新手上路 前天 21:11
主楼
[资源名称]
gotokenizer - 一个基于字典和Bigram语言模型的Golang分词器
[资源来源]
github.com
[资源介绍]
GitHub - xujiajun/gotokenizer: A tokenizer based on the dictionary and Bigram language models for Go. (Now only support chinese segmentation)
A tokenizer based on the dictionary and Bigram language models for Go. (Now only support chinese segmentation) - xujiajun/gotokenizer
- gotokenizer - 一个基于字典和Bigram语言模型的Golang分词器。(目前仅支持中文分词)
[资源合集]
😀 😊 😵‍💫 😡 🤝 🙏 👍 👎 ❤️