[资源介绍]
GitHub - xujiajun/gotokenizer: A tokenizer based on the dictionary and Bigram language models for Go. (Now only support chinese segmentation)
A tokenizer based on the dictionary and Bigram language models for Go. (Now only support chinese segmentation) - xujiajun/gotokenizer
- gotokenizer - 一个基于字典和Bigram语言模型的Golang分词器。(目前仅支持中文分词)