一个微型&算法全面的中文分词引擎 | A micro tokenizer for Chinese
Some experiments about Machine Learning
手工整理医疗行业词汇、术语等语料。可用于语音识别、对话系统等各类nlp模型...
Source code for an ACL2017 paper on Chinese word segmentation
Source codes for paper "Neural Networks Incorporating Dictionaries for C...
Chinese Word Segmentation Tool, THULAC的Java实现.
Lucene/Solr Analyzer Plugin. Support MacOS,Linux x86/64,Windows x86/64. ...
利用深度学习实现中文分词
Open Source State-of-the-art Chinese Word Segmentation System with BiLST...
基于深度学习的自然语言处理库
Sub-Character Representation Learning