Skip to content

Add an additional Chinese tokenizer #6304

howard-haowen started this conversation in Language Support
Discussion options

You must be logged in to vote

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Feature requests and improvements lang / zh Chinese language data and models feat / tokenizer Feature: Tokenizer
2 participants
Converted from issue

This discussion was converted from issue #6304 on December 10, 2020 12:38.