We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
在qwen-1的模型中自带qwen.tiktoken,tokenization_qwen.py中会将其转换为tiktoken的使用形式,这会很快。但是自从Qwe-1.5 到 Qwen-2 再到目前的 Qwen2.5,config里使用的都是Qwen2Tokenzer,即便是Fast好像也是不使用Tiktoken的。 我能直接copy qwen.tiktoken和tokenization_qwen.py放到2.5的tokenizer中并使用。encode出来的token_id和原来是一致的,但是还没测试其他不一致的地方。因为我看官方不建议1和2的Tokenizer混用。
Qwen2之后的模型能使用tiktoken,并给出使用操作。
No response
- OS:Ubuntu - Python: 3.8 - Transformers: 4.40.2 - PyTorch: 2.0.1 - CUDA (`python -c 'import torch; print(torch.version.cuda)'`):12.1
The text was updated successfully, but these errors were encountered:
No branches or pull requests
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
当前行为 | Current Behavior
在qwen-1的模型中自带qwen.tiktoken,tokenization_qwen.py中会将其转换为tiktoken的使用形式,这会很快。但是自从Qwe-1.5 到 Qwen-2 再到目前的 Qwen2.5,config里使用的都是Qwen2Tokenzer,即便是Fast好像也是不使用Tiktoken的。
我能直接copy qwen.tiktoken和tokenization_qwen.py放到2.5的tokenizer中并使用。encode出来的token_id和原来是一致的,但是还没测试其他不一致的地方。因为我看官方不建议1和2的Tokenizer混用。
期望行为 | Expected Behavior
Qwen2之后的模型能使用tiktoken,并给出使用操作。
复现方法 | Steps To Reproduce
No response
运行环境 | Environment
备注 | Anything else?
No response
The text was updated successfully, but these errors were encountered: