Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

flash attention support for chatglm3-6b #31652

Open
elimsjxr opened this issue Jun 27, 2024 · 1 comment
Open

flash attention support for chatglm3-6b #31652

elimsjxr opened this issue Jun 27, 2024 · 1 comment

Comments

@elimsjxr
Copy link

Is there anyone working on a FlashAttention support for chatglm3-6b?

@amyeroberts
Copy link
Collaborator

Hi @elimsjxr, thanks for opening this issue!

As the modeling code for ChatGLM3-6B is hosted on the hub, the best place to request this feature is by opening a discussion on the community tab, directly on the checkpoint's page on the hub.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants