-
Notifications
You must be signed in to change notification settings - Fork 239
Issues: flashinfer-ai/flashinfer
Deprecation Notice: Python 3.8 Wheel Support to End in future...
#682
opened Dec 18, 2024 by
yzh119
Open
2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
RMSNorm failed with error code no kernel image is available for execution on the device
#920
opened Mar 7, 2025 by
nanmi
[Bugfix] CUDAGraph Compatibility in AppendPagedKVCacheKernel for Variable-Length Inputs
#919
opened Mar 7, 2025 by
SungBalance
ValueError: Invalid mode: forward_mode=<ForwardMode.TARGET_VERIFY: 6>
#916
opened Mar 6, 2025 by
tchaton
flashinfer.prefill.single_prefill_with_kv_cache meets error when running on A100
#915
opened Mar 6, 2025 by
FengzhuoZhang
flashinfer_kernels.abi3.so: undefined symbol: _ZN5torch3jit17parseSchemaOrNameERKSsb
#911
opened Mar 5, 2025 by
Idonthaveaname-wq
AttributeError: module 'flashinfer' has no attribute 'mla'
#907
opened Mar 3, 2025 by
ThomasBaruzier
[Feature] Can flashinfer support different custom mask for different attention head?
#906
opened Mar 2, 2025 by
yuzhenmao
Can't run on CUDA 12.8 with older driver version because of CUTLASS issue
#903
opened Feb 27, 2025 by
mickaelseznec
Possible Bug in chain_speculative_sampling: output_emitted_token_num Exceeds Expected Limit
#879
opened Feb 19, 2025 by
JaeminK
The current _append_paged_kv_cache_kernel does not support k_head_dim being different from v_head_dim
#877
opened Feb 19, 2025 by
qiyuxinlin
[Feature] Can BatchDecodeWithPagedKVCacheWrapper return attention scores to all tokens, not just logsumexp?
#838
opened Feb 14, 2025 by
yawnzh
PrefillPlan tries to allocate more memory than float_workspace_size_in_bytes passed in.
#809
opened Feb 12, 2025 by
rchardx
[BUG] attention/prefill.cuh(138): error: expression must have a constant value
#806
opened Feb 11, 2025 by
haohaibo
Numerical stability issue in recent commits since 0.2.0
bug
Something isn't working
priority: high
#805
opened Feb 11, 2025 by
rchardx
[Feature] Reuse JIT code path for building AOT wheel
enhancement
New feature or request
#791
opened Feb 6, 2025 by
abcdabcd987
Previous Next
ProTip!
What’s not been updated in a month: updated:<2025-02-08.