Skip to content

Add support for Qwen3-vl models #2098

@Hansashawn

Description

@Hansashawn

Is your feature request related to a problem? Please describe.
When trying to load Qwen3-vl models like Qwen3-VL-235B-A22B-Thinking or Qwen3VL-8B-Instruct, get error messages as following:
site-packages/llama_cpp/_internals.py", line 78, in close
if self.sampler is not None:
AttributeError: 'LlamaModel' object has no attribute 'sampler'

Describe the solution you'd like
Looks that Qwen3-vl models are not supported yet, could you help add the supporting for these Qwen3-vl models?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions