Skip to content

Ollama and qwen models #432

@qubit0

Description

@qubit0

Here's my .env file:

# LLM API Configuration - pointing to local Ollama
LLM_API_KEY=ollama
LLM_BASE_URL=http://localhost:11434/api
LLM_MODEL_NAME=qwen3.5:0.8b

I get the following in backend/logs/

[2026-04-01 21:26:03] DEBUG [mirofish.api.generate_ontology:158] 模拟需求: What would happen if they can't deliver 50% of their promise in 100 days?...
[2026-04-01 21:26:03] INFO [mirofish.api.generate_ontology:177] 创建项目: proj_450fa2d55c88
[2026-04-01 21:26:03] INFO [mirofish.api.generate_ontology:212] 文本提取完成,共 44882 字符
[2026-04-01 21:26:03] INFO [mirofish.api.generate_ontology:215] 调用 LLM 生成本体定义...
[2026-04-01 21:26:03] DEBUG [mirofish.request.log_response:62] 响应: 500

Metadata

Metadata

Assignees

No one assigned

    Labels

    LLM APIAny questions regarding the LLM APIquestionFurther information is requested

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions