Skip to content

Local Ollama Models Not Responsive #11049

@moosetunes

Description

@moosetunes

Problem (one or two sentences)

I have several Ollama models downloaded to my PC and I'm using the latest Roo Code build 3.45. I was using Perplexity.ai to help me debug the VS Code Studio output and it was reported - "Roo Code 3.45 has a fundamental bug - it creates a new task/shadow Git repo for every single chat, regardless of tool settings. The rm -rf temporarily clears it, but Roo immediately recreates the corrupted task system."

Context (who is affected and when)

Happens when I attempt to use a local model. Ubuntu, VS Code Studio, Roo Code Plugin, 32GB Ram, Ryzen 7

Reproduction steps

1 - Select Provider - Ollama
2 - Select Model - llama3.1:latest

Here is my config.yaml contents:

name: Local Config
version: 1.0.0
schema: v1

models:

Expected result

"What is Python?" or "List the files in my workspace." Should get response.

Actual result

"API Request", then nothing.

Variations tried (optional)

Models will run from bash.

App Version

3.45.0

API Provider (optional)

Ollama

Model Used (optional)

llama3.1:latest

Roo Code Task Links (optional)

No response

Relevant logs or errors (optional)

[createTask] parent task 019c05ab-1af1-722e-9d7c-dc20b32daf5c.5be4536c instantiated
[Task#getCheckpointService] initializing shadow git
[t#create] git = 2.51.0
[t#initShadowGit] creating shadow git repo at /home/mooseshoes/.config/Code/User/globalStorage/rooveterinaryinc.roo-cline/tasks/019c05ab-1af1-722e-9d7c-dc20b32daf5c/checkpoints
[t#initShadowGit] initialized shadow repo with base commit 3308c9fbd634374a0d5c3ca54231ed1a60a08f89 in 563ms
[Task#getCheckpointService] service initialized

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    Status

    Triage

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions