Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FIX] No Local AI connection (tried, llama.cpp OR ollama) #796

Open
3 of 14 tasks
jppaolim opened this issue Jun 3, 2024 · 0 comments
Open
3 of 14 tasks

[FIX] No Local AI connection (tried, llama.cpp OR ollama) #796

jppaolim opened this issue Jun 3, 2024 · 0 comments
Labels
fix Fix something that isn't working as expected

Comments

@jppaolim
Copy link

jppaolim commented Jun 3, 2024

Describe the bug

After following tutorial, configuring everything, chat in Khoj client doesn't return any answer.

To Reproduce

Mac Os Sonoma 14,5
Download docker
docker-compose up
download client
configure ollama (for instance) or lm studio
start a chat
try with some documents indexed, but it doesn't change

Screenshots

see below

Platform

  • Server:
    • Cloud-Hosted (https://app.khoj.dev)
    • Self-Hosted Docker
    • Self-Hosted Python package
    • Self-Hosted source code
  • Client:
    • Obsidian
    • Emacs
    • Desktop app
    • Web browser
    • WhatsApp
  • OS:
    • Windows
    • macOS
    • Linux
    • Android
    • iOS

If self-hosted

  • Server Version [e.g. 1.0.1]: latest built from docker (2nd of june)

server-1 | [22:58:47.450854] DEBUG uvicorn.error: = connection is protocol.py:256
server-1 | CONNECTING
server-1 | [22:58:47.456446] DEBUG uvicorn.error: < GET server.py:286
server-1 | /api/chat/ws?conversation_id=3&region=%
server-1 | (MASKED IN THIS COPY / PASTE)
server-1 | HTTP/1.1
server-1 | [22:58:47.460393] DEBUG uvicorn.error: < host: localhost:42110 server.py:288
server-1 | [22:58:47.463016] DEBUG uvicorn.error: < connection: Upgrade server.py:288
server-1 | [22:58:47.464992] DEBUG uvicorn.error: < pragma: no-cache server.py:288
server-1 | [22:58:47.467932] DEBUG uvicorn.error: < cache-control: server.py:288
server-1 | no-cache
server-1 | [22:58:47.471291] DEBUG uvicorn.error: < user-agent: server.py:288
server-1 | Mozilla/5.0 (Macintosh; Intel Mac OS X
server-1 | 10_15_7) AppleWebKit/537.36 (KHTML,
server-1 | like Gecko) Chrome/125.0.0.0
server-1 | Safari/537.36
server-1 | [22:58:47.472692] DEBUG uvicorn.error: < upgrade: websocket server.py:288
server-1 | [22:58:47.473238] DEBUG uvicorn.error: < origin: server.py:288
server-1 | http://localhost:42110
server-1 | [22:58:47.473793] DEBUG uvicorn.error: < sec-websocket-version: server.py:288
server-1 | 13
server-1 | [22:58:47.474464] DEBUG uvicorn.error: < accept-encoding: gzip, server.py:288
server-1 | deflate, br, zstd
server-1 | [22:58:47.475405] DEBUG uvicorn.error: < accept-language: server.py:288
server-1 | fr-FR,fr;q=0.9,en-US;q=0.8,en;q=0.7
server-1 | [22:58:47.476124] DEBUG uvicorn.error: < cookie: server.py:288
server-1 | csrftoken=TcD5N3o51EGyvKLxSFLHXnvkVLGaF
server-1 | BWV;
server-1 | sessionid=mnu5v0bhf8puv3u78pdo35tggxw9c
server-1 | hgo
server-1 | [22:58:47.476859] DEBUG uvicorn.error: < sec-websocket-key: server.py:288
server-1 | u9XGbbzjoruqiKTC8t9wJA==
server-1 | [22:58:47.477413] DEBUG uvicorn.error: < server.py:288
server-1 | sec-websocket-extensions:
server-1 | permessage-deflate;
server-1 | client_max_window_bits
server-1 | [22:58:47.487251] INFO uvicorn.error: websockets_impl.py:212
server-1 | ('192.168.65.1', 58271) -
server-1 | "WebSocket /api/chat/ws"
server-1 | [accepted]
server-1 | [22:58:47.488508] DEBUG uvicorn.error: > HTTP/1.1 101 Switching server.py:307
server-1 | Protocols
server-1 | [22:58:47.489475] DEBUG uvicorn.error: > Upgrade: websocket server.py:309
server-1 | [22:58:47.490193] DEBUG uvicorn.error: > Connection: Upgrade server.py:309
server-1 | [22:58:47.490789] DEBUG uvicorn.error: > Sec-WebSocket-Accept: server.py:309
server-1 | hr5cJdI0NeonOqpcezc78EtUmrs=
server-1 | [22:58:47.491398] DEBUG uvicorn.error: > server.py:309
server-1 | Sec-WebSocket-Extensions:
server-1 | permessage-deflate
server-1 | [22:58:47.491981] DEBUG uvicorn.error: > Date: Mon, 03 Jun 2024 server.py:309
server-1 | 22:58:47 GMT
server-1 | [22:58:47.492822] DEBUG uvicorn.error: > Server: Python/3.10 server.py:309
server-1 | websockets/12.0
server-1 | [22:58:47.493497] INFO uvicorn.error: connection open server.py:642
server-1 | [22:58:47.494049] DEBUG uvicorn.error: = connection is OPEN protocol.py:357
server-1 | [22:58:53.316138] INFO uvicorn.access: httptools_impl.py:437
server-1 | 192.168.65.1:58266 - "GET
server-1 | /config HTTP/1.1" 200
server-1 | [22:58:53.331505] DEBUG uvicorn.error: < CLOSE 1001 (going protocol.py:1172
server-1 | away) [2 bytes]
server-1 | [22:58:53.332910] DEBUG uvicorn.error: = connection is protocol.py:1227
server-1 | CLOSING
server-1 | [22:58:53.333710] DEBUG uvicorn.error: > CLOSE 1001 (going protocol.py:1178
server-1 | away) [2 bytes]
server-1 | [22:58:53.334781] DEBUG uvicorn.error: x half-closing TCP protocol.py:1319
server-1 | connection
server-1 | [22:58:53.335572] DEBUG uvicorn.error: = connection is protocol.py:1497
server-1 | CLOSED
server-1 | [22:58:53.337237] DEBUG khoj.routers.api_chat: User default api_chat.py:491
server-1 | disconnected web socket
server-1 | [22:58:53.338838] INFO uvicorn.error: connection closed server.py:264

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
fix Fix something that isn't working as expected
Projects
None yet
Development

No branches or pull requests

1 participant