Skip to content

ollama base URL as settings #97

ollama base URL as settings

ollama base URL as settings #97

Triggered via push June 29, 2024 02:02
Status Failure
Total duration 2m 22s
Artifacts

test.yml

on: push
Fit to window
Zoom out
Zoom in

Annotations

9 errors and 1 warning
tests/unit/config.test.ts > Load default settings: tests/unit/config.test.ts#L19
AssertionError: expected { general: { …(2) }, …(7) } to strictly equal { general: { …(2) }, …(7) } - Expected + Received Object { "appearance": Object { "chat": Object { "fontSize": 3, "theme": "openai", }, }, "commands": Object { "engine": "", "model": "", }, "engines": Object { "anthropic": Object { "model": Object { "chat": "", }, "models": Object { "chat": Array [], "image": Array [], }, }, "google": Object { "model": Object { "chat": "", }, "models": Object { "chat": Array [], "image": Array [], }, }, "groq": Object { "model": Object { "chat": "", }, "models": Object { "chat": Array [], "image": Array [], }, }, "mistralai": Object { "model": Object { "chat": "", }, "models": Object { "chat": Array [], "image": Array [], }, }, "ollama": Object { - "baseURL": "http://127.0.0.1:11434", "model": Object { "chat": "llama2", }, "models": Object { "chat": Array [], "image": Array [], }, }, "openai": Object { "apiKey": "", - "baseURL": "https://api.openai.com/v1", "model": Object { "chat": "gpt-3.5-turbo", "image": "dall-e-2", }, "models": Object { "chat": Array [], "image": Array [], }, "tts": Object { "model": "tts-1", "voice": "alloy", }, }, }, "general": Object { "keepRunning": true, "language": "", }, "instructions": Object { "default": "You are a helpful assistant. You are here to help the user with any questions they have.", "titling": "You are an assistant whose task is to find the best title for the conversation below. The title should be just a few words.", "titling_user": "Provide a title for the conversation above. Do not return anything other than the title. Do not wrap responses in quotes.", }, "llm": Object { "autoVisionSwitch": true, "conversationLength": 5, "engine": "openai", }, "plugins": Object { "browse": Object {}, "dalle": Object { "enabled": true, }, "python": Object {}, "tavily": Object {}, }, "shortcuts": Object { "anywhere": Object { "ctrl": true, "key": "Space", "shift": true, }, "chat": Object { "ctrl": true, "key": "Space", }, "command": Object { "alt": true, "ctrl": true, "key": "Space", }, }, } ❯ tests/unit/config.test.ts:19:18
tests/unit/engine_ollama.test.ts > Ollama Load Models: tests/unit/engine_ollama.test.ts#L42
AssertionError: expected false to be true // Object.is equality - Expected + Received - true + false ❯ tests/unit/engine_ollama.test.ts:42:36
tests/unit/engine_ollama.test.ts > Ollama Basic: src/services/ollama.ts#L27
Error: [vitest] No "Ollama" export is defined on the "ollama/browser" mock. Did you forget to return it from "vi.mock"? If you need to partially mock a module, you can use "importOriginal" helper inside: vi.mock("ollama/browser", async (importOriginal) => { const actual = await importOriginal() return { ...actual, // your mocked methods } }) ❯ new __vite_ssr_exports__.default src/services/ollama.ts:27:23 ❯ tests/unit/engine_ollama.test.ts:52:18
tests/unit/engine_ollama.test.ts > Ollama completion: src/services/ollama.ts#L27
Error: [vitest] No "Ollama" export is defined on the "ollama/browser" mock. Did you forget to return it from "vi.mock"? If you need to partially mock a module, you can use "importOriginal" helper inside: vi.mock("ollama/browser", async (importOriginal) => { const actual = await importOriginal() return { ...actual, // your mocked methods } }) ❯ new __vite_ssr_exports__.default src/services/ollama.ts:27:23 ❯ tests/unit/engine_ollama.test.ts:59:18
tests/unit/engine_ollama.test.ts > Ollama stream: src/services/ollama.ts#L27
Error: [vitest] No "Ollama" export is defined on the "ollama/browser" mock. Did you forget to return it from "vi.mock"? If you need to partially mock a module, you can use "importOriginal" helper inside: vi.mock("ollama/browser", async (importOriginal) => { const actual = await importOriginal() return { ...actual, // your mocked methods } }) ❯ new __vite_ssr_exports__.default src/services/ollama.ts:27:23 ❯ tests/unit/engine_ollama.test.ts:72:18
tests/unit/engine_ollama.test.ts > Ollama image: src/services/ollama.ts#L27
Error: [vitest] No "Ollama" export is defined on the "ollama/browser" mock. Did you forget to return it from "vi.mock"? If you need to partially mock a module, you can use "importOriginal" helper inside: vi.mock("ollama/browser", async (importOriginal) => { const actual = await importOriginal() return { ...actual, // your mocked methods } }) ❯ new __vite_ssr_exports__.default src/services/ollama.ts:27:23 ❯ tests/unit/engine_ollama.test.ts:84:18
tests/unit/engine_ollama.test.ts > Ollama addImageToPayload: src/services/ollama.ts#L27
Error: [vitest] No "Ollama" export is defined on the "ollama/browser" mock. Did you forget to return it from "vi.mock"? If you need to partially mock a module, you can use "importOriginal" helper inside: vi.mock("ollama/browser", async (importOriginal) => { const actual = await importOriginal() return { ...actual, // your mocked methods } }) ❯ new __vite_ssr_exports__.default src/services/ollama.ts:27:23 ❯ tests/unit/engine_ollama.test.ts:90:18
tests/unit/engine_ollama.test.ts > Ollama streamChunkToLlmChunk Text: src/services/ollama.ts#L27
Error: [vitest] No "Ollama" export is defined on the "ollama/browser" mock. Did you forget to return it from "vi.mock"? If you need to partially mock a module, you can use "importOriginal" helper inside: vi.mock("ollama/browser", async (importOriginal) => { const actual = await importOriginal() return { ...actual, // your mocked methods } }) ❯ new __vite_ssr_exports__.default src/services/ollama.ts:27:23 ❯ tests/unit/engine_ollama.test.ts:99:18
build
Process completed with exit code 1.
build
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: JesseTG/[email protected]. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.