Skip to content

Commit

Permalink
Merge pull request #17 from sroussey/model-repository
Browse files Browse the repository at this point in the history
Refactor KV Storage and Setup Models data to be stored
  • Loading branch information
sroussey authored Jan 18, 2025
2 parents 933082e + b6c8d28 commit f0de635
Show file tree
Hide file tree
Showing 106 changed files with 3,316 additions and 932 deletions.
Binary file modified bun.lockb
Binary file not shown.
20 changes: 11 additions & 9 deletions docs/developers/01_getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,9 +22,9 @@
- [`packages/storage`](#packagesstorage)
- [`packages/ai`](#packagesai)
- [`packages/ai-provider`](#packagesai-provider)
- [`examples/cli`](#samplescli)
- [`examples/web`](#samplesweb)
- [`examples/ngraph`](#samplesngraph)
- [`examples/cli`](#examplescli)
- [`examples/web`](#examplesweb)
- [`examples/ngraph`](#examplesngraph)

# Developer Getting Started

Expand All @@ -51,13 +51,13 @@ After this, plese read [Architecture](02_architecture.md) before attempting to [

```ts
import { TaskGraphBuilder } from "ellmers-core";
import { registerHuggingfaceLocalTasksInMemory } from "ellmers-ai-provider/hf-transformers/server";
import { registerHuggingfaceLocalTasksInMemory } from "ellmers-test";
// config and start up
registerHuggingfaceLocalTasksInMemory();

const builder = new TaskGraphBuilder();
builder
.DownloadModel({ model: "Xenova/LaMini-Flan-T5-783M" })
.DownloadModel({ model: "ONNX Xenova/LaMini-Flan-T5-783M q8" })
.TextRewriter({
text: "The quick brown fox jumps over the lazy dog.",
prompt: ["Rewrite the following text in reverse:", "Rewrite this to sound like a pirate:"],
Expand All @@ -79,15 +79,17 @@ import {
DataFlow,
TaskGraph,
TaskGraphRunner,
registerHuggingfaceLocalTasksInMemory,
} from "ellmers-core";
import { registerHuggingfaceLocalTasksInMemory } from "ellmers-test";

// config and start up
registerHuggingfaceLocalTasksInMemory();

// build and run graph
const graph = new TaskGraph();
graph.addTask(new DownloadModel({ id: "1", input: { model: "Xenova/LaMini-Flan-T5-783M" } }));
graph.addTask(
new DownloadModel({ id: "1", input: { model: "ONNX Xenova/LaMini-Flan-T5-783M q8" } })
);
graph.addTask(
new TextRewriterCompoundTask({
id: "2",
Expand Down Expand Up @@ -284,7 +286,7 @@ There is a JSONTask that can be used to build a graph. This is useful for saving
"id": "1",
"type": "DownloadModelCompoundTask",
"input": {
"model": ["Xenova/LaMini-Flan-T5-783M", "Xenova/m2m100_418M"]
"model": ["ONNX Xenova/LaMini-Flan-T5-783M q8", "ONNX Xenova/m2m100_418M q8"]
}
},
{
Expand All @@ -305,7 +307,7 @@ There is a JSONTask that can be used to build a graph. This is useful for saving
"id": "3",
"type": "TextTranslationCompoundTask",
"input": {
"model": "Xenova/m2m100_418M",
"model": "ONNX Xenova/m2m100_418M q8",
"source": "en",
"target": "es"
},
Expand Down
2 changes: 1 addition & 1 deletion docs/developers/02_architecture.md
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ classDiagram
static TaskOutputDefinition[] outputs$
static readonly sideeffects = false$
run() TaskOutput
runSyncOnly() TaskOutput
runReactive() TaskOutput
}
<<abstract>> TaskBase
style TaskBase type:abstract,stroke-dasharray: 5 5
Expand Down
2 changes: 1 addition & 1 deletion docs/developers/03_extending.md
Original file line number Diff line number Diff line change
Expand Up @@ -147,4 +147,4 @@ Compound Tasks are not cached (though any or all of their children may be).

## Reactive Task UIs

Tasks can be reactive at a certain level. This means that they can be triggered by changes in the data they depend on, without "running" the expensive job based task runs. This is useful for a UI node editor. For example, you change a color in one task and it is propagated downstream without incurring costs for re-running the entire graph. It is like a spreadsheet where changing a cell can trigger a recalculation of other cells. This is implemented via a `runSyncOnly()` method that is called when the data changes. Typically, the `run()` will call `runSyncOnly()` on itself at the end of the method.
Tasks can be reactive at a certain level. This means that they can be triggered by changes in the data they depend on, without "running" the expensive job based task runs. This is useful for a UI node editor. For example, you change a color in one task and it is propagated downstream without incurring costs for re-running the entire graph. It is like a spreadsheet where changing a cell can trigger a recalculation of other cells. This is implemented via a `runReactive()` method that is called when the data changes. Typically, the `run()` will call `runReactive()` on itself at the end of the method.
42 changes: 17 additions & 25 deletions examples/cli/src/TaskCLI.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,38 +9,23 @@ import { Command } from "commander";
import { runTask } from "./TaskStreamToListr2";
import "@huggingface/transformers";
import { TaskGraph, JsonTask, TaskGraphBuilder, JsonTaskItem } from "ellmers-core";

import {
DownloadModelTask,
DownloadModelCompoundTask,
findAllModels,
findModelByName,
findModelByUseCase,
ModelUseCaseEnum,
} from "ellmers-ai";
import { DownloadModelTask, getGlobalModelRepository } from "ellmers-ai";
import "ellmers-task";

export function AddBaseCommands(program: Command) {
program
.command("download")
.description("download models")
.option("--model <name>", "model to download")
.requiredOption("--model <name>", "model to download")
.action(async (options) => {
const models = findAllModels();
const graph = new TaskGraph();
if (options.model) {
const model = findModelByName(options.model);
const model = await getGlobalModelRepository().findByName(options.model);
if (model) {
graph.addTask(new DownloadModelTask({ input: { model: model.name } }));
} else {
program.error(`Unknown model ${options.model}`);
}
} else {
graph.addTask(
new DownloadModelCompoundTask({
input: { model: models.map((m) => m.name) },
})
);
}
await runTask(graph);
});
Expand All @@ -52,8 +37,11 @@ export function AddBaseCommands(program: Command) {
.option("--model <name>", "model to use")
.action(async (text: string, options) => {
const model = options.model
? findModelByName(options.model)?.name
: findModelByUseCase(ModelUseCaseEnum.TEXT_EMBEDDING).map((m) => m.name);
? (await getGlobalModelRepository().findByName(options.model))?.name
: (await getGlobalModelRepository().findModelsByTask("TextEmbeddingTask"))?.map(
(m) => m.name
);

if (!model) {
program.error(`Unknown model ${options.model}`);
} else {
Expand All @@ -70,8 +58,10 @@ export function AddBaseCommands(program: Command) {
.option("--model <name>", "model to use")
.action(async (text, options) => {
const model = options.model
? findModelByName(options.model)?.name
: findModelByUseCase(ModelUseCaseEnum.TEXT_SUMMARIZATION).map((m) => m.name);
? (await getGlobalModelRepository().findByName(options.model))?.name
: (await getGlobalModelRepository().findModelsByTask("TextSummaryTask"))?.map(
(m) => m.name
);
if (!model) {
program.error(`Unknown model ${options.model}`);
} else {
Expand All @@ -89,8 +79,10 @@ export function AddBaseCommands(program: Command) {
.option("--model <name>", "model to use")
.action(async (text, options) => {
const model = options.model
? findModelByName(options.model)?.name
: findModelByUseCase(ModelUseCaseEnum.TEXT_REWRITING).map((m) => m.name);
? (await getGlobalModelRepository().findByName(options.model))?.name
: (await getGlobalModelRepository().findModelsByTask("TextRewriterTask"))?.map(
(m) => m.name
);
if (!model) {
program.error(`Unknown model ${options.model}`);
} else {
Expand All @@ -111,7 +103,7 @@ export function AddBaseCommands(program: Command) {
id: "1",
type: "DownloadModelTask",
input: {
model: "Xenova/LaMini-Flan-T5-783M",
model: "ONNX Xenova/LaMini-Flan-T5-783M q8",
},
},
{
Expand Down
18 changes: 13 additions & 5 deletions examples/cli/src/TaskHelper.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,18 @@
import chalk from "chalk";
import { ListrTaskWrapper } from "listr2";

/**
* TaskHelper provides CLI progress visualization utilities.
*
* Features:
* - Unicode-based progress bar generation
* - Customizable bar length and progress indication
* - Color-coded output using chalk
*
* Used to create visual feedback for long-running tasks in the CLI interface,
* with smooth progress transitions and clear visual indicators.
*/

export function createBar(progress: number, length: number): string {
let distance = progress * length;
let bar = "";
Expand Down Expand Up @@ -47,11 +59,7 @@ export function createBar(progress: number, length: number): string {
// Extend empty bar
bar += "\u258F".repeat(length > bar.length ? length - bar.length : 0);

return chalk.rgb(
70,
70,
240
)("\u2595" + chalk.bgRgb(20, 20, 70)(bar) + "\u258F");
return chalk.rgb(70, 70, 240)("\u2595" + chalk.bgRgb(20, 20, 70)(bar) + "\u258F");
}

export class TaskHelper<T = any> {
Expand Down
12 changes: 10 additions & 2 deletions examples/cli/src/ellmers.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,21 @@ import { program } from "commander";
import { argv } from "process";
import { AddBaseCommands } from "./TaskCLI";
import { getProviderRegistry } from "ellmers-ai";
import { registerHuggingfaceLocalTasksInMemory } from "ellmers-ai-provider/hf-transformers/server";
import { registerMediaPipeTfJsLocalInMemory } from "ellmers-ai-provider/tf-mediapipe/server";
import {
registerHuggingfaceLocalModels,
registerHuggingfaceLocalTasksInMemory,
registerMediaPipeTfJsLocalInMemory,
registerMediaPipeTfJsLocalModels,
} from "ellmers-test";
import "ellmers-test";

program.version("1.0.0").description("A CLI to run Ellmers.");

AddBaseCommands(program);

await registerHuggingfaceLocalModels();
await registerMediaPipeTfJsLocalModels();

registerHuggingfaceLocalTasksInMemory();
registerMediaPipeTfJsLocalInMemory();

Expand Down
21 changes: 11 additions & 10 deletions examples/web/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,13 @@
"version": "0.0.0",
"type": "module",
"scripts": {
"dev": "concurrently --kill-others -c 'auto' -n app,types 'bunx --bun vite' 'tsc -w --noEmit'",
"dev": "concurrently --kill-others -c 'auto' -n app,types 'bunx --bun vite' 'tsc -w --noEmit --preserveWatchOutput'",
"build": "vite build && tsc --noEmit",
"lint": "eslint . --ext ts,tsx --report-unused-disable-directives --max-warnings 0",
"preview": "vite preview"
},
"dependencies": {
"@xyflow/react": "^12.3.6",
"@xyflow/react": "^12.4.1",
"react": "^19.0.0",
"react-dom": "^19.0.0",
"@uiw/react-codemirror": "^4.23.7",
Expand All @@ -24,20 +24,21 @@
"ellmers-core": "workspace:packages/core",
"ellmers-storage": "workspace:packages/storage",
"ellmers-ai-provider": "workspace:packages/ai-provider",
"ellmers-ai": "workspace:packages/ai"
"ellmers-ai": "workspace:packages/ai",
"ellmers-test": "workspace:packages/test"
},
"devDependencies": {
"@types/react": "^19.0.4",
"@types/react-dom": "^19.0.2",
"@typescript-eslint/eslint-plugin": "^8.19.1",
"@typescript-eslint/parser": "^8.19.1",
"@types/react": "^19.0.7",
"@types/react-dom": "^19.0.3",
"@typescript-eslint/eslint-plugin": "^8.20.0",
"@typescript-eslint/parser": "^8.20.0",
"@vitejs/plugin-react": "^4.3.4",
"eslint": "^9.17.0",
"eslint": "^9.18.0",
"eslint-plugin-react-hooks": "^5.1.0",
"eslint-plugin-react-refresh": "^0.4.16",
"eslint-plugin-react-refresh": "^0.4.18",
"vite": "^6.0.7",
"tailwindcss": "3.4.17",
"postcss": "8.4.49",
"postcss": "8.5.1",
"autoprefixer": "10.4.20"
},
"engines": {
Expand Down
51 changes: 46 additions & 5 deletions examples/web/src/App.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,15 @@ import React, { useCallback, useEffect, useState } from "react";
import { ReactFlowProvider } from "@xyflow/react";
import { RunGraphFlow } from "./RunGraphFlow";
import { JsonEditor } from "./JsonEditor";
import { JsonTask, JsonTaskItem, TaskGraph, TaskGraphBuilder } from "ellmers-core";
import {
ConcurrencyLimiter,
JsonTask,
JsonTaskItem,
TaskGraph,
TaskGraphBuilder,
TaskInput,
TaskOutput,
} from "ellmers-core";
import {
IndexedDbTaskGraphRepository,
IndexedDbTaskOutputRepository,
Expand All @@ -11,10 +19,37 @@ import { ResizableHandle, ResizablePanel, ResizablePanelGroup } from "./Resize";
import { QueuesStatus } from "./QueueSatus";
import { OutputRepositoryStatus } from "./OutputRepositoryStatus";
import { GraphStoreStatus } from "./GraphStoreStatus";
import { registerHuggingfaceLocalTasksInMemory } from "ellmers-ai-provider/hf-transformers/browser";
import { InMemoryJobQueue } from "ellmers-storage/inmemory";
import { getProviderRegistry } from "ellmers-ai";
import {
LOCAL_ONNX_TRANSFORMERJS,
registerHuggingfaceLocalTasks,
} from "ellmers-ai-provider/hf-transformers/browser";
import {
MEDIA_PIPE_TFJS_MODEL,
registerMediaPipeTfJsLocalTasks,
} from "ellmers-ai-provider/tf-mediapipe/browser";
import "ellmers-task";
import "ellmers-test";
import { registerMediaPipeTfJsLocalModels } from "ellmers-test";
import { registerHuggingfaceLocalModels } from "ellmers-test";

const ProviderRegistry = getProviderRegistry();

registerHuggingfaceLocalTasksInMemory();
registerHuggingfaceLocalTasks();
ProviderRegistry.registerQueue(
LOCAL_ONNX_TRANSFORMERJS,
new InMemoryJobQueue<TaskInput, TaskOutput>("local_hft", new ConcurrencyLimiter(1, 10), 10)
);

registerMediaPipeTfJsLocalTasks();
ProviderRegistry.registerQueue(
MEDIA_PIPE_TFJS_MODEL,
new InMemoryJobQueue<TaskInput, TaskOutput>("local_mp", new ConcurrencyLimiter(1, 10), 10)
);

ProviderRegistry.clearQueues();
ProviderRegistry.startQueues();

const taskOutputCache = new IndexedDbTaskOutputRepository();
const builder = new TaskGraphBuilder(taskOutputCache);
Expand All @@ -31,13 +66,13 @@ const graph = await taskGraphRepo.getTaskGraph("default");
const resetGraph = () => {
builder
.reset()
.DownloadModel({ model: ["Xenova/LaMini-Flan-T5-783M", "Xenova/m2m100_418M"] })
.DownloadModel({ model: ["ONNX Xenova/LaMini-Flan-T5-783M q8", "ONNX Xenova/m2m100_418M q8"] })
.TextRewriter({
text: "The quick brown fox jumps over the lazy dog.",
prompt: ["Rewrite the following text in reverse:", "Rewrite this to sound like a pirate:"],
})
.TextTranslation({
model: "Xenova/m2m100_418M",
model: "ONNX Xenova/m2m100_418M q8",
source: "en",
target: "es",
})
Expand Down Expand Up @@ -76,6 +111,12 @@ export const App = () => {

// changes coming from builder in console
useEffect(() => {
async function init() {
await registerHuggingfaceLocalModels();
await registerMediaPipeTfJsLocalModels();
}
init();

function listen() {
setJsonData(JSON.stringify(builder.toDependencyJSON(), null, 2));
setGraph(builder.graph);
Expand Down
4 changes: 2 additions & 2 deletions examples/web/src/QueueSatus.tsx
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
import { JobStatus } from "ellmers-core";
import { ModelProcessorEnum, getProviderRegistry } from "ellmers-ai";
import { getProviderRegistry } from "ellmers-ai";
import { useCallback, useEffect, useState } from "react";

export function QueueStatus({ queueType }: { queueType: ModelProcessorEnum }) {
export function QueueStatus({ queueType }: { queueType: string }) {
const queue = getProviderRegistry().getQueue(queueType);
const [pending, setPending] = useState<number>(0);
const [processing, setProcessing] = useState<number>(0);
Expand Down
5 changes: 0 additions & 5 deletions examples/web/src/RunGraphFlow.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -14,16 +14,11 @@ import { TurboNodeData, SingleNode, CompoundNode } from "./TurboNode";
import TurboEdge from "./TurboEdge";
import { FiFileText, FiClipboard, FiDownload, FiUpload } from "react-icons/fi";
import { Task, TaskGraph } from "ellmers-core";
import { registerHuggingfaceLocalTasksInMemory } from "ellmers-ai-provider/hf-transformers/browser";
import { registerMediaPipeTfJsLocalInMemory } from "ellmers-ai-provider/tf-mediapipe/browser";
import { GraphPipelineCenteredLayout, GraphPipelineLayout, computeLayout } from "./layout";

import "@xyflow/react/dist/base.css";
import "./RunGraphFlow.css";

registerHuggingfaceLocalTasksInMemory();
registerMediaPipeTfJsLocalInMemory();

const categoryIcons = {
"Text Model": <FiFileText />,
Input: <FiUpload />,
Expand Down
Loading

0 comments on commit f0de635

Please sign in to comment.