Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 20 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,7 @@ AgentScope Studio is a powerful **local visualization toolkit** designed for dev
- **Tracing**: OpenTelemetry-based trace visualization for LLM calls, token usage, and agent invocations
- **Agent Evaluation**: Evaluation-oriented analysis from a statistical perspective
- **Built-in Copilot (Friday)**: A development assistant, playground for rapid secondary development, and integration hub for advanced features
- **Runtime Chat (Services)**: Integrate with [AgentScope Runtime](https://github.com/agentscope-ai/agentscope-runtime) to chat with deployed agents directly in Studio

<p align="center">
<img
Expand Down Expand Up @@ -147,6 +148,24 @@ agentscope.init(
# ...
```

## 💬 Runtime Chat

AgentScope Studio includes a built-in **Runtime Chat** service that connects to [AgentScope Runtime](https://github.com/agentscope-ai/agentscope-runtime) backends. To use it:

1. **Start an AgentScope Runtime backend** (see [AgentScope Runtime Quick Start](https://github.com/agentscope-ai/agentscope-runtime#-quick-start)):

```bash
pip install agentscope-runtime
# Create and run your agent app (see docs for full example)
python app.py # Starts on http://localhost:8090
```

2. **Connect from AgentScope Studio**: Navigate to **Services > Runtime Chat**, click the ⚙️ Settings icon, and set `baseURL` to your runtime endpoint (e.g., `http://localhost:8090`)

3. **Start chatting** with your deployed agent!

> For detailed configuration and full usage examples, see the [Runtime Chat documentation](./docs/tutorial/en/services/runtime-chat.md).

## 📚 Documentation

For more details, please refer to our documentation:
Expand All @@ -156,6 +175,7 @@ For more details, please refer to our documentation:
- [Project Management](./docs/tutorial/en/develop/project.md) - Managing projects and runs
- [Tracing](./docs/tutorial/en/develop/tracing.md) - OpenTelemetry integration and semantic conventions
- [Friday](./docs/tutorial/en/agent/friday.md) - Built-in Copilot guide
- [Runtime Chat](./docs/tutorial/en/services/runtime-chat.md) - Integrated chat service guide
- [Contributing](./docs/tutorial/en/tutorial/contributing.md) - How to contribute

## ⚖️ License
Expand Down
18 changes: 18 additions & 0 deletions docs/.vitepress/config.mts
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,15 @@ export default defineConfig({
text: 'Agent',
items: [{ text: 'Friday', link: '/agent/friday' }],
},
{
text: 'Services',
items: [
{
text: 'Runtime Chat',
link: '/services/runtime-chat',
},
],
},
],
},
},
Expand Down Expand Up @@ -119,6 +128,15 @@ export default defineConfig({
{ text: 'Friday', link: '/zh_CN/agent/friday' },
],
},
{
text: '服务',
items: [
{
text: 'Runtime Chat',
link: '/zh_CN/services/runtime-chat',
},
],
},
],
},
},
Expand Down
4 changes: 4 additions & 0 deletions docs/tutorial/en/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,4 +24,8 @@ features:
details: Evaluate your agent system from a new statistical perspective
link: /develop/evaluation
linkText: Learn more
- title: Runtime Chat
details: Integrate with AgentScope Runtime to chat with deployed agents directly in Studio
link: /services/runtime-chat
linkText: Learn more
---
148 changes: 148 additions & 0 deletions docs/tutorial/en/services/runtime-chat.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,148 @@
# Runtime Chat

## Overview

Runtime Chat is an integrated chat service within AgentScope Studio, powered by the [`@agentscope-ai/chat`](https://www.npmjs.com/package/@agentscope-ai/chat) component from [AgentScope Runtime](https://github.com/agentscope-ai/agentscope-runtime). It connects to an AgentScope Runtime backend via streaming API, allowing you to interact with deployed agents directly from the Studio.

## Features

- **Multi-Session Support**: Create, switch, and manage multiple chat sessions, persisted in browser localStorage
- **Configurable Theme**: Customize colors, dark mode, and header appearance
- **Welcome Screen**: Set greeting, description, avatar, and prompt suggestions
- **API Configuration**: Connect to any AgentScope Runtime endpoint with configurable base URL and token
- **Settings Panel**: Real-time configuration via the ⚙️ icon in the header
- **SSE Streaming**: Real-time streaming responses

## Usage Example

### Step 1: Start an AgentScope Runtime Backend

```bash
pip install agentscope-runtime
```

Create `app.py` (see [AgentScope Runtime Quick Start](https://github.com/agentscope-ai/agentscope-runtime#-quick-start) for details):

```python
import os

from agentscope.agent import ReActAgent
from agentscope.model import DashScopeChatModel
from agentscope.formatter import DashScopeChatFormatter
from agentscope.tool import Toolkit, execute_python_code
from agentscope.pipeline import stream_printing_messages
from agentscope.memory import InMemoryMemory

from agentscope_runtime.engine import AgentApp
from agentscope_runtime.engine.schemas.agent_schemas import AgentRequest
from agentscope_runtime.engine.services.agent_state import InMemoryStateService

agent_app = AgentApp(
app_name="MyAssistant",
app_description="A helpful assistant",
)


@agent_app.init
async def init_func(self):
self.state_service = InMemoryStateService()
await self.state_service.start()


@agent_app.shutdown
async def shutdown_func(self):
await self.state_service.stop()


@agent_app.query(framework="agentscope")
async def query_func(self, msgs, request: AgentRequest = None, **kwargs):
session_id = request.session_id
user_id = request.user_id

state = await self.state_service.export_state(
session_id=session_id, user_id=user_id,
)

toolkit = Toolkit()
toolkit.register_tool_function(execute_python_code)

agent = ReActAgent(
name="MyAssistant",
model=DashScopeChatModel(
"qwen-turbo",
api_key=os.getenv("DASHSCOPE_API_KEY"),
stream=True,
),
sys_prompt="You're a helpful assistant.",
toolkit=toolkit,
memory=InMemoryMemory(),
formatter=DashScopeChatFormatter(),
)
agent.set_console_output_enabled(enabled=False)

if state:
agent.load_state_dict(state)

async for msg, last in stream_printing_messages(
agents=[agent], coroutine_task=agent(msgs),
):
yield msg, last

state = agent.state_dict()
await self.state_service.save_state(
user_id=user_id, session_id=session_id, state=state,
)


agent_app.run(host="127.0.0.1", port=8090)
```

```bash
python app.py
# Server listens on http://localhost:8090/process
```

### Step 2: Connect from AgentScope Studio

1. Navigate to **Services > Runtime Chat** in the sidebar
2. Click the **⚙️ Settings** icon in the top-right corner
3. Set **baseURL** to `http://localhost:8090/process` and **token** if needed
4. Click **Save**, then create a new session and start chatting!

## Configuration

Click the **⚙️ Settings** icon to open the configuration panel.

### Theme

| Option | Description | Default |
| ------------------ | ------------------- | --------------- |
| `colorPrimary` | Primary theme color | `#615CED` |
| `colorBgBase` | Background color | — |
| `colorTextBase` | Text color | — |
| `darkMode` | Enable dark mode | `false` |
| `leftHeader.logo` | Logo image URL | AgentScope logo |
| `leftHeader.title` | Header title text | `Runtime Chat` |

### Sender

| Option | Description | Default |
| ------------ | ------------------------------ | ------------------ |
| `disclaimer` | Disclaimer text below input | AI disclaimer text |
| `maxLength` | Maximum input character length | `10000` |

### Welcome

| Option | Description | Default |
| ------------- | ------------------------------ | ------------------------------------------- |
| `greeting` | Welcome greeting text | `Hello, how can I help you today?` |
| `description` | Welcome description | `I am a helpful assistant...` |
| `avatar` | Welcome avatar URL | AgentScope logo |
| `prompts` | Quick-start prompt suggestions | `Hello`, `How are you?`, `What can you do?` |

### API

| Option | Description | Default |
| --------- | ------------------------------- | ------- |
| `baseURL` | AgentScope Runtime API base URL | — |
| `token` | API authentication token | — |
1 change: 1 addition & 0 deletions docs/tutorial/en/tutorial/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ AgentScope-Studio provides:
- OpenTelemetry-based tracing visualization
- Evaluation-oriented analysis and visualization
- A built-in agent (AgentScope-Friday) for quick secondary development
- Integrate with [AgentScope Runtime](https://github.com/agentscope-ai/agentscope-runtime) to chat with deployed agents directly in Studio ([Runtime Chat](../services/runtime-chat.md))

## _How Does AgentScope-Studio Work?_

Expand Down
4 changes: 4 additions & 0 deletions docs/tutorial/zh_CN/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,4 +24,8 @@ features:
details: 提供面向智能体应用的评测、分析和归因,从统计的角度评估智能体性能
link: /zh_CN/develop/evaluation
linkText: 了解更多
- title: Runtime Chat
details: 集成 AgentScope Runtime,在 Studio 中直接与已部署的智能体对话
link: /zh_CN/services/runtime-chat
linkText: 了解更多
---
148 changes: 148 additions & 0 deletions docs/tutorial/zh_CN/services/runtime-chat.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,148 @@
# Runtime Chat

## 概述

Runtime Chat 是 AgentScope Studio 内置的聊天服务,基于 [`@agentscope-ai/chat`](https://www.npmjs.com/package/@agentscope-ai/chat) 组件(来自 [AgentScope Runtime](https://github.com/agentscope-ai/agentscope-runtime))构建。它通过流式 API 连接 AgentScope Runtime 后端,让您可以直接在 Studio 中与已部署的智能体进行交互。

## 功能特点

- **多会话支持**:创建、切换和管理多个聊天会话,通过浏览器 localStorage 持久化存储
- **可配置主题**:自定义色彩、深色模式和标题栏外观
- **欢迎页面**:设置问候语、描述文字、头像和快捷提示词
- **API 配置**:连接任何 AgentScope Runtime 端点,支持自定义 Base URL 和 Token
- **设置面板**:通过标题栏 ⚙️ 图标实时修改配置
- **SSE 流式响应**:实时流式对话

## 使用示例

### 第一步:启动 AgentScope Runtime 后端

```bash
pip install agentscope-runtime
```

创建 `app.py`(详见 [AgentScope Runtime 快速开始](https://github.com/agentscope-ai/agentscope-runtime#-quick-start)):

```python
import os

from agentscope.agent import ReActAgent
from agentscope.model import DashScopeChatModel
from agentscope.formatter import DashScopeChatFormatter
from agentscope.tool import Toolkit, execute_python_code
from agentscope.pipeline import stream_printing_messages
from agentscope.memory import InMemoryMemory

from agentscope_runtime.engine import AgentApp
from agentscope_runtime.engine.schemas.agent_schemas import AgentRequest
from agentscope_runtime.engine.services.agent_state import InMemoryStateService

agent_app = AgentApp(
app_name="MyAssistant",
app_description="A helpful assistant",
)


@agent_app.init
async def init_func(self):
self.state_service = InMemoryStateService()
await self.state_service.start()


@agent_app.shutdown
async def shutdown_func(self):
await self.state_service.stop()


@agent_app.query(framework="agentscope")
async def query_func(self, msgs, request: AgentRequest = None, **kwargs):
session_id = request.session_id
user_id = request.user_id

state = await self.state_service.export_state(
session_id=session_id, user_id=user_id,
)

toolkit = Toolkit()
toolkit.register_tool_function(execute_python_code)

agent = ReActAgent(
name="MyAssistant",
model=DashScopeChatModel(
"qwen-turbo",
api_key=os.getenv("DASHSCOPE_API_KEY"),
stream=True,
),
sys_prompt="You're a helpful assistant.",
toolkit=toolkit,
memory=InMemoryMemory(),
formatter=DashScopeChatFormatter(),
)
agent.set_console_output_enabled(enabled=False)

if state:
agent.load_state_dict(state)

async for msg, last in stream_printing_messages(
agents=[agent], coroutine_task=agent(msgs),
):
yield msg, last

state = agent.state_dict()
await self.state_service.save_state(
user_id=user_id, session_id=session_id, state=state,
)


agent_app.run(host="127.0.0.1", port=8090)
```

```bash
python app.py
# 服务在 http://localhost:8090/process 上监听
```

### 第二步:在 AgentScope Studio 中连接

1. 在侧边栏中导航至 **服务 > Runtime Chat**
2. 点击右上角的 **⚙️ 设置** 图标
3. 设置 **baseURL** 为 `http://localhost:8090/process`,按需填写 **token**
4. 点击 **Save** 保存,创建新会话即可开始对话!

## 配置说明

点击 **⚙️ 设置** 图标打开配置面板。

### 主题 (Theme)

| 选项 | 说明 | 默认值 |
| ------------------ | ------------- | --------------- |
| `colorPrimary` | 主题主色调 | `#615CED` |
| `colorBgBase` | 背景颜色 | — |
| `colorTextBase` | 文字颜色 | — |
| `darkMode` | 启用深色模式 | `false` |
| `leftHeader.logo` | Logo 图片 URL | AgentScope 图标 |
| `leftHeader.title` | 标题栏文字 | `Runtime Chat` |

### 发送器 (Sender)

| 选项 | 说明 | 默认值 |
| ------------ | ------------------------ | ----------- |
| `disclaimer` | 输入框下方的免责声明文字 | AI 免责声明 |
| `maxLength` | 最大输入字符数 | `10000` |

### 欢迎页 (Welcome)

| 选项 | 说明 | 默认值 |
| ------------- | -------------- | ------------------------------------------- |
| `greeting` | 欢迎问候语 | `Hello, how can I help you today?` |
| `description` | 欢迎描述文字 | `I am a helpful assistant...` |
| `avatar` | 欢迎页头像 URL | AgentScope 图标 |
| `prompts` | 快捷提示词建议 | `Hello`、`How are you?`、`What can you do?` |

### API 配置

| 选项 | 说明 | 默认值 |
| --------- | ---------------------------------- | ------ |
| `baseURL` | AgentScope Runtime API 的 Base URL | — |
| `token` | API 认证 Token | — |
Loading
Loading