Skip to content
This repository has been archived by the owner on Jan 10, 2025. It is now read-only.

忽略 #276

Closed
4 tasks
l137295 opened this issue Dec 3, 2024 · 0 comments
Closed
4 tasks

忽略 #276

l137295 opened this issue Dec 3, 2024 · 0 comments

Comments

@l137295
Copy link

l137295 commented Dec 3, 2024

⚠️不按照以下格式创建的issue将不会获得回复⚠️

提bug前先检查以下是否已经执行

  • 我已更新了最新版本的代码
  • 我已经仔细地阅读了readme 文档
  • 我已在FAQ里查找了可能寻找的答案
  • 我已经尝试搜索过历史关闭issue寻找可能的答案

bug描述

提供有用的信息

  • 目前使用的版本号:(很可能最新版本已经修复了)
    @app.post("/receive")
    async def receive_message(
    type: str = Form(...),
    content: str = Form(...),
    source: str = Form(...),
    isMentioned: str = Form(...),
    isMsgFromSelf: str = Form(...),
    ):

    处理请求数据

    response_data = {
    "type": type,
    "content": content,
    "source": source,
    "isMentioned": isMentioned,
    "isMsgFromSelf": isMsgFromSelf,
    }
    try:
    # 填写处理逻辑-开始
    logger.info("Received data: {}", response_data)
    # 填写处理逻辑-结束
    return JSONResponse(content={"status": "success", "data": response_data})
    except Exception as e:
    logger.error("Error processing request: {}", e)
    return JSONResponse(content={"status": "error", "data": "处理失败"})

if name == "main":
uvicorn.run(app, host="0.0.0.0", port=3003)

@l137295 l137295 changed the title 小白出现的问题,请求各位大佬:发送消息实现了,但是接收消息让ollama自动回复的还实现···怎么去让ollama的qwen模型去接收消息回复呢 忽略 Jan 7, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants