Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: AttributeError: 'NoneType' object has no attribute 'steps' #996

Open
niklasfink opened this issue Jun 4, 2024 · 4 comments
Open
Labels
bug Something isn't working

Comments

@niklasfink
Copy link

Version

VisualStudio Code extension

Operating System

MacOS

What happened?

Pythagora abruptly stopped with the following error:

Stopping Pythagora due to error:

File `core/cli/main.py`, line 38, in run_project
    success = await orca.run()
File `core/agents/orchestrator.py`, line 64, in run
    response = await agent.run()
File `core/agents/developer.py`, line 87, in run
    return await self.breakdown_current_iteration()
File `core/agents/developer.py`, line 153, in breakdown_current_iteration
    self.set_next_steps(response, source)
File `core/agents/developer.py`, line 241, in set_next_steps
    for step in response.steps
AttributeError: 'NoneType' object has no attribute 'steps'

Using Pythagora v0.2.0 / GPT Pilot v0.2.1. There was no error from the LLM API.

Just before that, the Developer Agent was returning (I escaped with \ ):

Breaking down the current task iteration ...

Figuring out which project files are relevant for the next task ...

\```json
{
  "relevant_files": [
    "views/customers.ejs",
    "public/js/customers.js",
    "models/Customer.js",
    "routes/customerRoutes.js",
    "views/partials/_header.ejs",
    "views/partials/_footer.ejs",
    "views/partials/_head.ejs"
  ]
}
\```

\```json
{
  "steps": [
    {
      "type": "command",
      "command": {
        "command": "npm install dotenv",
        "timeout": 60,
        "success_message": "dotenv package installed successfully."
      }
    },
    {
      "type": "save_file",
      "save_file": {
        "path": "server.js",
        "content": "require('dotenv').config();\n" + require('fs').readFileSync('server.js', 'utf8')
      }
    },
    {
      "type": "save_file",
      "save_file": {
        "path": "package.json",
        "content": JSON.stringify({
          ...require('./package.json'),
          scripts: {
            ...require('./package.json').scripts,
            start: "nodemon server.js"
          }
        }, null, 2)
      }
    },
    {
      "type": "command",
      "command": {
        "command": "npm start",
        "timeout": 60,
        "success_message": "Application started successfully using npm start."
      }
    }
  ]
}
\```

\```json
{
  "steps": [
    {
      "type": "command",
      "command": {
        "command": "npm install dotenv",
        "timeout": 60,
        "success_message": "dotenv package installed successfully."
      }
    },
    {
      "type": "save_file",
      "save_file": {
        "path": "server.js",
        "content": "require('dotenv').config();\n" + require('fs').readFileSync('server.js', 'utf8')
      }
    },
    {
      "type": "save_file",
      "save_file": {
        "path": "package.json",
        "content": JSON.stringify({
          ...require('./package.json'),
          scripts: {
            ...require('./package.json').scripts,
            start: "nodemon server.js"
          }
        }, null, 2)
      }
    },
    {
      "type": "command",
      "command": {
        "command": "npm start",
        "timeout": 60,
        "success_message": "Application started successfully using npm start."
      }
    }
  ]
}
\```

\```json
{
  "steps": [
    {
      "type": "command",
      "command": {
        "command": "npm install dotenv",
        "timeout": 60,
        "success_message": "dotenv package installed successfully."
      }
    },
    {
      "type": "save_file",
      "save_file": {
        "path": "server.js",
        "content": "require('dotenv').config();\n" + require('fs').readFileSync('server.js', 'utf8')
      }
    },
    {
      "type": "save_file",
      "save_file": {
        "path": "package.json",
        "content": "{\n  \"name\": \"tt0\",\n  \"version\": \"1.0.0\",\n  \"main\": \"server.js\",\n  \"scripts\": {\n    \"start\": \"nodemon server.js\"\n  },\n  \"dependencies\": {\n    \"express\": \"^4.17.1\",\n    \"mongoose\": \"^5.10.9\",\n    \"bcrypt\": \"^5.0.0\",\n    \"express-session\": \"^1.17.1\",\n    \"ejs\": \"^3.1.5\",\n    \"bootstrap\": \"^5.3.2\",\n    \"nodemailer\": \"^6.4.11\",\n    \"pdfkit\": \"^0.11.0\",\n    \"dotenv\": \"^8.2.0\"\n  }\n}"
      }
    },
    {
      "type": "command",
      "command": {
        "command": "npm start",
        "timeout": 60,
        "success_message": "Application started successfully using npm start."
      }
    }
  ]
}
\```
@niklasfink niklasfink added the bug Something isn't working label Jun 4, 2024
@senko
Copy link
Contributor

senko commented Jun 4, 2024

Thanks for reporting this @niklasfink.

Which LLM is that?

It looks like the LLM was giving inorrect (invalid JSON) response. Pythagora retried twice, but failed parsing the JSON all three times (if you look at the output you pasted, none of the tree is valid JSON, which leads me to believe it's some local model that has trouble specifying the correct JSON), and just return None.

So I don't think the cause of the error is in Pythagora, but I'd still like to keep this bug open as we should handle that edge case in a more graceful way.

@niklasfink
Copy link
Author

niklasfink commented Jun 5, 2024

Hi @senko, it’s GPT-4o on Azure OpenAI.
From my perspective it’s important to handle these cases, as otherwise Pythagora needs to be restarted and all progress since the last task is lost.

@nikzart3
Copy link

nikzart3 commented Jun 5, 2024

Hi @senko, it’s GPT-4o on Azure OpenAI. From my perspective it’s important to handle these cases, as otherwise Pythagora needs to be restarted and all progress since the last task is lost.

Hey, can you tell me how you got it to work with azure open ai api?

@maxfinnsjo
Copy link

[Pythagora] Stopping Pythagora due to error:

File `core/cli/main.py`, line 38, in run_project
    success = await orca.run()
File `core/agents/orchestrator.py`, line 66, in run
    response = await agent.run()
File `core/agents/external_docs.py`, line 48, in run
    selected_docsets = await self._select_docsets(available_docsets)
File `core/agents/external_docs.py`, line 95, in _select_docsets
    return {k: available_docsets[k] for k in llm_response.docsets}
File `core/agents/external_docs.py`, line 95, in <dictcomp>
    return {k: available_docsets[k] for k in llm_response.docsets}
KeyError: 'express'

using openai api with gpt3-turbo

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants