diff --git a/README.md b/README.md index a5be9999..aa08c173 100644 --- a/README.md +++ b/README.md @@ -30,13 +30,12 @@ OWASP Noir is an open-source project specializing in identifying attack surfaces ## Key Features -- Identify API endpoints and parameters from source code. -- Support various source code languages and frameworks. -- Provide analysts with technical information and security issues identified during source code analysis. -- Friendly pipeline & DevOps integration, offering multiple output formats (JSON, YAML, OAS spec) and compatibility with tools like curl and httpie. -- Friendly Offensive Security Tools integration, allowing usage with tools such as ZAP and Caido, Burpsuite. -- Identify security issues within the source code through rule-based passive scanning. -- Generate elegant and clear output results. +- Extract API endpoints and parameters from source code. +- Support multiple languages and frameworks. +- Uncover security issues with detailed analysis and rule-based passive scanning. +- Integrate seamlessly with DevOps pipelines and tools like curl, ZAP, and Caido. +- Deliver clear, actionable results in formats like JSON, YAML, and OAS. +- Enhance endpoint discovery with AI for unfamiliar frameworks and hidden APIs. ## Usage diff --git a/docs/_advanced/ai_integration.md b/docs/_advanced/ai_integration.md new file mode 100644 index 00000000..7966e0cb --- /dev/null +++ b/docs/_advanced/ai_integration.md @@ -0,0 +1,56 @@ +--- +title: AI Integration +has_children: false +nav_order: 5 +layout: page +--- + +# AI Integration +{: .d-inline-block } + +New (v0.19.0) +{: .label .label-green } + + +## Overview Flags + +* `--ollama http://localhost:11434` Specify the Ollama server URL to connect to. +* `--ollama-model MODEL` Specify the Ollama model name to be used for analysis. + + +## How to Use AI Integration +### Step 1: Install and Run Ollama + +1. Install Ollama: Follow the instructions on the official Ollama website to install the required software. +2. Run the Model: Start the Ollama server and ensure the desired model is available. For example: + +```bash +# Download LLM model +ollama pull llama3 + +# Run LLM model +ollama run llama3 +``` + +### Step 2: Run Noir with AI Analysis + +To leverage AI capabilities for additional analysis, use the following command: + +```bash +noir -b . --ollama http://localhost:11434 --ollama-model llama3 +``` + +This command performs the standard Noir operations while utilizing the specified AI model for enhanced analysis. + +![](../../images/advanced/ollama.jpeg) + +## Benefits of AI Integration + +* Using an LLM allows Noir to handle frameworks or languages that are beyond its original support scope. +* Additional endpoints that might be missed during a standard Noir scan can be identified. +* Note that there is a possibility of false positives, and the scanning speed may decrease depending on the number of LLM parameters and the performance of the machine hosting the service. + +## Notes + +* Ensure that the Ollama server is running and accessible at the specified URL before executing the command. +* Replace llama3 with the name of the desired model as required. \ No newline at end of file diff --git a/docs/_advanced/diff.md b/docs/_advanced/diff.md index 05ceb95b..5b311a71 100644 --- a/docs/_advanced/diff.md +++ b/docs/_advanced/diff.md @@ -1,7 +1,7 @@ --- title: Diff Mode has_children: false -nav_order: 5 +nav_order: 6 layout: page --- diff --git a/docs/_includes/usage.md b/docs/_includes/usage.md index 86025b83..e2abf1be 100644 --- a/docs/_includes/usage.md +++ b/docs/_includes/usage.md @@ -41,6 +41,10 @@ FLAGS: --use-matchers string Send URLs that match specific conditions to the Deliver --use-filters string Exclude URLs that match specified conditions and send the rest to Deliver + AI Integration: + --ollama http://localhost:11434 Specify the Ollama server URL + --ollama-model MODEL Specify the Ollama model name + DIFF: --diff-path ./app2 Specify the path to the old version of the source code for comparison @@ -51,7 +55,7 @@ FLAGS: CONFIG: --config-file ./config.yaml Specify the path to a configuration file in YAML format - --concurrency 100 Set concurrency + --concurrency 50 Set concurrency --generate-completion zsh Generate Zsh/Bash/Fish completion script DEBUG: diff --git a/docs/images/advanced/ollama.jpeg b/docs/images/advanced/ollama.jpeg new file mode 100644 index 00000000..b22cb1ec Binary files /dev/null and b/docs/images/advanced/ollama.jpeg differ