A lightweight proxy server with LLM-powered content filtering capabilities.
- π Simple and lightweight HTTP/HTTPS proxy server
- π€ LLM-powered content filtering (Ollama, OpenRouter)
- π Customizable content transformation
- π‘οΈ Optional robots.txt bypass
- π Customizable User-Agent management
- π Comprehensive logging system
- Node.js 18.x or higher
- npm or yarn
- Ollama (if using local LLM filtering)
- Clone the repository
git clone https://github.com/GOROman/SubtractProxy.git
cd SubtractProxy- Install dependencies
npm install- Start the development server
npm run devThe proxy server will start at http://localhost:8080 by default.
Create a config.json file in the project root:
{
"port": 8080,
"host": "127.0.0.1",
"ignoreRobotsTxt": false,
"llm": {
"enabled": true,
"type": "ollama",
"model": "gemma",
"baseUrl": "http://localhost:11434"
},
"logging": {
"level": "info",
"file": "proxy.log"
},
"userAgent": {
"enabled": true,
"rotate": true,
"value": "CustomUserAgent/1.0",
"presets": [
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15"
]
}
}SubtractProxy is built with a modular architecture that ensures extensibility and maintainability:
SubtractProxy/
βββ src/
β βββ proxy/ # Core proxy server implementation
β βββ llm/ # LLM integration and filtering
β βββ config/ # Configuration management
β βββ utils/ # Shared utilities and logging
- Proxy Server: Handles HTTP/HTTPS requests and responses
- LLM Filter: Processes content through LLM models
- Config Manager: Manages application settings
- Logger: Provides comprehensive logging capabilities
GET /*: Handles all HTTP/HTTPS proxy requestsPOST /api/config: Updates proxy configurationGET /api/status: Returns proxy server status
interface ProxyConfig {
port: number;
host: string;
llm: LLMConfig;
logging: LogConfig;
userAgent: UserAgentConfig;
}SubtractProxy uses LLM-based content filtering with customizable rules:
-
Content Analysis
- Text content is analyzed for specified patterns
- Images and media are processed separately
- Response headers are preserved
-
Filtering Criteria
- Content categories (NSFW, spam, malware)
- Custom keywords and patterns
- URL patterns and domains
-
Actions
- Allow: Pass content through
- Block: Return error page
- Modify: Transform content
- Log: Record without blocking
- Set your browser's proxy settings to the SubtractProxy address (default:
127.0.0.1:8080) - Browse normally - all requests will be filtered through SubtractProxy
- Start Ollama server (if using local LLM)
- Configure LLM settings in config.json
- Start SubtractProxy
- Content will be automatically filtered based on LLM analysis
SubtractProxy provides flexible User-Agent management:
-
Custom User-Agent
- Set a specific User-Agent string using the
valuefield
{ "userAgent": { "enabled": true, "value": "CustomUserAgent/1.0" } } - Set a specific User-Agent string using the
-
User-Agent Rotation
- Enable automatic rotation between preset User-Agents
-
Connection Refused
- Check if proxy server is running
- Verify port is not in use
- Confirm firewall settings
-
LLM Integration
- Ensure Ollama server is running
- Check model availability
- Verify API endpoints
-
Performance
- Monitor memory usage
- Check logging levels
- Optimize request handling
{ "userAgent": { "enabled": true, "rotate": true, "presets": [ "Mozilla/5.0 (Windows NT 10.0; Win64; x64)", "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7)" ] } } -
Disable User-Agent Modification
- Keep original request's User-Agent
{ "userAgent": { "enabled": false } }
# Start development server with hot reload
npm run dev
# Build for production
npm run build
# Start production server
npm start
# Run tests
npm test
# Lint code
npm run lint
# Format code
npm run format- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'feat: add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
GOROman
- Ollama for local LLM support
- Express.js for the web server framework
- http-proxy for proxy functionality