-
Notifications
You must be signed in to change notification settings - Fork 242
feat: Add VS Code Language Model API integration (#80) #102
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
ultmaster
merged 24 commits into
microsoft:main
from
ankit-apk:feature/vscode-lm-api-integration
Sep 2, 2025
Merged
Changes from 1 commit
Commits
Show all changes
24 commits
Select commit
Hold shift + click to select a range
d7be26f
feat: Add VS Code Language Model API integration (#80)
ankit-apk 21405f6
refactor: Address PR feedback for VS Code LM API integration
ankit-apk 1fead53
cleanup: Remove unused files after merging changes
ankit-apk c63fe7f
fix: Register VS Code LM commands and fix compilation issues
ankit-apk a89a3e5
chore: Update .gitignore to exclude test artifacts
ankit-apk 5409c08
docs: Update PR description with test results and review feedback
ankit-apk 5e6e33a
refactor: address PR feedback - simplify implementation
ankit-apk 1232c08
fix: update code review example to pass tests
ankit-apk 58d3c14
refactor: address all PR review feedback
ankit-apk cb42de1
refactor: address all PR review feedback
ankit-apk 8a4d65b
Merge branch 'feature/vscode-lm-api-integration' of https://github.co…
ankit-apk 7bd1a65
Merge upstream/main - resolve conflicts in VS Code LM API integration
ankit-apk 9d46562
merge upstream/main
ultmaster c6b019a
revert bump version
ultmaster fb71b55
refactor vscode command
ultmaster 16741ef
Add comments
ultmaster fa49518
fix command bugs
ultmaster 458e505
fix example
ultmaster 6e05d27
restore tags and version
ultmaster b26db04
restore tags and version
ultmaster 1ad65d8
Merge branch 'feature/vscode-lm-api-integration' of https://github.co…
ultmaster 7097dbb
fix tests
ultmaster 743b6e6
revert
ultmaster 9cfdf9f
minor fix
ultmaster File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,170 @@ | ||
# Add VS Code Language Model API Integration | ||
|
||
## Summary | ||
|
||
This PR implements comprehensive support for VS Code's Language Model API, allowing POML users to leverage GitHub Copilot and other language models without configuring API keys. This addresses issue #80 and significantly improves the user experience by removing configuration friction. | ||
|
||
## Motivation | ||
|
||
Currently, users must manually configure API keys and endpoints for language models, which creates several pain points: | ||
- New users face a steep configuration barrier | ||
- API key management poses security concerns | ||
- Multiple configuration steps reduce adoption | ||
- No seamless integration with existing VS Code language models | ||
|
||
This PR solves these issues by integrating directly with VS Code's Language Model API, which is already used by GitHub Copilot and other extensions. | ||
|
||
## Changes | ||
|
||
### Core Implementation | ||
|
||
1. **New VS Code LM Provider** (`packages/poml-vscode/providers/vscodeLMProvider.ts`) | ||
- `VSCodeLMProvider` class for interacting with VS Code LM API | ||
- `VSCodeLMIntegration` helper for seamless integration | ||
- Support for all major VS Code LM features (streaming, cancellation, error handling) | ||
|
||
2. **Enhanced Test Command** (`packages/poml-vscode/command/testCommandEnhanced.ts`) | ||
- Automatic detection of VS Code LM availability | ||
- Intelligent fallback to VS Code LM when no API key configured | ||
- Seamless switching between providers | ||
|
||
3. **Auto-Configuration Commands** (`packages/poml-vscode/command/detectModelsCommand.ts`) | ||
- `poml.detectVSCodeModels` - Detect available language models | ||
- `poml.autoConfigureLM` - Auto-configure POML to use VS Code LM | ||
|
||
### Configuration Updates | ||
|
||
- Added "vscode" as a language model provider option | ||
- Updated package.json with new commands and settings | ||
- Enhanced settings.ts to support the new provider type | ||
|
||
### Documentation | ||
|
||
- Comprehensive guide for VS Code LM API usage (`docs/vscode/vscode-lm-api.md`) | ||
- Migration guides from other providers | ||
- Troubleshooting section | ||
- Best practices and examples | ||
|
||
### Testing | ||
|
||
- Complete test suite for VS Code LM provider (`packages/poml-vscode/tests/vscodeLMProvider.test.ts`) | ||
- Tests for error handling, model detection, and streaming | ||
- Mock implementations for VS Code API components | ||
|
||
## Features | ||
|
||
### 🚀 Zero Configuration | ||
- Automatically detects and uses GitHub Copilot if available | ||
- No API keys required | ||
- One-click auto-configuration | ||
|
||
### 🔄 Intelligent Fallback | ||
- Automatically uses VS Code LM when no other provider configured | ||
- Seamless switching between providers | ||
- Graceful error handling | ||
|
||
### 🔍 Model Discovery | ||
- Detect all available language models | ||
- Support for multiple model families (GPT-4o, Claude, o1) | ||
- Real-time availability checking | ||
|
||
### 🔒 Enhanced Security | ||
- No API keys stored in settings | ||
- Leverages VS Code's built-in authentication | ||
- Respects user consent requirements | ||
|
||
### 📊 Better User Experience | ||
- Unified billing through GitHub Copilot subscription | ||
- Automatic model updates | ||
- Consistent with VS Code ecosystem | ||
|
||
## Usage | ||
|
||
### Quick Start | ||
1. Install the updated POML extension | ||
2. Run command: `POML: Auto-Configure Language Model` | ||
3. Start using POML with GitHub Copilot! | ||
|
||
### Manual Configuration | ||
```json | ||
{ | ||
"poml.languageModel.provider": "vscode", | ||
"poml.languageModel.model": "copilot/gpt-4o" | ||
} | ||
``` | ||
|
||
## Testing | ||
|
||
The implementation has been thoroughly tested with: | ||
- ✅ Unit tests for all new components | ||
- ✅ Integration tests with mock VS Code APIs | ||
- ✅ Error handling scenarios | ||
- ✅ Model detection and validation | ||
- ✅ Streaming and cancellation | ||
|
||
## Compatibility | ||
|
||
- Requires VS Code 1.95.0 or later | ||
- Backward compatible with existing POML configurations | ||
- Works alongside traditional API key configurations | ||
|
||
## Migration Path | ||
|
||
Users can migrate seamlessly: | ||
1. Existing configurations continue to work | ||
2. New users get VS Code LM by default if available | ||
3. One-command migration for existing users | ||
|
||
## Future Enhancements | ||
|
||
This PR lays the groundwork for: | ||
- Support for VS Code's upcoming language model features | ||
- Integration with VS Code's model selection UI | ||
- Enhanced model-specific optimizations | ||
- Tool/function calling support when available | ||
|
||
## Checklist | ||
|
||
- [x] Code implementation complete | ||
- [x] Tests written and passing | ||
- [x] Documentation updated | ||
- [x] Package.json updated with new commands | ||
- [x] Settings schema updated | ||
- [x] Backward compatibility maintained | ||
- [x] Error handling implemented | ||
- [x] TypeScript types properly defined | ||
|
||
## Screenshots/Demo | ||
|
||
### Auto-Configuration Flow | ||
1. User runs "Auto-Configure Language Model" | ||
2. POML detects GitHub Copilot | ||
3. Settings automatically updated | ||
4. Ready to use without API keys! | ||
|
||
### Model Detection | ||
- Shows all available models | ||
- One-click configuration | ||
- Helpful error messages | ||
|
||
## Related Issues | ||
|
||
- Fixes #80: "make the extension use VS Code LM API" | ||
- Addresses configuration issues mentioned in #84 and #98 | ||
- Improves onboarding experience | ||
|
||
## Breaking Changes | ||
|
||
None. This PR is fully backward compatible. | ||
|
||
## Review Notes | ||
|
||
Key files to review: | ||
1. `packages/poml-vscode/providers/vscodeLMProvider.ts` - Core implementation | ||
2. `packages/poml-vscode/command/testCommandEnhanced.ts` - Integration logic | ||
3. `packages/poml-vscode/tests/vscodeLMProvider.test.ts` - Test coverage | ||
4. `docs/vscode/vscode-lm-api.md` - User documentation | ||
|
||
## Acknowledgments | ||
|
||
Thanks to the VS Code team for the excellent Language Model API documentation and to the community for highlighting this need in issue #80. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,222 @@ | ||
# Using VS Code Language Model API with POML | ||
ankit-apk marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
|
||
POML now supports VS Code's built-in Language Model API, allowing you to use GitHub Copilot and other language models without configuring API keys. | ||
|
||
## Overview | ||
|
||
The VS Code Language Model API integration enables POML to: | ||
- Use GitHub Copilot's language models directly | ||
- Automatically detect available models | ||
- Work without API key configuration | ||
- Seamlessly fall back to VS Code LM when no other provider is configured | ||
|
||
## Requirements | ||
|
||
- VS Code version 1.95.0 or later | ||
- Active GitHub Copilot subscription (or other language model provider) | ||
- POML extension installed | ||
|
||
## Quick Start | ||
|
||
### Automatic Configuration | ||
|
||
1. Open any `.poml` file in VS Code | ||
2. Run the command: `POML: Auto-Configure Language Model` | ||
3. If GitHub Copilot is available, POML will automatically configure itself to use it | ||
|
||
### Manual Configuration | ||
|
||
1. Open VS Code Settings (`Cmd/Ctrl + ,`) | ||
2. Search for "POML Language Model" | ||
3. Set the following: | ||
- **Provider**: `VS Code LM (GitHub Copilot)` | ||
- **Model**: `copilot/gpt-4o` (or leave blank for auto-detection) | ||
- **API Key**: Leave blank (not needed for VS Code LM) | ||
|
||
### Detecting Available Models | ||
|
||
To see which language models are available in your VS Code instance: | ||
|
||
1. Run command: `POML: Detect VS Code Language Models` | ||
2. The extension will show you all available models | ||
3. Choose "Use VS Code LM" to automatically configure POML | ||
|
||
## Supported Models | ||
|
||
When using VS Code LM API, the following models are typically available: | ||
|
||
- `copilot/gpt-4o` - GPT-4 Optimized (recommended) | ||
- `copilot/gpt-4o-mini` - Smaller, faster variant | ||
- `copilot/claude-3.5-sonnet` - Claude 3.5 Sonnet | ||
- `copilot/o1` - OpenAI o1 model | ||
- `copilot/o1-mini` - OpenAI o1 mini model | ||
|
||
## Configuration Options | ||
|
||
### settings.json | ||
|
||
```json | ||
{ | ||
"poml.languageModel.provider": "vscode", | ||
"poml.languageModel.model": "copilot/gpt-4o", | ||
"poml.languageModel.temperature": 0.7, | ||
"poml.languageModel.maxTokens": 2000 | ||
} | ||
``` | ||
|
||
### Provider Selection Priority | ||
|
||
POML uses the following priority order for selecting a language model provider: | ||
|
||
1. **Explicit VS Code LM**: If provider is set to "vscode" | ||
2. **Automatic VS Code LM**: If no API key is configured and VS Code LM is available | ||
3. **Configured Provider**: If API key and provider are configured | ||
|
||
## Features | ||
|
||
### Automatic Fallback | ||
|
||
If you haven't configured any language model settings, POML will automatically try to use VS Code's Language Model API if available. | ||
|
||
### User Consent | ||
|
||
The first time you use VS Code LM API, you may be prompted to grant consent. This is a one-time authorization that allows extensions to use your language model subscription. | ||
|
||
### Rate Limiting | ||
|
||
VS Code LM API respects rate limits set by your language model provider. POML will handle rate limiting errors gracefully and provide appropriate feedback. | ||
|
||
## Advantages | ||
|
||
1. **No API Key Management**: Use your existing GitHub Copilot subscription | ||
2. **Automatic Updates**: Models are updated automatically through VS Code | ||
3. **Unified Billing**: Costs are included in your GitHub Copilot subscription | ||
4. **Better Security**: No need to store API keys in settings | ||
5. **Seamless Integration**: Works with VS Code's built-in authentication | ||
|
||
## Troubleshooting | ||
|
||
### "No language models found" | ||
|
||
**Solution**: | ||
- Ensure you're signed in to GitHub Copilot | ||
- Run: `GitHub Copilot: Sign In` from the command palette | ||
- Update VS Code to version 1.95.0 or later | ||
|
||
### "User consent required" | ||
|
||
**Solution**: | ||
- This is normal for first-time use | ||
- Click "Allow" when prompted | ||
- The consent is remembered for future sessions | ||
|
||
### "Rate limit exceeded" | ||
|
||
**Solution**: | ||
- Wait a few moments before trying again | ||
- Check your GitHub Copilot usage limits | ||
- Consider using a model with higher rate limits | ||
|
||
### Models not appearing | ||
|
||
**Solution**: | ||
1. Check VS Code version: `Help > About` | ||
2. Verify GitHub Copilot extension is installed and active | ||
3. Run `POML: Detect VS Code Language Models` to refresh | ||
|
||
## Migration Guide | ||
|
||
### From OpenAI API | ||
|
||
```json | ||
// Before | ||
{ | ||
"poml.languageModel.provider": "openai", | ||
"poml.languageModel.model": "gpt-4", | ||
"poml.languageModel.apiKey": "sk-..." | ||
} | ||
|
||
// After | ||
{ | ||
"poml.languageModel.provider": "vscode", | ||
"poml.languageModel.model": "copilot/gpt-4o" | ||
// No API key needed! | ||
} | ||
``` | ||
|
||
### From Azure OpenAI | ||
|
||
```json | ||
// Before | ||
{ | ||
"poml.languageModel.provider": "microsoft", | ||
"poml.languageModel.model": "gpt-4-deployment", | ||
"poml.languageModel.apiKey": "...", | ||
"poml.languageModel.apiUrl": "https://....openai.azure.com" | ||
} | ||
|
||
// After | ||
{ | ||
"poml.languageModel.provider": "vscode", | ||
"poml.languageModel.model": "copilot/gpt-4o" | ||
} | ||
``` | ||
|
||
## Best Practices | ||
|
||
1. **Model Selection**: Use `copilot/gpt-4o` for best performance and quality | ||
2. **Temperature**: Adjust temperature based on your use case (0.0-1.0) | ||
3. **Token Limits**: Be aware of model token limits (GPT-4o supports up to 64K tokens) | ||
4. **Error Handling**: Implement fallback logic for when models are unavailable | ||
|
||
## Example Usage | ||
|
||
### Basic Prompt Testing | ||
|
||
1. Create a `.poml` file: | ||
```xml | ||
<poml> | ||
<role>You are a helpful assistant.</role> | ||
<task>Explain quantum computing in simple terms.</task> | ||
</poml> | ||
``` | ||
|
||
2. Click the "Test" button or run `POML: Test current prompt on Chat Models` | ||
3. POML will automatically use VS Code LM if configured | ||
|
||
### Programmatic Usage | ||
|
||
```typescript | ||
import { VSCodeLMIntegration } from 'poml-vscode/providers/vscodeLMProvider'; | ||
|
||
// Check if VS Code LM should be used | ||
if (VSCodeLMIntegration.shouldUseVSCodeLM(settings)) { | ||
// Stream responses from VS Code LM | ||
const stream = VSCodeLMIntegration.createStream(messages, settings); | ||
|
||
for await (const chunk of stream) { | ||
console.log(chunk); | ||
} | ||
} | ||
``` | ||
|
||
## API Reference | ||
|
||
### Commands | ||
|
||
- `poml.detectVSCodeModels` - Detect available VS Code Language Models | ||
- `poml.autoConfigureLM` - Automatically configure language model settings | ||
|
||
### Settings | ||
|
||
- `poml.languageModel.provider` - Set to "vscode" to use VS Code LM API | ||
- `poml.languageModel.model` - Model identifier (e.g., "copilot/gpt-4o") | ||
- `poml.languageModel.temperature` - Response randomness (0.0-1.0) | ||
- `poml.languageModel.maxTokens` - Maximum response length | ||
|
||
## Support | ||
|
||
For issues or questions about VS Code LM integration: | ||
1. Check the [POML GitHub Issues](https://github.com/microsoft/poml/issues) | ||
2. Review [VS Code Language Model API docs](https://code.visualstudio.com/api/extension-guides/language-model) | ||
3. Verify your GitHub Copilot subscription is active |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.