Skip to content

A little experiment to call LM Studio's Local Dev Server using Semantic Kernel

License

Notifications You must be signed in to change notification settings

endjin/SemanticKernelLMStudio

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SemanticKernelLMStudio

A very simple command line app (using Spectre.Console.Cli) which implements a custom Semantic Kernel IChatCompletionService to interact with the LM Studio Local LLM Server API.

Getting Started

  • Install the latest version of LM Studio from https://lmstudio.ai.
  • Download a model (demo used mistral-7b-instruct-v0.3)
  • Start the Local LLM Server
  • Run the SemanticKernelLMStudio app

Example

  • Scenario: Could a SLM be used to perform small editing tasks such as parsing an author string into first and last name?
  • Conjecture: The SLM could have more encoded knowledge about the cultural first names and family names around the world than the editor of the content.

Prompt:

Given the following string contains a name "firstnamelastname" break into first and last name and return a json string as your only output. use `first_name` and `last_name` as keys. Capitalize the first letter of each name.

Response:

{
  "first_name": "Firstname",
  "last_name": "Lastname"
}

History

Created for Howard van Rooijen's End of Week Show & Tell Session on 2024-11-29

About

A little experiment to call LM Studio's Local Dev Server using Semantic Kernel

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages