Skip to content

Latest commit

 

History

History
26 lines (18 loc) · 858 Bytes

README.md

File metadata and controls

26 lines (18 loc) · 858 Bytes

Laravel Ollama

This app uses Laravel, Livewire, and Volt to create a simple interface that generates a response from an AI model using Ollama.

Laravel Ollama Screenshot

Simply download and install Ollama. Then use it with any model, like so:

ollama pull codellama

This application will retreive the response in Laravel by hitting the following endpoint, which is available via Ollama:

curl -X POST http://localhost:11434/api/generate -d '{
  "model": "codellama",
  "prompt": "Write me a function that outputs the fibonacci sequence"
}'

For testing purposes you may also use the CLI to get a response:

ollama run codellama "Write me a function that outputs the fibonacci sequence"