This library provides wraper functions to use Langfuse LLM monitoring for your application. It was build for Symfony but can be used in any PHP application.
Install the library and required dependencies via Composer:
composer require janzaba/langfuse
In your .env file, add your Langfuse PUBLIC_KEY and SECRET_KEY:
LANGFUSE_PUBLIC_KEY=your-public-key
LANGFUSE_SECRET_KEY=your-secret-keyIn your config/services.yaml, add the following service definitions:
parameters:
langfuse_config:
public_key: '%env(LANGFUSE_PUBLIC_KEY)%'
secret_key: '%env(LANGFUSE_SECRET_KEY)%'
# Optional: langfuse_base_uri: 'https://custom.langfuse.endpoint/'
services:
Langfuse\Config\Config:
class: Langfuse\Config\Config
arguments:
- '%langfuse_config%'
public: false
Langfuse\Client\LangfuseClient:
arguments:
$config: '@Langfuse\Config\Config'
Langfuse\LangfuseManager:
arguments:
$langfuseClient: '@Langfuse\Client\LangfuseClient'Now you can wrap your code with helper methods
$this->langfuseManager->withTrace(
'Trace name',
['operation' => 'example operation name'],
function () {
// Your code here
}
);Inside a trace you can have LLM generation.
$answer = $this->langfuseManager->withGeneration(
'prompt name',
'gpt-4o-mini',
$prompt,
function () use ($prompt) {
return $this->openAIClient->chat()->create(
[
'model' => 'gpt-4o-mini',
'messages' => $prompt,
]
);
}
);Contributions are welcome! Please submit a pull request or open an issue for any improvements or bugs.
This project is licensed under the MIT License.