Langfuse
title: Langfuse subtitle: Using OpenRouter with Langfuse headline: ‘Langfuse Integration | OpenRouter SDK Support’ canonical-url: https://openrouter.ai/docs/guides/community/langfuse og:site_name: OpenRouter Documentation og:title: ‘Langfuse Integration - OpenRouter SDK Support’ og:description: ‘Integrate OpenRouter using Langfuse for observability and tracing. Complete guide for Langfuse integration with OpenRouter for Python applications.’ og:image: https://openrouter.ai/dynamic-og?title=Langfuse&description=Langfuse%20Integration og:image:width: 1200 og:image:height: 630 twitter:card: summary_large_image twitter:site: ‘@OpenRouterAI’ noindex: false nofollow: false
Looking to auto-instrument without client code? Check out OpenRouter Broadcast to automatically send traces to Langfuse.
Using Langfuse
Langfuse provides observability and analytics for LLM applications. Since OpenRouter uses the OpenAI API schema, you can utilize Langfuse’s native integration with the OpenAI SDK to automatically trace and monitor your OpenRouter API calls.
Installation
Configuration
Set up your environment variables:
Environment Setup
Simple LLM Call
Since OpenRouter provides an OpenAI-compatible API, you can use the Langfuse OpenAI SDK wrapper to automatically log OpenRouter calls as generations in Langfuse:
Basic Integration
Advanced Tracing with Nested Calls
Use the @observe() decorator to capture execution details of functions with nested LLM calls:
Nested Function Tracing
Learn More
- Langfuse OpenRouter Integration: https://langfuse.com/docs/integrations/other/openrouter
- OpenRouter Quick Start Guide: https://openrouter.ai/docs/quickstart
- Langfuse
@observe()Decorator: https://langfuse.com/docs/sdk/python/decorators