TPToolPazar
Ana Sayfa/Rehberler/How To Use Langfuse

How To Use Langfuse

📖 Bu rehber ToolPazar ekibi tarafından hazırlanmıştır. Tüm araçlarımız ücretsiz ve reklamsızdır.

What it is

Langfuse solves the “why did my chatbot say that?” problem. It captures every LLM call, tool invocation, and user interaction as a nested trace, adds latency and cost math, and lets you score outputs manually or with LLM-as-judge. It’s open-source, self-hostable, and drops into Python or JS apps with a few lines of code.

Install / set up

A Next.js + Postgres + ClickHouse stack (Redis and S3 for object storage). SDKs for Python and TypeScript send events to the ingestion API, which populates traces made of spans, generations, and events. The UI renders traces, aggregates metrics, and runs dataset-based evals against prompt versions.

First run

ClickHouse is required as of v3 and it’s heavier than the old Postgres-only stack. If you self-host on a small VM, the ingestion worker can fall behind and traces arrive minutes late. Size your instance for ClickHouse, not for Next.js.

Everyday workflows

Any team shipping an LLM feature to real users. The moment you have more than 10 daily conversations and someone asks “is it getting better or worse?”, you need Langfuse or something like it.

Gotchas and tips

Who it’s for