博客内容Blog Content

LangSmith和Langfuse的比较和使用 Comparison and Usage of LangSmith and Langfuse

BlogType : IT releaseTime : 2025-12-08 18:00:00

介绍并对比LangSmith和Langfuse,并尝试将LLM应用的链路上报 Introduce and compare LangSmith and Langfuse, and try to report the LLM application's execution traces.

总览 Overview

LangSmith 和 Langfuse 都是目前 LLM 应用开发中非常主流的 LLMOps(大模型运维) 平台,它们的核心功能高度重叠(都提供 Tracing 追踪、Evaluation 评估、Prompt Management 提示词管理),但在产品定位、开源策略和生态整合上有显著差异。

LangSmith and Langfuse are both currently mainstream LLMOps (Large Language Model operations) platforms in LLM application development. Their core functions overlap significantly (both provide tracing, evaluation, and prompt management), but they differ significantly in product positioning, open-source strategy, and ecosystem integration.


以下是两者的详细对比分析:

The following is a detailed comparative analysis of the two:

维度LangSmithLangfuse
核心定位LangChain 原生配套的闭源 SaaS 平台开源、中立的工程化平台
开源策略闭源 (Closed Source)开源 (Open Source / MIT License)
最佳适用场景深度使用 LangChain/LangGraph 框架的团队需要私有化部署、成本敏感或使用非 LangChain 栈的团队
部署方式SaaS 为主(私有化部署仅限企业版,昂贵)SaaS 或 免费自托管 (Self-hosted)
主要优势开箱即用,UI 交互极佳,Playground 体验好数据完全掌控,无厂商锁定,集成灵活 (SDK/API)
DimensionLangSmithLangfuse
Core PositioningClosed-source SaaS platform natively integrated with LangChainOpen-source, independent engineering platform
Open Source StrategyClosed SourceOpen Source (MIT License)
Best Use CaseTeams deeply using LangChain/LangGraph frameworksTeams requiring self-hosting, cost-sensitive teams, or those not using the LangChain stack
Deployment MethodPrimarily SaaS (Private deployment only available for enterprise, expensive)SaaS or free self-hosted
Main AdvantagesOut-of-the-box usage, excellent UI/UX, great Playground experienceFull data control, no vendor lock-in, flexible integration (SDK/API)

简单来说,LangSmith 是“苹果式”的体验(封闭、精致、整合度高、贵),而 Langfuse 是“安卓/Linux式”的选择(开放、灵活、可折腾、自主可控)。

Simply put, LangSmith offers an "Apple-like" experience (closed, refined, highly integrated, expensive), while Langfuse is an "Android/Linux-like" choice (open, flexible, hackable, self-controllable).




LangSmith的使用 LangSmith usage

使用方法:

  1. 先去官网注册账号https://docs.langchain.com/langsmith/home

  2. 无法直接创建project,需要上报后自动建,这里需要先申请apikey,然后在.env中加入langsmith的配置,指定project的名字(不指定会发到default)

  3. 配置好后运行程序,所有LLM的交互会自动上报

Usage method:

  1. First go to the official website to register an account https://docs.langchain.com/langsmith/home

  2. You cannot directly create a project; it needs to be reported first to be created automatically. Here, you need to first apply for an API key, then add LangSmith configuration in the .env file, and specify the project name (if not specified, it will be sent to the default project).

  3. After configuration, run the program, and all LLM interactions will be automatically reported.


对于Langchain项目,安装依赖后,在.env添加相关langsmith配置即可

For LangChain projects, after installing dependencies, just add related LangSmith configuration in the .env file.

# LangSmith

LANGSMITH_TRACING=true

LANGSMITH_ENDPOINT=https://api.smith.langchain.com

LANGSMITH_API_KEY=

LANGSMITH_PROJECT=llmops


运行llmops项目,发送一个请求

Run the llmops project and send a request.

image.png


发现langsmith里面已经自动建好了项目

You can see that the project has already been automatically created in LangSmith.

image.png


可以看到调用细节

You can view call details.

image.png


可以看llm请求内容json,相关token调用量等信息

You can see the LLM request content JSON, related token usage, and other information.

image.png


对于非Langchain框架的项目,需要参考文档进行接入

For projects not based on the LangChain framework, refer to the documentation for integration:

https://docs.langchain.com/langsmith/annotate-code





Langfuse的使用 Langfuse usage

以springAI为例,参考文档进行接入

Take Spring AI as an example. Refer to the documentation for integration:

https://langfuse.com/integrations/frameworks/spring-ai


主要步骤是在pom添加相关otel依赖,添加ChatModelCompletionContentObservationFilter组件,同时配置application.yml配置,注意需要先在https://cloud.langfuse.com/注册,并创建项目拿到公钥和密钥

The main steps are to add related OTel dependencies in pom.xml, add the ChatModelCompletionContentObservationFilter component, and configure the application.yml configuration. Note that you need to register at https://cloud.langfuse.com/ first and create a project to obtain the public and secret keys.

�spring:
  ai:
    chat:
      observations:
        log-prompt: true # Include prompt content in tracing (disabled by default for privacy)
        log-completion: true # Include completion content in tracing (disabled by default)
    tools:
      observations:
        include-content: true     # 专门控制 Tool/MCP 的 args 和 result

management:
  tracing:
    sampling:
      probability: 1.0 # Sample 100% of requests for full tracing (adjust in production as needed)
  observations:
    annotations:
      enabled: true # Enable @Observed (if you use observation annotations in code)


完成后运行项目,发起请求,然后可以到langfuse页面上查看调用链路详情(包括聊天记录和回答,耗时等)

After completing the setup, run the project, initiate a request, and then you can view the call chain details on the Langfuse page (including chat history and responses, time consumption, etc.).

image.png

image.png


如果链路调用了mcp也可以看到具体接口和参数(需要开启配置)

If the call chain involves MCP, the interfaces called and arguments will also be visible. (Need to enable configuration)

image.png


不过似乎Langfuse非商业版仅支持50000条数据上报,如果超过需要花钱升级或者自己部署Langfuse,因此为了节省额度需要自己开发一个OtelFilterConfig类将非AI接口的一些请求过滤不上报

However, it seems that the non-commercial version of Langfuse only supports reporting up to 50,000 data entries. If this limit is exceeded, an upgrade or self-hosting Langfuse is required. Therefore, to save quota, it's necessary to develop a custom OtelFilterConfig class to filter out and exclude non-AI-related requests from being reported.

image.png