This is a beta feature. The API may change in future releases.
The LangSmith Collector-Proxy is a lightweight, high-performance proxy server that sits between your application and the LangSmith backend. It batches and compresses trace data before sending it to LangSmith, reducing network overhead and improving performance.

When to Use the Collector-Proxy

The Collector-Proxy is particularly valuable when:
  • You’re running multiple instances of your application in parallel and need to efficiently aggregate traces
  • You want more efficient tracing than direct OTEL API calls to LangSmith (the collector optimizes batching and compression)
  • You’re using a language that doesn’t have a native LangSmith SDK

Key Features

  • Efficient Data Transfer Batches multiple spans into fewer, larger uploads.
  • Compression Uses zstd to minimize payload size.
  • OTLP Support Accepts OTLP JSON and Protobuf over HTTP POST.
  • Semantic Translation Maps GenAI/OpenInference conventions to the LangSmith Run model.
  • Flexible Batching Flush by span count or time interval.

Configuration

Configure via environment variables:
VariableDescriptionDefault
HTTP_PORTPort to run the proxy server4318
LANGSMITH_ENDPOINTLangSmith backend URLhttps://api.smith.langchain.com
LANGSMITH_API_KEYAPI key for LangSmithRequired (env var or header)
LANGSMITH_PROJECTDefault tracing projectDefault project if not specified
BATCH_SIZESpans per upload batch100
FLUSH_INTERVAL_MSFlush interval in milliseconds1000
MAX_BUFFER_BYTESMax uncompressed buffer size10485760 (10 MB)
MAX_BODY_BYTESMax incoming request body size209715200 (200 MB)
MAX_RETRIESRetry attempts for failed uploads3
RETRY_BACKOFF_MSInitial backoff in milliseconds100

Project Configuration

The Collector-Proxy supports LangSmith project configuration with the following priority:
  1. If a project is specified in the request headers (Langsmith-Project), that project will be used
  2. If no project is specified in headers, it will use the project set in the LANGSMITH_PROJECT environment variable
  3. If neither is set, it will trace to the default project.

Authentication

The API key can be provided either:
  • As an environment variable (LANGSMITH_API_KEY)
  • In the request headers (X-API-Key)

Deployment (Docker)

You can deploy the Collector-Proxy with Docker:
  1. Build the image
    docker build \
      -t langsmith-collector-proxy:beta .
    
  2. Run the container
    docker run -d \
      -p 4318:4318 \
      -e LANGSMITH_API_KEY=<your_api_key> \
      -e LANGSMITH_PROJECT=<your_project> \
      langsmith-collector-proxy:beta
    

Usage

Point any OTLP-compatible client or the OpenTelemetry Collector exporter at:
export OTEL_EXPORTER_OTLP_ENDPOINT=http://<host>:4318/v1/traces
export OTEL_EXPORTER_OTLP_HEADERS="X-API-Key=<your_api_key>,Langsmith-Project=<your_project>"
Send a test trace:
curl -X POST http://localhost:4318/v1/traces \
  -H "Content-Type: application/json" \
  --data '{
    "resourceSpans": [
      {
        "resource": {
          "attributes": [
            {
              "key": "service.name",
              "value": { "stringValue": "test-service" }
            }
          ]
        },
        "scopeSpans": [
          {
            "scope": {
              "name": "example/instrumentation",
              "version": "1.0.0"
            },
            "spans": [
              {
                "traceId": "T6nh/mMkIONaoHewS9UWIw==",
                "spanId": "0tEqJwCpvU0=",
                "name": "parent-span",
                "kind": "SPAN_KIND_INTERNAL",
                "startTimeUnixNano": 1747675155185223936,
                "endTimeUnixNano":   1747675156185223936,
                "attributes": [
                  {
                    "key": "gen_ai.prompt",
                    "value": {
                      "stringValue": "{\"text\":\"Hello, world!\"}"
                    }
                  },
                  {
                    "key": "gen_ai.usage.input_tokens",
                    "value": {
                      "intValue": "5"
                    }
                  },
                  {
                    "key": "gen_ai.completion",
                    "value": {
                      "stringValue": "{\"text\":\"Hi there!\"}"
                    }
                  },
                  {
                    "key": "gen_ai.usage.output_tokens",
                    "value": {
                      "intValue": "3"
                    }
                  }
                ],
                "droppedAttributesCount": 0,
                "events": [],
                "links": [],
                "status": {}
              }
            ]
          }
        ]
      }
    ]
  }'

Health & Scaling

  • Liveness: GET /live → 200
  • Readiness: GET /ready → 200

Horizontal Scaling

To ensure full traces are batched correctly, route spans with the same trace ID to the same instance (e.g., via consistent hashing).

Fork & Extend

Fork the Collector-Proxy repo on GitHub and implement your own converter:
  • Create a custom GenAiConverter or modify the existing one in internal/translator/otel_converter.go
  • Register the custom converter in internal/translator/translator.go