Important: This documentation covers Yarn 1 (Classic).
For Yarn 2+ docs and migration guide, see yarnpkg.com.

Package detail

@traceloop/ai-semantic-conventions

traceloop134kApache-2.00.13.0TypeScript support: included

OpenTelemetry ai-specific semantic conventions

opentelemetry, nodejs, tracing, attributes, semantic conventions

readme

For Javascript / Typescript

Open-source observability for your LLM application

Get started with Node.js or Next.js »

Slack | Docs | Website

OpenLLMetry JS is released under the Apache-2.0 License PRs welcome! git commit activity Slack community channel Traceloop Twitter

🎉 New: Our semantic conventions are now part of OpenTelemetry! Join the discussion and help us shape the future of LLM observability.

OpenLLMetry-JS is a set of extensions built on top of OpenTelemetry that gives you complete observability over your LLM application. Because it uses OpenTelemetry under the hood, it can be connected to your existing observability solutions - Datadog, Honeycomb, and others.

It's built and maintained by Traceloop under the Apache 2.0 license.

The repo contains standard OpenTelemetry instrumentations for LLM providers and Vector DBs, as well as a Traceloop SDK that makes it easy to get started with OpenLLMetry-JS, while still outputting standard OpenTelemetry data that can be connected to your observability stack. If you already have OpenTelemetry instrumented, you can just add any of our instrumentations directly.

🚀 Getting Started

The easiest way to get started is to use our SDK. For a complete guide, go to our docs.

Install the SDK:

npm install --save @traceloop/node-server-sdk

Then, to start instrumenting your code, just add these 2 lines to your code:

import * as traceloop from "@traceloop/node-server-sdk";

traceloop.initialize();

Make sure to import the SDK before importing any LLM module.

That's it. You're now tracing your code with OpenLLMetry-JS! If you're running this locally, you may want to disable batch sending, so you can see the traces immediately:

traceloop.initialize({ disableBatch: true });

Now, you need to decide where to export the traces to.

⏫ Supported (and tested) destinations

See our docs for instructions on connecting to each one.

🪗 What do we instrument?

OpenLLMetry-JS can instrument everything that OpenTelemetry already instruments - so things like your DB, API calls, and more. On top of that, we built a set of custom extensions that instrument things like your calls to OpenAI or Anthropic, or your Vector DB like Pinecone, Chroma, or Weaviate.

LLM Providers

  • ✅ OpenAI
  • ✅ Azure OpenAI
  • ✅ Anthropic
  • ✅ Cohere
  • ⏳ Replicate
  • ⏳ HuggingFace
  • ✅ Vertex AI (GCP)
  • ✅ Bedrock (AWS)

Vector DBs

  • ✅ Pinecone
  • ✅ Chroma
  • ✅ Qdrant
  • ⏳ Weaviate
  • ⏳ Milvus

Frameworks

  • ✅ LangChain
  • ✅ LlamaIndex

🔎 Telemetry

The SDK provided with OpenLLMetry (not the instrumentations) contains a telemetry feature that collects anonymous usage information.

You can opt out of telemetry by setting the TRACELOOP_TELEMETRY environment variable to FALSE.

Why we collect telemetry

  • The primary purpose is to detect exceptions within instrumentations. Since LLM providers frequently update their APIs, this helps us quickly identify and fix any breaking changes.
  • We only collect anonymous data, with no personally identifiable information. You can view exactly what data we collect in our Privacy documentation.
  • Telemetry is only collected in the SDK. If you use the instrumentations directly without the SDK, no telemetry is collected.

🌱 Contributing

Whether it's big or small, we love contributions ❤️ Check out our guide to see how to get started.

Not sure where to get started? You can:

💚 Community & Support

  • Slack (For live discussion with the community and the Traceloop team)
  • GitHub Discussions (For help with building and deeper conversations about features)
  • GitHub Issues (For any bugs and errors you encounter using OpenLLMetry)
  • Twitter (Get news fast)

changelog

Change Log

All notable changes to this project will be documented in this file. See Conventional Commits for commit guidelines.

0.13.0 (2025-04-22)

Features

  • traceloop-sdk: standalone span processor (#596) (05a6326)

0.12.2 (2025-03-12)

Bug Fixes

  • sdk: headers as an argument in initialization (#569) (7aec655)

0.12.1 (2025-02-20)

Note: Version bump only for package openllmetry-js

0.12.0 (2025-01-13)

Features

0.11.7 (2024-12-20)

Bug Fixes

  • sdk: patch span attributes for vercel AI users (#478) (33eca03)

0.11.6 (2024-12-16)

Bug Fixes

  • deps: major, minor and various instrumentation fixes (0f18865)

0.11.5 (2024-12-11)

Bug Fixes

  • openai: structured output promise unwrapping exception (#474) (546fd9e)

0.11.4 (2024-11-13)

Bug Fixes

  • langchain: return exports in langchain runnables module patch (#470) (23fdb28)
  • vertex-ai: missing system prompt (#473) (663e438)

0.11.3 (2024-10-16)

Bug Fixes

  • openai: streaming tool_call + logging multiple tool_call (#463) (5d5de09)

0.11.2 (2024-09-10)

Bug Fixes

0.11.1 (2024-08-31)

Bug Fixes

  • langchain: instrument vector DB calls (#440) (c129aae)

0.11.0 (2024-08-27)

Bug Fixes

  • include dist .mjs for packages with modules declared (#425) (4a7ec33)
  • sdk: use headers from env if available (#435) (31aa015)

Features

  • instrumentation-chromadb,instrumentation-qdrant: add esm exports (#428) (dfd418b)

0.10.0 (2024-08-01)

Features

  • introduce traceloop.entity.path instead of traceloop.entity.name chaining (#393) (207f9fe)

0.9.5 (2024-07-30)

Bug Fixes

  • sdk: option to suppress instrumentations (#392) (d6ccf0d)

0.9.4 (2024-07-28)

Bug Fixes

  • sdk: properly initialize token enrich value for instrumentations (#384) (143bc66)

0.9.3 (2024-07-25)

Note: Version bump only for package openllmetry-js

0.9.2 (2024-07-17)

Bug Fixes

0.9.1 (2024-07-10)

Bug Fixes

  • sdk: support parameters needed by Sentry SDK (#360) (b1f195c)

0.9.0 (2024-07-04)

Bug Fixes

  • sdk: option to silence initialization message (#343) (75c68ce)
  • sdk: versions on workflows & tasks (#353) (eb6211f)

Features

0.8.9 (2024-06-17)

Bug Fixes

  • sdk: run workflows in parallel with different association proper… (#329) (9a8f84c)

0.8.8 (2024-06-16)

Bug Fixes

  • sdk: serialization of Map in sub-objects of inputs and outputs (#323) (49b032a)

0.8.7 (2024-06-12)

Bug Fixes

  • sdk: propagate association properties within a workflow (#318) (3e530bc)

0.8.6 (2024-06-03)

Bug Fixes

  • remove sentry; lower noise for instrumentation errors (#294) (c4e3782)

0.8.5 (2024-05-31)

Bug Fixes

0.8.4 (2024-05-20)

Bug Fixes

  • manual-tracing: add missing llm.request.type attribute (#269) (528c498)
  • sdk: serialize map outputs for non-promise outputs (#276) (b4a8948)
  • sdk: used wrong completion attribute in manual instrumentations (#277) (b434f61)

0.8.3 (2024-05-16)

Bug Fixes

  • sdk: api for manual logging of LLM calls (#264) (500097c)

0.8.2 (2024-05-07)

Bug Fixes

  • openai: switched to pure js tiktoken (#248) (9d8805e)

0.8.1 (2024-05-06)

Bug Fixes

0.8.0 (2024-04-29)

Features

0.7.0 (2024-04-22)

Bug Fixes

Features

0.6.1 (2024-04-22)

Bug Fixes

0.6.0 (2024-04-05)

Bug Fixes

  • anthropic: support streaming for completion API (#191) (0efb330)

Features

0.5.29 (2024-04-03)

Bug Fixes

  • openai: enrich token metrics on streaming requests (#183) (2ef0c13)
  • sdk: clean and typed instrumentations (#182) (83737ee)

0.5.28 (2024-04-02)

Bug Fixes

  • openai-instrumentation: logprobs reporting using span event (#172) (923df7f)
  • sdk: allow passing a function to the decorator (#181) (2178f1c)
  • sdk: decorator bug with passing this parameter (#180) (956bad4)

0.5.27 (2024-03-26)

Bug Fixes

0.5.26 (2024-03-26)

Bug Fixes

  • sdk: manual instrumentation in all modules (#171) (e5784a4)

0.5.25 (2024-03-15)

Bug Fixes

  • sdk: do not initialize logger if not instructed (#156) (cab900c)

0.5.24 (2024-03-15)

Bug Fixes

  • switch to rollup for all instrumentations (#155) (605fb46)

0.5.23 (2024-03-15)

Bug Fixes

0.5.22 (2024-03-15)

Bug Fixes