A Trace comprises a series of events executed within an LLM chain(workflow). Tracing enables Baserun to capture and display the LLM chain’s entire lifecycle, whether synchronous or asynchronous.Tracing LLM chains allows you to debug your application, monitor your LLM chains’ performance, and also collect user feedback.
Alternatively set the Baserun API key when initializing the SDK
Copy
Ask AI
import { baserun } from "baserun";// init needs to be awaited. If top-level await is not available, wrap it in an async function,// but make sure it is called before instantiating OpenAI, Anthropic or Replicateawait baserun.init({ apiKey: "br-...",});
3
Initialize Baserun
At your application’s startup, define the environment in which you’d like to run Baserun. You can use Baserun in the development environment while iterating on your features, utilizing it for debugging and analysis, or in the production environment to monitor your application.
Copy
Ask AI
import { baserun } from "baserun";// in your main function// init needs to be awaited. If top-level await is not available, wrap it in an async function,// but make sure it is called before instantiating OpenAI, Anthropic or Replicateawait baserun.init();
4
Decide what to trace
The function(s) to trace are ultimately dependent on your app. It could be a main() function, or it could be a handler for an API call.Note for TS/JS: Make sure to always call await baserun.init() before you instantiate OpenAI, Anthropic or Replicate.
Congrats, you are done! Now, you can navigate to the monitoring tab. Here is what you will see interact with your application:
Optionally, you can add metadata like trace name, user ID, and session ID to aid in debugging. Read Logging > Advanced tracing features for more details.