LiteLLM Integration

liteLLM provides callbacks, making it easy for you to log your completions responses.

Using Callbacks

First, sign up to get an app ID on the Lunary dashboard.

With these 2 lines of code, you can instantly log your responses across all providers with lunary:

litellm.success_callback = ["lunary"]
litellm.failure_callback = ["lunary"]

Complete code

from litellm import completion
## set env variables
os.environ["LUNARY_PUBLIC_KEY"] = "YOUR PROJECT ID"
# Optional: os.environ["LUNARY_API_URL"] = "self-hosting-url"
os.environ["OPENAI_API_KEY"], os.environ["COHERE_API_KEY"] = "", ""
# set callbacks
litellm.success_callback = ["lunary"]
litellm.failure_callback = ["lunary"]
#openai call
response = completion(
model="gpt-4o",
messages=[{"role": "user", "content": "Hi 👋 - i'm openai"}],
user="some_user_id"
)
#cohere call
response = completion(
model="command-nightly",
messages=[{"role": "user", "content": "Hi 👋 - i'm cohere"}],
user="some_user_id"
)

Questions? We're here to help.