← Back to changelog
Marc Klingen
April 9, 2024
OpenAI Integration tracks used Langfuse Prompts
![Picture Marc Klingen](/_next/image?url=%2Fimages%2Fpeople%2Fmarcklingen.jpg&w=96&q=75)
The OpenAI SDK integration for Langfuse Tracing now supports capturing the used prompt (version) from Prompt Management.
Example
prompt = langfuse.get_prompt("calculator")
openai.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": prompt.compile(base=10)},
{"role": "user", "content": "1 + 1 = "}],
langfuse_prompt=prompt
)
Thanks to @fancyweb (opens in a new tab) for the contribution on this!
See prompt management docs for more details and example notebook.