Link multiple prompts to a single observation? #7608
Replies: 2 comments 3 replies
-
|
Hi @andrea-t94, currently, it is only possible to link one prompt to a generation, but I agree that there are use cases where it would be useful to see the other prompt components that have been part of the prompt. I will turn this post into a GitHub ideas post to add it to our roadmap! |
Beta Was this translation helpful? Give feedback.
-
|
Do we have any update on this use-case? Even we have two prompts on one observation level for few our our use-cases/features. As of now, we can "Link trace to prompt" to only one of those. and 2nd prompt feels like a orphan, lost and still not properly de-coupled.
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi team,
Weβre using the Mastra SDK to integrate Langfuse prompt management in our AI agent system. At runtime, we assemble the final LLM prompt by combining several modular prompts:
Each of these prompts is stored separately in the Langfuse UI under prompt management.
We do not want to create a single static prompt in Langfuse, because the set of tool prompts included depends on the runtime context (only relevant tools are added per session).
How we assemble the prompt:
How we currently track prompts in Langfuse (Mastra SDK):
Question
Is there a way to link multiple prompts (base + tool modules, each stored separately in Langfuse) as first-class prompts to a single observation or trace?
Or, is there a recommended approach for making all contributing prompt modules visible and searchable in Langfuse, without merging them into a single static prompt?
Goal:
We want to preserve the modularity and runtime flexibility of our prompts, but still have full visibility in Langfuse over which prompt modules contributed to each LLM call.
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions