Summary
Haystack (haystack-ai) is an open-source AI orchestration framework by deepset for building production-ready LLM applications. It has 24.6k GitHub stars, 300k+ monthly PyPI downloads, and active releases (latest: v2.28, April 20, 2026). This repository has zero instrumentation for any Haystack execution surface — no integration directory, no wrapper, no patcher, no auto_instrument() support.
Haystack provides a pipeline-based execution model for RAG, search, and agent workflows, with explicit control over retrieval, routing, memory, and generation. Its Agent component implements tool-calling agent functionality with provider-agnostic chat model support. The framework is used in production by organizations including Airbus, Netflix, and others.
Comparable orchestration frameworks (LangChain, DSPy, Agno, Pydantic AI) have dedicated native integrations in this repository.
What needs to be instrumented
The haystack-ai package exposes these execution surfaces, none of which are instrumented:
Pipeline execution (highest priority)
| Class / Method |
Description |
Pipeline.run() |
Executes a DAG of connected components — the primary execution surface |
AsyncPipeline.run() |
Async pipeline execution |
Pipelines are Haystack's core abstraction. A pipeline connects components (generators, retrievers, embedders, converters, routers) into a directed graph. Pipeline.run() executes the full graph, passing data between components. Tracing should capture the full pipeline run as a parent span with child spans for each component execution.
Agent execution
| Class / Method |
Description |
Agent.run() |
Tool-calling agent that iterates until a final answer or max steps |
The Agent is a Haystack component that uses a chat model to perform multi-step reasoning with tool use. It supports any ChatGenerator backend (OpenAI, Anthropic, etc.) and can be used standalone or nested within a pipeline.
Component execution
| Class / Method |
Description |
Component.run() |
Individual component execution within a pipeline |
Key component types with generative-AI execution surfaces:
OpenAIChatGenerator.run() — Chat completions via OpenAI
AnthropicChatGenerator.run() — Chat completions via Anthropic
GoogleAIGeminiChatGenerator.run() — Chat completions via Google AI
HuggingFaceAPIChatGenerator.run() — Chat completions via HF Inference API
OpenAITextEmbedder.run() / OpenAIDocumentEmbedder.run() — Embeddings
- Various retrievers — Document retrieval from vector stores
Chat model abstraction
| Class / Method |
Description |
ChatGenerator.run() |
Provider-agnostic chat generation (wraps provider-specific generators) |
Implementation notes
Tracing system: Haystack v2 has a built-in tracing system (haystack.tracing) with a Tracer interface and support for custom backends. It ships with OpenTelemetry support via haystack.tracing.opentelemetry. A native Braintrust integration could implement the Tracer interface or hook into the component run() lifecycle.
Pipeline introspection: Pipelines expose their component graph, allowing the integration to automatically discover and trace all components without manual configuration.
Content tracing: Haystack has a tracing.content_tracing_enabled flag that controls whether input/output content is included in traces (vs. just structural spans). The Braintrust integration should enable this for full observability.
No coverage in any instrumentation layer
- No integration directory (
py/src/braintrust/integrations/haystack/)
- No wrapper function (e.g.
wrap_haystack())
- No patcher in any existing integration
- No nox test session (
test_haystack)
- No version entry in
py/src/braintrust/integrations/versioning.py
- No mention in
py/src/braintrust/integrations/__init__.py
A grep for haystack (case-insensitive) across py/src/braintrust/ returns zero matches.
Braintrust docs status
not_found — Haystack is not mentioned on the Braintrust integrations directory or the tracing guide.
Upstream references
Local repo files inspected
py/src/braintrust/integrations/ — no haystack/ directory exists on main
py/src/braintrust/wrappers/ — no Haystack wrapper
py/noxfile.py — no test_haystack session
py/src/braintrust/integrations/__init__.py — Haystack not listed in integration registry
py/src/braintrust/integrations/versioning.py — no Haystack version matrix
- Full repo grep for "haystack" — zero matches in SDK source
Summary
Haystack (
haystack-ai) is an open-source AI orchestration framework by deepset for building production-ready LLM applications. It has 24.6k GitHub stars, 300k+ monthly PyPI downloads, and active releases (latest: v2.28, April 20, 2026). This repository has zero instrumentation for any Haystack execution surface — no integration directory, no wrapper, no patcher, noauto_instrument()support.Haystack provides a pipeline-based execution model for RAG, search, and agent workflows, with explicit control over retrieval, routing, memory, and generation. Its
Agentcomponent implements tool-calling agent functionality with provider-agnostic chat model support. The framework is used in production by organizations including Airbus, Netflix, and others.Comparable orchestration frameworks (LangChain, DSPy, Agno, Pydantic AI) have dedicated native integrations in this repository.
What needs to be instrumented
The
haystack-aipackage exposes these execution surfaces, none of which are instrumented:Pipeline execution (highest priority)
Pipeline.run()AsyncPipeline.run()Pipelines are Haystack's core abstraction. A pipeline connects components (generators, retrievers, embedders, converters, routers) into a directed graph.
Pipeline.run()executes the full graph, passing data between components. Tracing should capture the full pipeline run as a parent span with child spans for each component execution.Agent execution
Agent.run()The
Agentis a Haystack component that uses a chat model to perform multi-step reasoning with tool use. It supports anyChatGeneratorbackend (OpenAI, Anthropic, etc.) and can be used standalone or nested within a pipeline.Component execution
Component.run()Key component types with generative-AI execution surfaces:
OpenAIChatGenerator.run()— Chat completions via OpenAIAnthropicChatGenerator.run()— Chat completions via AnthropicGoogleAIGeminiChatGenerator.run()— Chat completions via Google AIHuggingFaceAPIChatGenerator.run()— Chat completions via HF Inference APIOpenAITextEmbedder.run()/OpenAIDocumentEmbedder.run()— EmbeddingsChat model abstraction
ChatGenerator.run()Implementation notes
Tracing system: Haystack v2 has a built-in tracing system (
haystack.tracing) with aTracerinterface and support for custom backends. It ships with OpenTelemetry support viahaystack.tracing.opentelemetry. A native Braintrust integration could implement theTracerinterface or hook into the componentrun()lifecycle.Pipeline introspection: Pipelines expose their component graph, allowing the integration to automatically discover and trace all components without manual configuration.
Content tracing: Haystack has a
tracing.content_tracing_enabledflag that controls whether input/output content is included in traces (vs. just structural spans). The Braintrust integration should enable this for full observability.No coverage in any instrumentation layer
py/src/braintrust/integrations/haystack/)wrap_haystack())test_haystack)py/src/braintrust/integrations/versioning.pypy/src/braintrust/integrations/__init__.pyA grep for
haystack(case-insensitive) acrosspy/src/braintrust/returns zero matches.Braintrust docs status
not_found— Haystack is not mentioned on the Braintrust integrations directory or the tracing guide.Upstream references
Local repo files inspected
py/src/braintrust/integrations/— nohaystack/directory exists onmainpy/src/braintrust/wrappers/— no Haystack wrapperpy/noxfile.py— notest_haystacksessionpy/src/braintrust/integrations/__init__.py— Haystack not listed in integration registrypy/src/braintrust/integrations/versioning.py— no Haystack version matrix