現已釋出!閱讀關於 11 月新增功能和修復的內容。

AI 工具包中的跟蹤

AI 工具包提供跟蹤功能,幫助您監控和分析 AI 應用程式的效能。您可以跟蹤 AI 應用程式的執行情況,包括與生成式 AI 模型的互動,從而深入瞭解其行為和效能。

AI 工具包託管本地 HTTP 和 gRPC 伺服器以收集跟蹤資料。收集器伺服器與 OTLP(OpenTelemetry 協議)相容,大多數語言模型 SDK 都直接支援 OTLP,或者有非 Microsoft 的檢測庫來支援它。使用 AI 工具包視覺化收集的檢測資料。

所有支援 OTLP 並遵循 生成式 AI 系統的語義約定的框架或 SDK 都受支援。下表包含經過相容性測試的常用 AI SDK。

Azure AI 推理 Foundry Agent Service Anthropic Gemini LangChain OpenAI SDK 3 OpenAI Agents SDK
Python ✅ (traceloopmonocle1,2 ✅ (monocle ✅ (LangSmithmonocle1,2 ✅ (opentelemetry-python-contribmonocle1 ✅ (Logfiremonocle1,2
TS/JS ✅ (traceloop1,2 ✅ (traceloop1,2 ✅ (traceloop1,2
  1. 方括號中的 SDK 是非 Microsoft 工具,它們添加了 OTLP 支援,因為官方 SDK 不支援 OTLP。
  2. 這些工具不完全遵循 OpenTelemetry 關於生成式 AI 系統的規則。
  3. 對於 OpenAI SDK,僅支援 聊天完成 API。不支援 響應 API

如何開始跟蹤

  1. 透過在樹狀檢視中選擇“**跟蹤**”來開啟跟蹤 Web 檢視。

  2. 選擇“**啟動收集器**”按鈕以啟動本地 OTLP 跟蹤收集器伺服器。

    Screenshot showing the Start Collector button in the Tracing webview.

  3. 使用程式碼片段啟用檢測。有關不同語言和 SDK 的程式碼片段,請參閱“設定檢測”部分。

  4. 透過執行應用程式生成跟蹤資料。

  5. 在跟蹤 Web 檢視中,選擇“**重新整理**”按鈕以檢視新的跟蹤資料。

    Screenshot showing the Trace List in the Tracing webview.

設定檢測

在 AI 應用程式中設定跟蹤以收集跟蹤資料。以下程式碼片段展示瞭如何為不同的 SDK 和語言設定跟蹤。

所有 SDK 的過程相似。

  • 向 LLM 或代理應用程式新增跟蹤。
  • 設定 OTLP 跟蹤匯出器以使用 AITK 本地收集器。
Azure AI 推理 SDK - Python

安裝

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry]

設定

import os
os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true"
os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry"

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter

resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-azure-ai-agents"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
    endpoint="https://:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)

logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

from azure.ai.inference.tracing import AIInferenceInstrumentor
AIInferenceInstrumentor().instrument(True)
Azure AI 推理 SDK - TypeScript/JavaScript

安裝

npm install @azure/opentelemetry-instrumentation-azure-sdk @opentelemetry/api @opentelemetry/exporter-trace-otlp-proto @opentelemetry/instrumentation @opentelemetry/resources @opentelemetry/sdk-trace-node

設定

const { context } = require('@opentelemetry/api');
const { resourceFromAttributes } = require('@opentelemetry/resources');
const {
  NodeTracerProvider,
  SimpleSpanProcessor
} = require('@opentelemetry/sdk-trace-node');
const { OTLPTraceExporter } = require('@opentelemetry/exporter-trace-otlp-proto');

const exporter = new OTLPTraceExporter({
  url: 'https://:4318/v1/traces'
});
const provider = new NodeTracerProvider({
  resource: resourceFromAttributes({
    'service.name': 'opentelemetry-instrumentation-azure-ai-inference'
  }),
  spanProcessors: [new SimpleSpanProcessor(exporter)]
});
provider.register();

const { registerInstrumentations } = require('@opentelemetry/instrumentation');
const {
  createAzureSdkInstrumentation
} = require('@azure/opentelemetry-instrumentation-azure-sdk');

registerInstrumentations({
  instrumentations: [createAzureSdkInstrumentation()]
});
Foundry Agent Service - Python

安裝

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry]

設定

import os
os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true"
os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry"

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter

resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-azure-ai-agents"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
    endpoint="https://:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)

logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

from azure.ai.agents.telemetry import AIAgentsInstrumentor
AIAgentsInstrumentor().instrument(True)
Foundry Agent Service - TypeScript/JavaScript

安裝

npm install @azure/opentelemetry-instrumentation-azure-sdk @opentelemetry/api @opentelemetry/exporter-trace-otlp-proto @opentelemetry/instrumentation @opentelemetry/resources @opentelemetry/sdk-trace-node

設定

const { context } = require('@opentelemetry/api');
const { resourceFromAttributes } = require('@opentelemetry/resources');
const {
  NodeTracerProvider,
  SimpleSpanProcessor
} = require('@opentelemetry/sdk-trace-node');
const { OTLPTraceExporter } = require('@opentelemetry/exporter-trace-otlp-proto');

const exporter = new OTLPTraceExporter({
  url: 'https://:4318/v1/traces'
});
const provider = new NodeTracerProvider({
  resource: resourceFromAttributes({
    'service.name': 'opentelemetry-instrumentation-azure-ai-inference'
  }),
  spanProcessors: [new SimpleSpanProcessor(exporter)]
});
provider.register();

const { registerInstrumentations } = require('@opentelemetry/instrumentation');
const {
  createAzureSdkInstrumentation
} = require('@azure/opentelemetry-instrumentation-azure-sdk');

registerInstrumentations({
  instrumentations: [createAzureSdkInstrumentation()]
});
Anthropic - Python

OpenTelemetry

安裝

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-anthropic

設定

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter

resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-anthropic-traceloop"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
    endpoint="https://:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)

logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
AnthropicInstrumentor().instrument()

Monocle

安裝

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace

設定

from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Import monocle_apptrace
from monocle_apptrace import setup_monocle_telemetry

# Setup Monocle telemetry with OTLP span exporter for traces
setup_monocle_telemetry(
    workflow_name="opentelemetry-instrumentation-anthropic",
    span_processors=[
        BatchSpanProcessor(
            OTLPSpanExporter(endpoint="https://:4318/v1/traces")
        )
    ]
)
Anthropic - TypeScript/JavaScript

安裝

npm install @traceloop/node-server-sdk

設定

const { initialize } = require('@traceloop/node-server-sdk');
const { trace } = require('@opentelemetry/api');

initialize({
  appName: 'opentelemetry-instrumentation-anthropic-traceloop',
  baseUrl: 'https://:4318',
  disableBatch: true
});
Google Gemini - Python

OpenTelemetry

安裝

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-google-genai

設定

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter

resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-google-genai"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
    endpoint="https://:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)

logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

from opentelemetry.instrumentation.google_genai import GoogleGenAiSdkInstrumentor
GoogleGenAiSdkInstrumentor().instrument(enable_content_recording=True)

Monocle

安裝

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace

設定

from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Import monocle_apptrace
from monocle_apptrace import setup_monocle_telemetry

# Setup Monocle telemetry with OTLP span exporter for traces
setup_monocle_telemetry(
    workflow_name="opentelemetry-instrumentation-google-genai",
    span_processors=[
        BatchSpanProcessor(
            OTLPSpanExporter(endpoint="https://:4318/v1/traces")
        )
    ]
)
LangChain - Python

LangSmith

安裝

pip install langsmith[otel]

設定

import os
os.environ["LANGSMITH_OTEL_ENABLED"] = "true"
os.environ["LANGSMITH_TRACING"] = "true"
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "https://:4318"

Monocle

安裝

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace

設定

from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Import monocle_apptrace
from monocle_apptrace import setup_monocle_telemetry

# Setup Monocle telemetry with OTLP span exporter for traces
setup_monocle_telemetry(
    workflow_name="opentelemetry-instrumentation-langchain",
    span_processors=[
        BatchSpanProcessor(
            OTLPSpanExporter(endpoint="https://:4318/v1/traces")
        )
    ]
)
LangChain - TypeScript/JavaScript

安裝

npm install @traceloop/node-server-sdk

設定

const { initialize } = require('@traceloop/node-server-sdk');
initialize({
  appName: 'opentelemetry-instrumentation-langchain-traceloop',
  baseUrl: 'https://:4318',
  disableBatch: true
});
OpenAI - Python

OpenTelemetry

安裝

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-openai-v2

設定

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor
import os

os.environ["OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT"] = "true"

# Set up resource
resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-openai"
})

# Create tracer provider
trace.set_tracer_provider(TracerProvider(resource=resource))

# Configure OTLP exporter
otlp_exporter = OTLPSpanExporter(
    endpoint="https://:4318/v1/traces"
)

# Add span processor
trace.get_tracer_provider().add_span_processor(
    BatchSpanProcessor(otlp_exporter)
)

# Set up logger provider
logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

# Enable OpenAI instrumentation
OpenAIInstrumentor().instrument()

Monocle

安裝

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace

設定

from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Import monocle_apptrace
from monocle_apptrace import setup_monocle_telemetry

# Setup Monocle telemetry with OTLP span exporter for traces
setup_monocle_telemetry(
    workflow_name="opentelemetry-instrumentation-openai",
    span_processors=[
        BatchSpanProcessor(
            OTLPSpanExporter(endpoint="https://:4318/v1/traces")
        )
    ]
)
OpenAI - TypeScript/JavaScript

安裝

npm install @traceloop/instrumentation-openai @traceloop/node-server-sdk

設定

const { initialize } = require('@traceloop/node-server-sdk');
initialize({
  appName: 'opentelemetry-instrumentation-openai-traceloop',
  baseUrl: 'https://:4318',
  disableBatch: true
});
OpenAI Agents SDK - Python

Logfire

安裝

pip install logfire

設定

import logfire
import os

os.environ["OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"] = "https://:4318/v1/traces"

logfire.configure(
    service_name="opentelemetry-instrumentation-openai-agents-logfire",
    send_to_logfire=False,
)
logfire.instrument_openai_agents()

Monocle

安裝

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace

設定

from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Import monocle_apptrace
from monocle_apptrace import setup_monocle_telemetry

# Setup Monocle telemetry with OTLP span exporter for traces
setup_monocle_telemetry(
    workflow_name="opentelemetry-instrumentation-openai-agents",
    span_processors=[
        BatchSpanProcessor(
            OTLPSpanExporter(endpoint="https://:4318/v1/traces")
        )
    ]
)

示例 1:使用 Opentelemetry 透過 Azure AI 推理 SDK 設定跟蹤

以下端到端示例使用 Python 中的 Azure AI 推理 SDK,並展示瞭如何設定跟蹤提供程式和檢測。

先決條件

要執行此示例,您需要以下先決條件:

設定您的開發環境

使用以下說明部署一個預配置的開發環境,其中包含執行此示例所需的所有依賴項。

  1. 設定 GitHub 個人訪問令牌

    使用免費的 GitHub Models 作為示例模型。

    開啟 GitHub 開發人員設定並選擇“**生成新令牌**”。

    重要

    令牌需要 models:read 許可權,否則將返回未經授權。令牌將傳送到 Microsoft 服務。

  2. 建立環境變數

    建立一個環境變數,使用以下程式碼片段之一將您的令牌設定為客戶端程式碼的金鑰。將 <your-github-token-goes-here> 替換為您實際的 GitHub 令牌。

    bash

    export GITHUB_TOKEN="<your-github-token-goes-here>"
    

    powershell

    $Env:GITHUB_TOKEN="<your-github-token-goes-here>"
    

    Windows 命令提示符

    set GITHUB_TOKEN=<your-github-token-goes-here>
    
  3. 安裝 Python 包

    以下命令將安裝使用 Azure AI 推理 SDK 進行跟蹤所需的 Python 包。

    pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry]
    
  4. 設定跟蹤

    1. 在計算機上為專案建立一個新的本地目錄。

      mkdir my-tracing-app
      
    2. 導航到您建立的目錄。

      cd my-tracing-app
      
    3. 在該目錄中開啟 Visual Studio Code。

      code .
      
  5. 建立 Python 檔案

    1. my-tracing-app 目錄中,建立一個名為 main.py 的 Python 檔案。

      您將在此處新增設定跟蹤和與 Azure AI 推理 SDK 互動的程式碼。

    2. 將以下程式碼新增到 main.py 並儲存檔案。

      import os
      
      ### Set up for OpenTelemetry tracing ###
      os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true"
      os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry"
      
      from opentelemetry import trace, _events
      from opentelemetry.sdk.resources import Resource
      from opentelemetry.sdk.trace import TracerProvider
      from opentelemetry.sdk.trace.export import BatchSpanProcessor
      from opentelemetry.sdk._logs import LoggerProvider
      from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
      from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
      from opentelemetry.sdk._events import EventLoggerProvider
      from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
      
      github_token = os.environ["GITHUB_TOKEN"]
      
      resource = Resource(attributes={
          "service.name": "opentelemetry-instrumentation-azure-ai-inference"
      })
      provider = TracerProvider(resource=resource)
      otlp_exporter = OTLPSpanExporter(
          endpoint="https://:4318/v1/traces",
      )
      processor = BatchSpanProcessor(otlp_exporter)
      provider.add_span_processor(processor)
      trace.set_tracer_provider(provider)
      
      logger_provider = LoggerProvider(resource=resource)
      logger_provider.add_log_record_processor(
          BatchLogRecordProcessor(OTLPLogExporter(endpoint="https://:4318/v1/logs"))
      )
      _events.set_event_logger_provider(EventLoggerProvider(logger_provider))
      
      from azure.ai.inference.tracing import AIInferenceInstrumentor
      AIInferenceInstrumentor().instrument()
      ### Set up for OpenTelemetry tracing ###
      
      from azure.ai.inference import ChatCompletionsClient
      from azure.ai.inference.models import UserMessage
      from azure.ai.inference.models import TextContentItem
      from azure.core.credentials import AzureKeyCredential
      
      client = ChatCompletionsClient(
          endpoint = "https://models.inference.ai.azure.com",
          credential = AzureKeyCredential(github_token),
          api_version = "2024-08-01-preview",
      )
      
      response = client.complete(
          messages = [
              UserMessage(content = [
                  TextContentItem(text = "hi"),
              ]),
          ],
          model = "gpt-4.1",
          tools = [],
          response_format = "text",
          temperature = 1,
          top_p = 1,
      )
      
      print(response.choices[0].message.content)
      
  6. 執行程式碼

    1. 在 Visual Studio Code 中開啟一個新終端。

    2. 在終端中,使用命令 python main.py 執行程式碼。

  7. 在 AI 工具包中檢查跟蹤資料

    執行程式碼並重新整理跟蹤 Web 檢視後,列表中會出現一個新的跟蹤。

    選擇跟蹤以開啟跟蹤詳細資訊 Web 檢視。

    Screenshot showing selecting a trace from the Trace List in the Tracing webview.

    在左側的 span 樹狀檢視中檢視應用程式的完整執行流程。

    選擇右側的 span 詳細資訊檢視中的 span,然後在“**輸入 + 輸出**”選項卡中檢視生成式 AI 訊息。

    選擇“**元資料**”選項卡以檢視原始元資料。

    Screenshot showing the Trace Details view in the Tracing webview.

示例 2:使用 Monocle 透過 OpenAI Agents SDK 設定跟蹤

以下端到端示例在 Python 中使用 OpenAI Agents SDK 和 Monocle,並展示瞭如何為多代理旅行預訂系統設定跟蹤。

先決條件

要執行此示例,您需要以下先決條件:

設定您的開發環境

使用以下說明部署一個預配置的開發環境,其中包含執行此示例所需的所有依賴項。

  1. 建立環境變數

    使用以下程式碼片段之一為您的 OpenAI API 金鑰建立環境變數。將 <your-openai-api-key> 替換為您實際的 OpenAI API 金鑰。

    bash

    export OPENAI_API_KEY="<your-openai-api-key>"
    

    powershell

    $Env:OPENAI_API_KEY="<your-openai-api-key>"
    

    Windows 命令提示符

    set OPENAI_API_KEY=<your-openai-api-key>
    

    或者,在您的專案目錄中建立一個 .env 檔案。

    OPENAI_API_KEY=<your-openai-api-key>
    
  2. 安裝 Python 包

    建立具有以下內容的 requirements.txt 檔案。

    opentelemetry-sdk
    opentelemetry-exporter-otlp-proto-http
    monocle_apptrace
    openai-agents
    python-dotenv
    

    使用以下命令安裝包:

    pip install -r requirements.txt
    
  3. 設定跟蹤

    1. 在計算機上為專案建立一個新的本地目錄。

      mkdir my-agents-tracing-app
      
    2. 導航到您建立的目錄。

      cd my-agents-tracing-app
      
    3. 在該目錄中開啟 Visual Studio Code。

      code .
      
  4. 建立 Python 檔案

    1. my-agents-tracing-app 目錄中,建立一個名為 main.py 的 Python 檔案。

      您將在此處新增使用 Monocle 設定跟蹤和與 OpenAI Agents SDK 互動的程式碼。

    2. 將以下程式碼新增到 main.py 並儲存檔案。

      import os
      
      from dotenv import load_dotenv
      
      # Load environment variables from .env file
      load_dotenv()
      
      from opentelemetry.sdk.trace.export import BatchSpanProcessor
      from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
      
      # Import monocle_apptrace
      from monocle_apptrace import setup_monocle_telemetry
      
      # Setup Monocle telemetry with OTLP span exporter for traces
      setup_monocle_telemetry(
          workflow_name="opentelemetry-instrumentation-openai-agents",
          span_processors=[
              BatchSpanProcessor(
                  OTLPSpanExporter(endpoint="https://:4318/v1/traces")
              )
          ]
      )
      
      from agents import Agent, Runner, function_tool
      
      # Define tool functions
      @function_tool
      def book_flight(from_airport: str, to_airport: str) -> str:
          """Book a flight between airports."""
          return f"Successfully booked a flight from {from_airport} to {to_airport} for 100 USD."
      
      @function_tool
      def book_hotel(hotel_name: str, city: str) -> str:
          """Book a hotel reservation."""
          return f"Successfully booked a stay at {hotel_name} in {city} for 50 USD."
      
      @function_tool
      def get_weather(city: str) -> str:
          """Get weather information for a city."""
          return f"The weather in {city} is sunny and 75°F."
      
      # Create specialized agents
      flight_agent = Agent(
          name="Flight Agent",
          instructions="You are a flight booking specialist. Use the book_flight tool to book flights.",
          tools=[book_flight],
      )
      
      hotel_agent = Agent(
          name="Hotel Agent",
          instructions="You are a hotel booking specialist. Use the book_hotel tool to book hotels.",
          tools=[book_hotel],
      )
      
      weather_agent = Agent(
          name="Weather Agent",
          instructions="You are a weather information specialist. Use the get_weather tool to provide weather information.",
          tools=[get_weather],
      )
      
      # Create a coordinator agent with tools
      coordinator = Agent(
          name="Travel Coordinator",
          instructions="You are a travel coordinator. Delegate flight bookings to the Flight Agent, hotel bookings to the Hotel Agent, and weather queries to the Weather Agent.",
          tools=[
              flight_agent.as_tool(
                  tool_name="flight_expert",
                  tool_description="Handles flight booking questions and requests.",
              ),
              hotel_agent.as_tool(
                  tool_name="hotel_expert",
                  tool_description="Handles hotel booking questions and requests.",
              ),
              weather_agent.as_tool(
                  tool_name="weather_expert",
                  tool_description="Handles weather information questions and requests.",
              ),
          ],
      )
      
      # Run the multi-agent workflow
      if __name__ == "__main__":
          import asyncio
      
          result = asyncio.run(
              Runner.run(
                  coordinator,
                  "Book me a flight today from SEA to SFO, then book the best hotel there and tell me the weather.",
              )
          )
          print(result.final_output)
      
  5. 執行程式碼

    1. 在 Visual Studio Code 中開啟一個新終端。

    2. 在終端中,使用命令 python main.py 執行程式碼。

  6. 在 AI 工具包中檢查跟蹤資料

    執行程式碼並重新整理跟蹤 Web 檢視後,列表中會出現一個新的跟蹤。

    選擇跟蹤以開啟跟蹤詳細資訊 Web 檢視。

    Screenshot showing selecting a trace from the Trace List in the Tracing webview.

    在左側的 span 樹狀檢視中,檢視應用程式的完整執行流程,包括代理呼叫、工具呼叫和代理委託。

    選擇右側的 span 詳細資訊檢視中的 span,然後在“**輸入 + 輸出**”選項卡中檢視生成式 AI 訊息。

    選擇“**元資料**”選項卡以檢視原始元資料。

    Screenshot showing the Trace Details view in the Tracing webview.

您學到了什麼

在本文中,您學習瞭如何

  • 使用 Azure AI 推理 SDK 和 OpenTelemetry 在 AI 應用程式中設定跟蹤。
  • 配置 OTLP 跟蹤匯出器,將跟蹤資料傳送到本地收集器伺服器。
  • 執行您的應用程式以生成跟蹤資料,並在 AI 工具包 Web 檢視中檢視跟蹤。
  • 使用多種 SDK 和語言(包括 Python 和 TypeScript/JavaScript)以及透過 OTLP 的非 Microsoft 工具來使用跟蹤功能。
  • 使用提供的程式碼片段檢測各種 AI 框架(Anthropic、Gemini、LangChain、OpenAI 等)。
  • 使用跟蹤 Web 檢視 UI,包括“**啟動收集器**”和“**重新整理**”按鈕來管理跟蹤資料。
  • 設定您的開發環境,包括環境變數和包安裝,以啟用跟蹤。
  • 使用 span 樹狀圖和詳細資訊檢視分析應用程式的執行流程,包括生成式 AI 訊息流程和元資料。
© . This site is unofficial and not affiliated with Microsoft.