diff --git a/docs/api/durable_exec.md b/docs/api/durable_exec.md index b4269eed78..f8edeed380 100644 --- a/docs/api/durable_exec.md +++ b/docs/api/durable_exec.md @@ -1,3 +1,5 @@ # `pydantic_ai.durable_exec` ::: pydantic_ai.durable_exec.temporal + +::: pydantic_ai.durable_exec.dbos diff --git a/docs/durable_execution/dbos.md b/docs/durable_execution/dbos.md new file mode 100644 index 0000000000..41a83129c5 --- /dev/null +++ b/docs/durable_execution/dbos.md @@ -0,0 +1,173 @@ +# Durable Execution with DBOS + +[DBOS](https://www.dbos.dev/) is a lightweight [durable execution](https://docs.dbos.dev/architecture) library natively integrated with Pydantic AI. + +## Durable Execution + +DBOS workflows make your program **durable** by checkpointing its state in a database. If your program ever fails, when it restarts all your workflows will automatically resume from the last completed step. + +* **Workflows** must be deterministic and generally cannot include I/O. +* **Steps** may perform I/O (network, disk, API calls). If a step fails, it restarts from the beginning. + +Every workflow input and step output is durably stored in the system database. When workflow execution fails, whether from crashes, network issues, or server restarts, DBOS leverages these checkpoints to recover workflows from their last completed step. + +DBOS **queues** provide durable, database-backed alternatives to systems like Celery or BullMQ, supporting features such as concurrency limits, rate limits, timeouts, and prioritization. See the [DBOS docs](https://docs.dbos.dev/architecture) for details. + +The diagram below shows the overall architecture of an agentic application in DBOS. +DBOS runs fully in-process as a library. Functions remain normal Python functions but are checkpointed into a database (Postgres or SQLite). + +```text + Clients + (HTTP, RPC, Kafka, etc.) + | + v ++------------------------------------------------------+ +| Application Servers | +| | +| +----------------------------------------------+ | +| | Pydantic AI + DBOS Libraries | | +| | | | +| | [ Workflows (Agent Run Loop) ] | | +| | [ Steps (Tool, MCP, Model) ] | | +| | [ Queues ] [ Cron Jobs ] [ Messaging ] | | +| +----------------------------------------------+ | +| | ++------------------------------------------------------+ + | + v ++------------------------------------------------------+ +| Database | +| (Stores workflow and step state, schedules tasks) | ++------------------------------------------------------+ +``` + +See the [DBOS documentation](https://docs.dbos.dev/architecture) for more information. + +## Durable Agent + +Any agent can be wrapped in a [`DBOSAgent`][pydantic_ai.durable_exec.dbos.DBOSAgent] to get durable execution. `DBOSAgent` automatically:, + +* Wraps `Agent.run` and `Agent.run_sync` as DBOS workflows. +* Wraps [model requests](../models/overview.md) and [MCP communication](../mcp/client.md) as DBOS steps. + +Custom tool functions and event stream handlers are **not automatically wrapped** by DBOS. +If they involve non-deterministic behavior or perform I/O, you should explicitly decorate them with `@DBOS.step`. + +The original agent, model, and MCP server can still be used as normal outside the DBOS workflow. + +Here is a simple but complete example of wrapping an agent for durable execution. All it requires is to install Pydantic AI with the DBOS [open-source library](https://github.com/dbos-inc/dbos-transact-py): + +```bash +pip/uv-add pydantic-ai[dbos] +``` + +Or if you're using the slim package, you can install it with the `dbos` optional group: + +```bash +pip/uv-add pydantic-ai-slim[dbos] +``` + +```python {title="dbos_agent.py" test="skip"} +from dbos import DBOS, DBOSConfig + +from pydantic_ai import Agent +from pydantic_ai.durable_exec.dbos import DBOSAgent + +dbos_config: DBOSConfig = { + 'name': 'pydantic_dbos_agent', + 'system_database_url': 'sqlite:///dbostest.sqlite', # (3)! +} +DBOS(config=dbos_config) + +agent = Agent( + 'gpt-5', + instructions="You're an expert in geography.", + name='geography', # (4)! +) + +dbos_agent = DBOSAgent(agent) # (1)! + +async def main(): + DBOS.launch() + result = await dbos_agent.run('What is the capital of Mexico?') # (2)! + print(result.output) + #> Mexico City (Ciudad de México, CDMX) +``` + +1. Workflows and `DBOSAgent` must be defined before `DBOS.launch()` so that recovery can correctly find all workflows. +2. [`DBOSAgent.run()`][pydantic_ai.durable_exec.dbos.DBOSAgent.run] works like [`Agent.run()`][pydantic_ai.Agent.run], but runs as a DBOS workflow and executes model requests, decorated tool calls, and MCP communication as DBOS steps. +3. This example uses SQLite. Postgres is recommended for production. +4. The agent's `name` is used to uniquely identify its workflows. + +_(This example is complete, it can be run "as is" — you'll need to add `asyncio.run(main())` to run `main`)_ + +Because DBOS workflows need to be defined before calling `DBOS.launch()` and the `DBOSAgent` instance automatically registers `run` and `run_sync` as workflows, it needs to be defined before calling `DBOS.launch()` as well. + +For more information on how to use DBOS in Python applications, see their [Python SDK guide](https://docs.dbos.dev/python/programming-guide). + +## DBOS Integration Considerations + +When using DBOS with Pydantic AI agents, there are a few important considerations to ensure workflows and toolsets behave correctly. + +### Agent and Toolset Requirements + +Each agent instance must have a unique `name` so DBOS can correctly resume workflows after a failure or restart. + +Tools and event stream handlers are not automatically wrapped by DBOS. You can decide how to integrate them: + +* Decorate with `@DBOS.step` if the function involves non-determinism or I/O. +* Skip the decorator if durability isn't needed, so you avoid the extra DB checkpoint write. +* If the function needs to enqueue tasks or invoke other DBOS workflows, run it inside the agent's main workflow (not as a step). + +Other than that, any agent and toolset will just work! + +### Agent Run Context and Dependencies + +DBOS checkpoints workflow inputs/outputs and step outputs into a database using `jsonpickle`. This means you need to make sure [dependencies](../dependencies.md) object provided to [`DBOSAgent.run()`][pydantic_ai.durable_exec.dbos.DBOSAgent.run] or [`DBOSAgent.run_sync()`][pydantic_ai.durable_exec.dbos.DBOSAgent.run_sync], and tool outputs can be serialized using jsonpickle. You may also want to keep the inputs and outputs small (under \~2 MB). PostgreSQL and SQLite support up to 1 GB per field, but large objects may impact performance. + +### Streaming + +Because DBOS cannot stream output directly to the workflow or step call site, [`Agent.run_stream()`][pydantic_ai.Agent.run_stream] is not supported when running inside of a DBOS workflow. + +Instead, you can implement streaming by setting an [`event_stream_handler`][pydantic_ai.agent.EventStreamHandler] on the `Agent` or `DBOSAgent` instance and using [`DBOSAgent.run()`][pydantic_ai.durable_exec.dbos.DBOSAgent.run]. +The event stream handler function will receive the agent [run context][pydantic_ai.tools.RunContext] and an async iterable of events from the model's streaming response and the agent's execution of tools. For examples, see the [streaming docs](../agents.md#streaming-all-events). + + +## Step Configuration + +You can customize DBOS step behavior, such as retries, by passing [`StepConfig`][pydantic_ai.durable_exec.dbos.StepConfig] objects to the `DBOSAgent` constructor: + +- `mcp_step_config`: The DBOS step config to use for MCP server communication. No retries if omitted. +- `model_step_config`: The DBOS step config to use for model request steps. No retries if omitted. + +For custom tools, you can annotate them directly with [`@DBOS.step`](https://docs.dbos.dev/python/reference/decorators#step) or [`@DBOS.workflow`](https://docs.dbos.dev/python/reference/decorators#workflow) decorators as needed. These decorators have no effect outside DBOS workflows, so tools remain usable in non-DBOS agents. + + +## Step Retries + +On top of the automatic retries for request failures that DBOS will perform, Pydantic AI and various provider API clients also have their own request retry logic. Enabling these at the same time may cause the request to be retried more often than expected, with improper `Retry-After` handling. + +When using DBOS, it's recommended to not use [HTTP Request Retries](../retries.md) and to turn off your provider API client's own retry logic, for example by setting `max_retries=0` on a [custom `OpenAIProvider` API client](../models/openai.md#custom-openai-client). + +You can customize DBOS's retry policy using [step configuration](#step-configuration). + +## Observability with Logfire + +When using [Pydantic Logfire](../logfire.md), we **recommend disabling DBOS's built-in OpenTelemetry tracing**. +DBOS automatically wraps workflow and step execution in spans, while Pydantic AI and Logfire already emit spans for the same function calls, model requests, and tool invocations. Without disabling DBOS tracing, these operations may appear twice in your trace tree. + +To disable DBOS traces and logs, you can set `disable_otlp=True` in `DBOSConfig`. For example: + + +```python {title="dbos_no_traces.py" test="skip"} +from dbos import DBOS, DBOSConfig + +dbos_config: DBOSConfig = { + 'name': 'pydantic_dbos_agent', + 'system_database_url': 'sqlite:///dbostest.sqlite', + 'disable_otlp': True # (1)! +} +DBOS(config=dbos_config) +``` + +1. If `True`, disables OpenTelemetry tracing and logging for DBOS. Default is `False`. diff --git a/docs/durable_execution/overview.md b/docs/durable_execution/overview.md new file mode 100644 index 0000000000..0753f1a4c1 --- /dev/null +++ b/docs/durable_execution/overview.md @@ -0,0 +1,10 @@ +# Durable Execution + +Pydantic AI allows you to build durable agents that can preserve their progress across transient API failures and application errors or restarts, and handle long-running, asynchronous, and human-in-the-loop workflows with production-grade reliability. Durable agents have full support for [streaming](../agents.md#streaming-all-events) and [MCP](../mcp/client.md), with the added benefit of fault tolerance. + +Pydantic AI natively supports two durable execution solutions: + +- [Temporal](./temporal.md) +- [DBOS](./dbos.md) + +These integrations only uses Pydantic AI's public interface, so they also serve as a reference for integrating with other durable systems. diff --git a/docs/temporal.md b/docs/durable_execution/temporal.md similarity index 86% rename from docs/temporal.md rename to docs/durable_execution/temporal.md index 8f9554652f..6d2bf2454d 100644 --- a/docs/temporal.md +++ b/docs/durable_execution/temporal.md @@ -1,11 +1,8 @@ # Durable Execution with Temporal -Pydantic AI enables you to build durable agents that can preserve their progress across transient API failures and application errors or restarts, and handle long-running, asynchronous, and human-in-the-loop workflows with production-grade reliability. Durable agents have full support for [streaming](agents.md#streaming-all-events) and [MCP](mcp/client.md), with the added benefit of fault tolerance. - [Temporal](https://temporal.io) is a popular [durable execution](https://docs.temporal.io/evaluate/understanding-temporal#durable-execution) platform that's natively supported by Pydantic AI. -The integration only uses Pydantic AI's public interface, so it can also serve as a reference for how to integrate with another durable execution systems. -### Durable Execution +## Durable Execution In Temporal's durable execution implementation, a program that crashes or encounters an exception while interacting with a model or API will retry until it can successfully complete. @@ -29,7 +26,7 @@ Activity code faces no restrictions on I/O or external interactions, but if an a See the [Temporal documentation](https://docs.temporal.io/evaluate/understanding-temporal#temporal-application-the-building-blocks) for more information -In the case of Pydantic AI agents, integration with Temporal means that [model requests](models/overview.md), [tool calls](tools.md) that may require I/O, and [MCP server communication](mcp/client.md) all need to be offloaded to Temporal activities due to their I/O requirements, while the logic that coordinates them (i.e. the agent run) lives in the workflow. Code that handles a scheduled job or web request can then execute the workflow, which will in turn execute the activities as needed. +In the case of Pydantic AI agents, integration with Temporal means that [model requests](../models/overview.md), [tool calls](../tools.md) that may require I/O, and [MCP server communication](../mcp/client.md) all need to be offloaded to Temporal activities due to their I/O requirements, while the logic that coordinates them (i.e. the agent run) lives in the workflow. Code that handles a scheduled job or web request can then execute the workflow, which will in turn execute the activities as needed. The diagram below shows the overall architecture of an agentic application in Temporal. The Temporal Server is responsible for tracking program execution and making sure the associated state is preserved reliably (i.e., stored to an internal database, and possibly replicated across cloud regions). @@ -71,7 +68,7 @@ See the [Temporal documentation](https://docs.temporal.io/evaluate/understanding Any agent can be wrapped in a [`TemporalAgent`][pydantic_ai.durable_exec.temporal.TemporalAgent] to get a durable agent that can be used inside a deterministic Temporal workflow, by automatically offloading all work that requires I/O (namely model requests, tool calls, and MCP server communication) to non-deterministic activities. -At the time of wrapping, the agent's [model](models/overview.md) and [toolsets](toolsets.md) (including function tools registered on the agent and MCP servers) are frozen, activities are dynamically created for each, and the original model and toolsets are wrapped to call on the worker to execute the corresponding activities instead of directly performing the actions inside the workflow. The original agent can still be used as normal outside the Temporal workflow, but any changes to its model or toolsets after wrapping will not be reflected in the durable agent. +At the time of wrapping, the agent's [model](../models/overview.md) and [toolsets](../toolsets.md) (including function tools registered on the agent and MCP servers) are frozen, activities are dynamically created for each, and the original model and toolsets are wrapped to call on the worker to execute the corresponding activities instead of directly performing the actions inside the workflow. The original agent can still be used as normal outside the Temporal workflow, but any changes to its model or toolsets after wrapping will not be reflected in the durable agent. Here is a simple but complete example of wrapping an agent for durable execution, creating a Temporal workflow with durable execution logic, connecting to a Temporal server, and running the workflow from non-durable code. All it requires is a Temporal server to be [running locally](https://github.com/temporalio/temporal#download-and-start-temporal-server-locally): @@ -165,7 +162,7 @@ Other than that, any agent and toolset will just work! ### Instructions Functions, Output Functions, and History Processors -Pydantic AI runs non-async [instructions](agents.md#instructions) and [system prompt](agents.md#system-prompts) functions, [history processors](message-history.md#processing-message-history), [output functions](output.md#output-functions), and [output validators](output.md#output-validator-functions) in threads, which are not supported inside Temporal workflows and require an activity. Ensure that these functions are async instead. +Pydantic AI runs non-async [instructions](../agents.md#instructions) and [system prompt](../agents.md#system-prompts) functions, [history processors](../message-history.md#processing-message-history), [output functions](../output.md#output-functions), and [output validators](../output.md#output-validator-functions) in threads, which are not supported inside Temporal workflows and require an activity. Ensure that these functions are async instead. Synchronous tool functions are supported, as tools are automatically run in activities unless this is [explicitly disabled](#activity-configuration). Still, it's recommended to make tool functions async as well to improve performance. @@ -173,7 +170,7 @@ Synchronous tool functions are supported, as tools are automatically run in acti As workflows and activities run in separate processes, any values passed between them need to be serializable. As these payloads are stored in the workflow execution event history, Temporal limits their size to 2MB. -To account for these limitations, tool functions and the [event stream handler](#streaming) running inside activities receive a limited version of the agent's [`RunContext`][pydantic_ai.tools.RunContext], and it's your responsibility to make sure that the [dependencies](dependencies.md) object provided to [`TemporalAgent.run()`][pydantic_ai.durable_exec.temporal.TemporalAgent.run] can be serialized using Pydantic. +To account for these limitations, tool functions and the [event stream handler](#streaming) running inside activities receive a limited version of the agent's [`RunContext`][pydantic_ai.tools.RunContext], and it's your responsibility to make sure that the [dependencies](../dependencies.md) object provided to [`TemporalAgent.run()`][pydantic_ai.durable_exec.temporal.TemporalAgent.run] can be serialized using Pydantic. Specifically, only the `deps`, `retries`, `tool_call_id`, `tool_name`, `tool_call_approved`, `retry`, and `run_step` fields are available by default, and trying to access `model`, `usage`, `prompt`, `messages`, or `tracer` will raise an error. If you need one or more of these attributes to be available inside activities, you can create a [`TemporalRunContext`][pydantic_ai.durable_exec.temporal.TemporalRunContext] subclass with custom `serialize_run_context` and `deserialize_run_context` class methods and pass it to [`TemporalAgent`][pydantic_ai.durable_exec.temporal.TemporalAgent] as `run_context_type`. @@ -183,7 +180,7 @@ If you need one or more of these attributes to be available inside activities, y Because Temporal activities cannot stream output directly to the activity call site, [`Agent.run_stream()`][pydantic_ai.Agent.run_stream] and [`Agent.iter()`][pydantic_ai.Agent.iter] are not supported. Instead, you can implement streaming by setting an [`event_stream_handler`][pydantic_ai.agent.EventStreamHandler] on the `Agent` or `TemporalAgent` instance and using [`TemporalAgent.run()`][pydantic_ai.durable_exec.temporal.TemporalAgent.run] inside the workflow. -The event stream handler function will receive the agent [run context][pydantic_ai.tools.RunContext] and an async iterable of events from the model's streaming response and the agent's execution of tools. For examples, see the [streaming docs](agents.md#streaming-all-events). +The event stream handler function will receive the agent [run context][pydantic_ai.tools.RunContext] and an async iterable of events from the model's streaming response and the agent's execution of tools. For examples, see the [streaming docs](../agents.md#streaming-all-events). As the streaming model request activity, workflow, and workflow execution call all take place in separate processes, passing data between them requires some care: @@ -206,13 +203,13 @@ Temporal activity configuration, like timeouts and retry policies, can be custom On top of the automatic retries for request failures that Temporal will perform, Pydantic AI and various provider API clients also have their own request retry logic. Enabling these at the same time may cause the request to be retried more often than expected, with improper `Retry-After` handling. -When using Temporal, it's recommended to not use [HTTP Request Retries](retries.md) and to turn off your provider API client's own retry logic, for example by setting `max_retries=0` on a [custom `OpenAIProvider` API client](models/openai.md#custom-openai-client). +When using Temporal, it's recommended to not use [HTTP Request Retries](../retries.md) and to turn off your provider API client's own retry logic, for example by setting `max_retries=0` on a [custom `OpenAIProvider` API client](../models/openai.md#custom-openai-client). You can customize Temporal's retry policy using [activity configuration](#activity-configuration). ## Observability with Logfire -Temporal generates telemetry events and metrics for each workflow and activity execution, and Pydantic AI generates events for each agent run, model request and tool call. These can be sent to [Pydantic Logfire](logfire.md) to get a complete picture of what's happening in your application. +Temporal generates telemetry events and metrics for each workflow and activity execution, and Pydantic AI generates events for each agent run, model request and tool call. These can be sent to [Pydantic Logfire](../logfire.md) to get a complete picture of what's happening in your application. To use Logfire with Temporal, you need to pass a [`LogfirePlugin`][pydantic_ai.durable_exec.temporal.LogfirePlugin] object to Temporal's `Client.connect()`: diff --git a/docs/index.md b/docs/index.md index 672c0cee4c..608998e5b3 100644 --- a/docs/index.md +++ b/docs/index.md @@ -32,7 +32,7 @@ Integrates the [Model Context Protocol](mcp/client.md), [Agent2Agent](a2a.md), a Easily lets you flag that certain tool calls [require approval](deferred-tools.md#human-in-the-loop-tool-approval) before they can proceed, possibly depending on tool call arguments, conversation history, or user preferences. 8. **Durable Execution**: -Enables you to build [durable agents](temporal.md) that can preserve their progress across transient API failures and application errors or restarts, and handle long-running, asynchronous, and human-in-the-loop workflows with production-grade reliability. +Enables you to build [durable agents](durable_execution/overview.md) that can preserve their progress across transient API failures and application errors or restarts, and handle long-running, asynchronous, and human-in-the-loop workflows with production-grade reliability. 9. **Streamed Outputs**: Provides the ability to [stream](output.md#streamed-results) structured output continuously, with immediate validation, ensuring real time access to generated data. diff --git a/docs/install.md b/docs/install.md index 1fd12a499d..b4fb7147b4 100644 --- a/docs/install.md +++ b/docs/install.md @@ -57,6 +57,7 @@ pip/uv-add "pydantic-ai-slim[openai]" * `mcp` - installs `mcp` [PyPI ↗](https://pypi.org/project/mcp){:target="_blank"} * `a2a` - installs `fasta2a` [PyPI ↗](https://pypi.org/project/fasta2a){:target="_blank"} * `ag-ui` - installs `ag-ui-protocol` [PyPI ↗](https://pypi.org/project/ag-ui-protocol){:target="_blank"} and `starlette` [PyPI ↗](https://pypi.org/project/starlette){:target="_blank"} +* `dbos` - installs [`dbos`](durable_execution/dbos.md) [PyPI ↗](https://pypi.org/project/dbos){:target="_blank"} See the [models](models/overview.md) documentation for information on which optional dependencies are required for each model. diff --git a/mkdocs.yml b/mkdocs.yml index eb128d7880..58c60a717d 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -61,7 +61,10 @@ nav: - Integrations: - Debugging & Monitoring with Pydantic Logfire: logfire.md - - temporal.md + - Durable Execution: + - Overview: durable_execution/overview.md + - Temporal: durable_execution/temporal.md + - DBOS: durable_execution/dbos.md - Agent-User Interaction (AG-UI): ag-ui.md - Agent2Agent (A2A): a2a.md diff --git a/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/__init__.py b/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/__init__.py new file mode 100644 index 0000000000..72e4ac6e6f --- /dev/null +++ b/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/__init__.py @@ -0,0 +1,6 @@ +from ._agent import DBOSAgent +from ._mcp_server import DBOSMCPServer +from ._model import DBOSModel +from ._utils import StepConfig + +__all__ = ['DBOSAgent', 'DBOSModel', 'DBOSMCPServer', 'StepConfig'] diff --git a/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/_agent.py b/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/_agent.py new file mode 100644 index 0000000000..b7e6b89827 --- /dev/null +++ b/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/_agent.py @@ -0,0 +1,718 @@ +from __future__ import annotations + +from collections.abc import AsyncIterable, AsyncIterator, Iterator, Sequence +from contextlib import AbstractAsyncContextManager, asynccontextmanager, contextmanager +from typing import Any, overload + +from dbos import DBOS, DBOSConfiguredInstance +from typing_extensions import Never + +from pydantic_ai import ( + _utils, + messages as _messages, + models, + usage as _usage, +) +from pydantic_ai._run_context import AgentDepsT +from pydantic_ai.agent import AbstractAgent, AgentRun, AgentRunResult, EventStreamHandler, RunOutputDataT, WrapperAgent +from pydantic_ai.exceptions import UserError +from pydantic_ai.mcp import MCPServer +from pydantic_ai.models import Model +from pydantic_ai.output import OutputDataT, OutputSpec +from pydantic_ai.result import StreamedRunResult +from pydantic_ai.settings import ModelSettings +from pydantic_ai.tools import ( + DeferredToolResults, + RunContext, + Tool, + ToolFuncEither, +) +from pydantic_ai.toolsets import AbstractToolset + +from ._mcp_server import DBOSMCPServer +from ._model import DBOSModel +from ._utils import StepConfig + + +@DBOS.dbos_class() +class DBOSAgent(WrapperAgent[AgentDepsT, OutputDataT], DBOSConfiguredInstance): + def __init__( + self, + wrapped: AbstractAgent[AgentDepsT, OutputDataT], + *, + name: str | None = None, + event_stream_handler: EventStreamHandler[AgentDepsT] | None = None, + mcp_step_config: StepConfig | None = None, + model_step_config: StepConfig | None = None, + ): + """Wrap an agent to enable it with DBOS durable workflows, by automatically offloading model requests, tool calls, and MCP server communication to DBOS steps. + + After wrapping, the original agent can still be used as normal outside of the DBOS workflow. + + Args: + wrapped: The agent to wrap. + name: Optional unique agent name to use as the DBOS configured instance name. If not provided, the agent's `name` will be used. + event_stream_handler: Optional event stream handler to use instead of the one set on the wrapped agent. + mcp_step_config: The base DBOS step config to use for MCP server steps. If no config is provided, use the default settings of DBOS. + model_step_config: The DBOS step config to use for model request steps. If no config is provided, use the default settings of DBOS. + """ + super().__init__(wrapped) + + self._name = name or wrapped.name + self._event_stream_handler = event_stream_handler + if self._name is None: + raise UserError( + "An agent needs to have a unique `name` in order to be used with DBOS. The name will be used to identify the agent's workflows and steps." + ) + + # Merge the config with the default DBOS config + self._mcp_step_config = mcp_step_config or {} + self._model_step_config = model_step_config or {} + + if not isinstance(wrapped.model, Model): + raise UserError( + 'An agent needs to have a `model` in order to be used with DBOS, it cannot be set at agent run time.' + ) + + dbos_model = DBOSModel( + wrapped.model, + step_name_prefix=self._name, + step_config=self._model_step_config, + event_stream_handler=self.event_stream_handler, + ) + self._model = dbos_model + + dbosagent_name = self._name + + def dbosify_toolset(toolset: AbstractToolset[AgentDepsT]) -> AbstractToolset[AgentDepsT]: + # Replace MCPServer with DBOSMCPServer + if isinstance(toolset, MCPServer): + return DBOSMCPServer( + wrapped=toolset, + step_name_prefix=dbosagent_name, + step_config=self._mcp_step_config, + ) + else: + return toolset + + dbos_toolsets = [toolset.visit_and_replace(dbosify_toolset) for toolset in wrapped.toolsets] + self._toolsets = dbos_toolsets + DBOSConfiguredInstance.__init__(self, self._name) + + # Wrap the `run` method in a DBOS workflow + @DBOS.workflow(name=f'{self._name}.run') + async def wrapped_run_workflow( + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: OutputSpec[RunOutputDataT] | None = None, + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + event_stream_handler: EventStreamHandler[AgentDepsT] | None = None, + **_deprecated_kwargs: Never, + ) -> AgentRunResult[Any]: + with self._dbos_overrides(): + return await super(WrapperAgent, self).run( + user_prompt, + output_type=output_type, + message_history=message_history, + deferred_tool_results=deferred_tool_results, + model=model, + deps=deps, + model_settings=model_settings, + usage_limits=usage_limits, + usage=usage, + infer_name=infer_name, + toolsets=toolsets, + event_stream_handler=event_stream_handler, + **_deprecated_kwargs, + ) + + self.dbos_wrapped_run_workflow = wrapped_run_workflow + + # Wrap the `run_sync` method in a DBOS workflow + @DBOS.workflow(name=f'{self._name}.run_sync') + def wrapped_run_sync_workflow( + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: OutputSpec[RunOutputDataT] | None = None, + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + event_stream_handler: EventStreamHandler[AgentDepsT] | None = None, + **_deprecated_kwargs: Never, + ) -> AgentRunResult[Any]: + with self._dbos_overrides(): + return super(DBOSAgent, self).run_sync( + user_prompt, + output_type=output_type, + message_history=message_history, + deferred_tool_results=deferred_tool_results, + model=model, + deps=deps, + model_settings=model_settings, + usage_limits=usage_limits, + usage=usage, + infer_name=infer_name, + toolsets=toolsets, + event_stream_handler=event_stream_handler, + **_deprecated_kwargs, + ) + + self.dbos_wrapped_run_sync_workflow = wrapped_run_sync_workflow + + @property + def name(self) -> str | None: + return self._name + + @name.setter + def name(self, value: str | None) -> None: # pragma: no cover + raise UserError( + 'The agent name cannot be changed after creation. If you need to change the name, create a new agent.' + ) + + @property + def model(self) -> Model: + return self._model + + @property + def event_stream_handler(self) -> EventStreamHandler[AgentDepsT] | None: + handler = self._event_stream_handler or super().event_stream_handler + if handler is None: + return None + elif DBOS.workflow_id is not None and DBOS.step_id is None: + # Special case if it's in a DBOS workflow but not a step, we need to iterate through all events and call the handler. + return self._call_event_stream_handler_in_workflow + else: + return handler + + async def _call_event_stream_handler_in_workflow( + self, ctx: RunContext[AgentDepsT], stream: AsyncIterable[_messages.AgentStreamEvent] + ) -> None: + handler = self._event_stream_handler or super().event_stream_handler + assert handler is not None + + async def streamed_response(event: _messages.AgentStreamEvent): + yield event + + async for event in stream: + await handler(ctx, streamed_response(event)) + + @property + def toolsets(self) -> Sequence[AbstractToolset[AgentDepsT]]: + with self._dbos_overrides(): + return super().toolsets + + @contextmanager + def _dbos_overrides(self) -> Iterator[None]: + # Override with DBOSModel and DBOSMCPServer in the toolsets. + with super().override(model=self._model, toolsets=self._toolsets, tools=[]): + yield + + @overload + async def run( + self, + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: None = None, + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT = None, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + event_stream_handler: EventStreamHandler[AgentDepsT] | None = None, + ) -> AgentRunResult[OutputDataT]: ... + + @overload + async def run( + self, + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: OutputSpec[RunOutputDataT], + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT = None, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + event_stream_handler: EventStreamHandler[AgentDepsT] | None = None, + ) -> AgentRunResult[RunOutputDataT]: ... + + async def run( + self, + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: OutputSpec[RunOutputDataT] | None = None, + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT = None, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + event_stream_handler: EventStreamHandler[AgentDepsT] | None = None, + **_deprecated_kwargs: Never, + ) -> AgentRunResult[Any]: + """Run the agent with a user prompt in async mode. + + This method builds an internal agent graph (using system prompts, tools and result schemas) and then + runs the graph to completion. The result of the run is returned. + + Example: + ```python + from pydantic_ai import Agent + + agent = Agent('openai:gpt-4o') + + async def main(): + agent_run = await agent.run('What is the capital of France?') + print(agent_run.output) + #> The capital of France is Paris. + ``` + + Args: + user_prompt: User input to start/continue the conversation. + output_type: Custom output type to use for this run, `output_type` may only be used if the agent has no + output validators since output validators would expect an argument that matches the agent's output type. + message_history: History of the conversation so far. + deferred_tool_results: Optional results for deferred tool calls in the message history. + model: Optional model to use for this run, required if `model` was not set when creating the agent. + deps: Optional dependencies to use for this run. + model_settings: Optional settings to use for this model's request. + usage_limits: Optional limits on model request count or token usage. + usage: Optional usage to start with, useful for resuming a conversation or agents used in tools. + infer_name: Whether to try to infer the agent name from the call frame if it's not set. + toolsets: Optional additional toolsets for this run. + event_stream_handler: Optional event stream handler to use for this run. + + Returns: + The result of the run. + """ + return await self.dbos_wrapped_run_workflow( + user_prompt, + output_type=output_type, + message_history=message_history, + deferred_tool_results=deferred_tool_results, + model=model, + deps=deps, + model_settings=model_settings, + usage_limits=usage_limits, + usage=usage, + infer_name=infer_name, + toolsets=toolsets, + event_stream_handler=event_stream_handler, + **_deprecated_kwargs, + ) + + @overload + def run_sync( + self, + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: None = None, + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT = None, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + event_stream_handler: EventStreamHandler[AgentDepsT] | None = None, + ) -> AgentRunResult[OutputDataT]: ... + + @overload + def run_sync( + self, + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: OutputSpec[RunOutputDataT], + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT = None, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + event_stream_handler: EventStreamHandler[AgentDepsT] | None = None, + ) -> AgentRunResult[RunOutputDataT]: ... + + def run_sync( + self, + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: OutputSpec[RunOutputDataT] | None = None, + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT = None, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + event_stream_handler: EventStreamHandler[AgentDepsT] | None = None, + **_deprecated_kwargs: Never, + ) -> AgentRunResult[Any]: + """Synchronously run the agent with a user prompt. + + This is a convenience method that wraps [`self.run`][pydantic_ai.agent.AbstractAgent.run] with `loop.run_until_complete(...)`. + You therefore can't use this method inside async code or if there's an active event loop. + + Example: + ```python + from pydantic_ai import Agent + + agent = Agent('openai:gpt-4o') + + result_sync = agent.run_sync('What is the capital of Italy?') + print(result_sync.output) + #> The capital of Italy is Rome. + ``` + + Args: + user_prompt: User input to start/continue the conversation. + output_type: Custom output type to use for this run, `output_type` may only be used if the agent has no + output validators since output validators would expect an argument that matches the agent's output type. + message_history: History of the conversation so far. + deferred_tool_results: Optional results for deferred tool calls in the message history. + model: Optional model to use for this run, required if `model` was not set when creating the agent. + deps: Optional dependencies to use for this run. + model_settings: Optional settings to use for this model's request. + usage_limits: Optional limits on model request count or token usage. + usage: Optional usage to start with, useful for resuming a conversation or agents used in tools. + infer_name: Whether to try to infer the agent name from the call frame if it's not set. + toolsets: Optional additional toolsets for this run. + event_stream_handler: Optional event stream handler to use for this run. + + Returns: + The result of the run. + """ + return self.dbos_wrapped_run_sync_workflow( + user_prompt, + output_type=output_type, + message_history=message_history, + deferred_tool_results=deferred_tool_results, + model=model, + deps=deps, + model_settings=model_settings, + usage_limits=usage_limits, + usage=usage, + infer_name=infer_name, + toolsets=toolsets, + event_stream_handler=event_stream_handler, + **_deprecated_kwargs, + ) + + @overload + def run_stream( + self, + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: None = None, + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT = None, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + event_stream_handler: EventStreamHandler[AgentDepsT] | None = None, + ) -> AbstractAsyncContextManager[StreamedRunResult[AgentDepsT, OutputDataT]]: ... + + @overload + def run_stream( + self, + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: OutputSpec[RunOutputDataT], + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT = None, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + event_stream_handler: EventStreamHandler[AgentDepsT] | None = None, + ) -> AbstractAsyncContextManager[StreamedRunResult[AgentDepsT, RunOutputDataT]]: ... + + @asynccontextmanager + async def run_stream( + self, + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: OutputSpec[RunOutputDataT] | None = None, + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT = None, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + event_stream_handler: EventStreamHandler[AgentDepsT] | None = None, + **_deprecated_kwargs: Never, + ) -> AsyncIterator[StreamedRunResult[AgentDepsT, Any]]: + """Run the agent with a user prompt in async mode, returning a streamed response. + + Example: + ```python + from pydantic_ai import Agent + + agent = Agent('openai:gpt-4o') + + async def main(): + async with agent.run_stream('What is the capital of the UK?') as response: + print(await response.get_output()) + #> The capital of the UK is London. + ``` + + Args: + user_prompt: User input to start/continue the conversation. + output_type: Custom output type to use for this run, `output_type` may only be used if the agent has no + output validators since output validators would expect an argument that matches the agent's output type. + message_history: History of the conversation so far. + deferred_tool_results: Optional results for deferred tool calls in the message history. + model: Optional model to use for this run, required if `model` was not set when creating the agent. + deps: Optional dependencies to use for this run. + model_settings: Optional settings to use for this model's request. + usage_limits: Optional limits on model request count or token usage. + usage: Optional usage to start with, useful for resuming a conversation or agents used in tools. + infer_name: Whether to try to infer the agent name from the call frame if it's not set. + toolsets: Optional additional toolsets for this run. + event_stream_handler: Optional event stream handler to use for this run. It will receive all the events up until the final result is found, which you can then read or stream from inside the context manager. + + Returns: + The result of the run. + """ + if DBOS.workflow_id is not None and DBOS.step_id is None: + raise UserError( + '`agent.run_stream()` cannot currently be used inside a DBOS workflow. ' + 'Set an `event_stream_handler` on the agent and use `agent.run()` instead. ' + 'Please file an issue if this is not sufficient for your use case.' + ) + + async with super().run_stream( + user_prompt, + output_type=output_type, + message_history=message_history, + deferred_tool_results=deferred_tool_results, + model=model, + deps=deps, + model_settings=model_settings, + usage_limits=usage_limits, + usage=usage, + infer_name=infer_name, + toolsets=toolsets, + event_stream_handler=event_stream_handler, + **_deprecated_kwargs, + ) as result: + yield result + + @overload + def iter( + self, + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: None = None, + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT = None, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + **_deprecated_kwargs: Never, + ) -> AbstractAsyncContextManager[AgentRun[AgentDepsT, OutputDataT]]: ... + + @overload + def iter( + self, + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: OutputSpec[RunOutputDataT], + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT = None, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + **_deprecated_kwargs: Never, + ) -> AbstractAsyncContextManager[AgentRun[AgentDepsT, RunOutputDataT]]: ... + + @asynccontextmanager + async def iter( + self, + user_prompt: str | Sequence[_messages.UserContent] | None = None, + *, + output_type: OutputSpec[RunOutputDataT] | None = None, + message_history: list[_messages.ModelMessage] | None = None, + deferred_tool_results: DeferredToolResults | None = None, + model: models.Model | models.KnownModelName | str | None = None, + deps: AgentDepsT = None, + model_settings: ModelSettings | None = None, + usage_limits: _usage.UsageLimits | None = None, + usage: _usage.RunUsage | None = None, + infer_name: bool = True, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | None = None, + **_deprecated_kwargs: Never, + ) -> AsyncIterator[AgentRun[AgentDepsT, Any]]: + """A contextmanager which can be used to iterate over the agent graph's nodes as they are executed. + + This method builds an internal agent graph (using system prompts, tools and output schemas) and then returns an + `AgentRun` object. The `AgentRun` can be used to async-iterate over the nodes of the graph as they are + executed. This is the API to use if you want to consume the outputs coming from each LLM model response, or the + stream of events coming from the execution of tools. + + The `AgentRun` also provides methods to access the full message history, new messages, and usage statistics, + and the final result of the run once it has completed. + + For more details, see the documentation of `AgentRun`. + + Example: + ```python + from pydantic_ai import Agent + + agent = Agent('openai:gpt-4o') + + async def main(): + nodes = [] + async with agent.iter('What is the capital of France?') as agent_run: + async for node in agent_run: + nodes.append(node) + print(nodes) + ''' + [ + UserPromptNode( + user_prompt='What is the capital of France?', + instructions=None, + instructions_functions=[], + system_prompts=(), + system_prompt_functions=[], + system_prompt_dynamic_functions={}, + ), + ModelRequestNode( + request=ModelRequest( + parts=[ + UserPromptPart( + content='What is the capital of France?', + timestamp=datetime.datetime(...), + ) + ] + ) + ), + CallToolsNode( + model_response=ModelResponse( + parts=[TextPart(content='The capital of France is Paris.')], + usage=RequestUsage(input_tokens=56, output_tokens=7), + model_name='gpt-4o', + timestamp=datetime.datetime(...), + ) + ), + End(data=FinalResult(output='The capital of France is Paris.')), + ] + ''' + print(agent_run.result.output) + #> The capital of France is Paris. + ``` + + Args: + user_prompt: User input to start/continue the conversation. + output_type: Custom output type to use for this run, `output_type` may only be used if the agent has no + output validators since output validators would expect an argument that matches the agent's output type. + message_history: History of the conversation so far. + deferred_tool_results: Optional results for deferred tool calls in the message history. + model: Optional model to use for this run, required if `model` was not set when creating the agent. + deps: Optional dependencies to use for this run. + model_settings: Optional settings to use for this model's request. + usage_limits: Optional limits on model request count or token usage. + usage: Optional usage to start with, useful for resuming a conversation or agents used in tools. + infer_name: Whether to try to infer the agent name from the call frame if it's not set. + toolsets: Optional additional toolsets for this run. + + Returns: + The result of the run. + """ + if model is not None and not isinstance(model, DBOSModel): + raise UserError( + 'Non-DBOS model cannot be set at agent run time inside a DBOS workflow, it must be set at agent creation time.' + ) + + with self._dbos_overrides(): + async with super().iter( + user_prompt=user_prompt, + output_type=output_type, + message_history=message_history, + deferred_tool_results=deferred_tool_results, + model=model, + deps=deps, + model_settings=model_settings, + usage_limits=usage_limits, + usage=usage, + infer_name=infer_name, + toolsets=toolsets, + **_deprecated_kwargs, + ) as run: + yield run + + @contextmanager + def override( + self, + *, + deps: AgentDepsT | _utils.Unset = _utils.UNSET, + model: models.Model | models.KnownModelName | str | _utils.Unset = _utils.UNSET, + toolsets: Sequence[AbstractToolset[AgentDepsT]] | _utils.Unset = _utils.UNSET, + tools: Sequence[Tool[AgentDepsT] | ToolFuncEither[AgentDepsT, ...]] | _utils.Unset = _utils.UNSET, + ) -> Iterator[None]: + """Context manager to temporarily override agent dependencies, model, toolsets, or tools. + + This is particularly useful when testing. + You can find an example of this [here](../testing.md#overriding-model-via-pytest-fixtures). + + Args: + deps: The dependencies to use instead of the dependencies passed to the agent run. + model: The model to use instead of the model passed to the agent run. + toolsets: The toolsets to use instead of the toolsets passed to the agent constructor and agent run. + tools: The tools to use instead of the tools registered with the agent. + """ + if _utils.is_set(model) and not isinstance(model, (DBOSModel)): + raise UserError( + 'Non-DBOS model cannot be contextually overridden inside a DBOS workflow, it must be set at agent creation time.' + ) + + with super().override(deps=deps, model=model, toolsets=toolsets, tools=tools): + yield diff --git a/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/_mcp_server.py b/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/_mcp_server.py new file mode 100644 index 0000000000..f293cc025b --- /dev/null +++ b/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/_mcp_server.py @@ -0,0 +1,89 @@ +from __future__ import annotations + +from abc import ABC +from collections.abc import Callable +from typing import Any + +from dbos import DBOS +from typing_extensions import Self + +from pydantic_ai.mcp import MCPServer, ToolResult +from pydantic_ai.tools import AgentDepsT, RunContext +from pydantic_ai.toolsets.abstract import AbstractToolset, ToolsetTool +from pydantic_ai.toolsets.wrapper import WrapperToolset + +from ._utils import StepConfig + + +class DBOSMCPServer(WrapperToolset[AgentDepsT], ABC): + """A wrapper for MCPServer that integrates with DBOS, turning call_tool and get_tools to DBOS steps.""" + + def __init__( + self, + wrapped: MCPServer, + *, + step_name_prefix: str, + step_config: StepConfig, + ): + super().__init__(wrapped) + self._step_config = step_config or {} + self._step_name_prefix = step_name_prefix + id_suffix = f'__{wrapped.id}' if wrapped.id else '' + self._name = f'{step_name_prefix}__mcp_server{id_suffix}' + + # Wrap get_tools in a DBOS step. + @DBOS.step( + name=f'{self._name}.get_tools', + **self._step_config, + ) + async def wrapped_get_tools_step( + ctx: RunContext[AgentDepsT], + ) -> dict[str, ToolsetTool[AgentDepsT]]: + return await super(DBOSMCPServer, self).get_tools(ctx) + + self._dbos_wrapped_get_tools_step = wrapped_get_tools_step + + # Wrap call_tool in a DBOS step. + @DBOS.step( + name=f'{self._name}.call_tool', + **self._step_config, + ) + async def wrapped_call_tool_step( + name: str, + tool_args: dict[str, Any], + ctx: RunContext[AgentDepsT], + tool: ToolsetTool[AgentDepsT], + ) -> ToolResult: + return await super(DBOSMCPServer, self).call_tool(name, tool_args, ctx, tool) + + self._dbos_wrapped_call_tool_step = wrapped_call_tool_step + + @property + def id(self) -> str | None: + return self.wrapped.id + + async def __aenter__(self) -> Self: + # The wrapped MCPServer enters itself around listing and calling tools + # so we don't need to enter it here (nor could we because we're not inside a DBOS step). + return self + + async def __aexit__(self, *args: Any) -> bool | None: + return None + + def visit_and_replace( + self, visitor: Callable[[AbstractToolset[AgentDepsT]], AbstractToolset[AgentDepsT]] + ) -> AbstractToolset[AgentDepsT]: + # DBOS-ified toolsets cannot be swapped out after the fact. + return self + + async def get_tools(self, ctx: RunContext[AgentDepsT]) -> dict[str, ToolsetTool[AgentDepsT]]: + return await self._dbos_wrapped_get_tools_step(ctx) + + async def call_tool( + self, + name: str, + tool_args: dict[str, Any], + ctx: RunContext[AgentDepsT], + tool: ToolsetTool[AgentDepsT], + ) -> ToolResult: + return await self._dbos_wrapped_call_tool_step(name, tool_args, ctx, tool) diff --git a/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/_model.py b/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/_model.py new file mode 100644 index 0000000000..269b19cf8d --- /dev/null +++ b/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/_model.py @@ -0,0 +1,137 @@ +from __future__ import annotations + +from collections.abc import AsyncIterator +from contextlib import asynccontextmanager +from datetime import datetime +from typing import Any + +from dbos import DBOS + +from pydantic_ai.agent import EventStreamHandler +from pydantic_ai.messages import ( + ModelMessage, + ModelResponse, + ModelResponseStreamEvent, +) +from pydantic_ai.models import Model, ModelRequestParameters, StreamedResponse +from pydantic_ai.models.wrapper import WrapperModel +from pydantic_ai.settings import ModelSettings +from pydantic_ai.tools import RunContext +from pydantic_ai.usage import RequestUsage + +from ._utils import StepConfig + + +class DBOSStreamedResponse(StreamedResponse): + def __init__(self, model_request_parameters: ModelRequestParameters, response: ModelResponse): + super().__init__(model_request_parameters) + self.response = response + + async def _get_event_iterator(self) -> AsyncIterator[ModelResponseStreamEvent]: + return + # noinspection PyUnreachableCode + yield + + def get(self) -> ModelResponse: + return self.response + + def usage(self) -> RequestUsage: + return self.response.usage # pragma: no cover + + @property + def model_name(self) -> str: + return self.response.model_name or '' # pragma: no cover + + @property + def provider_name(self) -> str: + return self.response.provider_name or '' # pragma: no cover + + @property + def timestamp(self) -> datetime: + return self.response.timestamp # pragma: no cover + + +class DBOSModel(WrapperModel): + """A wrapper for Model that integrates with DBOS, turning request and request_stream to DBOS steps.""" + + def __init__( + self, + model: Model, + *, + step_name_prefix: str, + step_config: StepConfig, + event_stream_handler: EventStreamHandler[Any] | None = None, + ): + super().__init__(model) + self.step_config = step_config + self.event_stream_handler = event_stream_handler + self._step_name_prefix = step_name_prefix + + # Wrap the request in a DBOS step. + @DBOS.step( + name=f'{self._step_name_prefix}__model.request', + **self.step_config, + ) + async def wrapped_request_step( + messages: list[ModelMessage], + model_settings: ModelSettings | None, + model_request_parameters: ModelRequestParameters, + ) -> ModelResponse: + return await super(DBOSModel, self).request(messages, model_settings, model_request_parameters) + + self._dbos_wrapped_request_step = wrapped_request_step + + # Wrap the request_stream in a DBOS step. + @DBOS.step( + name=f'{self._step_name_prefix}__model.request_stream', + **self.step_config, + ) + async def wrapped_request_stream_step( + messages: list[ModelMessage], + model_settings: ModelSettings | None, + model_request_parameters: ModelRequestParameters, + run_context: RunContext[Any] | None = None, + ) -> ModelResponse: + async with super(DBOSModel, self).request_stream( + messages, model_settings, model_request_parameters, run_context + ) as streamed_response: + if self.event_stream_handler is not None: + assert run_context is not None, ( + 'A DBOS model cannot be used with `pydantic_ai.direct.model_request_stream()` as it requires a `run_context`. Set an `event_stream_handler` on the agent and use `agent.run()` instead.' + ) + await self.event_stream_handler(run_context, streamed_response) + + async for _ in streamed_response: + pass + return streamed_response.get() + + self._dbos_wrapped_request_stream_step = wrapped_request_stream_step + + async def request( + self, + messages: list[ModelMessage], + model_settings: ModelSettings | None, + model_request_parameters: ModelRequestParameters, + ) -> ModelResponse: + return await self._dbos_wrapped_request_step(messages, model_settings, model_request_parameters) + + @asynccontextmanager + async def request_stream( + self, + messages: list[ModelMessage], + model_settings: ModelSettings | None, + model_request_parameters: ModelRequestParameters, + run_context: RunContext[Any] | None = None, + ) -> AsyncIterator[StreamedResponse]: + # If not in a workflow (could be in a step), just call the wrapped request_stream method. + if DBOS.workflow_id is None or DBOS.step_id is not None: + async with super().request_stream( + messages, model_settings, model_request_parameters, run_context + ) as streamed_response: + yield streamed_response + return + + response = await self._dbos_wrapped_request_stream_step( + messages, model_settings, model_request_parameters, run_context + ) + yield DBOSStreamedResponse(model_request_parameters, response) diff --git a/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/_utils.py b/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/_utils.py new file mode 100644 index 0000000000..7be0593011 --- /dev/null +++ b/pydantic_ai_slim/pydantic_ai/durable_exec/dbos/_utils.py @@ -0,0 +1,10 @@ +from typing_extensions import TypedDict + + +class StepConfig(TypedDict, total=False): + """Configuration for a step in the DBOS workflow.""" + + retries_allowed: bool + interval_seconds: float + max_attempts: int + backoff_rate: float diff --git a/pydantic_ai_slim/pydantic_ai/mcp.py b/pydantic_ai_slim/pydantic_ai/mcp.py index c3d0b3bbc4..3edd0925fe 100644 --- a/pydantic_ai_slim/pydantic_ai/mcp.py +++ b/pydantic_ai_slim/pydantic_ai/mcp.py @@ -517,7 +517,7 @@ def __repr__(self) -> str: f'args={self.args!r}', ] if self.id: - repr_args.append(f'id={self.id!r}') # pragma: no cover + repr_args.append(f'id={self.id!r}') return f'{self.__class__.__name__}({", ".join(repr_args)})' diff --git a/pydantic_ai_slim/pyproject.toml b/pydantic_ai_slim/pyproject.toml index 41e165d215..7145926b8f 100644 --- a/pydantic_ai_slim/pyproject.toml +++ b/pydantic_ai_slim/pyproject.toml @@ -98,6 +98,8 @@ ag-ui = ["ag-ui-protocol>=0.1.8", "starlette>=0.45.3"] retries = ["tenacity>=8.2.3"] # Temporal temporal = ["temporalio==1.17.0"] +# DBOS +dbos = ["dbos>=1.13.0"] [tool.hatch.metadata] allow-direct-references = true diff --git a/pydantic_evals/pydantic_evals/otel/_context_in_memory_span_exporter.py b/pydantic_evals/pydantic_evals/otel/_context_in_memory_span_exporter.py index 05d4d6bff9..5a25914861 100644 --- a/pydantic_evals/pydantic_evals/otel/_context_in_memory_span_exporter.py +++ b/pydantic_evals/pydantic_evals/otel/_context_in_memory_span_exporter.py @@ -18,7 +18,7 @@ ) _LOGFIRE_IS_INSTALLED = True -except ImportError: # pragma: no cover +except ImportError: # pragma: lax no cover _LOGFIRE_IS_INSTALLED = False # pyright: ignore[reportConstantRedefinition] # Ensure that we can do an isinstance check without erroring diff --git a/pyproject.toml b/pyproject.toml index 4237a55b64..6b647d78e8 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -52,6 +52,7 @@ dependencies = [ [tool.hatch.metadata.hooks.uv-dynamic-versioning.optional-dependencies] examples = ["pydantic-ai-examples=={{ version }}"] a2a = ["fasta2a>=0.4.1"] +dbos = ["pydantic-ai-slim[dbos]=={{ version }}"] [project.urls] Homepage = "https://ai.pydantic.dev" diff --git a/tests/cassettes/test_dbos/test_complex_agent_run.yaml b/tests/cassettes/test_dbos/test_complex_agent_run.yaml new file mode 100644 index 0000000000..a567fd8632 --- /dev/null +++ b/tests/cassettes/test_dbos/test_complex_agent_run.yaml @@ -0,0 +1,955 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '4294' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: 'Tell me: the capital of the country; the weather there; the product name' + role: user + model: gpt-4o + stream: true + stream_options: + include_usage: true + tool_choice: required + tools: + - function: + description: '' + name: get_weather + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + - function: + description: '' + name: get_country + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Convert Celsius to Fahrenheit.\n\n Args:\n celsius: Temperature in Celsius\n\n Returns:\n + \ Temperature in Fahrenheit\n " + name: celsius_to_fahrenheit + parameters: + additionalProperties: false + properties: + celsius: + type: number + required: + - celsius + type: object + strict: true + type: function + - function: + description: "Get the weather forecast for a location.\n\n Args:\n location: The location to get the weather + forecast for.\n\n Returns:\n The weather forecast for the location.\n " + name: get_weather_forecast + parameters: + additionalProperties: false + properties: + location: + type: string + required: + - location + type: object + strict: true + type: function + - function: + description: '' + name: get_image_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_dict + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_error + parameters: + additionalProperties: false + properties: + value: + default: false + type: boolean + type: object + type: function + - function: + description: '' + name: get_none + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_multiple_items + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Get the current log level.\n\n Returns:\n The current log level.\n " + name: get_log_level + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Echo the run context.\n\n Args:\n ctx: Context object containing request and session information.\n\n + \ Returns:\n Dictionary with an echo message and the deps.\n " + name: echo_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: Use sampling callback. + name: use_sampling + parameters: + additionalProperties: false + properties: + foo: + type: string + required: + - foo + type: object + strict: true + type: function + - function: + description: The final response which ends this conversation + name: final_result + parameters: + $defs: + Answer: + additionalProperties: false + properties: + answer: + type: string + label: + type: string + required: + - label + - answer + type: object + additionalProperties: false + properties: + answers: + items: + $ref: '#/$defs/Answer' + type: array + required: + - answers + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C2QD1kGWsTW5OWiqAtOSFEAOfPfQH","object":"chat.completion.chunk","created":1754693439,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"role":"assistant","content":null},"logprobs":null,"finish_reason":null}],"obfuscation":"3Y6yaTIPlXV"} + + data: {"id":"chatcmpl-C2QD1kGWsTW5OWiqAtOSFEAOfPfQH","object":"chat.completion.chunk","created":1754693439,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"id":"call_q2UyBRP7eXNTzAoR8lEhjc9Z","type":"function","function":{"name":"get_country","arguments":""}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"g7emiFlCcG"} + + data: {"id":"chatcmpl-C2QD1kGWsTW5OWiqAtOSFEAOfPfQH","object":"chat.completion.chunk","created":1754693439,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{}"}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"S8ct"} + + data: {"id":"chatcmpl-C2QD1kGWsTW5OWiqAtOSFEAOfPfQH","object":"chat.completion.chunk","created":1754693439,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":1,"id":"call_b51ijcpFkDiTQG1bQzsrmtW5","type":"function","function":{"name":"get_product_name","arguments":""}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"uomn5"} + + data: {"id":"chatcmpl-C2QD1kGWsTW5OWiqAtOSFEAOfPfQH","object":"chat.completion.chunk","created":1754693439,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":1,"function":{"arguments":"{}"}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"yrJZ"} + + data: {"id":"chatcmpl-C2QD1kGWsTW5OWiqAtOSFEAOfPfQH","object":"chat.completion.chunk","created":1754693439,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"tool_calls"}],"usage":null,"obfuscation":"onPp"} + + data: {"id":"chatcmpl-C2QD1kGWsTW5OWiqAtOSFEAOfPfQH","object":"chat.completion.chunk","created":1754693439,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[],"usage":{"prompt_tokens":364,"completion_tokens":40,"total_tokens":404,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":"F1wZCrV0lEsbu"} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '906' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '4720' + content-type: + - application/json + cookie: + - __cf_bm=zft6YwMMTPDlJv9nFzcyqjatDLWL51IWoXRjiBo8lxg-1754693440-1.0.1.1-FT8HyZDjEEZIx76hp4IYd2Ke6ga_YHmugNrkwXCbkPQJM7bAIax9kMz_DGNsvY5gt.sE2g60Jc0zEEp43vK95vUKIk62fCzcc3i.7eygET0; + _cfuvid=I3WeF5lZoAzwC31zFIarQCYAjRSXcSCKQ0Z8Szv00_0-1754693440668-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: 'Tell me: the capital of the country; the weather there; the product name' + role: user + - role: assistant + tool_calls: + - function: + arguments: '{}' + name: get_country + id: call_q2UyBRP7eXNTzAoR8lEhjc9Z + type: function + - function: + arguments: '{}' + name: get_product_name + id: call_b51ijcpFkDiTQG1bQzsrmtW5 + type: function + - content: Mexico + role: tool + tool_call_id: call_q2UyBRP7eXNTzAoR8lEhjc9Z + - content: Pydantic AI + role: tool + tool_call_id: call_b51ijcpFkDiTQG1bQzsrmtW5 + model: gpt-4o + stream: true + stream_options: + include_usage: true + tool_choice: required + tools: + - function: + description: '' + name: get_weather + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + - function: + description: '' + name: get_country + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Convert Celsius to Fahrenheit.\n\n Args:\n celsius: Temperature in Celsius\n\n Returns:\n + \ Temperature in Fahrenheit\n " + name: celsius_to_fahrenheit + parameters: + additionalProperties: false + properties: + celsius: + type: number + required: + - celsius + type: object + strict: true + type: function + - function: + description: "Get the weather forecast for a location.\n\n Args:\n location: The location to get the weather + forecast for.\n\n Returns:\n The weather forecast for the location.\n " + name: get_weather_forecast + parameters: + additionalProperties: false + properties: + location: + type: string + required: + - location + type: object + strict: true + type: function + - function: + description: '' + name: get_image_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_dict + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_error + parameters: + additionalProperties: false + properties: + value: + default: false + type: boolean + type: object + type: function + - function: + description: '' + name: get_none + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_multiple_items + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Get the current log level.\n\n Returns:\n The current log level.\n " + name: get_log_level + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Echo the run context.\n\n Args:\n ctx: Context object containing request and session information.\n\n + \ Returns:\n Dictionary with an echo message and the deps.\n " + name: echo_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: Use sampling callback. + name: use_sampling + parameters: + additionalProperties: false + properties: + foo: + type: string + required: + - foo + type: object + strict: true + type: function + - function: + description: The final response which ends this conversation + name: final_result + parameters: + $defs: + Answer: + additionalProperties: false + properties: + answer: + type: string + label: + type: string + required: + - label + - answer + type: object + additionalProperties: false + properties: + answers: + items: + $ref: '#/$defs/Answer' + type: array + required: + - answers + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C2QD2NQfRbWW5ww5we2oDjS1mgHtK","object":"chat.completion.chunk","created":1754693440,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"role":"assistant","content":null,"tool_calls":[{"index":0,"id":"call_LwxJUB9KppVyogRRLQsamRJv","type":"function","function":{"name":"get_weather","arguments":""}}],"refusal":null},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"NA5VZTdCK"} + + data: {"id":"chatcmpl-C2QD2NQfRbWW5ww5we2oDjS1mgHtK","object":"chat.completion.chunk","created":1754693440,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"MYr"} + + data: {"id":"chatcmpl-C2QD2NQfRbWW5ww5we2oDjS1mgHtK","object":"chat.completion.chunk","created":1754693440,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"city"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"N0"} + + data: {"id":"chatcmpl-C2QD2NQfRbWW5ww5we2oDjS1mgHtK","object":"chat.completion.chunk","created":1754693440,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"S"} + + data: {"id":"chatcmpl-C2QD2NQfRbWW5ww5we2oDjS1mgHtK","object":"chat.completion.chunk","created":1754693440,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Mexico"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C2QD2NQfRbWW5ww5we2oDjS1mgHtK","object":"chat.completion.chunk","created":1754693440,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" City"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"P"} + + data: {"id":"chatcmpl-C2QD2NQfRbWW5ww5we2oDjS1mgHtK","object":"chat.completion.chunk","created":1754693440,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\"}"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"FLs"} + + data: {"id":"chatcmpl-C2QD2NQfRbWW5ww5we2oDjS1mgHtK","object":"chat.completion.chunk","created":1754693440,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"tool_calls"}],"usage":null,"obfuscation":"d0LO"} + + data: {"id":"chatcmpl-C2QD2NQfRbWW5ww5we2oDjS1mgHtK","object":"chat.completion.chunk","created":1754693440,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[],"usage":{"prompt_tokens":423,"completion_tokens":15,"total_tokens":438,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":"o9nP5OYjURlfY"} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '379' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '4969' + content-type: + - application/json + cookie: + - __cf_bm=zft6YwMMTPDlJv9nFzcyqjatDLWL51IWoXRjiBo8lxg-1754693440-1.0.1.1-FT8HyZDjEEZIx76hp4IYd2Ke6ga_YHmugNrkwXCbkPQJM7bAIax9kMz_DGNsvY5gt.sE2g60Jc0zEEp43vK95vUKIk62fCzcc3i.7eygET0; + _cfuvid=I3WeF5lZoAzwC31zFIarQCYAjRSXcSCKQ0Z8Szv00_0-1754693440668-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: 'Tell me: the capital of the country; the weather there; the product name' + role: user + - role: assistant + tool_calls: + - function: + arguments: '{}' + name: get_country + id: call_q2UyBRP7eXNTzAoR8lEhjc9Z + type: function + - function: + arguments: '{}' + name: get_product_name + id: call_b51ijcpFkDiTQG1bQzsrmtW5 + type: function + - content: Mexico + role: tool + tool_call_id: call_q2UyBRP7eXNTzAoR8lEhjc9Z + - content: Pydantic AI + role: tool + tool_call_id: call_b51ijcpFkDiTQG1bQzsrmtW5 + - role: assistant + tool_calls: + - function: + arguments: '{"city":"Mexico City"}' + name: get_weather + id: call_LwxJUB9KppVyogRRLQsamRJv + type: function + - content: sunny + role: tool + tool_call_id: call_LwxJUB9KppVyogRRLQsamRJv + model: gpt-4o + stream: true + stream_options: + include_usage: true + tool_choice: required + tools: + - function: + description: '' + name: get_weather + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + - function: + description: '' + name: get_country + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Convert Celsius to Fahrenheit.\n\n Args:\n celsius: Temperature in Celsius\n\n Returns:\n + \ Temperature in Fahrenheit\n " + name: celsius_to_fahrenheit + parameters: + additionalProperties: false + properties: + celsius: + type: number + required: + - celsius + type: object + strict: true + type: function + - function: + description: "Get the weather forecast for a location.\n\n Args:\n location: The location to get the weather + forecast for.\n\n Returns:\n The weather forecast for the location.\n " + name: get_weather_forecast + parameters: + additionalProperties: false + properties: + location: + type: string + required: + - location + type: object + strict: true + type: function + - function: + description: '' + name: get_image_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_dict + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_error + parameters: + additionalProperties: false + properties: + value: + default: false + type: boolean + type: object + type: function + - function: + description: '' + name: get_none + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_multiple_items + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Get the current log level.\n\n Returns:\n The current log level.\n " + name: get_log_level + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Echo the run context.\n\n Args:\n ctx: Context object containing request and session information.\n\n + \ Returns:\n Dictionary with an echo message and the deps.\n " + name: echo_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: Use sampling callback. + name: use_sampling + parameters: + additionalProperties: false + properties: + foo: + type: string + required: + - foo + type: object + strict: true + type: function + - function: + description: The final response which ends this conversation + name: final_result + parameters: + $defs: + Answer: + additionalProperties: false + properties: + answer: + type: string + label: + type: string + required: + - label + - answer + type: object + additionalProperties: false + properties: + answers: + items: + $ref: '#/$defs/Answer' + type: array + required: + - answers + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"role":"assistant","content":null,"tool_calls":[{"index":0,"id":"call_CCGIWaMeYWmxOQ91orkmTvzn","type":"function","function":{"name":"final_result","arguments":""}}],"refusal":null},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"3HGTeJmv"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"5oT"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answers"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Bkpoehp1mPtgpuo"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":["}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"XR"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"OrV"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"label"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"p"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"r"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Capital"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"hPFa7nYTy1kqfHY"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\",\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"x"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answer"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"1"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"The"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Dfa"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" capital"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"uoFfXckIZzkxsw"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" of"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"4Iq"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" Mexico"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"0YTqpF2Qa9jYzl0"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" is"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"jh5"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" Mexico"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"rbLc3P4VrYs4CjM"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" City"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"5"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":".\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Vu2"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"},{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"i"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"label"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"2"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"G"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Weather"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"G4dA7nClk85oh1y"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\",\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Q"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answer"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"I"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"The"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"i98"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" weather"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"vXmI31omHBLPWx"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" in"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"8Z7"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" Mexico"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"O2h4Rnr8tNEmqNy"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" City"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"8"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" is"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"dS8"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" currently"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"PaxQJe8HpcFV"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" sunny"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":".\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"oZX"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"},{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"x"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"label"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"J"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"s"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Product"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"zHxJUlHIGtbB0wY"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" Name"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"l"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\",\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"b"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answer"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"j"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"The"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"TlE"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" product"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"10zmpKfDgdH40s"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" name"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"t"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" is"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"YN8"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" P"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Cy29"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"yd"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"skZD"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"antic"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"q"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" AI"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"be9"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":".\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"5y4"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"}"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"IH76k"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"]}"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"5HnB"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"tool_calls"}],"usage":null,"obfuscation":"P7V8"} + + data: {"id":"chatcmpl-C2QD4vblfNcSDeoXmULJR4umoKNqY","object":"chat.completion.chunk","created":1754693442,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[],"usage":{"prompt_tokens":448,"completion_tokens":62,"total_tokens":510,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":"dRC1SdJDw80tk"} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '482' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +version: 1 +... diff --git a/tests/cassettes/test_dbos/test_complex_agent_run_in_workflow.yaml b/tests/cassettes/test_dbos/test_complex_agent_run_in_workflow.yaml new file mode 100644 index 0000000000..cf50925805 --- /dev/null +++ b/tests/cassettes/test_dbos/test_complex_agent_run_in_workflow.yaml @@ -0,0 +1,929 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '4294' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: 'Tell me: the capital of the country; the weather there; the product name' + role: user + model: gpt-4o + stream: true + stream_options: + include_usage: true + tool_choice: required + tools: + - function: + description: '' + name: get_weather + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + - function: + description: '' + name: get_country + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Convert Celsius to Fahrenheit.\n\n Args:\n celsius: Temperature in Celsius\n\n Returns:\n + \ Temperature in Fahrenheit\n " + name: celsius_to_fahrenheit + parameters: + additionalProperties: false + properties: + celsius: + type: number + required: + - celsius + type: object + strict: true + type: function + - function: + description: "Get the weather forecast for a location.\n\n Args:\n location: The location to get the weather + forecast for.\n\n Returns:\n The weather forecast for the location.\n " + name: get_weather_forecast + parameters: + additionalProperties: false + properties: + location: + type: string + required: + - location + type: object + strict: true + type: function + - function: + description: '' + name: get_image_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_dict + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_error + parameters: + additionalProperties: false + properties: + value: + default: false + type: boolean + type: object + type: function + - function: + description: '' + name: get_none + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_multiple_items + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Get the current log level.\n\n Returns:\n The current log level.\n " + name: get_log_level + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Echo the run context.\n\n Args:\n ctx: Context object containing request and session information.\n\n + \ Returns:\n Dictionary with an echo message and the deps.\n " + name: echo_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: Use sampling callback. + name: use_sampling + parameters: + additionalProperties: false + properties: + foo: + type: string + required: + - foo + type: object + strict: true + type: function + - function: + description: The final response which ends this conversation + name: final_result + parameters: + $defs: + Answer: + additionalProperties: false + properties: + answer: + type: string + label: + type: string + required: + - label + - answer + type: object + additionalProperties: false + properties: + answers: + items: + $ref: '#/$defs/Answer' + type: array + required: + - answers + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C1KMEUDb1vVwsROQUCZTgG6A6vtWo","object":"chat.completion.chunk","created":1754432618,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","usage":null,"choices":[{"index":0,"delta":{"role":"assistant","content":null},"logprobs":null,"finish_reason":null}],"obfuscation":"jP9abrn9XF3"} + + data: {"id":"chatcmpl-C1KMEUDb1vVwsROQUCZTgG6A6vtWo","object":"chat.completion.chunk","created":1754432618,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"id":"call_3rqTYrA6H21AYUaRGP4F66oq","type":"function","function":{"name":"get_country","arguments":""}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"8RXlE4Z5NT"} + + data: {"id":"chatcmpl-C1KMEUDb1vVwsROQUCZTgG6A6vtWo","object":"chat.completion.chunk","created":1754432618,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{}"}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"CNtv"} + + data: {"id":"chatcmpl-C1KMEUDb1vVwsROQUCZTgG6A6vtWo","object":"chat.completion.chunk","created":1754432618,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":1,"id":"call_Xw9XMKBJU48kAAd78WgIswDx","type":"function","function":{"name":"get_product_name","arguments":""}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"TomlO"} + + data: {"id":"chatcmpl-C1KMEUDb1vVwsROQUCZTgG6A6vtWo","object":"chat.completion.chunk","created":1754432618,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":1,"function":{"arguments":"{}"}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"4Gko"} + + data: {"id":"chatcmpl-C1KMEUDb1vVwsROQUCZTgG6A6vtWo","object":"chat.completion.chunk","created":1754432618,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"tool_calls"}],"usage":null,"obfuscation":"GtnJ"} + + data: {"id":"chatcmpl-C1KMEUDb1vVwsROQUCZTgG6A6vtWo","object":"chat.completion.chunk","created":1754432618,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[],"usage":{"prompt_tokens":364,"completion_tokens":40,"total_tokens":404,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":"Ohcj4NkgRRFLL"} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '756' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '4720' + content-type: + - application/json + cookie: + - __cf_bm=glnXI8zJVAJ0K_QdEGsOKKRgWNfzACGXNWXiwREzaLg-1754432619-1.0.1.1-_Ef07EWA.dA.ieqsA1ZV5wshb3Z4zXVgZ6bbQCLkVEXqzEUQ4cPSApZhDGjVWQMg9aEywh0CfTkaZwvW0rjDh_nKfoZ5Cc8fMVgN5gAyMNc; + _cfuvid=Ckz8lfebgV0n8QtvIIYuQIvlcwwiwc67I0Aw.L8t4rM-1754432619011-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: 'Tell me: the capital of the country; the weather there; the product name' + role: user + - role: assistant + tool_calls: + - function: + arguments: '{}' + name: get_country + id: call_3rqTYrA6H21AYUaRGP4F66oq + type: function + - function: + arguments: '{}' + name: get_product_name + id: call_Xw9XMKBJU48kAAd78WgIswDx + type: function + - content: Mexico + role: tool + tool_call_id: call_3rqTYrA6H21AYUaRGP4F66oq + - content: Pydantic AI + role: tool + tool_call_id: call_Xw9XMKBJU48kAAd78WgIswDx + model: gpt-4o + stream: true + stream_options: + include_usage: true + tool_choice: required + tools: + - function: + description: '' + name: get_weather + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + - function: + description: '' + name: get_country + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Convert Celsius to Fahrenheit.\n\n Args:\n celsius: Temperature in Celsius\n\n Returns:\n + \ Temperature in Fahrenheit\n " + name: celsius_to_fahrenheit + parameters: + additionalProperties: false + properties: + celsius: + type: number + required: + - celsius + type: object + strict: true + type: function + - function: + description: "Get the weather forecast for a location.\n\n Args:\n location: The location to get the weather + forecast for.\n\n Returns:\n The weather forecast for the location.\n " + name: get_weather_forecast + parameters: + additionalProperties: false + properties: + location: + type: string + required: + - location + type: object + strict: true + type: function + - function: + description: '' + name: get_image_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_dict + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_error + parameters: + additionalProperties: false + properties: + value: + default: false + type: boolean + type: object + type: function + - function: + description: '' + name: get_none + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_multiple_items + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Get the current log level.\n\n Returns:\n The current log level.\n " + name: get_log_level + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Echo the run context.\n\n Args:\n ctx: Context object containing request and session information.\n\n + \ Returns:\n Dictionary with an echo message and the deps.\n " + name: echo_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: Use sampling callback. + name: use_sampling + parameters: + additionalProperties: false + properties: + foo: + type: string + required: + - foo + type: object + strict: true + type: function + - function: + description: The final response which ends this conversation + name: final_result + parameters: + $defs: + Answer: + additionalProperties: false + properties: + answer: + type: string + label: + type: string + required: + - label + - answer + type: object + additionalProperties: false + properties: + answers: + items: + $ref: '#/$defs/Answer' + type: array + required: + - answers + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C1KMJC4uUHgeJ4A0e8jM8wufrmdxX","object":"chat.completion.chunk","created":1754432623,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"role":"assistant","content":null,"tool_calls":[{"index":0,"id":"call_Vz0Sie91Ap56nH0ThKGrZXT7","type":"function","function":{"name":"get_weather","arguments":""}}],"refusal":null},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"pJt0pVt5b"} + + data: {"id":"chatcmpl-C1KMJC4uUHgeJ4A0e8jM8wufrmdxX","object":"chat.completion.chunk","created":1754432623,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"u17"} + + data: {"id":"chatcmpl-C1KMJC4uUHgeJ4A0e8jM8wufrmdxX","object":"chat.completion.chunk","created":1754432623,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"city"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"W5"} + + data: {"id":"chatcmpl-C1KMJC4uUHgeJ4A0e8jM8wufrmdxX","object":"chat.completion.chunk","created":1754432623,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"R"} + + data: {"id":"chatcmpl-C1KMJC4uUHgeJ4A0e8jM8wufrmdxX","object":"chat.completion.chunk","created":1754432623,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Mexico"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C1KMJC4uUHgeJ4A0e8jM8wufrmdxX","object":"chat.completion.chunk","created":1754432623,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" City"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"d"} + + data: {"id":"chatcmpl-C1KMJC4uUHgeJ4A0e8jM8wufrmdxX","object":"chat.completion.chunk","created":1754432623,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\"}"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"y7y"} + + data: {"id":"chatcmpl-C1KMJC4uUHgeJ4A0e8jM8wufrmdxX","object":"chat.completion.chunk","created":1754432623,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"tool_calls"}],"usage":null,"obfuscation":"Wj82"} + + data: {"id":"chatcmpl-C1KMJC4uUHgeJ4A0e8jM8wufrmdxX","object":"chat.completion.chunk","created":1754432623,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[],"usage":{"prompt_tokens":423,"completion_tokens":15,"total_tokens":438,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":"Umu4rjZrtKjmq"} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '535' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '4969' + content-type: + - application/json + cookie: + - __cf_bm=glnXI8zJVAJ0K_QdEGsOKKRgWNfzACGXNWXiwREzaLg-1754432619-1.0.1.1-_Ef07EWA.dA.ieqsA1ZV5wshb3Z4zXVgZ6bbQCLkVEXqzEUQ4cPSApZhDGjVWQMg9aEywh0CfTkaZwvW0rjDh_nKfoZ5Cc8fMVgN5gAyMNc; + _cfuvid=Ckz8lfebgV0n8QtvIIYuQIvlcwwiwc67I0Aw.L8t4rM-1754432619011-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: 'Tell me: the capital of the country; the weather there; the product name' + role: user + - role: assistant + tool_calls: + - function: + arguments: '{}' + name: get_country + id: call_3rqTYrA6H21AYUaRGP4F66oq + type: function + - function: + arguments: '{}' + name: get_product_name + id: call_Xw9XMKBJU48kAAd78WgIswDx + type: function + - content: Mexico + role: tool + tool_call_id: call_3rqTYrA6H21AYUaRGP4F66oq + - content: Pydantic AI + role: tool + tool_call_id: call_Xw9XMKBJU48kAAd78WgIswDx + - role: assistant + tool_calls: + - function: + arguments: '{"city":"Mexico City"}' + name: get_weather + id: call_Vz0Sie91Ap56nH0ThKGrZXT7 + type: function + - content: sunny + role: tool + tool_call_id: call_Vz0Sie91Ap56nH0ThKGrZXT7 + model: gpt-4o + stream: true + stream_options: + include_usage: true + tool_choice: required + tools: + - function: + description: '' + name: get_weather + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + - function: + description: '' + name: get_country + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Convert Celsius to Fahrenheit.\n\n Args:\n celsius: Temperature in Celsius\n\n Returns:\n + \ Temperature in Fahrenheit\n " + name: celsius_to_fahrenheit + parameters: + additionalProperties: false + properties: + celsius: + type: number + required: + - celsius + type: object + strict: true + type: function + - function: + description: "Get the weather forecast for a location.\n\n Args:\n location: The location to get the weather + forecast for.\n\n Returns:\n The weather forecast for the location.\n " + name: get_weather_forecast + parameters: + additionalProperties: false + properties: + location: + type: string + required: + - location + type: object + strict: true + type: function + - function: + description: '' + name: get_image_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_dict + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_error + parameters: + additionalProperties: false + properties: + value: + default: false + type: boolean + type: object + type: function + - function: + description: '' + name: get_none + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_multiple_items + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Get the current log level.\n\n Returns:\n The current log level.\n " + name: get_log_level + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Echo the run context.\n\n Args:\n ctx: Context object containing request and session information.\n\n + \ Returns:\n Dictionary with an echo message and the deps.\n " + name: echo_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: Use sampling callback. + name: use_sampling + parameters: + additionalProperties: false + properties: + foo: + type: string + required: + - foo + type: object + strict: true + type: function + - function: + description: The final response which ends this conversation + name: final_result + parameters: + $defs: + Answer: + additionalProperties: false + properties: + answer: + type: string + label: + type: string + required: + - label + - answer + type: object + additionalProperties: false + properties: + answers: + items: + $ref: '#/$defs/Answer' + type: array + required: + - answers + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"role":"assistant","content":null,"tool_calls":[{"index":0,"id":"call_4kc6691zCzjPnOuEtbEGUvz2","type":"function","function":{"name":"final_result","arguments":""}}],"refusal":null},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"fJykmX6H"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"MvE"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answers"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"dktqfAJehiLPjyt"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":["}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"1O"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Lvw"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"label"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"f"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"U"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Capital"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"dfpOaGxthacFGXR"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" of"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"ofN"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" the"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Lw"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" country"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"dTcI51M3iiQGmY"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\",\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"8"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answer"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"5"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Mexico"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" City"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"l"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\"},{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"joxN8cSzo5Vtw0V"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"label"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"H"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"x"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Weather"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"naC5I0l5UxNvni5"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" in"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Osv"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" the"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Hz"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" capital"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"XpkABYJn503NY0"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\",\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"D"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answer"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"u"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Sunny"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Y"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\"},{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"q6lCElEngpao86s"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"label"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"l"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"p"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Product"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"xxsXQcQiDZz87WR"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" Name"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"z"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\",\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"g"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answer"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"i"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"P"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"4sQOS"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"yd"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"cEjx"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"antic"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"l"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" AI"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"1Fy"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\"}"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"bId"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"]}"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"YeZF"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"tool_calls"}],"usage":null,"obfuscation":"T9AO"} + + data: {"id":"chatcmpl-C1KMMrEA9QLIX25pFjKjoRdNkO0nN","object":"chat.completion.chunk","created":1754432626,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[],"usage":{"prompt_tokens":448,"completion_tokens":49,"total_tokens":497,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":"0pSy41lq4PYDj"} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '585' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +version: 1 +... diff --git a/tests/cassettes/test_dbos/test_complex_agent_run_stream_in_workflow.yaml b/tests/cassettes/test_dbos/test_complex_agent_run_stream_in_workflow.yaml new file mode 100644 index 0000000000..99f14a3a15 --- /dev/null +++ b/tests/cassettes/test_dbos/test_complex_agent_run_stream_in_workflow.yaml @@ -0,0 +1,929 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '4294' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: 'Tell me: the capital of the country; the weather there; the product name' + role: user + model: gpt-4o + stream: true + stream_options: + include_usage: true + tool_choice: required + tools: + - function: + description: '' + name: get_weather + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + - function: + description: '' + name: get_country + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Convert Celsius to Fahrenheit.\n\n Args:\n celsius: Temperature in Celsius\n\n Returns:\n + \ Temperature in Fahrenheit\n " + name: celsius_to_fahrenheit + parameters: + additionalProperties: false + properties: + celsius: + type: number + required: + - celsius + type: object + strict: true + type: function + - function: + description: "Get the weather forecast for a location.\n\n Args:\n location: The location to get the weather + forecast for.\n\n Returns:\n The weather forecast for the location.\n " + name: get_weather_forecast + parameters: + additionalProperties: false + properties: + location: + type: string + required: + - location + type: object + strict: true + type: function + - function: + description: '' + name: get_image_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_dict + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_error + parameters: + additionalProperties: false + properties: + value: + default: false + type: boolean + type: object + type: function + - function: + description: '' + name: get_none + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_multiple_items + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Get the current log level.\n\n Returns:\n The current log level.\n " + name: get_log_level + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Echo the run context.\n\n Args:\n ctx: Context object containing request and session information.\n\n + \ Returns:\n Dictionary with an echo message and the deps.\n " + name: echo_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: Use sampling callback. + name: use_sampling + parameters: + additionalProperties: false + properties: + foo: + type: string + required: + - foo + type: object + strict: true + type: function + - function: + description: The final response which ends this conversation + name: final_result + parameters: + $defs: + Answer: + additionalProperties: false + properties: + answer: + type: string + label: + type: string + required: + - label + - answer + type: object + additionalProperties: false + properties: + answers: + items: + $ref: '#/$defs/Answer' + type: array + required: + - answers + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C3OoWwTCYbJ255lFXwWSCx64R8I6u","object":"chat.completion.chunk","created":1754926404,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"role":"assistant","content":null},"logprobs":null,"finish_reason":null}],"obfuscation":"2RzOMz6snzg"} + + data: {"id":"chatcmpl-C3OoWwTCYbJ255lFXwWSCx64R8I6u","object":"chat.completion.chunk","created":1754926404,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"id":"call_w5dFfgZ9tHc5AZxppyYFIuHl","type":"function","function":{"name":"get_country","arguments":""}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"irDFsH5lzC"} + + data: {"id":"chatcmpl-C3OoWwTCYbJ255lFXwWSCx64R8I6u","object":"chat.completion.chunk","created":1754926404,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{}"}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"MRXh"} + + data: {"id":"chatcmpl-C3OoWwTCYbJ255lFXwWSCx64R8I6u","object":"chat.completion.chunk","created":1754926404,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":1,"id":"call_8Ks83PNrVg7CjrgfA49cI6SR","type":"function","function":{"name":"get_product_name","arguments":""}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"qI2OI"} + + data: {"id":"chatcmpl-C3OoWwTCYbJ255lFXwWSCx64R8I6u","object":"chat.completion.chunk","created":1754926404,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":1,"function":{"arguments":"{}"}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"7sxu"} + + data: {"id":"chatcmpl-C3OoWwTCYbJ255lFXwWSCx64R8I6u","object":"chat.completion.chunk","created":1754926404,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"tool_calls"}],"usage":null,"obfuscation":"MdGw"} + + data: {"id":"chatcmpl-C3OoWwTCYbJ255lFXwWSCx64R8I6u","object":"chat.completion.chunk","created":1754926404,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[],"usage":{"prompt_tokens":364,"completion_tokens":40,"total_tokens":404,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":"RNdcIwLOaDbhd"} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '1236' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '4720' + content-type: + - application/json + cookie: + - __cf_bm=qvTuiRBcW7Bg1izqtzFqOtqRu0FdltcnxI8HwrPcCLo-1754926405-1.0.1.1-c_MI0lTx2_gi3xrmoPwthTmLKeKKajC5fijqXwKfaytKJqHd2aGOttqRRjKHsoZaIaF6r95i.MVP9gTVsy2TvzO.WxJZQWwnY.oJMsxWDMc; + _cfuvid=a0UouwNeD6T07eGzpM64.qD3KG4SEfRY5kZYrLzgZiU-1754926405395-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: 'Tell me: the capital of the country; the weather there; the product name' + role: user + - role: assistant + tool_calls: + - function: + arguments: '{}' + name: get_country + id: call_w5dFfgZ9tHc5AZxppyYFIuHl + type: function + - function: + arguments: '{}' + name: get_product_name + id: call_8Ks83PNrVg7CjrgfA49cI6SR + type: function + - content: Mexico + role: tool + tool_call_id: call_w5dFfgZ9tHc5AZxppyYFIuHl + - content: Pydantic AI + role: tool + tool_call_id: call_8Ks83PNrVg7CjrgfA49cI6SR + model: gpt-4o + stream: true + stream_options: + include_usage: true + tool_choice: required + tools: + - function: + description: '' + name: get_weather + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + - function: + description: '' + name: get_country + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Convert Celsius to Fahrenheit.\n\n Args:\n celsius: Temperature in Celsius\n\n Returns:\n + \ Temperature in Fahrenheit\n " + name: celsius_to_fahrenheit + parameters: + additionalProperties: false + properties: + celsius: + type: number + required: + - celsius + type: object + strict: true + type: function + - function: + description: "Get the weather forecast for a location.\n\n Args:\n location: The location to get the weather + forecast for.\n\n Returns:\n The weather forecast for the location.\n " + name: get_weather_forecast + parameters: + additionalProperties: false + properties: + location: + type: string + required: + - location + type: object + strict: true + type: function + - function: + description: '' + name: get_image_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_dict + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_error + parameters: + additionalProperties: false + properties: + value: + default: false + type: boolean + type: object + type: function + - function: + description: '' + name: get_none + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_multiple_items + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Get the current log level.\n\n Returns:\n The current log level.\n " + name: get_log_level + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Echo the run context.\n\n Args:\n ctx: Context object containing request and session information.\n\n + \ Returns:\n Dictionary with an echo message and the deps.\n " + name: echo_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: Use sampling callback. + name: use_sampling + parameters: + additionalProperties: false + properties: + foo: + type: string + required: + - foo + type: object + strict: true + type: function + - function: + description: The final response which ends this conversation + name: final_result + parameters: + $defs: + Answer: + additionalProperties: false + properties: + answer: + type: string + label: + type: string + required: + - label + - answer + type: object + additionalProperties: false + properties: + answers: + items: + $ref: '#/$defs/Answer' + type: array + required: + - answers + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C3OoaxA52FVqKNOcPSRU9dAB2N72T","object":"chat.completion.chunk","created":1754926408,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"role":"assistant","content":null,"tool_calls":[{"index":0,"id":"call_oAYsUVV4qquVvieF6Dp61ipv","type":"function","function":{"name":"get_weather","arguments":""}}],"refusal":null},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"vrwPo0J2E"} + + data: {"id":"chatcmpl-C3OoaxA52FVqKNOcPSRU9dAB2N72T","object":"chat.completion.chunk","created":1754926408,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"HWd"} + + data: {"id":"chatcmpl-C3OoaxA52FVqKNOcPSRU9dAB2N72T","object":"chat.completion.chunk","created":1754926408,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"city"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"rm"} + + data: {"id":"chatcmpl-C3OoaxA52FVqKNOcPSRU9dAB2N72T","object":"chat.completion.chunk","created":1754926408,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"B"} + + data: {"id":"chatcmpl-C3OoaxA52FVqKNOcPSRU9dAB2N72T","object":"chat.completion.chunk","created":1754926408,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Mexico"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C3OoaxA52FVqKNOcPSRU9dAB2N72T","object":"chat.completion.chunk","created":1754926408,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" City"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"P"} + + data: {"id":"chatcmpl-C3OoaxA52FVqKNOcPSRU9dAB2N72T","object":"chat.completion.chunk","created":1754926408,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\"}"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"vxH"} + + data: {"id":"chatcmpl-C3OoaxA52FVqKNOcPSRU9dAB2N72T","object":"chat.completion.chunk","created":1754926408,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"tool_calls"}],"usage":null,"obfuscation":"TIRC"} + + data: {"id":"chatcmpl-C3OoaxA52FVqKNOcPSRU9dAB2N72T","object":"chat.completion.chunk","created":1754926408,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[],"usage":{"prompt_tokens":423,"completion_tokens":15,"total_tokens":438,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":"b2wb3y0L5jj6U"} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '368' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '4969' + content-type: + - application/json + cookie: + - __cf_bm=qvTuiRBcW7Bg1izqtzFqOtqRu0FdltcnxI8HwrPcCLo-1754926405-1.0.1.1-c_MI0lTx2_gi3xrmoPwthTmLKeKKajC5fijqXwKfaytKJqHd2aGOttqRRjKHsoZaIaF6r95i.MVP9gTVsy2TvzO.WxJZQWwnY.oJMsxWDMc; + _cfuvid=a0UouwNeD6T07eGzpM64.qD3KG4SEfRY5kZYrLzgZiU-1754926405395-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: 'Tell me: the capital of the country; the weather there; the product name' + role: user + - role: assistant + tool_calls: + - function: + arguments: '{}' + name: get_country + id: call_w5dFfgZ9tHc5AZxppyYFIuHl + type: function + - function: + arguments: '{}' + name: get_product_name + id: call_8Ks83PNrVg7CjrgfA49cI6SR + type: function + - content: Mexico + role: tool + tool_call_id: call_w5dFfgZ9tHc5AZxppyYFIuHl + - content: Pydantic AI + role: tool + tool_call_id: call_8Ks83PNrVg7CjrgfA49cI6SR + - role: assistant + tool_calls: + - function: + arguments: '{"city":"Mexico City"}' + name: get_weather + id: call_oAYsUVV4qquVvieF6Dp61ipv + type: function + - content: sunny + role: tool + tool_call_id: call_oAYsUVV4qquVvieF6Dp61ipv + model: gpt-4o + stream: true + stream_options: + include_usage: true + tool_choice: required + tools: + - function: + description: '' + name: get_weather + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + - function: + description: '' + name: get_country + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Convert Celsius to Fahrenheit.\n\n Args:\n celsius: Temperature in Celsius\n\n Returns:\n + \ Temperature in Fahrenheit\n " + name: celsius_to_fahrenheit + parameters: + additionalProperties: false + properties: + celsius: + type: number + required: + - celsius + type: object + strict: true + type: function + - function: + description: "Get the weather forecast for a location.\n\n Args:\n location: The location to get the weather + forecast for.\n\n Returns:\n The weather forecast for the location.\n " + name: get_weather_forecast + parameters: + additionalProperties: false + properties: + location: + type: string + required: + - location + type: object + strict: true + type: function + - function: + description: '' + name: get_image_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_dict + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_error + parameters: + additionalProperties: false + properties: + value: + default: false + type: boolean + type: object + type: function + - function: + description: '' + name: get_none + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_multiple_items + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Get the current log level.\n\n Returns:\n The current log level.\n " + name: get_log_level + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Echo the run context.\n\n Args:\n ctx: Context object containing request and session information.\n\n + \ Returns:\n Dictionary with an echo message and the deps.\n " + name: echo_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: Use sampling callback. + name: use_sampling + parameters: + additionalProperties: false + properties: + foo: + type: string + required: + - foo + type: object + strict: true + type: function + - function: + description: The final response which ends this conversation + name: final_result + parameters: + $defs: + Answer: + additionalProperties: false + properties: + answer: + type: string + label: + type: string + required: + - label + - answer + type: object + additionalProperties: false + properties: + answers: + items: + $ref: '#/$defs/Answer' + type: array + required: + - answers + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"role":"assistant","content":null,"tool_calls":[{"index":0,"id":"call_nryimwPYCh3YrGpAvvDhPxpO","type":"function","function":{"name":"final_result","arguments":""}}],"refusal":null},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Rsplpznn"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"mLE"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answers"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"eeAi5JV3aIjxR6s"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":["}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"FU"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"09W"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"label"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"w"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"e"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Capital"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"WYH8sFF1slrAXp1"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" of"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"x15"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" the"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Pc"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" country"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"RX7C17YDGvCQ55"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\",\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"c"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answer"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"D"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Mexico"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" City"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"x"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\"},{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"FiC5ZMbgN3lsQIx"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"label"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"i"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"4"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Weather"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"rpVPFBQDh0N6oTm"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" in"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"4YY"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" the"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"T9"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" capital"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"53tq8ACYnSSFVa"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\",\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"c"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answer"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"9"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Sunny"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"L"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\"},{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"rexyTVLA5g4TqZU"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"label"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"S"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"0"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Product"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"maad8vsYEgcs6Io"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" Name"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"9"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\",\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"8"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answer"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"u"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"P"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Exwy0"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"yd"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"pHPa"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"antic"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"j"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" AI"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"qQ1"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\"}"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"CNK"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"]}"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"x2F2"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"tool_calls"}],"usage":null,"obfuscation":"F5EE"} + + data: {"id":"chatcmpl-C3OodA2lxT9iakx32WWKSC5FCMZMa","object":"chat.completion.chunk","created":1754926411,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[],"usage":{"prompt_tokens":448,"completion_tokens":49,"total_tokens":497,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":"L2GnXfXPSsFok"} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '512' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +version: 1 +... diff --git a/tests/cassettes/test_dbos/test_dbos_agent_iter.yaml b/tests/cassettes/test_dbos/test_dbos_agent_iter.yaml new file mode 100644 index 0000000000..6b85fce101 --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_iter.yaml @@ -0,0 +1,78 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '144' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of Mexico? + role: user + model: gpt-4o + stream: true + stream_options: + include_usage: true + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"role":"assistant","content":"","refusal":null},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"KhEoTT6u7JgGis"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":"The"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"68kWLPsNVG6FC"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" capital"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"6d1qUFRY"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" of"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Q7kmEmvYuS4Nk"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" Mexico"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"KrDQdBujM"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" is"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"qCG5yEQHHMYd2"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" Mexico"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"uFnoTvoy9"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" City"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"bdnHY6qXdKt"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"2WeZHWZ8r9jetpK"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}],"usage":null,"obfuscation":"7LC080uo8W"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[],"usage":{"prompt_tokens":14,"completion_tokens":8,"total_tokens":22,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":""} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '259' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +version: 1 +... diff --git a/tests/cassettes/test_dbos/test_dbos_agent_iter_in_workflow.yaml b/tests/cassettes/test_dbos/test_dbos_agent_iter_in_workflow.yaml new file mode 100644 index 0000000000..6b85fce101 --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_iter_in_workflow.yaml @@ -0,0 +1,78 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '144' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of Mexico? + role: user + model: gpt-4o + stream: true + stream_options: + include_usage: true + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"role":"assistant","content":"","refusal":null},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"KhEoTT6u7JgGis"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":"The"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"68kWLPsNVG6FC"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" capital"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"6d1qUFRY"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" of"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Q7kmEmvYuS4Nk"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" Mexico"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"KrDQdBujM"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" is"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"qCG5yEQHHMYd2"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" Mexico"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"uFnoTvoy9"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" City"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"bdnHY6qXdKt"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"2WeZHWZ8r9jetpK"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}],"usage":null,"obfuscation":"7LC080uo8W"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[],"usage":{"prompt_tokens":14,"completion_tokens":8,"total_tokens":22,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":""} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '259' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +version: 1 +... diff --git a/tests/cassettes/test_dbos/test_dbos_agent_override_deps_in_workflow.yaml b/tests/cassettes/test_dbos/test_dbos_agent_override_deps_in_workflow.yaml new file mode 100644 index 0000000000..0d9336e72e --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_override_deps_in_workflow.yaml @@ -0,0 +1,79 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '105' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of Mexico? + role: user + model: gpt-4o + stream: false + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '838' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '467' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: stop + index: 0 + logprobs: null + message: + annotations: [] + content: The capital of Mexico is Mexico City. + refusal: null + role: assistant + created: 1754937502 + id: chatcmpl-C3RhWZ6jbzOaAe9fKOSr5lWGY5Qi2 + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_ff25b2783a + usage: + completion_tokens: 8 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 14 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 22 + status: + code: 200 + message: OK +version: 1 diff --git a/tests/cassettes/test_dbos/test_dbos_agent_override_tools_in_workflow.yaml b/tests/cassettes/test_dbos/test_dbos_agent_override_tools_in_workflow.yaml new file mode 100644 index 0000000000..0d9336e72e --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_override_tools_in_workflow.yaml @@ -0,0 +1,79 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '105' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of Mexico? + role: user + model: gpt-4o + stream: false + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '838' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '467' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: stop + index: 0 + logprobs: null + message: + annotations: [] + content: The capital of Mexico is Mexico City. + refusal: null + role: assistant + created: 1754937502 + id: chatcmpl-C3RhWZ6jbzOaAe9fKOSr5lWGY5Qi2 + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_ff25b2783a + usage: + completion_tokens: 8 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 14 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 22 + status: + code: 200 + message: OK +version: 1 diff --git a/tests/cassettes/test_dbos/test_dbos_agent_run.yaml b/tests/cassettes/test_dbos/test_dbos_agent_run.yaml new file mode 100644 index 0000000000..d84fdf771f --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_run.yaml @@ -0,0 +1,79 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '105' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of Mexico? + role: user + model: gpt-4o + stream: false + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '838' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '580' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: stop + index: 0 + logprobs: null + message: + annotations: [] + content: The capital of Mexico is Mexico City. + refusal: null + role: assistant + created: 1754688958 + id: chatcmpl-C2P2k1mRRz7KMAtppLZz83Lyy33Jl + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_ff25b2783a + usage: + completion_tokens: 8 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 14 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 22 + status: + code: 200 + message: OK +version: 1 diff --git a/tests/cassettes/test_dbos/test_dbos_agent_run_in_workflow_with_toolsets.yaml b/tests/cassettes/test_dbos/test_dbos_agent_run_in_workflow_with_toolsets.yaml new file mode 100644 index 0000000000..d84fdf771f --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_run_in_workflow_with_toolsets.yaml @@ -0,0 +1,79 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '105' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of Mexico? + role: user + model: gpt-4o + stream: false + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '838' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '580' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: stop + index: 0 + logprobs: null + message: + annotations: [] + content: The capital of Mexico is Mexico City. + refusal: null + role: assistant + created: 1754688958 + id: chatcmpl-C2P2k1mRRz7KMAtppLZz83Lyy33Jl + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_ff25b2783a + usage: + completion_tokens: 8 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 14 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 22 + status: + code: 200 + message: OK +version: 1 diff --git a/tests/cassettes/test_dbos/test_dbos_agent_run_stream.yaml b/tests/cassettes/test_dbos/test_dbos_agent_run_stream.yaml new file mode 100644 index 0000000000..64d552affe --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_run_stream.yaml @@ -0,0 +1,78 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '144' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of Mexico? + role: user + model: gpt-4o + stream: true + stream_options: + include_usage: true + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C2P2HtMJhPkWjQ2adKerkdVilXmRL","object":"chat.completion.chunk","created":1754688929,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"role":"assistant","content":"","refusal":null},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"oouykO51ovJROe"} + + data: {"id":"chatcmpl-C2P2HtMJhPkWjQ2adKerkdVilXmRL","object":"chat.completion.chunk","created":1754688929,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"content":"The"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Qk5ENocFqkZOr"} + + data: {"id":"chatcmpl-C2P2HtMJhPkWjQ2adKerkdVilXmRL","object":"chat.completion.chunk","created":1754688929,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"content":" capital"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"G2ubBYwM"} + + data: {"id":"chatcmpl-C2P2HtMJhPkWjQ2adKerkdVilXmRL","object":"chat.completion.chunk","created":1754688929,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"content":" of"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"46wA1R0uj9DWX"} + + data: {"id":"chatcmpl-C2P2HtMJhPkWjQ2adKerkdVilXmRL","object":"chat.completion.chunk","created":1754688929,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"content":" Mexico"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"HF2DtMCaA"} + + data: {"id":"chatcmpl-C2P2HtMJhPkWjQ2adKerkdVilXmRL","object":"chat.completion.chunk","created":1754688929,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"content":" is"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"vFNThjPk0BGrI"} + + data: {"id":"chatcmpl-C2P2HtMJhPkWjQ2adKerkdVilXmRL","object":"chat.completion.chunk","created":1754688929,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"content":" Mexico"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"EU1SORHKK"} + + data: {"id":"chatcmpl-C2P2HtMJhPkWjQ2adKerkdVilXmRL","object":"chat.completion.chunk","created":1754688929,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"content":" City"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"gpuKBCmzsBH"} + + data: {"id":"chatcmpl-C2P2HtMJhPkWjQ2adKerkdVilXmRL","object":"chat.completion.chunk","created":1754688929,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Ow9nVgoSopFONI2"} + + data: {"id":"chatcmpl-C2P2HtMJhPkWjQ2adKerkdVilXmRL","object":"chat.completion.chunk","created":1754688929,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}],"usage":null,"obfuscation":"zkiywLbUBn"} + + data: {"id":"chatcmpl-C2P2HtMJhPkWjQ2adKerkdVilXmRL","object":"chat.completion.chunk","created":1754688929,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[],"usage":{"prompt_tokens":14,"completion_tokens":8,"total_tokens":22,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":""} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '433' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +version: 1 +... diff --git a/tests/cassettes/test_dbos/test_dbos_agent_run_sync.yaml b/tests/cassettes/test_dbos/test_dbos_agent_run_sync.yaml new file mode 100644 index 0000000000..934c945533 --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_run_sync.yaml @@ -0,0 +1,79 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '105' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of Mexico? + role: user + model: gpt-4o + stream: false + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '838' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '466' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: stop + index: 0 + logprobs: null + message: + annotations: [] + content: The capital of Mexico is Mexico City. + refusal: null + role: assistant + created: 1754688941 + id: chatcmpl-C2P2TVJ3Qoyk6ajLKjYZF8QDAwt50 + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_ff25b2783a + usage: + completion_tokens: 8 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 14 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 22 + status: + code: 200 + message: OK +version: 1 diff --git a/tests/cassettes/test_dbos/test_dbos_agent_run_sync_in_workflow.yaml b/tests/cassettes/test_dbos/test_dbos_agent_run_sync_in_workflow.yaml new file mode 100644 index 0000000000..934c945533 --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_run_sync_in_workflow.yaml @@ -0,0 +1,79 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '105' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of Mexico? + role: user + model: gpt-4o + stream: false + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '838' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '466' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: stop + index: 0 + logprobs: null + message: + annotations: [] + content: The capital of Mexico is Mexico City. + refusal: null + role: assistant + created: 1754688941 + id: chatcmpl-C2P2TVJ3Qoyk6ajLKjYZF8QDAwt50 + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_ff25b2783a + usage: + completion_tokens: 8 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 14 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 22 + status: + code: 200 + message: OK +version: 1 diff --git a/tests/cassettes/test_dbos/test_dbos_agent_with_dataclass_deps_as_dict.yaml b/tests/cassettes/test_dbos/test_dbos_agent_with_dataclass_deps_as_dict.yaml new file mode 100644 index 0000000000..56e26e3b43 --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_with_dataclass_deps_as_dict.yaml @@ -0,0 +1,195 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '298' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of the country? + role: user + model: gpt-4o + stream: false + tool_choice: auto + tools: + - function: + description: '' + name: get_country_from_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '1071' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '762' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: tool_calls + index: 0 + logprobs: null + message: + annotations: [] + content: null + refusal: null + role: assistant + tool_calls: + - function: + arguments: '{}' + name: get_country_from_deps + id: call_5yRefuR6ypjiaezT7TLFs6cw + type: function + created: 1754931124 + id: chatcmpl-C3Q2eKnB6inIhhfgNgGnWbTMA2dxc + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_ff25b2783a + usage: + completion_tokens: 13 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 42 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 55 + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '534' + content-type: + - application/json + cookie: + - __cf_bm=mSkun4f2D8ApNaB3wVpGAt_q1Nw1DAlbsGyBdx44ecs-1754931124-1.0.1.1-1wnrnf1KSSmpulBH_mYdlPZlUDzGlcxdUIezMYqWXFW1DBq0oXf_iFZ43bjWBar0yuXJ7TuX2f8JH6fLYqeoVJjutHZJeJNskPgLTocD3fE; + _cfuvid=ZfQ6ht_tZf58Z86BtFmF.8PgCy5MRXIbVNE3e0MV5dc-1754931124950-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of the country? + role: user + - role: assistant + tool_calls: + - function: + arguments: '{}' + name: get_country_from_deps + id: call_5yRefuR6ypjiaezT7TLFs6cw + type: function + - content: Mexico + role: tool + tool_call_id: call_5yRefuR6ypjiaezT7TLFs6cw + model: gpt-4o + stream: false + tool_choice: auto + tools: + - function: + description: '' + name: get_country_from_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '838' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '450' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: stop + index: 0 + logprobs: null + message: + annotations: [] + content: The capital of Mexico is Mexico City. + refusal: null + role: assistant + created: 1754931125 + id: chatcmpl-C3Q2f5OX9rYHpSnEVZwHjzqUVrTQF + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_07871e2ad8 + usage: + completion_tokens: 9 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 67 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 76 + status: + code: 200 + message: OK +version: 1 diff --git a/tests/cassettes/test_dbos/test_dbos_agent_with_hitl_tool.yaml b/tests/cassettes/test_dbos/test_dbos_agent_with_hitl_tool.yaml new file mode 100644 index 0000000000..7d2de11026 --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_with_hitl_tool.yaml @@ -0,0 +1,249 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '639' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: Just call tools without asking for confirmation. + role: system + - content: Delete the file `.env` and create `test.txt` + role: user + model: gpt-4o + stream: false + tool_choice: auto + tools: + - function: + description: '' + name: create_file + parameters: + additionalProperties: false + properties: + path: + type: string + required: + - path + type: object + strict: true + type: function + - function: + description: '' + name: delete_file + parameters: + additionalProperties: false + properties: + path: + type: string + required: + - path + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '1319' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '737' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: tool_calls + index: 0 + logprobs: null + message: + annotations: [] + content: null + refusal: null + role: assistant + tool_calls: + - function: + arguments: '{"path": ".env"}' + name: delete_file + id: call_jYdIdRZHxZTn5bWCq5jlMrJi + type: function + - function: + arguments: '{"path": "test.txt"}' + name: create_file + id: call_TmlTVWQbzrXCZ4jNsCVNbNqu + type: function + created: 1756419063 + id: chatcmpl-C9f7f1SbHRzcwlJwrYKdSonTRWPvV + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_df0f7b956c + usage: + completion_tokens: 46 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 71 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 117 + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '1109' + content-type: + - application/json + cookie: + - __cf_bm=mLR5rPRuew3fXXRgovBj3Ir6vRflwQucmAT8xYCnZaA-1756419064-1.0.1.1-HvvEmDcDOmavm0vBqLo_ZG_Bjh6B2CgT.etvrYCBiezHow_0xNZ64xqybhJl_NViSVDNXuaXviCcDgIZg13KgsWp0J_qNORRiVIHxO2UAO4; + _cfuvid=536Zye1eyyVQS9C8BV6AAFwv4vaW.H4o2O4LK_228oc-1756419064420-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: Just call tools without asking for confirmation. + role: system + - content: Delete the file `.env` and create `test.txt` + role: user + - content: null + role: assistant + tool_calls: + - function: + arguments: '{"path": ".env"}' + name: delete_file + id: call_jYdIdRZHxZTn5bWCq5jlMrJi + type: function + - function: + arguments: '{"path": "test.txt"}' + name: create_file + id: call_TmlTVWQbzrXCZ4jNsCVNbNqu + type: function + - content: 'true' + role: tool + tool_call_id: call_jYdIdRZHxZTn5bWCq5jlMrJi + - content: Success + role: tool + tool_call_id: call_TmlTVWQbzrXCZ4jNsCVNbNqu + model: gpt-4o + stream: false + tool_choice: auto + tools: + - function: + description: '' + name: create_file + parameters: + additionalProperties: false + properties: + path: + type: string + required: + - path + type: object + strict: true + type: function + - function: + description: '' + name: delete_file + parameters: + additionalProperties: false + properties: + path: + type: string + required: + - path + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '882' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '624' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: stop + index: 0 + logprobs: null + message: + annotations: [] + content: The file `.env` has been deleted and `test.txt` has been created successfully. + refusal: null + role: assistant + created: 1756419066 + id: chatcmpl-C9f7iBeeNNBazDCMa7RSx3EFtiZoR + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_80956533cb + usage: + completion_tokens: 19 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 133 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 152 + status: + code: 200 + message: OK +version: 1 diff --git a/tests/cassettes/test_dbos/test_dbos_agent_with_hitl_tool_sync.yaml b/tests/cassettes/test_dbos/test_dbos_agent_with_hitl_tool_sync.yaml new file mode 100644 index 0000000000..7d2de11026 --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_with_hitl_tool_sync.yaml @@ -0,0 +1,249 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '639' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: Just call tools without asking for confirmation. + role: system + - content: Delete the file `.env` and create `test.txt` + role: user + model: gpt-4o + stream: false + tool_choice: auto + tools: + - function: + description: '' + name: create_file + parameters: + additionalProperties: false + properties: + path: + type: string + required: + - path + type: object + strict: true + type: function + - function: + description: '' + name: delete_file + parameters: + additionalProperties: false + properties: + path: + type: string + required: + - path + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '1319' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '737' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: tool_calls + index: 0 + logprobs: null + message: + annotations: [] + content: null + refusal: null + role: assistant + tool_calls: + - function: + arguments: '{"path": ".env"}' + name: delete_file + id: call_jYdIdRZHxZTn5bWCq5jlMrJi + type: function + - function: + arguments: '{"path": "test.txt"}' + name: create_file + id: call_TmlTVWQbzrXCZ4jNsCVNbNqu + type: function + created: 1756419063 + id: chatcmpl-C9f7f1SbHRzcwlJwrYKdSonTRWPvV + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_df0f7b956c + usage: + completion_tokens: 46 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 71 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 117 + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '1109' + content-type: + - application/json + cookie: + - __cf_bm=mLR5rPRuew3fXXRgovBj3Ir6vRflwQucmAT8xYCnZaA-1756419064-1.0.1.1-HvvEmDcDOmavm0vBqLo_ZG_Bjh6B2CgT.etvrYCBiezHow_0xNZ64xqybhJl_NViSVDNXuaXviCcDgIZg13KgsWp0J_qNORRiVIHxO2UAO4; + _cfuvid=536Zye1eyyVQS9C8BV6AAFwv4vaW.H4o2O4LK_228oc-1756419064420-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: Just call tools without asking for confirmation. + role: system + - content: Delete the file `.env` and create `test.txt` + role: user + - content: null + role: assistant + tool_calls: + - function: + arguments: '{"path": ".env"}' + name: delete_file + id: call_jYdIdRZHxZTn5bWCq5jlMrJi + type: function + - function: + arguments: '{"path": "test.txt"}' + name: create_file + id: call_TmlTVWQbzrXCZ4jNsCVNbNqu + type: function + - content: 'true' + role: tool + tool_call_id: call_jYdIdRZHxZTn5bWCq5jlMrJi + - content: Success + role: tool + tool_call_id: call_TmlTVWQbzrXCZ4jNsCVNbNqu + model: gpt-4o + stream: false + tool_choice: auto + tools: + - function: + description: '' + name: create_file + parameters: + additionalProperties: false + properties: + path: + type: string + required: + - path + type: object + strict: true + type: function + - function: + description: '' + name: delete_file + parameters: + additionalProperties: false + properties: + path: + type: string + required: + - path + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '882' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '624' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: stop + index: 0 + logprobs: null + message: + annotations: [] + content: The file `.env` has been deleted and `test.txt` has been created successfully. + refusal: null + role: assistant + created: 1756419066 + id: chatcmpl-C9f7iBeeNNBazDCMa7RSx3EFtiZoR + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_80956533cb + usage: + completion_tokens: 19 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 133 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 152 + status: + code: 200 + message: OK +version: 1 diff --git a/tests/cassettes/test_dbos/test_dbos_agent_with_model_retry.yaml b/tests/cassettes/test_dbos/test_dbos_agent_with_model_retry.yaml new file mode 100644 index 0000000000..59e9e12db7 --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_with_model_retry.yaml @@ -0,0 +1,335 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '347' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the weather in CDMX? + role: user + model: gpt-4o + stream: false + tool_choice: auto + tools: + - function: + description: '' + name: get_weather_in_city + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '1086' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '327' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: tool_calls + index: 0 + logprobs: null + message: + annotations: [] + content: null + refusal: null + role: assistant + tool_calls: + - function: + arguments: '{"city":"CDMX"}' + name: get_weather_in_city + id: call_fFAB8MNL3tUdfNIIdsIJTo0H + type: function + created: 1756423190 + id: chatcmpl-C9gCExiXILzHBQ4ZuERdiURkHUZZM + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_ea40d5097a + usage: + completion_tokens: 17 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 47 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 64 + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '665' + content-type: + - application/json + cookie: + - __cf_bm=yUpeQm6kQWb.kRgbX0F391IMtd8nqGUrSigYrEakdXQ-1756423190-1.0.1.1-FPxZURmrG14EfyWc8ZgNIHTHoWxQ3IPV9X4uCZ2O_z1KiiSc66VTYHq86EbI5aUnd26QXGFdwTsriLpMHbGRxfqlOj.DSK3daVg9q_XBQDE; + _cfuvid=t8Kvct8UEUS4P6_7AIQ7_ayX0J6QaCWrtHqGpTB5uMo-1756423190751-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the weather in CDMX? + role: user + - content: null + role: assistant + tool_calls: + - function: + arguments: '{"city":"CDMX"}' + name: get_weather_in_city + id: call_fFAB8MNL3tUdfNIIdsIJTo0H + type: function + - content: |- + Did you mean Mexico City? + + Fix the errors and try again. + role: tool + tool_call_id: call_fFAB8MNL3tUdfNIIdsIJTo0H + model: gpt-4o + stream: false + tool_choice: auto + tools: + - function: + description: '' + name: get_weather_in_city + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '1094' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '352' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: tool_calls + index: 0 + logprobs: null + message: + annotations: [] + content: null + refusal: null + role: assistant + tool_calls: + - function: + arguments: '{"city":"Mexico City"}' + name: get_weather_in_city + id: call_hLYHO5lK5lmiukTZv6VQzz3x + type: function + created: 1756423191 + id: chatcmpl-C9gCF2OpzQojDQTsp31IsAagNqEC6 + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_ea40d5097a + usage: + completion_tokens: 17 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 87 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 104 + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '937' + content-type: + - application/json + cookie: + - __cf_bm=yUpeQm6kQWb.kRgbX0F391IMtd8nqGUrSigYrEakdXQ-1756423190-1.0.1.1-FPxZURmrG14EfyWc8ZgNIHTHoWxQ3IPV9X4uCZ2O_z1KiiSc66VTYHq86EbI5aUnd26QXGFdwTsriLpMHbGRxfqlOj.DSK3daVg9q_XBQDE; + _cfuvid=t8Kvct8UEUS4P6_7AIQ7_ayX0J6QaCWrtHqGpTB5uMo-1756423190751-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the weather in CDMX? + role: user + - content: null + role: assistant + tool_calls: + - function: + arguments: '{"city":"CDMX"}' + name: get_weather_in_city + id: call_fFAB8MNL3tUdfNIIdsIJTo0H + type: function + - content: |- + Did you mean Mexico City? + + Fix the errors and try again. + role: tool + tool_call_id: call_fFAB8MNL3tUdfNIIdsIJTo0H + - content: null + role: assistant + tool_calls: + - function: + arguments: '{"city":"Mexico City"}' + name: get_weather_in_city + id: call_hLYHO5lK5lmiukTZv6VQzz3x + type: function + - content: sunny + role: tool + tool_call_id: call_hLYHO5lK5lmiukTZv6VQzz3x + model: gpt-4o + stream: false + tool_choice: auto + tools: + - function: + description: '' + name: get_weather_in_city + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '850' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '312' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: stop + index: 0 + logprobs: null + message: + annotations: [] + content: The weather in Mexico City is currently sunny. + refusal: null + role: assistant + created: 1756423192 + id: chatcmpl-C9gCGg6DDdUlo7CuS04nK9k6dnkZG + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_ea40d5097a + usage: + completion_tokens: 10 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 116 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 126 + status: + code: 200 + message: OK +version: 1 diff --git a/tests/cassettes/test_dbos/test_dbos_agent_with_non_dict_deps.yaml b/tests/cassettes/test_dbos/test_dbos_agent_with_non_dict_deps.yaml new file mode 100644 index 0000000000..b4b64ef667 --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_with_non_dict_deps.yaml @@ -0,0 +1,95 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '298' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of the country? + role: user + model: gpt-4o + stream: false + tool_choice: auto + tools: + - function: + description: '' + name: get_country_from_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '1071' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '579' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: tool_calls + index: 0 + logprobs: null + message: + annotations: [] + content: null + refusal: null + role: assistant + tool_calls: + - function: + arguments: '{}' + name: get_country_from_deps + id: call_SdMSvu33V768Hpgpv2L9Vub0 + type: function + created: 1754931280 + id: chatcmpl-C3Q5AndpHahgIZa3e472KqmMMupyp + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_07871e2ad8 + usage: + completion_tokens: 13 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 42 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 55 + status: + code: 200 + message: OK +version: 1 diff --git a/tests/cassettes/test_dbos/test_dbos_agent_with_unserializable_deps_type.yaml b/tests/cassettes/test_dbos/test_dbos_agent_with_unserializable_deps_type.yaml new file mode 100644 index 0000000000..de8c761758 --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_agent_with_unserializable_deps_type.yaml @@ -0,0 +1,95 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '279' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the model name? + role: user + model: gpt-4o + stream: false + tool_choice: auto + tools: + - function: + description: '' + name: get_model_name + parameters: + additionalProperties: false + properties: {} + type: object + type: function + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '1064' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '1141' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: tool_calls + index: 0 + logprobs: null + message: + annotations: [] + content: null + refusal: null + role: assistant + tool_calls: + - function: + arguments: '{}' + name: get_model_name + id: call_wB0C4FAOjxYgTNJrQT9NzzZ9 + type: function + created: 1755036404 + id: chatcmpl-C3rQisW29iISecZ6NMn4FrseeO3A9 + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_07871e2ad8 + usage: + completion_tokens: 11 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 38 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 49 + status: + code: 200 + message: OK +version: 1 diff --git a/tests/cassettes/test_dbos/test_dbos_model_stream_direct.yaml b/tests/cassettes/test_dbos/test_dbos_model_stream_direct.yaml new file mode 100644 index 0000000000..6b85fce101 --- /dev/null +++ b/tests/cassettes/test_dbos/test_dbos_model_stream_direct.yaml @@ -0,0 +1,78 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '144' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of Mexico? + role: user + model: gpt-4o + stream: true + stream_options: + include_usage: true + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"role":"assistant","content":"","refusal":null},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"KhEoTT6u7JgGis"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":"The"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"68kWLPsNVG6FC"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" capital"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"6d1qUFRY"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" of"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Q7kmEmvYuS4Nk"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" Mexico"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"KrDQdBujM"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" is"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"qCG5yEQHHMYd2"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" Mexico"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"uFnoTvoy9"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":" City"},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"bdnHY6qXdKt"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{"content":"."},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"2WeZHWZ8r9jetpK"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}],"usage":null,"obfuscation":"7LC080uo8W"} + + data: {"id":"chatcmpl-C2P1wP1damHwC6sXvGAIh5PMvH6wM","object":"chat.completion.chunk","created":1754688908,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_ff25b2783a","choices":[],"usage":{"prompt_tokens":14,"completion_tokens":8,"total_tokens":22,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":""} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '259' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +version: 1 +... diff --git a/tests/cassettes/test_dbos/test_multiple_agents.yaml b/tests/cassettes/test_dbos/test_multiple_agents.yaml new file mode 100644 index 0000000000..92fab355e7 --- /dev/null +++ b/tests/cassettes/test_dbos/test_multiple_agents.yaml @@ -0,0 +1,1009 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '105' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of Mexico? + role: user + model: gpt-4o + stream: false + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '838' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '714' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: stop + index: 0 + logprobs: null + message: + annotations: [] + content: The capital of Mexico is Mexico City. + refusal: null + role: assistant + created: 1754686067 + id: chatcmpl-C2OI7Ey3XvNe02fb41d1D6h1j6H1M + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_07871e2ad8 + usage: + completion_tokens: 8 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 14 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 22 + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '4294' + content-type: + - application/json + cookie: + - __cf_bm=FMdLfGkYYRRxShv6d.6ULos8pStg0TmiWrGy26zbUnk-1754686067-1.0.1.1-E.y8vuMwOtCOXnzbZRfxF.uHql5wJ.TjpdhC2xGP7dLaiSdNu.imsaLxXWUsb3oWk_j4bh0I6jtaUIkz6FLH3DWe7bW3PJgXdHVWVeSDd5I; + _cfuvid=VKe1SA_RyQAHS9bkLXRq_LMGd8_CKmeHtRDscnoH.vk-1754686067987-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: 'Tell me: the capital of the country; the weather there; the product name' + role: user + model: gpt-4o + stream: true + stream_options: + include_usage: true + tool_choice: required + tools: + - function: + description: '' + name: get_weather + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + - function: + description: '' + name: get_country + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Convert Celsius to Fahrenheit.\n\n Args:\n celsius: Temperature in Celsius\n\n Returns:\n + \ Temperature in Fahrenheit\n " + name: celsius_to_fahrenheit + parameters: + additionalProperties: false + properties: + celsius: + type: number + required: + - celsius + type: object + strict: true + type: function + - function: + description: "Get the weather forecast for a location.\n\n Args:\n location: The location to get the weather + forecast for.\n\n Returns:\n The weather forecast for the location.\n " + name: get_weather_forecast + parameters: + additionalProperties: false + properties: + location: + type: string + required: + - location + type: object + strict: true + type: function + - function: + description: '' + name: get_image_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_dict + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_error + parameters: + additionalProperties: false + properties: + value: + default: false + type: boolean + type: object + type: function + - function: + description: '' + name: get_none + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_multiple_items + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Get the current log level.\n\n Returns:\n The current log level.\n " + name: get_log_level + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Echo the run context.\n\n Args:\n ctx: Context object containing request and session information.\n\n + \ Returns:\n Dictionary with an echo message and the deps.\n " + name: echo_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: Use sampling callback. + name: use_sampling + parameters: + additionalProperties: false + properties: + foo: + type: string + required: + - foo + type: object + strict: true + type: function + - function: + description: The final response which ends this conversation + name: final_result + parameters: + $defs: + Answer: + additionalProperties: false + properties: + answer: + type: string + label: + type: string + required: + - label + - answer + type: object + additionalProperties: false + properties: + answers: + items: + $ref: '#/$defs/Answer' + type: array + required: + - answers + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C2OICn1CbpqxbWqJvEQJadV86H8q7","object":"chat.completion.chunk","created":1754686072,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"role":"assistant","content":null},"logprobs":null,"finish_reason":null}],"obfuscation":"bEcAVepxT2N"} + + data: {"id":"chatcmpl-C2OICn1CbpqxbWqJvEQJadV86H8q7","object":"chat.completion.chunk","created":1754686072,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"id":"call_fc0SDU3fpyNWhrPIoQKrxefP","type":"function","function":{"name":"get_country","arguments":""}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"r5vzM2B7QX"} + + data: {"id":"chatcmpl-C2OICn1CbpqxbWqJvEQJadV86H8q7","object":"chat.completion.chunk","created":1754686072,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{}"}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"f28G"} + + data: {"id":"chatcmpl-C2OICn1CbpqxbWqJvEQJadV86H8q7","object":"chat.completion.chunk","created":1754686072,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":1,"id":"call_QrIV88ppSKBV3sdKw9Dkr9L5","type":"function","function":{"name":"get_product_name","arguments":""}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"HGhzy"} + + data: {"id":"chatcmpl-C2OICn1CbpqxbWqJvEQJadV86H8q7","object":"chat.completion.chunk","created":1754686072,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","usage":null,"choices":[{"index":0,"delta":{"tool_calls":[{"index":1,"function":{"arguments":"{}"}}]},"logprobs":null,"finish_reason":null}],"obfuscation":"CPN0"} + + data: {"id":"chatcmpl-C2OICn1CbpqxbWqJvEQJadV86H8q7","object":"chat.completion.chunk","created":1754686072,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"tool_calls"}],"usage":null,"obfuscation":"CLx1"} + + data: {"id":"chatcmpl-C2OICn1CbpqxbWqJvEQJadV86H8q7","object":"chat.completion.chunk","created":1754686072,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[],"usage":{"prompt_tokens":364,"completion_tokens":40,"total_tokens":404,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":"9UAlDTq6bfIST"} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '952' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '4720' + content-type: + - application/json + cookie: + - __cf_bm=FMdLfGkYYRRxShv6d.6ULos8pStg0TmiWrGy26zbUnk-1754686067-1.0.1.1-E.y8vuMwOtCOXnzbZRfxF.uHql5wJ.TjpdhC2xGP7dLaiSdNu.imsaLxXWUsb3oWk_j4bh0I6jtaUIkz6FLH3DWe7bW3PJgXdHVWVeSDd5I; + _cfuvid=VKe1SA_RyQAHS9bkLXRq_LMGd8_CKmeHtRDscnoH.vk-1754686067987-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: 'Tell me: the capital of the country; the weather there; the product name' + role: user + - role: assistant + tool_calls: + - function: + arguments: '{}' + name: get_country + id: call_fc0SDU3fpyNWhrPIoQKrxefP + type: function + - function: + arguments: '{}' + name: get_product_name + id: call_QrIV88ppSKBV3sdKw9Dkr9L5 + type: function + - content: Mexico + role: tool + tool_call_id: call_fc0SDU3fpyNWhrPIoQKrxefP + - content: Pydantic AI + role: tool + tool_call_id: call_QrIV88ppSKBV3sdKw9Dkr9L5 + model: gpt-4o + stream: true + stream_options: + include_usage: true + tool_choice: required + tools: + - function: + description: '' + name: get_weather + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + - function: + description: '' + name: get_country + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Convert Celsius to Fahrenheit.\n\n Args:\n celsius: Temperature in Celsius\n\n Returns:\n + \ Temperature in Fahrenheit\n " + name: celsius_to_fahrenheit + parameters: + additionalProperties: false + properties: + celsius: + type: number + required: + - celsius + type: object + strict: true + type: function + - function: + description: "Get the weather forecast for a location.\n\n Args:\n location: The location to get the weather + forecast for.\n\n Returns:\n The weather forecast for the location.\n " + name: get_weather_forecast + parameters: + additionalProperties: false + properties: + location: + type: string + required: + - location + type: object + strict: true + type: function + - function: + description: '' + name: get_image_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_dict + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_error + parameters: + additionalProperties: false + properties: + value: + default: false + type: boolean + type: object + type: function + - function: + description: '' + name: get_none + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_multiple_items + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Get the current log level.\n\n Returns:\n The current log level.\n " + name: get_log_level + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Echo the run context.\n\n Args:\n ctx: Context object containing request and session information.\n\n + \ Returns:\n Dictionary with an echo message and the deps.\n " + name: echo_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: Use sampling callback. + name: use_sampling + parameters: + additionalProperties: false + properties: + foo: + type: string + required: + - foo + type: object + strict: true + type: function + - function: + description: The final response which ends this conversation + name: final_result + parameters: + $defs: + Answer: + additionalProperties: false + properties: + answer: + type: string + label: + type: string + required: + - label + - answer + type: object + additionalProperties: false + properties: + answers: + items: + $ref: '#/$defs/Answer' + type: array + required: + - answers + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C2OIHajL8O898rmLqIoxa4RVHOYix","object":"chat.completion.chunk","created":1754686077,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"role":"assistant","content":null,"tool_calls":[{"index":0,"id":"call_0sOcp1sdvSe58xn9EtpyT4Z7","type":"function","function":{"name":"get_weather","arguments":""}}],"refusal":null},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"YE1nZFEdg"} + + data: {"id":"chatcmpl-C2OIHajL8O898rmLqIoxa4RVHOYix","object":"chat.completion.chunk","created":1754686077,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Izl"} + + data: {"id":"chatcmpl-C2OIHajL8O898rmLqIoxa4RVHOYix","object":"chat.completion.chunk","created":1754686077,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"city"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"q1"} + + data: {"id":"chatcmpl-C2OIHajL8O898rmLqIoxa4RVHOYix","object":"chat.completion.chunk","created":1754686077,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"o"} + + data: {"id":"chatcmpl-C2OIHajL8O898rmLqIoxa4RVHOYix","object":"chat.completion.chunk","created":1754686077,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Mexico"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C2OIHajL8O898rmLqIoxa4RVHOYix","object":"chat.completion.chunk","created":1754686077,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" City"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"5"} + + data: {"id":"chatcmpl-C2OIHajL8O898rmLqIoxa4RVHOYix","object":"chat.completion.chunk","created":1754686077,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\"}"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"xub"} + + data: {"id":"chatcmpl-C2OIHajL8O898rmLqIoxa4RVHOYix","object":"chat.completion.chunk","created":1754686077,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"tool_calls"}],"usage":null,"obfuscation":"n6VC"} + + data: {"id":"chatcmpl-C2OIHajL8O898rmLqIoxa4RVHOYix","object":"chat.completion.chunk","created":1754686077,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[],"usage":{"prompt_tokens":423,"completion_tokens":15,"total_tokens":438,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":"DPehNdReB8XBJ"} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '571' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '4969' + content-type: + - application/json + cookie: + - __cf_bm=FMdLfGkYYRRxShv6d.6ULos8pStg0TmiWrGy26zbUnk-1754686067-1.0.1.1-E.y8vuMwOtCOXnzbZRfxF.uHql5wJ.TjpdhC2xGP7dLaiSdNu.imsaLxXWUsb3oWk_j4bh0I6jtaUIkz6FLH3DWe7bW3PJgXdHVWVeSDd5I; + _cfuvid=VKe1SA_RyQAHS9bkLXRq_LMGd8_CKmeHtRDscnoH.vk-1754686067987-0.0.1.1-604800000 + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: 'Tell me: the capital of the country; the weather there; the product name' + role: user + - role: assistant + tool_calls: + - function: + arguments: '{}' + name: get_country + id: call_fc0SDU3fpyNWhrPIoQKrxefP + type: function + - function: + arguments: '{}' + name: get_product_name + id: call_QrIV88ppSKBV3sdKw9Dkr9L5 + type: function + - content: Mexico + role: tool + tool_call_id: call_fc0SDU3fpyNWhrPIoQKrxefP + - content: Pydantic AI + role: tool + tool_call_id: call_QrIV88ppSKBV3sdKw9Dkr9L5 + - role: assistant + tool_calls: + - function: + arguments: '{"city":"Mexico City"}' + name: get_weather + id: call_0sOcp1sdvSe58xn9EtpyT4Z7 + type: function + - content: sunny + role: tool + tool_call_id: call_0sOcp1sdvSe58xn9EtpyT4Z7 + model: gpt-4o + stream: true + stream_options: + include_usage: true + tool_choice: required + tools: + - function: + description: '' + name: get_weather + parameters: + additionalProperties: false + properties: + city: + type: string + required: + - city + type: object + strict: true + type: function + - function: + description: '' + name: get_country + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Convert Celsius to Fahrenheit.\n\n Args:\n celsius: Temperature in Celsius\n\n Returns:\n + \ Temperature in Fahrenheit\n " + name: celsius_to_fahrenheit + parameters: + additionalProperties: false + properties: + celsius: + type: number + required: + - celsius + type: object + strict: true + type: function + - function: + description: "Get the weather forecast for a location.\n\n Args:\n location: The location to get the weather + forecast for.\n\n Returns:\n The weather forecast for the location.\n " + name: get_weather_forecast + parameters: + additionalProperties: false + properties: + location: + type: string + required: + - location + type: object + strict: true + type: function + - function: + description: '' + name: get_image_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_audio_resource_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_product_name_link + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_image + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_dict + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_error + parameters: + additionalProperties: false + properties: + value: + default: false + type: boolean + type: object + type: function + - function: + description: '' + name: get_none + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: '' + name: get_multiple_items + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Get the current log level.\n\n Returns:\n The current log level.\n " + name: get_log_level + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: "Echo the run context.\n\n Args:\n ctx: Context object containing request and session information.\n\n + \ Returns:\n Dictionary with an echo message and the deps.\n " + name: echo_deps + parameters: + additionalProperties: false + properties: {} + type: object + type: function + - function: + description: Use sampling callback. + name: use_sampling + parameters: + additionalProperties: false + properties: + foo: + type: string + required: + - foo + type: object + strict: true + type: function + - function: + description: The final response which ends this conversation + name: final_result + parameters: + $defs: + Answer: + additionalProperties: false + properties: + answer: + type: string + label: + type: string + required: + - label + - answer + type: object + additionalProperties: false + properties: + answers: + items: + $ref: '#/$defs/Answer' + type: array + required: + - answers + type: object + strict: true + type: function + uri: https://api.openai.com/v1/chat/completions + response: + body: + string: |+ + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"role":"assistant","content":null,"tool_calls":[{"index":0,"id":"call_ilRpnEc1a17bm7xyvhNbaSOS","type":"function","function":{"name":"final_result","arguments":""}}],"refusal":null},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"0bScewxE"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"hp7"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answers"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"nfPPM72GKlkKYsc"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":["}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"HP"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Cdc"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"label"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"2"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"y"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Capital"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"SDZEIE6GemLgnH2"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" of"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"OZB"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" the"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"k9"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" Country"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"ICmAU4LdEPUFx7"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\",\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"i"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answer"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"j"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Mexico"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" City"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"U"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\"},{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"87LtLruw12QiS2W"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"label"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"n"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"p"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Weather"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"e8ZZ9vaf5gjWfMW"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" in"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"ZJi"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" Mexico"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"8wEw5SfvrhCPdxY"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" City"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Q"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\",\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"j"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answer"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"J"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Sunny"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"b"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\"},{\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"TdXcc4UNug5RPRX"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"label"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"O"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"V"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"Product"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"Yk14mLdHSY7N0fo"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" Name"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"J"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\",\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"9"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"answer"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":""} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\":\""}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"m"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"P"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"57VUB"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"yd"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"hqeM"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"antic"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"F"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":" AI"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"xQX"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"\"}"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"q6r"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{"tool_calls":[{"index":0,"function":{"arguments":"]}"}}]},"logprobs":null,"finish_reason":null}],"usage":null,"obfuscation":"8scW"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"tool_calls"}],"usage":null,"obfuscation":"Df0J"} + + data: {"id":"chatcmpl-C2OIKzk3pk6v0J4StM6YwlYv4fnSE","object":"chat.completion.chunk","created":1754686080,"model":"gpt-4o-2024-08-06","service_tier":"default","system_fingerprint":"fp_07871e2ad8","choices":[],"usage":{"prompt_tokens":448,"completion_tokens":49,"total_tokens":497,"prompt_tokens_details":{"cached_tokens":0,"audio_tokens":0},"completion_tokens_details":{"reasoning_tokens":0,"audio_tokens":0,"accepted_prediction_tokens":0,"rejected_prediction_tokens":0}},"obfuscation":"tZUpwQLEqnpHt"} + + data: [DONE] + + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-type: + - text/event-stream; charset=utf-8 + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '561' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + status: + code: 200 + message: OK +version: 1 +... diff --git a/tests/cassettes/test_dbos/test_simple_agent_run_in_workflow.yaml b/tests/cassettes/test_dbos/test_simple_agent_run_in_workflow.yaml new file mode 100644 index 0000000000..9aa924f5f5 --- /dev/null +++ b/tests/cassettes/test_dbos/test_simple_agent_run_in_workflow.yaml @@ -0,0 +1,79 @@ +interactions: +- request: + headers: + accept: + - application/json + accept-encoding: + - gzip, deflate + connection: + - keep-alive + content-length: + - '105' + content-type: + - application/json + host: + - api.openai.com + method: POST + parsed_body: + messages: + - content: What is the capital of Mexico? + role: user + model: gpt-4o + stream: false + uri: https://api.openai.com/v1/chat/completions + response: + headers: + access-control-expose-headers: + - X-Request-ID + alt-svc: + - h3=":443"; ma=86400 + connection: + - keep-alive + content-length: + - '838' + content-type: + - application/json + openai-organization: + - pydantic-28gund + openai-processing-ms: + - '403' + openai-project: + - proj_dKobscVY9YJxeEaDJen54e3d + openai-version: + - '2020-10-01' + strict-transport-security: + - max-age=31536000; includeSubDomains; preload + transfer-encoding: + - chunked + parsed_body: + choices: + - finish_reason: stop + index: 0 + logprobs: null + message: + annotations: [] + content: The capital of Mexico is Mexico City. + refusal: null + role: assistant + created: 1754675179 + id: chatcmpl-C2LSVwAtcuMjKCHykKXgKphwTaQVB + model: gpt-4o-2024-08-06 + object: chat.completion + service_tier: default + system_fingerprint: fp_ff25b2783a + usage: + completion_tokens: 8 + completion_tokens_details: + accepted_prediction_tokens: 0 + audio_tokens: 0 + reasoning_tokens: 0 + rejected_prediction_tokens: 0 + prompt_tokens: 14 + prompt_tokens_details: + audio_tokens: 0 + cached_tokens: 0 + total_tokens: 22 + status: + code: 200 + message: OK +version: 1 diff --git a/tests/test_dbos.py b/tests/test_dbos.py new file mode 100644 index 0000000000..c88903caf8 --- /dev/null +++ b/tests/test_dbos.py @@ -0,0 +1,1565 @@ +from __future__ import annotations + +import asyncio +import os +import time +import uuid +from collections.abc import AsyncIterable, AsyncIterator, Generator, Iterator +from contextlib import contextmanager +from dataclasses import dataclass, field +from datetime import datetime +from typing import Any, Literal + +import pytest +from httpx import AsyncClient +from pydantic import BaseModel + +from pydantic_ai import Agent +from pydantic_ai._run_context import RunContext +from pydantic_ai.direct import model_request_stream +from pydantic_ai.exceptions import ApprovalRequired, CallDeferred, ModelRetry, UserError +from pydantic_ai.messages import ( + AgentStreamEvent, + FinalResultEvent, + FunctionToolCallEvent, + FunctionToolResultEvent, + ModelMessage, + ModelRequest, + ModelResponse, + PartDeltaEvent, + PartStartEvent, + RetryPromptPart, + TextPart, + ToolCallPart, + ToolCallPartDelta, + ToolReturnPart, + UserPromptPart, +) +from pydantic_ai.models import cached_async_http_client +from pydantic_ai.models.test import TestModel +from pydantic_ai.run import AgentRunResult +from pydantic_ai.usage import RequestUsage + +from .conftest import IsDatetime, IsStr + +try: + from dbos import DBOS, DBOSConfig, SetWorkflowID + + from pydantic_ai.durable_exec.dbos import DBOSAgent, DBOSMCPServer, DBOSModel +except ImportError: # pragma: lax no cover + pytest.skip('DBOS is not installed', allow_module_level=True) + +try: + import logfire + from logfire.testing import CaptureLogfire +except ImportError: # pragma: lax no cover + pytest.skip('logfire not installed', allow_module_level=True) + +try: + from pydantic_ai.mcp import MCPServerStdio +except ImportError: # pragma: lax no cover + pytest.skip('mcp not installed', allow_module_level=True) + + +try: + from pydantic_ai.models.openai import OpenAIChatModel + from pydantic_ai.providers.openai import OpenAIProvider +except ImportError: # pragma: lax no cover + pytest.skip('openai not installed', allow_module_level=True) + +from inline_snapshot import snapshot + +from pydantic_ai.tools import DeferredToolRequests, DeferredToolResults, ToolDefinition +from pydantic_ai.toolsets import ExternalToolset, FunctionToolset + +pytestmark = [ + pytest.mark.anyio, + pytest.mark.vcr, + pytest.mark.xdist_group(name='dbos'), +] + +# We need to use a custom cached HTTP client here as the default one created for OpenAIProvider will be closed automatically +# at the end of each test, but we need this one to live longer. +http_client = cached_async_http_client(provider='dbos') + + +@pytest.fixture(autouse=True, scope='module') +async def close_cached_httpx_client(anyio_backend: str) -> AsyncIterator[None]: + try: + yield + finally: + await http_client.aclose() + + +@pytest.fixture(autouse=True, scope='module') +def setup_logfire_instrumentation() -> Iterator[None]: + # Set up logfire for the tests. + logfire.configure(metrics=False) + yield + + +@contextmanager +def workflow_raises(exc_type: type[Exception], exc_message: str) -> Iterator[None]: + """Helper for asserting that a DBOS workflow fails with the expected error.""" + with pytest.raises(Exception) as exc_info: + yield + assert isinstance(exc_info.value, Exception) + assert str(exc_info.value) == exc_message + + +DBOS_SQLITE_FILE = 'dbostest.sqlite' +DBOS_CONFIG: DBOSConfig = { + 'name': 'pydantic_dbos_tests', + 'database_url': f'sqlite:///{DBOS_SQLITE_FILE}', + 'system_database_url': f'sqlite:///{DBOS_SQLITE_FILE}', + 'run_admin_server': False, + 'disable_otlp': True, # Disable DBOS OTLP to avoid conflicts with logfire +} + + +@pytest.fixture(scope='module') +def dbos() -> Generator[DBOS, Any, None]: + dbos = DBOS(config=DBOS_CONFIG) + DBOS.launch() + try: + yield dbos + finally: + DBOS.destroy() + + +# Automatically clean up old DBOS sqlite files +@pytest.fixture(autouse=True, scope='module') +def cleanup_test_sqlite_file() -> Iterator[None]: + if os.path.exists(DBOS_SQLITE_FILE): + os.remove(DBOS_SQLITE_FILE) # pragma: lax no cover + yield + + if os.path.exists(DBOS_SQLITE_FILE): + os.remove(DBOS_SQLITE_FILE) # pragma: lax no cover + + +model = OpenAIChatModel( + 'gpt-4o', + provider=OpenAIProvider( + api_key=os.getenv('OPENAI_API_KEY', 'mock-api-key'), + http_client=http_client, + ), +) + +# Not necessarily need to define it outside of the function. DBOS just requires workflows to be statically defined so recovery would be able to find those workflows. It's nice to reuse it in multiple tests. +simple_agent = Agent(model, name='simple_agent') +simple_dbos_agent = DBOSAgent(simple_agent) + + +async def test_simple_agent_run_in_workflow(allow_model_requests: None, dbos: DBOS, openai_api_key: str) -> None: + """Test that a simple agent can run in a DBOS workflow.""" + + @DBOS.workflow() + async def run_simple_agent() -> str: + result = await simple_dbos_agent.run('What is the capital of Mexico?') + return result.output + + output = await run_simple_agent() + assert output == snapshot('The capital of Mexico is Mexico City.') + + +class Deps(BaseModel): + country: str + + +# Wrap event_stream_handler as a DBOS step because it's non-deterministic (uses logfire) +@DBOS.step() +async def event_stream_handler( + ctx: RunContext[Deps], + stream: AsyncIterable[AgentStreamEvent], +): + logfire.info(f'{ctx.run_step=}') + async for event in stream: + logfire.info('event', event=event) + + +# This doesn't need to be a step +async def get_country(ctx: RunContext[Deps]) -> str: + return ctx.deps.country + + +class WeatherArgs(BaseModel): + city: str + + +@DBOS.step() +def get_weather(args: WeatherArgs) -> str: + if args.city == 'Mexico City': + return 'sunny' + else: + return 'unknown' # pragma: no cover + + +@dataclass +class Answer: + label: str + answer: str + + +@dataclass +class Response: + answers: list[Answer] + + +@dataclass +class BasicSpan: + content: str + children: list[BasicSpan] = field(default_factory=list) + parent_id: int | None = field(repr=False, compare=False, default=None) + + +complex_agent = Agent( + model, + deps_type=Deps, + output_type=Response, + toolsets=[ + FunctionToolset[Deps](tools=[get_country], id='country'), + MCPServerStdio('python', ['-m', 'tests.mcp_server'], timeout=20, id='mcp'), + ExternalToolset(tool_defs=[ToolDefinition(name='external')], id='external'), + ], + tools=[get_weather], + event_stream_handler=event_stream_handler, + instrument=True, # Enable instrumentation for testing + name='complex_agent', +) +complex_dbos_agent = DBOSAgent(complex_agent) + + +async def test_complex_agent_run_in_workflow(allow_model_requests: None, dbos: DBOS, capfire: CaptureLogfire) -> None: + # Set a workflow ID for testing list steps + wfid = str(uuid.uuid4()) + with SetWorkflowID(wfid): + # DBOSAgent already wraps the `run` function as a DBOS workflow, so we can just call it directly. + result = await complex_dbos_agent.run( + 'Tell me: the capital of the country; the weather there; the product name', deps=Deps(country='Mexico') + ) + assert result.output == snapshot( + Response( + answers=[ + Answer(label='Capital of the country', answer='Mexico City'), + Answer(label='Weather in the capital', answer='Sunny'), + Answer(label='Product Name', answer='Pydantic AI'), + ] + ) + ) + + # Make sure the steps are persisted correctly in the DBOS database. + steps = await dbos.list_workflow_steps_async(wfid) + assert [step['function_name'] for step in steps] == snapshot( + [ + 'complex_agent__mcp_server__mcp.get_tools', + 'complex_agent__model.request_stream', + 'event_stream_handler', + 'event_stream_handler', + 'complex_agent__mcp_server__mcp.call_tool', + 'event_stream_handler', + 'complex_agent__mcp_server__mcp.get_tools', + 'complex_agent__model.request_stream', + 'event_stream_handler', + 'get_weather', + 'event_stream_handler', + 'complex_agent__mcp_server__mcp.get_tools', + 'complex_agent__model.request_stream', + ] + ) + + exporter = capfire.exporter + + spans = exporter.exported_spans_as_dict() + basic_spans_by_id = { + span['context']['span_id']: BasicSpan( + parent_id=span['parent']['span_id'] if span['parent'] else None, + content=attributes.get('event') or attributes['logfire.msg'], + ) + for span in spans + if (attributes := span.get('attributes')) + } + + assert len(basic_spans_by_id) > 0, 'No spans were exported' + root_span = None + for basic_span in basic_spans_by_id.values(): + if basic_span.parent_id is None: + root_span = basic_span + else: + parent_id = basic_span.parent_id + parent_span = basic_spans_by_id[parent_id] + parent_span.children.append(basic_span) + + # Assert the root span and its structure matches expected hierarchy + assert root_span == snapshot( + BasicSpan( + content='complex_agent run', + children=[ + BasicSpan( + content='chat gpt-4o', + children=[ + BasicSpan(content='ctx.run_step=1'), + BasicSpan( + content='{"index":0,"part":{"tool_name":"get_country","args":"","tool_call_id":"call_3rqTYrA6H21AYUaRGP4F66oq","part_kind":"tool-call"},"event_kind":"part_start"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"{}","tool_call_id":"call_3rqTYrA6H21AYUaRGP4F66oq","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":1,"part":{"tool_name":"get_product_name","args":"","tool_call_id":"call_Xw9XMKBJU48kAAd78WgIswDx","part_kind":"tool-call"},"event_kind":"part_start"}' + ), + BasicSpan( + content='{"index":1,"delta":{"tool_name_delta":null,"args_delta":"{}","tool_call_id":"call_Xw9XMKBJU48kAAd78WgIswDx","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + ], + ), + BasicSpan(content='ctx.run_step=1'), + BasicSpan( + content='{"part":{"tool_name":"get_country","args":"{}","tool_call_id":"call_3rqTYrA6H21AYUaRGP4F66oq","part_kind":"tool-call"},"event_kind":"function_tool_call"}' + ), + BasicSpan(content='ctx.run_step=1'), + BasicSpan( + content='{"part":{"tool_name":"get_product_name","args":"{}","tool_call_id":"call_Xw9XMKBJU48kAAd78WgIswDx","part_kind":"tool-call"},"event_kind":"function_tool_call"}' + ), + BasicSpan( + content='running 2 tools', + children=[ + BasicSpan(content='running tool: get_country'), + BasicSpan(content='ctx.run_step=1'), + BasicSpan( + content=IsStr( + regex=r'{"result":{"tool_name":"get_country","content":"Mexico","tool_call_id":"call_3rqTYrA6H21AYUaRGP4F66oq","metadata":null,"timestamp":".+?","part_kind":"tool-return"},"event_kind":"function_tool_result"}' + ) + ), + BasicSpan(content='running tool: get_product_name'), + BasicSpan(content='ctx.run_step=1'), + BasicSpan( + content=IsStr( + regex=r'{"result":{"tool_name":"get_product_name","content":"Pydantic AI","tool_call_id":"call_Xw9XMKBJU48kAAd78WgIswDx","metadata":null,"timestamp":".+?","part_kind":"tool-return"},"event_kind":"function_tool_result"}' + ) + ), + ], + ), + BasicSpan( + content='chat gpt-4o', + children=[ + BasicSpan(content='ctx.run_step=2'), + BasicSpan( + content='{"index":0,"part":{"tool_name":"get_weather","args":"","tool_call_id":"call_Vz0Sie91Ap56nH0ThKGrZXT7","part_kind":"tool-call"},"event_kind":"part_start"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"{\\"","tool_call_id":"call_Vz0Sie91Ap56nH0ThKGrZXT7","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"city","tool_call_id":"call_Vz0Sie91Ap56nH0ThKGrZXT7","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\":\\"","tool_call_id":"call_Vz0Sie91Ap56nH0ThKGrZXT7","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"Mexico","tool_call_id":"call_Vz0Sie91Ap56nH0ThKGrZXT7","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":" City","tool_call_id":"call_Vz0Sie91Ap56nH0ThKGrZXT7","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\"}","tool_call_id":"call_Vz0Sie91Ap56nH0ThKGrZXT7","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + ], + ), + BasicSpan(content='ctx.run_step=2'), + BasicSpan( + content='{"part":{"tool_name":"get_weather","args":"{\\"city\\":\\"Mexico City\\"}","tool_call_id":"call_Vz0Sie91Ap56nH0ThKGrZXT7","part_kind":"tool-call"},"event_kind":"function_tool_call"}' + ), + BasicSpan( + content='running 1 tool', + children=[ + BasicSpan(content='running tool: get_weather'), + BasicSpan(content='ctx.run_step=2'), + BasicSpan( + content=IsStr( + regex=r'{"result":{"tool_name":"get_weather","content":"sunny","tool_call_id":"call_Vz0Sie91Ap56nH0ThKGrZXT7","metadata":null,"timestamp":".+?","part_kind":"tool-return"},"event_kind":"function_tool_result"}' + ) + ), + ], + ), + BasicSpan( + content='chat gpt-4o', + children=[ + BasicSpan(content='ctx.run_step=3'), + BasicSpan( + content='{"index":0,"part":{"tool_name":"final_result","args":"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_kind":"tool-call"},"event_kind":"part_start"}' + ), + BasicSpan( + content='{"tool_name":"final_result","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","event_kind":"final_result"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"{\\"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"answers","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\":[","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"{\\"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"label","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\":\\"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"Capital","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":" of","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":" the","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":" country","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\",\\"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"answer","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\":\\"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"Mexico","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":" City","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\"},{\\"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"label","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\":\\"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"Weather","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":" in","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":" the","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":" capital","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\",\\"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"answer","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\":\\"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"Sunny","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\"},{\\"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"label","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\":\\"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"Product","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":" Name","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\",\\"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"answer","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\":\\"","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"P","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"yd","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"antic","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":" AI","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"\\"}","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + BasicSpan( + content='{"index":0,"delta":{"tool_name_delta":null,"args_delta":"]}","tool_call_id":"call_4kc6691zCzjPnOuEtbEGUvz2","part_delta_kind":"tool_call"},"event_kind":"part_delta"}' + ), + ], + ), + ], + ) + ) + + +# Note: since we wrap the agent run in a DBOS workflow, we cannot just use a DBOS agent without DBOS. This test shows we can use a complex agent with DBOS decorated tools. Without DBOS workflows, those steps are just normal functions. +async def test_complex_agent_run(allow_model_requests: None) -> None: + events: list[AgentStreamEvent] = [] + + async def event_stream_handler( + ctx: RunContext[Deps], + stream: AsyncIterable[AgentStreamEvent], + ): + async for event in stream: + events.append(event) + + with complex_agent.override(deps=Deps(country='Mexico')): + result = await complex_agent.run( + 'Tell me: the capital of the country; the weather there; the product name', + deps=Deps(country='The Netherlands'), + event_stream_handler=event_stream_handler, + ) + assert result.output == snapshot( + Response( + answers=[ + Answer(label='Capital', answer='The capital of Mexico is Mexico City.'), + Answer(label='Weather', answer='The weather in Mexico City is currently sunny.'), + Answer(label='Product Name', answer='The product name is Pydantic AI.'), + ] + ) + ) + + assert events == snapshot( + [ + PartStartEvent( + index=0, + part=ToolCallPart(tool_name='get_country', args='', tool_call_id='call_q2UyBRP7eXNTzAoR8lEhjc9Z'), + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='{}', tool_call_id='call_q2UyBRP7eXNTzAoR8lEhjc9Z') + ), + PartStartEvent( + index=1, + part=ToolCallPart(tool_name='get_product_name', args='', tool_call_id='call_b51ijcpFkDiTQG1bQzsrmtW5'), + ), + PartDeltaEvent( + index=1, delta=ToolCallPartDelta(args_delta='{}', tool_call_id='call_b51ijcpFkDiTQG1bQzsrmtW5') + ), + FunctionToolCallEvent( + part=ToolCallPart(tool_name='get_country', args='{}', tool_call_id='call_q2UyBRP7eXNTzAoR8lEhjc9Z') + ), + FunctionToolCallEvent( + part=ToolCallPart(tool_name='get_product_name', args='{}', tool_call_id='call_b51ijcpFkDiTQG1bQzsrmtW5') + ), + FunctionToolResultEvent( + result=ToolReturnPart( + tool_name='get_country', + content='Mexico', + tool_call_id='call_q2UyBRP7eXNTzAoR8lEhjc9Z', + timestamp=IsDatetime(), + ) + ), + FunctionToolResultEvent( + result=ToolReturnPart( + tool_name='get_product_name', + content='Pydantic AI', + tool_call_id='call_b51ijcpFkDiTQG1bQzsrmtW5', + timestamp=IsDatetime(), + ) + ), + PartStartEvent( + index=0, + part=ToolCallPart(tool_name='get_weather', args='', tool_call_id='call_LwxJUB9KppVyogRRLQsamRJv'), + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='{"', tool_call_id='call_LwxJUB9KppVyogRRLQsamRJv') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='city', tool_call_id='call_LwxJUB9KppVyogRRLQsamRJv') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='":"', tool_call_id='call_LwxJUB9KppVyogRRLQsamRJv') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='Mexico', tool_call_id='call_LwxJUB9KppVyogRRLQsamRJv') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' City', tool_call_id='call_LwxJUB9KppVyogRRLQsamRJv') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='"}', tool_call_id='call_LwxJUB9KppVyogRRLQsamRJv') + ), + FunctionToolCallEvent( + part=ToolCallPart( + tool_name='get_weather', args='{"city":"Mexico City"}', tool_call_id='call_LwxJUB9KppVyogRRLQsamRJv' + ) + ), + FunctionToolResultEvent( + result=ToolReturnPart( + tool_name='get_weather', + content='sunny', + tool_call_id='call_LwxJUB9KppVyogRRLQsamRJv', + timestamp=IsDatetime(), + ) + ), + PartStartEvent( + index=0, + part=ToolCallPart(tool_name='final_result', args='', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn'), + ), + FinalResultEvent(tool_name='final_result', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn'), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='{"', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='answers', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='":[', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='{"', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='label', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='":"', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='Capital', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='","', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='answer', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='":"', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='The', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' capital', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' of', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' Mexico', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' is', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' Mexico', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' City', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='."', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='},{"', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='label', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='":"', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='Weather', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='","', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='answer', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='":"', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='The', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' weather', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' in', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' Mexico', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' City', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' is', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' currently', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' sunny', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='."', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='},{"', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='label', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='":"', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='Product', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' Name', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='","', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='answer', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='":"', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='The', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' product', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' name', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' is', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' P', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='yd', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='antic', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=' AI', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='."', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta='}', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + PartDeltaEvent( + index=0, delta=ToolCallPartDelta(args_delta=']}', tool_call_id='call_CCGIWaMeYWmxOQ91orkmTvzn') + ), + ] + ) + + +async def test_multiple_agents(allow_model_requests: None, dbos: DBOS): + """Test that multiple agents can run in a DBOS workflow.""" + # This is just a smoke test to ensure that multiple agents can run in a DBOS workflow. + # We don't need to check the output as it's already tested in the individual agent tests. + result = await simple_dbos_agent.run('What is the capital of Mexico?') + assert result.output == snapshot('The capital of Mexico is Mexico City.') + + result = await complex_dbos_agent.run( + 'Tell me: the capital of the country; the weather there; the product name', deps=Deps(country='Mexico') + ) + assert result.output == snapshot( + Response( + answers=[ + Answer(label='Capital of the Country', answer='Mexico City'), + Answer(label='Weather in Mexico City', answer='Sunny'), + Answer(label='Product Name', answer='Pydantic AI'), + ] + ) + ) + + +async def test_agent_name_collision(allow_model_requests: None, dbos: DBOS): + with pytest.raises( + Exception, match="Duplicate instance registration for class 'DBOSAgent' instance 'simple_agent'" + ): + DBOSAgent(simple_agent) + + +async def test_agent_without_name(): + with pytest.raises( + UserError, + match="An agent needs to have a unique `name` in order to be used with DBOS. The name will be used to identify the agent's workflows and steps.", + ): + DBOSAgent(Agent()) + + +async def test_agent_without_model(): + with pytest.raises( + UserError, + match='An agent needs to have a `model` in order to be used with DBOS, it cannot be set at agent run time.', + ): + DBOSAgent(Agent(name='test_agent')) + + +async def test_toolset_without_id(): + # Note: this is allowed in DBOS because we don't wrap the tools automatically in a workflow. It's up to the user to define the tools as DBOS steps if they want to use them as steps in a workflow. + DBOSAgent(Agent(model=model, name='test_agent', toolsets=[FunctionToolset()])) + + +async def test_dbos_agent(): + assert isinstance(complex_dbos_agent.model, DBOSModel) + assert complex_dbos_agent.model.wrapped == complex_agent.model + + # DBOS only wraps the MCP server toolsets. Other toolsets are not wrapped. + toolsets = complex_dbos_agent.toolsets + assert len(toolsets) == 5 + + # Empty function toolset for the agent's own tools + assert isinstance(toolsets[0], FunctionToolset) + assert toolsets[0].id == '' + assert toolsets[0].tools == {} + + # Function toolset for the wrapped agent's own tools + assert isinstance(toolsets[1], FunctionToolset) + assert toolsets[1].id == '' + assert toolsets[1].tools.keys() == {'get_weather'} + + # Wrapped 'country' toolset + assert isinstance(toolsets[2], FunctionToolset) + assert toolsets[2].id == 'country' + assert toolsets[2].tools.keys() == {'get_country'} + + # Wrapped 'mcp' MCP server + assert isinstance(toolsets[3], DBOSMCPServer) + assert toolsets[3].id == 'mcp' + assert toolsets[3].wrapped == complex_agent.toolsets[2] + + # Unwrapped 'external' toolset + assert isinstance(toolsets[4], ExternalToolset) + assert toolsets[4].id == 'external' + assert toolsets[4] == complex_agent.toolsets[3] + + +async def test_dbos_agent_run(allow_model_requests: None, dbos: DBOS): + # Note: this runs as a DBOS workflow because we automatically wrap the run function. + result = await simple_dbos_agent.run('What is the capital of Mexico?') + assert result.output == snapshot('The capital of Mexico is Mexico City.') + + +def test_dbos_agent_run_sync(allow_model_requests: None, dbos: DBOS): + # Note: this runs as a DBOS workflow because we automatically wrap the run_sync function. + # This is equivalent to test_dbos_agent_run_sync_in_workflow + result = simple_dbos_agent.run_sync('What is the capital of Mexico?') + assert result.output == snapshot('The capital of Mexico is Mexico City.') + + +async def test_dbos_agent_run_stream(allow_model_requests: None): + # Run stream is not a DBOS workflow, so we can use it directly. + async with simple_dbos_agent.run_stream('What is the capital of Mexico?') as result: + assert [c async for c in result.stream_text(debounce_by=None)] == snapshot( + [ + 'The', + 'The capital', + 'The capital of', + 'The capital of Mexico', + 'The capital of Mexico is', + 'The capital of Mexico is Mexico', + 'The capital of Mexico is Mexico City', + 'The capital of Mexico is Mexico City.', + ] + ) + + +async def test_dbos_agent_iter(allow_model_requests: None): + output: list[str] = [] + async with simple_dbos_agent.iter('What is the capital of Mexico?') as run: + async for node in run: + if Agent.is_model_request_node(node): + async with node.stream(run.ctx) as stream: + async for chunk in stream.stream_text(debounce_by=None): + output.append(chunk) + assert output == snapshot( + [ + 'The', + 'The capital', + 'The capital of', + 'The capital of Mexico', + 'The capital of Mexico is', + 'The capital of Mexico is Mexico', + 'The capital of Mexico is Mexico City', + 'The capital of Mexico is Mexico City.', + ] + ) + + +def test_dbos_agent_run_sync_in_workflow(allow_model_requests: None, dbos: DBOS): + # DBOS allows calling `run_sync` inside a workflow as a child workflow. + @DBOS.workflow() + def run_sync_workflow(): + result = simple_dbos_agent.run_sync('What is the capital of Mexico?') + return result.output + + output = run_sync_workflow() + assert output == snapshot('The capital of Mexico is Mexico City.') + + +async def test_dbos_agent_run_stream_in_workflow(allow_model_requests: None, dbos: DBOS): + @DBOS.workflow() + async def run_stream_workflow(): + async with simple_dbos_agent.run_stream('What is the capital of Mexico?') as result: + pass + return await result.get_output() # pragma: no cover + + with workflow_raises( + UserError, + snapshot( + '`agent.run_stream()` cannot currently be used inside a DBOS workflow. ' + 'Set an `event_stream_handler` on the agent and use `agent.run()` instead. ' + 'Please file an issue if this is not sufficient for your use case.' + ), + ): + await run_stream_workflow() + + +async def test_dbos_agent_iter_in_workflow(allow_model_requests: None, dbos: DBOS): + # DBOS allows calling `iter` inside a workflow as a step. + @DBOS.workflow() + async def run_iter_workflow(): + output: list[str] = [] + async with simple_dbos_agent.iter('What is the capital of Mexico?') as run: + async for node in run: + if Agent.is_model_request_node(node): + async with node.stream(run.ctx) as stream: + async for chunk in stream.stream_text(debounce_by=None): + output.append(chunk) + return output + + output = await run_iter_workflow() + # If called in a workflow, the output is a single concatenated string. + assert output == snapshot( + [ + 'The capital of Mexico is Mexico City.', + ] + ) + + +async def simple_event_stream_handler( + ctx: RunContext[None], + stream: AsyncIterable[AgentStreamEvent], +): + pass + + +async def test_dbos_agent_run_in_workflow_with_event_stream_handler(allow_model_requests: None, dbos: DBOS) -> None: + # DBOS workflow input must be serializable, so we cannot use a function as a dependency. + # Therefore, we cannot pass in an event stream handler as an argument. + with workflow_raises(TypeError, snapshot('Serialized data item should not be a function')): + await simple_dbos_agent.run('What is the capital of Mexico?', event_stream_handler=simple_event_stream_handler) + + +async def test_dbos_agent_run_in_workflow_with_model(allow_model_requests: None, dbos: DBOS): + # A non-DBOS model is not wrapped as steps so it's not deterministic and cannot be used in a DBOS workflow. + with workflow_raises( + UserError, + snapshot( + 'Non-DBOS model cannot be set at agent run time inside a DBOS workflow, it must be set at agent creation time.' + ), + ): + await simple_dbos_agent.run('What is the capital of Mexico?', model=model) + + +async def test_dbos_agent_run_in_workflow_with_toolsets(allow_model_requests: None, dbos: DBOS): + # Since DBOS does not automatically wrap the tools in a workflow, and allows dynamic steps, we can pass in toolsets directly. + result = await simple_dbos_agent.run('What is the capital of Mexico?', toolsets=[FunctionToolset()]) + assert result.output == snapshot('The capital of Mexico is Mexico City.') + + +async def test_dbos_agent_override_model_in_workflow(allow_model_requests: None, dbos: DBOS): + # We cannot override the model to a non-DBOS one in a DBOS workflow. + with workflow_raises( + UserError, + snapshot( + 'Non-DBOS model cannot be contextually overridden inside a DBOS workflow, it must be set at agent creation time.' + ), + ): + with simple_dbos_agent.override(model=model): + pass + + +async def test_dbos_agent_override_toolsets_in_workflow(allow_model_requests: None, dbos: DBOS): + # Since DBOS does not automatically wrap the tools in a workflow, and allows dynamic steps, we can override toolsets directly. + @DBOS.workflow() + async def run_with_toolsets(): + with simple_dbos_agent.override(toolsets=[FunctionToolset()]): + pass + + await run_with_toolsets() + + +async def test_dbos_agent_override_tools_in_workflow(allow_model_requests: None, dbos: DBOS): + # Since DBOS does not automatically wrap the tools in a workflow, and allows dynamic steps, we can override tools directly. + @DBOS.workflow() + async def run_with_tools(): + with simple_dbos_agent.override(tools=[get_weather]): + result = await simple_dbos_agent.run('What is the capital of Mexico?') + return result.output + + output = await run_with_tools() + assert output == snapshot('The capital of Mexico is Mexico City.') + + +async def test_dbos_agent_override_deps_in_workflow(allow_model_requests: None, dbos: DBOS): + # This is allowed + @DBOS.workflow() + async def run_with_deps(): + with simple_dbos_agent.override(deps=None): + result = await simple_dbos_agent.run('What is the capital of the country?') + return result.output + + output = await run_with_deps() + assert output == snapshot('The capital of Mexico is Mexico City.') + + +async def test_dbos_model_stream_direct(allow_model_requests: None, dbos: DBOS): + @DBOS.workflow() + async def run_model_stream(): + messages: list[ModelMessage] = [ModelRequest.user_text_prompt('What is the capital of Mexico?')] + async with model_request_stream(complex_dbos_agent.model, messages) as stream: + async for _ in stream: + pass + + with workflow_raises( + AssertionError, + snapshot( + 'A DBOS model cannot be used with `pydantic_ai.direct.model_request_stream()` as it requires a `run_context`. Set an `event_stream_handler` on the agent and use `agent.run()` instead.' + ), + ): + await run_model_stream() + + +@dataclass +class UnserializableDeps: + client: AsyncClient + + +unserializable_deps_agent = Agent(model, name='unserializable_deps_agent', deps_type=UnserializableDeps) + + +@unserializable_deps_agent.tool +async def get_model_name(ctx: RunContext[UnserializableDeps]) -> int: + return ctx.deps.client.max_redirects # pragma: lax no cover + + +async def test_dbos_agent_with_unserializable_deps_type(allow_model_requests: None, dbos: DBOS): + unserializable_deps_dbos_agent = DBOSAgent(unserializable_deps_agent) + # Test this raises a serialization error because httpx.AsyncClient is not serializable. + with pytest.raises( + Exception, + match='object proxy must define __reduce_ex__()', + ): + async with AsyncClient() as client: + # This will trigger the client to be unserializable + logfire.instrument_httpx(client, capture_all=True) + await unserializable_deps_dbos_agent.run('What is the model name?', deps=UnserializableDeps(client=client)) + + +# Test dynamic toolsets in an agent with DBOS + + +@DBOS.step() +def temperature_celsius(city: str) -> float: + return 21.0 + + +@DBOS.step() +def temperature_fahrenheit(city: str) -> float: + return 69.8 + + +weather_toolset = FunctionToolset(tools=[temperature_celsius, temperature_fahrenheit]) + + +@weather_toolset.tool +@DBOS.step() +def conditions(ctx: RunContext, city: str) -> str: + if ctx.run_step % 2 == 0: + return "It's sunny" # pragma: lax no cover + else: + return "It's raining" + + +datetime_toolset = FunctionToolset() + + +@DBOS.step() +def now_func() -> datetime: + return datetime.now() + + +datetime_toolset.add_function(now_func, name='now') + + +@dataclass +class ToggleableDeps: + active: Literal['weather', 'datetime'] + + def toggle(self): + if self.active == 'weather': + self.active = 'datetime' + else: + self.active = 'weather' + + +test_model = TestModel() +dynamic_agent = Agent(name='dynamic_agent', model=test_model, deps_type=ToggleableDeps) + + +@dynamic_agent.toolset # type: ignore +def toggleable_toolset(ctx: RunContext[ToggleableDeps]) -> FunctionToolset[None]: + if ctx.deps.active == 'weather': + return weather_toolset + else: + return datetime_toolset + + +@dynamic_agent.tool +def toggle(ctx: RunContext[ToggleableDeps]): + ctx.deps.toggle() + + +dynamic_dbos_agent = DBOSAgent(dynamic_agent) + + +def test_dynamic_toolset(dbos: DBOS): + weather_deps = ToggleableDeps('weather') + + result = dynamic_dbos_agent.run_sync('Toggle the toolset', deps=weather_deps) + assert result.output == snapshot( + '{"toggle":null,"temperature_celsius":21.0,"temperature_fahrenheit":69.8,"conditions":"It\'s raining"}' + ) + + result = dynamic_dbos_agent.run_sync('Toggle the toolset', deps=weather_deps) + assert result.output == snapshot(IsStr(regex=r'{"toggle":null,"now":".+?"}')) + + +# Test human-in-the-loop with DBOS agent +hitl_agent = Agent( + model, + name='hitl_agent', + output_type=[str, DeferredToolRequests], + instructions='Just call tools without asking for confirmation.', +) + + +@hitl_agent.tool +@DBOS.step() +def create_file(ctx: RunContext[None], path: str) -> None: + raise CallDeferred + + +@hitl_agent.tool +@DBOS.step() +def delete_file(ctx: RunContext[None], path: str) -> bool: + if not ctx.tool_call_approved: + raise ApprovalRequired + return True + + +hitl_dbos_agent = DBOSAgent(hitl_agent) + + +async def test_dbos_agent_with_hitl_tool(allow_model_requests: None, dbos: DBOS): + # Main loop for the agent, keep running until we get a final string output. + @DBOS.workflow() + async def hitl_main_loop(prompt: str) -> AgentRunResult[str | DeferredToolRequests]: + messages: list[ModelMessage] = [ModelRequest.user_text_prompt(prompt)] + deferred_tool_results: DeferredToolResults | None = None + while True: + result = await hitl_dbos_agent.run(message_history=messages, deferred_tool_results=deferred_tool_results) + messages = result.all_messages() + + if isinstance(result.output, DeferredToolRequests): + deferred_tool_requests = result.output + # Set deferred_tool_requests as a DBOS workflow event, so the external functions can see it. + await DBOS.set_event_async('deferred_tool_requests', deferred_tool_requests) + + # Wait for the deferred tool requests to be handled externally. + deferred_tool_results = await DBOS.recv_async('deferred_tool_results', timeout_seconds=30) + else: + return result + + wf_handle = await DBOS.start_workflow_async(hitl_main_loop, 'Delete the file `.env` and create `test.txt`') + + while True: + await asyncio.sleep(1) + status = await wf_handle.get_status() + if status.status == 'SUCCESS': + break + + assert status.status == 'PENDING' + # Wait and check if the workflow has set a deferred tool request event. + deferred_tool_requests = await DBOS.get_event_async( + wf_handle.workflow_id, 'deferred_tool_requests', timeout_seconds=1 + ) + if deferred_tool_requests is not None: # pragma: no branch + results = DeferredToolResults() + # Approve all calls + for tool_call in deferred_tool_requests.approvals: + results.approvals[tool_call.tool_call_id] = True + + for tool_call in deferred_tool_requests.calls: + results.calls[tool_call.tool_call_id] = 'Success' + + # Signal the workflow with the results. + await DBOS.send_async(wf_handle.workflow_id, results, topic='deferred_tool_results') + + result = await wf_handle.get_result() + assert result.output == snapshot('The file `.env` has been deleted and `test.txt` has been created successfully.') + assert result.all_messages() == snapshot( + [ + ModelRequest( + parts=[ + UserPromptPart( + content='Delete the file `.env` and create `test.txt`', + timestamp=IsDatetime(), + ) + ], + instructions='Just call tools without asking for confirmation.', + ), + ModelResponse( + parts=[ + ToolCallPart( + tool_name='delete_file', + args='{"path": ".env"}', + tool_call_id='call_jYdIdRZHxZTn5bWCq5jlMrJi', + ), + ToolCallPart( + tool_name='create_file', + args='{"path": "test.txt"}', + tool_call_id='call_TmlTVWQbzrXCZ4jNsCVNbNqu', + ), + ], + usage=RequestUsage( + input_tokens=71, + output_tokens=46, + details={ + 'accepted_prediction_tokens': 0, + 'audio_tokens': 0, + 'reasoning_tokens': 0, + 'rejected_prediction_tokens': 0, + }, + ), + model_name=IsStr(), + timestamp=IsDatetime(), + provider_name='openai', + provider_response_id=IsStr(), + ), + ModelRequest( + parts=[ + ToolReturnPart( + tool_name='delete_file', + content=True, + tool_call_id=IsStr(), + timestamp=IsDatetime(), + ), + ToolReturnPart( + tool_name='create_file', + content='Success', + tool_call_id=IsStr(), + timestamp=IsDatetime(), + ), + ], + instructions='Just call tools without asking for confirmation.', + ), + ModelResponse( + parts=[ + TextPart(content='The file `.env` has been deleted and `test.txt` has been created successfully.') + ], + usage=RequestUsage( + input_tokens=133, + output_tokens=19, + details={ + 'accepted_prediction_tokens': 0, + 'audio_tokens': 0, + 'reasoning_tokens': 0, + 'rejected_prediction_tokens': 0, + }, + ), + model_name='gpt-4o-2024-08-06', + timestamp=IsDatetime(), + provider_name='openai', + provider_response_id=IsStr(), + ), + ] + ) + + +def test_dbos_agent_with_hitl_tool_sync(allow_model_requests: None, dbos: DBOS): + # Main loop for the agent, keep running until we get a final string output. + @DBOS.workflow() + def hitl_main_loop_sync(prompt: str) -> AgentRunResult[str | DeferredToolRequests]: + messages: list[ModelMessage] = [ModelRequest.user_text_prompt(prompt)] + deferred_tool_results: DeferredToolResults | None = None + while True: + result = hitl_dbos_agent.run_sync(message_history=messages, deferred_tool_results=deferred_tool_results) + messages = result.all_messages() + + if isinstance(result.output, DeferredToolRequests): + deferred_tool_requests = result.output + # Set deferred_tool_requests as a DBOS workflow event, so the external functions can see it. + DBOS.set_event('deferred_tool_requests', deferred_tool_requests) + + # Wait for the deferred tool requests to be handled externally. + deferred_tool_results = DBOS.recv('deferred_tool_results', timeout_seconds=30) + else: + return result + + wf_handle = DBOS.start_workflow(hitl_main_loop_sync, 'Delete the file `.env` and create `test.txt`') + + while True: + time.sleep(1) + status = wf_handle.get_status() + if status.status == 'SUCCESS': + break + + # Wait and check if the workflow has set a deferred tool request event. + deferred_tool_requests = DBOS.get_event(wf_handle.workflow_id, 'deferred_tool_requests', timeout_seconds=1) + if deferred_tool_requests is not None: # pragma: no branch + results = DeferredToolResults() + # Approve all calls + for tool_call in deferred_tool_requests.approvals: + results.approvals[tool_call.tool_call_id] = True + + for tool_call in deferred_tool_requests.calls: + results.calls[tool_call.tool_call_id] = 'Success' + + # Signal the workflow with the results. + DBOS.send(wf_handle.workflow_id, results, topic='deferred_tool_results') + + result = wf_handle.get_result() + assert result.output == snapshot('The file `.env` has been deleted and `test.txt` has been created successfully.') + assert result.all_messages() == snapshot( + [ + ModelRequest( + parts=[ + UserPromptPart( + content='Delete the file `.env` and create `test.txt`', + timestamp=IsDatetime(), + ) + ], + instructions='Just call tools without asking for confirmation.', + ), + ModelResponse( + parts=[ + ToolCallPart( + tool_name='delete_file', + args='{"path": ".env"}', + tool_call_id='call_jYdIdRZHxZTn5bWCq5jlMrJi', + ), + ToolCallPart( + tool_name='create_file', + args='{"path": "test.txt"}', + tool_call_id='call_TmlTVWQbzrXCZ4jNsCVNbNqu', + ), + ], + usage=RequestUsage( + input_tokens=71, + output_tokens=46, + details={ + 'accepted_prediction_tokens': 0, + 'audio_tokens': 0, + 'reasoning_tokens': 0, + 'rejected_prediction_tokens': 0, + }, + ), + model_name=IsStr(), + timestamp=IsDatetime(), + provider_name='openai', + provider_response_id=IsStr(), + ), + ModelRequest( + parts=[ + ToolReturnPart( + tool_name='delete_file', + content=True, + tool_call_id=IsStr(), + timestamp=IsDatetime(), + ), + ToolReturnPart( + tool_name='create_file', + content='Success', + tool_call_id=IsStr(), + timestamp=IsDatetime(), + ), + ], + instructions='Just call tools without asking for confirmation.', + ), + ModelResponse( + parts=[ + TextPart(content='The file `.env` has been deleted and `test.txt` has been created successfully.') + ], + usage=RequestUsage( + input_tokens=133, + output_tokens=19, + details={ + 'accepted_prediction_tokens': 0, + 'audio_tokens': 0, + 'reasoning_tokens': 0, + 'rejected_prediction_tokens': 0, + }, + ), + model_name='gpt-4o-2024-08-06', + timestamp=IsDatetime(), + provider_name='openai', + provider_response_id=IsStr(), + ), + ] + ) + + +# Test model retry + +model_retry_agent = Agent(model, name='model_retry_agent') + + +@model_retry_agent.tool_plain +@DBOS.step() +def get_weather_in_city(city: str) -> str: + if city != 'Mexico City': + raise ModelRetry('Did you mean Mexico City?') + return 'sunny' + + +model_retry_dbos_agent = DBOSAgent(model_retry_agent) + + +async def test_dbos_agent_with_model_retry(allow_model_requests: None, dbos: DBOS): + result = await model_retry_dbos_agent.run('What is the weather in CDMX?') + assert result.output == snapshot('The weather in Mexico City is currently sunny.') + + assert result.all_messages() == snapshot( + [ + ModelRequest( + parts=[ + UserPromptPart( + content='What is the weather in CDMX?', + timestamp=IsDatetime(), + ) + ] + ), + ModelResponse( + parts=[ + ToolCallPart( + tool_name='get_weather_in_city', + args='{"city":"CDMX"}', + tool_call_id=IsStr(), + ) + ], + usage=RequestUsage( + input_tokens=47, + output_tokens=17, + details={ + 'accepted_prediction_tokens': 0, + 'audio_tokens': 0, + 'reasoning_tokens': 0, + 'rejected_prediction_tokens': 0, + }, + ), + model_name='gpt-4o-2024-08-06', + timestamp=IsDatetime(), + provider_name='openai', + provider_response_id=IsStr(), + ), + ModelRequest( + parts=[ + RetryPromptPart( + content='Did you mean Mexico City?', + tool_name='get_weather_in_city', + tool_call_id=IsStr(), + timestamp=IsDatetime(), + ) + ] + ), + ModelResponse( + parts=[ + ToolCallPart( + tool_name='get_weather_in_city', + args='{"city":"Mexico City"}', + tool_call_id=IsStr(), + ) + ], + usage=RequestUsage( + input_tokens=87, + output_tokens=17, + details={ + 'accepted_prediction_tokens': 0, + 'audio_tokens': 0, + 'reasoning_tokens': 0, + 'rejected_prediction_tokens': 0, + }, + ), + model_name='gpt-4o-2024-08-06', + timestamp=IsDatetime(), + provider_name='openai', + provider_response_id=IsStr(), + ), + ModelRequest( + parts=[ + ToolReturnPart( + tool_name='get_weather_in_city', + content='sunny', + tool_call_id=IsStr(), + timestamp=IsDatetime(), + ) + ] + ), + ModelResponse( + parts=[TextPart(content='The weather in Mexico City is currently sunny.')], + usage=RequestUsage( + input_tokens=116, + output_tokens=10, + details={ + 'accepted_prediction_tokens': 0, + 'audio_tokens': 0, + 'reasoning_tokens': 0, + 'rejected_prediction_tokens': 0, + }, + ), + model_name='gpt-4o-2024-08-06', + timestamp=IsDatetime(), + provider_name='openai', + provider_response_id=IsStr(), + ), + ] + ) diff --git a/tests/test_examples.py b/tests/test_examples.py index 54824ec1aa..f6402d4412 100644 --- a/tests/test_examples.py +++ b/tests/test_examples.py @@ -229,7 +229,7 @@ def print(self, *args: Any, **kwargs: Any) -> None: if opt_test.startswith('skip'): pytest.skip(opt_test[4:].lstrip(' -') or 'running code skipped') elif opt_test.startswith('ci_only') and os.environ.get('GITHUB_ACTIONS', '').lower() != 'true': - pytest.skip(opt_test[7:].lstrip(' -') or 'running code skipped in local tests') # pragma: no cover + pytest.skip(opt_test[7:].lstrip(' -') or 'running code skipped in local tests') # pragma: lax no cover else: test_globals: dict[str, str] = {'__name__': dunder_name} diff --git a/uv.lock b/uv.lock index 728e2fc53c..6359f6c5d9 100644 --- a/uv.lock +++ b/uv.lock @@ -146,6 +146,21 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/ec/6a/bc7e17a3e87a2985d3e8f4da4cd0f481060eb78fb08596c42be62c90a4d9/aiosignal-1.3.2-py2.py3-none-any.whl", hash = "sha256:45cde58e409a301715980c2b01d0c28bdde3770d8290b5eb2173759d9acb31a5", size = 7597, upload-time = "2024-12-13T17:10:38.469Z" }, ] +[[package]] +name = "alembic" +version = "1.16.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "mako" }, + { name = "sqlalchemy" }, + { name = "tomli", marker = "python_full_version < '3.11'" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9a/ca/4dc52902cf3491892d464f5265a81e9dff094692c8a049a3ed6a05fe7ee8/alembic-1.16.5.tar.gz", hash = "sha256:a88bb7f6e513bd4301ecf4c7f2206fe93f9913f9b48dac3b78babde2d6fe765e", size = 1969868, upload-time = "2025-08-27T18:02:05.668Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/39/4a/4c61d4c84cfd9befb6fa08a702535b27b21fff08c946bc2f6139decbf7f7/alembic-1.16.5-py3-none-any.whl", hash = "sha256:e845dfe090c5ffa7b92593ae6687c5cb1a101e91fa53868497dbd79847f9dbe3", size = 247355, upload-time = "2025-08-27T18:02:07.37Z" }, +] + [[package]] name = "algoliasearch" version = "4.13.2" @@ -780,6 +795,53 @@ toml = [ { name = "tomli", marker = "python_full_version <= '3.11'" }, ] +[[package]] +name = "cryptography" +version = "45.0.7" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cffi", marker = "platform_python_implementation != 'PyPy'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a7/35/c495bffc2056f2dadb32434f1feedd79abde2a7f8363e1974afa9c33c7e2/cryptography-45.0.7.tar.gz", hash = "sha256:4b1654dfc64ea479c242508eb8c724044f1e964a47d1d1cacc5132292d851971", size = 744980, upload-time = "2025-09-01T11:15:03.146Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0c/91/925c0ac74362172ae4516000fe877912e33b5983df735ff290c653de4913/cryptography-45.0.7-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:3be4f21c6245930688bd9e162829480de027f8bf962ede33d4f8ba7d67a00cee", size = 7041105, upload-time = "2025-09-01T11:13:59.684Z" }, + { url = "https://files.pythonhosted.org/packages/fc/63/43641c5acce3a6105cf8bd5baeceeb1846bb63067d26dae3e5db59f1513a/cryptography-45.0.7-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:67285f8a611b0ebc0857ced2081e30302909f571a46bfa7a3cc0ad303fe015c6", size = 4205799, upload-time = "2025-09-01T11:14:02.517Z" }, + { url = "https://files.pythonhosted.org/packages/bc/29/c238dd9107f10bfde09a4d1c52fd38828b1aa353ced11f358b5dd2507d24/cryptography-45.0.7-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:577470e39e60a6cd7780793202e63536026d9b8641de011ed9d8174da9ca5339", size = 4430504, upload-time = "2025-09-01T11:14:04.522Z" }, + { url = "https://files.pythonhosted.org/packages/62/62/24203e7cbcc9bd7c94739428cd30680b18ae6b18377ae66075c8e4771b1b/cryptography-45.0.7-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:4bd3e5c4b9682bc112d634f2c6ccc6736ed3635fc3319ac2bb11d768cc5a00d8", size = 4209542, upload-time = "2025-09-01T11:14:06.309Z" }, + { url = "https://files.pythonhosted.org/packages/cd/e3/e7de4771a08620eef2389b86cd87a2c50326827dea5528feb70595439ce4/cryptography-45.0.7-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:465ccac9d70115cd4de7186e60cfe989de73f7bb23e8a7aa45af18f7412e75bf", size = 3889244, upload-time = "2025-09-01T11:14:08.152Z" }, + { url = "https://files.pythonhosted.org/packages/96/b8/bca71059e79a0bb2f8e4ec61d9c205fbe97876318566cde3b5092529faa9/cryptography-45.0.7-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:16ede8a4f7929b4b7ff3642eba2bf79aa1d71f24ab6ee443935c0d269b6bc513", size = 4461975, upload-time = "2025-09-01T11:14:09.755Z" }, + { url = "https://files.pythonhosted.org/packages/58/67/3f5b26937fe1218c40e95ef4ff8d23c8dc05aa950d54200cc7ea5fb58d28/cryptography-45.0.7-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:8978132287a9d3ad6b54fcd1e08548033cc09dc6aacacb6c004c73c3eb5d3ac3", size = 4209082, upload-time = "2025-09-01T11:14:11.229Z" }, + { url = "https://files.pythonhosted.org/packages/0e/e4/b3e68a4ac363406a56cf7b741eeb80d05284d8c60ee1a55cdc7587e2a553/cryptography-45.0.7-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:b6a0e535baec27b528cb07a119f321ac024592388c5681a5ced167ae98e9fff3", size = 4460397, upload-time = "2025-09-01T11:14:12.924Z" }, + { url = "https://files.pythonhosted.org/packages/22/49/2c93f3cd4e3efc8cb22b02678c1fad691cff9dd71bb889e030d100acbfe0/cryptography-45.0.7-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:a24ee598d10befaec178efdff6054bc4d7e883f615bfbcd08126a0f4931c83a6", size = 4337244, upload-time = "2025-09-01T11:14:14.431Z" }, + { url = "https://files.pythonhosted.org/packages/04/19/030f400de0bccccc09aa262706d90f2ec23d56bc4eb4f4e8268d0ddf3fb8/cryptography-45.0.7-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:fa26fa54c0a9384c27fcdc905a2fb7d60ac6e47d14bc2692145f2b3b1e2cfdbd", size = 4568862, upload-time = "2025-09-01T11:14:16.185Z" }, + { url = "https://files.pythonhosted.org/packages/29/56/3034a3a353efa65116fa20eb3c990a8c9f0d3db4085429040a7eef9ada5f/cryptography-45.0.7-cp311-abi3-win32.whl", hash = "sha256:bef32a5e327bd8e5af915d3416ffefdbe65ed975b646b3805be81b23580b57b8", size = 2936578, upload-time = "2025-09-01T11:14:17.638Z" }, + { url = "https://files.pythonhosted.org/packages/b3/61/0ab90f421c6194705a99d0fa9f6ee2045d916e4455fdbb095a9c2c9a520f/cryptography-45.0.7-cp311-abi3-win_amd64.whl", hash = "sha256:3808e6b2e5f0b46d981c24d79648e5c25c35e59902ea4391a0dcb3e667bf7443", size = 3405400, upload-time = "2025-09-01T11:14:18.958Z" }, + { url = "https://files.pythonhosted.org/packages/63/e8/c436233ddf19c5f15b25ace33979a9dd2e7aa1a59209a0ee8554179f1cc0/cryptography-45.0.7-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:bfb4c801f65dd61cedfc61a83732327fafbac55a47282e6f26f073ca7a41c3b2", size = 7021824, upload-time = "2025-09-01T11:14:20.954Z" }, + { url = "https://files.pythonhosted.org/packages/bc/4c/8f57f2500d0ccd2675c5d0cc462095adf3faa8c52294ba085c036befb901/cryptography-45.0.7-cp37-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:81823935e2f8d476707e85a78a405953a03ef7b7b4f55f93f7c2d9680e5e0691", size = 4202233, upload-time = "2025-09-01T11:14:22.454Z" }, + { url = "https://files.pythonhosted.org/packages/eb/ac/59b7790b4ccaed739fc44775ce4645c9b8ce54cbec53edf16c74fd80cb2b/cryptography-45.0.7-cp37-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3994c809c17fc570c2af12c9b840d7cea85a9fd3e5c0e0491f4fa3c029216d59", size = 4423075, upload-time = "2025-09-01T11:14:24.287Z" }, + { url = "https://files.pythonhosted.org/packages/b8/56/d4f07ea21434bf891faa088a6ac15d6d98093a66e75e30ad08e88aa2b9ba/cryptography-45.0.7-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:dad43797959a74103cb59c5dac71409f9c27d34c8a05921341fb64ea8ccb1dd4", size = 4204517, upload-time = "2025-09-01T11:14:25.679Z" }, + { url = "https://files.pythonhosted.org/packages/e8/ac/924a723299848b4c741c1059752c7cfe09473b6fd77d2920398fc26bfb53/cryptography-45.0.7-cp37-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:ce7a453385e4c4693985b4a4a3533e041558851eae061a58a5405363b098fcd3", size = 3882893, upload-time = "2025-09-01T11:14:27.1Z" }, + { url = "https://files.pythonhosted.org/packages/83/dc/4dab2ff0a871cc2d81d3ae6d780991c0192b259c35e4d83fe1de18b20c70/cryptography-45.0.7-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:b04f85ac3a90c227b6e5890acb0edbaf3140938dbecf07bff618bf3638578cf1", size = 4450132, upload-time = "2025-09-01T11:14:28.58Z" }, + { url = "https://files.pythonhosted.org/packages/12/dd/b2882b65db8fc944585d7fb00d67cf84a9cef4e77d9ba8f69082e911d0de/cryptography-45.0.7-cp37-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:48c41a44ef8b8c2e80ca4527ee81daa4c527df3ecbc9423c41a420a9559d0e27", size = 4204086, upload-time = "2025-09-01T11:14:30.572Z" }, + { url = "https://files.pythonhosted.org/packages/5d/fa/1d5745d878048699b8eb87c984d4ccc5da4f5008dfd3ad7a94040caca23a/cryptography-45.0.7-cp37-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:f3df7b3d0f91b88b2106031fd995802a2e9ae13e02c36c1fc075b43f420f3a17", size = 4449383, upload-time = "2025-09-01T11:14:32.046Z" }, + { url = "https://files.pythonhosted.org/packages/36/8b/fc61f87931bc030598e1876c45b936867bb72777eac693e905ab89832670/cryptography-45.0.7-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:dd342f085542f6eb894ca00ef70236ea46070c8a13824c6bde0dfdcd36065b9b", size = 4332186, upload-time = "2025-09-01T11:14:33.95Z" }, + { url = "https://files.pythonhosted.org/packages/0b/11/09700ddad7443ccb11d674efdbe9a832b4455dc1f16566d9bd3834922ce5/cryptography-45.0.7-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:1993a1bb7e4eccfb922b6cd414f072e08ff5816702a0bdb8941c247a6b1b287c", size = 4561639, upload-time = "2025-09-01T11:14:35.343Z" }, + { url = "https://files.pythonhosted.org/packages/71/ed/8f4c1337e9d3b94d8e50ae0b08ad0304a5709d483bfcadfcc77a23dbcb52/cryptography-45.0.7-cp37-abi3-win32.whl", hash = "sha256:18fcf70f243fe07252dcb1b268a687f2358025ce32f9f88028ca5c364b123ef5", size = 2926552, upload-time = "2025-09-01T11:14:36.929Z" }, + { url = "https://files.pythonhosted.org/packages/bc/ff/026513ecad58dacd45d1d24ebe52b852165a26e287177de1d545325c0c25/cryptography-45.0.7-cp37-abi3-win_amd64.whl", hash = "sha256:7285a89df4900ed3bfaad5679b1e668cb4b38a8de1ccbfc84b05f34512da0a90", size = 3392742, upload-time = "2025-09-01T11:14:38.368Z" }, + { url = "https://files.pythonhosted.org/packages/13/3e/e42f1528ca1ea82256b835191eab1be014e0f9f934b60d98b0be8a38ed70/cryptography-45.0.7-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:de58755d723e86175756f463f2f0bddd45cc36fbd62601228a3f8761c9f58252", size = 3572442, upload-time = "2025-09-01T11:14:39.836Z" }, + { url = "https://files.pythonhosted.org/packages/59/aa/e947693ab08674a2663ed2534cd8d345cf17bf6a1facf99273e8ec8986dc/cryptography-45.0.7-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:a20e442e917889d1a6b3c570c9e3fa2fdc398c20868abcea268ea33c024c4083", size = 4142233, upload-time = "2025-09-01T11:14:41.305Z" }, + { url = "https://files.pythonhosted.org/packages/24/06/09b6f6a2fc43474a32b8fe259038eef1500ee3d3c141599b57ac6c57612c/cryptography-45.0.7-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:258e0dff86d1d891169b5af222d362468a9570e2532923088658aa866eb11130", size = 4376202, upload-time = "2025-09-01T11:14:43.047Z" }, + { url = "https://files.pythonhosted.org/packages/00/f2/c166af87e95ce6ae6d38471a7e039d3a0549c2d55d74e059680162052824/cryptography-45.0.7-pp310-pypy310_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:d97cf502abe2ab9eff8bd5e4aca274da8d06dd3ef08b759a8d6143f4ad65d4b4", size = 4141900, upload-time = "2025-09-01T11:14:45.089Z" }, + { url = "https://files.pythonhosted.org/packages/16/b9/e96e0b6cb86eae27ea51fa8a3151535a18e66fe7c451fa90f7f89c85f541/cryptography-45.0.7-pp310-pypy310_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:c987dad82e8c65ebc985f5dae5e74a3beda9d0a2a4daf8a1115f3772b59e5141", size = 4375562, upload-time = "2025-09-01T11:14:47.166Z" }, + { url = "https://files.pythonhosted.org/packages/36/d0/36e8ee39274e9d77baf7d0dafda680cba6e52f3936b846f0d56d64fec915/cryptography-45.0.7-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:c13b1e3afd29a5b3b2656257f14669ca8fa8d7956d509926f0b130b600b50ab7", size = 3322781, upload-time = "2025-09-01T11:14:48.747Z" }, + { url = "https://files.pythonhosted.org/packages/99/4e/49199a4c82946938a3e05d2e8ad9482484ba48bbc1e809e3d506c686d051/cryptography-45.0.7-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:4a862753b36620af6fc54209264f92c716367f2f0ff4624952276a6bbd18cbde", size = 3584634, upload-time = "2025-09-01T11:14:50.593Z" }, + { url = "https://files.pythonhosted.org/packages/16/ce/5f6ff59ea9c7779dba51b84871c19962529bdcc12e1a6ea172664916c550/cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:06ce84dc14df0bf6ea84666f958e6080cdb6fe1231be2a51f3fc1267d9f3fb34", size = 4149533, upload-time = "2025-09-01T11:14:52.091Z" }, + { url = "https://files.pythonhosted.org/packages/ce/13/b3cfbd257ac96da4b88b46372e662009b7a16833bfc5da33bb97dd5631ae/cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:d0c5c6bac22b177bf8da7435d9d27a6834ee130309749d162b26c3105c0795a9", size = 4385557, upload-time = "2025-09-01T11:14:53.551Z" }, + { url = "https://files.pythonhosted.org/packages/1c/c5/8c59d6b7c7b439ba4fc8d0cab868027fd095f215031bc123c3a070962912/cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:2f641b64acc00811da98df63df7d59fd4706c0df449da71cb7ac39a0732b40ae", size = 4149023, upload-time = "2025-09-01T11:14:55.022Z" }, + { url = "https://files.pythonhosted.org/packages/55/32/05385c86d6ca9ab0b4d5bb442d2e3d85e727939a11f3e163fc776ce5eb40/cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:f5414a788ecc6ee6bc58560e85ca624258a55ca434884445440a810796ea0e0b", size = 4385722, upload-time = "2025-09-01T11:14:57.319Z" }, + { url = "https://files.pythonhosted.org/packages/23/87/7ce86f3fa14bc11a5a48c30d8103c26e09b6465f8d8e9d74cf7a0714f043/cryptography-45.0.7-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:1f3d56f73595376f4244646dd5c5870c14c196949807be39e79e7bd9bac3da63", size = 3332908, upload-time = "2025-09-01T11:14:58.78Z" }, +] + [[package]] name = "cssselect2" version = "0.7.0" @@ -817,6 +879,34 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/eb/62/eb8157afb21bd229c864521c1ab4fa8e9b4f1b06bafdd8c4668a7a31b5dd/datasets-4.0.0-py3-none-any.whl", hash = "sha256:7ef95e62025fd122882dbce6cb904c8cd3fbc829de6669a5eb939c77d50e203d", size = 494825, upload-time = "2025-07-09T14:35:50.658Z" }, ] +[[package]] +name = "dbos" +version = "1.13.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "alembic" }, + { name = "cryptography" }, + { name = "docker" }, + { name = "fastapi", extra = ["standard"] }, + { name = "jsonpickle" }, + { name = "jsonschema" }, + { name = "opentelemetry-api" }, + { name = "opentelemetry-exporter-otlp-proto-http" }, + { name = "opentelemetry-sdk" }, + { name = "psycopg", extra = ["binary"] }, + { name = "pyjwt" }, + { name = "python-dateutil" }, + { name = "pyyaml" }, + { name = "rich" }, + { name = "tomlkit" }, + { name = "typer" }, + { name = "websockets" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/47/d2/fa283887930293f079400b5525ceab7f0b0301910cf0070c1aebe621d85f/dbos-1.13.0.tar.gz", hash = "sha256:40fd2419bd4e72821fe1726e12ff7fae7b6e4b482971cc43cef08693468bac4c", size = 209587, upload-time = "2025-09-02T18:56:50.996Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1a/a0/5cd6f990c5835067fdb71a29e507ed13cbbd92fbf0c857450dfa134099c0/dbos-1.13.0-py3-none-any.whl", hash = "sha256:fec850e95a92997c7c0dc05ebd702dde941221465006957a314207bad7caf45f", size = 147294, upload-time = "2025-09-02T18:56:48.772Z" }, +] + [[package]] name = "ddgs" version = "9.2.3" @@ -908,6 +998,29 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/12/b3/231ffd4ab1fc9d679809f356cebee130ac7daa00d6d6f3206dd4fd137e9e/distro-1.9.0-py3-none-any.whl", hash = "sha256:7bffd925d65168f85027d8da9af6bddab658135b840670a223589bc0c8ef02b2", size = 20277, upload-time = "2023-12-24T09:54:30.421Z" }, ] +[[package]] +name = "dnspython" +version = "2.7.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b5/4a/263763cb2ba3816dd94b08ad3a33d5fdae34ecb856678773cc40a3605829/dnspython-2.7.0.tar.gz", hash = "sha256:ce9c432eda0dc91cf618a5cedf1a4e142651196bbcd2c80e89ed5a907e5cfaf1", size = 345197, upload-time = "2024-10-05T20:14:59.362Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/68/1b/e0a87d256e40e8c888847551b20a017a6b98139178505dc7ffb96f04e954/dnspython-2.7.0-py3-none-any.whl", hash = "sha256:b4c34b7d10b51bcc3a5071e7b8dee77939f1e878477eeecc965e9835f63c6c86", size = 313632, upload-time = "2024-10-05T20:14:57.687Z" }, +] + +[[package]] +name = "docker" +version = "7.1.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pywin32", marker = "sys_platform == 'win32'" }, + { name = "requests" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/91/9b/4a2ea29aeba62471211598dac5d96825bb49348fa07e906ea930394a83ce/docker-7.1.0.tar.gz", hash = "sha256:ad8c70e6e3f8926cb8a92619b832b4ea5299e2831c14284663184e200546fa6c", size = 117834, upload-time = "2024-05-23T11:13:57.216Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e3/26/57c6fb270950d476074c087527a558ccb6f4436657314bfb6cdf484114c4/docker-7.1.0-py3-none-any.whl", hash = "sha256:c96b93b7f0a746f9e77d325bcfb87422a3d8bd4f03136ae8a85b37f1898d5fc0", size = 147774, upload-time = "2024-05-23T11:13:55.01Z" }, +] + [[package]] name = "duckdb" version = "1.3.2" @@ -958,6 +1071,19 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/75/21/fc2c821a2c92c021f8f8adf9fb36235d1b49525b7cd953e85624296aab94/duckduckgo_search-7.5.0-py3-none-any.whl", hash = "sha256:6a2d3f12ae29b3e076cd43be61f5f73cd95261e0a0f318fe0ad3648d7a5dff03", size = 20238, upload-time = "2025-02-24T14:50:48.179Z" }, ] +[[package]] +name = "email-validator" +version = "2.3.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "dnspython" }, + { name = "idna" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f5/22/900cb125c76b7aaa450ce02fd727f452243f2e91a61af068b40adba60ea9/email_validator-2.3.0.tar.gz", hash = "sha256:9fc05c37f2f6cf439ff414f8fc46d917929974a82244c20eb10231ba60c54426", size = 51238, upload-time = "2025-08-26T13:09:06.831Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/de/15/545e2b6cf2e3be84bc1ed85613edd75b8aea69807a71c26f4ca6a9258e82/email_validator-2.3.0-py3-none-any.whl", hash = "sha256:80f13f623413e6b197ae73bb10bf4eb0908faf509ad8362c5edeb0be7fd450b4", size = 35604, upload-time = "2025-08-26T13:09:05.858Z" }, +] + [[package]] name = "eval-type-backport" version = "0.2.2" @@ -1022,6 +1148,54 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/8f/7d/2d6ce181d7a5f51dedb8c06206cbf0ec026a99bf145edd309f9e17c3282f/fastapi-0.115.8-py3-none-any.whl", hash = "sha256:753a96dd7e036b34eeef8babdfcfe3f28ff79648f86551eb36bfc1b0bf4a8cbf", size = 94814, upload-time = "2025-01-30T14:06:38.564Z" }, ] +[package.optional-dependencies] +standard = [ + { name = "email-validator" }, + { name = "fastapi-cli", extra = ["standard"] }, + { name = "httpx" }, + { name = "jinja2" }, + { name = "python-multipart" }, + { name = "uvicorn", extra = ["standard"] }, +] + +[[package]] +name = "fastapi-cli" +version = "0.0.10" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "rich-toolkit" }, + { name = "typer" }, + { name = "uvicorn", extra = ["standard"] }, +] +sdist = { url = "https://files.pythonhosted.org/packages/31/b6/ed25b8874a27f684bf601990c48fcb3edb478edca2b9a38cc2ba196fb304/fastapi_cli-0.0.10.tar.gz", hash = "sha256:85a93df72ff834c3d2a356164512cabaf8f093d50eddad9309065a9c9ac5193a", size = 16994, upload-time = "2025-08-31T17:43:20.702Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7c/62/0f00036925c0614e333a2baf739c861453a6779331ffb47ec9a6147f860b/fastapi_cli-0.0.10-py3-none-any.whl", hash = "sha256:04bef56b49f7357c6c4acd4f793b4433ed3f511be431ed0af68db6d3f8bd44b3", size = 10851, upload-time = "2025-08-31T17:43:19.481Z" }, +] + +[package.optional-dependencies] +standard = [ + { name = "fastapi-cloud-cli" }, + { name = "uvicorn", extra = ["standard"] }, +] + +[[package]] +name = "fastapi-cloud-cli" +version = "0.1.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "httpx" }, + { name = "pydantic", extra = ["email"] }, + { name = "rich-toolkit" }, + { name = "rignore" }, + { name = "sentry-sdk" }, + { name = "typer" }, + { name = "uvicorn", extra = ["standard"] }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a9/2e/3b6e5016affc310e5109bc580f760586eabecea0c8a7ab067611cd849ac0/fastapi_cloud_cli-0.1.5.tar.gz", hash = "sha256:341ee585eb731a6d3c3656cb91ad38e5f39809bf1a16d41de1333e38635a7937", size = 22710, upload-time = "2025-07-28T13:30:48.216Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e5/a6/5aa862489a2918a096166fd98d9fe86b7fd53c607678b3fa9d8c432d88d5/fastapi_cloud_cli-0.1.5-py3-none-any.whl", hash = "sha256:d80525fb9c0e8af122370891f9fa83cf5d496e4ad47a8dd26c0496a6c85a012a", size = 18992, upload-time = "2025-07-28T13:30:47.427Z" }, +] + [[package]] name = "fastavro" version = "1.10.0" @@ -1281,6 +1455,57 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/16/52/4fe9dfc2239e7b748ad8dc3b80ad8755f5c9378432715193586c3ab74bf9/gradio_client-1.7.1-py3-none-any.whl", hash = "sha256:d7737bc473a2093549c06004379c42f0a3510a98095cf7cea9033837e252149f", size = 321994, upload-time = "2025-02-19T20:05:21.305Z" }, ] +[[package]] +name = "greenlet" +version = "3.2.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/03/b8/704d753a5a45507a7aab61f18db9509302ed3d0a27ac7e0359ec2905b1a6/greenlet-3.2.4.tar.gz", hash = "sha256:0dca0d95ff849f9a364385f36ab49f50065d76964944638be9691e1832e9f86d", size = 188260, upload-time = "2025-08-07T13:24:33.51Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7d/ed/6bfa4109fcb23a58819600392564fea69cdc6551ffd5e69ccf1d52a40cbc/greenlet-3.2.4-cp310-cp310-macosx_11_0_universal2.whl", hash = "sha256:8c68325b0d0acf8d91dde4e6f930967dd52a5302cd4062932a6b2e7c2969f47c", size = 271061, upload-time = "2025-08-07T13:17:15.373Z" }, + { url = "https://files.pythonhosted.org/packages/2a/fc/102ec1a2fc015b3a7652abab7acf3541d58c04d3d17a8d3d6a44adae1eb1/greenlet-3.2.4-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:94385f101946790ae13da500603491f04a76b6e4c059dab271b3ce2e283b2590", size = 629475, upload-time = "2025-08-07T13:42:54.009Z" }, + { url = "https://files.pythonhosted.org/packages/c5/26/80383131d55a4ac0fb08d71660fd77e7660b9db6bdb4e8884f46d9f2cc04/greenlet-3.2.4-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f10fd42b5ee276335863712fa3da6608e93f70629c631bf77145021600abc23c", size = 640802, upload-time = "2025-08-07T13:45:25.52Z" }, + { url = "https://files.pythonhosted.org/packages/9f/7c/e7833dbcd8f376f3326bd728c845d31dcde4c84268d3921afcae77d90d08/greenlet-3.2.4-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:c8c9e331e58180d0d83c5b7999255721b725913ff6bc6cf39fa2a45841a4fd4b", size = 636703, upload-time = "2025-08-07T13:53:12.622Z" }, + { url = "https://files.pythonhosted.org/packages/e9/49/547b93b7c0428ede7b3f309bc965986874759f7d89e4e04aeddbc9699acb/greenlet-3.2.4-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:58b97143c9cc7b86fc458f215bd0932f1757ce649e05b640fea2e79b54cedb31", size = 635417, upload-time = "2025-08-07T13:18:25.189Z" }, + { url = "https://files.pythonhosted.org/packages/7f/91/ae2eb6b7979e2f9b035a9f612cf70f1bf54aad4e1d125129bef1eae96f19/greenlet-3.2.4-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c2ca18a03a8cfb5b25bc1cbe20f3d9a4c80d8c3b13ba3df49ac3961af0b1018d", size = 584358, upload-time = "2025-08-07T13:18:23.708Z" }, + { url = "https://files.pythonhosted.org/packages/f7/85/433de0c9c0252b22b16d413c9407e6cb3b41df7389afc366ca204dbc1393/greenlet-3.2.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:9fe0a28a7b952a21e2c062cd5756d34354117796c6d9215a87f55e38d15402c5", size = 1113550, upload-time = "2025-08-07T13:42:37.467Z" }, + { url = "https://files.pythonhosted.org/packages/a1/8d/88f3ebd2bc96bf7747093696f4335a0a8a4c5acfcf1b757717c0d2474ba3/greenlet-3.2.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:8854167e06950ca75b898b104b63cc646573aa5fef1353d4508ecdd1ee76254f", size = 1137126, upload-time = "2025-08-07T13:18:20.239Z" }, + { url = "https://files.pythonhosted.org/packages/d6/6f/b60b0291d9623c496638c582297ead61f43c4b72eef5e9c926ef4565ec13/greenlet-3.2.4-cp310-cp310-win_amd64.whl", hash = "sha256:73f49b5368b5359d04e18d15828eecc1806033db5233397748f4ca813ff1056c", size = 298654, upload-time = "2025-08-07T13:50:00.469Z" }, + { url = "https://files.pythonhosted.org/packages/a4/de/f28ced0a67749cac23fecb02b694f6473f47686dff6afaa211d186e2ef9c/greenlet-3.2.4-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:96378df1de302bc38e99c3a9aa311967b7dc80ced1dcc6f171e99842987882a2", size = 272305, upload-time = "2025-08-07T13:15:41.288Z" }, + { url = "https://files.pythonhosted.org/packages/09/16/2c3792cba130000bf2a31c5272999113f4764fd9d874fb257ff588ac779a/greenlet-3.2.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:1ee8fae0519a337f2329cb78bd7a8e128ec0f881073d43f023c7b8d4831d5246", size = 632472, upload-time = "2025-08-07T13:42:55.044Z" }, + { url = "https://files.pythonhosted.org/packages/ae/8f/95d48d7e3d433e6dae5b1682e4292242a53f22df82e6d3dda81b1701a960/greenlet-3.2.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:94abf90142c2a18151632371140b3dba4dee031633fe614cb592dbb6c9e17bc3", size = 644646, upload-time = "2025-08-07T13:45:26.523Z" }, + { url = "https://files.pythonhosted.org/packages/d5/5e/405965351aef8c76b8ef7ad370e5da58d57ef6068df197548b015464001a/greenlet-3.2.4-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:4d1378601b85e2e5171b99be8d2dc85f594c79967599328f95c1dc1a40f1c633", size = 640519, upload-time = "2025-08-07T13:53:13.928Z" }, + { url = "https://files.pythonhosted.org/packages/25/5d/382753b52006ce0218297ec1b628e048c4e64b155379331f25a7316eb749/greenlet-3.2.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:0db5594dce18db94f7d1650d7489909b57afde4c580806b8d9203b6e79cdc079", size = 639707, upload-time = "2025-08-07T13:18:27.146Z" }, + { url = "https://files.pythonhosted.org/packages/1f/8e/abdd3f14d735b2929290a018ecf133c901be4874b858dd1c604b9319f064/greenlet-3.2.4-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2523e5246274f54fdadbce8494458a2ebdcdbc7b802318466ac5606d3cded1f8", size = 587684, upload-time = "2025-08-07T13:18:25.164Z" }, + { url = "https://files.pythonhosted.org/packages/5d/65/deb2a69c3e5996439b0176f6651e0052542bb6c8f8ec2e3fba97c9768805/greenlet-3.2.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:1987de92fec508535687fb807a5cea1560f6196285a4cde35c100b8cd632cc52", size = 1116647, upload-time = "2025-08-07T13:42:38.655Z" }, + { url = "https://files.pythonhosted.org/packages/3f/cc/b07000438a29ac5cfb2194bfc128151d52f333cee74dd7dfe3fb733fc16c/greenlet-3.2.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:55e9c5affaa6775e2c6b67659f3a71684de4c549b3dd9afca3bc773533d284fa", size = 1142073, upload-time = "2025-08-07T13:18:21.737Z" }, + { url = "https://files.pythonhosted.org/packages/d8/0f/30aef242fcab550b0b3520b8e3561156857c94288f0332a79928c31a52cf/greenlet-3.2.4-cp311-cp311-win_amd64.whl", hash = "sha256:9c40adce87eaa9ddb593ccb0fa6a07caf34015a29bf8d344811665b573138db9", size = 299100, upload-time = "2025-08-07T13:44:12.287Z" }, + { url = "https://files.pythonhosted.org/packages/44/69/9b804adb5fd0671f367781560eb5eb586c4d495277c93bde4307b9e28068/greenlet-3.2.4-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:3b67ca49f54cede0186854a008109d6ee71f66bd57bb36abd6d0a0267b540cdd", size = 274079, upload-time = "2025-08-07T13:15:45.033Z" }, + { url = "https://files.pythonhosted.org/packages/46/e9/d2a80c99f19a153eff70bc451ab78615583b8dac0754cfb942223d2c1a0d/greenlet-3.2.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ddf9164e7a5b08e9d22511526865780a576f19ddd00d62f8a665949327fde8bb", size = 640997, upload-time = "2025-08-07T13:42:56.234Z" }, + { url = "https://files.pythonhosted.org/packages/3b/16/035dcfcc48715ccd345f3a93183267167cdd162ad123cd93067d86f27ce4/greenlet-3.2.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f28588772bb5fb869a8eb331374ec06f24a83a9c25bfa1f38b6993afe9c1e968", size = 655185, upload-time = "2025-08-07T13:45:27.624Z" }, + { url = "https://files.pythonhosted.org/packages/31/da/0386695eef69ffae1ad726881571dfe28b41970173947e7c558d9998de0f/greenlet-3.2.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:5c9320971821a7cb77cfab8d956fa8e39cd07ca44b6070db358ceb7f8797c8c9", size = 649926, upload-time = "2025-08-07T13:53:15.251Z" }, + { url = "https://files.pythonhosted.org/packages/68/88/69bf19fd4dc19981928ceacbc5fd4bb6bc2215d53199e367832e98d1d8fe/greenlet-3.2.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c60a6d84229b271d44b70fb6e5fa23781abb5d742af7b808ae3f6efd7c9c60f6", size = 651839, upload-time = "2025-08-07T13:18:30.281Z" }, + { url = "https://files.pythonhosted.org/packages/19/0d/6660d55f7373b2ff8152401a83e02084956da23ae58cddbfb0b330978fe9/greenlet-3.2.4-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3b3812d8d0c9579967815af437d96623f45c0f2ae5f04e366de62a12d83a8fb0", size = 607586, upload-time = "2025-08-07T13:18:28.544Z" }, + { url = "https://files.pythonhosted.org/packages/8e/1a/c953fdedd22d81ee4629afbb38d2f9d71e37d23caace44775a3a969147d4/greenlet-3.2.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:abbf57b5a870d30c4675928c37278493044d7c14378350b3aa5d484fa65575f0", size = 1123281, upload-time = "2025-08-07T13:42:39.858Z" }, + { url = "https://files.pythonhosted.org/packages/3f/c7/12381b18e21aef2c6bd3a636da1088b888b97b7a0362fac2e4de92405f97/greenlet-3.2.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:20fb936b4652b6e307b8f347665e2c615540d4b42b3b4c8a321d8286da7e520f", size = 1151142, upload-time = "2025-08-07T13:18:22.981Z" }, + { url = "https://files.pythonhosted.org/packages/e9/08/b0814846b79399e585f974bbeebf5580fbe59e258ea7be64d9dfb253c84f/greenlet-3.2.4-cp312-cp312-win_amd64.whl", hash = "sha256:a7d4e128405eea3814a12cc2605e0e6aedb4035bf32697f72deca74de4105e02", size = 299899, upload-time = "2025-08-07T13:38:53.448Z" }, + { url = "https://files.pythonhosted.org/packages/49/e8/58c7f85958bda41dafea50497cbd59738c5c43dbbea5ee83d651234398f4/greenlet-3.2.4-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:1a921e542453fe531144e91e1feedf12e07351b1cf6c9e8a3325ea600a715a31", size = 272814, upload-time = "2025-08-07T13:15:50.011Z" }, + { url = "https://files.pythonhosted.org/packages/62/dd/b9f59862e9e257a16e4e610480cfffd29e3fae018a68c2332090b53aac3d/greenlet-3.2.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:cd3c8e693bff0fff6ba55f140bf390fa92c994083f838fece0f63be121334945", size = 641073, upload-time = "2025-08-07T13:42:57.23Z" }, + { url = "https://files.pythonhosted.org/packages/f7/0b/bc13f787394920b23073ca3b6c4a7a21396301ed75a655bcb47196b50e6e/greenlet-3.2.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:710638eb93b1fa52823aa91bf75326f9ecdfd5e0466f00789246a5280f4ba0fc", size = 655191, upload-time = "2025-08-07T13:45:29.752Z" }, + { url = "https://files.pythonhosted.org/packages/f2/d6/6adde57d1345a8d0f14d31e4ab9c23cfe8e2cd39c3baf7674b4b0338d266/greenlet-3.2.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:c5111ccdc9c88f423426df3fd1811bfc40ed66264d35aa373420a34377efc98a", size = 649516, upload-time = "2025-08-07T13:53:16.314Z" }, + { url = "https://files.pythonhosted.org/packages/7f/3b/3a3328a788d4a473889a2d403199932be55b1b0060f4ddd96ee7cdfcad10/greenlet-3.2.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d76383238584e9711e20ebe14db6c88ddcedc1829a9ad31a584389463b5aa504", size = 652169, upload-time = "2025-08-07T13:18:32.861Z" }, + { url = "https://files.pythonhosted.org/packages/ee/43/3cecdc0349359e1a527cbf2e3e28e5f8f06d3343aaf82ca13437a9aa290f/greenlet-3.2.4-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:23768528f2911bcd7e475210822ffb5254ed10d71f4028387e5a99b4c6699671", size = 610497, upload-time = "2025-08-07T13:18:31.636Z" }, + { url = "https://files.pythonhosted.org/packages/b8/19/06b6cf5d604e2c382a6f31cafafd6f33d5dea706f4db7bdab184bad2b21d/greenlet-3.2.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:00fadb3fedccc447f517ee0d3fd8fe49eae949e1cd0f6a611818f4f6fb7dc83b", size = 1121662, upload-time = "2025-08-07T13:42:41.117Z" }, + { url = "https://files.pythonhosted.org/packages/a2/15/0d5e4e1a66fab130d98168fe984c509249c833c1a3c16806b90f253ce7b9/greenlet-3.2.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:d25c5091190f2dc0eaa3f950252122edbbadbb682aa7b1ef2f8af0f8c0afefae", size = 1149210, upload-time = "2025-08-07T13:18:24.072Z" }, + { url = "https://files.pythonhosted.org/packages/0b/55/2321e43595e6801e105fcfdee02b34c0f996eb71e6ddffca6b10b7e1d771/greenlet-3.2.4-cp313-cp313-win_amd64.whl", hash = "sha256:554b03b6e73aaabec3745364d6239e9e012d64c68ccd0b8430c64ccc14939a8b", size = 299685, upload-time = "2025-08-07T13:24:38.824Z" }, + { url = "https://files.pythonhosted.org/packages/22/5c/85273fd7cc388285632b0498dbbab97596e04b154933dfe0f3e68156c68c/greenlet-3.2.4-cp314-cp314-macosx_11_0_universal2.whl", hash = "sha256:49a30d5fda2507ae77be16479bdb62a660fa51b1eb4928b524975b3bde77b3c0", size = 273586, upload-time = "2025-08-07T13:16:08.004Z" }, + { url = "https://files.pythonhosted.org/packages/d1/75/10aeeaa3da9332c2e761e4c50d4c3556c21113ee3f0afa2cf5769946f7a3/greenlet-3.2.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:299fd615cd8fc86267b47597123e3f43ad79c9d8a22bebdce535e53550763e2f", size = 686346, upload-time = "2025-08-07T13:42:59.944Z" }, + { url = "https://files.pythonhosted.org/packages/c0/aa/687d6b12ffb505a4447567d1f3abea23bd20e73a5bed63871178e0831b7a/greenlet-3.2.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:c17b6b34111ea72fc5a4e4beec9711d2226285f0386ea83477cbb97c30a3f3a5", size = 699218, upload-time = "2025-08-07T13:45:30.969Z" }, + { url = "https://files.pythonhosted.org/packages/dc/8b/29aae55436521f1d6f8ff4e12fb676f3400de7fcf27fccd1d4d17fd8fecd/greenlet-3.2.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b4a1870c51720687af7fa3e7cda6d08d801dae660f75a76f3845b642b4da6ee1", size = 694659, upload-time = "2025-08-07T13:53:17.759Z" }, + { url = "https://files.pythonhosted.org/packages/92/2e/ea25914b1ebfde93b6fc4ff46d6864564fba59024e928bdc7de475affc25/greenlet-3.2.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:061dc4cf2c34852b052a8620d40f36324554bc192be474b9e9770e8c042fd735", size = 695355, upload-time = "2025-08-07T13:18:34.517Z" }, + { url = "https://files.pythonhosted.org/packages/72/60/fc56c62046ec17f6b0d3060564562c64c862948c9d4bc8aa807cf5bd74f4/greenlet-3.2.4-cp314-cp314-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:44358b9bf66c8576a9f57a590d5f5d6e72fa4228b763d0e43fee6d3b06d3a337", size = 657512, upload-time = "2025-08-07T13:18:33.969Z" }, + { url = "https://files.pythonhosted.org/packages/e3/a5/6ddab2b4c112be95601c13428db1d8b6608a8b6039816f2ba09c346c08fc/greenlet-3.2.4-cp314-cp314-win_amd64.whl", hash = "sha256:e37ab26028f12dbb0ff65f29a8d3d44a765c61e729647bf2ddfbbed621726f01", size = 303425, upload-time = "2025-08-07T13:32:27.59Z" }, +] + [[package]] name = "griffe" version = "1.5.7" @@ -1391,6 +1616,42 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/87/f5/72347bc88306acb359581ac4d52f23c0ef445b57157adedb9aee0cd689d2/httpcore-1.0.7-py3-none-any.whl", hash = "sha256:a3fff8f43dc260d5bd363d9f9cf1830fa3a458b332856f34282de498ed420edd", size = 78551, upload-time = "2024-11-15T12:30:45.782Z" }, ] +[[package]] +name = "httptools" +version = "0.6.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a7/9a/ce5e1f7e131522e6d3426e8e7a490b3a01f39a6696602e1c4f33f9e94277/httptools-0.6.4.tar.gz", hash = "sha256:4e93eee4add6493b59a5c514da98c939b244fce4a0d8879cd3f466562f4b7d5c", size = 240639, upload-time = "2024-10-16T19:45:08.902Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3b/6f/972f8eb0ea7d98a1c6be436e2142d51ad2a64ee18e02b0e7ff1f62171ab1/httptools-0.6.4-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:3c73ce323711a6ffb0d247dcd5a550b8babf0f757e86a52558fe5b86d6fefcc0", size = 198780, upload-time = "2024-10-16T19:44:06.882Z" }, + { url = "https://files.pythonhosted.org/packages/6a/b0/17c672b4bc5c7ba7f201eada4e96c71d0a59fbc185e60e42580093a86f21/httptools-0.6.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:345c288418f0944a6fe67be8e6afa9262b18c7626c3ef3c28adc5eabc06a68da", size = 103297, upload-time = "2024-10-16T19:44:08.129Z" }, + { url = "https://files.pythonhosted.org/packages/92/5e/b4a826fe91971a0b68e8c2bd4e7db3e7519882f5a8ccdb1194be2b3ab98f/httptools-0.6.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:deee0e3343f98ee8047e9f4c5bc7cedbf69f5734454a94c38ee829fb2d5fa3c1", size = 443130, upload-time = "2024-10-16T19:44:09.45Z" }, + { url = "https://files.pythonhosted.org/packages/b0/51/ce61e531e40289a681a463e1258fa1e05e0be54540e40d91d065a264cd8f/httptools-0.6.4-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ca80b7485c76f768a3bc83ea58373f8db7b015551117375e4918e2aa77ea9b50", size = 442148, upload-time = "2024-10-16T19:44:11.539Z" }, + { url = "https://files.pythonhosted.org/packages/ea/9e/270b7d767849b0c96f275c695d27ca76c30671f8eb8cc1bab6ced5c5e1d0/httptools-0.6.4-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:90d96a385fa941283ebd231464045187a31ad932ebfa541be8edf5b3c2328959", size = 415949, upload-time = "2024-10-16T19:44:13.388Z" }, + { url = "https://files.pythonhosted.org/packages/81/86/ced96e3179c48c6f656354e106934e65c8963d48b69be78f355797f0e1b3/httptools-0.6.4-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:59e724f8b332319e2875efd360e61ac07f33b492889284a3e05e6d13746876f4", size = 417591, upload-time = "2024-10-16T19:44:15.258Z" }, + { url = "https://files.pythonhosted.org/packages/75/73/187a3f620ed3175364ddb56847d7a608a6fc42d551e133197098c0143eca/httptools-0.6.4-cp310-cp310-win_amd64.whl", hash = "sha256:c26f313951f6e26147833fc923f78f95604bbec812a43e5ee37f26dc9e5a686c", size = 88344, upload-time = "2024-10-16T19:44:16.54Z" }, + { url = "https://files.pythonhosted.org/packages/7b/26/bb526d4d14c2774fe07113ca1db7255737ffbb119315839af2065abfdac3/httptools-0.6.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:f47f8ed67cc0ff862b84a1189831d1d33c963fb3ce1ee0c65d3b0cbe7b711069", size = 199029, upload-time = "2024-10-16T19:44:18.427Z" }, + { url = "https://files.pythonhosted.org/packages/a6/17/3e0d3e9b901c732987a45f4f94d4e2c62b89a041d93db89eafb262afd8d5/httptools-0.6.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:0614154d5454c21b6410fdf5262b4a3ddb0f53f1e1721cfd59d55f32138c578a", size = 103492, upload-time = "2024-10-16T19:44:19.515Z" }, + { url = "https://files.pythonhosted.org/packages/b7/24/0fe235d7b69c42423c7698d086d4db96475f9b50b6ad26a718ef27a0bce6/httptools-0.6.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f8787367fbdfccae38e35abf7641dafc5310310a5987b689f4c32cc8cc3ee975", size = 462891, upload-time = "2024-10-16T19:44:21.067Z" }, + { url = "https://files.pythonhosted.org/packages/b1/2f/205d1f2a190b72da6ffb5f41a3736c26d6fa7871101212b15e9b5cd8f61d/httptools-0.6.4-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40b0f7fe4fd38e6a507bdb751db0379df1e99120c65fbdc8ee6c1d044897a636", size = 459788, upload-time = "2024-10-16T19:44:22.958Z" }, + { url = "https://files.pythonhosted.org/packages/6e/4c/d09ce0eff09057a206a74575ae8f1e1e2f0364d20e2442224f9e6612c8b9/httptools-0.6.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:40a5ec98d3f49904b9fe36827dcf1aadfef3b89e2bd05b0e35e94f97c2b14721", size = 433214, upload-time = "2024-10-16T19:44:24.513Z" }, + { url = "https://files.pythonhosted.org/packages/3e/d2/84c9e23edbccc4a4c6f96a1b8d99dfd2350289e94f00e9ccc7aadde26fb5/httptools-0.6.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:dacdd3d10ea1b4ca9df97a0a303cbacafc04b5cd375fa98732678151643d4988", size = 434120, upload-time = "2024-10-16T19:44:26.295Z" }, + { url = "https://files.pythonhosted.org/packages/d0/46/4d8e7ba9581416de1c425b8264e2cadd201eb709ec1584c381f3e98f51c1/httptools-0.6.4-cp311-cp311-win_amd64.whl", hash = "sha256:288cd628406cc53f9a541cfaf06041b4c71d751856bab45e3702191f931ccd17", size = 88565, upload-time = "2024-10-16T19:44:29.188Z" }, + { url = "https://files.pythonhosted.org/packages/bb/0e/d0b71465c66b9185f90a091ab36389a7352985fe857e352801c39d6127c8/httptools-0.6.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:df017d6c780287d5c80601dafa31f17bddb170232d85c066604d8558683711a2", size = 200683, upload-time = "2024-10-16T19:44:30.175Z" }, + { url = "https://files.pythonhosted.org/packages/e2/b8/412a9bb28d0a8988de3296e01efa0bd62068b33856cdda47fe1b5e890954/httptools-0.6.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:85071a1e8c2d051b507161f6c3e26155b5c790e4e28d7f236422dbacc2a9cc44", size = 104337, upload-time = "2024-10-16T19:44:31.786Z" }, + { url = "https://files.pythonhosted.org/packages/9b/01/6fb20be3196ffdc8eeec4e653bc2a275eca7f36634c86302242c4fbb2760/httptools-0.6.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69422b7f458c5af875922cdb5bd586cc1f1033295aa9ff63ee196a87519ac8e1", size = 508796, upload-time = "2024-10-16T19:44:32.825Z" }, + { url = "https://files.pythonhosted.org/packages/f7/d8/b644c44acc1368938317d76ac991c9bba1166311880bcc0ac297cb9d6bd7/httptools-0.6.4-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:16e603a3bff50db08cd578d54f07032ca1631450ceb972c2f834c2b860c28ea2", size = 510837, upload-time = "2024-10-16T19:44:33.974Z" }, + { url = "https://files.pythonhosted.org/packages/52/d8/254d16a31d543073a0e57f1c329ca7378d8924e7e292eda72d0064987486/httptools-0.6.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:ec4f178901fa1834d4a060320d2f3abc5c9e39766953d038f1458cb885f47e81", size = 485289, upload-time = "2024-10-16T19:44:35.111Z" }, + { url = "https://files.pythonhosted.org/packages/5f/3c/4aee161b4b7a971660b8be71a92c24d6c64372c1ab3ae7f366b3680df20f/httptools-0.6.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:f9eb89ecf8b290f2e293325c646a211ff1c2493222798bb80a530c5e7502494f", size = 489779, upload-time = "2024-10-16T19:44:36.253Z" }, + { url = "https://files.pythonhosted.org/packages/12/b7/5cae71a8868e555f3f67a50ee7f673ce36eac970f029c0c5e9d584352961/httptools-0.6.4-cp312-cp312-win_amd64.whl", hash = "sha256:db78cb9ca56b59b016e64b6031eda5653be0589dba2b1b43453f6e8b405a0970", size = 88634, upload-time = "2024-10-16T19:44:37.357Z" }, + { url = "https://files.pythonhosted.org/packages/94/a3/9fe9ad23fd35f7de6b91eeb60848986058bd8b5a5c1e256f5860a160cc3e/httptools-0.6.4-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ade273d7e767d5fae13fa637f4d53b6e961fb7fd93c7797562663f0171c26660", size = 197214, upload-time = "2024-10-16T19:44:38.738Z" }, + { url = "https://files.pythonhosted.org/packages/ea/d9/82d5e68bab783b632023f2fa31db20bebb4e89dfc4d2293945fd68484ee4/httptools-0.6.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:856f4bc0478ae143bad54a4242fccb1f3f86a6e1be5548fecfd4102061b3a083", size = 102431, upload-time = "2024-10-16T19:44:39.818Z" }, + { url = "https://files.pythonhosted.org/packages/96/c1/cb499655cbdbfb57b577734fde02f6fa0bbc3fe9fb4d87b742b512908dff/httptools-0.6.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:322d20ea9cdd1fa98bd6a74b77e2ec5b818abdc3d36695ab402a0de8ef2865a3", size = 473121, upload-time = "2024-10-16T19:44:41.189Z" }, + { url = "https://files.pythonhosted.org/packages/af/71/ee32fd358f8a3bb199b03261f10921716990808a675d8160b5383487a317/httptools-0.6.4-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4d87b29bd4486c0093fc64dea80231f7c7f7eb4dc70ae394d70a495ab8436071", size = 473805, upload-time = "2024-10-16T19:44:42.384Z" }, + { url = "https://files.pythonhosted.org/packages/8a/0a/0d4df132bfca1507114198b766f1737d57580c9ad1cf93c1ff673e3387be/httptools-0.6.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:342dd6946aa6bda4b8f18c734576106b8a31f2fe31492881a9a160ec84ff4bd5", size = 448858, upload-time = "2024-10-16T19:44:43.959Z" }, + { url = "https://files.pythonhosted.org/packages/1e/6a/787004fdef2cabea27bad1073bf6a33f2437b4dbd3b6fb4a9d71172b1c7c/httptools-0.6.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4b36913ba52008249223042dca46e69967985fb4051951f94357ea681e1f5dc0", size = 452042, upload-time = "2024-10-16T19:44:45.071Z" }, + { url = "https://files.pythonhosted.org/packages/4d/dc/7decab5c404d1d2cdc1bb330b1bf70e83d6af0396fd4fc76fc60c0d522bf/httptools-0.6.4-cp313-cp313-win_amd64.whl", hash = "sha256:28908df1b9bb8187393d5b5db91435ccc9c8e891657f9cbb42a2541b44c82fc8", size = 87682, upload-time = "2024-10-16T19:44:46.46Z" }, +] + [[package]] name = "httpx" version = "0.28.1" @@ -1574,6 +1835,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/31/b4/b9b800c45527aadd64d5b442f9b932b00648617eb5d63d2c7a6587b7cafc/jmespath-1.0.1-py3-none-any.whl", hash = "sha256:02e2e4cc71b5bcab88332eebf907519190dd9e6e82107fa7f83b1003a6252980", size = 20256, upload-time = "2022-06-17T18:00:10.251Z" }, ] +[[package]] +name = "jsonpickle" +version = "4.1.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e4/a6/d07afcfdef402900229bcca795f80506b207af13a838d4d99ad45abf530c/jsonpickle-4.1.1.tar.gz", hash = "sha256:f86e18f13e2b96c1c1eede0b7b90095bbb61d99fedc14813c44dc2f361dbbae1", size = 316885, upload-time = "2025-06-02T20:36:11.57Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c1/73/04df8a6fa66d43a9fd45c30f283cc4afff17da671886e451d52af60bdc7e/jsonpickle-4.1.1-py3-none-any.whl", hash = "sha256:bb141da6057898aa2438ff268362b126826c812a1721e31cf08a6e142910dc91", size = 47125, upload-time = "2025-06-02T20:36:08.647Z" }, +] + [[package]] name = "jsonschema" version = "4.25.0" @@ -1725,6 +1995,18 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/83/29/00b9b0322a473aee6cda87473401c9abb19506cd650cc69a8aa38277ea74/lxml-5.3.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:48fd46bf7155def2e15287c6f2b133a2f78e2d22cdf55647269977b873c65499", size = 3487718, upload-time = "2025-02-10T07:50:31.231Z" }, ] +[[package]] +name = "mako" +version = "1.3.10" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markupsafe" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9e/38/bd5b78a920a64d708fe6bc8e0a2c075e1389d53bef8413725c63ba041535/mako-1.3.10.tar.gz", hash = "sha256:99579a6f39583fa7e5630a28c3c1f440e4e97a414b80372649c0ce338da2ea28", size = 392474, upload-time = "2025-04-10T12:44:31.16Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/87/fb/99f81ac72ae23375f22b7afdb7642aba97c00a713c217124420147681a2f/mako-1.3.10-py3-none-any.whl", hash = "sha256:baef24a52fc4fc514a0887ac600f9f1cff3d82c61d4d700a1fa84d597b88db59", size = 78509, upload-time = "2025-04-10T12:50:53.297Z" }, +] + [[package]] name = "markdown" version = "3.7" @@ -2865,6 +3147,75 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/fd/b2/ab07b09e0f6d143dfb839693aa05765257bceaa13d03bf1a696b78323e7a/protobuf-5.29.3-py3-none-any.whl", hash = "sha256:0a18ed4a24198528f2333802eb075e59dea9d679ab7a6c5efb017a59004d849f", size = 172550, upload-time = "2025-01-08T21:38:50.439Z" }, ] +[[package]] +name = "psycopg" +version = "3.2.9" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, + { name = "tzdata", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/27/4a/93a6ab570a8d1a4ad171a1f4256e205ce48d828781312c0bbaff36380ecb/psycopg-3.2.9.tar.gz", hash = "sha256:2fbb46fcd17bc81f993f28c47f1ebea38d66ae97cc2dbc3cad73b37cefbff700", size = 158122, upload-time = "2025-05-13T16:11:15.533Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/44/b0/a73c195a56eb6b92e937a5ca58521a5c3346fb233345adc80fd3e2f542e2/psycopg-3.2.9-py3-none-any.whl", hash = "sha256:01a8dadccdaac2123c916208c96e06631641c0566b22005493f09663c7a8d3b6", size = 202705, upload-time = "2025-05-13T16:06:26.584Z" }, +] + +[package.optional-dependencies] +binary = [ + { name = "psycopg-binary", marker = "implementation_name != 'pypy'" }, +] + +[[package]] +name = "psycopg-binary" +version = "3.2.9" +source = { registry = "https://pypi.org/simple" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b6/ce/d677bc51f9b180986e5515268603519cee682eb6b5e765ae46cdb8526579/psycopg_binary-3.2.9-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:528239bbf55728ba0eacbd20632342867590273a9bacedac7538ebff890f1093", size = 4033081, upload-time = "2025-05-13T16:06:29.666Z" }, + { url = "https://files.pythonhosted.org/packages/de/f4/b56263eb20dc36d71d7188622872098400536928edf86895736e28546b3c/psycopg_binary-3.2.9-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e4978c01ca4c208c9d6376bd585e2c0771986b76ff7ea518f6d2b51faece75e8", size = 4082141, upload-time = "2025-05-13T16:06:33.81Z" }, + { url = "https://files.pythonhosted.org/packages/68/47/5316c3b0a2b1ff5f1d440a27638250569994534874a2ce88bf24f5c51c0f/psycopg_binary-3.2.9-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1ed2bab85b505d13e66a914d0f8cdfa9475c16d3491cf81394e0748b77729af2", size = 4678993, upload-time = "2025-05-13T16:06:36.309Z" }, + { url = "https://files.pythonhosted.org/packages/53/24/b2c667b59f07fd7d7805c0c2074351bf2b98a336c5030d961db316512ffb/psycopg_binary-3.2.9-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:799fa1179ab8a58d1557a95df28b492874c8f4135101b55133ec9c55fc9ae9d7", size = 4500117, upload-time = "2025-05-13T16:06:38.847Z" }, + { url = "https://files.pythonhosted.org/packages/ae/91/a08f8878b0fe0b34b083c149df950bce168bc1b18b2fe849fa42bf4378d4/psycopg_binary-3.2.9-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bb37ac3955d19e4996c3534abfa4f23181333974963826db9e0f00731274b695", size = 4766985, upload-time = "2025-05-13T16:06:42.502Z" }, + { url = "https://files.pythonhosted.org/packages/10/be/3a45d5b7d8f4c4332fd42465f2170b5aef4d28a7c79e79ac7e5e1dac74d7/psycopg_binary-3.2.9-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:001e986656f7e06c273dd4104e27f4b4e0614092e544d950c7c938d822b1a894", size = 4461990, upload-time = "2025-05-13T16:06:45.971Z" }, + { url = "https://files.pythonhosted.org/packages/03/ce/20682b9a4fc270d8dc644a0b16c1978732146c6ff0abbc48fbab2f4a70aa/psycopg_binary-3.2.9-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:fa5c80d8b4cbf23f338db88a7251cef8bb4b68e0f91cf8b6ddfa93884fdbb0c1", size = 3777947, upload-time = "2025-05-13T16:06:49.134Z" }, + { url = "https://files.pythonhosted.org/packages/07/5c/f6d486e00bcd8709908ccdd436b2a190d390dfd61e318de4060bc6ee2a1e/psycopg_binary-3.2.9-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:39a127e0cf9b55bd4734a8008adf3e01d1fd1cb36339c6a9e2b2cbb6007c50ee", size = 3337502, upload-time = "2025-05-13T16:06:51.378Z" }, + { url = "https://files.pythonhosted.org/packages/0b/a1/086508e929c0123a7f532840bb0a0c8a1ebd7e06aef3ee7fa44a3589bcdf/psycopg_binary-3.2.9-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:fb7599e436b586e265bea956751453ad32eb98be6a6e694252f4691c31b16edb", size = 3440809, upload-time = "2025-05-13T16:06:54.552Z" }, + { url = "https://files.pythonhosted.org/packages/40/f2/3a347a0f894355a6b173fca2202eca279b6197727b24e4896cf83f4263ee/psycopg_binary-3.2.9-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:5d2c9fe14fe42b3575a0b4e09b081713e83b762c8dc38a3771dd3265f8f110e7", size = 3497231, upload-time = "2025-05-13T16:06:58.858Z" }, + { url = "https://files.pythonhosted.org/packages/18/31/0845a385eb6f4521b398793293b5f746a101e80d5c43792990442d26bc2e/psycopg_binary-3.2.9-cp310-cp310-win_amd64.whl", hash = "sha256:7e4660fad2807612bb200de7262c88773c3483e85d981324b3c647176e41fdc8", size = 2936845, upload-time = "2025-05-13T16:07:02.712Z" }, + { url = "https://files.pythonhosted.org/packages/b6/84/259ea58aca48e03c3c793b4ccfe39ed63db7b8081ef784d039330d9eed96/psycopg_binary-3.2.9-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:2504e9fd94eabe545d20cddcc2ff0da86ee55d76329e1ab92ecfcc6c0a8156c4", size = 4040785, upload-time = "2025-05-13T16:07:07.569Z" }, + { url = "https://files.pythonhosted.org/packages/25/22/ce58ffda2b7e36e45042b4d67f1bbd4dd2ccf4cfd2649696685c61046475/psycopg_binary-3.2.9-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:093a0c079dd6228a7f3c3d82b906b41964eaa062a9a8c19f45ab4984bf4e872b", size = 4087601, upload-time = "2025-05-13T16:07:11.75Z" }, + { url = "https://files.pythonhosted.org/packages/c6/4f/b043e85268650c245025e80039b79663d8986f857bc3d3a72b1de67f3550/psycopg_binary-3.2.9-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:387c87b51d72442708e7a853e7e7642717e704d59571da2f3b29e748be58c78a", size = 4676524, upload-time = "2025-05-13T16:07:17.038Z" }, + { url = "https://files.pythonhosted.org/packages/da/29/7afbfbd3740ea52fda488db190ef2ef2a9ff7379b85501a2142fb9f7dd56/psycopg_binary-3.2.9-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d9ac10a2ebe93a102a326415b330fff7512f01a9401406896e78a81d75d6eddc", size = 4495671, upload-time = "2025-05-13T16:07:21.709Z" }, + { url = "https://files.pythonhosted.org/packages/ea/eb/df69112d18a938cbb74efa1573082248437fa663ba66baf2cdba8a95a2d0/psycopg_binary-3.2.9-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:72fdbda5b4c2a6a72320857ef503a6589f56d46821592d4377c8c8604810342b", size = 4768132, upload-time = "2025-05-13T16:07:25.818Z" }, + { url = "https://files.pythonhosted.org/packages/76/fe/4803b20220c04f508f50afee9169268553f46d6eed99640a08c8c1e76409/psycopg_binary-3.2.9-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f34e88940833d46108f949fdc1fcfb74d6b5ae076550cd67ab59ef47555dba95", size = 4458394, upload-time = "2025-05-13T16:07:29.148Z" }, + { url = "https://files.pythonhosted.org/packages/0f/0f/5ecc64607ef6f62b04e610b7837b1a802ca6f7cb7211339f5d166d55f1dd/psycopg_binary-3.2.9-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a3e0f89fe35cb03ff1646ab663dabf496477bab2a072315192dbaa6928862891", size = 3776879, upload-time = "2025-05-13T16:07:32.503Z" }, + { url = "https://files.pythonhosted.org/packages/c8/d8/1c3d6e99b7db67946d0eac2cd15d10a79aa7b1e3222ce4aa8e7df72027f5/psycopg_binary-3.2.9-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:6afb3e62f2a3456f2180a4eef6b03177788df7ce938036ff7f09b696d418d186", size = 3333329, upload-time = "2025-05-13T16:07:35.555Z" }, + { url = "https://files.pythonhosted.org/packages/d7/02/a4e82099816559f558ccaf2b6945097973624dc58d5d1c91eb1e54e5a8e9/psycopg_binary-3.2.9-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:cc19ed5c7afca3f6b298bfc35a6baa27adb2019670d15c32d0bb8f780f7d560d", size = 3435683, upload-time = "2025-05-13T16:07:37.863Z" }, + { url = "https://files.pythonhosted.org/packages/91/e4/f27055290d58e8818bed8a297162a096ef7f8ecdf01d98772d4b02af46c4/psycopg_binary-3.2.9-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:bc75f63653ce4ec764c8f8c8b0ad9423e23021e1c34a84eb5f4ecac8538a4a4a", size = 3497124, upload-time = "2025-05-13T16:07:40.567Z" }, + { url = "https://files.pythonhosted.org/packages/67/3d/17ed07579625529534605eeaeba34f0536754a5667dbf20ea2624fc80614/psycopg_binary-3.2.9-cp311-cp311-win_amd64.whl", hash = "sha256:3db3ba3c470801e94836ad78bf11fd5fab22e71b0c77343a1ee95d693879937a", size = 2939520, upload-time = "2025-05-13T16:07:45.467Z" }, + { url = "https://files.pythonhosted.org/packages/29/6f/ec9957e37a606cd7564412e03f41f1b3c3637a5be018d0849914cb06e674/psycopg_binary-3.2.9-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:be7d650a434921a6b1ebe3fff324dbc2364393eb29d7672e638ce3e21076974e", size = 4022205, upload-time = "2025-05-13T16:07:48.195Z" }, + { url = "https://files.pythonhosted.org/packages/6b/ba/497b8bea72b20a862ac95a94386967b745a472d9ddc88bc3f32d5d5f0d43/psycopg_binary-3.2.9-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6a76b4722a529390683c0304501f238b365a46b1e5fb6b7249dbc0ad6fea51a0", size = 4083795, upload-time = "2025-05-13T16:07:50.917Z" }, + { url = "https://files.pythonhosted.org/packages/42/07/af9503e8e8bdad3911fd88e10e6a29240f9feaa99f57d6fac4a18b16f5a0/psycopg_binary-3.2.9-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:96a551e4683f1c307cfc3d9a05fec62c00a7264f320c9962a67a543e3ce0d8ff", size = 4655043, upload-time = "2025-05-13T16:07:54.857Z" }, + { url = "https://files.pythonhosted.org/packages/28/ed/aff8c9850df1648cc6a5cc7a381f11ee78d98a6b807edd4a5ae276ad60ad/psycopg_binary-3.2.9-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:61d0a6ceed8f08c75a395bc28cb648a81cf8dee75ba4650093ad1a24a51c8724", size = 4477972, upload-time = "2025-05-13T16:07:57.925Z" }, + { url = "https://files.pythonhosted.org/packages/5c/bd/8e9d1b77ec1a632818fe2f457c3a65af83c68710c4c162d6866947d08cc5/psycopg_binary-3.2.9-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad280bbd409bf598683dda82232f5215cfc5f2b1bf0854e409b4d0c44a113b1d", size = 4737516, upload-time = "2025-05-13T16:08:01.616Z" }, + { url = "https://files.pythonhosted.org/packages/46/ec/222238f774cd5a0881f3f3b18fb86daceae89cc410f91ef6a9fb4556f236/psycopg_binary-3.2.9-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:76eddaf7fef1d0994e3d536ad48aa75034663d3a07f6f7e3e601105ae73aeff6", size = 4436160, upload-time = "2025-05-13T16:08:04.278Z" }, + { url = "https://files.pythonhosted.org/packages/37/78/af5af2a1b296eeca54ea7592cd19284739a844974c9747e516707e7b3b39/psycopg_binary-3.2.9-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:52e239cd66c4158e412318fbe028cd94b0ef21b0707f56dcb4bdc250ee58fd40", size = 3753518, upload-time = "2025-05-13T16:08:07.567Z" }, + { url = "https://files.pythonhosted.org/packages/ec/ac/8a3ed39ea069402e9e6e6a2f79d81a71879708b31cc3454283314994b1ae/psycopg_binary-3.2.9-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:08bf9d5eabba160dd4f6ad247cf12f229cc19d2458511cab2eb9647f42fa6795", size = 3313598, upload-time = "2025-05-13T16:08:09.999Z" }, + { url = "https://files.pythonhosted.org/packages/da/43/26549af068347c808fbfe5f07d2fa8cef747cfff7c695136172991d2378b/psycopg_binary-3.2.9-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:1b2cf018168cad87580e67bdde38ff5e51511112f1ce6ce9a8336871f465c19a", size = 3407289, upload-time = "2025-05-13T16:08:12.66Z" }, + { url = "https://files.pythonhosted.org/packages/67/55/ea8d227c77df8e8aec880ded398316735add8fda5eb4ff5cc96fac11e964/psycopg_binary-3.2.9-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:14f64d1ac6942ff089fc7e926440f7a5ced062e2ed0949d7d2d680dc5c00e2d4", size = 3472493, upload-time = "2025-05-13T16:08:15.672Z" }, + { url = "https://files.pythonhosted.org/packages/3c/02/6ff2a5bc53c3cd653d281666728e29121149179c73fddefb1e437024c192/psycopg_binary-3.2.9-cp312-cp312-win_amd64.whl", hash = "sha256:7a838852e5afb6b4126f93eb409516a8c02a49b788f4df8b6469a40c2157fa21", size = 2927400, upload-time = "2025-05-13T16:08:18.652Z" }, + { url = "https://files.pythonhosted.org/packages/28/0b/f61ff4e9f23396aca674ed4d5c9a5b7323738021d5d72d36d8b865b3deaf/psycopg_binary-3.2.9-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:98bbe35b5ad24a782c7bf267596638d78aa0e87abc7837bdac5b2a2ab954179e", size = 4017127, upload-time = "2025-05-13T16:08:21.391Z" }, + { url = "https://files.pythonhosted.org/packages/bc/00/7e181fb1179fbfc24493738b61efd0453d4b70a0c4b12728e2b82db355fd/psycopg_binary-3.2.9-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:72691a1615ebb42da8b636c5ca9f2b71f266be9e172f66209a361c175b7842c5", size = 4080322, upload-time = "2025-05-13T16:08:24.049Z" }, + { url = "https://files.pythonhosted.org/packages/58/fd/94fc267c1d1392c4211e54ccb943be96ea4032e761573cf1047951887494/psycopg_binary-3.2.9-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:25ab464bfba8c401f5536d5aa95f0ca1dd8257b5202eede04019b4415f491351", size = 4655097, upload-time = "2025-05-13T16:08:27.376Z" }, + { url = "https://files.pythonhosted.org/packages/41/17/31b3acf43de0b2ba83eac5878ff0dea5a608ca2a5c5dd48067999503a9de/psycopg_binary-3.2.9-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0e8aeefebe752f46e3c4b769e53f1d4ad71208fe1150975ef7662c22cca80fab", size = 4482114, upload-time = "2025-05-13T16:08:30.781Z" }, + { url = "https://files.pythonhosted.org/packages/85/78/b4d75e5fd5a85e17f2beb977abbba3389d11a4536b116205846b0e1cf744/psycopg_binary-3.2.9-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b7e4e4dd177a8665c9ce86bc9caae2ab3aa9360b7ce7ec01827ea1baea9ff748", size = 4737693, upload-time = "2025-05-13T16:08:34.625Z" }, + { url = "https://files.pythonhosted.org/packages/3b/95/7325a8550e3388b00b5e54f4ced5e7346b531eb4573bf054c3dbbfdc14fe/psycopg_binary-3.2.9-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7fc2915949e5c1ea27a851f7a472a7da7d0a40d679f0a31e42f1022f3c562e87", size = 4437423, upload-time = "2025-05-13T16:08:37.444Z" }, + { url = "https://files.pythonhosted.org/packages/1a/db/cef77d08e59910d483df4ee6da8af51c03bb597f500f1fe818f0f3b925d3/psycopg_binary-3.2.9-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a1fa38a4687b14f517f049477178093c39c2a10fdcced21116f47c017516498f", size = 3758667, upload-time = "2025-05-13T16:08:40.116Z" }, + { url = "https://files.pythonhosted.org/packages/95/3e/252fcbffb47189aa84d723b54682e1bb6d05c8875fa50ce1ada914ae6e28/psycopg_binary-3.2.9-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:5be8292d07a3ab828dc95b5ee6b69ca0a5b2e579a577b39671f4f5b47116dfd2", size = 3320576, upload-time = "2025-05-13T16:08:43.243Z" }, + { url = "https://files.pythonhosted.org/packages/1c/cd/9b5583936515d085a1bec32b45289ceb53b80d9ce1cea0fef4c782dc41a7/psycopg_binary-3.2.9-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:778588ca9897b6c6bab39b0d3034efff4c5438f5e3bd52fda3914175498202f9", size = 3411439, upload-time = "2025-05-13T16:08:47.321Z" }, + { url = "https://files.pythonhosted.org/packages/45/6b/6f1164ea1634c87956cdb6db759e0b8c5827f989ee3cdff0f5c70e8331f2/psycopg_binary-3.2.9-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:f0d5b3af045a187aedbd7ed5fc513bd933a97aaff78e61c3745b330792c4345b", size = 3477477, upload-time = "2025-05-13T16:08:51.166Z" }, + { url = "https://files.pythonhosted.org/packages/7b/1d/bf54cfec79377929da600c16114f0da77a5f1670f45e0c3af9fcd36879bc/psycopg_binary-3.2.9-cp313-cp313-win_amd64.whl", hash = "sha256:2290bc146a1b6a9730350f695e8b670e1d1feb8446597bed0bbe7c3c30e0abcb", size = 2928009, upload-time = "2025-05-13T16:08:53.67Z" }, +] + [[package]] name = "pyarrow" version = "21.0.0" @@ -2953,6 +3304,11 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/6a/c0/ec2b1c8712ca690e5d61979dee872603e92b8a32f94cc1b72d53beab008a/pydantic-2.11.7-py3-none-any.whl", hash = "sha256:dde5df002701f6de26248661f6835bbe296a47bf73990135c7d07ce741b9623b", size = 444782, upload-time = "2025-06-14T08:33:14.905Z" }, ] +[package.optional-dependencies] +email = [ + { name = "email-validator" }, +] + [[package]] name = "pydantic-ai" source = { editable = "." } @@ -2964,6 +3320,9 @@ dependencies = [ a2a = [ { name = "fasta2a" }, ] +dbos = [ + { name = "pydantic-ai-slim", extra = ["dbos"] }, +] examples = [ { name = "pydantic-ai-examples" }, ] @@ -3016,8 +3375,9 @@ requires-dist = [ { name = "fasta2a", marker = "extra == 'a2a'", specifier = ">=0.4.1" }, { name = "pydantic-ai-examples", marker = "extra == 'examples'", editable = "examples" }, { name = "pydantic-ai-slim", extras = ["ag-ui", "anthropic", "bedrock", "cli", "cohere", "evals", "google", "groq", "huggingface", "logfire", "mcp", "mistral", "openai", "retries", "temporal", "vertexai"], editable = "pydantic_ai_slim" }, + { name = "pydantic-ai-slim", extras = ["dbos"], marker = "extra == 'dbos'", editable = "pydantic_ai_slim" }, ] -provides-extras = ["a2a", "examples"] +provides-extras = ["a2a", "dbos", "examples"] [package.metadata.requires-dev] dev = [ @@ -3139,6 +3499,9 @@ cli = [ cohere = [ { name = "cohere", marker = "sys_platform != 'emscripten'" }, ] +dbos = [ + { name = "dbos" }, +] duckduckgo = [ { name = "ddgs" }, ] @@ -3187,6 +3550,7 @@ requires-dist = [ { name = "argcomplete", marker = "extra == 'cli'", specifier = ">=3.5.0" }, { name = "boto3", marker = "extra == 'bedrock'", specifier = ">=1.39.0" }, { name = "cohere", marker = "sys_platform != 'emscripten' and extra == 'cohere'", specifier = ">=5.16.0" }, + { name = "dbos", marker = "extra == 'dbos'", specifier = ">=1.13.0" }, { name = "ddgs", marker = "extra == 'duckduckgo'", specifier = ">=9.0.0" }, { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, { name = "fasta2a", marker = "extra == 'a2a'", specifier = ">=0.4.1" }, @@ -3215,7 +3579,7 @@ requires-dist = [ { name = "tenacity", marker = "extra == 'retries'", specifier = ">=8.2.3" }, { name = "typing-inspection", specifier = ">=0.4.0" }, ] -provides-extras = ["a2a", "ag-ui", "anthropic", "bedrock", "cli", "cohere", "duckduckgo", "evals", "google", "groq", "huggingface", "logfire", "mcp", "mistral", "openai", "retries", "tavily", "temporal", "vertexai"] +provides-extras = ["a2a", "ag-ui", "anthropic", "bedrock", "cli", "cohere", "dbos", "duckduckgo", "evals", "google", "groq", "huggingface", "logfire", "mcp", "mistral", "openai", "retries", "tavily", "temporal", "vertexai"] [[package]] name = "pydantic-core" @@ -3382,6 +3746,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/8a/0b/9fcc47d19c48b59121088dd6da2488a49d5f72dacf8262e2790a1d2c7d15/pygments-2.19.1-py3-none-any.whl", hash = "sha256:9ea1544ad55cecf4b8242fab6dd35a93bbce657034b0611ee383099054ab6d8c", size = 1225293, upload-time = "2025-01-06T17:26:25.553Z" }, ] +[[package]] +name = "pyjwt" +version = "2.10.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e7/46/bd74733ff231675599650d3e47f361794b22ef3e3770998dda30d3b63726/pyjwt-2.10.1.tar.gz", hash = "sha256:3cc5772eb20009233caf06e9d8a0577824723b44e6648ee0a2aedb6cf9381953", size = 87785, upload-time = "2024-11-28T03:43:29.933Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/61/ad/689f02752eeec26aed679477e80e632ef1b682313be70793d798c1d5fc8f/PyJWT-2.10.1-py3-none-any.whl", hash = "sha256:dcdd193e30abefd5debf142f9adfcdd2b58004e644f25406ffaebd50bd98dacb", size = 22997, upload-time = "2024-11-28T03:43:27.893Z" }, +] + [[package]] name = "pymdown-extensions" version = "10.14.3" @@ -3726,6 +4099,114 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/19/71/39c7c0d87f8d4e6c020a393182060eaefeeae6c01dab6a84ec346f2567df/rich-13.9.4-py3-none-any.whl", hash = "sha256:6049d5e6ec054bf2779ab3358186963bac2ea89175919d699e378b99738c2a90", size = 242424, upload-time = "2024-11-01T16:43:55.817Z" }, ] +[[package]] +name = "rich-toolkit" +version = "0.15.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "rich" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/65/36/cdb3d51371ad0cccbf1541506304783bd72d55790709b8eb68c0d401a13a/rich_toolkit-0.15.0.tar.gz", hash = "sha256:3f5730e9f2d36d0bfe01cf723948b7ecf4cc355d2b71e2c00e094f7963128c09", size = 115118, upload-time = "2025-08-11T10:55:37.909Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/75/e4/b0794eefb3cf78566b15e5bf576492c1d4a92ce5f6da55675bc11e9ef5d8/rich_toolkit-0.15.0-py3-none-any.whl", hash = "sha256:ddb91008283d4a7989fd8ff0324a48773a7a2276229c6a3070755645538ef1bb", size = 29062, upload-time = "2025-08-11T10:55:37.152Z" }, +] + +[[package]] +name = "rignore" +version = "0.6.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/73/46/05a94dc55ac03cf931d18e43b86ecee5ee054cb88b7853fffd741e35009c/rignore-0.6.4.tar.gz", hash = "sha256:e893fdd2d7fdcfa9407d0b7600ef2c2e2df97f55e1c45d4a8f54364829ddb0ab", size = 11633, upload-time = "2025-07-19T19:24:46.219Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ff/27/55ec2871e42c0a01669f7741598a5948f04bd32f3975478a0bead9e7e251/rignore-0.6.4-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:c201375cfe76e56e61fcdfe50d0882aafb49544b424bfc828e0508dc9fbc431b", size = 888088, upload-time = "2025-07-19T19:23:50.776Z" }, + { url = "https://files.pythonhosted.org/packages/3d/e0/6be3d7adf91f7d67f08833a29dea4f7c345554b385f9a797c397f6685f29/rignore-0.6.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4962d537e377394292c4828e1e9c620618dd8daa49ba746abe533733a89f8644", size = 824159, upload-time = "2025-07-19T19:23:44.395Z" }, + { url = "https://files.pythonhosted.org/packages/99/b7/fbb56b8cfa27971f9a19e87769dae0cb648343226eddda94ded32be2afc3/rignore-0.6.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8a6dd2f213cff6ca3c4d257fa3f5b0c7d4f6c23fe83bf292425fbe8d0c9c908a", size = 892493, upload-time = "2025-07-19T19:22:32.061Z" }, + { url = "https://files.pythonhosted.org/packages/d5/cf/21f130801c29c1fcf22f00a41d7530cef576819ee1a26c86bdb7bb06a0f2/rignore-0.6.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:64d379193f86a21fc93762783f36651927f54d5eea54c4922fdccb5e37076ed2", size = 872810, upload-time = "2025-07-19T19:22:45.554Z" }, + { url = "https://files.pythonhosted.org/packages/e4/4a/474a627263ef13a0ac28a0ce3a20932fbe41f6043f7280da47c7aca1f586/rignore-0.6.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:53c4f8682cf645b7a9160e0f1786af3201ed54a020bb4abd515c970043387127", size = 1160488, upload-time = "2025-07-19T19:22:58.359Z" }, + { url = "https://files.pythonhosted.org/packages/0b/c7/a10c180f77cbb456ab483c28e52efd6166cee787f11d21cb1d369b89e961/rignore-0.6.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:af1246e672bd835a17d3ae91579b3c235ec55b10924ef22608d3e9ec90fa2699", size = 938780, upload-time = "2025-07-19T19:23:10.604Z" }, + { url = "https://files.pythonhosted.org/packages/32/68/8e67701e8cc9f157f12b3742e14f14e395c7f3a497720c7f6aab7e5cdec4/rignore-0.6.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:82eed48fbc3097af418862e3c5c26fa81aa993e0d8b5f3a0a9a29cc6975eedff", size = 950347, upload-time = "2025-07-19T19:23:33.759Z" }, + { url = "https://files.pythonhosted.org/packages/1e/11/8eef123a2d029ed697b119806a0ca8a99d9457500c40b4d26cd21860eb89/rignore-0.6.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:df1215a071d42fd857fb6363c13803fbd915d48eaeaa9b103fb2266ba89c8995", size = 976679, upload-time = "2025-07-19T19:23:23.813Z" }, + { url = "https://files.pythonhosted.org/packages/09/7e/9584f4e4b3c1587ae09f286a14dab2376895d782be632289d151cb952432/rignore-0.6.4-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:82f2d318e66756066ed664015d8ca720078ab1d319377f1f61e3f4d01325faea", size = 1067469, upload-time = "2025-07-19T19:23:57.616Z" }, + { url = "https://files.pythonhosted.org/packages/c3/2c/d3515693b89c47761822219bb519cefd0cd45a38ff82c35a4ccdd8e95deb/rignore-0.6.4-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:e7d4258fc81051097c4d4c6ad17f0100c40088dbd2c6c31fc3c888a1d5a16190", size = 1136199, upload-time = "2025-07-19T19:24:09.922Z" }, + { url = "https://files.pythonhosted.org/packages/e7/39/94ea41846547ebb87d16527a3e978c8918632a060f77669a492f8a90b8b9/rignore-0.6.4-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:a0d0b9ec7929df8fd35ae89cb56619850dc140869139d61a2f4fa2941d2d1878", size = 1111179, upload-time = "2025-07-19T19:24:21.908Z" }, + { url = "https://files.pythonhosted.org/packages/ce/77/9acda68c7cea4d5dd027ef63163e0be30008f635acd75ea801e4c443fcdd/rignore-0.6.4-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8883d079b948ffcd56b67572831c9b8949eca7fe2e8f7bdbf7691c7a9388f054", size = 1121143, upload-time = "2025-07-19T19:24:33.958Z" }, + { url = "https://files.pythonhosted.org/packages/05/67/d1489e9224f33b9a87b7f870650bcab582ee3452df286bcb2fbb6a7ba257/rignore-0.6.4-cp310-cp310-win32.whl", hash = "sha256:5aeac5b354e15eb9f7857b02ad2af12ae2c2ed25a61921b0bd7e272774530f77", size = 643131, upload-time = "2025-07-19T19:24:54.437Z" }, + { url = "https://files.pythonhosted.org/packages/5d/d1/7d668bed51d3f0895e875e57c8e42f421635cdbcb96652ab24f297c9c5cf/rignore-0.6.4-cp310-cp310-win_amd64.whl", hash = "sha256:90419f881d05a1febb0578a175aa3e51d149ded1875421ed75a8af4392b7fe56", size = 721109, upload-time = "2025-07-19T19:24:47.458Z" }, + { url = "https://files.pythonhosted.org/packages/be/11/66992d271dbc44eac33f3b6b871855bc17e511b9279a2a0982b44c2b0c01/rignore-0.6.4-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:85f684dfc2c497e35ad34ffd6744a3bcdcac273ec1dbe7d0464bfa20f3331434", size = 888239, upload-time = "2025-07-19T19:23:51.835Z" }, + { url = "https://files.pythonhosted.org/packages/cb/1b/a9bde714e474043f97a06097925cf11e4597f9453adc267427d05ff9f38e/rignore-0.6.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:23954acc6debc852dbccbffbb70f0e26b12d230239e1ad0638eb5540694d0308", size = 824348, upload-time = "2025-07-19T19:23:45.54Z" }, + { url = "https://files.pythonhosted.org/packages/db/58/dabba227fee6553f9be069f58128419b6d4954c784c4cd566cfe59955c1f/rignore-0.6.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b2bf793bd58dbf3dee063a758b23ea446b5f037370405ecefc78e1e8923fc658", size = 892419, upload-time = "2025-07-19T19:22:33.763Z" }, + { url = "https://files.pythonhosted.org/packages/2c/fa/e3c16368ee32d6d1146cf219b127fd5c7e6baf22cad7a7a5967782ff3b20/rignore-0.6.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1eaeaa5a904e098604ea2012383a721de06211c8b4013abf0d41c3cfeb982f4f", size = 873285, upload-time = "2025-07-19T19:22:46.67Z" }, + { url = "https://files.pythonhosted.org/packages/78/9d/ef43d760dc3d18011d8482692b478785a846bba64157844b3068e428739c/rignore-0.6.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a48bdbeb03093e3fac2b40d62a718c59b5bb4f29cfdc8e7cbb360e1ea7bf0056", size = 1160457, upload-time = "2025-07-19T19:22:59.457Z" }, + { url = "https://files.pythonhosted.org/packages/95/de/eca1b035705e0b4e6c630fd1fcec45d14cf354a4acea88cf29ea0a322fea/rignore-0.6.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a8c5f9452d116be405f0967160b449c46ac929b50eaf527f33ee4680e3716e39", size = 938833, upload-time = "2025-07-19T19:23:11.657Z" }, + { url = "https://files.pythonhosted.org/packages/d4/2d/58912efa4137e989616d679a5390b53e93d5150be47217dd686ff60cd4cd/rignore-0.6.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6cf1039bfbdaa0f9710a6fb75436c25ca26d364881ec4d1e66d466bb36a7fb98", size = 950603, upload-time = "2025-07-19T19:23:35.245Z" }, + { url = "https://files.pythonhosted.org/packages/6f/3d/9827cc1c7674d8d884d3d231a224a2db8ea8eae075a1611dfdcd0c301e20/rignore-0.6.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:136629eb0ec2b6ac6ab34e71ce8065a07106fe615a53eceefc30200d528a4612", size = 976867, upload-time = "2025-07-19T19:23:24.919Z" }, + { url = "https://files.pythonhosted.org/packages/75/47/9dcee35e24897b62d66f7578f127bc91465c942a9d702d516d3fe7dcaa00/rignore-0.6.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:35e3d0ebaf01086e6454c3fecae141e2db74a5ddf4a97c72c69428baeff0b7d4", size = 1067603, upload-time = "2025-07-19T19:23:58.765Z" }, + { url = "https://files.pythonhosted.org/packages/4b/68/f66e7c0b0fc009f3e19ba8e6c3078a227285e3aecd9f6498d39df808cdfd/rignore-0.6.4-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:7ed1f9010fa1ef5ea0b69803d1dfb4b7355921779e03a30396034c52691658bc", size = 1136289, upload-time = "2025-07-19T19:24:11.136Z" }, + { url = "https://files.pythonhosted.org/packages/a6/b7/6fff161fe3ae5c0e0a0dded9a428e41d31c7fefc4e57c7553b9ffb064139/rignore-0.6.4-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:c16e9e898ed0afe2e20fa8d6412e02bd13f039f7e0d964a289368efd4d9ad320", size = 1111566, upload-time = "2025-07-19T19:24:23.065Z" }, + { url = "https://files.pythonhosted.org/packages/1f/c5/a5978ad65074a08dad46233a3333d154ae9cb9339325f3c181002a174746/rignore-0.6.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:7e6bc0bdcd404a7a8268629e8e99967127bb41e02d9eb09a471364c4bc25e215", size = 1121142, upload-time = "2025-07-19T19:24:35.151Z" }, + { url = "https://files.pythonhosted.org/packages/e8/af/91f084374b95dc2477a4bd066957beb3b61b551f2364b4f7f5bc52c9e4c7/rignore-0.6.4-cp311-cp311-win32.whl", hash = "sha256:fdd59bd63d2a49cc6d4f3598f285552ccb1a41e001df1012e0e0345cf2cabf79", size = 643031, upload-time = "2025-07-19T19:24:55.541Z" }, + { url = "https://files.pythonhosted.org/packages/07/3a/31672aa957aebba8903005313697127bbbad9db3afcfc9857150301fab1d/rignore-0.6.4-cp311-cp311-win_amd64.whl", hash = "sha256:7bf5be0e8a01845e57b5faa47ef9c623bb2070aa2f743c2fc73321ffaae45701", size = 721003, upload-time = "2025-07-19T19:24:48.867Z" }, + { url = "https://files.pythonhosted.org/packages/ec/6c/e5af4383cdd7829ef9aa63ac82a6507983e02dbc7c2e7b9aa64b7b8e2c7a/rignore-0.6.4-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:74720d074b79f32449d5d212ce732e0144a294a184246d1f1e7bcc1fc5c83b69", size = 885885, upload-time = "2025-07-19T19:23:53.236Z" }, + { url = "https://files.pythonhosted.org/packages/89/3e/1b02a868830e464769aa417ee195ac352fe71ff818df8ce50c4b998edb9c/rignore-0.6.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0a8184fcf567bd6b6d7b85a0c138d98dd40f63054141c96b175844414c5530d7", size = 819736, upload-time = "2025-07-19T19:23:46.565Z" }, + { url = "https://files.pythonhosted.org/packages/e0/75/b9be0c523d97c09f3c6508a67ce376aba4efe41c333c58903a0d7366439a/rignore-0.6.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bcb0d7d7ecc3fbccf6477bb187c04a091579ea139f15f139abe0b3b48bdfef69", size = 892779, upload-time = "2025-07-19T19:22:35.167Z" }, + { url = "https://files.pythonhosted.org/packages/91/f4/3064b06233697f2993485d132f06fe95061fef71631485da75aed246c4fd/rignore-0.6.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:feac73377a156fb77b3df626c76f7e5893d9b4e9e886ac8c0f9d44f1206a2a91", size = 872116, upload-time = "2025-07-19T19:22:47.828Z" }, + { url = "https://files.pythonhosted.org/packages/99/94/cb8e7af9a3c0a665f10e2366144e0ebc66167cf846aca5f1ac31b3661598/rignore-0.6.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:465179bc30beb1f7a3439e428739a2b5777ed26660712b8c4e351b15a7c04483", size = 1163345, upload-time = "2025-07-19T19:23:00.557Z" }, + { url = "https://files.pythonhosted.org/packages/86/6b/49faa7ad85ceb6ccef265df40091d9992232d7f6055fa664fe0a8b13781c/rignore-0.6.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4a4877b4dca9cf31a4d09845b300c677c86267657540d0b4d3e6d0ce3110e6e9", size = 939967, upload-time = "2025-07-19T19:23:13.494Z" }, + { url = "https://files.pythonhosted.org/packages/80/c8/b91afda10bd5ca1e3a80463340b899c0dc26a7750a9f3c94f668585c7f40/rignore-0.6.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:456456802b1e77d1e2d149320ee32505b8183e309e228129950b807d204ddd17", size = 949717, upload-time = "2025-07-19T19:23:36.404Z" }, + { url = "https://files.pythonhosted.org/packages/3f/f1/88bfdde58ae3fb1c1a92bb801f492eea8eafcdaf05ab9b75130023a4670b/rignore-0.6.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4c1ff2fc223f1d9473d36923160af37bf765548578eb9d47a2f52e90da8ae408", size = 975534, upload-time = "2025-07-19T19:23:25.988Z" }, + { url = "https://files.pythonhosted.org/packages/aa/8f/a80b4a2e48ceba56ba19e096d41263d844757e10aa36ede212571b5d8117/rignore-0.6.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:e445fbc214ae18e0e644a78086ea5d0f579e210229a4fbe86367d11a4cd03c11", size = 1067837, upload-time = "2025-07-19T19:23:59.888Z" }, + { url = "https://files.pythonhosted.org/packages/7d/90/0905597af0e78748909ef58418442a480ddd93e9fc89b0ca9ab170c357c0/rignore-0.6.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:e07d9c5270fc869bc431aadcfb6ed0447f89b8aafaa666914c077435dc76a123", size = 1134959, upload-time = "2025-07-19T19:24:12.396Z" }, + { url = "https://files.pythonhosted.org/packages/cc/7d/0fa29adf9183b61947ce6dc8a1a9779a8ea16573f557be28ec893f6ddbaa/rignore-0.6.4-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:7a6ccc0ea83d2c0c6df6b166f2acacedcc220a516436490f41e99a5ae73b6019", size = 1109708, upload-time = "2025-07-19T19:24:24.176Z" }, + { url = "https://files.pythonhosted.org/packages/4e/a7/92892ed86b2e36da403dd3a0187829f2d880414cef75bd612bfdf4dedebc/rignore-0.6.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:536392c5ec91755db48389546c833c4ab1426fe03e5a8522992b54ef8a244e7e", size = 1120546, upload-time = "2025-07-19T19:24:36.377Z" }, + { url = "https://files.pythonhosted.org/packages/31/1b/d29ae1fe901d523741d6d1d3ffe0d630734dd0ed6b047628a69c1e15ea44/rignore-0.6.4-cp312-cp312-win32.whl", hash = "sha256:f5f9dca46fc41c0a1e236767f68be9d63bdd2726db13a0ae3a30f68414472969", size = 642005, upload-time = "2025-07-19T19:24:56.671Z" }, + { url = "https://files.pythonhosted.org/packages/1a/41/a224944824688995374e4525115ce85fecd82442fc85edd5bcd81f4f256d/rignore-0.6.4-cp312-cp312-win_amd64.whl", hash = "sha256:e02eecb9e1b9f9bf7c9030ae73308a777bed3b2486204cc74dfcfbe699ab1497", size = 720358, upload-time = "2025-07-19T19:24:49.959Z" }, + { url = "https://files.pythonhosted.org/packages/db/a3/edd7d0d5cc0720de132b6651cef95ee080ce5fca11c77d8a47db848e5f90/rignore-0.6.4-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:2b3b1e266ce45189240d14dfa1057f8013ea34b9bc8b3b44125ec8d25fdb3985", size = 885304, upload-time = "2025-07-19T19:23:54.268Z" }, + { url = "https://files.pythonhosted.org/packages/93/a1/d8d2fb97a6548307507d049b7e93885d4a0dfa1c907af5983fd9f9362a21/rignore-0.6.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45fe803628cc14714df10e8d6cdc23950a47eb9eb37dfea9a4779f4c672d2aa0", size = 818799, upload-time = "2025-07-19T19:23:47.544Z" }, + { url = "https://files.pythonhosted.org/packages/b1/cd/949981fcc180ad5ba7b31c52e78b74b2dea6b7bf744ad4c0c4b212f6da78/rignore-0.6.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e439f034277a947a4126e2da79dbb43e33d73d7c09d3d72a927e02f8a16f59aa", size = 892024, upload-time = "2025-07-19T19:22:36.18Z" }, + { url = "https://files.pythonhosted.org/packages/b0/d3/9042d701a8062d9c88f87760bbc2695ee2c23b3f002d34486b72a85f8efe/rignore-0.6.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:84b5121650ae24621154c7bdba8b8970b0739d8146505c9f38e0cda9385d1004", size = 871430, upload-time = "2025-07-19T19:22:49.62Z" }, + { url = "https://files.pythonhosted.org/packages/eb/50/3370249b984212b7355f3d9241aa6d02e706067c6d194a2614dfbc0f5b27/rignore-0.6.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:52b0957b585ab48a445cf8ac1dbc33a272ab060835e583b4f95aa8c67c23fb2b", size = 1160559, upload-time = "2025-07-19T19:23:01.629Z" }, + { url = "https://files.pythonhosted.org/packages/6c/6f/2ad7f925838091d065524f30a8abda846d1813eee93328febf262b5cda21/rignore-0.6.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:50359e0d5287b5e2743bd2f2fbf05df619c8282fd3af12f6628ff97b9675551d", size = 939947, upload-time = "2025-07-19T19:23:14.608Z" }, + { url = "https://files.pythonhosted.org/packages/1f/01/626ec94d62475ae7ef8b00ef98cea61cbea52a389a666703c97c4673d406/rignore-0.6.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:efe18096dcb1596757dfe0b412aab6d32564473ae7ee58dea0a8b4be5b1a2e3b", size = 949471, upload-time = "2025-07-19T19:23:37.521Z" }, + { url = "https://files.pythonhosted.org/packages/e8/c3/699c4f03b3c46f4b5c02f17a0a339225da65aad547daa5b03001e7c6a382/rignore-0.6.4-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b79c212d9990a273ad91e8d9765e1766ef6ecedd3be65375d786a252762ba385", size = 974912, upload-time = "2025-07-19T19:23:27.13Z" }, + { url = "https://files.pythonhosted.org/packages/cd/35/04626c12f9f92a9fc789afc2be32838a5d9b23b6fa8b2ad4a8625638d15b/rignore-0.6.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:c6ffa7f2a8894c65aa5dc4e8ac8bbdf39a326c0c6589efd27686cfbb48f0197d", size = 1067281, upload-time = "2025-07-19T19:24:01.016Z" }, + { url = "https://files.pythonhosted.org/packages/fe/9c/8f17baf3b984afea151cb9094716f6f1fb8e8737db97fc6eb6d494bd0780/rignore-0.6.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:a63f5720dffc8d8fb0a4d02fafb8370a4031ebf3f99a4e79f334a91e905b7349", size = 1134414, upload-time = "2025-07-19T19:24:13.534Z" }, + { url = "https://files.pythonhosted.org/packages/10/88/ef84ffa916a96437c12cefcc39d474122da9626d75e3a2ebe09ec5d32f1b/rignore-0.6.4-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ce33982da47ac5dc09d19b04fa8d7c9aa6292fc0bd1ecf33076989faa8886094", size = 1109330, upload-time = "2025-07-19T19:24:25.303Z" }, + { url = "https://files.pythonhosted.org/packages/27/43/2ada5a2ec03b82e903610a1c483f516f78e47700ee6db9823f739e08b3af/rignore-0.6.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:d899621867aa266824fbd9150e298f19d25b93903ef0133c09f70c65a3416eca", size = 1120381, upload-time = "2025-07-19T19:24:37.798Z" }, + { url = "https://files.pythonhosted.org/packages/3b/99/e7bcc643085131cb14dbea772def72bf1f6fe9037171ebe177c4f228abc8/rignore-0.6.4-cp313-cp313-win32.whl", hash = "sha256:d0615a6bf4890ec5a90b5fb83666822088fbd4e8fcd740c386fcce51e2f6feea", size = 641761, upload-time = "2025-07-19T19:24:58.096Z" }, + { url = "https://files.pythonhosted.org/packages/d9/25/7798908044f27dea1a8abdc75c14523e33770137651e5f775a15143f4218/rignore-0.6.4-cp313-cp313-win_amd64.whl", hash = "sha256:145177f0e32716dc2f220b07b3cde2385b994b7ea28d5c96fbec32639e9eac6f", size = 719876, upload-time = "2025-07-19T19:24:51.125Z" }, + { url = "https://files.pythonhosted.org/packages/b4/e3/ae1e30b045bf004ad77bbd1679b9afff2be8edb166520921c6f29420516a/rignore-0.6.4-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e55bf8f9bbd186f58ab646b4a08718c77131d28a9004e477612b0cbbd5202db2", size = 891776, upload-time = "2025-07-19T19:22:37.78Z" }, + { url = "https://files.pythonhosted.org/packages/45/a9/1193e3bc23ca0e6eb4f17cf4b99971237f97cfa6f241d98366dff90a6d09/rignore-0.6.4-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2521f7bf3ee1f2ab22a100a3a4eed39a97b025804e5afe4323528e9ce8f084a5", size = 871442, upload-time = "2025-07-19T19:22:50.972Z" }, + { url = "https://files.pythonhosted.org/packages/20/83/4c52ae429a0b2e1ce667e35b480e9a6846f9468c443baeaed5d775af9485/rignore-0.6.4-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0cc35773a8a9c119359ef974d0856988d4601d4daa6f532c05f66b4587cf35bc", size = 1159844, upload-time = "2025-07-19T19:23:02.751Z" }, + { url = "https://files.pythonhosted.org/packages/c1/2f/c740f5751f464c937bfe252dc15a024ae081352cfe80d94aa16d6a617482/rignore-0.6.4-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b665b1ea14457d7b49e834baabc635a3b8c10cfb5cca5c21161fabdbfc2b850e", size = 939456, upload-time = "2025-07-19T19:23:15.72Z" }, + { url = "https://files.pythonhosted.org/packages/fc/dd/68dbb08ac0edabf44dd144ff546a3fb0253c5af708e066847df39fc9188f/rignore-0.6.4-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:c7fd339f344a8548724f289495b835bed7b81174a0bc1c28c6497854bd8855db", size = 1067070, upload-time = "2025-07-19T19:24:02.803Z" }, + { url = "https://files.pythonhosted.org/packages/3b/3a/7e7ea6f0d31d3f5beb0f2cf2c4c362672f5f7f125714458673fc579e2bed/rignore-0.6.4-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:91dc94b1cc5af8d6d25ce6edd29e7351830f19b0a03b75cb3adf1f76d00f3007", size = 1134598, upload-time = "2025-07-19T19:24:15.039Z" }, + { url = "https://files.pythonhosted.org/packages/7e/06/1b3307f6437d29bede5a95738aa89e6d910ba68d4054175c9f60d8e2c6b1/rignore-0.6.4-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:4d1918221a249e5342b60fd5fa513bf3d6bf272a8738e66023799f0c82ecd788", size = 1108862, upload-time = "2025-07-19T19:24:26.765Z" }, + { url = "https://files.pythonhosted.org/packages/b0/d5/b37c82519f335f2c472a63fc6215c6f4c51063ecf3166e3acf508011afbd/rignore-0.6.4-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:240777332b859dc89dcba59ab6e3f1e062bc8e862ffa3e5f456e93f7fd5cb415", size = 1120002, upload-time = "2025-07-19T19:24:38.952Z" }, + { url = "https://files.pythonhosted.org/packages/ac/72/2f05559ed5e69bdfdb56ea3982b48e6c0017c59f7241f7e1c5cae992b347/rignore-0.6.4-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:66b0e548753e55cc648f1e7b02d9f74285fe48bb49cec93643d31e563773ab3f", size = 949454, upload-time = "2025-07-19T19:23:38.664Z" }, + { url = "https://files.pythonhosted.org/packages/0b/92/186693c8f838d670510ac1dfb35afbe964320fbffb343ba18f3d24441941/rignore-0.6.4-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6971ac9fdd5a0bd299a181096f091c4f3fd286643adceba98eccc03c688a6637", size = 974663, upload-time = "2025-07-19T19:23:28.24Z" }, + { url = "https://files.pythonhosted.org/packages/85/4d/5a69ea5ae7de78eddf0a0699b6dbd855f87c1436673425461188ea39662f/rignore-0.6.4-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40f493eef4b191777ba6d16879e3f73836142e04480d2e2f483675d652e6b559", size = 895408, upload-time = "2025-07-19T19:22:42.16Z" }, + { url = "https://files.pythonhosted.org/packages/a3/c3/b6cdf9b676d6774c5de3ca04a5f4dbaffae3bb06bdee395e095be24f098e/rignore-0.6.4-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6790635e4df35333e27cd9e8b31d1d559826cf8b52f2c374b81ab698ac0140cf", size = 873042, upload-time = "2025-07-19T19:22:54.663Z" }, + { url = "https://files.pythonhosted.org/packages/80/25/61182149b2f2ca86c22c6253b361ec0e983e60e913ca75588a7d559b41eb/rignore-0.6.4-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e326dab28787f07c6987c04686d4ad9d4b1e1caca1a15b85d443f91af2e133d2", size = 1162036, upload-time = "2025-07-19T19:23:06.916Z" }, + { url = "https://files.pythonhosted.org/packages/db/44/7fe55c2b7adc8c90dc8709ef2fac25fa526b0c8bfd1090af4e6b33c2e42f/rignore-0.6.4-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bd24cb0f58c6036b0f64ac6fc3f759b7f0de5506fa9f5a65e9d57f8cf44a026d", size = 940381, upload-time = "2025-07-19T19:23:19.364Z" }, + { url = "https://files.pythonhosted.org/packages/3a/a3/8cc0c9a9db980a1589007d0fedcaf41475820e0cd4950a5f6eeb8ebc0ee0/rignore-0.6.4-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:36cb95b0acae3c88b99a39f4246b395fd983848f3ec85ff26531d638b6584a45", size = 951924, upload-time = "2025-07-19T19:23:42.209Z" }, + { url = "https://files.pythonhosted.org/packages/07/f2/4f2c88307c84801d6c772c01e8d856deaa8e85117180b88aaa0f41d4f86f/rignore-0.6.4-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:dfc954973429ce545d06163d87a6bae0ccea5703adbc957ee3d332c9592a58eb", size = 976515, upload-time = "2025-07-19T19:23:31.524Z" }, + { url = "https://files.pythonhosted.org/packages/a4/bd/f701ddf897cf5e3f394107e6dad147216b3a0d84e9d53d7a5fed7cc97d26/rignore-0.6.4-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:cbed37d7c128b58ab9ade80e131efc4a48b6d045cd0bd1d3254cbb6b4a0ad67e", size = 1069896, upload-time = "2025-07-19T19:24:06.24Z" }, + { url = "https://files.pythonhosted.org/packages/00/52/1ae54afad26aafcfee1b44a36b27bb0dd63f1c23081e1599dbf681368925/rignore-0.6.4-pp310-pypy310_pp73-musllinux_1_2_armv7l.whl", hash = "sha256:a0db910ef867d6ca2d52fefd22d8b6b63b20ec61661e2ad57e5c425a4e39431a", size = 1136337, upload-time = "2025-07-19T19:24:18.529Z" }, + { url = "https://files.pythonhosted.org/packages/85/9a/3b74aabb69ed118d0b493afa62d1aacc3bf12b8f11bf682a3c02174c3068/rignore-0.6.4-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:d664443a0a71d0a7d669adf32be59c4249bbff8b2810960f1b91d413ee4cf6b8", size = 1111677, upload-time = "2025-07-19T19:24:30.21Z" }, + { url = "https://files.pythonhosted.org/packages/70/7d/bd0f6c1bc89c80b116b526b77cdd5263c0ad218d5416aebf4ca9cce9ca73/rignore-0.6.4-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:b9f6f1d91429b4a6772152848815cf1459663796b7b899a0e15d9198e32c9371", size = 1122823, upload-time = "2025-07-19T19:24:42.476Z" }, + { url = "https://files.pythonhosted.org/packages/33/a1/daaa2df10dfa6d87c896a5783c8407c284530d5a056307d1f55a8ef0c533/rignore-0.6.4-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9b3da26d5a35ab15525b68d30b7352ad2247321f5201fc7e50ba6d547f78d5ea", size = 895772, upload-time = "2025-07-19T19:22:43.423Z" }, + { url = "https://files.pythonhosted.org/packages/35/e6/65130a50cd3ed11c967034dfd653e160abb7879fb4ee338a1cccaeda7acd/rignore-0.6.4-pp311-pypy311_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:43028f3587558231d9fa68accff58c901dc50fd7bbc5764d3ee3df95290f6ebf", size = 873093, upload-time = "2025-07-19T19:22:55.745Z" }, + { url = "https://files.pythonhosted.org/packages/32/c4/02ead1274ce935c59f2bb3deaaaa339df9194bc40e3c2d8d623e31e47ec4/rignore-0.6.4-pp311-pypy311_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bc56f1fcab7740751b98fead67b98ba64896424d8c834ea22089568db4e36dfa", size = 1162199, upload-time = "2025-07-19T19:23:08.376Z" }, + { url = "https://files.pythonhosted.org/packages/78/0c/94a4edce0e80af69f200cc35d8da4c727c52d28f0c9d819b388849ae8ef6/rignore-0.6.4-pp311-pypy311_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6033f2280898535a5f69935e08830a4e49ff1e29ef2c3f9a2b9ced59de06fdbf", size = 940176, upload-time = "2025-07-19T19:23:20.862Z" }, + { url = "https://files.pythonhosted.org/packages/43/92/21ec579c999a3ed4d1b2a5926a9d0edced7c65d8ac353bc9120d49b05a64/rignore-0.6.4-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f5ac0c4e6a24be88f3821e101ef4665e9e1dc015f9e45109f32fed71dbcdafa", size = 951632, upload-time = "2025-07-19T19:23:43.32Z" }, + { url = "https://files.pythonhosted.org/packages/67/c4/72e7ba244222b9efdeb18f9974d6f1e30cf5a2289e1b482a1e8b3ebee90f/rignore-0.6.4-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8906ac8dd585ece83b1346e0470260a1951058cc0ef5a17542069bde4aa3f42f", size = 976923, upload-time = "2025-07-19T19:23:32.678Z" }, + { url = "https://files.pythonhosted.org/packages/8e/14/e754c12bc953c7fa309687cd30a6ea95e5721168fb0b2a99a34bff24be5c/rignore-0.6.4-pp311-pypy311_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:14d095622969504a2e56f666286202dad583f08d3347b7be2d647ddfd7a9bf47", size = 1069861, upload-time = "2025-07-19T19:24:07.671Z" }, + { url = "https://files.pythonhosted.org/packages/a6/24/ba2bdaf04a19b5331c051b9d480e8daca832bed4aeaa156d6d679044c06c/rignore-0.6.4-pp311-pypy311_pp73-musllinux_1_2_armv7l.whl", hash = "sha256:30f3d688df7eb4850318f1b5864d14f2c5fe5dbf3803ed0fc8329d2a7ad560dc", size = 1136368, upload-time = "2025-07-19T19:24:19.68Z" }, + { url = "https://files.pythonhosted.org/packages/83/48/7cf52353299e02aa629150007fa75f4b91d99b4f2fa536f2e24ead810116/rignore-0.6.4-pp311-pypy311_pp73-musllinux_1_2_i686.whl", hash = "sha256:028f62a7b0a6235bb3f03c9e7f342352e7fa4b3f08c761c72f9de8faee40ed9c", size = 1111714, upload-time = "2025-07-19T19:24:31.717Z" }, + { url = "https://files.pythonhosted.org/packages/84/9c/3881ad34f01942af0cf713e25e476bf851e04e389cc3ff146c3b459ab861/rignore-0.6.4-pp311-pypy311_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:7e6c425603db2c147eace4f752ca3cd4551e7568c9d332175d586c68bcbe3d8d", size = 1122433, upload-time = "2025-07-19T19:24:43.973Z" }, +] + [[package]] name = "rpds-py" version = "0.26.0" @@ -3922,6 +4403,19 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/6a/23/8146aad7d88f4fcb3a6218f41a60f6c2d4e3a72de72da1825dc7c8f7877c/semantic_version-2.10.0-py2.py3-none-any.whl", hash = "sha256:de78a3b8e0feda74cabc54aab2da702113e33ac9d9eb9d2389bcf1f58b7d9177", size = 15552, upload-time = "2022-05-26T13:35:21.206Z" }, ] +[[package]] +name = "sentry-sdk" +version = "2.35.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/bd/79/0ecb942f3f1ad26c40c27f81ff82392d85c01d26a45e3c72c2b37807e680/sentry_sdk-2.35.2.tar.gz", hash = "sha256:e9e8f3c795044beb59f2c8f4c6b9b0f9779e5e604099882df05eec525e782cc6", size = 343377, upload-time = "2025-09-01T11:00:58.633Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c0/91/a43308dc82a0e32d80cd0dfdcfca401ecbd0f431ab45f24e48bb97b7800d/sentry_sdk-2.35.2-py2.py3-none-any.whl", hash = "sha256:38c98e3cbb620dd3dd80a8d6e39c753d453dd41f8a9df581b0584c19a52bc926", size = 363975, upload-time = "2025-09-01T11:00:56.574Z" }, +] + [[package]] name = "shellingham" version = "1.5.4" @@ -3970,6 +4464,51 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/d1/c2/fe97d779f3ef3b15f05c94a2f1e3d21732574ed441687474db9d342a7315/soupsieve-2.6-py3-none-any.whl", hash = "sha256:e72c4ff06e4fb6e4b5a9f0f55fe6e81514581fca1515028625d0f299c602ccc9", size = 36186, upload-time = "2024-08-13T13:39:10.986Z" }, ] +[[package]] +name = "sqlalchemy" +version = "2.0.43" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "greenlet", marker = "(python_full_version < '3.14' and platform_machine == 'AMD64') or (python_full_version < '3.14' and platform_machine == 'WIN32') or (python_full_version < '3.14' and platform_machine == 'aarch64') or (python_full_version < '3.14' and platform_machine == 'amd64') or (python_full_version < '3.14' and platform_machine == 'ppc64le') or (python_full_version < '3.14' and platform_machine == 'win32') or (python_full_version < '3.14' and platform_machine == 'x86_64')" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/d7/bc/d59b5d97d27229b0e009bd9098cd81af71c2fa5549c580a0a67b9bed0496/sqlalchemy-2.0.43.tar.gz", hash = "sha256:788bfcef6787a7764169cfe9859fe425bf44559619e1d9f56f5bddf2ebf6f417", size = 9762949, upload-time = "2025-08-11T14:24:58.438Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/8f/4e/985f7da36f09592c5ade99321c72c15101d23c0bb7eecfd1daaca5714422/sqlalchemy-2.0.43-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:70322986c0c699dca241418fcf18e637a4369e0ec50540a2b907b184c8bca069", size = 2133162, upload-time = "2025-08-11T15:52:17.854Z" }, + { url = "https://files.pythonhosted.org/packages/37/34/798af8db3cae069461e3bc0898a1610dc469386a97048471d364dc8aae1c/sqlalchemy-2.0.43-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:87accdbba88f33efa7b592dc2e8b2a9c2cdbca73db2f9d5c510790428c09c154", size = 2123082, upload-time = "2025-08-11T15:52:19.181Z" }, + { url = "https://files.pythonhosted.org/packages/fb/0f/79cf4d9dad42f61ec5af1e022c92f66c2d110b93bb1dc9b033892971abfa/sqlalchemy-2.0.43-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c00e7845d2f692ebfc7d5e4ec1a3fd87698e4337d09e58d6749a16aedfdf8612", size = 3208871, upload-time = "2025-08-11T15:50:30.656Z" }, + { url = "https://files.pythonhosted.org/packages/56/b3/59befa58fb0e1a9802c87df02344548e6d007e77e87e6084e2131c29e033/sqlalchemy-2.0.43-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:022e436a1cb39b13756cf93b48ecce7aa95382b9cfacceb80a7d263129dfd019", size = 3209583, upload-time = "2025-08-11T15:57:47.697Z" }, + { url = "https://files.pythonhosted.org/packages/29/d2/124b50c0eb8146e8f0fe16d01026c1a073844f0b454436d8544fe9b33bd7/sqlalchemy-2.0.43-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:c5e73ba0d76eefc82ec0219d2301cb33bfe5205ed7a2602523111e2e56ccbd20", size = 3148177, upload-time = "2025-08-11T15:50:32.078Z" }, + { url = "https://files.pythonhosted.org/packages/83/f5/e369cd46aa84278107624617034a5825fedfc5c958b2836310ced4d2eadf/sqlalchemy-2.0.43-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:9c2e02f06c68092b875d5cbe4824238ab93a7fa35d9c38052c033f7ca45daa18", size = 3172276, upload-time = "2025-08-11T15:57:49.477Z" }, + { url = "https://files.pythonhosted.org/packages/de/2b/4602bf4c3477fa4c837c9774e6dd22e0389fc52310c4c4dfb7e7ba05e90d/sqlalchemy-2.0.43-cp310-cp310-win32.whl", hash = "sha256:e7a903b5b45b0d9fa03ac6a331e1c1d6b7e0ab41c63b6217b3d10357b83c8b00", size = 2101491, upload-time = "2025-08-11T15:54:59.191Z" }, + { url = "https://files.pythonhosted.org/packages/38/2d/bfc6b6143adef553a08295490ddc52607ee435b9c751c714620c1b3dd44d/sqlalchemy-2.0.43-cp310-cp310-win_amd64.whl", hash = "sha256:4bf0edb24c128b7be0c61cd17eef432e4bef507013292415f3fb7023f02b7d4b", size = 2125148, upload-time = "2025-08-11T15:55:00.593Z" }, + { url = "https://files.pythonhosted.org/packages/9d/77/fa7189fe44114658002566c6fe443d3ed0ec1fa782feb72af6ef7fbe98e7/sqlalchemy-2.0.43-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:52d9b73b8fb3e9da34c2b31e6d99d60f5f99fd8c1225c9dad24aeb74a91e1d29", size = 2136472, upload-time = "2025-08-11T15:52:21.789Z" }, + { url = "https://files.pythonhosted.org/packages/99/ea/92ac27f2fbc2e6c1766bb807084ca455265707e041ba027c09c17d697867/sqlalchemy-2.0.43-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f42f23e152e4545157fa367b2435a1ace7571cab016ca26038867eb7df2c3631", size = 2126535, upload-time = "2025-08-11T15:52:23.109Z" }, + { url = "https://files.pythonhosted.org/packages/94/12/536ede80163e295dc57fff69724caf68f91bb40578b6ac6583a293534849/sqlalchemy-2.0.43-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4fb1a8c5438e0c5ea51afe9c6564f951525795cf432bed0c028c1cb081276685", size = 3297521, upload-time = "2025-08-11T15:50:33.536Z" }, + { url = "https://files.pythonhosted.org/packages/03/b5/cacf432e6f1fc9d156eca0560ac61d4355d2181e751ba8c0cd9cb232c8c1/sqlalchemy-2.0.43-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db691fa174e8f7036afefe3061bc40ac2b770718be2862bfb03aabae09051aca", size = 3297343, upload-time = "2025-08-11T15:57:51.186Z" }, + { url = "https://files.pythonhosted.org/packages/ca/ba/d4c9b526f18457667de4c024ffbc3a0920c34237b9e9dd298e44c7c00ee5/sqlalchemy-2.0.43-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:fe2b3b4927d0bc03d02ad883f402d5de201dbc8894ac87d2e981e7d87430e60d", size = 3232113, upload-time = "2025-08-11T15:50:34.949Z" }, + { url = "https://files.pythonhosted.org/packages/aa/79/c0121b12b1b114e2c8a10ea297a8a6d5367bc59081b2be896815154b1163/sqlalchemy-2.0.43-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:4d3d9b904ad4a6b175a2de0738248822f5ac410f52c2fd389ada0b5262d6a1e3", size = 3258240, upload-time = "2025-08-11T15:57:52.983Z" }, + { url = "https://files.pythonhosted.org/packages/79/99/a2f9be96fb382f3ba027ad42f00dbe30fdb6ba28cda5f11412eee346bec5/sqlalchemy-2.0.43-cp311-cp311-win32.whl", hash = "sha256:5cda6b51faff2639296e276591808c1726c4a77929cfaa0f514f30a5f6156921", size = 2101248, upload-time = "2025-08-11T15:55:01.855Z" }, + { url = "https://files.pythonhosted.org/packages/ee/13/744a32ebe3b4a7a9c7ea4e57babae7aa22070d47acf330d8e5a1359607f1/sqlalchemy-2.0.43-cp311-cp311-win_amd64.whl", hash = "sha256:c5d1730b25d9a07727d20ad74bc1039bbbb0a6ca24e6769861c1aa5bf2c4c4a8", size = 2126109, upload-time = "2025-08-11T15:55:04.092Z" }, + { url = "https://files.pythonhosted.org/packages/61/db/20c78f1081446095450bdc6ee6cc10045fce67a8e003a5876b6eaafc5cc4/sqlalchemy-2.0.43-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:20d81fc2736509d7a2bd33292e489b056cbae543661bb7de7ce9f1c0cd6e7f24", size = 2134891, upload-time = "2025-08-11T15:51:13.019Z" }, + { url = "https://files.pythonhosted.org/packages/45/0a/3d89034ae62b200b4396f0f95319f7d86e9945ee64d2343dcad857150fa2/sqlalchemy-2.0.43-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:25b9fc27650ff5a2c9d490c13c14906b918b0de1f8fcbb4c992712d8caf40e83", size = 2123061, upload-time = "2025-08-11T15:51:14.319Z" }, + { url = "https://files.pythonhosted.org/packages/cb/10/2711f7ff1805919221ad5bee205971254845c069ee2e7036847103ca1e4c/sqlalchemy-2.0.43-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6772e3ca8a43a65a37c88e2f3e2adfd511b0b1da37ef11ed78dea16aeae85bd9", size = 3320384, upload-time = "2025-08-11T15:52:35.088Z" }, + { url = "https://files.pythonhosted.org/packages/6e/0e/3d155e264d2ed2778484006ef04647bc63f55b3e2d12e6a4f787747b5900/sqlalchemy-2.0.43-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a113da919c25f7f641ffbd07fbc9077abd4b3b75097c888ab818f962707eb48", size = 3329648, upload-time = "2025-08-11T15:56:34.153Z" }, + { url = "https://files.pythonhosted.org/packages/5b/81/635100fb19725c931622c673900da5efb1595c96ff5b441e07e3dd61f2be/sqlalchemy-2.0.43-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:4286a1139f14b7d70141c67a8ae1582fc2b69105f1b09d9573494eb4bb4b2687", size = 3258030, upload-time = "2025-08-11T15:52:36.933Z" }, + { url = "https://files.pythonhosted.org/packages/0c/ed/a99302716d62b4965fded12520c1cbb189f99b17a6d8cf77611d21442e47/sqlalchemy-2.0.43-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:529064085be2f4d8a6e5fab12d36ad44f1909a18848fcfbdb59cc6d4bbe48efe", size = 3294469, upload-time = "2025-08-11T15:56:35.553Z" }, + { url = "https://files.pythonhosted.org/packages/5d/a2/3a11b06715149bf3310b55a98b5c1e84a42cfb949a7b800bc75cb4e33abc/sqlalchemy-2.0.43-cp312-cp312-win32.whl", hash = "sha256:b535d35dea8bbb8195e7e2b40059e2253acb2b7579b73c1b432a35363694641d", size = 2098906, upload-time = "2025-08-11T15:55:00.645Z" }, + { url = "https://files.pythonhosted.org/packages/bc/09/405c915a974814b90aa591280623adc6ad6b322f61fd5cff80aeaef216c9/sqlalchemy-2.0.43-cp312-cp312-win_amd64.whl", hash = "sha256:1c6d85327ca688dbae7e2b06d7d84cfe4f3fffa5b5f9e21bb6ce9d0e1a0e0e0a", size = 2126260, upload-time = "2025-08-11T15:55:02.965Z" }, + { url = "https://files.pythonhosted.org/packages/41/1c/a7260bd47a6fae7e03768bf66451437b36451143f36b285522b865987ced/sqlalchemy-2.0.43-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e7c08f57f75a2bb62d7ee80a89686a5e5669f199235c6d1dac75cd59374091c3", size = 2130598, upload-time = "2025-08-11T15:51:15.903Z" }, + { url = "https://files.pythonhosted.org/packages/8e/84/8a337454e82388283830b3586ad7847aa9c76fdd4f1df09cdd1f94591873/sqlalchemy-2.0.43-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:14111d22c29efad445cd5021a70a8b42f7d9152d8ba7f73304c4d82460946aaa", size = 2118415, upload-time = "2025-08-11T15:51:17.256Z" }, + { url = "https://files.pythonhosted.org/packages/cf/ff/22ab2328148492c4d71899d62a0e65370ea66c877aea017a244a35733685/sqlalchemy-2.0.43-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:21b27b56eb2f82653168cefe6cb8e970cdaf4f3a6cb2c5e3c3c1cf3158968ff9", size = 3248707, upload-time = "2025-08-11T15:52:38.444Z" }, + { url = "https://files.pythonhosted.org/packages/dc/29/11ae2c2b981de60187f7cbc84277d9d21f101093d1b2e945c63774477aba/sqlalchemy-2.0.43-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c5a9da957c56e43d72126a3f5845603da00e0293720b03bde0aacffcf2dc04f", size = 3253602, upload-time = "2025-08-11T15:56:37.348Z" }, + { url = "https://files.pythonhosted.org/packages/b8/61/987b6c23b12c56d2be451bc70900f67dd7d989d52b1ee64f239cf19aec69/sqlalchemy-2.0.43-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:5d79f9fdc9584ec83d1b3c75e9f4595c49017f5594fee1a2217117647225d738", size = 3183248, upload-time = "2025-08-11T15:52:39.865Z" }, + { url = "https://files.pythonhosted.org/packages/86/85/29d216002d4593c2ce1c0ec2cec46dda77bfbcd221e24caa6e85eff53d89/sqlalchemy-2.0.43-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9df7126fd9db49e3a5a3999442cc67e9ee8971f3cb9644250107d7296cb2a164", size = 3219363, upload-time = "2025-08-11T15:56:39.11Z" }, + { url = "https://files.pythonhosted.org/packages/b6/e4/bd78b01919c524f190b4905d47e7630bf4130b9f48fd971ae1c6225b6f6a/sqlalchemy-2.0.43-cp313-cp313-win32.whl", hash = "sha256:7f1ac7828857fcedb0361b48b9ac4821469f7694089d15550bbcf9ab22564a1d", size = 2096718, upload-time = "2025-08-11T15:55:05.349Z" }, + { url = "https://files.pythonhosted.org/packages/ac/a5/ca2f07a2a201f9497de1928f787926613db6307992fe5cda97624eb07c2f/sqlalchemy-2.0.43-cp313-cp313-win_amd64.whl", hash = "sha256:971ba928fcde01869361f504fcff3b7143b47d30de188b11c6357c0505824197", size = 2123200, upload-time = "2025-08-11T15:55:07.932Z" }, + { url = "https://files.pythonhosted.org/packages/b8/d9/13bdde6521f322861fab67473cec4b1cc8999f3871953531cf61945fad92/sqlalchemy-2.0.43-py3-none-any.whl", hash = "sha256:1681c21dd2ccee222c2fe0bef671d1aef7c504087c9c4e800371cfcc8ac966fc", size = 1924759, upload-time = "2025-08-11T15:39:53.024Z" }, +] + [[package]] name = "sse-starlette" version = "2.2.1" @@ -4326,6 +4865,49 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/61/14/33a3a1352cfa71812a3a21e8c9bfb83f60b0011f5e36f2b1399d51928209/uvicorn-0.34.0-py3-none-any.whl", hash = "sha256:023dc038422502fa28a09c7a30bf2b6991512da7dcdb8fd35fe57cfc154126f4", size = 62315, upload-time = "2024-12-15T13:33:27.467Z" }, ] +[package.optional-dependencies] +standard = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "httptools" }, + { name = "python-dotenv" }, + { name = "pyyaml" }, + { name = "uvloop", marker = "platform_python_implementation != 'PyPy' and sys_platform != 'cygwin' and sys_platform != 'win32'" }, + { name = "watchfiles" }, + { name = "websockets" }, +] + +[[package]] +name = "uvloop" +version = "0.21.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/af/c0/854216d09d33c543f12a44b393c402e89a920b1a0a7dc634c42de91b9cf6/uvloop-0.21.0.tar.gz", hash = "sha256:3bf12b0fda68447806a7ad847bfa591613177275d35b6724b1ee573faa3704e3", size = 2492741, upload-time = "2024-10-14T23:38:35.489Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3d/76/44a55515e8c9505aa1420aebacf4dd82552e5e15691654894e90d0bd051a/uvloop-0.21.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:ec7e6b09a6fdded42403182ab6b832b71f4edaf7f37a9a0e371a01db5f0cb45f", size = 1442019, upload-time = "2024-10-14T23:37:20.068Z" }, + { url = "https://files.pythonhosted.org/packages/35/5a/62d5800358a78cc25c8a6c72ef8b10851bdb8cca22e14d9c74167b7f86da/uvloop-0.21.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:196274f2adb9689a289ad7d65700d37df0c0930fd8e4e743fa4834e850d7719d", size = 801898, upload-time = "2024-10-14T23:37:22.663Z" }, + { url = "https://files.pythonhosted.org/packages/f3/96/63695e0ebd7da6c741ccd4489b5947394435e198a1382349c17b1146bb97/uvloop-0.21.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f38b2e090258d051d68a5b14d1da7203a3c3677321cf32a95a6f4db4dd8b6f26", size = 3827735, upload-time = "2024-10-14T23:37:25.129Z" }, + { url = "https://files.pythonhosted.org/packages/61/e0/f0f8ec84979068ffae132c58c79af1de9cceeb664076beea86d941af1a30/uvloop-0.21.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87c43e0f13022b998eb9b973b5e97200c8b90823454d4bc06ab33829e09fb9bb", size = 3825126, upload-time = "2024-10-14T23:37:27.59Z" }, + { url = "https://files.pythonhosted.org/packages/bf/fe/5e94a977d058a54a19df95f12f7161ab6e323ad49f4dabc28822eb2df7ea/uvloop-0.21.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:10d66943def5fcb6e7b37310eb6b5639fd2ccbc38df1177262b0640c3ca68c1f", size = 3705789, upload-time = "2024-10-14T23:37:29.385Z" }, + { url = "https://files.pythonhosted.org/packages/26/dd/c7179618e46092a77e036650c1f056041a028a35c4d76945089fcfc38af8/uvloop-0.21.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:67dd654b8ca23aed0a8e99010b4c34aca62f4b7fce88f39d452ed7622c94845c", size = 3800523, upload-time = "2024-10-14T23:37:32.048Z" }, + { url = "https://files.pythonhosted.org/packages/57/a7/4cf0334105c1160dd6819f3297f8700fda7fc30ab4f61fbf3e725acbc7cc/uvloop-0.21.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:c0f3fa6200b3108919f8bdabb9a7f87f20e7097ea3c543754cabc7d717d95cf8", size = 1447410, upload-time = "2024-10-14T23:37:33.612Z" }, + { url = "https://files.pythonhosted.org/packages/8c/7c/1517b0bbc2dbe784b563d6ab54f2ef88c890fdad77232c98ed490aa07132/uvloop-0.21.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:0878c2640cf341b269b7e128b1a5fed890adc4455513ca710d77d5e93aa6d6a0", size = 805476, upload-time = "2024-10-14T23:37:36.11Z" }, + { url = "https://files.pythonhosted.org/packages/ee/ea/0bfae1aceb82a503f358d8d2fa126ca9dbdb2ba9c7866974faec1cb5875c/uvloop-0.21.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b9fb766bb57b7388745d8bcc53a359b116b8a04c83a2288069809d2b3466c37e", size = 3960855, upload-time = "2024-10-14T23:37:37.683Z" }, + { url = "https://files.pythonhosted.org/packages/8a/ca/0864176a649838b838f36d44bf31c451597ab363b60dc9e09c9630619d41/uvloop-0.21.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8a375441696e2eda1c43c44ccb66e04d61ceeffcd76e4929e527b7fa401b90fb", size = 3973185, upload-time = "2024-10-14T23:37:40.226Z" }, + { url = "https://files.pythonhosted.org/packages/30/bf/08ad29979a936d63787ba47a540de2132169f140d54aa25bc8c3df3e67f4/uvloop-0.21.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:baa0e6291d91649c6ba4ed4b2f982f9fa165b5bbd50a9e203c416a2797bab3c6", size = 3820256, upload-time = "2024-10-14T23:37:42.839Z" }, + { url = "https://files.pythonhosted.org/packages/da/e2/5cf6ef37e3daf2f06e651aae5ea108ad30df3cb269102678b61ebf1fdf42/uvloop-0.21.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:4509360fcc4c3bd2c70d87573ad472de40c13387f5fda8cb58350a1d7475e58d", size = 3937323, upload-time = "2024-10-14T23:37:45.337Z" }, + { url = "https://files.pythonhosted.org/packages/8c/4c/03f93178830dc7ce8b4cdee1d36770d2f5ebb6f3d37d354e061eefc73545/uvloop-0.21.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:359ec2c888397b9e592a889c4d72ba3d6befba8b2bb01743f72fffbde663b59c", size = 1471284, upload-time = "2024-10-14T23:37:47.833Z" }, + { url = "https://files.pythonhosted.org/packages/43/3e/92c03f4d05e50f09251bd8b2b2b584a2a7f8fe600008bcc4523337abe676/uvloop-0.21.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f7089d2dc73179ce5ac255bdf37c236a9f914b264825fdaacaded6990a7fb4c2", size = 821349, upload-time = "2024-10-14T23:37:50.149Z" }, + { url = "https://files.pythonhosted.org/packages/a6/ef/a02ec5da49909dbbfb1fd205a9a1ac4e88ea92dcae885e7c961847cd51e2/uvloop-0.21.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:baa4dcdbd9ae0a372f2167a207cd98c9f9a1ea1188a8a526431eef2f8116cc8d", size = 4580089, upload-time = "2024-10-14T23:37:51.703Z" }, + { url = "https://files.pythonhosted.org/packages/06/a7/b4e6a19925c900be9f98bec0a75e6e8f79bb53bdeb891916609ab3958967/uvloop-0.21.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:86975dca1c773a2c9864f4c52c5a55631038e387b47eaf56210f873887b6c8dc", size = 4693770, upload-time = "2024-10-14T23:37:54.122Z" }, + { url = "https://files.pythonhosted.org/packages/ce/0c/f07435a18a4b94ce6bd0677d8319cd3de61f3a9eeb1e5f8ab4e8b5edfcb3/uvloop-0.21.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:461d9ae6660fbbafedd07559c6a2e57cd553b34b0065b6550685f6653a98c1cb", size = 4451321, upload-time = "2024-10-14T23:37:55.766Z" }, + { url = "https://files.pythonhosted.org/packages/8f/eb/f7032be105877bcf924709c97b1bf3b90255b4ec251f9340cef912559f28/uvloop-0.21.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:183aef7c8730e54c9a3ee3227464daed66e37ba13040bb3f350bc2ddc040f22f", size = 4659022, upload-time = "2024-10-14T23:37:58.195Z" }, + { url = "https://files.pythonhosted.org/packages/3f/8d/2cbef610ca21539f0f36e2b34da49302029e7c9f09acef0b1c3b5839412b/uvloop-0.21.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:bfd55dfcc2a512316e65f16e503e9e450cab148ef11df4e4e679b5e8253a5281", size = 1468123, upload-time = "2024-10-14T23:38:00.688Z" }, + { url = "https://files.pythonhosted.org/packages/93/0d/b0038d5a469f94ed8f2b2fce2434a18396d8fbfb5da85a0a9781ebbdec14/uvloop-0.21.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:787ae31ad8a2856fc4e7c095341cccc7209bd657d0e71ad0dc2ea83c4a6fa8af", size = 819325, upload-time = "2024-10-14T23:38:02.309Z" }, + { url = "https://files.pythonhosted.org/packages/50/94/0a687f39e78c4c1e02e3272c6b2ccdb4e0085fda3b8352fecd0410ccf915/uvloop-0.21.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5ee4d4ef48036ff6e5cfffb09dd192c7a5027153948d85b8da7ff705065bacc6", size = 4582806, upload-time = "2024-10-14T23:38:04.711Z" }, + { url = "https://files.pythonhosted.org/packages/d2/19/f5b78616566ea68edd42aacaf645adbf71fbd83fc52281fba555dc27e3f1/uvloop-0.21.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f3df876acd7ec037a3d005b3ab85a7e4110422e4d9c1571d4fc89b0fc41b6816", size = 4701068, upload-time = "2024-10-14T23:38:06.385Z" }, + { url = "https://files.pythonhosted.org/packages/47/57/66f061ee118f413cd22a656de622925097170b9380b30091b78ea0c6ea75/uvloop-0.21.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bd53ecc9a0f3d87ab847503c2e1552b690362e005ab54e8a48ba97da3924c0dc", size = 4454428, upload-time = "2024-10-14T23:38:08.416Z" }, + { url = "https://files.pythonhosted.org/packages/63/9a/0962b05b308494e3202d3f794a6e85abe471fe3cafdbcf95c2e8c713aabd/uvloop-0.21.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a5c39f217ab3c663dc699c04cbd50c13813e31d917642d459fdcec07555cc553", size = 4660018, upload-time = "2024-10-14T23:38:10.888Z" }, +] + [[package]] name = "vcrpy" version = "5.1.0"