Request Flow in AgentDock Core
This document outlines the typical sequence of events when a request is processed by AgentDock Core.
Request Flow Diagram
Detailed Steps
- Request Initiation: A client sends a request (e.g., a user message, potentially including a
sessionId
) to an API endpoint. - API Endpoint Handling: Extracts the
sessionId
and message payload, determining the targetagentId
. - Agent Instantiation (
AgentNode
): Creates an instance ofAgentNode
, passing the agent configuration, API keys, and core managers. - State Retrieval / Initialization: Loads or creates orchestration state for the session.
- Orchestration & Tool Filtering: Evaluates conditions to determine the
activeStep
and filters available tools. - LLM Interaction (
CoreLLM.streamText
): Prepares the prompt, sets callbacks, and calls the LLM with filtered tools. - Response Streaming & Tool Handling:
- LLM streams response (text chunks or tool call requests)
- Tool calls are executed and results returned to the LLM
- Text chunks are streamed back to the client
- State Updates: Update token usage, tool history, sequence index, and timestamps.
- Response Completion: Stream ends and response is finalized.
- Cleanup:
AgentNode
instance is discarded while session state persists.
For implementation details, see agentdock-core/src/nodes/agent-node.ts
.
Enhanced Streaming Flow
AgentDock extends standard streaming with AgentDockStreamResult
:
This enhanced flow provides:
- Automatic State Management: Updates token usage and tracks tools used
- Error Handling: Propagates errors from LLM providers with better context
- Tool Orchestration: Manages tool execution and updates session state
For details on streaming implementation, see Response Streaming.