Core Steps
-
Initialize
ToolManagerand Register Tools: As with Langchain, start by setting up yourToolManagerand registering all necessary tools from your Applications or custom functions. -
Convert Tools to OpenAI Format:
The
ToolManagercan list tools in the JSON schema format expected by the OpenAI API.This uses theconvert_tool_to_openai_tooladapter, which formats each MCPTool’s name, description, and parameters schema into the required structure. -
Make the API Call to OpenAI:
When making a
client.chat.completions.createcall, include thetoolsparameter with the JSON obtained in the previous step, and settool_choice(e.g., to"auto"to let the model decide, or to a specific tool). -
Check for Tool Calls in the Response:
The
response_messagefrom OpenAI might containtool_calls. If it does, the model wants to invoke one or more of your tools. -
Execute Tool Calls using
ToolManager: For each tool call requested by the model:- Get the function name and arguments.
- Use your
ToolManagerinstance’scall_toolmethod to execute the actual MCP tool. - Append the results of the tool calls as new messages with
role: "tool".
-
Send Results Back to OpenAI (Optional):
After executing all tool calls, you can send the updated list of messages (including the tool responses) back to the OpenAI model to get a final summarized response from the AI.
examples/github.py script provides a practical demonstration of this flow.