AI & ML

Function Calling

An LLM capability that allows the model to generate structured function calls with arguments, enabling it to interact with external tools and APIs.

Function calling (also called tool use) is a capability of modern LLMs that allows them to generate structured JSON function calls instead of plain text. This bridges the gap between natural language understanding and programmatic action.

When you define a set of functions with their parameters and descriptions, the LLM can decide when to call a function, select the appropriate one, and generate the correct arguments. The application then executes the function and returns the result to the model.

For example, given a "get_weather" function, a user asking "What's the weather in Toronto?" would trigger the model to call get_weather(city="Toronto") rather than generating a text response. The actual weather data is fetched and returned to the model for a grounded response.

Function calling is the foundation of AI agents and tool-using AI systems. It's supported by OpenAI, Anthropic (Claude), Google (Gemini), and other providers, each with slightly different API formats.

Want to learn more?

Explore more developer terms or read in-depth articles on the blog.

Browse all terms