AI

Function calling

A capability that lets an LLM-powered AI agent call out to external tools (booking system, CRM, calculator, search) instead of just generating text. The bridge between the model and the real world.

What it means

Function calling (also called tool use) lets an AI agent take actions, not just write replies. When the agent decides it needs to do something, like check available appointment slots, look up an order, or send a calendar invite, it emits a structured function call that your system executes. The function returns data, the agent reads the data, and continues the conversation.

Modern LLMs from OpenAI, Anthropic, and Google all support function calling natively. The integration work is on the application side: define the functions, wire them up, and trust the model to call the right one at the right time.

Why it matters

Without function calling, an agent can only generate text. With it, the agent becomes a worker: it can book appointments in real time, look up real customer records, calculate real shipping costs, qualify leads against real criteria, send real WhatsApp messages.

This is what separates a 'chatbot' from an 'AI agent'. The agent has hands.

Example

A salon's AI agent is asked, 'Do you have an open slot Saturday morning for a haircut?' The agent calls the booking-system function, receives back the actual slot list, and replies, 'Yes, I have 9:30am and 11am open. Would either work?' The customer picks one; the agent calls the booking function again to reserve. No human involved.

Where this comes up

← Back to all terms