Google Gemini Function Calling Intermediate
Google's Gemini models support function calling through function declarations. Gemini also offers a unique automatic function calling mode where the SDK can execute functions for you, reducing boilerplate code.
Defining Function Declarations
Python
import google.generativeai as genai # Define your actual Python function def get_weather(location: str, unit: str = "celsius") -> dict: """Get current weather for a location. Args: location: City name, e.g. 'Tokyo' unit: Temperature unit (celsius or fahrenheit) """ # Your implementation here return {"temp": 22, "unit": unit, "condition": "Sunny"} # Gemini can infer the schema from Python type hints! model = genai.GenerativeModel( model_name="gemini-2.0-flash", tools=[get_weather] )
Google's Advantage: Gemini can automatically infer function schemas from Python type hints and docstrings. You do not need to manually write JSON schemas like with OpenAI or Anthropic.
Manual Function Calling
Python
chat = model.start_chat() response = chat.send_message("What's the weather in Tokyo?") # Check for function calls for part in response.parts: if part.function_call: name = part.function_call.name args = dict(part.function_call.args) print(f"Function: {name}, Args: {args}") # Execute and send result back result = get_weather(**args) response = chat.send_message( genai.protos.Content(parts=[ genai.protos.Part(function_response=genai.protos.FunctionResponse( name=name, response={"result": result} )) ]) ) print(response.text)
Automatic Function Calling
Gemini's SDK can automatically execute functions for you:
Python
# Enable automatic function calling model = genai.GenerativeModel( model_name="gemini-2.0-flash", tools=[get_weather] ) # The SDK handles the entire tool use loop automatically! response = model.generate_content( "What's the weather in Tokyo?", tool_config={"function_calling_config": "AUTO"} ) # response.text already contains the final answer print(response.text) # "The weather in Tokyo is currently 22°C and sunny!"
Function Calling Modes
| Mode | Behavior |
|---|---|
AUTO |
Model decides whether to call functions (default) |
ANY |
Model must call at least one function |
NONE |
Model will not call any functions |
Provider Comparison Summary
| Feature | OpenAI | Anthropic | |
|---|---|---|---|
| Schema definition | Manual JSON Schema | Manual JSON Schema | Auto from type hints |
| Parallel calls | Yes | Yes | Yes |
| Auto-execution | No | No | Yes (SDK feature) |
| Streaming support | Yes | Yes | Yes |
Security Note: Be careful with automatic function calling in production. It removes your ability to validate arguments before execution. For production use, prefer manual execution with proper input validation.
Lilly Tech Systems