Latest Ollama Python Library Update

The latest update to the Ollama Python library introduces several powerful features:

  • Support for Python Functions as Tools: You can now pass Python functions directly as tools for the library.
  • Full Typing Support: Enhanced type annotations improve usability and reliability.
  • New Examples: Fresh examples are available to help you get started more quickly.

Getting Started

Begin by installing or updating the Ollama Python library:

bashCopy codepip install -U ollama

Passing Python Functions as Tools

1. Define a Python Function

Create a standard Python function, including type annotations for parameters and return values. Adding a Google-style docstring is optional but recommended for better clarity and performance.

pythonCopy codedef add_two_numbers(a: int, b: int) -> int:
    """
    Add two numbers.

    Args:
        a: The first integer number.
        b: The second integer number.

    Returns:
        int: The sum of the two numbers.
    """
    return a + b

2. Pass the Function as a Tool to Ollama

Use the tools field to pass the function reference:

pythonCopy codeimport ollama

response = ollama.chat(
    'llama3.1',
    messages=[{'role': 'user', 'content': 'What is 10 + 10?'}],
    tools=[add_two_numbers],  # Reference to the defined function
)

3. Invoke the Function from the Model’s Response

Retrieve the function call from the model’s output and execute it:

pythonCopy codeavailable_functions = {
    'add_two_numbers': add_two_numbers,
}

for tool in response.message.tool_calls or []:
    function_to_call = available_functions.get(tool.function.name)
    if function_to_call:
        print('Function output:', function_to_call(**tool.function.arguments))
    else:
        print('Function not found:', tool.function.name)

Using Existing Functions as Tools

You can also pass functions from external libraries or SDKs. For instance, here’s how to use the requests library to fetch content from a website:

pythonCopy codeimport ollama
import requests

available_functions = {
    'request': requests.request,
}

response = ollama.chat(
    'llama3.1',
    messages=[{'role': 'user', 'content': 'Get the ollama.com webpage?'}],
    tools=[requests.request], 
)

for tool in response.message.tool_calls or []:
    function_to_call = available_functions.get(tool.function.name)
    if function_to_call == requests.request:
        # Perform the HTTP request
        resp = function_to_call(
            method=tool.function.arguments.get('method'),
            url=tool.function.arguments.get('url'),
        )
        print(resp.text)
    else:
        print('Function not found:', tool.function.name)

How It Works: Automatic JSON Schema Generation

The Ollama library leverages Pydantic and docstring parsing to automatically generate JSON schemas for functions. For example, the add_two_numbers function generates the following schema:

jsonCopy code{
    "type": "function",
    "function": {
        "name": "add_two_numbers",
        "description": "Add two numbers",
        "parameters": {
            "type": "object",
            "required": ["a", "b"],
            "properties": {
                "a": {
                    "type": "integer",
                    "description": "The first integer number"
                },
                "b": {
                    "type": "integer",
                    "description": "The second integer number"
                }
            }
        }
    }
}

This schema, which previously had to be created manually, is now handled automatically by the library.


Additional Updates in Version 0.4

  • Updated Examples: The GitHub repository now includes refreshed and detailed examples.
  • Comprehensive Typing Support: Enhanced type safety enables direct object access without sacrificing compatibility with existing functionality.