🏠
Toolhouse
DiscordGithubSign upGo to App
  • 🏠Toolhouse
  • Quick start: deploy your first agent
  • Build agents with the th file
  • Test agents before deploying
  • Deploy and run your agents
  • Agent workers
    • Running Agents asynchronously
      • API Reference
    • Schedule autonomous runs
      • API Reference
  • Toolhouse SDK
    • ✨Quick start (Python)
    • ✨Quick start (TypeScript)
    • Using LlamaIndex
    • Using Vercel AI
  • Choose MCP servers for your agent
  • Customize agents for your end users
  • 💬Execution logs
  • Go to app
Powered by GitBook
On this page
  • Decorate your existing functions
  • Pass your local tools
  • Overriding cloud tools
  • Treat tool definitions as prompts

Working with your local tools

Last updated 4 months ago

Toolhouse gives your the flexibility to use your local tools alongside any tools you install from Toolhouse itself. When you run your local tools with Toolhouse, you can simplify the boilerplate logic required to execute tools and convert their results into valid completion objections.

To execute your local tools just follow these two steps:

Decorate your existing functions

In your code, add the register_local_tool() decorator to the function that returns the response to the LLM. Specify the name of the tool as the decorator's parameter.

  • Your function's signature should accept all the arguments the LLM will pass, both required and optional. Toolhouse may throw an exception if the function signature does not match the required LLM arguments.

  • Your function should always return a string. Toolhouse will throw an exception if the function does not return a string or string-like value.

Ensure that the arguments you need from the LLM are mapped in the function signature:

import requests
from toolhouse import Toolhouse
from anthropic import Anthropic

client = Anthropic(api_key="YOUR_API_KEY")
MODEL = "claude-3-5-sonnet-20240620"

th = Toolhouse(provider="anthropic")

# The parameter must match the name of the tool in your tool definition
@th.register_local_tool("get_current_weather")
def get_weather_forecast(
   # These arguments must match the name of the parameters 
   # in your tool definition
   latitude: float, 
   longitude: float) -> str:
   
    url = f"https://api.open-meteo.com/v1/forecast?latitude={latitude}&longitude={longitude}&hourly=temperature_2m&forecast_days=1"
    
    response = requests.get(url)
    
    if response.status_code == 200:
        return response.text
    else:
        return f"Error: {response.status_code} - {response.text}"
import requests
from toolhouse import Toolhouse
from openai import OpenAI

client = OpenAI()
MODEL = "gpt-4o"

# If you don't specify an API key, Toolhouse will expect you
# specify one in the TOOLHOUSE_API_KEY env variable.
th = Toolhouse()

# The parameter must match the name of the tool in your tool definition.
# See the "name" field in my_local_tools.
# The function name in your code can differ from that value.
@th.register_local_tool("get_current_weather")
def get_weather_forecast(
   # Must match the name of the parameters in your tool definition
   latitude: float, 
   longitude: float) -> str:
   
    url = f"https://api.open-meteo.com/v1/forecast?latitude={latitude}&longitude={longitude}&hourly=temperature_2m&forecast_days=1"
    
    response = requests.get(url)
    
    if response.status_code == 200:
        return response.text
    else:
        return f"Error: {response.status_code} - {response.text}"
import requests
from toolhouse import Toolhouse

th = Toolhouse()

# The parameter must match the name of the tool in your tool definition
@th.register_local_tool("get_current_weather")
def get_weather_forecast(
   # Must match the name of the parameters in your tool definition
   latitude: float, 
   longitude: float) -> str:
   
    url = f"https://api.open-meteo.com/v1/forecast?latitude={latitude}&longitude={longitude}&hourly=temperature_2m&forecast_days=1"
    
    response = requests.get(url)
    
    if response.status_code == 200:
        return response.text
    else:
        return f"Error: {response.status_code} - {response.text}"

my_local_tools = [
    {
        "type": "function",
        "function": {
            "name": "get_current_weather",
            "description": "Retrieves the current weather for the location you specify.",
            "parameters": {
                "type": "object",
                "properties": {
                    "latitude": {
                        "type": "number",
                        "description": "The latitude of the location.",
                    },
                    "longitude": {
                        "type": "number",
                        "description": "The longitude of the location.",
                    },
                    "required": [
                        "latitude",
                        "longitude",
                    ],
                },
            },
        },
    }
]
import requests
from toolhouse import Toolhouse

th = Toolhouse()

# The parameter must match the name of the tool in your tool definition
@th.register_local_tool("get_current_weather")
def get_weather_forecast(
   # Must match the name of the parameters in your tool definition
   latitude: float, 
   longitude: float) -> str:
   
    url = f"https://api.open-meteo.com/v1/forecast?latitude={latitude}&longitude={longitude}&hourly=temperature_2m&forecast_days=1"
    
    response = requests.get(url)
    
    if response.status_code == 200:
        return response.text
    else:
        return f"Error: {response.status_code} - {response.text}"

my_local_tools = [
    {
        "type": "function",
        "function": {
            "name": "get_current_weather",
            "description": "Retrieves the current weather for the location you specify.",
            "parameters": {
                "type": "object",
                "properties": {
                    "latitude": {
                        "type": "number",
                        "description": "The latitude of the location.",
                    },
                    "longitude": {
                        "type": "number",
                        "description": "The longitude of the location.",
                    },
                    "required": [
                        "latitude",
                        "longitude",
                    ],
                },
            },
        },
    }
]

Pass your local tools

Define your local tools as a variable like my_local_tools

Append your tools to the th.get_tools() call, then use th.run_tools() to run both your local and cloud tools.

my_local_tools = [
    {
            "name": "get_current_weather",
            "description": "Retrieves the current weather for the location you specify.",
            "input_schema": {
                "type": "object",
                "properties": {
                    "latitude": {
                        "type": "number",
                        "description": "The latitude of the location.",
                    },
                    "longitude": {
                        "type": "number",
                        "description": "The longitude of the location.",
                    },
                },
                "required": [
                    "latitude",
                    "longitude",
                ]
            }
    }
]

messages = [
  {
    "role": "user",
    "content": "What's the weather in Oakland, CA?",
  }
]

response = client.messages.create(
  model=MODEL,
  messages=messages,
  max_tokens=1000,
  tools=th.get_tools() + my_local_tools,
)

# Runs your local tool, gets the result, 
# and appends it to the context
tool_run = th.run_tools(response)
messages = messages + tool_run

response = client.messages.create(
  model=MODEL,
  messages=messages,
  max_tokens=1000,
  tools=th.get_tools(),
)

print(response.content[0].text)
my_local_tools = [
    {
        "type": "function",
        "function": {
            "name": "get_current_weather",
            "description": "Retrieves the current weather for the latitude and longitude belonging to a place you know. When you specify the latitude and longitude, IT IS CRUCIAL THAT YOU DO NOT SPECIFY ANYTHING ELSE. Before you use this tool, ensure that you only send the latitude and longitude of the place you know. If you are sending any other information, think again and remove any information that is not latitude and longitude.",
            "parameters": {
                "type": "object",
                "properties": {
                    "latitude": {
                        "type": "number",
                        "description": "The latitude of the location.",
                    },
                    "longitude": {
                        "type": "number",
                        "description": "The longitude of the location.",
                    }
                },
                "required": [
                    "latitude",
                    "longitude",
                ],
                "additionalProperties": False
                },
            },
        }
    ]

messages = [
  {
    "role": "user",
    "content": "What's the weather in Oakland, CA?",
  }
]

response = client.chat.completions.create(
  model=MODEL,
  messages=messages,
  tools=th.get_tools() + my_local_tools,
)

# Runs your local tool, gets the result, 
# and appends it to the context
tool_run = th.run_tools(response)
messages += tool_run

response = client.chat.completions.create(
  model=MODEL,
  messages=messages,
  tools=th.get_tools() + my_local_tools,
)

print(response.choices[0].message.content)
import os
from anthropic import AnthropicBedrock

client = AnthropicBedrock(
  aws_access_key=os.environ.get("AWS_ACCESS_KEY"),
  aws_secret_key=os.environ.get("AWS_SECRET_KEY"),
  aws_session_token=os.environ.get("AWS_SESSION_TOKEN"),
  aws_region=os.environ.get("AWS_REGION"),
)

MODEL = "anthropic.claude-3-5-sonnet-20240620-v1:0"

messages = [
  {
    "role": "user",
    "content": "What's the weather in Oakland, CA?",
  }
]

response = client.messages.create(
  model=MODEL,
  messages=messages,
  max_tokens=1000,

  tools=th.get_tools() + my_local_tools,
)

# Runs your local tool, gets the result, 
# and appends it to the context
tool_run = th.run_tools(response)
messages.append(tool_run)

response = client.chat.completions.create(
  model=MODEL,
  messages=messages,
  tools=th.get_tools(),
)

print(response.choices[0].text)
import os
from openai import AzureOpenAI

client = AzureOpenAI(
  api_key=os.environ.get("AZURE_OPENAI_API_KEY"),  
  api_version="2024-02-01",
  azure_endpoint = os.environ.get("AZURE_OPENAI_ENDPOINT")  
)

DEPLOYMENT_ID = "REPLACE_WITH_YOUR_DEPLOYMENT_NAME"

messages = [
  {
    "role": "user",
    "content": "What's the weather in Oakland, CA?",
  }
]

response = client.chat.completions.create(
  model=DEPLOYMENT_ID,
  messages=messages,
  # Passes Code Execution as a tool
  tools=th.get_tools() + my_local_tools,
)

# Runs your local tool, gets the result, 
# and appends it to the context
tool_run = th.run_tools(response)
messages.append(tool_run)

response = client.chat.completions.create(
  model=MODEL,
  messages=messages,
  tools=th.get_tools(),
)

print(response.choices[0].text)
import os
from groq import Groq

client = Groq(api_key=os.environ.get('GROQ_API_KEY'))
MODEL = 'llama3.1-70b-8192'

messages = [
  {
    "role": "user",
    "content": "What's the weather in Oakland, CA?",
  }
]

response = client.chat.completions.create(
  model=MODEL,
  messages=messages,

  tools=th.get_tools() + my_local_tools,
)

# Runs your local tool, gets the result, 
# and appends it to the context
tool_run = th.run_tools(response)
messages.append(tool_run)

response = client.chat.completions.create(
  model=MODEL,
  messages=messages,
  tools=th.get_tools(),
)

print(response.choices[0].text)
import os
from together import Together

client = Together(api_key=os.environ.get('TOGETHER_API_KEY'))
MODEL = 'mistralai/Mixtral-8x7B-Instruct-v0.1'

messages = [
  {
    "role": "user",
    "content": "What's the weather in Oakland, CA?",
  }
]

response = client.chat.completions.create(
  model=MODEL,
  messages=messages,
  # Passes Code Execution as a tool
  tools=th.get_tools() + my_local_tools,
)

# Runs your local, gets the result, 
# and appends it to the context
tool_run = th.run_tools(response)
messages.append(tool_run)

response = client.chat.completions.create(
  model=MODEL,
  messages=messages,
  tools=th.get_tools(),
)

print(response.choices[0].text)

Toolhouse will always execute tools in the order chosen by the LLM. If your LLM supports parallel tool use, tools will be executed one by one as specified by the LLM.

Overriding cloud tools

Your local tools will always have priority over cloud tools. In other words, if your local tool has the same name as a cloud tool, Toolhouse will execute your local tool.

You can use tool override to bypass cloud execution when you need to do so. This is useful in situations where you need to cache or memoize a function's request, or if you want to inhibit the behavior of a function for a specific completion call.

Treat tool definitions as prompts

Describe your tools like you would you describe the behavior of your assistant. Ensure each tool description contains precise, detailed instructions. This is particularly important for models with fewer parameters, which tend to be less accurate in generating a valid tool call for your user prompts.

Decorate your existing functions
Pass the tools alongside the th.get_tools() call