🏠
Toolhouse
DiscordGithubSign upGo to App
  • 🏠Toolhouse
  • Quick start: deploy your first agent
  • Build agents with the th file
  • Test agents before deploying
  • Deploy and run your agents
  • Agent workers
    • Running Agents asynchronously
      • API Reference
    • Schedule autonomous runs
      • API Reference
  • Toolhouse SDK
    • ✨Quick start (Python)
    • ✨Quick start (TypeScript)
    • Using LlamaIndex
    • Using Vercel AI
  • Choose MCP servers for your agent
  • Customize agents for your end users
  • 💬Execution logs
  • Go to app
Powered by GitBook
On this page
  1. Toolhouse SDK

Using Vercel AI

You can pass Toolhouse into Vercel AI. This allows you to extend the functionality of Vercel AI by bringing any of your Toolhouse tools into your apps.

To use Toolhouse with Vercel AI, simply instantiate the Toolhouse SDK by specifying Vercel AI as your provider:

import { Toolhouse } from '@toolhouseai/sdk';

const toolhouse = new Toolhouse({
    apiKey: process.env.TOOLHOUSE_API_KEY,
    provider: "vercel",
});

const tools = await toolhouse.getTools() as Record<string, CoreTool<any, any>>;

This configures Toolhouse to return tools that can be used into your AI functions. From there, simply pass your tools into generateText or streamText. You can use any LLM supported by Vercel.

Here's the complete example:

import { Toolhouse } from "@toolhouseai/sdk";
import { anthropic } from "@ai-sdk/anthropic";
import { CoreMessage, CoreTool, generateText } from "ai";
import * as dotenv from "dotenv";

dotenv.config()

async function main() {
  const toolhouse = new Toolhouse({
    apiKey: process.env.TOOLHOUSE_API_KEY,
    provider: "vercel",
  });
  
  const tools = await toolhouse.getTools() as Record<string, CoreTool<any, any>>;
  const history = [{ 
    role: "user", 
    content: "Get the contents of https://toolhouse.ai and summarize its key value propositions in a few bullet points"
  }] as CoreMessage[];
  
  const { text, toolResults } = await generateText({
    model: anthropic("claude-3-5-sonnet-latest"),
    tools,
    messages: history,
  });
  
  console.log(`text: "${text}"`);
  console.log("toolResults");
  console.log(JSON.stringify(toolResults));
}

main();
const tools = await toolhouse.getTools("web") as Record<string, CoreTool<any, any>>;
PreviousUsing LlamaIndexNextChoose MCP servers for your agent

Last updated 6 months ago

You can also pass a to the getTools() call. This way, you can give each AI function a specific collection of tools, which enhances their precisions and helps you optimize their responses:

Bundle