Using Vercel AI

You can pass Toolhouse into Vercel AI. This allows you to extend the functionality of Vercel AI by bringing any of your Toolhouse tools into your apps.

To use Toolhouse with Vercel AI, simply instantiate the Toolhouse SDK by specifying Vercel AI as your provider:

import { Toolhouse } from '@toolhouseai/sdk';

const toolhouse = new Toolhouse({
    apiKey: process.env.TOOLHOUSE_API_KEY,
    provider: "vercel",
});

const tools = await toolhouse.getTools() as Record<string, CoreTool<any, any>>;

This configures Toolhouse to return tools that can be used into your AI functions. From there, simply pass your tools into generateText or streamText. You can use any LLM supported by Vercel.

Here's the complete example:

import { Toolhouse } from "@toolhouseai/sdk";
import { anthropic } from "@ai-sdk/anthropic";
import { CoreMessage, CoreTool, generateText } from "ai";
import * as dotenv from "dotenv";

dotenv.config()

async function main() {
  const toolhouse = new Toolhouse({
    apiKey: process.env.TOOLHOUSE_API_KEY,
    provider: "vercel",
  });
  
  const tools = await toolhouse.getTools() as Record<string, CoreTool<any, any>>;
  const history = [{ 
    role: "user", 
    content: "Get the contents of https://toolhouse.ai and summarize its key value propositions in a few bullet points"
  }] as CoreMessage[];
  
  const { text, toolResults } = await generateText({
    model: anthropic("claude-3-5-sonnet-latest"),
    tools,
    messages: history,
  });
  
  console.log(`text: "${text}"`);
  console.log("toolResults");
  console.log(JSON.stringify(toolResults));
}

main();

You can also pass a Bundle to the getTools() call. This way, you can give each AI function a specific collection of tools, which enhances their precisions and helps you optimize their responses:

const tools = await toolhouse.getTools("web") as Record<string, CoreTool<any, any>>;

Last updated