Using LlamaIndex

You can use Toolhouse to equip your LlamaIndex agents with any Toolhouse tools. This allows you to use Toolhouse either alone or alongside any of your existing LlamaIndex tools across any agent type.

To use Toolhouse with LlamaIndex, simply instantiate the Toolhouse SDK by specifying LlamaIndex as your provider:

from llama_index.core.agent import ReActAgent
from llama_index.llms.groq import Groq
from toolhouse import Toolhouse, Provider
import dotenv

dotenv.load_dotenv()

th = Toolhouse(provider=Provider.LLAMAINDEX)

# Here we use a model hosted on Groq, but you can use any LLMs
# supported by LlamaIndex
llm = Groq(model="llama-3.2-11b-vision-preview")
agent = ReActAgent.from_tools(th.get_tools(), llm=llm, verbose=True)

response = agent.chat(
    "Search the internet for 3 medium-sized AI companies and for each one, "
    "get the contents of their webpage. When done, give me a short executive "
    "summary in bullet points."
)
print(str(response))

You can also pass a Bundle to the th.get_tools() call. This way, you can give each agent a specific collection of tools, which enhances their precisions and helps you optimize their responses:

tools = th.get_tools(bundle="search_and_get_page_contents")
agent = ReActAgent.from_tools(tools, llm=llm, verbose=True)

Using local tools

If you want to build and execute a tool locally, you won't need to use local tools. Instead, define a LlamaIndex tool and pass it alongside your Toolhouse tools:

from llama_index.core.tools import FunctionTool
from llama_index.core.agent import ReActAgent
from toolhouse import Toolhouse, Provider

th = Toolhouse(provider=Provider.LLAMAINDEX)
def get_weather(location: str) -> str:
    """Useful for getting the weather for a given location."""
    pass


llamaindex_tool = FunctionTool.from_defaults(get_weather)
th_tools = th.get_tools()

agent = ReActAgent.from_tools(th_tools + llamaindex_tool, llm=llm, verbose=True)

Last updated