You can use Toolhouse to equip your LlamaIndex agents with any Toolhouse tools. This allows you to use Toolhouse either alone or alongside any of your existing LlamaIndex tools across any agent type.
To use Toolhouse with LlamaIndex, simply instantiate the Toolhouse SDK by specifying LlamaIndex as your provider:
from llama_index.core.agent import ReActAgentfrom llama_index.llms.groq import Groqfrom toolhouse import Toolhouse, Providerimport dotenvdotenv.load_dotenv()th =Toolhouse(provider=Provider.LLAMAINDEX)# Here we use a model hosted on Groq, but you can use any LLMs# supported by LlamaIndexllm =Groq(model="llama-3.2-11b-vision-preview")agent = ReActAgent.from_tools(th.get_tools(), llm=llm, verbose=True)response = agent.chat("Search the internet for 3 medium-sized AI companies and for each one, ""get the contents of their webpage. When done, give me a short executive ""summary in bullet points.")print(str(response))
You can also pass a Bundle to the th.get_tools() call. This way, you can give each agent a specific collection of tools, which enhances their precisions and helps you optimize their responses:
If you want to build and execute a tool locally, you won't need to use local tools. Instead, define a LlamaIndex tool and pass it alongside your Toolhouse tools:
from llama_index.core.tools import FunctionToolfrom llama_index.core.agent import ReActAgentfrom toolhouse import Toolhouse, Providerth =Toolhouse(provider=Provider.LLAMAINDEX)defget_weather(location:str) ->str:"""Useful for getting the weather for a given location."""passllamaindex_tool = FunctionTool.from_defaults(get_weather)th_tools = th.get_tools()agent = ReActAgent.from_tools(th_tools + llamaindex_tool, llm=llm, verbose=True)