Tools Plugins Integrations
Master AI agent extensions with CrewAI! Learn to build custom tools, integrate external APIs & plugins to enhance your AI
Module 4: Tools, Plugins, and Integrations
This module explores how to extend the capabilities of your AI agents by integrating external tools, plugins, and services. We will cover building custom tools, handling external APIs, and leveraging existing powerful integrations to enhance agent functionality.
1. Building Custom Tools for Agents
CrewAI allows agents to interact with the outside world and perform specific tasks by utilizing tools. You can define custom tools to encapsulate specific functionalities.
Defining a Custom Tool
A custom tool in CrewAI is essentially a Python function that performs a specific action. You then wrap this function with the @tool
decorator from crewai_tools
.
Example: A simple tool to add two numbers.
from crewai_tools import tool
@tool
def add_numbers(num1: int, num2: int) -> int:
"""Adds two numbers and returns the result."""
return num1 + num2
Using Custom Tools in an Agent
Once defined, you can assign these tools to an agent when creating its profile.
from crewai import Agent, Task, Crew
from your_tools_module import add_numbers # Assuming your tool is in 'your_tools_module.py'
## Define your agent
calculator_agent = Agent(
role="Calculator Assistant",
goal="Perform simple arithmetic calculations",
backstory="An AI specialized in math.",
tools=[add_numbers], # Assign the custom tool
verbose=True
)
## Define a task that uses the tool
calculation_task = Task(
description="Calculate the sum of 5 and 7.",
expected_output="The number 12",
agent=calculator_agent
)
## Create and run the crew
crew = Crew(
agents=[calculator_agent],
tasks=[calculation_task],
verbose=2
)
crew.kickoff()
2. Handling External APIs and Browser-Based Actions
Agents can interact with external systems and web content through API calls and browser automation.
Interacting with External APIs
This involves making HTTP requests to external services. For instance, fetching data from a weather API or a stock market API. CrewAI agents can be equipped with tools that perform these API calls.
Example: A tool to fetch weather information (requires requests
library).
import requests
from crewai_tools import tool
@tool
def get_weather(location: str) -> str:
"""Fetches the current weather for a given location."""
api_key = "YOUR_WEATHER_API_KEY" # Replace with your actual API key
base_url = f"http://api.openweathermap.org/data/2.5/weather?q={location}&appid={api_key}&units=metric"
try:
response = requests.get(base_url)
response.raise_for_status() # Raise an exception for bad status codes
data = response.json()
temp = data['main']['temp']
description = data['weather'][0]['description']
return f"The weather in {location} is {temp}°C with {description}."
except requests.exceptions.RequestException as e:
return f"Error fetching weather for {location}: {e}"
Browser-Based Actions
For tasks requiring interaction with web pages, tools can be built to browse, scrape, and interact with web content. CrewAI integrates well with tools that can perform these actions.
Web Scraping: Extracting specific information from web pages.
Web Navigation: Visiting URLs, clicking links, filling forms.
CrewAI's ScrapeWebsiteTool
(from crewai_tools
) is a prime example for handling browser-based actions.
Example: Using ScrapeWebsiteTool
to get website content.
from crewai_tools import ScrapeWebsiteTool
## Initialize the tool
web_scraper = ScrapeWebsiteTool()
## Assign to an agent
researcher_agent = Agent(
role="Web Researcher",
goal="Find information on the internet",
backstory="An expert researcher adept at navigating the web.",
tools=[web_scraper],
verbose=True
)
## Define a task
research_task = Task(
description="Scrape the content of 'https://en.wikipedia.org/wiki/Artificial_intelligence' and summarize the main points about its history.",
expected_output="A summary of the history of Artificial Intelligence from the Wikipedia page.",
agent=researcher_agent
)
3. Integration with LangChain, OpenAI, HuggingFace APIs
CrewAI leverages powerful underlying technologies and APIs to provide its capabilities.
LangChain Integration
CrewAI is built on top of LangChain, providing a robust framework for building LLM applications. This means you can leverage LangChain's extensive features and integrations within your CrewAI workflows.
OpenAI API
CrewAI primarily uses OpenAI's language models (e.g., GPT-4, GPT-3.5 Turbo) for its agent's reasoning and task execution. You'll need to configure your OpenAI API key for CrewAI to function.
import os
from crewai import Agent, Task, Crew
from main import YourAgentClass # Example: if your agent class is in 'main.py'
## Set your OpenAI API key as an environment variable
os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"
## Example of an agent that might be powered by OpenAI models
data_analyst_agent = Agent(
role="Data Analyst",
goal="Analyze financial reports and identify trends",
backstory="An experienced data analyst with a keen eye for detail.",
verbose=True,
# By default, CrewAI uses OpenAI models if no LLM is specified.
# You can explicitly specify an LLM from LangChain if needed.
)
Hugging Face APIs
For models not available through OpenAI or for specific fine-tuned models, Hugging Face provides a vast repository. You can integrate Hugging Face models with CrewAI, often by using them with LangChain's integrations.
Example: Using a Hugging Face model via LangChain.
from langchain_community.llms import HuggingFaceHub
from crewai import Agent, Task, Crew
import os
## Set your Hugging Face API token as an environment variable
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "YOUR_HUGGINGFACE_HUB_API_TOKEN"
## Initialize an LLM from Hugging Face
hf_llm = HuggingFaceHub(repo_id="google/flan-t5-large", model_kwargs={"temperature": 0.7, "max_length": 512})
## Define an agent using the Hugging Face LLM
hugging_agent = Agent(
role="Text Summarizer",
goal="Summarize provided text accurately",
backstory="A helpful AI assistant focused on condensing information.",
llm=hf_llm, # Specify the Hugging Face LLM
verbose=True
)
## Define a task
summarization_task = Task(
description="Summarize the following article: [Paste Article Content Here]",
expected_output="A concise summary of the article.",
agent=hugging_agent
)
## Create and run the crew
crew = Crew(
agents=[hugging_agent],
tasks=[summarization_task],
verbose=2
)
crew.kickoff()
4. Tool Abstraction in CrewAI
CrewAI provides a layer of abstraction over various tools, making it easier to use them consistently. This abstraction simplifies how agents interact with different functionalities.
Examples of Abstracted Tools:
Calculator: Performing mathematical operations.
Web Search: Querying search engines to find information online.
File Reader: Reading content from local files.
These tools are often pre-built and available through libraries like crewai_tools
.
Using the Tool
decorator: As shown in Section 1, the @tool
decorator is the primary way to define and abstract custom tools.
Pre-built Tools: crewai_tools
offers several out-of-the-box tools for common tasks.
from crewai_tools import BaseTool, tool, LoadFileTool, CSVSampleTool
from crewai import Agent, Task, Crew
## Example using LoadFileTool
load_file_tool = LoadFileTool()
reader_agent = Agent(
role="Document Reader",
goal="Read and process documents",
backstory="An AI that specializes in extracting information from files.",
tools=[load_file_tool],
verbose=True
)
read_task = Task(
description="Read the content of 'my_document.txt' and extract the key dates mentioned.",
expected_output="A list of key dates found in the document.",
agent=reader_agent
)
5. Using Vector Databases (FAISS, Pinecone) in a Crew Setup
Vector databases are crucial for enabling agents to perform semantic search and retrieve relevant information from large datasets. CrewAI can integrate with vector databases to enhance agents' knowledge base.
Semantic Search with Vector DBs
When an agent needs to access information beyond its immediate prompt, it can query a vector database. The query is converted into a vector embedding, and the database returns the most semantically similar content.
Integrating FAISS and Pinecone
FAISS (Facebook AI Similarity Search): An open-source library for efficient similarity search and clustering of dense vectors. It's suitable for local deployments and experimentation.
Pinecone: A managed cloud-based vector database that offers scalability and ease of use for production environments.
General Integration Pattern:
Load and Embed Data: Load your documents, split them into chunks, and use an embedding model (e.g., from OpenAI or Hugging Face) to generate vector embeddings for each chunk.
Index Embeddings: Store these embeddings in your chosen vector database (FAISS or Pinecone).
Create a Retrieval Tool: Build a custom tool that takes a query, generates its embedding, and queries the vector database for similar content.
Assign Tool to Agent: Assign this retrieval tool to your agent.
Example (Conceptual - requires setup of vector DB and embeddings):
## This is a conceptual example. Actual implementation requires
## setting up the vector DB and embedding model.
from crewai_tools import tool
## Assume you have a vector_db_client configured for FAISS or Pinecone
## from your_vector_db_setup import vector_db_client
## Placeholder for a tool that searches a vector database
@tool
def semantic_search_vector_db(query: str) -> str:
"""Performs a semantic search in the knowledge base and returns relevant snippets."""
# Placeholder for actual vector database search logic
# For example, using Pinecone client:
# results = vector_db_client.query(vector=query_embedding, top_k=3)
# return "\n".join([match['metadata']['text'] for match in results['matches']])
return f"Searching for '{query}' in the knowledge base..."
## Assign this tool to an agent that needs to access a knowledge base
knowledge_agent = Agent(
role="Knowledge Navigator",
goal="Find and synthesize information from the company's knowledge base.",
backstory="An AI with access to a vast repository of internal documents.",
tools=[semantic_search_vector_db],
verbose=True
)
## Define a task that requires knowledge retrieval
knowledge_task = Task(
description="Find information about our latest marketing campaign strategy using the knowledge base.",
expected_output="Key strategies and findings related to the latest marketing campaign.",
agent=knowledge_agent
)
By integrating these tools and services, you can significantly enhance the capabilities of your AI agents, allowing them to perform a wider range of complex tasks and interact with the external world effectively.