Most Large Language Models can’t access new information because they are trained on data up to a certain point. Today, we’ll change that. Instead of just making a smarter chatbot, we’ll build an agent that connects to the internet for real-time answers. In this tutorial, I’ll show you how to connect a free, local AI model to the internet using Python.
Connect AI Agent to the Internet: Getting Started
We’ll run everything on our computer to connect the AI agent to the internet. This keeps your data private and doesn’t cost anything.
Step 0: The Setup
To start, you’ll need a local backend for the AI model. Begin by downloading and installing Ollama.
Next, open your terminal or command prompt and download the Llama 3.2 model:
ollama pull llama3.2
This model is lightweight and runs quickly.
Now, let’s bring everything together. We’ll use LangChain as the framework, LangGraph for the agent’s logic, and DuckDuckGo as the search engine.
Run this in your terminal:
pip install langchain-ollama langchain-community langgraph duckduckgo-search
Let’s start coding. Create a file named my_agent.py and follow the steps below.
Step 1: Imports and The Brain
First, set up the LLM. We’ll use temperature=0 so the model gives factual and precise answers. We want accuracy, not creativity:
from langchain_ollama import ChatOllama
from langchain_community.tools import DuckDuckGoSearchRun
from langgraph.prebuilt import create_react_agent
# 1. LLM (Brain)
llm = ChatOllama(
model="llama3.2:latest",
temperature=0,
)Step 2: The Internet Access
Now, define the tool. DuckDuckGoSearchRun is useful because it lets you search the web without needing an API key or a Google Cloud account:
# 2. Tool (Internet Access) search_tool = DuckDuckGoSearchRun() tools = [search_tool]
Step 3: The System Instructions
This is where we use prompt engineering. We need to clearly tell the model that it is more than just a chatbot now.
We’ll define a System Prompt that makes the model use the search tool for current events. This helps prevent the model from making things up:
# 3. System Instruction (FORCE SEARCH WHEN NEEDED) SYSTEM_PROMPT = """ You are an AI agent with access to a live web search tool. RULES: - For any question about current events, software versions, prices, news, weather, sports, or dates — you MUST use web search. - Do NOT answer from memory when information could have changed after training. - If unsure, search first. """
Step 4: The Agent
We’ll use create_react_agent from LangGraph. This follows the ReAct pattern, which stands for Reason and Act. The agent checks the user input, decides if it needs a tool, uses the tool, and then looks at the result:
# 4. Create Agent
agent_executor = create_react_agent(llm, tools).with_config(
{
"recursion_limit": 8, # prevents infinite loops
"system_message": SYSTEM_PROMPT,
}
)Step 5: The Interaction Loop
Finally, we need a way to interact with it. This loop takes your input, sends it to the agent, and shows when the agent decides to use a tool:
# 5. Chat Loop
def run_chat():
print("🌐 Smart AI Agent Online (Type 'quit' to exit)")
print("-" * 60)
while True:
user_input = input("You: ")
if user_input.lower() in ["quit", "exit", "q"]:
print("👋 Goodbye!")
break
print("🤖 Thinking...", end="", flush=True)
try:
response = agent_executor.invoke({
"messages": [{"role": "user", "content": user_input}]
})
# ---------------------------
# DEBUG: Show tool usage
# ---------------------------
for m in response["messages"]:
if m.type == "tool":
print(f"\n🔎 Tool used: {m.name}")
# ---------------------------
# Get Final AI Message
# ---------------------------
for m in reversed(response["messages"]):
if m.type == "ai":
print(f"\r🤖 Agent: {m.content}")
break
except Exception as e:
print(f"\nError: {e}")
print("Make sure Ollama is running: `ollama serve`")
# Run
if __name__ == "__main__":
run_chat()Now, run your code. Here’s an example of a question I tried:
🌐 Smart AI Agent Online (Type 'quit' to exit)
------------------------------------------------------------
You: What is the latest version of Python available today?
🤖 Thinking...
🔎 Tool used: duckduckgo_search
🤖 Agent: The current and latest version of Python is Python 3.14, which is the latest stable release. It is recommended to test this version in a staging or development environment before deploying it in production to identify any refactoring required. You can download the latest Python 3.14 version from the official Python website: python.org/downloads.
Closing Thoughts
That’s how you connect your first AI agent to the internet.
With this, we’re moving from an era of static AI to an era of active AI. The models aren’t just collections of information anymore; they’re tools that can take action.
If you found this article helpful, you can follow me on Instagram for daily AI tips and practical resources. You may also be interested in my latest book, Hands-On GenAI, LLMs & AI Agents, a step-by-step guide to prepare you for careers in today’s AI industry.





