If your AI agent only runs in the terminal, only people comfortable with setting up Python can use it. Turning an experiment into a real product means adding a user interface. In this article, I’ll show you how to build a web UI for your local AI agent using Python and Streamlit.
Building a Web AI for Local AI Agent
We’ll use the data analysis AI agent I built earlier, which runs Mistral locally with Ollama (find it here), and turn it into an interactive web app with Streamlit.
Before we start building the interface, make sure you have these libraries installed:
- Streamlit for the web UI
- LangChain to interact with the LLM
- Ollama to run the local model
- Pandas, Matplotlib, Seaborn for data analysis
Install everything using pip:
pip install streamlit pandas matplotlib seaborn langchain langchain-community
You also need Ollama installed locally with the Mistral model. If you haven’t already pulled the model, run:
ollama pull mistral
Step 1: Setting Up the UI and File Upload
Rather than hardcoding the CSV file import, we want our web app to handle any dataset the user uploads.
We’ll use Streamlit’s layout tools to add a title and a file uploader, so users can easily drop any dataset into the app:
import streamlit as st
import pandas as pd
from langchain_community.llms import Ollama
# Configure the page layout
st.set_page_config(page_title="AI Data Analyst", layout="wide")
st.title("📊 AI Data Analyst Agent")
st.markdown("Upload a CSV file and let local AI generate EDA code and statistical insights.")
# Sidebar for controls
with st.sidebar:
st.header("Upload Data")
uploaded_file = st.file_uploader("Choose a CSV file", type=["csv"])Step 2: Extracting Schema and Initializing the LLM
After a file is uploaded, we’ll read it with Pandas, just like in the original script from my last article. Then, we’ll start up the local Mistral model.
Using st.spinner() is a helpful UX trick. It lets users know the app is working, instead of just appearing to freeze:
if uploaded_file is not None:
# Load dataset
df = pd.read_csv(uploaded_file)
st.subheader("Raw Data Preview")
st.dataframe(df.head()) # Renders an interactive table in the browser
# Extract structural info (from your original logic)
schema_info = {
"columns": df.columns.tolist(),
"dtypes": df.dtypes.astype(str).to_dict(),
"missing_values": df.isnull().sum().to_dict(),
"shape": df.shape
}
# Initialize the local LLM
llm = Ollama(model="mistral")Step 3: Generating the Python Code
We only want to start the heavy processing when the user is ready. So, we’ll add a button. When the user clicks it, the app sends the schema to Mistral and asks for the EDA code:
if st.button("Generate Analysis Code & Insights"):
with st.spinner("Writing EDA code..."):
prompt = f"""
You are a data scientist.
Here is dataset metadata: {schema_info}
Write Python code using pandas, matplotlib, and seaborn to:
1. Generate summary statistics
2. Plot distributions for numerical columns
3. Plot correlation heatmap
4. Identify missing values visually
Only return executable Python code. Do not include markdown formatting like ```python.
"""
generated_code = llm.invoke(prompt)
st.subheader("💻 Generated EDA Code")
st.code(generated_code, language="python")
Step 4: Generating Human-Readable Insights
Right after generating the code, we will now focus on getting human-readable insights from the statistical summary:
with st.spinner("Analyzing statistics..."):
# Generate statistical summary
analysis_summary = df.describe(include="all").to_string()
insight_prompt = f"""
You are a senior data scientist.
Here are summary statistics: {analysis_summary}
Provide:
- Key patterns
- Potential data quality issues
- Interesting correlations
- Recommendations for modeling
Format the response nicely using Markdown.
"""
insights = llm.invoke(insight_prompt)
st.subheader("🧠 AI Insights")
st.markdown(insights) To launch your new UI, open your terminal and run:
streamlit run your_file_name.py
A browser window will open. You now have a fully working, local, privacy-focused AI data analysis app. Here’s an example of what it looks like:


Closing Thoughts
That’s how you can build a web UI for your local AI agent using Python and Streamlit.
Keep building projects you can interact with. It makes learning more hands-on and gives you something cool to show for your effort.
If you found this article helpful, you can follow me on Instagram for daily AI tips and practical resources. You may also be interested in my latest book, Hands-On GenAI, LLMs & AI Agents, a step-by-step guide to prepare you for careers in today’s AI industry.





