Agentic AI deployment: Production-Ready using Docker container:
AI is changing how developers interact with databases — and one of the most powerful use cases is turning natural language into executable SQL queries. In this project, I built and deployed an Agentic AI system using the Docker container that:
Here’s the full breakdown of the solution and deployment approach
What I Built and Deployed-->
A complete Agentic NL→SQL AI system with:
1) Gemini LLM (via LangChain)
Generates SQL from natural language.
2) MCP Server (Fast API)
Executes SQL queries safely and exposes: /schema → Database schema /run → Execute SQL and return results + updated schema
3) Gradio Front-End UI
A simple user interface where users type natural language and receive:
4) LangGraph Workflow
Implements a 3-node pipeline:
This makes the system truly "agentic" — reacting, reasoning, and deciding dynamically.
Architecture Overview
User → Gradio UI → LangGraph Agent → MCP Server → MySQL
Components:
Pre-Requisites for Production-Grade Deployment
To deploy the Agentic AI system in a secure, scalable production environment, you must prepare the following:
1) Domain Name (Required)
A valid fully qualified domain name (FQDN), e.g.:
sqlagent.shanmaha.com
This domain must point (via DNS A record) to the static public IP of your server.
2) Static External IP (Required)
Your VM/server must have a static public IP address. (Dynamic IP will break HTTPS, DNS routing, and certificates.)
3) HTTPS with SSL Certificates (Required)
To ensure secure access, all production traffic must be encrypted.
We use Let’s Encrypt to automatically generate and renew SSL certificates.
4) Reverse Proxy (Required: NGINX Proxy)
A reverse proxy is required for:
We use:
5) Firewall Rules (Required)
Open these ports:
Port Purpose
80 HTTP → needed for Let's Encrypt verification
443 HTTPS → for production UI & API
22 SSH (admin access)
(Port 7860/8080 do NOT need to be public if reverse proxy routes correctly.)
6) Docker & Docker Compose (Required)
Your GCP VM/server must have:
7) MySQL Database (Required)
Your system must have a production-grade MySQL database deployed in one of these ways:
Database must be reachable from the agent container.
8) .env configuration (Required)
You must configure .env with:
Docker Deployment of Agentic AI - Step-By-Step
1) Prepare Project Structure
app/
├── gradio_agentic_ui.py
├── mysql_mcp_server.py
├── langgraph_schema_graph.py
requirements.txt
docker-compose.production.yml
Dockerfile
.env
2) Build the Docker Image
docker compose -f docker-compose.production.yml build --no-cache
3) Deploy with Docker Compose
docker compose -f docker-compose.production.yml up -d
This starts:
AI Application using Docker will be available at:
https://<your-domain>
with full SSL (HTTPS) support.
The Agentic Workflow -LangGraph Workflow ( 3 node)
✔ Node 1 — Fetch Schema
The app calls:
GET /schema
…and retrieves live MySQL structure.
✔ Node 2 — Generate SQL
Gemini LLM turns natural language into SQL using LangChain + Prompt engineering.
✔ Node 3 — Execute SQL
FastAPI MCP server runs:
POST /run
and returns:
This entire flow is orchestrated with LangGraph, enabling deterministic execution and state transitions.
End-User Experience
From the browser UI, a user can simply type:
“Show total salary of employees grouped by department”
The system automatically:
Recommended by LinkedIn
No SQL knowledge required.
Packaging Agentic AI System - as a Reusable Portable Docker Image
This was one of the most powerful parts of the setup.
I created a portable package so the entire Agentic AI system can be shared and deployed on any machine in minutes.
✔ Step 1 — Save the running container as a standalone Docker image
docker commit agentic_app agentic-ai-mysql:latest
✔ Step 2 — Export docker image into a TAR file
docker save -o agentic-ai-mysql.tar agentic-ai-mysql:latest
✔ Step 3 — Create a portable folder and zip it.
agentic-ai-mysql-portable/
├── agentic-ai-mysql.tar
├── docker-compose.yml
├── .env.example
└── README.md
✔Step 4—Copy the portable folder anywhere and load the tar file
docker load -i agentic-ai-mysql.tar
✔Step 5 — Redeploy the container anywhere
docker compose -f docker-compose.production.yml up -d
This means:
Anyone can re-deploy the full Agentic AI system anywhere
Perfect for demos, clients, training environments, or offline setups.
NOTE: Docker file not required as we load the existing docker image and run it (not creating any new docker image)
✔Step 6—Demo:
Re-deploy Agentic AI container using pre-built image in Local windows (non-enterprise)
This demo is a non-enterprise re-deploy, as the DNS, HTTPS, nginx are not used in the docker-compose file
NOTE: The MySQL database already running in local Windows.
.env file
---> contents in Windows local machine
docker-compose file for Non-enterprise
-->contents in Windows local machine
Load the tarfile and run the container
docker load -i agentic-ai-mysql.tar
docker compose -f docker-compose.production.yml up -d
Access the UI from Browser ( Gradio UI)
RUN the SQL operation (insert records) from UI ( Gradio UI)
Note: Before status and After status of record insertion shown (from DB console)
Role of FastAPI’s :
1) Provides the MCP Server API Endpoints
FastAPI exposes these REST APIs:
GET /schema
Returns the live MySQL schema so the LLM (Gemini) understands the structure.
POST /run
Executes the SQL query generated by the LLM.
GET /
Simple heartbeat endpoint to confirm server is running.
This backend is the “brain” that interacts with MySQL.
2) Ensures Secure, Validated, Schema-Aware SQL Execution
FastAPI handles:
✔ Secure request handling ✔ Input validation ✔ JSON request parsing ✔ JSON response formatting ✔ Error handling ✔ Ensures clean communication between Gradio → LangGraph → Backend → MySQL
3) Acts as the LangGraph + LLM Connection Bridge
LangGraph workflow:
So FastAPI is the bridge between the LLM and the MySQL database.
4) Runs Inside Uvicorn for High Performance
Docker container launches:
uvicorn mysql_mcp_server:app --host 0.0.0.0 --port 8080
Uvicorn is a high-speed ASGI server. FastAPI + Uvicorn gives:
✔ Very high performance ✔ Async request handling ✔ Production-ready backend
5) Enables the App to Work in Any Environment
Because the FastAPI server:
The Gradio UI, LangGraph workflow, or even external systems can call it.That’s why portable Docker image works anywhere.
To keep it simple
All working together inside Docker.
Final Thoughts
This project demonstrates how modern AI + databases + DevOps can combine to create truly intelligent systems.
Key Innovations:
With this architecture, teams can:
✔ Turn non-technical users into powerful data explorers
✔ Reduce dependency on manual SQL
✔ Deploy AI-driven data access securely
✔ Share the full system as an easy-to-run package
If we are building AI-powered developer tools or data interfaces, this pattern is extremely powerful.
Conclusion:
By packaging an entire Agentic AI + MySQL automation system into a single, portable, production-ready Docker container, we’ve eliminated the biggest barriers to real-world adoption: complex setup, fragile dependencies, environment mismatches, and manual configuration.
This deployment shows that:
More importantly, this project proves how AI agents can be transformed from “demo-only” prototypes into practical, operational tools that deliver real outcomes — schema-aware SQL generation, automated execution, and interactive UI… all running inside one container.
This is not just about deploying an AI app — It’s about enabling enterprises to adopt Agentic AI safely, consistently, and at scale.The future of AI-driven automation is containerized, portable, and production-ready. And this deployment is one major step toward that future.
Written by Shanmugavelu Munivelu
I am a database professional with a passion for modern technology. My expertise spans cloud infrastructure and DevOps methodologies, and I've recently focused on integrating AI into my work to solve complex data challenges.
Shanmugavelu Munivelu I found the idea of typing a simple question and getting the exact SQL query executed safely on the database really interesting, it sounds like it could save a lot of time and effort. I've come across similar concepts in my work, where automation can greatly improve efficiency. I'd love to learn more about how LangGraph and Gemini AI contribute to this process. This is definitely something I'd like to explore further in my own work.
https://github.com/mshan0181/AI_project/tree/main/agentic_ai_nltosql