Agentic AI deployment: Production-Ready using Docker container:

Agentic AI deployment: Production-Ready using Docker container:

Article content

AI is changing how developers interact with databases — and one of the most powerful use cases is turning natural language into executable SQL queries. In this project, I built and deployed an Agentic AI system using the Docker container that:

  • Understands natural language
  • Generates correct MySQL SQL queries
  • Executes them safely
  • Visualizes results in a clean UI
  • Runs fully containerized in production
  • Can be packaged and reused in ANY environment

Article content

Here’s the full breakdown of the solution and deployment approach

What I Built and Deployed-->

A complete Agentic NL→SQL AI system with:

1) Gemini LLM (via LangChain)

Generates SQL from natural language.

2) MCP Server (Fast API)

Executes SQL queries safely and exposes: /schema → Database schema /run → Execute SQL and return results + updated schema

3) Gradio Front-End UI

A simple user interface where users type natural language and receive:

  • Auto-generated SQL
  • Execution results
  • Updated database schema

4) LangGraph Workflow

Implements a 3-node pipeline:

  1. Fetch Schema
  2. Generate SQL
  3. Execute SQL

This makes the system truly "agentic" — reacting, reasoning, and deciding dynamically.

Article content

Architecture Overview

Article content
User → Gradio UI → LangGraph Agent → MCP Server → MySQL
        

Components:

  • FastAPI MCP Server on port 8080
  • Gradio UI on port 7860
  • MySQL container
  • Nginx Reverse Proxy + Let’s Encrypt
  • Docker Compose Production Deployment

Pre-Requisites for Production-Grade Deployment

To deploy the Agentic AI system in a secure, scalable production environment, you must prepare the following:

1) Domain Name (Required)

A valid fully qualified domain name (FQDN), e.g.:

sqlagent.shanmaha.com
        

This domain must point (via DNS A record) to the static public IP of your server.

2) Static External IP (Required)

Your VM/server must have a static public IP address. (Dynamic IP will break HTTPS, DNS routing, and certificates.)

3) HTTPS with SSL Certificates (Required)

To ensure secure access, all production traffic must be encrypted.

We use Let’s Encrypt to automatically generate and renew SSL certificates.

4) Reverse Proxy (Required: NGINX Proxy)

A reverse proxy is required for:

  • Routing https://your-domain.com to internal containers
  • Handling HTTPS termination
  • Automatic SSL certificate management
  • Load balancing (if scaling later)
  • Protecting backend services

We use:

  • nginx-proxy (jwilder)
  • letsencrypt-nginx-proxy-companion

5) Firewall Rules (Required)

Open these ports:

Port Purpose

80 HTTP → needed for Let's Encrypt verification

443 HTTPS → for production UI & API

22 SSH (admin access)

(Port 7860/8080 do NOT need to be public if reverse proxy routes correctly.)

6) Docker & Docker Compose (Required)

Your GCP VM/server must have:

  • Docker Engine
  • Docker Compose Plugin
  • Ability to load your .tar container image

7) MySQL Database (Required)

Your system must have a production-grade MySQL database deployed in one of these ways:

  • ✔ Installed on the same VM (your chosen method)
  • ✔ Installed in another internal VM
  • ✔ Cloud SQL for MySQL (enterprise recommended in future)

Database must be reachable from the agent container.

8) .env configuration (Required)

You must configure .env with:

  • DOMAIN name
  • LetsEncrypt email
  • MySQL credentials
  • Private DB host IP
  • API keys for LLM

Article content

Docker Deployment of Agentic AI - Step-By-Step

1) Prepare Project Structure

app/
 ├── gradio_agentic_ui.py
 ├── mysql_mcp_server.py
 ├── langgraph_schema_graph.py
requirements.txt
docker-compose.production.yml
Dockerfile
.env
        

2) Build the Docker Image

Article content
docker compose -f docker-compose.production.yml build --no-cache
        
Article content

3) Deploy with Docker Compose

Article content
docker compose -f docker-compose.production.yml up -d
        
Article content
Article content

This starts:

  • Nginx reverse proxy
  • Let’s Encrypt SSL companion
  • MySQL DB
  • Agentic AI application (FastAPI + Gradio)

AI Application using Docker will be available at:

https://<your-domain>
        

with full SSL (HTTPS) support.

Article content

The Agentic Workflow -LangGraph Workflow ( 3 node)

✔ Node 1 — Fetch Schema

The app calls:

GET /schema
        

…and retrieves live MySQL structure.

Article content
Article content

✔ Node 2 — Generate SQL

Gemini LLM turns natural language into SQL using LangChain + Prompt engineering.

Article content


✔ Node 3 — Execute SQL

FastAPI MCP server runs:

POST /run
        
Article content
Article content

and returns:

  • Query results
  • Row count
  • Updated schema

This entire flow is orchestrated with LangGraph, enabling deterministic execution and state transitions.

End-User Experience

From the browser UI, a user can simply type:

“Show total salary of employees grouped by department”

The system automatically:

  1. Reads schema
  2. Generates SQL: SELECT department, SUM(salary) FROM employees GROUP BY department;
  3. Executes the query
  4. Shows results + schema changes

No SQL knowledge required.


Packaging Agentic AI System - as a Reusable Portable Docker Image

This was one of the most powerful parts of the setup.

I created a portable package so the entire Agentic AI system can be shared and deployed on any machine in minutes.

Article content

✔ Step 1 — Save the running container as a standalone Docker image

docker commit agentic_app agentic-ai-mysql:latest
        
Article content
Article content

✔ Step 2 — Export docker image into a TAR file

docker save -o agentic-ai-mysql.tar agentic-ai-mysql:latest
        
Article content

✔ Step 3 — Create a portable folder and zip it.

agentic-ai-mysql-portable/
 ├── agentic-ai-mysql.tar
 ├── docker-compose.yml
 ├── .env.example
 └── README.md
        
Article content
Article content

✔Step 4—Copy the portable folder anywhere and load the tar file

docker load -i agentic-ai-mysql.tar
        

✔Step 5 — Redeploy the container anywhere

docker compose -f docker-compose.production.yml up -d
        

This means:

Anyone can re-deploy the full Agentic AI system anywhere

  1. Prepare the .env file and docker-compose file
  2. Just Load the tar file & run the container using pre-built Docker image.

Perfect for demos, clients, training environments, or offline setups.

NOTE: Docker file not required as we load the existing docker image and run it (not creating any new docker image)

✔Step 6—Demo:

Re-deploy Agentic AI container using pre-built image in Local windows (non-enterprise)

This demo is a non-enterprise re-deploy, as the DNS, HTTPS, nginx are not used in the docker-compose file

  1. SCP (file transfer) the portable folder file from GCP VM to the local Windows
  2. The portable folder has the agentic-ai-mysql.tar and docker-compose file and .env file
  3. Edit the docker-compose file and .env file according to LOCAL windows environment.
  4. Just Load the tar file & run the container.
  5. This will load the UI (Gradio UI), where we can trigger the SQL through Natural language.

NOTE: The MySQL database already running in local Windows.

.env file

---> contents in Windows local machine

Article content

docker-compose file for Non-enterprise

-->contents in Windows local machine

Article content

Load the tarfile and run the container

docker load -i agentic-ai-mysql.tar

docker compose -f docker-compose.production.yml up -d

Article content

Access the UI from Browser ( Gradio UI)

Article content

RUN the SQL operation (insert records) from UI ( Gradio UI)

Note: Before status and After status of record insertion shown (from DB console)

Article content

Role of FastAPI’s :

Article content
Article content

1) Provides the MCP Server API Endpoints

FastAPI exposes these REST APIs:

GET /schema

Returns the live MySQL schema so the LLM (Gemini) understands the structure.

POST /run

Executes the SQL query generated by the LLM.

GET /

Simple heartbeat endpoint to confirm server is running.

This backend is the “brain” that interacts with MySQL.


2) Ensures Secure, Validated, Schema-Aware SQL Execution

FastAPI handles:

✔ Secure request handling ✔ Input validation ✔ JSON request parsing ✔ JSON response formatting ✔ Error handling ✔ Ensures clean communication between Gradio → LangGraph → Backend → MySQL


3) Acts as the LangGraph + LLM Connection Bridge

LangGraph workflow:

  1. Fetch schema → calls FastAPI /schema
  2. Generate SQL → uses Gemini
  3. Execute SQL → calls FastAPI /run

So FastAPI is the bridge between the LLM and the MySQL database.


4) Runs Inside Uvicorn for High Performance

Docker container launches:

uvicorn mysql_mcp_server:app --host 0.0.0.0 --port 8080
        

Uvicorn is a high-speed ASGI server. FastAPI + Uvicorn gives:

✔ Very high performance ✔ Async request handling ✔ Production-ready backend

5) Enables the App to Work in Any Environment

Because the FastAPI server:

  • exposes schema as HTTP API
  • exposes SQL execution as HTTP API

The Gradio UI, LangGraph workflow, or even external systems can call it.That’s why portable Docker image works anywhere.

To keep it simple

  • FastAPI = Backend API Server that handles schema + SQL execution
  • Gradio = Front-end UI
  • LangGraph + Gemini = Reasoning + SQL generation
  • MySQL = Database

All working together inside Docker.

Final Thoughts

This project demonstrates how modern AI + databases + DevOps can combine to create truly intelligent systems.

Key Innovations:

  • Natural-language SQL generation using Gemini
  • Schema-aware reasoning with LangGraph
  • Safe SQL execution via an MCP server
  • Secure cloud deployment with SSL
  • Fully portable Docker packaging

With this architecture, teams can:

✔ Turn non-technical users into powerful data explorers

✔ Reduce dependency on manual SQL

✔ Deploy AI-driven data access securely

✔ Share the full system as an easy-to-run package

If we are building AI-powered developer tools or data interfaces, this pattern is extremely powerful.

Conclusion:

By packaging an entire Agentic AI + MySQL automation system into a single, portable, production-ready Docker container, we’ve eliminated the biggest barriers to real-world adoption: complex setup, fragile dependencies, environment mismatches, and manual configuration.

This deployment shows that:

  • Agentic AI can run anywhere — cloud, on-prem, or local
  • Infrastructure becomes predictable — one image, same behavior
  • Natural Language → SQL intelligence is fully automated
  • Reusable, sharable, and scalable — teams can deploy instantly
  • Secure & isolated — database and LLM keys stay in environment configs

More importantly, this project proves how AI agents can be transformed from “demo-only” prototypes into practical, operational tools that deliver real outcomes — schema-aware SQL generation, automated execution, and interactive UI… all running inside one container.

This is not just about deploying an AI app — It’s about enabling enterprises to adopt Agentic AI safely, consistently, and at scale.The future of AI-driven automation is containerized, portable, and production-ready. And this deployment is one major step toward that future.

Written by Shanmugavelu Munivelu

I am a database professional with a passion for modern technology. My expertise spans cloud infrastructure and DevOps methodologies, and I've recently focused on integrating AI into my work to solve complex data challenges.

Article content


Shanmugavelu Munivelu I found the idea of typing a simple question and getting the exact SQL query executed safely on the database really interesting, it sounds like it could save a lot of time and effort. I've come across similar concepts in my work, where automation can greatly improve efficiency. I'd love to learn more about how LangGraph and Gemini AI contribute to this process. This is definitely something I'd like to explore further in my own work.

To view or add a comment, sign in

More articles by Shanmugavelu Munivelu

Others also viewed

Explore content categories