Custom MCP Server Setup and Calling it from Microsoft Copilot Client

Custom MCP Server Setup and Calling it from Microsoft Copilot Client

This project implements an MCP (Model Context Protocol) server and client examples in Python, centered on a BMI-calculation service combined with SQL Server connectivity for diagnostics and data access. The stack is Python-first: core code uses standard Python 3.x, the MCP server relies on the FastMCP/mcp packages for protocol and tool registration, and SQL connectivity uses the pyodbc driver to talk to Microsoft SQL Server. For local testing and repeatability the repo includes Docker support (docker-compose.yml) to run SQL Server locally, and the test suite contains a dedicated Docker-based integration test that validates configuration switching and connectivity. The project is designed with Azure Functions deployment in mind: files like host.json, local.settings.json, and AZURE_DEPLOYMENT.md document how to run and deploy the solution on Azure, and mcp_server.py contains reusable logic intended for usage within an Azure Functions app.

Frameworks and libraries used:

  • MCP stack: mcp and fastmcp provide the agent/tool model, resource registration, and a local server runtime for tooling and interaction.
  • OpenAI client usage: integration points for calling LLMs are present (OpenAI SDK usage), but keys are configured via environment variables to avoid leaking secrets.
  • SQL connectivity: pyodbc for ODBC-based connections to SQL Server; the code supports ActiveDirectory* authentication and SQL password-based auth modes.
  • Dev/test tooling: docker-compose for a repeatable local SQL Server environment and a small test suite to exercise configuration and query flows.
  • Deployment considerations: Azure Functions runtime conventions and guidance for using Managed Identity / Key Vault for secrets and secure connection patterns.

Key technical challenges and how they were addressed:

  • Secrets management and safety: the code originally contained hardcoded API keys and passwords in examples and tests. This created both a security and operational challenge. The repo was refactored to read sensitive values from environment variables (OPENAI_API_KEY, SA_PASSWORD, SQL_PASSWORD) and documentation updated to recommend Azure Key Vault or Function App settings for production. Tests and Docker healthchecks were changed to depend on environment variables rather than literal secrets.
  • Multiple authentication patterns for SQL Server: enterprise Azure SQL requires different auth methods (Managed Identity, Active Directory interactive, integrated, or SQL auth). The code was built to support these modes, generate ODBC connection strings accordingly, and provide a docker_test fallback for local development. The complexity of supporting AD interactive flows and managed identities required careful separation of runtime behaviors and clear guidance in docs.
  • Network and environment variability: Azure SQL access is often blocked by corporate networks, VPNs, or firewall rules. The repository includes diagnostics (sql_diagnostics.py, test_network_connectivity) and a relaxed configuration to help troubleshoot certificate or network validation issues. Providing Docker-based testing reduces the friction of verifying local behavior.
  • Redaction and developer ergonomics: to be useful during debugging while not exposing secrets, connection strings and outputs are redacted or masked in logs and responses. Tools were added to show whether env vars are set (e.g., SQL_PASSWORD presence shown as SET/NOT_SET) so developers can quickly find missing configuration while keeping secrets hidden.

Learnings and outcomes:

  • Security-first defaults matter. Moving from hardcoded secrets to environment-driven configuration and explicit guidance (Key Vault, Managed Identity) dramatically improves safety and deployability.
  • Build testable fallbacks. Providing a Docker-based SQL Server for local testing makes it practical to validate connection logic and SQL-related tools without needing Azure connectivity or credentials.
  • Support multiple authentication flows but prefer managed/host-native options. For cloud deployments, Managed Identity and Key Vault integration simplify lifecycle management and reduce secret sprawl.
  • Good documentation reduces support overhead. Updating README.md, SQL_SERVER_SETUP.md, and AZURE_DEPLOYMENT.md with explicit env var and Key Vault instructions makes it easier for other developers or operators to run and secure the system.
  • Clear diagnostics and redaction are essential. Developers need actionable diagnostic outputs (timeouts, connection errors) but logs must never contain raw secrets.

In short, the project demonstrates a practical integration of an LLM client, an MCP tool/service surface, and robust SQL connectivity while emphasizing secure secret handling, testability with Docker, and cloud-friendly authentication patterns.

To view or add a comment, sign in

More articles by Balachandar Ramalingam

  • Claude Code CLI to LLMGateway

    # Connecting Claude Code CLI to Enterprise LLM Gateway: A Practical Guide ## Introduction As enterprises adopt…

Others also viewed

Explore content categories