🚀 Batch Processing Made Easy: Azure Functions + Azure SQL
A Practical Guide to Automated Data Processing with TimerTrigger
Have you ever needed to process large amounts of data on a schedule? Whether it's generating daily reports, sending notifications, or aggregating analytics, batch processing is a common requirement in modern applications. Today, I'll show you how to implement this using Azure Functions and Azure SQL, a powerful, serverless combination! ⚡
🎯 What We'll Build
We'll create a Timer-triggered Azure Function that:
🏗️ Architecture Overview
┌─────────────────┐
│ Timer Trigger │ ──> Runs on schedule (cron)
└────────┬────────┘
│
▼
┌─────────────────┐
│ Azure Function │ ──> Queries Azure SQL
└────────┬────────┘
│
▼
┌─────────────────┐
│ Azure SQL │ ──> Returns batch data
└────────┬────────┘
│
▼
┌─────────────────┐
│ Send Reports/ │ ──> Email, Teams, etc.
│ Notifications │
└─────────────────┘
📋 Prerequisites
Before we start, make sure you have:
💻 Step 1: Setting Up the Database
First, let's create a simple sales table in Azure SQL:
-- Create Sales table
CREATE TABLE Sales (
SaleId INT PRIMARY KEY IDENTITY(1,1),
ProductName NVARCHAR(100),
SaleAmount DECIMAL(10,2),
SaleDate DATETIME,
CustomerEmail NVARCHAR(100),
Region NVARCHAR(50),
Processed BIT DEFAULT 0
);
-- Insert sample data
INSERT INTO Sales (ProductName, SaleAmount, SaleDate, CustomerEmail, Region)
VALUES
('Laptop Pro', 1299.99, GETDATE()-1, 'customer1@email.com', 'North'),
('Wireless Mouse', 29.99, GETDATE()-1, 'customer2@email.com', 'South'),
('Keyboard RGB', 79.99, GETDATE()-1, 'customer3@email.com', 'East');
⚙️ Step 2: Configure Connection String
Add your Azure SQL connection string to local.settings.json (for local development) or Application Settings (for Azure):
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"SqlConnectionString": "Server=tcp:yourserver.database.windows.net,1433;Database=yourdb;User ID=yourusername;Password=yourpassword;Encrypt=True;"
}
}
🔧 Step 3: Create the Azure Function
Here's our complete TimerTrigger function:
using System;
using System.Data.SqlClient;
using System.Collections.Generic;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
namespace BatchProcessingFunction
{
public class SalesReportFunction
{
[FunctionName("DailySalesReportProcessor")]
public async Task Run(
[TimerTrigger("0 0 8 * * *")] TimerInfo myTimer, // Runs daily at 8 AM
ILogger log)
{
log.LogInformation($"🚀 Sales Report Processor started at: {DateTime.Now}");
try
{
// Get unprocessed sales
var sales = await GetUnprocessedSales(log);
log.LogInformation($"📊 Found {sales.Count} unprocessed sales records");
if (sales.Count == 0)
{
log.LogInformation("✅ No new sales to process");
return;
}
// Process in batches
await ProcessSalesBatch(sales, log);
// Mark as processed
await MarkSalesAsProcessed(sales, log);
// Send summary report
await SendSummaryReport(sales, log);
log.LogInformation("✅ Batch processing completed successfully!");
}
catch (Exception ex)
{
log.LogError($"❌ Error processing batch: {ex.Message}");
throw;
}
}
private async Task<List<Sale>> GetUnprocessedSales(ILogger log)
{
var sales = new List<Sale>();
var connectionString = Environment.GetEnvironmentVariable("SqlConnectionString");
using (var connection = new SqlConnection(connectionString))
{
await connection.OpenAsync();
var query = @"
SELECT SaleId, ProductName, SaleAmount, SaleDate,
CustomerEmail, Region
FROM Sales
WHERE Processed = 0
AND SaleDate >= DATEADD(day, -1, GETDATE())
ORDER BY SaleDate DESC";
using (var command = new SqlCommand(query, connection))
using (var reader = await command.ExecuteReaderAsync())
{
while (await reader.ReadAsync())
{
sales.Add(new Sale
{
SaleId = reader.GetInt32(0),
ProductName = reader.GetString(1),
SaleAmount = reader.GetDecimal(2),
SaleDate = reader.GetDateTime(3),
CustomerEmail = reader.GetString(4),
Region = reader.GetString(5)
});
}
}
}
return sales;
}
private async Task ProcessSalesBatch(List<Sale> sales, ILogger log)
{
// Process in batches of 100
const int batchSize = 100;
for (int i = 0; i < sales.Count; i += batchSize)
{
var batch = sales.GetRange(i, Math.Min(batchSize, sales.Count - i));
log.LogInformation($"📦 Processing batch {(i / batchSize) + 1}: {batch.Count} records");
foreach (var sale in batch)
{
// Your business logic here
// Example: Send notification to customer
await SendCustomerNotification(sale, log);
}
// Small delay between batches to avoid throttling
await Task.Delay(100);
}
}
private async Task SendCustomerNotification(Sale sale, ILogger log)
{
// Simulate sending email/notification
log.LogInformation($"📧 Sending notification to {sale.CustomerEmail} for {sale.ProductName}");
// Here you would integrate with:
// - SendGrid for emails
// - Microsoft Graph for Teams messages
// - Azure Communication Services
// etc.
await Task.Delay(10); // Simulate API call
}
private async Task MarkSalesAsProcessed(List<Sale> sales, ILogger log)
{
var connectionString = Environment.GetEnvironmentVariable("SqlConnectionString");
var saleIds = string.Join(",", sales.ConvertAll(s => s.SaleId.ToString()));
using (var connection = new SqlConnection(connectionString))
{
await connection.OpenAsync();
var query = $"UPDATE Sales SET Processed = 1 WHERE SaleId IN ({saleIds})";
using (var command = new SqlCommand(query, connection))
{
var rowsAffected = await command.ExecuteNonQueryAsync();
log.LogInformation($"✔️ Marked {rowsAffected} sales as processed");
}
}
}
private async Task SendSummaryReport(List<Sale> sales, ILogger log)
{
var totalAmount = sales.Sum(s => s.SaleAmount);
var reportDate = DateTime.Now.ToString("yyyy-MM-dd");
var summary = $@"
📊 Daily Sales Report - {reportDate}
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Total Sales: {sales.Count}
Total Revenue: ${totalAmount:N2}
Average Sale: ${(totalAmount / sales.Count):N2}
Top Region: {sales.GroupBy(s => s.Region)
.OrderByDescending(g => g.Count())
.First().Key}
";
log.LogInformation(summary);
// Send via email/Teams/Slack
await Task.CompletedTask;
}
}
public class Sale
{
public int SaleId { get; set; }
public string ProductName { get; set; }
public decimal SaleAmount { get; set; }
public DateTime SaleDate { get; set; }
public string CustomerEmail { get; set; }
public string Region { get; set; }
}
}
⏰ Understanding the Timer Trigger
The cron expression "0 0 8 * * *" means:
0 0 8 * * *
│ │ │ │ │ │
│ │ │ │ │ └─ Day of week (any)
│ │ │ │ └─────── Month (any)
│ │ │ └───────────── Day of month (any)
│ │ └─────────────────── Hour (8 AM)
│ └───────────────────────── Minute (0)
└─────────────────────────────── Second (0)
Other useful schedules:
🎨 Best Practices
1️⃣ Idempotency
Always make your functions idempotent, they should produce the same result even if run multiple times:
// ✅ Good: Check if already processed
WHERE Processed = 0 AND SaleDate >= @StartDate
// ❌ Bad: Process everything
SELECT * FROM Sales
Recommended by LinkedIn
2️⃣ Batch Size Optimization
Don't process all records at once. Use batching:
const int batchSize = 100; // Adjust based on your needs
for (int i = 0; i < sales.Count; i += batchSize)
{
var batch = sales.GetRange(i, Math.Min(batchSize, sales.Count - i));
await ProcessBatch(batch);
}
3️⃣ Error Handling
Implement proper error handling and retry logic:
try
{
await ProcessSales();
}
catch (SqlException ex)
{
log.LogError($"Database error: {ex.Message}");
// Implement retry logic or dead-letter queue
}
4️⃣ Monitoring
Use Application Insights for monitoring:
log.LogMetric("SalesProcessed", sales.Count);
log.LogMetric("ProcessingDuration", stopwatch.ElapsedMilliseconds);
📊 Monitoring & Troubleshooting
You can monitor your function in Azure Portal:
Common Issues:
🔴 Function not triggering: Check cron expression and time zone settings
🔴 SQL timeout: Increase batch size or add indexes
🔴 Memory issues: Process smaller batches
🔴 Connection errors: Verify firewall rules and connection string
🎯 Real-World Use Cases
This pattern is perfect for:
🚀 Next Steps
To enhance this solution, consider:
📚 Conclusion
Azure Functions with TimerTrigger provides a powerful, serverless way to handle batch processing. Combined with Azure SQL, you get:
✅ No infrastructure management
✅ Pay only for execution time
✅ Automatic scaling
✅ Built-in monitoring
✅ Easy integration with Azure services
The code is production-ready and follows best practices. Feel free to adapt it to your specific needs!
What batch processing challenges are you facing? Drop a comment below! 💬
Found this helpful? Give it a 👍 and share with your network!
Nice post! 👍 Thanks for sharing!
Good post! Thanks for sharing 👏
Great information!
Great Thoughts Wagner Hernandes