Building a React Frontend with Python Backend for Raspberry Pi-powered AutoCAD Robotic Arm for 2D Drawing
L. P. Harisha Lakshan Warnakulasuriya(BSc in CS(OUSL)).
Bachelor of Bio Science in Computer Science.
Introduction
The fusion of web technologies, physical computing, and robotics allows for innovative applications. This article guides you through creating a full-stack application integrating a React frontend with a Python backend, running on a Raspberry Pi, to control a robotic arm capable of drawing AutoCAD-style 2D graphics.
Table of Contents
1. Hardware & Software Setup
1.1 Hardware Requirements
1.2 Software Requirements
2. Python Backend to Control Robotic Arm
2.1 Installing Required Libraries
sudo apt-get update
sudo apt-get install python3-flask
pip install flask flask-cors RPi.GPIO
2.2 Sample Code: Servo Control Logic
import RPi.GPIO as GPIO
import time
GPIO.setmode(GPIO.BCM)
SERVO_X = 17
SERVO_Y = 18
GPIO.setup(SERVO_X, GPIO.OUT)
GPIO.setup(SERVO_Y, GPIO.OUT)
pwm_x = GPIO.PWM(SERVO_X, 50)
pwm_y = GPIO.PWM(SERVO_Y, 50)
pwm_x.start(7.5)
pwm_y.start(7.5)
def move_servo(pwm, angle):
duty = 2 + (angle / 18)
pwm.ChangeDutyCycle(duty)
time.sleep(0.5)
2.3 Flask API to Receive Drawing Commands
from flask import Flask, request
from flask_cors import CORS
app = Flask(__name__)
CORS(app)
@app.route('/api/draw', methods=['POST'])
def draw():
data = request.get_json()
for coord in data['coordinates']:
x_angle = coord['x']
y_angle = coord['y']
move_servo(pwm_x, x_angle)
move_servo(pwm_y, y_angle)
return {"status": "Drawing executed"}, 200
if __name__ == '__main__':
app.run(host='0.0.0.0', port=5000)
3. Building the React Frontend
3.1 Initialize the React App
npx create-react-app robotic-arm-ui
cd robotic-arm-ui
npm install axios
3.2 Canvas for Drawing
import React, { useRef, useState } from 'react';
import axios from 'axios';
function Canvas() {
const canvasRef = useRef(null);
const [drawing, setDrawing] = useState(false);
const [coordinates, setCoordinates] = useState([]);
const handleMouseDown = () => setDrawing(true);
const handleMouseUp = () => setDrawing(false);
const handleMouseMove = (e) => {
if (!drawing) return;
const canvas = canvasRef.current;
const ctx = canvas.getContext('2d');
const rect = canvas.getBoundingClientRect();
const x = e.clientX - rect.left;
const y = e.clientY - rect.top;
ctx.lineTo(x, y);
ctx.stroke();
setCoordinates([...coordinates, { x: x / 5, y: y / 5 }]);
};
const sendDrawing = async () => {
await axios.post('http://<raspberry-pi-ip>:5000/api/draw', {
coordinates,
});
};
return (
<div>
<canvas
ref={canvasRef}
width={400}
height={400}
onMouseDown={handleMouseDown}
onMouseUp={handleMouseUp}
onMouseMove={handleMouseMove}
style={{ border: '1px solid black' }}
/>
<button onClick={sendDrawing}>Send Drawing</button>
</div>
);
}
export default Canvas;
4. Establishing React-to-Backend Communication
5. Testing and Deployment
5.1 Test Steps:
5.2 Deployment Tips:
6. System Architecture Flowchart
graph TD
A[User draws on Canvas] --> B[React stores coordinates]
B --> C[POST to Flask API]
C --> D[Python Backend receives data]
D --> E[Coordinate Translation to Servo Angles]
E --> F[Raspberry Pi GPIO Controls Arm]
F --> G[Robotic Arm Draws on Surface]
7. Enhancements and Conclusion
Possible Add-ons:
Conclusion
This project demonstrates a practical example of integrating:
Together, they create a seamless bridge from virtual drawing to physical output using a robotic arm. It opens avenues for:
With this foundation, users can expand to 3D drawing, CNC-style engraving, and beyond.
8. Advanced Features
8.1 Inverse Kinematics (IK)
Inverse Kinematics allows precise control of the robotic arm by computing joint angles from end-effector positions.
Python Example:
def inverse_kinematics(x, y, L1, L2):
import math
cos_angle2 = (x**2 + y**2 - L1**2 - L2**2) / (2 * L1 * L2)
angle2 = math.acos(cos_angle2)
angle1 = math.atan2(y, x) - math.atan2(L2 * math.sin(angle2), L1 + L2 * math.cos(angle2))
return math.degrees(angle1), math.degrees(angle2)
8.2 SVG to Servo Translation
You can use svgpathtools or svg.path libraries to extract points from SVG paths:
pip install svgpathtools
from svgpathtools import svg2paths
paths, _ = svg2paths('drawing.svg')
for path in paths:
for segment in path:
print(segment.start.real, segment.start.imag)
Then map SVG coordinates to robotic arm angles using inverse kinematics.
8.3 Smoothing Algorithms for Curves
Apply Chaikin’s algorithm or a moving average filter:
def smooth_path(points):
smoothed = []
for i in range(1, len(points)-1):
avg_x = (points[i-1][0] + points[i][0] + points[i+1][0]) / 3
avg_y = (points[i-1][1] + points[i][1] + points[i+1][1]) / 3
smoothed.append((avg_x, avg_y))
return smoothed
8.4 Real-time Feedback & Error Correction
Use potentiometers or rotary encoders to track joint angles, and compare them to intended angles:
def check_feedback(expected_angle, actual_angle):
error = expected_angle - actual_angle
if abs(error) > tolerance:
adjust_motor(error)
You can use I2C/ADC to read analog feedback and log discrepancies in real time.
Diagram: Hardware Connection Overview
Canvas (React) ---> Flask Server ---> Python Script
|--> GPIO Pins ---> Servo Motors ---> Robotic Arm
|<-- Feedback Pins <--- Rotary Encoders
🔗 GitHub Repository (Demo)
You can access the complete source code here: 👉
Stay tuned for more real-world AI-powered hardware integration projects!
This Lesson Series are compiled and crafted and teaches by Experienced Software Engineer L.P. Harisha Lakshan Warnakulasuriya.
My Personal Website -: https://www.harishalakshanwarnakulasuriya.com
My Portfolio Website -: https://main.harishacrypto.xyz
My Newsletter Series -: https://newsletter.harishacrypto.xyz
My email address: uniconrprofessionalbay@gmail.com
My GitHub Portfolio : Sponsor @harishalakshan on GitHub Sponsors