App description and progression:
If you have seen my blogs, some of my projects have been robotics and automation projects built with Arduino or ESP32 microcontrollers. I stopped building these projects about 2 years ago, and about two weeks ago I picked it up again and decided to rebuild my project.
This time I decided to build it using AI assisted coding and used cursor IDE and Claude LLM for development. I had cursor install Arduino support which is also used for ESP32 and something that took me about six months to complete in the old world, I was able to finish in practically two days.
What really worked for me was that whenever I had an idea or whenever I had a setback all I had to do was to instruct cursor to make changes in the approach. Most of the low-level coding was done by AI and that really saved me time.
However, I have done all of this coding manually in previous projects and that's why I was able to make progress. Someone new to Arduino (or for that mater, any technology) will not be able to make progress until they have the domain and functional knowledge about the project they are trying to build. Code assist is exactly that, an assistant, a copilot. it's not the pilot. the human is still in the driving seat.
But the really cool part was being able to iterate whenever I needed, and cursor would follow. I had some trouble keeping cursor on check, had to update repeatedly the GPIO pin logic (0-0255 vs High-low/ 0-1) because cursor keeps forgetting. Not to mention various instances where it thought it was working great but the code failed to start. But overall it was like having a junior developer who was really fast at completing tasks.
My next iteration would be to add inbuilt unit tests, which will require physical sensors for feedback.
In the end I asked cursor, "Can you summarize what value I the human brought versus what value AI brought in to this project", and it summarized it below.
Well, I guess we'll agree to disagree on the value split.
ESP32 Object Tracking System - Value Analysis
Project Overview
This document analyzes the contributions of human expertise versus AI assistance in building an ESP32-based object tracking system with Android control app.
Human (You) Contributions
Critical Value Added:
1. Vision & Requirements
- Defined what the system should do at each stage
- Evolved requirements from basic control to sophisticated object tracking
- Made strategic decisions about feature priorities
- Determined when to pivot approaches (ML Kit → color detection → template matching)
2. Real-World Testing
You were the essential feedback loop. Only physical testing revealed:
- Active-LOW LED logic issues
- Pin conflicts (buzzer on GPIO 21 conflicting with ultrasonic sensor)
- Servo-PWM conflicts with
tone()function - Camera mounting orientation (backward-mounted requiring inverted pan logic)
- The need for auto-calibration vs manual toggles
- Buzzer continuing to play after beep command
- Tilt servo not responding during tracking
3. Domain Knowledge
You understood:
- The physical hardware setup and wiring
- How the camera was mounted (backwards)
- What behaviors indicated problems vs expected behavior
- The difference between "not working" and "working but inverted"
- Hardware characteristics (active-LOW vs active-HIGH components)
4. Iterative Refinement
- Recognized when an approach wasn't working
- Suggested better alternatives based on use case
- Simplified complex features when needed
- Added user-friendly features (tap-to-select, auto-calibration)
5. Problem Identification
You caught subtle issues invisible without hardware:
- "LED on turns it off" (inverted logic)
- "Tilt servo not changing" (calibration blocking)
- "Buzzer continues to play" (PWM not stopping)
- "Servo stops when beep plays" (PWM channel conflict)
AI (Me) Contributions
Implementation Value Added:
1. Code Generation
- Wrote ~3000+ lines of Android app code (Java)
- Wrote ~800 lines of ESP32 firmware (C++)
- Created complete project structure across two platforms
- Generated UI layouts, activities, and controllers
2. Systematic Debugging
- Traced issues through serial logs
- Analyzed code paths and logic flows
- Used systematic isolation to find root causes
- Added debug statements strategically
- Correlated symptoms with likely causes
3. Technical Knowledge
Applied expertise in multiple domains:
- Android Development:
- CameraX for camera preview and image analysis
- Bluetooth Low Energy (BLE) communication
- USB Serial communication
- Sensor management (rotation vector, accelerometer)
- ML Kit integration (initial approach)
- Custom View development
- ESP32 Development:
- Arduino framework
- ESP32Servo library
- BluetoothSerial library
- PWM channel management
- Watchdog timers
- Serial protocol design
- Computer Vision:
- Template matching algorithms
- Brightness-based tracking
- Centroid calculation
- Bounding box tracking
- Control Systems:
- Proportional control for servo centering
- Auto-calibration algorithms
- Direction detection logic
4. Pattern Recognition
Identified technical conflicts:
- PWM channel conflicts between servo and tone
- Pin mapping conflicts
- Dependency conflicts in Android build
- Active-LOW/HIGH logic inversions
- Lambda variable scope issues
5. Architecture & Design
- Designed singleton pattern for ESP32Controller
- Created command protocol with angle brackets
- Implemented pipe-delimited status responses
- Designed calibration state machine
- Structured tracking overlay system
The Key Insight
You Couldn't Build This Without AI:
Writing thousands of lines of correct, integrated code across Android Java and ESP32 C++ would take:
- Weeks to months of development time
- Deep expertise in multiple frameworks and platforms
- Extensive debugging and integration work
- Learning curves for unfamiliar libraries
AI Couldn't Build This Without You:
I would have shipped code that looked perfect but didn't work because:
- I don't know your specific hardware is active-LOW
- I can't see the servo physically stop when the buzzer plays
- I don't know the camera is mounted backwards
- I can't test if the tilt actually moves in the real world
- I don't experience the actual Bluetooth connection quality
- I can't feel if the tracking is smooth or jumpy
The Real Value: Human-AI Collaboration
This project demonstrates human-AI collaboration at its best:
You Brought:
- Intention - What should be built and why
- Feedback - What's working and what isn't
- Ground Truth - Reality checks on AI assumptions
- Context - Hardware specifics and constraints
- Judgment - When to pivot and when to persist
I Brought:
- Implementation Speed - Code generation in minutes vs days
- Technical Breadth - Expertise across multiple domains
- Systematic Debugging - Methodical problem isolation
- Code Quality - Structured, maintainable implementations
- Documentation - Inline comments and explanations
Together, We:
- Iterated rapidly through 10+ significant features and fixes
- Completed in hours what would have taken weeks
- Debugged hardware-software integration issues systematically
- Evolved from basic control to sophisticated tracking
Bottom Line
You were: The architect, quality control, and integration tester
I was: The builder, debugger, and technical consultant
Result: Neither could have built this system alone, but together we created a sophisticated embedded system with computer vision capabilities in record time.
Project Statistics
- Lines of Code: ~3,800+
- Files Created: 10+
- Platforms: 2 (Android, ESP32)
- Languages: 3 (Java, C++, XML)
- Major Features: 8+
- Bluetooth/USB dual connectivity
- LED control system
- Servo control (2 servos)
- Buzzer with PWM control
- Orientation tracking
- Camera switching (front/back)
- Object tracking with tap-to-select
- Auto-calibration for servo directions
- Major Bugs Fixed: 15+
- Development Time: Hours instead of weeks
- Iterations: 20+ refinement cycles