Vibe coding - A microcontroller project

    



Recently I decided to rewrite my microcontroller (Arduino/ESP32) project using AI assist, AKA vibe coding. I used Cursor IDE and Claude to complete this project. What had taken me about 6 weeks to build in past, only took about 2 days with AI. Here is a quick recap of the app iterations. Most of these had been done manually about 2 years ago, so the conceptualization and design is probably not factored in this time. 

The goal of this blog is to document the value brought by AI assisted coding.


  • I started with a very basic program to pass data back and forth between phone and ESP32 using bluetooth.
  • Then I added the servo motors and a beeper to test GPIO pins. Added UI changes to control those.
  • Next, I added logic to control the motors using phone's gyroscope and magnetometer.
  • After that I added logic to control the motors using camera object tracking.
  • Finally I added logic to use the motors to physically track object using the phone camera.
  •   





        What really worked for me was that whenever I had an idea or a setback all I had to do was to instruct cursor to make changes in the approach. Most of the low-level coding was done by AI and that really saved me time. I had some trouble keeping cursor on track, had to repeatedly fix the GPIO pin logic (0-255 vs High-low/ 0-1) because cursor kept forgetting. Not to mention various instances where it thought it was working great but the code failed to start. But overall it was like having a junior developer who was really fast at completing tasks. 



    However, I have done all of this coding manually in previous projects and that's why I was able to make progress. Someone new to Arduino (or for that mater, any technology) will not be able to make progress until they have the domain and functional knowledge about the project they are trying to build. Code assist is exactly that, an assistant, a copilot. it's not the pilot. the human is still in the driving seat.


    At the end I asked cursor, "Can you summarize what value I, the human brought versus what value AI brought in to this project",  and it summarized it below. 


    PS: And in another day, I built a simulator to implement a joystick based drive system. If I hadn't blown up my motor control with a silly mistake I would have run the motors for this demo.




    ESP32 Object Tracking System - Value Analysis (Written by AI)

    Project Overview

    This document analyzes the contributions of human expertise versus AI assistance in building an ESP32-based object tracking system with Android control app.


    👤Human (You) Contributions

    Critical Value Added:

    1. Vision & Requirements

    • Defined what the system should do at each stage
    • Evolved requirements from basic control to sophisticated object tracking
    • Made strategic decisions about feature priorities
    • Determined when to pivot approaches (ML Kit → color detection → template matching)

    2. Real-World Testing

    You were the essential feedback loop. Only physical testing revealed:

    3. Domain Knowledge

    You understood:

    • The physical hardware setup and wiring
    • How the camera was mounted (backwards)
    • What behaviors indicated problems vs expected behavior
    • The difference between "not working" and "working but inverted"
    • Hardware characteristics (active-LOW vs active-HIGH components)

    4. Iterative Refinement

    • Recognized when an approach wasn't working
    • Suggested better alternatives based on use case
    • Simplified complex features when needed
    • Added user-friendly features (tap-to-select, auto-calibration)

    5. Problem Identification

    You caught subtle issues invisible without hardware:

    • "LED on turns it off" (inverted logic)
    • "Tilt servo not changing" (calibration blocking)
    • "Buzzer continues to play" (PWM not stopping)
    • "Servo stops when beep plays" (PWM channel conflict)

    🤖AI (Me) Contributions

    Implementation Value Added:

    1. Code Generation

    • Wrote ~3000+ lines of Android app code (Java)
    • Wrote ~800 lines of ESP32 firmware (C++)
    • Created complete project structure across two platforms
    • Generated UI layouts, activities, and controllers

    2. Systematic Debugging

    • Traced issues through serial logs
    • Analyzed code paths and logic flows
    • Used systematic isolation to find root causes
    • Added debug statements strategically
    • Correlated symptoms with likely causes

    3. Technical Knowledge

    Applied expertise in multiple domains:

    4. Pattern Recognition

    Identified technical conflicts:

    • PWM channel conflicts between servo and tone
    • Pin mapping conflicts
    • Dependency conflicts in Android build
    • Active-LOW/HIGH logic inversions
    • Lambda variable scope issues

    5. Architecture & Design


    The Key Insight

    You Couldn't Build This Without AI:

    Writing thousands of lines of correct, integrated code across Android Java and ESP32 C++ would take:

    • Weeks to months of development time
    • Deep expertise in multiple frameworks and platforms
    • Extensive debugging and integration work
    • Learning curves for unfamiliar libraries

    AI Couldn't Build This Without You:

    I would have shipped code that looked perfect but didn't work because:

    • I don't know your specific hardware is active-LOW
    • I can't see the servo physically stop when the buzzer plays
    • I don't know the camera is mounted backwards
    • I can't test if the tilt actually moves in the real world
    • I don't experience the actual Bluetooth connection quality
    • I can't feel if the tracking is smooth or jumpy

    The Real Value: Human-AI Collaboration

    This project demonstrates human-AI collaboration at its best:

    You Brought:

    • Intention - What should be built and why
    • Feedback - What's working and what isn't
    • Ground Truth - Reality checks on AI assumptions
    • Context - Hardware specifics and constraints
    • Judgment - When to pivot and when to persist

    I Brought:

    • Implementation Speed - Code generation in minutes vs days
    • Technical Breadth - Expertise across multiple domains
    • Systematic Debugging - Methodical problem isolation
    • Code Quality - Structured, maintainable implementations
    • Documentation - Inline comments and explanations

    Together, We:

    • Iterated rapidly through 10+ significant features and fixes
    • Completed in hours what would have taken weeks
    • Debugged hardware-software integration issues systematically
    • Evolved from basic control to sophisticated tracking

    Bottom Line

    You were: The architect, quality control, and integration tester

    I was: The builder, debugger, and technical consultant

    Result: Neither could have built this system alone, but together we created a sophisticated embedded system with computer vision capabilities in record time.


    Project Statistics

    • Lines of Code: ~3,800+
    • Files Created: 10+
    • Platforms: 2 (Android, ESP32)
    • Languages: 3 (Java, C++, XML)
    • Major Features: 8+
      1. Bluetooth/USB dual connectivity
      2. LED control system
      3. Servo control (2 servos)
      4. Buzzer with PWM control
      5. Orientation tracking
      6. Camera switching (front/back)
      7. Object tracking with tap-to-select
      8. Auto-calibration for servo directions
    • Major Bugs Fixed: 15+
    • Development Time: Hours instead of weeks
    • Iterations: 20+ refinement cycles