Full Stack Hardware Development in an Afternoon

Full Stack Hardware Development in an Afternoon

Building an AI-assisted biosensor control system — from firmware to REST API — on a walk through the neighborhood.


It’s three o’clock on a Friday afternoon, a lovely day for a walk in Berkeley. I have my earbuds in, and I’m dictating into an app I built for exactly this purpose — it feeds into a transcription backend running on a server in my closet that cost $200 on Amazon. From this transcription, AI will generate a software specification, a tutorial, and this blog post.

By the time I sit down at my desk this evening, I’ll have a working embedded control system: microcontroller firmware, a Python application, a touchscreen UI, a REST API, Docker deployment, and a CI/CD pipeline. The kind of thing that would have taken a team of engineers a week — maybe a month, if they were learning some of the technologies for the first time.

I’m about to build the full stack in an afternoon.


The Backstory

I’ve been doing embedded system development for over twenty years, since I was a kid programming microcontrollers and browsing the aisles of Radio Shack, flipping through Forrest Mims cookbooks. I studied computer and electrical engineering, then did a PhD in biomedical engineering. What’s always fascinated me about this work is the layers of abstraction we get to traverse — from the physics of semiconductors to object-oriented programming to sensors touching living tissue.

At Mosaic Design Labs, the company I founded after years in industry, we build biomedical instruments. Devices that combine microfluidics, electrochemistry, embedded systems, and biology. Our “full stack” starts at the physics of a sensor and goes all the way up through firmware, embedded Linux, REST APIs, touchscreen UIs, Docker containers, and CI/CD pipelines.

For decades, this stack has been too broad for any one person to hold. As technology expanded, engineers specialized into narrower and narrower slices — the firmware person, the web person, the DevOps person, the UI person. There simply wasn’t enough time or cognitive bandwidth to be fluent across all of it.

AI changes that.


What We’re Building

We’re developing a pathogen detection system — an instrument that takes an air sample, captures airborne pathogens, and identifies them using electrochemical DNA sensors on a microfluidic cartridge. The science is fascinating: aerosolized particles are collected and concentrated into a liquid sample, pathogens in the sample are lysed (broken open) using ultrasonic energy, and then their DNA is detected electrochemically on miniaturized sensors.

The instrument needs precise fluid control. Sample fluid, reagents, and wash buffers are routed through the cartridge by peristaltic pumps — eight of them, each driving a different fluid channel. The pumps are controlled by stepper motors, which are controlled by a microcontroller, which is controlled by software running on an embedded Linux computer, which exposes a touchscreen interface and a network API.

That’s a lot of layers. And today, I’m going to build the software for all of them.


The Approach

Here’s what’s on the table:

  • Arduino firmware in C++ that drives eight stepper motors via a text-based serial protocol.
  • A Python hardware abstraction layer that translates “dispense 500 microliters” into the right number of motor steps at the right speed.
  • A controller module that orchestrates sequences, manages state, and coordinates everything.
  • A touchscreen UI built with HTML, CSS, and JavaScript, bridged to Python via Eel.
  • A REST API so the instrument can be controlled remotely or integrated into larger automation workflows.
  • A command-line interface for bench troubleshooting and automated testing.
  • Docker-based deployment so that setting up a new instrument is as simple as flashing an SD card.
  • A CI/CD pipeline that automatically versions the firmware and application code on every push.

I’m not going to write all of this by hand. I’m going to describe what I want, have AI generate a structured specification from my description, and then build each module by working through the spec with an AI coding assistant in Cursor. I’ll review every line of code, test on real hardware, and iterate.

The specification itself — a detailed, multi-page technical document — was generated from the transcription of this walk. That’s not a gimmick. It’s a genuine workflow. As the lead engineer, I can describe the system architecture while walking the dog, and my team can start building from a polished spec within the hour.


Why These Technology Choices

Every choice here optimizes for the same thing: speed to working prototype with a clear path to production.

Raspberry Pi because it’s ubiquitous, has a seamless path from prototype (standard Pi) to production (Compute Module on a custom PCB), and is extensively represented in AI training data. When I ask an AI to help me configure a service on a Pi, it has seen thousands of examples.

Arduino (specifically the Due) because we need real-time motor control that Linux can’t provide, and the two-processor pattern — microcontroller for timing-critical tasks, Linux for everything else — is the most common architecture in professional instrument design for good reason.

Python because it’s interpreted (no compile-flash cycle), has an extraordinary ecosystem (pyserial, Flask, Eel, PyYAML), and is the language AI assistants are most fluent in. The days when Python was “too slow for embedded” are over. A Raspberry Pi has more computing power than the server rooms of the 1990s. Use it.

HTML/CSS/JS for the UI because it’s free (no Qt licensing), debuggable from a remote browser, and every AI model can build it fluently.

Docker because reproducible deployment environments matter even more on embedded systems than on cloud servers. Especially when you’re setting up dozens of instruments in the field.

None of these choices are exotic. That’s the point. At Mosaic, we optimize for boring, “settled technology” that AI tools can help us work with effectively. The innovation is in the science — the biosensor, the microfluidics, the electrochemistry. The software infrastructure should be invisible.


What This Means

Here’s the thing I want to convey — the reason I’m writing this on a walk instead of at a whiteboard with a team of six:

The nature of engineering is changing. Fast.

Twenty years ago, building this system would have required a firmware engineer, a Python developer, a web designer, a DevOps specialist, and a project manager to coordinate them. Each person would have spent a week or more on their piece. The integration phase — where all the pieces come together and discover they don’t fit — would have taken at least another week.

Today, one engineer with a clear architectural vision and AI assistance can hold the entire stack in their head, build it cohesively, and have a working prototype by dinner. Not because the work is easier — the system is just as complex — but because the cognitive load of implementation has been redistributed. The engineer focuses on architecture, decisions, and judgment. The AI handles the syntax, the boilerplate, and the vast library of patterns it’s been trained on. Now this particular example is fairly trivial, as the goal is to demonstrate speed and breadth, but it can be easily extended to more complex examples.

This is what we mean at Mosaic when we say we’re AI-native. We didn’t bolt AI onto an existing workflow. We designed the workflow around it. From how we write documents (Markdown in Git, published to Notion by an automated pipeline) to how we develop firmware (voice dictation to spec to working code in an afternoon), every layer of how we operate assumes AI is in the loop.


The Bigger Picture

The pump controller is one subsystem of a much larger instrument. There’s the aerosol collection system, the ultrasonic lysis module, the potentiostat for electrochemical sensing, temperature control, and eventually a data pipeline that feeds results to a cloud dashboard and a building automation pipeline. We’re building this in collaboration with university research partners, and the science is genuinely at the frontier of what’s possible.

But today’s exercise isn’t about the biosensor per se. It’s about demonstrating a way of working that applies to any embedded instrument project. If you’re building a lab automation system, a diagnostic device, an environmental sensor, a manufacturing quality control station — the architecture and the approach are the same.

Prototype with off-the-shelf hardware. Validate the software. Then design custom hardware around proven software. Use AI to move fast across the full stack. Keep the technology choices boring so you can focus your creativity where it matters — on the science, the product, the problem you’re actually trying to solve.


Try It Yourself

We’ve published a tutorial and a system specification alongside this post. The tutorial walks through the thinking behind the architecture — why these platforms, why this module structure, how to approach AI-assisted development. The spec provides the implementation details you’d hand to an AI coding assistant and say “build this.”

Both are available in our open documentation repository.

If you’re an engineer interested in this kind of work — biomedical devices, embedded systems, AI-native development — we’d love to hear from you. We’re always looking for curious, capable people who want to build things that matter.


Frankie Myers is the founder of Mosaic Design Labs, a biomedical product development studio in the San Francisco Bay Area.