Galton’s Goalie

Upgrading Mark Rober’s CrunchLabs Plinko Machine to Prove the Math in Style!

The Spark

My son and I have been doing Mark Rober’s CrunchLabs Build Boxes and they’re an absolute blast. For the uninitiated: Each month introduces an engineering concept with a ~20 minute video.. and then you build something cool that demonstrates it hands-on. We just started Season 2, and the first box was all about the bell curve – demonstrated through “Galton’s Goalie,” basically a desktop Plinko machine.

Watching the balls bounce randomly through the pegs and settle into a pretty decent bell curve distribution is mesmerizing. But somehow it wasn’t proving the point well enough.. and I couldn’t help thinking: what if we could PROVE it’s actually creating a bell curve? With computer vision layered on top, we could track every ball and visualize the distribution in real-time.

So I thought: what a fun vibe coding project! Let’s see if we can do that. And spoiler alert: We did.

Real-time ball tracking and histogram visualization
Real-time ball tracking and histogram visualization

The Journey

This was my “OMG I just did months of work in a couple of hours” moment.

Back in the day, I built Still Life in our Symphony engine – a long-exposure visualization system. We spent literally months building the signal processing and computer vision foundations. Detecting motion, tracking objects, creating trails – it was serious engineering work.

Fast forward to 2026: I described the problem to Claude, we talked back and forth about the approach, and it baked the first working version in a matter of hours. Real-time ball detection. Histogram updates. Motion trails. The whole thing.

But the journey didn’t end there! I showed it to a colleague who asked if we could improve the UI. One conversation with Claude later, we had a complete PyQt5 interface with multiple visualization modes, calibration controls, the works.

The updated PyQt5 interface with visualization controls
The updated PyQt5 interface with visualization controls

What I Learned

The coolest technical takeaway? The “why not both?!” moment.

With Still Life, adding a visualization mode (long exposure, particle effects, etc.) was a significant decision. We’re talking days of careful coding, testing, integration. You had to really commit.

Now, when Claude would suggest a few different visualization options, I could say “why not both?!” and both would be cooked that afternoon. Long-exposure mode? Done. Motion trails with color cycling? Done. Per-bucket sensitivity tuning? Why not!

What used to take days now takes minutes. The limiting factor isn’t implementation anymore, it’s the idea itself.

Would I Recommend?

Absolutely. I wonder though, how important it was that I’d done it before. I knew what questions to ask Claude. I had intuition for which approaches would work when it gave me options. The computer vision vocabulary was already in my head.

But regardless – you don’t need months of CV experience to build something like this anymore. You need curiosity, willingness to iterate, and the ability to recognize when something’s working (or not).

This was the project that inspired me to start this whole “Side Quests” series. If I can knock out a real-time computer vision app in an afternoon, what else becomes possible?

Try It Yourself

The full code is on GitHub: GaltonGoalieViz

Watch an earlier version in action on YouTube, or check out the sample videos showing the system tracking balls in real-time:

Sample visualization in action
Sample visualization in action

You’ll need:

  • Python 3.7+
  • A webcam
  • Something that drops balls through obstacles (Galton board, Plinko, whatever)
  • About 5 minutes to get it running

The README has full setup instructions. Hit ‘C’ to calibrate, ‘R’ to reset, ‘T’ to cycle through visualization modes. Watch probability theory come to life on your desk.


Built with: Python, OpenCV, NumPy, PyQt5, and a lot of “what if we tried…”