In this week's newsletter, Aaron has a conversation with Matt Puchalski, founder and CEO of Bucket Robotics, a Y Combinator-backed startup developing vision systems for flexible manufacturing with deep experience in autonomous vehicle integration from companies like Argo AI.
I have yet to see a customer that comes to Bucket Robotics that hasn't been sold a deeply expensive and overpowered camera when it's actually a software and inspection issue.
In this episode:
How STEP file simulation eliminates waiting for physical parts and reduces golden samples from 30 to 30,000 synthetic training images
Why software-first vision systems cut integration time from two weeks to two hours while maintaining hardware flexibility
How frozen iguanas falling on LiDAR sensors taught the importance of empathy in reliability engineering
Why writing technical documentation accelerates engineering problem-solving and knowledge transfer
Bonus Content:
Quality Control at Scale - Using Automated Inspection and Acceptable Defects Rates
S6E40 Matt Puchalski | Vision Inspection, Autonomous Vehicles, & Graduating Y Combinator
Matt Puchalski brings an unusual combination to manufacturing automation: self-driving car vision expertise meets deep frustration with manual quality inspection. His company, Bucket Robotics, addresses a persistent problem in injection molding and casting—the expensive, time-consuming process of training vision systems with physical "golden sample" parts. By starting with CAD files instead of manufactured parts, Bucket generates synthetic training data that eliminates the traditional bottleneck. The conversation reveals how Y Combinator taught him to show up at 6:30 AM with Krispy Kreme donuts, why Apple Watches caused autonomous vehicle shutdowns, and how sweeping floors builds trust that leads to better products. This episode challenges assumptions about what vision system deployment should cost and how long it should take.
>If YouTube isn’t your thing, check out this episode and all of our past episodes on Apple, Spotify, and all the rest.

“With the EZ Motion product, we’re able to do more automation and custom programming for our test needs. That’s helped us improve some tests 30%, some even 80% faster.”
Test Engineering Lab Manager, Array Technologies
As Array’s test volume and complexity increased, their team needed a flexible system that could adapt to every kind of testing - from new product development to field service validation.
With EZ Motion, the lab now runs faster, smarter, and around the clock:
⚙️ Automated motion + sensor control tailored to each test
⏱️ Up to 80% faster test cycles, freeing up engineering time
🌙 24/7 operation for higher throughput and efficiency
“By testing faster and more efficiently, we’re able to support more projects and customer needs for the company. EZ Motion allows us to accommodate those various needs.”
EZ Motion empowers engineering teams to scale testing without scaling effort: intuitive control, rapid automation, and real ROI.
>Watch how Array Technologies uses EZ motion in this test lab.
>Visit the Pipeline Design & Engineering products page for more information on EZ Motion.

Simulation-Based Training Eliminates the Golden Sample Bottleneck
Traditional vision system deployment starts with a problem: you need perfect physical parts to train the system before you can inspect production parts. Bucket Robotics approaches this differently by beginning with the engineering reference that already exists.
The problem with the golden sample method is one, that's predicated on waiting for those parts to be manufactured. And number two, one of the big problems that you'll see is 30 golden samples can very quickly turn into 300 or 3000 as soon as you enter into differences in colorways, or variability in how those are produced.
The cost and timeline implications extend beyond just acquiring parts. Someone has to wait for those golden samples, program the robot, and create the feedback loop. Each variation multiplies the effort.
By starting from the STEP file, you have your perfect golden sample - you can't get anything better than that engineering reference. And then from there we can generate not just 30 or 300 or 3000 but 30,000, 300,000 images of what good and bad looks like.
This approach flips the traditional sequence. Instead of manufacturing parts to train inspection systems, the inspection system trains on the same CAD models used to design the manufacturing process. The synthetic data generation happens before physical production begins.
We are hardware agnostic. The thing that we have seen is there is always a chance that you need to change your lighting conditions, your shutter rate, or things like that. But we have yet to see a customer that comes to Bucket Robotics that hasn't been sold a deeply expensive and overpowered camera when it's actually a software and inspection issue.
The integration timeline reflects this software-first philosophy. Where traditional vision systems require days or weeks of on-site programming and adjustment, simulation-based training compresses the process by removing the dependency on physical parts and extensive hardware configuration.

Quality Control at Scale - Using Automated Inspection and Acceptable Defects Rates

LEGO produces 75 billion elements annually, each requiring verification that dimensional tolerances fall within 0.002 millimeters. The company cannot inspect every dimension on every part. The math doesn't work. At typical injection molding cycle times, a single molding machine produces thousands of parts per hour. Measuring every critical dimension on each part would require inspection time exceeding production time by an order of magnitude. The elements would stack up faster than any inspection system could process them. Yet consumers expect bricks manufactured in different facilities, in different years, to connect with identical clutch force. The challenge is confirming precision without creating an inspection bottleneck that defeats the purpose of automated manufacturing.
For more, visit the full article on The Wave.
