Get email delivery of the Cadence blog featured here
SANTA CLARA, Calif.—The three keys to successful product development are: keep it simple, take time, and test, test, test.
That was my take-away from Mike Aldred’s keynote address Tuesday (May 12) here at the Embedded Vision Summit. Aldred, electronics lead for Dyson, delivered a master class in product-development processes for a rapt audience of at least 500. Exhibit A was the Dyson 360 Eye robotic vacuum cleaner— due out in Japan later this year and Europe and the U.S. next year—which has been 10 years in the making.
It has been a development in which 30 percent of the effort went into the concept phase, 10 percent into implementation, and 60 percent into testing and optimization.
“On the 360 Eye, we tested for eight years,” he said. “We probably have done 100,000 runs to get to a system,” Aldred (pictured right) said. “We’re still testing even as we’re in production.” Aldred admitted that if he had guessed at the project outset, he would have flipped the effort for those three areas (60 percent conceptualization, 10 percent implementation and 30 percent test and optimization).
Aldred, whose doctoral thesis was on visual methods for robot navigation, acknowledges that he works for a man—company founder James Dyson—who understands the value of patience, focus, and learning from mistakes.
Indeed, the 360 Eye had its origin in an earlier Dyson effort from 350 engineers and scientists, the DCO6, the company’s first autonomous product. “It went nowhere,” according to Aldred.
But concepts and lessons from that effort were paid forward into subsequent product development.
Aldred emphasized that the company’s mission was not to build a robotic vacuum cleaner but a machine that could clean better than humans and free up human time and energy. Robotics is a way to achieve that, and a vision system (as opposed to lasers) is the right enabling technology because it provides a richness and quantity of information to get the job done.
But there were challenges. First and foremost was energy: The 360 Eye runs for about 45 minutes before it needs to return to its charging station. That meant the engineering team needed to figure out ways to optimize the machine’s navigation and pathways to conserve power where possible.
During the concept phase, engineers fixed on an imaging system mounted atop the vacuum that could see in panorama and up to an angle of 45 degrees (not much useful information can be gleaned from imaging the ceiling, Aldred noted). They chose this approach because a forward-facing camera on a robotic vacuum cleaner faces a fundamental problem when it runs into something: It’s lost its perspective because it can no longer see. With a panoramic capability, the system can recalibrate where it is based on what it sees to the sides and behind.
“The information content is so large, the chances of being blinded are massively reduced,” he said.
They next decided on a SLAM system (simultaneous localization and mapping). To oversimplify, the system looks at edges, straight lines, and corners and estimates those in 3D space and then uses math and Kalman filtering to estimate the robot’s position. (Think about a sailor’s dead reckoning by stars, only in this project the stars are objects in the room with which the system figures out how to localize itself.)
So the 360 Eye boiled down to three core concepts: vision, panoramic view, and SLAM. Getting to that point in the conceptualization phase took four years.
“We didn’t hit on those three ideas right off. What I haven’t shown you is 3,000 ideas that went nowhere,” Aldred said. “It’s not a failure; it’s a lesson learned.”
During product development, they didn’t start with vision but with a virtual environment into which they fed formulas to simulate where features would be if they had a perfect vision system.
“Once we had confidence in math backend, then real-world testing started and we got a machine and drove it around in remote control,” Aldred said. The team fed the captured images into the SLAM system that they were confident now worked, he said.
Engineers and scientists at Dyson’s Malmesbury, U.K., headquarters were delighted with early results until Aldred took a prototype to his mother’s home. It failed.
“Our image data set was too small and we’d effectively tuned the system to a limited number of homes we’d tested,” all of those in Malmesbury. His mother didn’t live in the village.
“That was our first bitter lesson in the need to have extensive tests and data sets. We hadn’t realized how essential it was to have that breadth of testing.”
The fix was relatively easy, but Aldred emphasized throughout his talk the importance of testing and simulation. And he held up simulation for special scrutiny:
“A simulator is only as a good as the effort that has been put into its development. A simulator needs to be continually developed. As you learn things in the real world, you need to fold those back into your simulator; otherwise you haven’t learned the lesson.”
The other golden rule that Aldred shared was, keep it simple.
“Just because you can doesn’t mean you should,” he said.
As an example, he described the challenge of the device moving from a light room to a dark room. How do you manage exposure and gain control? Should the team maximize SLAM features or work on histogram equalization?
“What we ended up doing is maximizing image entropy,” he said. “It’s a simple measure but that’s a key lesson. We started thinking we’d have to do something complex and we ended up probably doing the simplest measure.”
He added: “If it works, stop there. Do the simplest things you need to achieve for what you want to do.”
For more information on Aldred, check out Kean Walmsley’s blog Q&A with his former school colleague.
Here's a short video showing the 360 Eye in action: