Home » Vision System Makes Robot a Guitar Hero
Vision System Makes Robot a Guitar Hero
Pete Nikrin graduated from Minnesota West Community and Technical College in Pipestone, Minn., in 2008 and now works as a manufacturing engineer at Meier Tool & Engineering (www.meiertool.com) in Anoka, Minn. He designed a robot to compete with a friend whom Nikrin had introduced to the Guitar Hero game and, after playing for two weeks, had surpassed Nikrin in his ability.
Bill Manor, robotics instructor at Minnesota West, suggested Nikrin incorporate a vision sensor with a right-angle lens from Banner Engineering, which Minnesota West had purchased in a startup education kit.
To develop his Guitar Hero robot, Nikrin used a mannequin—complete with Minnesota West sweatshirt, hat and painted fingernails—and installed the camera lens as the robot's left eye, which would be positioned toward the TV or computer screen.
The robot, named Roxanne, identified the notes to be played by using an Edge vision tool, which detects, counts and locates the transition between bright and dark pixels in an image area.
"We set up five Edge tools that ran horizontally across the screen, one for every fret, and positioned the tools to focus on the notes at the bottom of each," said Nikrin. "The Edge tools sent a constant signal as the five vertical fret lines progressed, and when a bright white dot appeared in the middle of a dark colored circle, the Edge tool allowed the sensor to detect it."
Jeff Curtis, senior applications engineer at Banner (www.bannerengineering.com), worked with Nikrin and Manor to ensure the robot's processing time was fast enough to keep up with the video game. Once a note was identified, communicating this signal efficiently depended on a heavy amount of programming, as well as Ethernet technology applied through a Modbus register. A PLC was programmed, so that it constantly looked at the vision sensor's register. Once the Edge tool senses a note, the PLC notices the change in the register, and the logic in the PLC fires a solenoid that activates the robot's finger. Just as a human player would react, the robot's finger then presses down on the appropriate note on the guitar. This setup resulted in 9-msec processing speed.
The team also needed to ensure Roxanne could play within a range of lighting conditions, as well as confirm the robot was correctly oriented with the monitor displaying the video game. They solved this problem by using a Locate tool, an edge-based vision tool that finds the absolute or relative position of the target in an image by finding its first edge.
"We honed a Locate tool and gave it a fixed point—a piece of reflective tape on the PC monitor—to focus on," said Curtis.
- 05/17/2013 Friday p.m. Wrap-Up:This Week on ControlGlobal and Elsewhere
- 05/16/2013 What's Bad Weather Costing Us?
- 05/16/2013 BP, Shell, Statoil Raided by EC
Invensys' SimSci Suite 2013 Now with More Usability Features
Invensys releases SimSci Suite 2013, a DVD catalogue providing a single source for all of Invensys' current SimSci-Esscor design, operator training, simulation and optimization software
Honeywell Integrates and Certifies FMC722 Subsea Automation Protocol
The integration and certification of these solutions will boost the productivity of oil and gas field operators and engineers
- 05/15/2013 What we can learn about safety from the Titanic hearings
- 05/15/2013 Monsanto Muscatine named 2012 HART Plant of the Year
- 05/14/2013 Siemens gas chromatograph is ISA Product of the Year
- 05/14/2013 IChemE issues call for papers for fall conference
- 05/10/2013 CEO Hogan to leave ABB for private reasons
- All news »
Access the entire print issue on-line and be notified each month via e-mail when your new issue is ready for you. Subscribe today.
- Featured White Papers