Banner Engineering Helps Student Develop Guitar Hero-Playing Robot

Oct. 16, 2009
Banner Engineering Corp. partnered with an engineering student and robotics instructor at Minnesota West Community and Technical College to help them develop a robot that can play the popular Guitar Hero video game.

Pete Nikrin, who graduated from Minnesota West in 2008 and now works as a manufacturing engineer at Meier Tool & Engineering, designed the robot to compete with a friend he had introduced to the game.

Bill Manor, robotics instructor at Minnesota West, suggested Nikrin incorporate a PresencePLUS P4 OMNI vision sensor with a right-angle lens from Banner Engineering. Minnesota West purchased the vision system from Banner at a discount as part of an educational kit.

To develop his Guitar Hero robot, Nikrin equipped a mannequin with a camera lens in the eye area. The robot, named Roxanne, uses Edge vision tools to identify the notes to be played. The vision tools detect, count and locate the transition between bright and dark pixels in an image area.

“We set-up five Edge tools that ran horizontally across the screen, one for every fret, and positioned the tools to focus on the notes at the bottom of each,” Nikrin says. “The Edge tools sent a constant signal as the five vertical fret lines progressed, and when a bright white dot appeared in the middle of a dark colored circle, the Edge tools allowed the sensor to detect it.”

Jeff Curtis, senior applications engineer at Banner, worked with Nikrin and Manor to ensure the robot's processing time was fast enough to keep up with the video game. Once a note was identified, communicating the signal efficiently depended upon a heavy amount of programming as well as Ethernet technology applied through a Modbus register. A PLC was programmed to constantly look at the vision sensor's register. Once an Edge tool senses a note, the PLC notices the change in the register, and the logic in the PLC fires a solenoid that activates the robot's finger. Just as a human player would react, the robot's finger then presses down on the appropriate note on the guitar.

The team used a Locate tool—an Edge-based vision tool that finds the absolute or relative position of the target in an image by finding its first edge—to ensure Roxanne could play in a range of lighting conditions.

Roxanne achieved 100% accuracy on the game’s medium mode and 95% accuracy on hard mode. She hit 80% of the notes on the game’s expert mode.

“Students have used Banner vision sensors in many projects over the years to inspect containers, for example, as they come down a conveyor,” Manor says.