Researchers at Universitat Autònoma de Barcelona (UAB) have recently developed a face-following robotic arm with emotion detection inspired by Pixar Animation Studios’ Luxo Jr. lamp. This robot was presented by Vernon Stanley Albayeros Duarte, a computer science graduate at UAB, in his final thesis. This project aims to create a face following robot arm that responds to user emotions using a Raspberry Pi as the main controller.
“The idea behind our robot is largely based on Pixar’s Luxo Jr. lamp shorts,” Albayeros Duarte said. “I wanted to build a robot that mimicked the lamp’s behavior in the shorts. I’m very interested in the maker scene and have been 3D printing for a few years, so I set out to build a ‘pet’ of sorts to demonstrate some interesting human-machine interactions. This is where the whole ‘face following emotion detection’ theme comes from, as having the lamp jump around like those in the Pixar shorts proved very difficult to implement, but still retained the ‘pet-toy’ feel about the project.”
As this study was part of Albayeros Duarte’ coursework, he had to meet certain requirements outlined by UAB. For instance, the main objective of the thesis was for students to learn about Google’s cloud services and how these can be used to offload computing resources in projects that are not computationally strong for them.
Raspberry Pi is a tiny and affordable computer, which has substantial computational limitations. These limitations make it the perfect candidate to explore the use of Google’s cloud platform for computationally intensive tasks, such as emotion detection.
Albayeros Duarte thus decided to use a Raspberry Pi to develop a small robot with emotion detection capabilities. His robot’s main body is LittleArm 2C, a robotic arm created by Slant Concepts founder, Gabe Bentz.
“I reached out to Slant Concepts to ask for permission to modify their robot arm so that it would hold a camera at the end, then created the electronics enclosure and base myself,” Albayeros Duarte said.
The robot designed by Albayeros Duarte ‘sweeps’ a camera from left to right, capturing a photo and using OpenCV, a library of programming functions that is often used for computer vision applications, to detect a face within its frame. When the robot reaches the end of either side, it raises or lowers the camera of a couple of degrees and resumes its sweeping movement.
“When it finds a face, the robot stops the sweeping movement and checks if the face stays within the field of view for more than a handful of frames,” Albayeros Duarte explained. “This ensures that it doesn’t ‘play’ with false positives in face detection. If the robot confirms that it has in fact found a face, it switches to the ‘face following’ part of the algorithm, where it tries to keep the face centered within it’s field of view. To do this, it pans and tilts according to the movements of the person it’s observing.”
You can read more from the following sources:
Edited and contributed by: S. Marjani
************************
Why Should Be “A Paid-Subscriber” and “Advertiser”
Keeping an independent media in countries that impose limitations on self supporting media, will help to support the humankind’s freedom. If you believe it, please act to be a PRO-MEMBER by clicking “HERE“, or:
Please send your PR’s directly to the email address of the Chief-Editor in order to be published at once in the world via ” https://pimi.ir ” The address is: aasaatnia@live.com.