Site icon Wonderful Engineering

This Hummingbird Robot Can Hover & Fly Like An Actual Hummingbird

Birds are the primary inspiration for the drones, did you know that? However, a particular bird has always been ten steps ahead of the drones; the hummingbird. The hummingbird is super-agile and cannot only hover but also make sharp turns quickly. Researchers at Purdue University have finally managed to create a hummingbird robot that is inspired by the hummingbird and taught it to fly making use of an algorithm that has been trained using the natural flight patterns of the hummingbird.

The previous attempts to create a hummingbird robot have not been successful. The resulting hummingbird robot was either too slow or human-controlled. However, the Purdue drone – hummingbird robot – that comes from Purdue University is quite close to the actual thing. It has a wingspan of 6.7 inches and weighs the same as an average hummingbird; 12 grams. It is capable of lifting about twice its own weight; 27 grams. The hummingbird robot features a 3D printed body that has wings crafted from carbon fiber and membranes, capable of flapping at frequencies of up to 40 Hz.

What sets this particular hummingbird robot apart is its capability of flying like an actual hummingbird. The hummingbirds are known for pulling off some of the best possible aerial stunts that are known to bird-kingdom including turning 180 degrees in only 0.2 seconds. The researchers have utilized algorithms that were built after observing real hummingbirds. These algorithms were compiled into simulations and then used for teaching the robot to fly.

It is a given that the hummingbird robot will not be the fastest flier nor it will have the biggest of range. However, its enhanced maneuverability and the small size makes it perfect for navigating spaces that are considered too narrow or tight for other robots. For instance, they can be used in a collapsed building for assessing damage or looking for survivors. The robot has no cameras and therefore cannot see as of now. It makes use of an electrical sense of touch and AI algorithms that can analyze the touches.

Xinyan Deng, the lead researcher on the study, said, ‘The robot can essentially create a map without seeing its surroundings. This could be helpful in a situation when the robot might be searching for victims in a dark place – and it means one less sensor to add when we do give the robot the ability to see.’ The team is currently working on enhancing the lifting capability of the robot and add sensors to it including batteries since right now it has to be tethered for acquiring power.

The research will be presented in the coming week via three papers at the IEEE International Conference on Robotics and Automation in Montreal.

Exit mobile version