Autonomous robotic plane flies indoors at MIT
Articles Blog

Autonomous robotic plane flies indoors at MIT

August 31, 2019


[MUSIC PLAYING] Micro air vehicles
capable of operating in constrained environments
without the use of an external
motion capture system are typically limited to
slow and conservative flight. As a consequence, almost
all of this research is done with rotorcraft
in the hover regime. In the robust robotics
group in CSAIL at MIT, we’ve developed a
fixed-wing vehicle capable of flying at high
speeds through obstacles using only onboard sensors. The vehicle is equipped
with an inertial measurement unit and a laser range scanner. All the computation for
state estimation and control is done onboard
using an Intel Atom processor, similar to what
is found in a commercially available netbook. We designed a custom
airplane to carry the sensing and computation payload while
still being able to maneuver in confined spaces. Our platform has
a 2-meter wingspan and weighs approximately
2 kilograms. At any given time
the laser can only see a two-dimensional
picture of the environment. Laser scans are depicted
with yellow points representing obstacles, and
blue representing free space. Even with a pre-computed
map, individual 2D scans don’t contain enough information
to uniquely determine the 3D position, velocity, and
orientation of the vehicle. To overcome this difficulty,
we aggregate successive scans and combine laser information
with the inertial measurement unit to perform
state estimation. Another technical
challenge is efficiently generating trajectories
for the vehicle to follow. The complicated
vehicle dynamics create substantial computational
difficulties in determining a path
to fly from point A to point B. To overcome
this difficulty, we use an approximate
dynamics model that makes it easy to map the
control inputs– elevator, rudder, aileron, and
throttle– to corresponding XYZ trajectories. We start by connecting a
set of high-level waypoints with line and arc segments. We then use our
approximate model to construct dynamically
feasible paths by parameterizing an offset
from this underlying trajectory. Here we demonstrate the
accuracy and reliability of this system flying
through a parking garage. In places, the parking garage
is less than 2.5 meters from floor to ceiling, creating
extremely tight tolerances for our 2-meter vehicle. Our algorithms
allowed the vehicle to complete a 7-minute flight
through the environment traveling at over 10 meters per
second, or 22 miles per hour, covering almost 3
miles of distance and repeatedly coming within a
few centimeters of obstacles.

Only registered users can comment.

  1. Umm…

    You couldn't have made it look like a carrier pigeon?

    Now how can we use this in a hack?

    Greetings to all MacGregorites and Class of '75 members!

  2. I'd love to see this tech in replacement of classic rollercoasters. Imagine sitting in a computer controlled plane that flies crazy fast through the forest, avoiding trees by tenth of an inch!
    What a thrill would that be!

  3. Yeah, especially if one of the sensors failed or the program crashed! Because I absolutely love crashing into solid objects at over 15G!

  4. At MIT students do engineering to make new inventions and discoveries…But in India students do engineering because-
    1. their parents want them to get an engineering degree
    2. they didn't get admission anywhere else
    Pity on them…

  5. humans need the most supervision. when left on their own, humans have put nuclear bombs on devices such as these

  6. i'll get really excited when biology/nanotech advances to the point where we can make cyborgs. compare the relatively gigantic netbook brain to that of a fruit fly.

  7. It's only because he happened to nut out the idea (he actually bothered to do it). Sometimes back-yarders do better than the big-wigs. He could sell it. Look at my channel for "$3,000" speakers that you can make from speakers from the dump. Videos show you how you can make the best speakers in the world for basically nothing! Why should only the rich have the listening pleasure? (I'm another example of beating the big-wigs)! lol

  8. Awesumm….how do you combine camera info with the laser unit to know about its localization wrt to the environment????

  9. I think the camera is just for observation, the range finder and the inertial measurement unit are gathering all the data

  10. With complete due respect, I didn't found its such a gr8 at all… Its not so complex algorithm as well, me too belongs to AI(Artificial Intelligence) and NN(Neural Networking) field.. Expect lot more from MIT. 🙂 Cheers !!!

  11. You mean you want to cut and paste the code and see if you can be on par with MIT engineers.. The guy pretty much told you how to do in the video.

  12. you bet. bootstrap like crazy.
    what he said in the video was an extremely high level description of something much more finicky, mostly the state estimation. marrying the LIDAR with accelerometers and gyroscopes sounds like a huge pain, however a system like this would be a huge boon to something like APM, the arduino pilot open source project

  13. Marrying LIDAR, accelerometers and gyroscopes is already implemented on ROS. Look for laser_scan_matcher

  14. i've been meaning to look at ROS, didn't realize they had features that varied already. that's only half the story though, the other half is the state estimation using that simplified aerodynamics model. although that might be possible using a rudimentary physics engine for games

  15. Fermat's last Theorem is the transformer bumblebee Robot.
    Original equation:
    z^n=x^n+y^n.
    Mean:
    z^(n-3)*z^3=x^(n-3)*x^3+y^(n-3)*y^3.
    Using the formula z^3=[z(z+1)/2]^2 – [z(z-1)/2]^2 to convert z^3 become the exponent 2. Then using the formula [z(z+1)/2]^2=1^3+2^3+……..+z^3 to convert the exponent 2 become the exponent 3.
    Repeated several times with the same method.
    The transformer bumblebee Robot was created according your own structure.

    Simplest format about Transformer  Bumblebee Robot.
    Using two formulas:
    z^3=[z(z+1)/2]^2 – [z(z-1)/2]^2 
    And define x<x+a<y.
    x^3+y^3=[y(y+1)/2]^2 – [x(x-1)/2]^2  – [(x+1)^3+(x+2)^3+……..+(x+a-1)^3+(x+a)^3+(x+a+1)^3+……..+(y-1)^3] 
    Because:
    (x+a)^3= [(x+a)(x+a+1)/2]^2 – [(x+a)(x+a-1)/2]^2
    So also:
    x^3+y^3=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+a)(x+a+1)/2]^2 + [(x+a)(x+a-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+a-1)^3+(x+a+1)^3+……..+(y-1)^3] 

    Original equation:
    z^3=x^3+y^3.
    According to above method, the  transformer bumblebee Robot system was created:
    [z(z+1)/2]^2 – [z(z-1)/2]^2=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+a)(x+a+1)/2]^2 + [(x+a)(x+a-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+a-1)^3+(x+a+1)^3+……..+(y-1)^3] 

    [z(z+1)/2]^2 – [z(z-1)/2]^2=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+b)(x+b+1)/2]^2 + [(x+b)(x+b-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+b-1)^3+(x+b+1)^3+……..+(y-1)^3] 

    [z(z+1)/2]^2 – [z(z-1)/2]^2=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+c)(x+c+1)/2]^2 + [(x+c)(x+c-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+c-1)^3+(x+c+1)^3+……..+(y-1)^3] 

    [z(z+1)/2]^2 – [z(z-1)/2]^2=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+d)(x+d+1)/2]^2 + [(x+d)(x+d-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+d-1)^3+(x+d+1)^3+……..+(y-1)^3].

    ……..
    Flood robot on the planet Orion galaxy looking for the integers to eat.
    Certainly not enough the integer for the large number of transformer bumblebee Robot

    ADIEU.

  16. This is only possible with todays fast processors, light weight lasers, and advanced batteries, Only more proof that self driving cars will be in the future.

  17. This is one of the coolest things I've seen from MITnews, I'm going to look into building one of these, but with some modifications to it!

  18. can the problem for dynamic navigation be fix by adding some kind of motor that gives the back tail more pivoting motion, or would it make the plan loose balance.

  19. I'm very interested in this type of autonomous flight for my quadcopter. Which sensor used? Do I need sensors costing hundreds dollar? Is it opened-source? Thanks

Leave a Reply

Your email address will not be published. Required fields are marked *