Login / Signup

Controlling a bio-inspired miniature blimp using a depth sensing neural-network camera.

Huy Q PhamShreyansh SinghMatthew GarrattDipanjan Majumdar
Published in: Bioinspiration & biomimetics (2024)
Miniature blimps are lighter-than-air vehicles which have become an increasingly common unmanned aerial system research platform due to their extended endurance and collision tolerant design. The UNSW-C bio-inspired miniature blimp consists of a 0.5m spherical mylar envelope filled with helium. Four fins placed along the equator provide control over the three translatory axes and yaw rotations. A gondola attached to the bottom of the blimp contains all the electronics and flight controller. Here, we focus on using the UNSW-C blimp as a platform to achieve autonomous flight in GPS-denied environments. Majority of unmanned flying systems rely on GPS or multi-camera motion capture systems for position and orientation estimation. However, such systems are expensive, difficult to set up and not compact enough to be deployed in real indoor environments. Instead, we seek to achieve basic flight autonomy for the blimp using a low-priced and portable solution. We make use of a low-cost embedded neural network stereoscopic camera (OAK-D-PoE) for detecting and positioning the blimp while an onboard IMU was used for orientation estimation. Flight tests and analysis of trajectories revealed that position hold as well as basic waypoint navigation could be achieved with variance (<0.1m) comparable to that when a conventional multi-camera position system (VICON) was used for localizing the blimp. Our results highlight the potentially favorable tradeoffs offered by such low-cost positioning systems in extending the operational domain of unmanned flight systems.
Keyphrases
  • low cost
  • neural network
  • high speed
  • convolutional neural network
  • single cell
  • skeletal muscle
  • mass spectrometry
  • high resolution
  • machine learning
  • heavy metals
  • deep learning