≡ Menu

Solutions for GPS-denied near-Earth autonomy

Below are photographs and a video of the operational hardware we are currently using for in-house work and providing to our partners. Our nano “drone” is based on a Crazyflie platform and is designed for “supervised autonomy” applications in which the operator provides high level commands and the drone safely implements them. With no human input, the drone defaults to full autonomy. The system is small- it will rest in your hand, and it weighs about 38 grams total. Yes, that’s 38 grams for the quadrotor, battery, and all sensing and control electronics!

dsc_1241

Centeye nano unmanned aircraft system (UAS) with 360-degree stereo vision (2016)

  • 360-degree optical flow and obstacle perception
  • Operation from daylight to no light
  • Automatic launch to hover
  • Moves in direction commanded by human operator
  • Holds a position when control sticks released
  • Detects and avoids approaching obstacles, including textureless walls
  • 100% GPS-denied
  • Built on a Crazyflie platform
  • Size: Rotors form 6 cm square
  • Mass: 38 g including battery

 

dsc_1247_cropped

Centeye Multi-Mode Stereo Sensor: This is the sensor we are using for our work in 2016.

  • 2 x Centeye RockCreek™ vision chips, with two-mode pixels and analog contrast enhancement/edge detection circuitry
  • Custom wide field of view (up to 150°) optics
  • Optical flow
  • Stereo vision
  • Laser ranging
  • LED illumination
  • Mass: 1.0 grams
  • Power: As little as 60 mA @ 4V
  • Neural network-based attention modulation

 

Above is our older nano quadrotor, based on a modified Crazyflie, used in 2015.

 

 

EyeStrip2009

3.0 gram omnidirectional optical flow ring (old from 2009)

 

VS_Centeye_MilligramHIP_head

0.2 gram wide field of view camera for nano UAS hover (2011). We have actually flown this- it demonstrates how low we can potentially go in terms of mass.

 

New applications for small and nano UAS, for example inspection and parcel delivery, will involve flying close to the ground and in near-Earth environments, and in some instances will involve entering and flying trough a building. For a small or nano UAS to be practical, any flight control system must have the following characteristics:

Non-GPS reliant: GPS is really only suitable when flying tens or hundreds of meters above the ground or when taking off and landing in an open area. When flying close to the ground amidst buildings, trees, or other obstacles GPS is neither reliable nor adequate- It is slow, innaccurate, and susceptible to dropouts and jamming. In indoor environments GPS is worthless.

Ad-hoc perception on-board: The sensor system must instead be able to perceive the environment in an ad-hoc manner with little or no a-priori knowledge of the environment. Furthermore, all sensing and perception algorithms must be performed on-board. It is misguided and dangerous to choose architectures in which sensory data is sent down to a ground station for processing, since the reliability of the system will now be additionally depending on the downlink, uplink, and base station, which may be jammed or hacked.

Low mass: The sensor suite must be light enough to not weigh down the UAS platform. For moderately sized systems, the entire sensor suite needs to weigh on the order of hundreds of grams. For a nano UAS, the weight budget drops to from several grams to under a gram.

Fast: The sensor suite must be able to process imagery at a high temporal bandwidth. When UASs are flying at speed, it may travel a meter or more over a 100 millisecond time period. Visual processing should occur at rates of 100s of Hz or more. Slower speeds are only appropriate for hovering or station-keeping tasks.

Robust to light levels: When any visual modality is being used, the system must be able to operate in light levels ranging from full daylight such as outdoors under the sun to full darkness such as indoors at night. Light levels can range six or more orders of magnitude from less than a tenth of a lux to over 100k lux.

Shared autonomy: Based on years of flight experience, we recommend control architecture in which the full control of the UAS is shared between a human operator and the on-board control system. Ideally the human operator should be able to enter high level commands (“take-off”, “go left”, etc.) and the control system should take care of the fine level details of executing the maneuvers.

Station keeping: The system should allow the UAS to hover in one location for an indefinite period of time (limited by battery life), for example when a human operator releases the control sticks. Solutions involving only an IMU are inadequate. The system should allow recovery from perturbations due to wind or air currents.

Obstacle avoidance: The system should allow the UAS to avoid obstacles in the flight path, including overriding the control input provided by a human operator. In particular for hovering platforms, the system should avoid obstacles in all directions.

Centeye has a long history in implementing vision systems to address the above concerns. For more information, contact us.

Small / Nano UAS Autonomy