≡ Menu

Centeye Nano Drone Video

Enjoy our most recent video showing our nano drone with 360 degree obstacle avoidance and auto-hover!

 

{ 0 comments }

Centeye Autonomous Nano Drone with 360 Degree Stereo Vision

As part of our efforts to develop practical vision-based flight control, Centeye has developed a new version of our nano drone containing 360 degree vision and obstacle avoidance. The system is built on a modified Crazyflie. Below are pictures and specifications for this system, which we are using for internal work and providing to our partners. The system is operational. New videos will be released in the coming months.

dsc_1241

Centeye nano unmanned aircraft system (UAS) with 360-degree stereo vision

  • 360-degree optical flow and obstacle perception
  • Operation from daylight to no light
  • Automatic launch to hover
  • Moves in direction commanded by human operator
  • Holds a position when control sticks released
  • Detects and avoids approaching obstacles, including textureless walls
  • 100% GPS-denied
  • Built on a Crazyflie platform
  • Size: Rotors form 6 cm square
  • Mass: 38 g including battery

 

dsc_1247_cropped

Centeye Multi-Mode Stereo Sensor:

  • 2 x Centeye RockCreek™ vision chips, with two-mode pixels and analog contrast enhancement/edge detection circuitry
  • Custom wide field of view (up to 150°) optics
  • Optical flow
  • Stereo vision
  • Laser ranging
  • LED illumination
  • Mass: 1.0 grams
  • Power: As little as 60 mA @ 4V
  • Neural network-based attention modulation

 

{ 0 comments }

As part of an Air Force funded project, Centeye has prototyped a vision based system to allow small drones to both hover in place without GPS and visually detect nearby objects to avoid collisions. The video below shows sample flights in an indoor residence, taken in November 2015. A more detailed write-up is available on the site of our good friends at Tandem NSI. A sample video of flights is below. The video contains annotations which are best viewed through a laptop or desktop computer.

 

{ 0 comments }

 

 

 

Internet of Things (IoT) car traffic counter camera using an ArduEye and Xively

Eager to make a little foray into the “Internet of Things”, I decided to experiment with the use of an ArduEye as an “Eye for the IoT”. My house is on a fairly busy street, of which I have a good vantage point from my attic home office. A car traffic counter seemed like a good choice for a first project.

 

I programmed an ArduEye Aphid to detect cars using a very simple algorithm- The ArduEye grabs a small block of pixels ten times a second, and if any pixel in that block changes by more than a threshold, this is considered a car detection. The block of pixels is on the path of cars headed north-bound on my street. I added the requirement that the pixels had to quiet down and be quite for a certain time interval (about a half a second) to prevent the same car from being counted twice.

 

Next was to upload the data feed to COSM (formerly Pachube). I made use of an excellent book, Building Internet of Things with the Arduino, by Charalampos Doukas, and used an Arduino Uno to bridge the ArduEye Aphid to the internet via a WiFly shield. COSM’s API was simple enough to follow, and for the first time, one of my image sensor chips was feeding real-time data to the Internet!

 

Here is a link to my data feed on COSM. I’m uploading three datastreams. The first is a simple count of how many cars are detected per minute. the second datastream shows the raw intensity value of one pixel in the window- this is a fun way to monitor the change from day to night. (This datastream is a bit glitchy- I haven’t figured that out yet). The third datastream shows the largest “delta intensity” change of any pixel between two successive frames. The picture below shows a screen shot of the car count datastream over a six-hour period of February 19, 2013. You can clearly see the rise of traffic due to rush hour towards the end of the plot.

 

 

The datastream you see may be different, depending on the current traffic and whether the camera is alive at any one point.

 

This first traffic monitor is very simple and certainly far from perfect. But it was fun to do and gives me a few other ideas for things to try. Here’s an interesting question: What if I could make a version of the Stonyman image sensor chip that drew just a few hundred microwatts of power? Could I then hook it up to a low power microcontroller that could monitor traffic for, say, a year with just a single battery charge? I think it would be fun to try that out…

{ 0 comments }

BCIT students make eye tracking computer input device, for ALS patients, using Centeye chip

This past January, I was contacted by Alex Sayer, Alan Kwok, and Benny Chick, students at the British Columbia Institute of Technology (BCIT) in Vancouver, who wanted to use some of our Tam2 chips for a class project in which they would provide a human-computer interface (HCI) for people suffering from ALS (Lou Gehrig’s disease) and could not operate a computer using their hands. Their idea: Use a low resolution image sensor, mounted on a pair of eyeglasses and pointed at the eye, to track where the eye is looking, and generate signals to emulate a mouse. The user would move his or her eye in different directions, causing the mouse cursor to move accordingly. The user could use select patterns to generate mouse clicks and so forth. Intrigued, I sent them some Tam2 chips to play with…

 

Four months later, they had a working prototype they call the “eyeSelect”, and Alan Janzen, a patient suffering from ALS, was using this device to play solitaire and other games on a computer! These three students won the top prize in the 2012 Dr. Jim McEwen Excellence in Engineering design competition. Nice!!

 

Below are a links to news media articles on their achievement, with pictures and video!

Metro News Canada

The Province

24hrs Canada

Global BC

 

 

{ 0 comments }

What can you do with an ArduEye?

We’ve had a lot of inquiries about our ArduEye system, plus we’ve just prototyped a smaller, completely self-contained ArduEye (more on this in another post). So I figure it makes sense to discuss what is actually possible with one of these devices. True, the ATmega328 processing engine of an Arduino is limited compared to more advanced DSP, but the reality is that for many applications you really don’t need a whole lot of pixels. If you can get by with specs of the ‘328 (16MHz as an Arduino, 8kB flash, and 2kB SRAM), the small size of an ArduEye (we’ve gotten to around 350mg) and the ease of prototyping new sensors with one (I can prototype a new sensor with just a few hours coding) makes it a good development platform and reference design. So below are some sample applications:

  • Monitor road traffic- count cars going each direction, measure speed, see if traffic is stopped
  • Count pedestrians walking down a hallway or through a gate
  • Monitor environment for intruders- are there people entering an area you’d rather keep private, or is dangerous? Are there rabbits invading your vegetable garden?
  • Give the Internet some eyes- Connect the ArduEye to the Internet, and use a service such as COSM to let anyone in the world pull data off your ArduEye. You can do this a thousand times in an area and have some fun. Hello Web 3.0!
  • Monitor presence of people in a room- Beyond a motion sensor, are there people in a room, or entering/leaving it?
  • Control a robot- You can do a lot with limited resolution- detect collisions with walls or other objects, use two to make an RC car drive down a tunnel, implement a fast line follower, etc.
  • Control a drone- One ArduEye facing downward is adequate for controlling the height of a fixed-wing RC aircraft using optical flow. Two facing horizontally in opposite directions is barely enough to control the height of a quad. Using four horizontally you should be able to make a quad hover in place.
  • Make a laser range-finder. You’ll need a laser module (as light as 2 grams) and ideally an optical bandpass filter.
  • Make an eye-tracker. A group of bright students at BCIT made an eye tracker using an ArduEye that allowed a person with ALS to control a computer using just his eyes!
  • Detect a ball being thrown.
  • Measure relative speed of two objects: Mount the ArduEye on one object, and image another object, and measure the optical flow. This can support all sorts of industrial control applications. For added accuracy you can place markings on the surface being imaged and even count markings.
  • Monitor your pet- Is your pet awake or asleep? Moving or staying put?
  • Baby monitor- Same for your baby.
  • Interactive art installation- Why not make a sculpture that interacts with people rather than just blinks lights on it’s own.
  • Track the sun- If using flat printed optics, you can track the sun from just after sunrise to just before sunset, and know if it is cloudy or not. Perhaps you can use this information to steer a movable solar panel.
  • Make a game, or game controller- in between measuring motion, detecting if someone has crossed a line, or moved, there are many games that could be constructed with an ArduEye.

We’ve actually prototyped some of the above examples in our lab or by others. The rest are examples that we are pretty sure are doable using the memory constraints of an Arduino, though we haven’t actually attempted yet.

{ 0 comments }

A 350mg, Omnidirectional 6DOF Optical IMU/Optical Flow Sensor

In a previous post, I demonstrated that the ArduEye platform could be used to prototype a 6DOF vision system for optical flow odometry. The goal is to make a vision system for the Harvard University Robobee Project.

 

After the success of the prototype, the next step was to design a board that was as small and light as possible. The result is shown below:

 

robobee3

The ArduEye prototype (left) and the finished sensor (right)

 

robobee2

Main components of vision sensor

 

robobee4-187x300The vision system consists of two back-to-back Stonyman vision chips, an Atmel ATMEGA 328P microcontroller, an oscillator (16Mhz), and a voltage regulator. The chips have flat printed optics (as described previously) with slits in order to take one-dimensional images of the environment. Even better, the Atmel has the Arduino bootloader, so the sensor is an Arduino clone and can be programmed through the Arduino IDE. The entire system weighs approximately 300-350 milligrams and has dimensions of 8×11 millimeters.

 

The following video shows that motion along all six axes can be distinguished. Some axes are stronger than others, and the Y translation, in particular, is weak. However, the results are promising and with a little optimization this could be a useful addition to a sensor suite.

 

I’d like to gauge the interest for an integrated Arduino clone vision sensor similar to this, but maybe not as compact and minimal. This would be most likely a one-sided vision chip with optics and an Arduino clone processor integrated on a small, single board. The size would be about that of a penny and weigh a half a gram. The user would have control over which pixels are read and how they are processed through the Arduino environment.

 

 

{ 0 comments }

New ArduEye using Stonyman image sensor chips

Awhile ago we (Centeye) started ArduEye, a project to implement an open source programmable vision sensorbuilt around the Arduino platform. The first ArduEye version used a simple Tam image sensor chip and a plastic lens attached directly to the chip. After much experimentation and some feedback from users, we now have a second generation ArduEye.

The second generation ArduEye is meant to be extremely flexible, ultimately allowing one to implement a wide variety of different sensor configurations. A basic, complete ArduEye is shown below, and contains the following basic components:

StonymanUnoRocket_1

An Arduino- Currently we are supporting Arduino UNO-sized boards (e.g. UNO, Duemilanove, Pro) and the Arduino MEGA. When the ARM-driven DUE comes out, we will surely support that as well.

A shield board- this board plugs into the Arduino, and has a number of places to mount one or more image sensor breakout boards. This shield also has places to mount an optional external ADC as well as additional power supply capacitors if desired.

A Stonyman image sensor on a breakout board- The Stonyman is a Centeye-designed 112×112 resolution image sensor chip with an extremely simple interface: 5 digital lines in, which are pulse in predefined sequences, and one analog line out, which contains the pixel. The Stonyman chips are wirebonded directly to a 1-inch square breakout board, which can plug into the shield.

Optics- Possibilities include printed pinholes, printed slits, and cell-phone camera lenses, depending on what you want to do.

Example application- The “application” is an Arduino sketch programmed into the Arduino. This sketch determines what the ArduEye does. One sketch can make it track bright lights, another sketch can measure optical flow, and so on. We are releasing, initially, a base sketch that demonstrates light tracking, optical flow, and odometry. Let us know what other example applications you would like to see.

ArduEye libraries- These libraries are to be installed in your Arduino IDE’s “libraries” file, and include functions to operate the Stonyman image sensor chip as well as acquire and process images, including measuring optical flow.

GUI- Finally, we created a basic GUI that serves as a visual dump terminal for the ArduEye. You can now communicate with the ArduEye via either the GUI or the basic Arduino IDE’s serial terminal. The GUI was written in Processing.

We designed the system to allow easy hacking to implement a wide variety of vision sensors by exploring combinations of optics, image sensing, and image processing. I personally find it useful, and actually use this system for prototyping things at Centeye- I can prototype a new vision sensor in just a couple hours. The target applications are quite broad and include just about anything that may use embedded vision, whether robotics, sensor nets, industrial controls, interactive electronic sculptures (yes this has come up), and so forth.

The video at the top shows some of the basic things you can do with this ArduEye. You’ll see the ArduEye interfacing with a host PC using both the Arduino IDE’s serial terminal and the ArduEye GUI. For more details, including links to the hardware design files and source code, go to the ArduEye wiki site. The site is a work in progress, but should be adequate to get people started. The sample “first application” and GUI is what was used to generate the above video.

Right now we are having 200 Stonyman breakout boards being assembled- they should be ready within a month. We’ll make more if this is well-received. We can assemble a few in-house at Centeye- I’ll do this if enough people twist my arm and promise to really play with the hardware. 🙂

Please let me know your thoughts. In particular, are there any other “sample application” sketches you’d like us to implement?

{ 0 comments }

RC micro helicopter hover (yaw and height) using millimeter thick vision camera

As part of Centeye’s participation in the NSF-funded Harvard University Robobee project, we are trying to see just how small we can make a vision system that can control a small flying vehicle. For the Robobee project our weight budget will be on the order of 25 milligrams. The vision system for our previous helicopter hovering system weighed about 3 to 5 grams (two orders of magnitude more!) so we have a ways to go!

We recently showed that we can control the yaw and height (heave) of a helicopter using just a single sensor. This is an improvement over the eight-sensor version used previously. The above video gives an overview of the helicopter (a hacked eFlite Blade mCX2) and the vision system, along with two sample flights in my living room. Basically a human pilot (Travis Young in this video) is able to fly the helicopter around with standard control sticks (left stick = yaw and heave, right stick = swash plate servos) and, upon letting go of the sticks, the helicopter with the vision system holds yaw and heave. Note that there was no sensing in this helicopter other than vision- there was no IMU or gyro, and all sensing/image processing was performed on board the helicopter. (The laptop is for setup and diagnostics only.)

The picture below shows the vision sensor itself- the image sensor and the optics weigh about 0.2g total. Image processing was performed on another board with an Atmel AVR32 processor- that was overkill and an 8-bit device could have been used.

Centeye_Camera_Board

A bit more about optics: In 2009 we developed a technique for “printing” optics on a thin plastic sheet, using the same photoplot process used to make masks for, say, making printed circuit boards. We can print up thousands of optics on a standard letter size sheet of plastic for about $50. The simplest version is a simple pinhole, which can be cut out of the plastic and glued directly onto an image sensor chip- pretty much any clear adhesive should work.The picture below shows a close-up of a piece of printed optics next to an image sensor (the one below is a different sensor, the 125 milligram TinyTam we demonstrated last year).

Centeye_Printed_Optics

The principle of the optics is quite understandable- a cross section is below. The plastic sheet has a higher index of refraction than air, thus a near hemisphere field of view of light may be focused onto a confined region of the image sensor chip. You won’t grab megapixel images in this manner, but it works well for the hundreds of pixels needed for hovering systems like this.

Centeye_Printed_Optics_Cross_Section

We are actually working on a new ArduEye system, using our newer Stonyman vision chips, to allow others to hack together sensors using this type of optics. A number of variations are possible, including using slits to sense 1D motion or pinhole arrays to make a compound eye sensor. If you want more details on this optics technique, you can pull up US patent application 12/710,073 on Google Patents.

(Sponsor Credit: “This work was partially supported by the National Science Foundation (award # CCF-0926148). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.”)

{ 0 comments }