We finally got around to flying and recording video this past weekend. We put out targets and recorded two videos - one about eight minutes long and the other slightly longer. Unfortunately the second file seems to have become corrupted so we can't view it for now. We've tried hex editing the header of the avi file to fix it but we haven't had any luck so far.
The flight tests were going very well up until one of our flight batteries that powers the motor failed on us. This caused the motor to shut off and made for a rather rough emergency landing so we won't fly for another one to two weeks while the plane is being repaired.
The video files are very large (several gb) and I haven't compressed them to upload yet.
We did realize one very important thing though - we need to deinterlace our video. We use a Sony FCB-HD11 camera which has the capability to capture HD or SD. Right now our transmitter is only for SD so we have to use the NTSC format, which is an interlaced 29.97 fps video.
This video looks great when viewed on a device that is designed to display interlaced video, like an older television. However, when viewed on a progressive monitor, like a computer display, it looks terrible.
Here's an example of what interlaced video looks like when displayed progressively:
We can use video editing software to deinterlace the recorded footage, but we need a real time solution to use until we can acquire an HD transmitter and use 720p.
There is a lot of literature on the subject and many approaches ranging in complexity and effectiveness. This site has a good overview on the subject. Deinterlacing—An Overview is a great paper on many approaches to solving the problem.
Our first efforts will be to implement a simple deinterlacing scheme that takes either the odd or even field lines and interpolates the values in between them (code can be found to do this in both OpenCV and GLSL). Our videos have a lot of motion so likely we will need something more robust than this.
More advanced techniques use both temporal and motion information to deinterlace, such as that found in A Method of De-lntcrlacing with Motion Compensated Interpolation, Motion and Edge Adaptive Interpolation De-interlacing Algorithm, or Deinterlacing with
Motion-Compensated Anisotropic Diffusion.
Deinterlacing concerns aside, I was not too happy with the video of targets we did acquire. Currently we are using a fixed downward looking camera and have no control over zoom once the plane has taken off. In about a week we will have the capability to control the camera via a wireless serial interface, but actually pointing the camera is a ways off. We should also have access to telemetry data at the same time, which will allow us to perform image rectification.
At this point it is premature for us to discuss object/character recognition algorithms in detail since what our final imagery will look like is still in flux. It is likely that since we won't be flying for about two weeks that we will acquire "still" shots of the targets on campus from a high vantage point so we can make some progress on the recognition front.