top of page

Categories

Category 1

​

In this category, responders would up load image files (either RGB or thermal) to the CVER Toolkit and then pick one or more of three common enhancements. This will let them “see” the signal in the noise or separate out areas of interest. OpenCV has some good versions of these functions that can be adapted with appropriate attribution.

Basic Operations: In this category, coders provide the following three basic algorithms. These are straightforward, entry-level programming projects and a good introduction to computer vision.

 

  • image enhancement through histogram equalization, which can be applied to electro-optical (visible light cameras) and thermal imagery

  • color segmentation for a range

  • grayscale segmentation for a range in a thermal image

 

The dataset for testing is here.

Category 2
 

SUAS often fly mapping missions where the camera is pointed straight down (nadir) and collect a series of images. Then responders look at those images in detail and based on what they see, they may want to send someone to the field to that location, so they need get a GPS location. Or they might want to grid out an area to specify boundaries for looking in an area. However, most groups do not use GPS coordinates but rather USNG.

 

GIS operations: In this category, coders must provide a system that for each nadir image in a set does the following list:

  • if given FOV, altitude and overlays a USNG grid for a user-specified interval (e.g., every 50, 100, or 200 meters)

  • give the GPS coordinates of each pixel if a cursor is rolled over the image

  • if given a set of images with the GPS or USNG coordinate and a bounding box, find all images in the set that have a pixel intersecting that location (this would be used if a responder thought they saw something in an image but wanted to see other views to help confirm or disambiguate)

  • dataset link

​

DJI Mavic Pro (DJI Mavic Pro Specs):

  • FOV 78.8°

  • 26 mm (35 mm format equivalent) f/2.2 

  • Distortion < 1.5% 

  • Focus from 0.5 m to ∞

 

DJI M210 (DJI M210 Specs)

  • FOV: 63.7°(Wide) – 2.3°(Tele)

​

​

This requires more mathematical analysis and also helps the students connect computer vision to the camera payload and that different small UAS will have different cameras. It also introduces programmers to the USNG and that different groups use different geographical coordinate systems. The program will need to read the geotag embedded in the .jpg

Category 3

​

Open: In this category, entrants can submit any algorithm they believe will be of value. For example, they may use the CRASAR imagery datasets (hrail.crasar.org) for wilderness search and rescue to train a machine learning algorithm. This allows more advanced students to be creative or implement algorithms from papers.

 

This is a challenging example of a video from a thermal camera where a person is walking is here.

 

Other still image and video datasets from Hurricanes Harvey, Irma, and Michael and from the Kilauea volcanic eruption are here: hrail.crasar.org

​

DJI Mavic Pro (DJI Mavic Pro Specs):

  • FOV 78.8°

  • 26 mm (35 mm format equivalent) f/2.2 

  • Distortion < 1.5% 

  • Focus from 0.5 m to ∞

 

DJI Mavic Air (DJI Mavic Air Specs):

  • FOV: 85°

  • 35 mm (Format Equivalent: 24 mm)

  • Aperture: f/2.8

  • Shooting Range: 0.5 m to ∞

 

DJI Phantom 4 (DJI Phantom 4 Specs):

  • FOV 94°

  • 20 mm (35 mm format equivalent)

  • f/2.8 focus at ∞

 

DJI Phantom 4 Pro (DJI Phantom 4 Pro Specs):

  • FOV 84° 

  • 8.8 mm/24 mm (35 mm format equivalent) 

  • f/2.8 - f/11 auto focus at 1 m - ∞

​

DJI M210 (DJI M210 Specs)

  • FOV: 63.7°(Wide) – 2.3°(Tele)

​

bottom of page