Lituya Bay’s Apocalyptic Wave (Credit: NASA)
The flood event detection contest, organized by the NASA Interagency Implementation and Advanced Concepts Team in collaboration with the IEEE GRSS Earth Science Informatics Technical Committee, seeks to develop approaches to delineate open water flood areas as an effort to identify flood extent, an impactful disaster that occurs frequently throughout the world. The competition involves a supervised learning task—participants will develop algorithms to identify flood pixels after training their algorithm against a training set of synthetic aperture radar (SAR) images. Participants are required to submit binary classification maps, and performance will be evaluated using the intersection over union (IOU) score.
This challenge aims to promote innovation in the detection of flood events and water bodies, as well as to provide objective and fair comparisons among methods. The ranking is based on quantitative accuracy parameters computed with respect to undisclosed test samples. Participants will be given a limited time to submit their segmentation maps after the competition starts. The contest will consist of two phases:
Date | Phase | Event |
---|---|---|
April 14th | ^ | Contest opening |
April 15th | Development (Phase 1) | Release of training data with references and validation data without references, evaluation of submissions for validation data set begins |
May 15th | x | Release of test data and validation references for model tuning, evaluation of test submissions begins |
June 15th | Test (Phase 2) | Evaluation of submissions stops |
July 16th | v | Winner announcement |
The contest dataset is composed of 66,810 (33,405 x 2 VV & VH polarization) tiles of 256×256 pixels, distributed respectively across the training, validation and test sets as follows: 33,405, 10,400, and 12,348 tiles for each polarization. Each tile includes 3 RGB channels which have been converted by tiling 54 labeled GeoTIFF files generated from Sentinel-1 C-band synthetic aperture radar (SAR) imagery data using Hybrid Pluggable Processing Pipeline “hyp3”. Training tiles correspond to intensity values for VV and VH polarization with the following attributes.
<region>_<datetime>*_x-*_y-*_<vv | vh>.png
The provided training data is split across 29 root folders named <region>_<datetime>*
, region being the region and datetime being the date and time of the flood event. Each root folder includes 4 sub-folders: vv
, vh
, flood_label
and water_body_label
with 2,068 files each. vv
and vh
correspond to the satellite images listed earlier and images in the flood_label and water_body_label folder provide reference ground truth.
Please note that the labeling has been performed as follows: The Hybrid Pluggable Processing Pipeline, “hyp3” system takes the Sentinel archive and creates a set of processes to get to a consistent method of generating the VV/VH amplitude or power imagery. The imagery is then converted to a 0 - 255 grayscale image. Total of 54 labeled GeoTIFF files from regions converted into tiles:
Each TIFF is converted to multiple 256x256 tiles (scenes) The reference ground truth file is 256×256 pixels large. The pixel values (0 or 1) correspond to the two following classes:
Visual examples of the training data and the high-resolution segmentation map used in the contest are shown in Figure 1.
(a) Sentinel-1 image (Grayscale: VV) | (b) Sentinel-1 image (Grayscale: VH) | (c) Reference flood data (Grayscale) | (d) Reference water body data (Grayscale) |
Figure 1: Visual examples of satellite images (a), (b) and reference data (c) and (d) used in ETCI 2021.
We expect participants to provide a binary segmentation of the region of interest (ROI), (i.e. 256x256 pixels) as a numpy arra with the byte (uint8) data type:
The reference data for the validation remain undisclosed until phase 1 concludes howeveer reference data for the test set remains undisclosed and will be used for the evaluation of the results.
The first, second, third, and fourth-ranked teams in the ECTI competition will be declared after code review as winners.
The ETCI competition chairs would like to thank the NASA Earth Science Data Systems Program, NASA Digital Transformation AI/ML thrust, and IEEE GRSS.
The data are provided for the purpose of participation in the 2021 ETCI competition. Any scientific publication using the data shall include a section “Acknowledgement”. This section shall include the following sentence: “The authors would like to thank the NASA Earth Science Data Systems Program, NASA Digital Transformation AI/ML thrust, and IEEE GRSS for organizing the ETCI competition”.
A. Yes, the cropped images are acquired from GeoTIFFs which don’t exactly align with a sliding crop window at the edges which causes artifacts in those cropped VV/VH images resulting in almost completely white images. However, they may still have flood / water body label because they are acquired from a different process. You may treat these images as noise.
A. For Phase 1 & 2: The maximum submission limit is 10 per day and the total number of submissions that can be made is 100.
Position | Participant or Team name | IOU |
---|---|---|
1st | Team Arren (Xidian University) | 0.7681 |
2nd | Siddha Ganju (NVIDIA) & Sayak Paul (Carted) | 0.7654 |
3rd | Shagun Garg (GFZ Postdam) | 0.7506 |