The challenge tasked us with examining the “potential impacts on reduced human traffic” in protected environments such as beaches and parks which could manifest in different forms such as a reduction in land degradation, a change in water quality, or a change in vegetation growth. Our project provides data relating to each of these three factors (Surface Albedo, SST, and EVI Respectively) and allows users to track the changes in these factors over time so that they can pinpoint a potential deviation from the trend in data collected after lockdown measures were implemented.
The impact of COVID-19 is one that every member of our team has felt in some way. Although we consider ourselves lucky, all of us have missed out on some of the amazing experiences, such as graduation, prom, and recitals, that we would have gotten to enjoy in a world without the virus. We are thankful for the opportunity to participate in the NASA SpaceApps Challenge because it helps us feel like we are contributing to the fight against the coronavirus rather than being helpless bystanders as the disease changes life as we know it.
We were inspired to choose this challenge because not only could our work in it be applied on a global scale, but we could use data that related directly to a location near us. Long Island Sound, the area around which we chose to focus our study, has historically been impacted greatly by human traffic. Not only did the challenge allow us to visualize the effect that a reduction in human traffic had on this area, but provided insights into these effects on a global scale as well.
We started by downloading our data from the NASA Giovanni visualizer and NASA Earthdata search in HDF format (for EVI) and netCDF format (for SST and surface albedo) for May 2016, 2017, 2018, 2019, and 2020. We then used panoply to visualize the data, and zoomed in on the Long Island region, ensuring that we were consistent with coordinates. The visualization showed the level of EVI, SST, or surface albedo on a 9-color scale, with more vibrant colors depicting higher levels of the given variable and paler colors depicting lower levels. After saving this image as a png, we ran it through a python image processor that we wrote, which counted the frequency of each color on the scale. We displayed our data on a website made with ReactJS using the Material-UI library. On this site, we show plots of the frequency of each color versus time for all 3 of our data sets and also provide sample images of what the data we processed looks like. The site also contains additional information about the variables we used.
Some problems we faced during development included understanding how to navigate NASA data and understanding the meaning of and how to properly extract data from certain file-types (such as HDF's) . Time was an issue as well, and we scrambled to build the aspects of our app that we considered crucial rather than worrying about some of the finer details. We are high schoolers, and this is our first hackathon, so we anticipated that the collaboration aspect of the challenge would be difficult, but luckily, with the help of VSCode liveshare and github, we were able to share code and help each other out in an effective way. Every new commit we made was a huge accomplishment to our team, and we shared our status every now and then in our team group chat supporting one another in our work.
Water temperature: NASA aqua MODIS SST data from Giovanni
Surface Albedo: NASA Goddard Earth Sciences Data and Information Services Center surface albedo data from Giovanni
EVT: Terra MODIS Vegetation Indices from Earthdata Search