A New Perspective

Due to the COVID-19 pandemic, protected areas and other forms of wilderness areas (e.g., arboretums, beaches, parks, marine monuments) have been closed worldwide. Your challenge is to lead the effort to examine any potential impacts of reduced human traffic in such local protected natural environments.

OutDoorsy

Summary

The first tab, the environmental tab, gives the user essential information about one's environment so that the user can always be prepared for venturing out into nature. We used the given NASA databases to collect data and serve it to the user. We returned humidity, air quality index, temperature, forecast, UV index, and cloud count. The second tab is a social media for users to post their favorite camping/nature-experiencing spots so everyone can appreciate new abundance in nature. The third tab is the main feature of our app. The identifier allows the user to take pictures of plants and animals and the app will return the identification. Our app uses Machine learning to identify organisms.

How We Addressed This Challenge

Our project aims to make the user more aware of their environment in the wake of increased wildlife and potentially dangerous encounters. Decreased human traffic would lead to a growth in wildlife of all types, and with that growth comes increased dangers. The boost in environmental liveliness comes at a cost: we all must be more careful. Invasive species may overrun what used to be harmless areas of organisms. Our app allows the user recognize certain harmful animals and plants with a Machine Learning algorithm that can detect and identify potentially harmful species, as seen in the demonstration video. The state-of-the-art ML model can be employed to protect millions of users from what would be a deadly encounter with harmful organisms. Decreased use of the area might also lead to increased pollen, which is a deadly trigger for asthma attacks. Pollen levels are just one of the many things our app detects and notifies the user of, allowing the user to be more aware of their surroundings. Users can now always be prepared for anything, since our app determines the weather status of the user's area.

How We Developed This Project

We were inspired by our want to go outside and enjoy nature once again after lockdown passes. James has had many bad experiences with nature due to his past as a Cub Scout, and we wanted to create an app that could solve the problem. We used the datasets given to us by NASA to build the weather page; specifically, we served the user's coordinates to the backend of the application and the backend retrieved corresponding data from the datasets to return the environmental information of the user's surroundings. For the machine learning algorithm used in the identifiers, we built a deep convolutional neural network. The DCCN was trained with 500 images in each plant and animal category to perfect its detection system. The DCCN was then employed on google cloud in order to establish a system of interchanges between the client and the backend, where the client sends in an image via a post request, and the backend returns the data. We ran into trouble configuring the datasets into the proper format to serve to the client, but we solved this by running two endpoints, one for raw data and one for diluted data. We also had trouble training the DCCN since it took a long time to train, and at the end of the first trial there was a bug resulting in the DCCN needing to be trained for another couple of hours. However, we pulled it together at the end. We are proud to achieve a working neural network for organism identification, since that was our goal from the beginning.

Project Demo

https://www.youtube.com/watch?v=TRyw0L-mesg&feature=youtu.be

Data & Resources
Tags
#environmentalawareness #Outdoorsy #nature #identifier
Global Judging
This project was submitted for consideration during the Space Apps Global Judging process.