Our project aims to make the user more aware of their environment in the wake of increased wildlife and potentially dangerous encounters. Decreased human traffic would lead to a growth in wildlife of all types, and with that growth comes increased dangers. The boost in environmental liveliness comes at a cost: we all must be more careful. Invasive species may overrun what used to be harmless areas of organisms. Our app allows the user recognize certain harmful animals and plants with a Machine Learning algorithm that can detect and identify potentially harmful species, as seen in the demonstration video. The state-of-the-art ML model can be employed to protect millions of users from what would be a deadly encounter with harmful organisms. Decreased use of the area might also lead to increased pollen, which is a deadly trigger for asthma attacks. Pollen levels are just one of the many things our app detects and notifies the user of, allowing the user to be more aware of their surroundings. Users can now always be prepared for anything, since our app determines the weather status of the user's area.
We were inspired by our want to go outside and enjoy nature once again after lockdown passes. James has had many bad experiences with nature due to his past as a Cub Scout, and we wanted to create an app that could solve the problem. We used the datasets given to us by NASA to build the weather page; specifically, we served the user's coordinates to the backend of the application and the backend retrieved corresponding data from the datasets to return the environmental information of the user's surroundings. For the machine learning algorithm used in the identifiers, we built a deep convolutional neural network. The DCCN was trained with 500 images in each plant and animal category to perfect its detection system. The DCCN was then employed on google cloud in order to establish a system of interchanges between the client and the backend, where the client sends in an image via a post request, and the backend returns the data. We ran into trouble configuring the datasets into the proper format to serve to the client, but we solved this by running two endpoints, one for raw data and one for diluted data. We also had trouble training the DCCN since it took a long time to train, and at the end of the first trial there was a bug resulting in the DCCN needing to be trained for another couple of hours. However, we pulled it together at the end. We are proud to achieve a working neural network for organism identification, since that was our goal from the beginning.
https://www.youtube.com/watch?v=TRyw0L-mesg&feature=youtu.be
https://earthdata.nasa.gov/learn/sensing-our-planet/volatile-trees
https://www.nasa.gov/centers/langley/news/researchernews/rn_SummerSafety.html
https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19930073077.pdf
https://www.ssc.nasa.gov/environmental/resource_mngmnt/natural_resources/poison/poison.html
https://www.fs.usda.gov/detail/okawen/alerts-notices?cid=fsm9_019210