Our challenge speaks of the emergence of websites, panels, and online services related to the delivery of quality information during the pandemic, to combat fake news. But it also speaks of the lack of quality online services that relate environmental factors to the coronavirus. Our project aims to integrate environmental data with epidemiological data, using an intelligent panel that analyzes the impacts of human activities on the appearance of epidemics and possibly makes predictions and monitoring of new future outbreaks.
What inspired us was the possibility of being able to prevent a new pandemic with data provided by all agencies for the good of humanity.
We first defined the scope of the project, to find out what was or was not possible to implement, and what to leave for later what was not the focus of the MVP. Soon after, we separated the development fronts and took turns in the things that we have more expertise.
We use agency data to show it on the map.
Development Fronts:
"Api" - Responsible for finding the data, validating, and its necessary API's.
"Design" - Responsible for creating screens and video pitch.
"Backend" - Responsible for creating the server, creating request methods, integrating APIS data, and integrating IBM-Watson for the chatbot.
"Frontend" - Responsible for passing the defined design, creating behaviors for the website, and playfully organizing data on the screen.
Languages:
CSS, HTML5, Javascript, NodeJS.
Frameworks, libraries, and tools:
Jquery, Leaflet, IBM-Watson, Express, Tableau, GitHub, Trello, Postman.
Difficulty defining the data and integrating it in real-time, as some of them provide only old data and in non-intuitive formats.
NOAA, FIRMS, NASA, OMS, IBM, EORC.JAXA, CLIMATE.GOV, IBM Cloud Assistant