A FEMA end to end solution for situational awareness project jump-started a daisy-chained workflow that was something to gawk over. In July 2019, we worked with Light Technologies to present a proof of concept regarding a project they’re working on with FEMA (Federal Emergency Management Agency).
Utilizing Yelp data to estimate the number of businesses in a given locality and categorizing them according to FEMA’s seven lifelines
Problem Statement: Prior to and during a disaster, it is important to understand the projected and actual effects of the event on the community, including its economic effects on critical services. FEMA has identified seven “lifelines” that require attention during a disaster:
· Safety and Security
· Food, water, sheltering
·Health and Medical
· Energy (power, fuel)
· Communications
· Transportation
· Hazardous Waste
This tool will utilize Yelp to estimate the effects of the event on each of the seven lifelines. This can include the number of businesses or services in each category or even, if available, their status (if provided by users and reviews on Yelp). The tool will search for relevant data and categorize it according to a list of impacted neighborhoods or a list of affected zip codes. It will provide an estimation of the potential impact of the event, at least according to the data available on Yelp.
Above you’ll find the project statement. Seems simple enough. Scrape the web, aggregate the results, and display the results. However, with every simple solution, there is always a different layer that we must explore.
As we worked on this project, we quickly realized that it was more involved than we previously anticipated. Several days were spent looking through the project requirements and then another few on what the data capture entailed (completed project: presentation, code, & live visualizations are available at the end of this article). None the less, the complexity of the problem likes on the three action item mentioned earlier: scraping the web, aggregating/cleaning the results, and displaying the final data. Let me give you a quick premise on the Lifelines Project.
Scraping the Web
During our discovery stage, we learned more about what a FEMA Lifeline is, and it’s corresponding components.
Lifelines: A lifeline enables the continuous operation of government functions and critical business, and is essential to human health and safety or economic security.
Lifeline Components: Each lifeline is comprised of multiple components and essential elements of information needed to stabilize the incident
Not only did we scrape the web for Lifelines, but primarily they’re corresponding components to get the Lifelines. In total, seven Lifelines were scraped through their 31 corresponding components. A detailed list of components scraped is available in the project repository below. During discovery, we decided to expand our search from Yelp to include Google Places. We utilized both Yelp and Google API’s for our scraping. The Yelp API returned locations with some matching comment on the location contrasting to the Google API, which returned locations regardless of comments or not.
A complete and detailed description of FEMA Lifelines is found at the link below:
Aggregating & Cleaning the Data
During our discovery period, we did some preliminary Exploratory Data Analysis (E.D.A.) After all, both API’s were excellent and allowed us to capture essential data that was pretty much ready to use. We brought in some sample data and looked through our results. Not much E.D.A was needed as the API’s are pretty well fleshed out.
Python (& relevant libraries) through Jupyter Notebooks were utilized to scrape, clean, and organize our data. The scraping proved the easiest step, and now our efforts turned to display the data or our results.
Displaying our findings
This final step was straight forward. Develop a few useful-looking diagrams along with eye-catching text to grab the viewers attention. However, we decided to take a bit further:
If FEMA needs situational awareness during a given disaster/emergency, they need to enter a location and find all the Lifelines available to the community.
With this in mind, we decided to explore what else we can do with our solution.
Findings
We revisited our findings with a different angle. The conclusion was made to design our solution to allow the input of a city; thus, the lifeline components’ location displays immediately afterward. The results are available below.
Additional
The final solution came in this form for end-user use:
- Enter a city, st (e.g., Houston, TX)
- View Lifeline Components on the city’s map online
Entering of target city was done on a local machine w/ scraping and processing of data. Findings (locations tagged with their corresponding lifeline components) were categorized, along with their Long. & Lat., and saved to a Google Sheet. At which point, this same data was connected with Tableau to visualize the results.
Feel free to play around with our findings at the link below. You may also find the complete project, presentation, and code at the public repository below.
Visualization: https://ovrflw.digital/femalifelines/
Github: https://github.com/adriancampos1/GA_DSI8_FEMA_Lifelines