If you’ve covered Chapter 10: The Federal Bureaucracy in U.S. Government: An Interactive Approach, then you and your students know that the United States federal government collects an extraordinary amount of data on almost every conceivable dimension of the country. However, did you know that much of this information can be accessed through a single portal: Data.gov? You can get a quick feel for the types of data that are available by flipping through the “topics” section, before diving into the data. As the site’s glossary notes,
Data become “information” when analyzed and possibly combined with other data in order to extract meaning, and to provide context. The meaning of data can vary depending on its context.
So how do you go about using the wealth of data to generate information? There are many ways you can use this site with your students:
- Students can use the “prepackaged” data in the Applications area (data for which extraction tools and applications have already been built) to build a profile of a particular place in the United States, with measures for everything from demographic data on the population from the U.S. Census Bureau to the normal background radiation in the community monitored by RadNet at the Environmental Protection Agency (below).
- Students can also import the raw data (usually provided as XML, Text/CSV, KML/KMZ, Feeds, XLS, or ESRI Shapefile formats) into the software program of their choice, explore the definitions and units of measurement, and make graphs and charts that highlight different values or time periods. For example, the Hospital Compare database provides information about care quality, mortality, and readmission rates for every hospital in the United States, updated periodically. Using this database, they could figure out where they would ask the ambulance to take them if they (or someone they knew) were to have a heart attack.
- Once a particular database has been explored, evaluate its source materials. How was the data collected? How could it have been manipulated either by the people reporting it or the people collecting it? Why might these people want to manipulate the data? What data that you think is pertinent was left out? This is easier to do as a teaching exercise with a database such as the “Vital Signs Codebook And Sources (2010 – 2012)” which has a codebook prepared by the researchers at the Baltimore Neighborhood Indicators Alliance/The Jacob France Institute that specifically answers these methodological questions.
- Are you or your students capable of programming your own apps? Have them build a visualization tool that they and others can use to make sense of the data. There is more information about applications, mashups, and visualizations in the Developers section.
Be aware that since there is so much data here there are occasionally glitches caused by miscommunication among agencies. For example, the “MyFood-A-pedia” external dataset for a long time was not linked to the site that succeeded it (part of SuperTracker at the USDA). Generally, a little sleuthing in an agency’s website can go a long way toward resolving the problem when this happens!
As the open data movement becomes more widespread, many more state, local, and tribal governments are also making data freely and openly available to the public. Data from many can be accessed by searching the “organizations” section on Data.gov.
Have ideas for using, interpreting, or analyzing data that weren’t listed above? Share them in the comments section!