HomeCouncil MembersAgency DirectoryConstitutionConferenceMeeting PresentationCUAC Business

 

http://cuac.wustl.edu/vertical.jpg

 

General Meeting - Annual Federal Agencies Presentations
    

 

 

 

General Meeting

 

                 FY 2007                              FY 2006                                   FY 2005                             FY 2004

                                                              FY 2003                                   FY 2002                             FY 2001

                                                              FY 2000                                   FY 1999                             FY 1998

 

 

 

 

CARTOGRAPHIC USERS ADVISORY COUNCIL (CUAC) 2007 AGENCY PRESENTATION MINUTES

APRIL 26-27, 2007

US GEOLOGICAL SURVEY, RESTON, VA

 

Sponsor

 

Richard Huffine, National Library Coordinator, US Geological Survey

 

CUAC Representatives in Attendance

 

Joe Aufmuth, University of Florida, ALA/MAGERT

Michael Fry, University of Maryland, WAML

Anne Graham, Massachusetts Institute of Technology, NEMO

Katie Lage, University of Colorado, WAML

Mary McInroy, University of Iowa, ALA/GODORT

Clara P. McLeod, Washington University, GSIS

Anita Oser, SLA, Social Science Division, G&M

Daniel T. Seldin, Indiana University, NACIS

Wangyal Shawa, Princeton University, ALA/MAGERT

Joy Suh, George Mason University, ALA/GODORT

Thelma Thompson, University of New Hampshire, NEMO

Linda Zellmer, Indiana University, GSIS

 

 

Federal Agency Presenters

 (in order of presentation)

 

Richard Huffine, National Library Coordinator, US Geological Survey

Andrew V. "Drew" Douglas, Customer Relations, DHS Federal Emergency Management   Agency Enterprise GIS Solutions

Valerie Martens, Cataloging Supervisor, US Government Printing Office (GPO) – agency discussion session

Betsy Kanalley, Assistant Program Manager, USDA Forest Service, Geospatial Services Group

Eric M. Hubbell, Program Analyst, U.S. Environmental Protection Agency

Sam Wear (for Rob Dollison), USGS Geospatial One-Stop

Billy Tolar, Standards Program Manager, FGDC/USGS

Jenny Runyon, U.S. Board on Geographic Names

Timothy Trainor, Assistant Division Chief for Geographic Areas and Cartographic Data    Products, U.S. Census Bureau, Geography Division

Richard Huffine, National Library Coordinator, United States Geological Survey

John Hebert, Chief of the Geography and Maps Division, Library of Congress

Brett Abrams, Electronic Records Archivist, National Archives and Records Administration

 

Written Agency Reports Submitted

Department of Energy

 


Federal Agency Presentation Schedule

Thursday, April 26, 2007

1:15 – 3:45pm: Agency Presentations Session I

1:15 – 1:30 Welcome

CUAC Chairs and Richard Huffine, USGS

Introduction of all members and agencies present

1:30 – 2:00 Andrew V. "Drew" Douglas, Customer Relations

DHS Federal Emergency Management Agency (FEMA) Enterprise GIS Solutions

2:00 – 2:30 Valerie Martens, Cataloging Supervisor, US Government Printing Office – agency     discussion session

2:30 – 3:00 Betsy Kanalley, Assistant Program Manager

USDA Forest Service, Geospatial Services Group

3:00 – 3:30 Eric M. Hubbell, Program Analyst, U.S. Environmental Protection Agency

 

Friday April 27, 2007

CUAC Chairs and USGS

Introduction of all members and agencies present

8:45 – 9:15 Sam Wear (for Rob Dollison), USGS Geospatial One-Stop

Billy Tolar, Standards Program Manager, FGDC/USGS

9:15 – 9:45 Jenny Runyon, U.S. Board on Geographic Names

9:45 – 10:15 Tim Trainor, Assistant Division Chief for Geographic Areas and   Cartographic Data Products, U.S. Census Bureau, Geography Division

10:30 – 11:00 Richard Huffine, National Library Coordinator, United States Geological Survey, Host of CUAC 2007

11:00 – 11:30  John Hebert, Chief of the Geography and Maps Division, Library of        Congress

1:15 – 1:45 CUAC Liaison written agency reports

Member agencies unable to attend

1:45 – 2:15 Brett Abrams, Electronic Records Archivist, National Archives and Records             Administration

 

Introductory Session Remarks

 

Richard Huffine, National Library Coordinator, US Geological Survey

 

Agency Presentation Minutes

 

Andrew V. "Drew" Douglas, Customer Relations, DHS Federal Emergency Management          Agency Enterprise GIS Solutions, “Disaster Cartographic Products at FEMA”

(submitted by Wangyal Shawa)

 

Andrew Douglas started his presentation by giving a history of disaster cartography at FEMA, starting from 1992 when they used MapInfo software to map Hurricane Andrew, to the establishment of the Geospatial Management Office in the Department of Homeland Security (DHS) when they merged into DHS in 2003. During the 2005 hurricane season, FEMA produced 3,000 unique map products created by FEMA headquarters with only 12 staff members. Mr. Douglas said they generated a lot of unique maps; these maps and data are part of the national records and need to be stored in libraries and made available to the public. However, they have certain concerns about what information and which formats of their products need to be made available to the public. He said that FEMA’s primary duty is to help people during disasters. They make status and logistic maps for decision makers to show where shelters are located and how many people are in each shelter, etc. They also make disaster declaration maps which are based on governors’ requests for disaster assistance.

 

FEMA uses different geospatial data products including the National Geospatial-Intelligence Agency (NGA) base product called Homeland Security Infrastructure Protection (HSIP) Gold, which is made available to all federal agencies involved in homeland security. The base product includes critical infrastructure, schools, medical facilities, utilities, transportation, dams, etc. FEMA not only uses the NGA HSIP Gold data (these data are not shared with the public) but also use other datasets such as demographic data from the Census and meteorological data from the National Meteorological Center, to create hurricane forecasts, hurricane projected and actual paths, determine people likely and actually effected by hurricanes as well as generate disaster maps. These maps help planners by giving them good ideas of how to prepare for the disaster and how to help people to recover from the disaster.

 

Mr. Douglas showed sample of maps done by FEMA. Some of their titles are:

1.         2004 Hurricane Season- Named Storms: Atlantic, Caribbean, and the Gulf of Mexico

2.         2005 Hurricane Season-Named Storms: Atlantic, Caribbean, and the Gulf of Mexico

3.         Hurricane Florence - Advisory number 37

4.         Hurricane Katrina – Advisory 23 – Elderly Population in Wind Swath

5.         Hurricane Katrina Peak Wind Gusts by County

6.         Hurricane Katrina – Advisory 23A – Evacuation Orders

7.         Hurricane Katrina Damage Overview

8.         Hurricane Katrina – New Orleans - Area Road Closures and Probable Flooding Areas as   of 8/29/05

9.         Hurricane Katrina – Allocated Space for Evacuees as of 1800, Saturday, September 3,                2005

10.        Presidential Disaster Declarations: December 24, 1964 to February 27, 2006

 

To access FEMA geospatial data he suggested we visit this URL www.gismaps.fema.gov

 

 

Valerie Martens, US Government Printing Office – agency discussion session

(submitted by Michael Frey)

 

In lieu of a formal presentation, Ms. Martens distributed a handout to CUAC members summarizing developments at GPO and addressing topics brought to GPO’s attention by CUAC members prior to the meeting. Items from that handout pertinent to the map librarianship community include:

 

Map-related statistics

From October, 2006 through February, 2007, GPO distributed 1,685,575 tangible copies of 3,842 titles (print, microfiche, CDs, DVDs and in-house maps). USGS map distribution during the same period included 59 titles and 12,673 copies.  From October 2006 through March 2007, 7,171 online titles and 3,294 PURL links to agency titles outside of GPO Access were added, for a total of 10,465 new online titles. These additions bring the total number of titles to 216,822, and the total number of titles linked from GPO Access to 51,248, for a total of 268,070 titles accessible through GPO Access. From June 1, 2006 to April 15, 2007, GPO cataloged approximately 259 maps (GPO’s chief map cataloger was ill for approximately 2 months, and returned to work on a part-time basis for one month).

 

FDsys

The U.S. Government Printing Office’s Future Digital System (FDsys) will preserve, authenticate, provide version control, and provide access to digital content from all three branches of the U.S. Government. FDsys will be a comprehensive, systematic, and dynamic means for preserving digital content free from dependence on specific hardware or software. The system will automate many lifecycle processes for digital content and make it easier to deliver content in formats suited to customers’ evolving needs. FDsys will be released for agency and public use in late 2007. [For add’l details about FDsys, see http://www.access.gpo.gov/su_docs/fdlp/pubs/proceedings/07spring/fdsys-0407.pdf.]

 

USDA Soil Surveys

The Department of Agriculture's Natural Resources Conservation Service (NRCS) is the publisher of the Soil Survey Reports, and these publications have been available for selection by the libraries in the FDLP for many years. The NRCS has traditionally issued the Soil Surveys as a printed set: one printed book and one printed map, packaged inside a file folder.

 

In 2006, the NRCS made a publishing decision to release some Soil Survey reports with parts in different formats.  This has generated a significant number of inquiries to Library Services and Content Management (LSCM) because libraries think that the FDLP has inadvertently distributed incomplete sets.

 

LSCM is working with the NRCS in an effort to identify which titles are being published with parts in different formats. We have communicated to NRCS that the seemingly random choice of formats for the distribution of each Survey causes confusion in the libraries and may hamper access to these important and useful documents.

 

NRCS has indicated their goal is to publish all Surveys online. Until that goal is realized, NRCS will continue to print parts of Soil Surveys in different formats. For example, the Soil Survey of Anson County, North Carolina, was only printed in book form and the maps were available online only. The book was classed A 57.38/33:AN 8 with Item Number 0102-B-33 and shipped on Shipping List 2006-0035-S.

 

Conversely, the Soil Survey Map of Washington County, Vermont, is currently being processed for shipment to the FDL's. For this Survey, the manuscript that accompanies the map is online only. The class for this title is A 57.38/45:W 27/MAPS and it will appear on an upcoming shipping list.

 

At present, there is no indication in the printed documents that the additional content is available online only. We recommend that libraries consult the NRCS Soil Survey website at http://soils.usda.gov/survey/online_surveys/ to determine the online availability of Soil Survey materials before sending an inquiry to LSCM.

 

Notes on GPO cataloging records will help identify the parts of Soil Surveys that have different formats.  GPO cataloging records will be either a map only record (when a map is in print, but not the book) or a book only record (when there is a book in print, but no map) with a note stating "Book not distributed to depository libraries in tangible form" or "Map not distributed to depository libraries in tangible form," respectively.

 

GPO appreciates the community's patience while we work with the NRCS going forward.

 

In addition to the handout, Ms. Martens fielded questions and comments from CUAC members. She was clear that maps were outside her area of expertise, and she agreed to forward CUAC’s comments [see below] to appropriate parties within GPO. (Policy-related questions, for example, may be directed to Laurie Hall at lhall@gpo.gov.)

 

 

Topics raised by CUAC members included:

 

Geospatial Metadata. CUAC asked for geospatial metadata from Fed’l agencies to be converted to MARC format so the data can be more readily found, and suggested that GPO use a metadata format for their digital projects (e.g., FDsys) that’s export-friendly. CUAC expressed continued interest in FDsys’s ability to incorporate geospatial metadata in all of GPO’s relevant digital initiatives. Ms. Martens indicated that geospatial metadata searching can be added to FDsys as a future feature, but clarification is needed as to exactly what is wanted (e.g., lat-long coordinates).

 

FEMA Flood Insurance Rate Maps. CUAC asked GPO to distribute Flood Insurance Rate Maps (FIRMs) through the Depository program.

 

Cataloging digital maps and geospatial data. CUAC asked for more routine identification and cataloging of digital geospatial data, maps, etc. from Fed’l agencies. Existing electronic publications from USGS and EPA, for example, don’t always have cataloging records. Fed’l agencies should be working more closely with GPO to make sure items have records. Ms. Martens noted that GPO’s staff is limited to 2 map catalogers, as well as a cataloger working more than half-time on EPA docs. She directed CUAC to Proceedings of the 2007 Spring Depository Library conference, which included a Depository Library Council session on Web harvesting. [See pg. 124 at http://www.access.gpo.gov/su_docs/fdlp/pubs/proceedings/07spring/transcripts-0407.pdf.]

 

Lost Documents. CUAC asked about procedures for notifying GPO about lost docs. Ms. Martens: lost docs are a big priority for GPO, and they’ve made enormous progress in the last couple years. The most efficient way to notify GPO is through AskGPO. Libraries can also send electronic docs to GPO for cataloging.

 

GPO’s Digital Projects. CUAC asked if there was a complete list of GPO’s digital projects and initiatives. CUAC: Is there any way to merge existing digital project indexes and consolidate them into a single repository?

 

Distribution. CUAC noted a continuing “disconnect” between what’s produced by federal agencies (e.g., FEMA’s event-specific maps) and what’s collected by GPO and distributed to depositories. Agencies are still producing items in print and electronically, but what’s distributed to depositories continues to decrease in number, and what’s available online changes over time. CUAC called for GPO to collect items that agencies aren’t motivated to keep in perpetuity. (Legacy publications come through GPO pretty well, but new products and titles seem to be under the radar.) Ms. Martens: If you find items like this, let us know and we’ll look into it. CUAC: Libraries could never keep up with that on an item-by-item basis. We need a comprehensive approach to dealing with how information is being published now.

 

Betsy Kanalley, Assistant Program Manager, USDA Forest Service,

Geospatial Services Group

(submitted by Katie Lage)

 

Ms. Kanalley began her presentation with an overview of the Forest Service structure, land management responsibilities, and programs. Her talk covered strategic goals for Forest Service geospatial programs, the new Forest Service Geodata Clearinghouse, Forest Service data on Google Earth, print on demand mapping services, and the map sales program.

 

The Forest Service geospatial programs are moving towards an integrated business model. They are integrating their mini data centers into three main centers, Portland, Kansas City, and Albuquerque. Kansas City will be the main data center, with Albuquerque working on development and testing of applications and acting as a backup to ensure continuity of operations for Forest Service data centers.

 

Geospatial information is gathered from various resource applications in programs that the Forest Service manages, such as fire, forest management, range, cultural resources, and more.

 

Future mapping efforts will focus on acquiring and producing data to support field needs,. Acquisition and production of elevation data and ortho-rectified imagery will continue. The Forest Service is also focusing on keeping data up to standards for content, accuracy, completeness, and documentation (metadata). They will continue to produce thematic maps and 1:24,000 and 1:126,720 (1:63,360 for Alaska)

 

The new Forest Service print on demand (FSPOD) mapping capability will be available to the public via Forest Service Geodata Clearinghouse in the near future.  The user will be able to select a 1:24,000 quadrangle extent and print the map or save it in PDF format. FSPOD uses ArcGIS Server 9.2 to produce 7.5’ 1:24,000-scale maps over FS lands of the conterminous United States and 15’ X 20-22.5’ 1:63,360-scale maps for Alaska. These products are either based on the traditional quadrangle footprint, or on a user defined center point.  The FS is working with the USGS, as they develop a similar map on demand capability, in cooperation with States and other partners.

 

Ms. Kanalley introduced the FSGeodata On-Line Geospatial Clearinghouse (http://fsgeodata.fs.fed.us), for discovering, assessing, and delivering USFS geospatial data. There is a gateway for raster data (coming soon), vector data, maps, and other data, including regional datasets. She referred a question about archiving data in FSGeodata to Dave George, the clearinghouse manager.

 

Forest Service geospatial data can also be found in Google Earth. The FS has partnered with Google to provide forest boundaries and recreation sites and pop-up information windows with links to forest service information and FSGeodata.

 

Ms. Kanalley briefly reported on new prices for USFS printed maps, showed the new plastic material some maps are being printed on, and reminded the group that they can be purchased through the USGS store and from the National Forest Store or Forest Service visitor centers. She brought examples of maps and forest atlases (for Region 5) for CUAC members to look over.

 

Q: Are there maps of just wilderness areas?

A: These should be available in the new print-on-demand mapping. Ms. Kanalley may also be able to help provide something like this.

 

USFS maps are available through the USGS store (http://store.usgs.gov/).

 

 

Eric M. Hubbell, Program Analyst, U.S. Environmental Protection Agency

(submitted by Joy Suh)

 

Eric Hubbell presented “Enterprise GIS at EPA” at the CUAC meeting on Thursday, April 26, 2006.  He began by introducing the geospatial teams within EPA whose functions have been developing Web applications and enterprise architecture for GIS and introduced Dave Wolf, the geospatial team leader who also attended this meeting.  Eric’s presentation covered background, GIS development at EPA, GIS public applications, data service offered, technology and future directions of geospatial program within the agency.

 

The mission of EPA is to protect human health and environment.  Since multiple offices within EPA oversaw each of EPA’s strategic goals (consisting of clean air and global climate change, clean and safe water, land preservation and restoration, healthy communities and ecosystems, and compliance and environmental stewardship), this resulted in a wide range of data sources. The challenges were to get the programs to agree to share and then put the data in a common format. EPA developed Envirofacts Data Warehouse in 1995 to provide a single public access site for environmental data related to air, water, and land across the United States.  Location or place (such as zip code and city) is a key to view local community data.

 

GIS applications have been increasingly important within and outside of EPA since the first introduction of GIS at EPA in the mid-1980s. Each of the10 regional offices has a geospatial team. EPA’s Office of Environmental Information develops enterprise architecture solutions. After developing Environfacts in 1995, the office developed EnviroMapper (EM), the first Web based application by using a Web-based GIS application.  EM now offers specific programs which answer questions as specific as: “Are there environmental concerns located surrounding my construction projects?, “Is this area a potential environmental justice site?, or “Are there significant sources of pollution where I live?.  The following specific GIS applications are able to address such concerns:

           Window to My Environment (WME) is a collaborative effort at the local, state, and national level. This is an interactive tool to generate maps, demographic statistics, environmental facts and conditions (watersheds and air quality, etc) in location of choice.  It allows data searches by zip code, city, and state.

           NEPAssist is an EPA centric program, which allows visualization on a regional basis of automated EPA’s environmental impact statement submissions. It assists with initial reviews under the National Environmental Policy Act (NEPA). NEPAssist provides reports and reviews of potential environmental concerns on the project sites.

           Environmental Justice (EJ) is similar to WME, but assesses regional statistics according to the following topics: health, social, economic, and environmental concerns.

           EM for Hurricanes and Rita Site along with mapping offers images of the area affected by Hurricane Katrina from Global Explorer.

Two Web sites that provide data services are:

           Geographic Image and Feature services (http://geodata.epa.gov) for superfund sites, permit application sites, toxic inventory sites, etc

           Geospatial Data Download services (http://epa.gov/enviro/geo_data.html) available in XML, shape files, or feature class files, eventually KML files.

 

Technology used at EPA is based on ArcIMS for mapping server, ESRI’s ArcSDE (spatial data engine), and Oracle Spatial (GIS extension to database). EPA also uses a service-oriented architecture (Web Service, XML), including data from USGS NWIS (National Water Inventory System), FWS NWI (National Wetlands Inventory), EPA STORET, ESRI ArcWeb Services, and USGS Terra Server Aerial photos and topographic maps.

Eric concludes by sharing the future direction of information technology used at the EPA. He notes the importance of GIS and the intent of data sharing and more GIS services on the Web.

 

Questions and Discussion:

CUAC members had a question about availability of hard copies of EPA basins to the library community in the future (whether through the depository program or direct request it from the agency).  Dave Wolf responded that EPA has been trying to upload all the data on the Web and suggested that libraries should regularly download data at their own convenience and can contact the EPA for historical data.  New NLCD (National Land Cover Data, 2001 source) is now available. CUAC members also inquired about possibility of formal partnership between EPA and university communities for sharing the web applications and data created by EPA as back up sources for access and archiving.  Further concerns and discussion centered on archiving issues and how these Web applications and data will be accessible 50 years from now. EPA is looking forward to working with NARA for data archiving. CUAC members also appreciated EPA’s development of these Web applications since it has proved useful for students to do environmental analysis without GIS knowledge.

 

For further information, please contact  Eric Hubbell (Hubbell.eric@epa.gov)

Web Sites for Further Information:

           EnviroMapper - http://epa.gov/enviro/html/em/

           Window to My Environment (WME) - http://www.epa.gov/enviro/wme/

 

 

Sam Wear, USGS Geospatial One-Stop

(submitted by Anne Graham)

 

Geospatial One-Stop (GOS), an intergovernmental project managed by the Department of the Interior and USGS in support of the President’s Initiative for E-Government that encourages collaboration to leverage government geospatial resources and best practices by providing access to national geospatial data.  An outcome of the Geospatial One-Stop E-gov project is Geodata.gov, a portal to our nation’s (local, regional, national) digital geographic data.

 

The Geopatial One-Stop portal (www.geodata.gov) provides access to many different kinds of digital geographic information. The actual geographic data does not reside in the portal, but rather the portal is an exploration system to a collection of pointers which reference different geospatial files, information and data.  Essentially the portal contains records about the files, like a huge card catalog, or a national metadata catalog.  These documented data sets contain many layers of information such as aerial imagery, elevation data, ground control, land cover, surface waters, transportation and structures.

 

The portal consists of different components: a metadata catalog with a search application; a map viewer; a data partnership marketplace; and community of interest collaboration tools. The National Map provides the primary base map of GOS.   The National Map is a critical asset, providing a seamless base of topographic data upon which other data, discovered in the portal, can be draped.  Interoperability standards allow The National Map to be leveraged by GOS.

 

The National Spatial Data Infrastructure (NSDI) refers to the technology, policies, standards and human resources necessary to acquire process, distribute, use and maintain spatial data by the Federal Government.  Geospatial One-Stop is one of the key components in furthering the building of the NSDI.  The GOS catalog is built upon harvesting copies of the metadata contained in the earlier NSDI collections and expanding the ways governments can publish their data to this national collection.

 

Partners are federal agencies, states, cities, counties (local governments, where the richest and most detailed data is being developed), tribes, academia, and the private sector.  The biggest challenge for the Federal government is to provide sufficient incentives to enable more local government information to be incorporated into the building of the National Spatial Data Infrastructure.

 

State, Local and Federal web map services are a great resource for the public to access the most current data.  GOS is a repository for pointers to these publicly available data services.  Data can be described with metadata and downloaded from GOS.

 

In addition to searching the entire collection of metadata, the GOS portal is organized around topical themes of information that are organized into data ‘Communities’.  In addition, to data themes the portal contains the following primary organizational tabs to help with navigation: 

 

Communities tab – provides a way for users to share information with each other about specific topics, such as fire, local government, historical collections.

The communities tab can be a pointer to a web site, or to a large amount of downloadable data.

One local community example is the metadata about Spokane web mapping service.

The Library tab within the Communities gives links to pertinent web sites.

 

Maps tab shows popular maps.

The National Map is where different kinds of live web mapping services that can be fused and mapped within the national map pointed to with GOS. 

 

Marketplace tab allows you to see what data others are trying to acquire so that you can develop partnerships for acquiring datasets.

 

There are approximately 125,000 records in GOS and the content continues to grow each year.

 

The home page interface is customizable with a login and maps and searches can be saved. The following enhancements have been recently made:

 

GOS 2.1 Enhancements:

·           Improved Harvesting

·           Improved Metadata Management tools

·           Spatial Ranking of Search Results (better ‘geographic fit’ in search)

·           Access Metadata from the Viewer

·           Provide More Feedback to publishers

 

Next Steps:

·           Publishing content to the web

·           Viewer Improvements: better Open Geospatial Consortium Web Mapping Service support, faster base maps, 3-D viewer, possible KML support.

 

Questions/comments:

·           Loading from multiple distributed map services can cause viewing and downloading time differences.

 

This interface has been very nice.

How do all the data delivery portals fit together?  Sam: I will provide the group an outline that came out of a meeting of several groups under the NSDI.  The groups worked to get people to understand the difference between all the portals of the NSDI.  GOS is where those different technologies come together.  The hope is that metadata records for all NSDI data will be placed in GOS.

 

 

Jenny Runyon, U.S. Board on Geographic Names

(submitted by Mary McInroy)

 

The Board on Geographic Names (BGN) was established by Executive Order in 1890 and is the longest-standing standards body in the United States.  The BGN’s mission, in 1890 as it remains today, is to oversee decisions affecting “…geographic names and principles of geographic nomenclature and orthography.”  At first interested only in US entities, the BGN gradually expanded its interests to include foreign names and other areas of interest to the United States, a process that accelerated during World War II.  In 1947, the BGN was recreated by Congress in Public Law 80-242.

 

A listing of BGN membership and organization can be found on their web site at http://geonames.usgs.gov/.  Members of the BGN represent federal agencies concerned with U.S. geographic information, population, ecology, and the management of public lands. 

 

 

The BGN’s Domestic Names Committee (DNC) includes multiple members from the Departments of Agriculture, Commerce, Interior, Homeland Security, the Library of Congress, the U.S. Postal Service, and the Government Printing Office.  The BGN also includes an Advisory Committee on Antarctic Names (ACAN).  Staff support for the DNC and ACAN is provided by USGS.

BGN’s Foreign Names Committee agency members are from the Commerce, State, and Defense Departments, as well as the CIA and the Library of Congress.  This committee includes Advisory Committees on Underseas Features and Extraterrestrial names, with staff support provided by NGA.

 

The BGN deals with the standardization of names, not their regulation.  Standardizing of geographic names and locations prevents incorrect, inaccurate, or contradictory feature data from appearing simultaneously in multiple applications, a circumstance which could have serious and potential catastrophic consequences in such areas as:  national security, emergency preparedness and response, site selection & analysis, and all levels of communication.

 

Members of the Domestic Names Committee meet each month at the Department of the Interior in Washington D.C. to agree on the geographic names to be used in federal products.  The full BGN (Domestic and Foreign Names committees) meets quarterly at USGS.  These BGN decisions on official (i.e., BGN approved) geographic names and locations are mandatory only for federal products, i.e., they are not binding for state and local governments, although most would agree that names should be consistent throughout all  levels of government and the private sector.  Although names and locations may have historical listings or variant spellings, there is only one official geographic name for each feature.  The Geographic Names Information System (GNIS) is the authoritative federal source for official domestic geographic names and locations.  GNIS is searchable online at http://geonames.usgs.gov/domestic/index.html, and can be downloaded entirely or in user-selected sections.  The GEONet Names Server (GNS), developed and maintained by the National Geospatial-Intelligence Agency (NGA), is the official repository of foreign place-name decisions approved by the BGN.  Also the GEONet Names Server, like the GNIS, is cumulative, i.e., name listings are not deleted except in cases of obvious duplication.  Names and locations of man made features are determined by the authoritative local source and are not subject to formal BGN review and decision.  However, their names and locations are recorded in the GNIS, and as such are considered official for federal use.  

 

To build the GNIS database, beginning in the 1970’s, the BGN collected names and locations from the 1:24K USGS topographic maps, then moved on to U.S. Forest Service visitor maps and NOAA charts.  Beginning in 1982 and continuing today, the BGN is in Phase II of a state-by-state data compilation effort, which involves collecting names from other federal sources, state and local sources, and other current and historical maps and documents (the final two states are expected to be completed in 2010).  Also, since 2002, the BGN has initiated Phase IIA, which involves updating names and locations (primarily new structures and cultural features) for the 46 most critical urban areas as identified by NGA for homeland security.  The BGN so far has standardized over two million names in 66 feature classes, i.e., broad categories such as summits, streams, canals, rapids, woods, and populated places.  Cultural features are the fastest-growing part of the database.

 

The BGN works closely with a network of fifty State Geographic Names Authorities (SNA’s), which solicit local input and provide recommendations to the BGN on name proposals (new names and name changes).  The SNA’s, many of whom represent state government agencies, also work closely with their GIS communities and other partners to coordinate names activities and to assist in the GNIS data compilation effort.  Some SNA’s are comprised of one individual in academia, while others are formal boards established by state legislatures.  Several SNA’s also serve as their state’s archivist or are affiliated with their state’s historical society.  The BGN is also developing partnerships with many tribal authorities, and in compliance with the Executive Order requiring tribal consultation on matters of interest to the federal government, will seek the input of any interested tribal government on any name proposal it receives.  Several tribes are working closely with the BGN to incorporate names of indigenous significance into the GNIS.

 

The work that BGN does supports, among others, the following federal programs:  Geospatial One Stop (GOS), The National Map, the National Atlas, the National Hydrography Dataset, the National Elevation Dataset, and FGDC standards development.

BGN is currently working with ANSI to make the GNIS Feature ID# the “official code” for the nation.  The GNIS Feature ID# is currently official for the federal government, but establishing it as a national standard would permit its usage throughout both the government and private sector and would create a standard within the international community. 

 

Google Earth currently uses GNIS and GEONet as two of its primary sources for names, although it also gathers names from a number of other non-standardized sources.  The official names issue is not a large problem with US names, but the foreign geographic names used on Google Earth are definitely not standardized.  The BGN is attempting to urge Google Earth to indicate that the BGN is the only official source for these names, and to also allow Google Earth’s users to feed any updates/corrections back to the BGN. 

 

The BGN is an active participant in the international arena, primarily through the United Nations Group of Experts of Geographical Names, and also through its annual geographic names training course, conducted under the auspices of the Pan American Institute for Geography and History.

 

BGN web site at http://geonames.usgs.gov/ includes a brief history of BGN, as well as links to GNIS and NGA’s GEONet Names Server for domestic and foreign place names respectively.  A form to propose or change a domestic geographic name can be found here also.  In addition, the BGN site links to other geographic place name sites for US states and a few foreign countries, as well as other general geographic names sites, e.g. ASU’s “Place Name Servers on the Internet” and the “Fuzzy Gazetteer.” 

 

Tim Trainor, Assistant Division Chief for Geographic Areas and

Cartographic Data Products, U.S. Census Bureau, Geography Division
(submitted by Joe Aufmuth)
 
Tim Trainor, Assistant Division Chief for Geographic Areas and Cartographic Data Products, U.S. Census Bureau, Geography Division began with 
an overview of presentation topics which included geographic and cartographic products, a 2010 Census update, a review of geographic programs, a FIPS and 
ANSI transition update, a MAF/TIGER system status update.

Geographic Products. Tim informed CUAC that TIGER/Line 2006 Second Edition is available and that TIGER/Line Shapefile, a new product, will be available fall of 2007. The Tiger format is being sunset and Census is moving forward with the shape file format. GML is also being looked at as a possible format. He reminded CUAC that two editions, of the shape files will be available each year, spring and fall.

Cartographic Products.The March 2007 printing of the 110th Congressional District Wall Map is available through a GPO contract. Large format maps of Congressional district changes in Georgia and Texas individual CD maps are in progress and will be available on the Census website. CBSA wall map will not be printed. It has been revised and is available on line. Hurricane Mapping http://www.census.gov/Press-Release/www/emergencies/index.html has produced a series of maps, both location based and thematic that has also led to a special redesign of the traditional census tract reference maps. The redesign produced a simpler and more generalized product. Maps in the Statistical Abstract for 2007 are available.

2010 Census Update. Census 2010 is underway. Letters were sent out to 40,000 community leaders and were invited to participate by sharing their address lists to help revise and check the Census Bureau's address list in preparation for questionnaire mailouts for the 2010 Decennial Census. This Local Update of Census Addresses (LUCA) is a massive operation that has produced a software product available to local governments to aid them in their address list review. It is a "low level GIS" that includes software and Census data, and will allow local governments to also update geographic data. The data will then be sent back to Census for inclusion in their database.

Two address canvassing dress rehearsal sites have been chosen in preparation for 2010, the San Joaquin, Stockton area and 9 counties in North Carolina. The Census Bureau is sending out enumerators road by road to capture housing locations using handheld GPS units. These units also will be used following the mailout of questionnaires in a follow up operation to acquire responses from households that did not return their questionnaires.

The American Community Survey (ACS) is taking the place of the Census long form sample questionnaire. ACS surveys will be published annually for communities with populations greater than or equal to 65,000, a 3 year average for populations greater than or equal 20,000 and a 5 year average for every area down to block groups. The data will be published in accordance with Census Bureau confidentiality and non disclosure thresholds.

Geographic Programs. LUCA, the Local Update of Census Addresses, was discussed above. There was a Statistical Areas Federal Register notice 2010 draft proposal for census geography related to census tracts, block groups, census designated places (CDPs), and county subdivisions. The proposal included geographic criteria to accommodate the ACS by proposing the minimum population threshold the same for block groups and tracts. The proposal also adds a housing unit threshold. It also modifies CDP definitions because some with no population were reported in 2000. Census County Divisions (CCDs) are proposed for elimination because they were originally offered for states that did not have legal subdivisions of counties, so data would be available for lower levels of geography. Comments that have been received on CCD's indicate an interest in keeping CCDs. Minor Civil Divisions (towns, townships, etc.) will remain unchanged. A final Federal Register notice will specify the final criteria. Separate proposals in 2008 will address Alaska Native Village Statistical Areas and Tribal Statistical Areas.

A pilotproject with Montana, the Bureau of Land Management, and the USGS centers on identifying issues for incorporating the spatial data for the Public Land Survey System into the MAF/TIGER System. The goal is to take advantage of the conformance of community boundary data with the PLSS as this is a valuable land reference system in the Midwest and West. The potential PLSS project is slated as a post 2010 activity.

FIPS and ANSI Update. FIPS is no longer being used as a standard. Tim reminded the group that the Census Bureau is responsible for codes for States, Counties, and Congressional Districts. He also stated that the Census works on behalf of OMB to help with CBSAs and related areas. The overall change is a transition from FIPS to ANSI. Data users have expressed concerns about not being able to sort databases on ANSI codes. As a result, Census is maintaining the 5 digit code for places and county subdivisions (formerly FIPS). Census will carry codes for States and Counties until 2012 and reassess. Formal FIPS and ANSI codes are being used for the 2010 census.

MAF/TIGER System Status. Census is working to realign the road network layer to be more accurate in position in order to have better spatial relationships with GPS data collected for each housing unit. Street center line accuracy will be 7.6 meters, and there are independent checks on positional accuracy. Census has been working on the project for the past 4 years. The number of files completed for the MTAIP is approximately 2600 counties. The remaining 600 counties will be completed by April 2008. The MAF/Tiger project has been on schedule and on budget since it started. Behind the Census products a new data model has been completed and is going through adjustment. Legacy TIGER data is migrating to the MAF/TIGER database. The new database is currently supporting 2010 Dress Rehearsal activities. The functions provided by original TIGER software applications are in development.

Most cartography products being produced by the Census are for field operations to conduct the census and are not intended as public products. A data products group is forming to propose and develop post 2010 products. Census is looking at redesigning the American FactFinder.

Questions. Several questions were asked by CUAC concerning the ACS 5 year data, future of paper census maps, the Urban Atlas, TIGER to Shapefiles, GPO distribution. And Appreciation was expressed for Tim's work on the Census. In answer to the questions Tim responded that the 5 year ACS data is a floating average of the previous 5 years down to the block group level and that 2010 will be the first release of ACS 5 year data. He noted a high variance is anticipated in the data due to the sample size. He commented that Census will be using ~120 plotters to produce office and field maps. He continued to say that 2010 Census maps will be available in PDF format. If paper maps are desired Census has a service to plot and ship for cost. Census sheets are being designed to reduce the number of maps resulting in lower cost to the consumer. Mr. Trainor commented that while he would like to redo the Urban Atlas series there is no plan to do so at this time. Lastly he reiterated that the Shapefile product will be available twice per year. Boundary files will be adjusted during the ACS cycle of communities that provide the boundary changes. Lastly, he commented that there are no plans to convert the historic census data to shapefiles.


 

Richard Huffine, National Library Coordinator, United States Geological Survey

David Soller, Geologist, U.S. Geological Survey, and

Chief, National Geologic Map Database Project

(submitted by Linda Zellmer)

 

David Soller reported on the National Geologic Map Database (URL:http://ngmdb.usgs.gov/ngmdb/ngm_catalog.ora.html), an index to geologic maps for the United States. A graphical search interface using Google Maps is available for selected states. In addition, a new feature available shows the number of maps that meet the search criteria. Other improvements include links download GIS data if it is available, links to a scanned image of the map and links to the scanned image in the Publications Warehouse. Because not everyone has the Plug-In available, the images are also available as an image that does not require a plug-in. The USGS is keeping track of the number of times a publication from a particular organization is accessed through the site, so that they and the contributing agencies are able to track use statistics. USGS is willing to share information on what they have scanned with others to eliminate duplication of effort. The site also contains links to all of the Digital Mapping Techniques reports that have been issued since the meetings began in 1997 (http://ngmdb.usgs.gov/Info/dmt/). These reports contain information on the development of digital mapping technology in the Earth Sciences. A new site on standards and guidelines is being developed as well.

 

Richard Huffine, the new National Library Coordinator of the USGS Libraries, presented the agency update for the USGS. He spoke about how the information provision side of USGS is evolving and being managed and the various components of the geospatial information office, which includes the USGS Libraries. Several statements over the last few years indicate that science at the USGS is becoming more integrated, rather than divided between various sub-disciplines (hydrology, geology, biology, etc.).

 

Information services drive a lot of the work at USGS. USGS provides answers to questions via the telephone, e-mail, mail, and even Blackberry. The USGS is developing several information resources in the individual science programs, such as the National Water Information Network (NWIS) and the National Biological Information Infrastructure (NBII). Information Services includes people, tools and processes. People are involved in understanding what the users need, building tools such as the Publications Warehouse, Frequently Asked Questions and the Science Topics are tools that help provide access to USGS information. The Natural Science Network consists of Science Information and Library Services, Knowledge Management and Information Delivery. Information services includes the Library as well as the people who respond to questions vial e-mail and the telephone (1-888-ASK-USGS). Tools are being developed to help manage USGS (Knowledge Management). These tools include the Frequently Asked Questions, Portals, Wikis, and other tools to help USGS collect, manage and create new information resources. Information Delivery includes the Publications Warehouse, the digitization and scanning efforts and the USGS Store. USGS is working towards providing access to information via print on demand or digital delivery so that users can decide how to use the information on their own. Information Delivery also includes the USGS web site, which is a distributed network on servers located throughout the country. The web site is being revised and upgraded so that information can be located more readily. One of the new parts of the USGS web site is the Science Topics section. The Science Topics site is based on an organized database and thesaurus so that information resources can be organized and identified more readily. An alphabetic index is also available on the Science Topics site so that people who want to browse alphabetically can do so. The USGS is also working with Science.gov so that the thesaurus at USGS works with scientific information from other science agencies. The Frequently Asked Questions database and Ask USGS systems are presently separate but this may change over time.

 

Publications Warehouse is still evolving, as is the USGS publications program. The USGS is centralizing publication functions so that the work is being done by a centralized group. The Warehouse is still growing, and a version 2 is being developed that will have persistent URLs, better links to documents and other work. It is possible to sort by title, report number date and author. The Contents link provides information on the number of items in each series, and whether the publication is available online. There have been questions about why the USGS is serving DJVU, including from GPO. Part of the reason is the file size. The USGS working towards providing pdf in addition to LizardTech formats. They are also working on developing a simple documented standard for USGS digitization so that the standards can be shared with outside organizations that are thinking of scanning USGS publications. They are working on a digital library plan for USGS that will include all of the publications issued by the USGS during its history.

 

The Geospatial Programs Office works with other government agencies to provide leadership and guidance to the agencies that are developing and providing access to geospatial information. The decision on what to print is within the Science Programs Office. USGS has a process in place to print maps and will continue to maintain that process as long as there is a process in place to produce the maps. The National Map is taking on a lot of the function of producing updated maps. USGS may not continue to update maps as they have done in the past. USGS is in the process of partnering and testing with Delaware and Florida to allow state agencies to update the National Map, so that they contribute the information that would update the information on the quadrangles. They are not going to be able to continue to update the maps as they have done in the past. USGS will continue to do lithographic printing, but will also be distributing data as well.

 

The historical scanning project for USGS topographic maps is continuing, however the primary priority at present is to scan the topographic maps for the southeastern United States before hurricane season begins.

 

 

Dr. John Hebert, Chief of the Geography and Maps Division, Library of Congress

(submitted by Dan Seldin)

 

Library of Congress is a collector of cartographic materials and provider of information.

Need for scanning standards.  Set a floor for resolution that all can work with.  LC G&M scanning for Congress at 300 DPI.

 

LC has been trying to set up a plan to work with USGS to scan the quads.  No one collection, LC, USGS or NARA, has a complete set of quads.  All three need to work together. 

 

At Library of Congress, Geography and Map Division collects maps while the Science and Technology Division collects science materials that compliment the maps.

 

On Monday, April 30, 2007, German Chancellor Angela Merkel officially transferred ownership of the Waldseemüller map to the United States at a ceremony at the Library of Congress.  The Library of Congress has had the map in its possession since 2001 and acquired it in 2003, but because it is on the German list of national treasures, it has to be formally transferred to the United States.  John Hébert attended and spoke at an official conference honoring Waldseemüller at the University of Freiburg, Germany on April 17, 2007.  At this conference, the German postal service issued a stamp honoring the map, showing all 12 sheets.  The Library of Congress is working with NIST to create a display case to preserve the Waldseemüller map.  The map will go on display in December 2007.

 

The Geography and Map Division has been in contact with various levels of USGS discussing the periodic archiving of the National Atlas and National Map.  LC would probably take a snapshot every 6 months.

 

Several groups have come to G&M to scan maps.  Academica Sencia of Taipei Taiwan has been scanning Chinese maps with a camera.  The are all the public domain maps from the beginning. These scans are being cataloged.

 

Nautical charts are being readied to be moved to Fort Meade, Maryland.  The Division has collected about 120,000 sheets of nautical charts from around the world.  A complete inventory had to be created before the move.  The Division will put the inventory online via the online catalog.  If this is successful, G&M will begin inventorying the set map collection.   Pre-1970 materials are not cataloged and are unknown outside the Division. 

 

The Geography and Map Division has signed an agreement with the Korean National Library to preserve Korean atlases and maps.  They will be scanned and put online.  The project will begin in the summer of 2007 and last 2 years.

 

The Geography and Map Division has scanned 10,000 maps in 10 years.  All the scanned maps have been cataloged.  These have included the Waldseemüller map, Jedediah Hotchkiss civil war map collection and World War II maps.  Copyright has limited the scanning of maps.  A group in Barcelona wanted to have a set of German maps of Spain from World War II scanned.  It took 4 months to get copyright permission from Germany the scan this set.

 

In reference, the Division has a project to finish converting the 1981 Sanborn fire insurance map guide to an online version this summer.  The scanned Sanborn maps will be attached to the online guide as the scanning is completed.  University of Texas and Sanford University want to have a cooperative scanning project of Texas and California Sanborns.  University of Texas will have a 3 week pilot scanning project in May 2007 with their own people.  Stanford is planning a similar project.  Several other Sanborn scanning proposals have not panned out.  Universities of Colorado and Florida have scanned their Sanborns.

 

Any maps the Library of Congress scans are in the public domain because they were out of copyright and produced with public funds.  All scanned maps are put on the web.  The scanning priorities are set by the G&M Division’s published cartobibliographies and reader demand.

 

LC G&M is acquiring 19th century county atlases on Ebay and encapsulating and post binding them.  In the process, they are being scanned.

 

 

Dr. Brett Abrams, Electronic Records Archivist,

National Archives and Records Administration

(submitted by Clara McLeod)

 

Dr. Abrams began his discussion by reviewing NARA’s mission and stating that he would focus his remarks on describing what activities NARA had been involved in for the last year.  In reviewing NARA’s mission, he reiterated that NARA, as an archival agency, is still concerned with the preservation of the “original,” which includes geospatial data. He noted that the mission of NARA remains to assist all federal agencies in managing their records, preserving those of “enduring” value during designated retention periods, and assuring that the value of the records is retained.  Then Dr. Abrams stated that the following three initiatives were targeted for last year’s focus: (1) the development of the open geospatial consortium(OGC)  and developing application schema and archival profiles using GML and single feature profile and (2) working with the Geospatial One Stop Portal Community to assure access to the historical collections, which is a collective goal of NARA, LC, and others, and (3) the increased scanning of historical maps and working toward digitization issues and concerns.  He reported that significant progress had occurred in the first two areas. 

 

On the first initiative, he noted that the Historical Data Working Group/FGDC that he chairs succeeded in getting a proposal taken to the Open Geospatial Consortium (OGC) to develop a data preservation working group within the technical committee (OGCTC) which was accepted.  This created the Data Preservation Working Group of the OGC, which NARA joined in March 2007.  The first meeting of this group was held April 17, 2007.  He further explained that the goal of the OGCTC is to get private industry, international and national government agencies, state and local governments, and universities involved in developing open standards related to geospatial information and determine what current level of interest exists among the OGCTC .  Brett stated that the second issue here is a source of funding for this initiative.  Dr. Abrams suggested that an opportunity exists here for universities and groups that CUAC represents to work with the DPWG.  A GML standards body already exists in the Technical Committee.  The question is how to continue progress in achieving the universal geospatial standards, looking at what currently exists:  GML, Simple Feature Profile, Spatial Data Transfer Standards (SDTS) or FGDC Content Standards for Digital Geospatial Metadata.

 

He mentioned that the electronic records geospatial holdings now include: The Fish and Wildlife’s Wetlands Inventory and Wildlife Refuges Files; the Forest Service’s Fire Management Maps; the Bureau of Land Management’s Forest Inventory Operations, Oregon; the Bureau of the Census, Topologically Integrated Geographic Encoding and Reference System (TIGER/Line), 1990 and 1992 issuances and the Geographic Base File/Dual Independent Map Encoding (GBF/DIME) File, 1980.  The Tiger and geographical phase files are in ASCII flat file format and some data is in shapefile format.  The reason for this is that there are published specifications for shapefile data currently.  This approach has endorsed the SDTS and GML current version with the Simple Features Profile to maintain the data.  He reiterated that only USGS used SDTS at this point, that GML and Simple Features Profile are not currently robust enough to maintain topology and that the problem remains that the information can’t be maintained in a bundle, thus it is available from sites where the information is being stored, which is a basic reliance on external access to the information and not valuable for archival purposes.  He stated that the NWME (Custodial Division) continues to gain experience in various types and formats of records.

 

The second initiative cited by Dr. Abrams was the development of NARA’s portion of the Historical Collections Community (HCC) on the Geo-Spatial One Stop Portal (GOS), working with LC. Dr. Abrams stressed that the organizations represented by CUAC can also be involved here.  He said that the site would benefit from greater participation from our institutions or organizations by establishing communities or links to GOS.  Links could be to just the descriptive information (which is what NARA currently does) or to catalogs or to the data and maps.  He then demonstrated the HCC Community on GOS at the website: geodata.gov.

 

In discussing the third initiative, Dr. Abrams noted that the scanning of historical maps was an area that had not seen much movement last year.  It requires critical involvement from all stake holders in pooling knowledge, efforts, and resources. NARA continues to scan slowly.  Here he mentioned that this initiative was more related to some of the activities occurring within the library’s mapping community.   It also related to his past contact with CUAC exploring the idea of jointly sponsoring a conference similar to the maps in transition one held in 2005 at the Library of Congress which would address issues of archiving and digitization, bringing together a community of stakeholders. This might also be accomplished by doing seminars in various parts of the country. He stated that it is still an objective to promote the awareness of the historical dimension to geospatial data, and that this has been financed in whole or part by federal funds.  He stressed the importance of facilitating the maintenance of historically valuable geospatial data and making it available to future generations.

 

Dr. Abrams concluded his presentation by suggesting that we go to www.fgdc.gov and look at the working groups that are available for membership and reminded us that this site provides libraries with materials related to the various topics of preserving, archiving and accessing geographical and geospatial data.  Participation in discussion groups here would be valuable for the mapping community.

 

Questions asked following the presentation included:

  • How do we go about contributing our contributions to HCC in GOS? Will your staff accept URLs to our locations? What is the process to follow if we wanted to contribute material? Would we need for you to give us login and password info.

GOS can accept specifically institutional related materials.  The development of the metadata for the site linked to would (could) be developed by NARA.  In conjunction with GOS, DR. Abrams said that if he was told by someone what it is that they would like to do and what kind of material was involved, specifics could be worked out. One thing to consider when submitting data to you is positional accuracy in that everyone does not create data in the same way. Will a standard exist for this? No.  NARA would be responsible for its own data and metadata.

 

  • What does NARA want from the mapping community?

NARA hopes that as institutions (agencies) develop certain standards, they will find a way to communicate their work - the best practices - to NARA so that they will have something to spearhead. The objective is to gather the best worked out ideas on this and promote them as such. The question still remains that we have to first discover what standards we are talking about—for geo-tiffs, digital materials, etc. What NARA is attempting to do is to provide some guidelines or standards or something along that line. An example of this might be the Library of Congress working with other interested parties and coming up with scanning guidelines for the historic maps. Here they archive the original and have a copy available the way it exist now in order to be able to take the copy, rectify it, put it into GIS, do things with it from that standpoint.

 

  • What are agencies now doing when they approach NARA?

There appears to be consideration of an initiative to figure out how to enrich the digitalization process to provide for more things to get digitized and a wider variety of things to  get digitized. There is outreach, but there is outreach to organizations and small companies to digitize some of the materials that NARA has. There are agencies that have come to NARA wanting them to get some of their old stuffs and digitize it and the process would then provide us with the reference copy of it. But in that respect, that’s kind of duplicative effort because I imagine that NOAA and some of these other agencies would probably put that stuffs up on their own sites. So, that’s where we are now.

 

  • If universities have preservation projects and more than happy to contribute their URLS to those locations and or look for back-ups and storage of the information, will NARA be willing to accept it because it is something else that another institution has done?

No, I would imagine that you would be responsible for your own data and we for basically, accountability and other issues like that. One of the things about the metadata is obviously that that kind of material is described in there. And this particular portal might not link to the data, but will provide a searching mechanism for locating it so that it can be linked to. So the data will be stored somewhere else where the data maybe accessible.

 

  • What is being done with the comments or suggestions received from groups that have funded projects (North Carolina, University of California, Stanford) from NARA or LC dealing with the issue of archiving geospatial data itself: trying to figure out format, the GML option as well as the open source standards.

 NARA’s funding has been minimal related to geospatial data. We have worked with San Diego super-computing center and what they have done is taking existing data and well and built the GIS version with it. The thing that they are doing right now is working with Vancouver City geospatial data and trying to figure out about archiving issues related to a live system.

 

  • Is there a membership fee for joining OGCTC ?

Yes.  In order to join them, you have to be a member and to be a member, you have to pay them.

 

  • What is happening with the geospatial line of business and all these business models you had to write for them?

There will be an RFP eventually, for program management office, which is going to run the development of the common solution target architecture. I believe some of the written documents might be public, already; I don’t know if that’s true or not. Also, the next, the initiatives towards funding have been taken…and the next level is trying to bring together all the parties that are members of the circular a16.  From NARA’s perspective, there will be some form of records management built into the architecture for the system. Just postulate for a minute about the machine being able to tag various things: To say, this data set or this set of records will be stored temporally for 20 years, or 50 years, or permanently stored in this location and not sent to the archives, or just be sent to some other locations after a 25 year period.

 

  • How do you get involved in the historical data working group?

You can just send me (Dr. Abrams) information and we’ll put you into the group. There are many working groups including the geo-spatial, aerial photography, digital efforts, digitalization efforts and paper maps. But also groups working on questions about material formats, what’s the best and what are the best practices for these particular sets of information.

 

  • How can we support you as an organization?

All  organizations can write letters  - letters  that would basically state your interest in pursuing this activity (standards or guidelines)and a commitment to attending workshops, conferences or seminars on the subject to get the goal accomplished.  This information will be taken to the person who is the chief information officer for NARA, and he will share with appropriate channels. Citing the need to have useful standards or guidelines from NARA concerning geospatial archiving to help move the geospatial archiving issues forward, we could request that NARA take the initiative in organizing a conference or a meeting to talk about this issues, and then we would have some sort of guidelines for standards. This would allow presentations by those involved to share their experiences. Then we begin to tackle the question by example. The other issue is that support in the form of funding for this initiative is also needed.  International involvement should also be expected.

 

In summary, there is still much work to be done in the realm of geospatial archiving.  There are currently no particular standards or guidelines for geospatial archiving, and the need still exists for a platform that can deal with any software or operating system.  CUAC would like NARA to coordinate the activities of other agencies that are also interested in geospatial archiving, so that guidelines could be developed.  Another possibility is that NARA could develop a common location (a repository) for storing foundational material so that everyone is aware of what work is being done and knowledge about ongoing and past projects can be more easily disseminated.  NARA needs support from the mapping community in its quest to get funding to initiate activities in archiving geospatial data, including locating ongoing projects, sponsoring presentations by those engaged in these activities and conferences to get different organizational types together.

 

Written Agency Reports Submitted

 

Donna Heimiller, and  Pamela Gray-Hann Department of Energy,

National Renewable Energy Laboratory

(submitted by Anita Oser)

 

NREL's GIS holdings are focused on renewable resource datasets. Currently our FTP site (http://www.nrel.gov/gis) has geographic shapefiles of annual wind power class (for 35 states and an older national assessment), annual and monthly solar resource for 40 km and a new 10 km coverage (direct normal and tilt=latitude collector), and biomass resource. We also provide access to 11 stand-alone Geospatial Toolkits that have been created for international projects, to provide those countries with some limited GIS querying capability. These toolkits include renewable resource, infrastructure and other base data for the country as part of the installation package.

 

There are other datasets that can be provided upon request, but aren't distributed on the FTP site. Some of these datasets require review of need and management approval before they can be sent. These include the original raster power density datasets that the wind power class shapefiles are created from; supplemental/unvalidated wind speed and power information for different heights above ground and time scales; wind measurement data; and solar modeled hourly values.

 

For users who don't have GIS capabilities, our latest internet map server (IMS) site "United States Atlas of Renewable Resources" is one of our dynamic maps that allows the user to view solar, wind, biomass and geothermal resources along with other reference layers such as counties, places, federal lands, etc.    This site is still under development but can be accessed through NREL's  http://www.nrel.gov/gis/ web page.

 

 

 

back to top