HomeCouncil MembersAgency DirectoryConstitutionConferenceMeeting PresentationCUAC Business




General Meeting - Annual Federal Agencies Presentations




General Meeting


                 FY 2007                              FY 2006                                   FY 2005                             FY 2004

                                                              FY 2003                                   FY 2002                             FY 2001

                                                              FY 2000                                   FY 1999                             FY 1998








May 1 & 2, 2003


U.S. Government Printing Office

CUAC Representatives
Paige Andrew, Pennsylvania State University SLA
David Decklebaum, University of California, Los Angeles WAML
Mike Furlough, University of Virginia ALA/MAGERT
Donna Koepp, Harvard University ALA/GODORT
Mary McInroy, University of Iowa ALA/GODORT
Clara P. McLeod, Washington University, St. Louis GIS
Daniel T. Seldin, Indiana University NACIS
Wangyal Shawa, Princeton University ALA/MAGERT
Christopher J. J. Thiry, Colorado School of Mines WAML
Linda Zellmer, Indiana University GIS

Agency Presenters
Gil Baldwin, Director, Library Programs Service, Government Printing Office
John Hebert, Chief, Geography and Map Division, Library of Congress
Connie Beard, U.S. Bureau of the Census
Jim Lusby, Disclosure and Release Division, National Imagery & Mapping Agency
Carol Brandt, GIS Program Manager, Bureau of Transportation Statistics
Doug Vandegraft, Chief Cartographer, Division of Realty, U.S. Fish and Wildlife Service
Frank Beck, U.S. Geological Survey/Federal Geographic Data Committee
William Effland, Natural Resources Conservation Service, U.S. Department of Agriculture

Betty Jones Government Printing Office
Jim Flatness Library of Congress
Jennifer Davis Government Printing Office
Vi Moorhouse Government Printing Office
Patricia DuPlantis Government Printing Office
Robert Morris Library of Congress
Nick Ellis Government Printing Office
Lawrence Woodward Government Printing Office

May 1, 2003
CUAC Co-chairs Dan Seldin and Mike Furlough called the meeting to order and welcomed the attendees.

Government Printing Office
Gil Baldwin, Director, Library Programs Service

Mr. Baldwin welcomed CUAC to GPO and assured us that he had a terrific staff that would be available for our two days of meetings to help make our meeting comfortable and productive.

In December 2002, the Bush Administration appointed a new Public Printer, who was confirmed by the Senate. He is Bruce James; originally from Nevada, Mr. James has an industry background. He brings an entrepreneurial spirit and a business approach. His staff is working on a two-year cycle of change. There are three phases to this and to some extent all three phases are ongoing, but in most aspects they are in the fact-finding phase with lots of pilot projects, discussions with different communities and exploring various products and services. The next phase is developing consensus on what the future will look like and getting input from all communities on a strategic plan. The final phase will be implementation.

Judith Russell has been appointed Superintendent of Documents. Judith spent several years at GPO before her years at NCLIS and has now returned as the first woman Superintendent of Documents.

Mr. James is very business oriented and is focused on the future and is externally directed. It is clear that the future is not going to be printing. The future is information dissemination. In the beginning, GPO Access was very much driven by paper products that were available digitally. They are now focused on born digital information and have become an information dissemination agency.

Mr. James has appointed William H. Turri Deputy Public Printer and Chief Operating Officer who is in charge of Innovations and Partnerships. This is a broader program than the traditional partnership initiative that LPS has had on going for several years.

GPO currently employs about 3,100 people. Library Program Service has a staff of 108. Most of these are librarians, many are catalogers, but there are also librarians who are managers and program analyzers. There are many more professionals than there used to be with only about 35 blue-collar workers in LPS.

They are in the process of selecting an integrated library system, and have been in the evaluation phase for the past 6 months. This phase is being directed by professional consultants who have been extremely helpful. They are currently in the contract development phase, working with Ex Libris and PTFS in partnership. They have not yet awarded a contract, but they hope to do so by the end of May.

The new Recommended Specifications for Public Access Workstations in Federal Depository Libraries have been developed based on what LPS sees coming out from federal publishers. It represents middle-of-the-road technology rather than bleeding edge. He is asking CUAC for input on these recommendations. Cindy Etkin, who is responsible for the development of the specifications, will come to the meeting later.

Bonnie Trivizas, Chief of the Library Division has retired and Sheila McGarr is returning from the Department of Education Library to fill Ms. Trivizas's position.

The transition from paper and fiche to electronic has been progressing for many years. Today, two-thirds of the distribution is online electronic format. One-fourth of the remaining tangible products are maps.

OMB issued a directive to executive agencies allowing them to solicit bids from commercial printers rather than printing documents through GPO. This has reduced GPO's sources of information, even though Congress opposed the directive. This was one of Mr. James first orders of business when he started. When he first took over, he spoke with Mitch Daniels of OMB about the issue. The public who loses when printing does not come through GPO, because then information does not get sent out to libraries. Fully 85% of the printing done through GPO is done by outside contractors.

Cataloging staff has been increased by six. They are trying to determine what data and information products will be coming through the program so they will know whether staffing is appropriate. There is a lot of training going on now, both for the new electronic medium and for the new integrated library system.

Two new formats that came through the program in the past year: the audio E book and the mini CD-ROM. This may not be any indication of a trend, but they were something different that required cataloging.

Several new communication channels are now available for communicating with GPO. There is the GPO FDLP-L. To sign on, go to the GPO homepage. Click on list serve. Click on list serve archive. Register at this point. Instructions are also in Administrative Notes. Also available are AskLPS, AskLPS@GPO.gov, and lostdocs@gpo.gov. All of these sources of assistance from GPO are available to all of us and we are encouraged to use them. LPS is also in the process of acquiring help desk software. It will be available in the next few months.

The Interagency Depository Seminar will be held later this month at GPO. This is especially geared towards new government documents librarians. In October the Federal Depository conference will be held in D.C. There will be informational and instructional programs as well as a continuation of the discussion on the future direction of the FDLP.

There is a new program at NARA that assures Access to Archival Databases (AAD). This program will assure the digital archiving of all congressional and regulatory publications.

GPO’s digital archive harvests digital-only data. This is done through their open archives server, as well as through partnerships, the digital archiving project with OCLC and they are investigating the possibility of including digital management on their ILS contract.

A couple of their partnerships are one with Department of Energy, Office of Science and Technology Information for permanent public access for all fiche and online data, and one with the University of Illinois at Chicago for the Foreign Affairs Network of the Department of State.

In response to a question about archiving of publications that are sent out electronically directly from an agency and not through the FDLP, Mr. Baldwin asked that we let LPS know about these cases so that the information can be captured and access can be provided through FDLP.

It was pointed out that CD-ROM products were being cataloged from the cover information instead of from the metadata contained on the CD-ROM. This was noted by the GPO catalogers in attendance.

A question was asked about how broken links on the Web are dealt with. Mr. Baldwin explained the PURL system. Broken links are discovered by an automated system, but the investigation that needs to be done to repair the link has to be done by a person. Broken links should be reported to askLPS@GPO.gov.

Council had several cataloging questions. The backlog will be resolved with the increase in the number of catalogers, and the assignment of an assistant to help Vi Morehouse with map cataloging. It has been about 18 months since they lost 4 catalogers, and it has taken this long to bring everyone up to speed. There was some discussion about Antarctica maps and how they should be classified, but that was also resolved and should be completed shortly. It was agreed that subject headings could be added for the counties for the Forest Service topos.

In response to questions about CRADAs (Cooperative Research and Development Agreement), Mr. Baldwin explained that when GPO finds out an agency has established a CRADA with a company, GPO contacts the agency and either makes a competing offer or merely explains that the agency is still responsible for getting data to the public. Agencies now are under much pressure to get their information out and still remain solvent. (Minutes submitted by Donna Koepp)

Library of Congress
John Hebert, Chief, Geography and Map Division

John Hebert began with a brief update of recent activities in the Division. The Library has entered into its final year of its agreement with the German Prince Johannes Waldburg-Wolfegg regarding the Waldseemüller Map. The map is a one-of-a-kind from 1507; it is the first published map to use the word “America.” The Library of Congress has given $6.5 million of the $10 million owed to the Prince. The Library is in negotiation with the Discovery Channel for the remaining $3.5 million. The Channel is also considering making a 30-hour program using many of the maps from the Division.

G&M added 3 new catalogers; 2 filled vacant positions. Two new cartographers will be hired soon; their job will be to use GIS to create maps for Congress. These maps will not be available to the public because they are specifically produced for Congress. The Division has put out notices for participants for their Summer Program. It is unknown how many people will attend. Last summer, 2 people from Native American colleges worked in the Division. Also, a Chinese professor helped analyze the division’s pre-1900 Chinese maps. Currently, G&M is working with a group from Japan who is interested in scanning a set of older Japanese maps. 160 of the maps in this set are found nowhere in the world other than the Division.

The Division’s website has recently added images of maps from WWII and the Lewis and Clark Expedition. The Library will soon be opening an exhibit on the latter topic; a third of the items in the exhibit will be maps. On September 18, 2003, LC will host a conference on Lewis and Clark.

The Phillip Lee Phillips Society recently met in Texas.

There are several large scanning projects going on or planned within the Division. The Chief noted that when items are scanned by the Division, the items are also cataloged. The first project will scan the Vietnam and India 1:50,000 maps. Second, the Division has entered into a contract with Readex where they will scan older maps in the Serial Set; Readex will use Donna Koepp’s index as a reference when selecting the materials. The scans will be made available on LC’s website and will be in the public domain. Readex will sell access to the scanned accompanying materials in the Serial Set.

The move to LC’s new Integrated Library System (ILS) has caused problems with the scanned image display software. Owing to changes in the MrSID licensing structure that may cost LC more money, LC is considering translating its files to JPEG2000 format.

The project to scan the Division’s collection of Sanborn maps has fallen apart because Sanborn (who were to pay to have the maps scanned) wanted to re-copyright the maps, even if they were in the public domain. Because of this G&M is examining some other ways to scan their 250,000 Sanborn sheets that are in the public domain.

The Chief informed CUAC that items from the former Soviet Union and Soviet Bloc which were thought to be in the public domain, might not be.

G&M continues to talk with NIMA about co-operative cataloging. G&M catalogs more items, but NIMA catalogs to sheet level of sets.

The Division is going to buy some new scanners; they will be able to scan items 2 feet by 5 feet. They are attempting to purchase top-mounted scanner, which would be used for atlases. G&M wants to hire a scanning technician —someone who is responsible for the scanners, but not the cataloging. Congress has given LC $5.5 million to work with NARA on digital preservation. (Minutes submitted by Christopher J. J. Thiry)

U.S. Bureau of the Census
Connie Beard, Cartographic Operations Branch
Connie Beard of the Census provided an update on recent map products and the progress of the MAF/TIGER modernization activities at the U.S. Census Bureau.

The recent Census products include maps, data and LandView.

Maps Products:
The map products include digital maps on the web, DVD/CD-ROM, printed report maps, and printed wall maps.

Digital Maps:
All the large-format digital maps of Census 2000 are available on the web, and some of them are available on DVD/CD-ROM, as listed below:
* Census Tract Outline Maps (Census 2000)…1 DVD – Available Now
* Entity Based Census 2000 Block Maps…6 DVDs – 1 Available Now, 5 Coming Soon
* American Indian/Alaska Native/Hawaiian Home Lands (Block Maps, Tract Maps & AIANA Wall Map)…1 DVD – Coming Soon
* Recreated 1990 Block and Census Track/BNA outline maps to fit with 2000 Block and Census Track/BNA boundaries. These maps were created using the same software as Census 2000 mapping software. The outline maps were saved as PDF files. They are available on the Internet now and later will be made available on DVD.

Printed Report Maps:
The printed report includes the Summary Population and Housing Characteristic Reports (PHC-1 and PHC-2). All the printed report maps are accessible on the Internet http://www.census.gov/prod/cen2000/index.html. These printed report maps consist of maps such as state and county outline maps, county subdivision maps, and tribal subdivision maps. The PHC-3 report will be coming out late in the summer and it will include state-based Metropolitan Area maps, showing the 1999 OMB definition of Metropolitan Areas that were in effect for Census 2000 and state-based urban areas maps that shows the location and name of the urbanized area and urban clusters for that state. The large-format maps of urbanized areas and urban cluster outline maps are available on the Internet in PDF file format. The Census is planning to put these maps on DVD later.

The Census Bureau is currently making the 1% sample or Super-PUMA maps available on their web page and later on DVD/CD-ROM. The end of the summer 5% sample data maps will be made available on the web. The Census has also made individual state profile maps and information available on their web page.

Printed Wall Maps
The following printed wall maps are available on the Census web page: * The American Indians and Alaska Natives in the United States delineated for Census 2000. * The 108th Congressional District maps.

* Census is in the process of making wall maps of individual Congressional Districts and State-based Congressional Districts outline maps.

Cartographic Boundary Files:
The generalized boundary files of all levels of Census Geography from Block Groups and above are available on the Census web page (http://www.census.gov/geo/www/cob/index.html). These files have been recently re-generated so that they will integrate vertically in a GIS. The boundary files are available in the following file formats: * ArcView Shapefile * Arc/Info Coverage Export (.e00) * Arc/Info Ungenerate (ASCII)

What’s New (http://www.census.gov/geo/www/maps/index.html) is a good place to check these products that are available on the web.

The Census is developing LandView version 5, which integrates EPA, Census data, and USGS Geographic Names Information System. This version of Landview will be a depository item. For more information on the LandView 5 product contact 301-763-4636.

The MAF/TIGER modernization:
The main goals of MAF (Master Address File)/TIGER modernization activities are to replace the old TIGER database system with an open commercial database system such as Oracle, and implement a more flexible, object-oriented development environment. Another objective is to merge the exiting separate databases such as MAF, TIGER, and GEOCAT into a single integrated database system so that it will improve the functionality of the MAF/TIGER system. In addition, the Census is working on improving address and map accuracy by enhancing coordinate systems.

This MAF/TIGER modernization program will improve the effectiveness and lower the cost of 2010 Census, ACS, and many other Census products.(Minutes submitted by Wangyal Shawa)

National Imagery & Mapping Agency
Jim Lusby, Disclosure and Release Division

Jim Lusby began by reporting that policies regarding public release of NIMA products had not changed in the past year. In the wake of the wars in Afghanistan and Iraq, and ongoing security fears, there are still questions and concerns in the federal government about the types of data that can be released to the public. However, Mr. Lusby noted that NIMA has not withdrawn anything from circulation, except during an initial review period following September 11, 2001.

As an organization, NIMA is in a period of uncertainty, especially with regard to its role since the formation of the Department of Homeland Security. As a matter of federal law, the Defense Department cannot operate inside the United States, but NIMA assists other agencies that take the lead in protecting the United States. Many of these agencies that have cartographic products and needs have been absorbed into Homeland Security. Mr. Lusby acknowledged a name change for the agency is in the works: the National Imagery and Mapping Agency will become the National Geographic-Intelligence Agency, or NGA.

Although Mr. Lusby announced last year that he was no longer responsible for customer operations, it has taken some time to find another person in NIMA who can serve as a liaison to the map user community. Mary Ford will take on the role that Mr. Lusby previously held prior to September 11, including interaction with GPO. Ford was unable to attend this year’s CUAC meeting owing to prior commitments, but she will attend future meetings. Mr. Lusby promised to train her in the needs of the map user community.

Mr. Lusby commented on some upcoming releases, including some international series of maps, notably covering Peru, Central America, and parts of Africa. The recent release of maps covering Iraq prior to the war was an effort by NIMA to get a common base of information distributed to the media, the public, and internal customers before the war began. He also referred to a series of posters re-printing historical maps from the 19th and 20th centuries. Both these maps and the maps of Iraq are available for public sale through the USGS websites. The NIMA homepage has a list of large-scale products for sale (http://www.nima.mil).

Shuttle Radar Topography Data (SRTM) is currently under release and will be completely distributed soon. The US Public has access to DTED-1 and DTED-2 level data (3-arc second and 30-arc second), and can obtain the data through the USGS Earth Data Center web sites. Most of the United States has been processed. Free downloads up to a file size limit are available, with purchase options for large quantities of data.

Mr. Lusby clarified that public sale maps could be made available through the FDLP program, but understood that participating libraries had not yet been surveyed regarding which of these series they wished to collect. Mr. Lusby suggested pursuing the matter with the GPO representatives to get the maps into the distribution channels. (Minutes submitted by Mike Furlough)

Dan Seldin adjourned the meeting until Friday morning, May 2, at 9:00 am

May 2, 2003
Dan Seldin brought the meeting to order.

Bureau of Transportation Statistics
Carol Brandt, GIS Program Manager
Carol Brandt has been at BTS since 1995 and previously worked at Census Bureau and Defense Mapping Agency.

Bureau of Transportation Statistics is one of ten operating “administrations” within the USDOT (Coast Guard and the Transportation Security Administration were recently moved to the Department of Homeland Security). The USDOT creates and maintains transportation specific spatial data for: highways, railroads, transit systems, airport facilities and air space, and intermodal facilities. USDOT spatial applications take the form of Internet mapping applications, transportation modeling, remote sensing and imagery, and various spatial and cartographic products and data in both hard copy and digital formats.

Non-BTS spatial data efforts of the other administrations within USDOT and mentioned by Ms. Brandt were:
* FHWA – Federal Highway Administration maintains National Highway Planning Network (NHPN), spatial data depicting the National Highway System. The FHWA collects Highway Performance Monitoring System Information from the States and uses spatial modeling to create representations of flow of traffic over the highway system.
* NHTSA – National Highway Traffic Safety Administration is currently developing better means, including geocoding, for identifying accident locations for the Fatal Accident and Reporting System (FARS).
* FAA – Federal Aviation Administration creates and maintains aeronautical charts for navigation. FAA is moving to more digital information with increased focus on 3-D modeling.
* FTA – Federal Transit Administration is beginning to use GIS technology to model passenger flow through transit system(s) and encourage greater use of transit. FTA recently completed a data collection effort to acquire spatial data representing transit infrastructure.
* FRA – Federal Railroad Administration maintains rail network spatial data to model commodity flow and is collecting geographic locations using GPS to improve safety.
* Office of Pipeline Safety collects spatial data representing pipelines and facilities. Data from the National Pipeline Mapping System (NPMS) is not available to the public post-September 11. The data will be made available on a case-by-case basis if request is cleared by agency (Office needs information on the requester and the planned use of the data). Data is collected and sold by vendors (Pennwell and Tobin) and is accurate to within plus or minus 500 feet.
* MARAD – Maritime Administration is using spatial data to model commodity flow through ports and is responsible for developing plans to improve security at ports throughout the US.

Ms. Brandt also drew attention to the “virtual” National Transportation Library (http://ntl.bts.gov), which offers quick links to spatial and other types of transportation data.

Bureau of Transportation Statistics (BTS)
Within the USDOT, Bureau of Transportation Statistics (BTS):
Fills gaps, creating spatial data where no data steward exists;
Distributes spatial data through the National Transportation Atlas Data Program;
Provides cartographic and spatial analysis support for the Department;
Develops internet mapping applications to provide easier access to transportation data;
Works to coordinate geographic efforts in the USDOT.

The Geographic Information Program within BTS is the lead administration for geographic information within USDOT. It represents USDOT in the Federal Geographic Data Committee (FGDC), hosts the NSDI clearinghouse node for transportation data, and is coordinating standards development for the transportation portion of the Geospatial One-Stop Initiative. BTS distributes national level transportation-specific spatial data, such as the national Transportation Atlas Databases (NTAD). NTAD contains the majority of the databases owned and maintained by various USDOT modes and includes transportation networks, transportation facilities and geographic reference data. All NTAD databases are available for down load via the BTS web site (http://www.bts.gov/gis/ntatlas/index.html), and a data CD-ROM is released annually.

BTS purchased a “vintage road network” from GDT (Geographic Data Technologies, Inc. This data set is available via download (network area by area) on their website. Contact Ms. Brandt to get the whole network at once on a 4 CD set. Some examples of BTS filling in gaps in data sets include the data on intermodal terminals, metropolitan planning organizations (MPO) boundaries, and working with the National Bridge Inventory (NBI) to geo-locate bridges. The NBI without geocoding is currently available on CD--contact Ms. Ann Shemaka / FHWA Office of Bridge Technology / HIBT-30 400 7th St. SW / Washington, D.C. 20590 /202-366-1575 / ann.shemaka@fhwa.dot.gov

BTS also produces some paper maps (“Annual Major Transportation Facilities,” “Transportation in North America,”) to support BTS publications and the Crisis Management Center, and maps on request, as indicated on the BTS website. Their Internet mapping applications include the National Highway System, tracking Airline Market Share, Airport Congestion, and the North American Transportation Atlas Databases (NORTAD). Via NORTAD, BTS distributes tri-national transportation specific spatial data equivalent to the NTAD for the U.S., Canada, and Mexico. There are plans for developing relationships to allow for regular release of NORTAD.

After September 11, all geospatial data was removed from the BTS website for approximately two months, and there is continued focus in BTS on what data be available. Most security concerns center on data showing the geographic locations of possible transportation “choke points,” e.g. tunnels and bridges. For example, the National Bridge Inventory (NBI) is basically a tabular dataset that BTS is working to geocode, but it is undecided at this point whether this data will be made available to the public.

Geospatial One-Stop
BTS is participating in Geospatial One-Stop, an OMB E-government initiative to create a comprehensive web portal to provide easier—and timelier—access to geospatial data. The lead agency for GeoSpatial One-Stop is the Department of the Interior, USDOT is the lead agency for the transportation area, and BTS is handling the core data content standards development activities for USDOT. Successful implementation of this initiative will require participation from all levels and types of government (perhaps 2/3 of the participation from non-federal sources) plus academic and private sectors. At the time of the CUAC meeting, draft content standards existed for road and rails, standards for air and transit were coming soon, and those for waterways would follow. Other geospatial data themes and are scheduled to be available in September. The comprehensive web portal is scheduled for preview in early June. Check the BTS web site for Geospatial One-Stop at http://www.bts.gov/gis/geospatial_onestop/index.html.(Minutes submitted by Mary McInroy)

U.S. Fish and Wildlife Service
Doug Vandegraft, Chief Cartographer, Division of Realty

Mr. Vandegraft reported about collaboration between USGS and FWS to produce a new map of the National Wildlife Refuge System for the National Atlas of the United States. The map is unique because it presents the refuge boundaries derived from an entirely digital format. There are now 541 national wildlife refuges and there will soon be 542. There are now more than 100 million acres in the system. Mr. Vandegraft explained that as a result of the digitization process, FWS was able to identify an additional 6 million square miles of refuge area. The scale of the map is 1:7,500,000; both Hawaii and Alaska are depicted at this constant scale. In the future look for all FWS maps to be produced in a new format. The goal is to have all maps produced by the agency look alike. Digital orthophotoquads will be used as the base map. There will not be a consistent scale due to the relative sizes of the geography being represented. New maps will begin to appear on the Division of Realty website (http://realty.fws.gov/carto-resources.html). Not all regions will set distributing maps on the web as a priority goal, and data availability will vary by region. Digital land status maps are being produced. These maps will show the lands already owned by the FWS as well as lands that the service would like to acquire. Approved acquisition boundaries identify lands that are viable for habitat, but not necessarily owned by the FWS. Within the FWS both AutoDesk and an array of ESRI products are being utilized. Mr. Vandegraft reported that he has not attended any Department of Homeland Security meetings. The Service still has plans to connect its Real Property Database with its digital boundary files. Presently the Real Property Database is being converted into an Oracle Database. GIS layers can be downloaded from the FWS website (http://fwsgis.fws.gov/website/nwrbnd/run.htm). These are boundary files. For the lower 48 states the scale is 1:24,000. For Alaska the scale varies from 1:250,000 to 1:63,360. The files for Alaska do contain some attribute data not available for the other states. Mr. Vandegraft responded to a question about including trails on maps that are available to the public. He said that some maps do indicate where trails are, but it is not a responsibility or priority for the agency. (Minutes submitted by David Deckelbaum)

U.S. Geological Survey
Frank Beck, National Mapping Division

Frank Beck, USGS National Mapping Division, gave the USGS report, substituting for Dan Cavanaugh, who had a conflict that prevented him from attending the meeting. Mr. Beck reported on several projects, including the National Map, which will revolutionize the National Mapping Discipline, the National Atlas, and some discussion on the Global GIS Dataset, DDS-62, a concern of CUAC.

The National Map is a major redirection for the National Mapping Division. Most people are familiar with the USGS’ basic product, the 7.5’ Quadrangle. The USGS completed once-over coverage at 1:24,000 in the late 1990s. To replicate that effort, it would cost $2,000,000,000 to $3,000,000,000. There is a tremendous amount of information on the 1:24,000 topographic maps. However, USGS has realized in the past few years, based on comments from users, that the maps are definitely out of date. Despite our best efforts, and pleas for funding to keep them up to date, there is a strong realization that USGS is fighting a losing battle trying to maintain the maps on their own. Budgets have been decreasing, although everyone is familiar with that problem. The revision program, which has existed for a number of years in an attempt to keep the maps up to date, at best is able to revise 1200 to 1500 maps a year.

The National Map
The National Map was a study that was done a few years ago to address the problem of salvaging the fundamental base-mapping program. The edict USGS received from Barbara Ryan, the USGS’ Associate Director of Geography, stated “I am committed to a dramatic improvement in our revision program as one of the major components of a healthy and scientifically sound geographic discipline.” The key characteristics of the National Map are that it be current, continuously revised, seamless, with no arbitrary edges, complete and consistently classified, built on the best available data, have varying resolution to reflect geographic reality, integrated within and between themes of data (positional and logical consistency), geographic (no cartographic offsets), that it should be a temporal record, which means that there will be versioning and transactional updates, and that there will be metadata for the data set and at the feature level. USGS has come to the realization that they cannot do it ourselves, so the National Map will rely heavily on partnerships, with federal agencies, state, regional and local governments, private industry, universities and libraries, and the public. Everyone is aware of data in various organizations that could help USGS maintain their maps. The National Map will be a system of related databases that will be combined to build and maintain a map that will cover the United States from coast to coast, and border to border. The National Map will show the information that USGS used to collect on their own to produce their topographic maps. The USGS role in the National Map will be to organize the information, be responsible for awareness, availability, and utility, serve as a catalyst and collaborator for creating and stimulating data partnerships, partner in standards development, integrate data from other participants and finally produce and own data when no other source exists.

Most recently, the big emphasis in the National Mapping Division, for better or worse, are the 133 Urban Areas. A tremendous percentage of the population dwells in the major metropolitan areas of our country. Those are the areas that are extremely important for reasons of security and natural disaster recovery. A good percentage of the USGS efforts this past year have been placed on these 133 urban areas.

A sample of the National Map Viewer for Mecklenburg County, NC was shown. It has undergone several changes, based on tests over the past year. This does not show the ultimate appearance of the National Map, but it is an example of the ultimate goal. At present there are no agreements between USGS and Mecklenburg County to maintain these data sets, but it is an example of the direction for the National Map. The National Map will offer a wide range of viewing options. Hopefully, users will be able to drill down from a small-scale depiction, such as the National Atlas, to a large-scale view, such as the Digital Orthophotoquads. Users will be able to pick and choose the layers they want and produce a graphic. Some information on the viewer may be owned and maintained by other organizations, perhaps even served by local government agencies. Users will be able to drill down to local data, such as information about local hospitals (services, number of beds, etc.), which will be maintained by local government agencies and/or organizations outside of the USGS. Ideally, local government agencies will take responsibility for maintaining their data, and provide access to USGS and, ultimately, the public.

A question was asked about who would take responsibility for archiving older data, USGS or local agencies. USGS hopes that localities will archive their data, in an appropriate, agreed-upon archival format and mechanism, frequency, etc. The primary concern is that digital information, which will not be printed regularly as has been done for the USGS topographic maps, will not be available for future use in temporal studies. There isn’t a clear understanding on what data needs to be archived, especially if only a small fraction of the features have changed. Perhaps only the information on the transaction will be archived.

Another question was asked about the rural areas, which may not be using GIS. The USGS will continue to be the data gatherer and provider for rural areas that are not currently using GIS or producing digital spatial data. Several approaches could be used. The National Map could simply show the existing topographic map, in the form of a digital raster graphic (a scanned topographic map). Another alternative would be to scan the map separates (roads, contours, vegetation cover, etc.) and allow that information to be accessed separately. That would represent the best available data for those areas, but would take more time and effort. Both options have been examined, but no decision has been made concerning how to show those rural areas.

Congress is enthusiastic about the National Map in some areas, such as the 133 urban areas. NIMA is the driver behind this part of the project. Getting funding for those areas, because of the Homeland Security needs, has been easy. Getting funding work elsewhere is more difficult. Even getting data from local partners, much less getting funding from those organizations to do work is difficult. The biggest incentive for local agencies is that by cooperating with the USGS, their data and that of their neighbors will be much more likely to be seamless and user friendly. USGS is also working on efforts to make local data more accessible. They are working on software packages that will make the data more interchangeable.

The latest fact sheet on the National Map is titled Hazards, Disasters and the National Map. It is USGS Fact Sheet 027-03, available on the web at: http://erg.usgs.gov/isb/pubs/factsheets/fs02703.html. Several printed byproducts of the National Map, mock-ups of topographic maps, were shown as examples of future print output that can be produced quickly and cheaply. With this type of product, it is difficult to determine what to put in the collar. Especially given that the data came from multiple sources, and that the date may not be very meaningful, as the data could change daily, and the layers may have been updated at different times. In addition, the new National Wildlife Map from the National Atlas was shown. Another North American map is in process. There is a new area on the National Atlas site on Printable maps, maps that can be printed at page-size for the common users. The site for this is at: http://nationalatlas.gov/printable.html.

Other Questions:
A question was asked about the source information on some of the maps from the old printed National Atlas maps, which give brief bibliographic information, with the statement “and other sources.” That request will be forwarded to the National Map office. A question was asked about funding for the National Cooperative Geologic Mapping program. No information on their funding was available.

The Middle East and Iraq maps produced by NIMA were also mentioned. Three additional maps will be available soon. GPO is trying to get copies for distribution to Depository Libraries.

Digital Data Set 62:
Four parts of DDS-62 (Central & South America, Africa, South Asia and South Pacific) were issued through the Depository Library Program. After those first four were issued, the Geologic Division ran into funding problems and could not issue the remaining sets (North America, Europe and North Eurasia). Somehow, a CRADA (Cooperative Research and Development Agreement) was established with the American Geological Institute. They are producing and issuing the remaining parts of DDS-62, and copyrighting them. The CRADA was announced in late September. What is copyrighted is the package that AGI has put together and issued, such as the ESRI software. What is not copyrighted is the raw data. That has not been a product provided by the U.S. Geological Survey. If there is enough interest in the raw data for the three remaining areas, GPO needs to be petitioned to ask for the data from USGS. The Survey could then provide the data to GPO, who could then provide it to Depository Libraries. GIS-literate librarians and library users would find the data useful.

A question was asked about whether we might be informed about potential CRADAs before they are finalized so that we could comment on them. Mr. Beck had no information on how to comment on them, but suggested two people who might be contacted about commenting on future CRADAs. Other agencies (such as the U.S. Department of Education) could and should have been contacted about providing funding support. (Minutes submitted by Linda Zellmer)

Natural Resources Conservation Service
William Effland, U.S. Department of Agriculture

William (Bill) Effland’s presentation discussed the background, uses and selected examples of various digital soil survey products produced by the USDA Natural Resources Conservation Service.

He stated that he would speak about (1) some digital soil survey information; (2) several sources of digital soil information that are available or are being developed; (3) advantages of that information; and (4) how the Agency is working to deliver that information to customers. Additionally, he mentioned future research and application directions of the Soil Survey Division by discussing some landscape analysis projects that he has worked on since transferring to the Division in January, 2003.

Dr. Effland explained that the USDA Natural Resources Conservation Service (NRCS) was formerly known as the Soil Conservation Service until about 1994. He noted that he works in the Soil Survey Division, with background and training as a soil scientist. Dr. Effland remarked that he is currently employed as a landscape analyst in the Agency’s 10,000 employees. About 900 of those employed are in the Soil Survey Division, where 45-50% of the workforce is expected to retire in the next five years. He stated that digital soil resource information provided one of the foundation layers for modern natural resource appraisal, analysis and interpretation.

National Cooperative Soil Survey (NCSS)
Dr. Effland stated that the National Cooperative Soil Survey is the key to the soil survey programs that exist throughout the United States. However, there are at least three components of cooperative soil surveys: the state, the county, and the federal government. These partners should be kept clearly in one’s mind when discussing soil survey information. The NCSS has many partners (e.g., federal agencies, state agencies, county agencies, land grant universities and private entities), with USDA/NRCS designated by Congress as the lead federal agency for soil survey programs. Some federal agency partners include the US Forest Service, the Bureau of Land Management and the National Park Service, including work on mapping soil resources for the national parks. There are also numerous NCSS partners with State Agencies. Dr. Effland stated that funding for the soil survey program varies from state to state. Each state has its own structure with respect to funding soil survey and how specific information is collected even though there is the broad umbrella of the NCSS, which provides a standardized format. Funding for the soil survey program is obtained through the various NCSS partners. In some states, historically soil survey work was 1/3 funded by the federal government, 1/3 by the states and 1/3 by the counties; in other states, it was primarily funded by the county government, with smaller contributions from the federal and state agencies. He continued his discussion of NCSS partners by stating that the Land Grant Universities are also collaborators who conduct soil science research and participate in field reviews. University cooperators help with the quality assurance of soil survey information. These universities are also an important component as far as research and development of technology for improving soil survey. In some areas, they helped develop the various soil landscape models that are applied as conceptual tools to identify and delineate different soils in the real world.

Another NCSS partner is groups such as the soil conservation and water conservation districts, which are legislative bodies formed at the county level. Typically, a single county will have a soil conservation district. These distinct groups were formed to give local advice on how to help direct the soil survey program. The last group he mentioned was various private entities, noting that some industry groups also serve as partners.

Dr. Effland concluded this section by reminding the group that the National Cooperative Soil Survey is a long-standing collaborative partnership and that “this collaborative working relationship directly influenced the direction and development of soil survey throughout the United States.” Digital Soil Survey Products

Dr. Effland then discussed digital soil survey products in general, stating that these data are inherently multi-scaled in nature. He said that the data can be displayed and studied on a world basis (global scale) down to something that is essentially within a field or sub-field level (e.g., county to field scale). He mentioned data from the World Soil Resources group led by Dr. Hari Eswaran as an example of global scale soil information. This group works collaboratively with the US State Department, the US Agency for International Development and UN/FAO (Food and Agricultural Organization of the United Nations) to produce and distribute generalized natural resource information that is available on a global to regional basis. He continued by citing the following two principle databases as examples of information or data available on a national to regional scale:

* The National Resources Inventory (NRI) - a statistical-designed database of over 800,000 sampling points across the U.S. with over 1.2 million records for approximately 200 different attributes. These data were collected every 5 years (1982-1997) and now there a sub-sample is collected on a yearly basis (starting in 2000). The NRI is a multi-million dollar effort. It includes spatial and temporal information and allows researchers and policy-makers to look at the status, conditions and trends of natural resources. The NRI does not inventory federal lands.

* State Soil Geographic Database (STATSGO). This data was originally released on CD in 1994 (available at 1:250,000 scale). It utilizes polygon/base mapping of large areas for regional to national scales of analysis and interpretation. The spatial data includes up to 21 different soil components for each polygon, giving the percentage of those different components within the polygon. Physical location for each individual soil component is not given but there are approximately 20,000 polygons for the U.S. STATSGO data was utilized in a GIS decision support system project completed under the North American Free Trade Agreement with Canada. Here, STATSGO data was joined across the U.S. and Canadian borders with the Soil Landscapes of Canada data, which is at a mapping scale of 1:1,000,000. In another project, STATSGO data was applied in conjunction with the Soil Landscapes of Canada for estimating soil carbon levels across North America.

Dr. Effland concluded this section by discussing an example of data available on a county to field scale:

* the Soil Survey Geographic Database, (SSURGO). SSURGO data is county level data that is publicly available via the Internet for application in geographic information systems. The NRCS is also developing a Soil Data Viewer in ArcView 3.3, which will be incorporated into the customer toolkit at USDA field offices throughout the U.S. SSURGO data scales vary with typical values ranging from 1:12,000 to 1:24,000.

He stated that these digital soils data are soil reports with county level soil data that have been used for years. He reminded the group of the wealth of information available in these products saying that, “the widely varying resource questions ranging from global to field level areas resulted in five orders, or mapping levels, of detail for soil survey data”. Traditionally, the county soil surveys were published in hard-copy paper format and some users still tend to like this format.

Uses of Digital Soil Products
His talk then focused on the uses of digital soil survey products. Areas mentioned were GIS visualization of soil properties or characteristics; soil interpretations; resource conservation planning; land use management; environmental assessment; and computer simulation modeling. He stated that the GIS visualization, analysis and interpretation of soil properties are a valuable use of the data. In fact, a multi-million dollar yearly effort is currently underway to update and digitize all modern soil surveys. He emphasized that there is also a wealth of soil interpretations available that allow us to look at potentials and limitations for using soils. For example, soils interpretation data allows one to look at engineering properties and limitations. He also stated that resource conservation planning was still a primary focus for using soil survey information, originating in the 1930’s with the early work of the Soil Erosion Service. A current example in this area is nutrient management and environmental quality with respect to air and water quality. Examples of land use planning, environmental assessment and computer simulation modeling were given. He talked about a program called BASINS that uses a model called SWAT (Soil Water Assessment Tool) which is a GIS linked computer simulation modeling tool that allows one to make estimates of the total maximum daily loads (TMDLs) of various watersheds. It is still in development. He also mentioned a water erosion prediction project that uses a tool called GeoWEPP. This model uses digital soil survey information in conjunction with the water erosion prediction model, WEPP. Dr. Effland discussed the advantages of using digital soil information. One advantage was that the digital data can be accessed very quickly and provide data rapidly. Another was that the digital soil data allows one to think about new relationships and to develop new interpretations that were not considered in the past because that data weren’t easily accessible. There is now and will be increased data availability for integrated resource and management tools. In fact, SSURGO data are becoming available as a part of a common computing environment where data from different agencies are stored on a central server and can be shared throughout the more than 2,000 USDA field offices across the country. Access to this data by a county planner or conservation planner technicians will be available through a GIS tool, the Soil Data Viewer. The last advantage of using digital data that he discussed was its ability to increase the capacity to develop some new soil information, e.g. creating soil information on some of the National Parks or BLM lands, and quickly updating and maintaining the soil information. Such updates would include drawing new soil lines or looking within the soil polygons and trying to understand the relationships of the soils to other factors or environmental variables. He then showed several maps produced from digital soil data to illustrate various uses. Most of these maps can be found on the Internet at: http://soils.usda.gov/soil_survey/main.htm; accessed July 1, 2003.

In this section, Dr. Effland also talked about a map for the National Soil Characterization Database, which showed the location of more than 27,000 soil profiles sampled for the soil survey program. This database “provides detailed morphological, chemical and physical property data which can be linked for analysis and interpretation to spatial data such as STATSGO or the NRI”. Another map showed the status of soil survey digitizing work for the county-level soil surveys. He mentioned that currently, more than 1,450 county soil surveys can be downloaded from the Internet.

He commented about the digitization of the SSURGO data, stating that it has a total of 2,200 counties or area for soils throughout the US. Currently, about 1,450 of these are archived SSURGO. Of the counties remaining, some are just being started, some have map compilation completed, and some are working on digitization. There are several digitizing centers throughout the country and this work is being done in cooperation with some universities.

In discussing tools that are being used to display and query SSURGO data, he named the Soil Data Viewer as the current GIS tool. The earlier Soil Explorer did not allow one to do a “true” GIS analysis. The current Soil Data Viewer uses ESRI’s ArcView GIS software and provides rapid access to numerous soil characteristics and interpretations. It thus allows one to rapidly create many interpretive thematic maps, e.g., on agriculture, building site development, sanitary facilities, and water tables. Reports - tabular or cartographic - can also be generated using this viewer. With SSURGO data, however, one may have up to three soil components because of the detailed level of soil information. There is also a web-based Soil Data Viewer that is being developed to view SSURGO data. (http://www.itc.nrcs.usda.gov/soildataviewer; accessed July 1,2003).

Lastly there was a discussion about a research tool currently under development at the University of Wisconsin-Madison called the 3dMapper. It was originally funded by NRCS as a tool for soil map visualization. He stated that it has now been commercialized and can be used to update the soil maps. It will allow draping digital orthophotographs over a DEM. (http://www.TerrainAnalytics.com; accessed July 1, 2003).

At the end of the discussion, the following questions were asked:

1. Have you considered printing the soil surveys? For example, doing print on demand, similar to what some small publishers are doing?
Dr. Effland stated that there has been some talk of print on demand with some of the publications. He said that they previously had a small publisher near Blacksburg, VA that would print on demand once there was enough interest in the publications. For example, they would print a thousand copies of a specific publication such as “Keys to Soil Taxonomy.” He stated that in many areas the soil resource survey information is underutilized but that it is very valuable to some people in other areas. Dr. Effland mentioned the program at the University of Maryland where they are scanning their old surveys and are making them available through a web site. This allows users to print only one map sheet, for example. He stated that NRCS is exploring various printing options such as the program at the University of Maryland. It was noted that Pennsylvania, Oregon and Missouri are doing similar work.

2. Terrain Analytics is the distributor for the 3dMapper and it’s for a fee. Is it freeware?
Dr. Effland said that there is a free version that was developed a few years back but that it is not enhanced with additional functionality and is more of a visualization tool. He stated that the current 3dMapper is more of a functional mapping tool and is fairly inexpensive.

3. One of the examples you showed from STATSGO data was the distribution of soil water tables and is it available for the public to use?
Dr. Effland stated that the data are available on the web but that the particular graphic for water table distributions is not on the web. He said that the data can be downloaded from STATSGO and are free through the website at Fort Worth. Dr. Effland was unsure if the BASINS data was still available to the general public due to Homeland Security issues. One member stated that the BASINS data are freely available by request through the EPA.

4. What is the minimum scale which determines an arbitrary boundary? For example, what is the minimum factor that you define when you try and determine an arbitrary boundary between Soil A and Soil B? Is there a specific standard or does the person viewing the boundary make the decision?
Dr. Effland stated that each of the soil surveys is mapped at one or two levels or orders. For example, an Order 1 survey would be at a research farm level with most county soil surveys at Order 2. He said that the polygon boundary determinations are standardized based on the soil landscape model and survey order but there is some subjectivity from the individual soil mappers. Dr. Effland said that one reason they are moving into using DEMs, DOQs and raster-based GIS is an effort to remove some of that subjectivity. He stated that if you look in the National Soil Survey Handbook or Soil Survey Manual, there is a table for each mapping scale indicating the minimum size delineation.

5. You talked about the sampling of soils at various locations, the Pedon Database. Is this data accessible to the public?
Dr. Effland stated that the Pedon database is going into transition and it will be one of the Internet map server type projects but that currently the CD is available. He said that previously, you could buy the data for $50 but now it is in transition where it will be updated more frequently as more soil pedon data becomes available. There are a lot of Land Grant Universities cooperators with the soil pedon data. He also said that, in some cases, the data may be incomplete so it was not used in the NCSS but now they are trying to complete, update and expand the database. Dr. Effland noted several places where they are working to do this, including the University of Arkansas, Pennsylvania State University and a project at the USGS related to information on soil carbon sequestration.

6. Will the CD ROM version of the soil surveys be available for all areas of the U.S.? Will including the shape files of raw data become the standard for CD distribution?
Dr. Effland said that the CDROM data will be available on a state-by-state basis. He said that some states have more resources as far as presenting that kind of information but in the long run the hardcopy soil survey report is transitioning into CD or Web-based server. Dr. Effland also noted that some of the electronic versions of the soil survey reports are technically equivalent to the hard copy report but also contain spatial data such as shape files. (Minutes submitted by Clara McLeod)


Mike Furlough thanked Betty Jones for her work in helping CUAC to hold its annual meeting in the Government Printing Offices.

Dan Seldin adjourned the meeting.

U.S. Board on Geographic Names Roger Payne, Executive Secretary (via email)

The Secretary reported that the Board of Geographic Names (BGN) is in the process of beta testing a new version of their Geographic Names Information Service (GNIS) website. Two states are testing the changes—Delaware and Florida. After the website’s redesign, among the new features will be a spatially enabled component. In the next year, the Board will release and activate the redesigned database, and release a new, enhanced user Internet webpage and interface for GNIS. The Board’s new disc product includes GNIS' data almost in its entirety, and can be displayed using LANDVIEW V (a product produced by a Federal consortium) ; the disc is presently marketed by the Bureau of the Census. It is $99, and is in DVD format.

Although there was some mention of blocking certain categories names in GNIS due to 9/11, an analysis later determined that would not be necessary.

The upgrading of the names in GNIS (Phase II) is complete or in progress for all but four States--New York, Kentucky, Alaska, and Michigan. Phase III will likely be scrapped because it has been overtaken by events: namely support for the local and State vertical data integration in support of The National Map and homeland security. Phase II will be completed.

There have been no major changes in procedure or policy regarding how the Board decides on name changes. (Report taken and submitted by Christopher J.J. Thiry)

U.S. Forest Service Betsy Banas, Staff Cartographer, Geospatial Services Group

I. The Forest Service recently held its second Geospatial Conference in Colorado Springs, Co. There were over 250 attendees from the Federal Government, State and County representatives, State Foresters, and many others. The event was co-sponsored by Colorado State University and The University of Colorado at Colorado Springs. The conference program and presentations are available by contacting David George, the Forest Service Geospatial Conference Program Chair, at dgeorge@fs.fed.us.

II. The Forest Service continues to collaborate with the US Geological Survey (USGS) in its National Map Initiative. We are pleased to report that the Forest Service is participating in building the National Map, using Forest Service data for two focus areas: Colorado Springs/San Isabel National Forest and Albuquerque/Cibola National Forest.

III. Last year the Forest Service reported on the focused effort Forest Service has placed on our participation in the Federal Geographic Data Committee (FGDC). We are continuing to be engaged in the varied, fast paced efforts of the Office of Management and Budget (OMB) through the FGDC, to coordinate mapping and geospatial data collection and related activities among Federal Agencies. There has been a lot of effort this year, by the FGDC to engage participation among States, local governments, Tribes, academia and other entities. OMB and FGDC are developing a means to measure and monitor our adherence to standards in order to hold us accountable for compliance.

IV. The President’s Council on Excellence in Government has keyed in on Electronic Government (e-Gov/ the Internet) as the way to improve efficiency in doing business. 24 e-government initiatives were identified, including Geospatial One-Stop. On December 17, 2002, the President signed the E-Government Act. President Bush states that this legislation “builds upon my Administration's expanding E-Government initiative by ensuring strong leadership of the information technology activities of Federal agencies, a comprehensive framework for information security standards and programs, and uniform safeguards to protect the confidentiality of information provided by the public for statistical purposes. The Act will also assist in expanding the use of the Internet and computer resources in order to deliver Government services, consistent with the reform principles I outlined on July 10, 2002, for a citizen-centered, results-oriented, and market-based Government.”

The Forest Service has been very involved in Geospatial One-Stop, as we continue our efforts to provide standard geospatial data, which is documented with FGDC compliant metadata. We know have our Forest Service Geodata Clearinghouse up and on-line. The Geodata Clearinghouse can be viewed at http://fsgeodata.fs.fed.us/. It is currently being upgraded to provide ESRI ArcIMS data with FGDC compliant metadata. The upgrade should be complete by October 2003.

To learn more about Electronic Government and Geospatial One Stop, see http://www.whitehouse.gov/omb/egov/ and http://www.geo-one-stop.gov/ .

The Forest Service is also involved with Recreation One Stop another of the 24 Presidential e-Gov initiatives. The effort will provide the public with a one stop ‘portal’ to recreational opportunities and will be supported with Internet mapping services.

V. The Forest Service continues to collaborate with the USGS in the sale of our Forest Visitor Maps and other specialty products through their on-line services and vendor network. This enables us to provide better public service. The program has been operational for 2 years and we have seen our map sales have increased as a result.

VI. Since September 11, the Forest Service has focused efforts on Homeland Security.

A. The Deputy Manager from our Geospatial Service and Technology Center, Barry Napier, has accepted a 15-month detail to the Interagency Geospatial Preparedness Team, located at the Federal Emergency Management Agency. Other members of the team are from USGS and the National Imagery and Mapping Agency. We also have a representative (Susan DeLost from our Washington Office, Engineering Staff) to the FGDC Homeland Security Working Group.

B. Efforts are focused on defining geospatial data that is critical for disaster preparedness and for first response in the event of a crisis. A Standard and agreed upon Critical Infrastructure Layer for Homeland Security is being developed.

C. Forest Service experience with fire-related disaster response has been valuable.

D. Forest Service and other USDA Agencies were involved in the efforts to recover debris from the Colombia Shuttle. Remote Sensing and Global Positioning System data and technology were utilized.

VII. The Forest Service suffered an extremely severe fire season in 2002. Congress did not allocate additional funds to cover the excessive costs of fighting fires last year. Money was ‘borrowed’ from other program areas to cover costs. Our Geospatial Service and Technology Center suffered from this ‘Fire Borrowing.’ The Single Edition Quadrangle Mapping Program, in which we produce 1/24,000 topographic quadrangle maps over National Forest System Lands, has suffered. We were unable to meet our production goal of 600 maps. We are trying to make up the shortfall this year, but it is not certain if we will meet this goal. If we have another bad fire season, we may go through another round of borrowing.

VIII. Our budgets have not been increased, and all of the geospatial initiatives have increases, so our dollars are spread very thin. This has also affected our production schedule.

IX. Another OMB initiative, “Competitive Sourcing” which involves efforts to stream line and improve efficiency has also had an impact. Various program areas are being studied to determine the best way to improve efficiency. Unfortunately, the task of studying programs is costly and takes time form other work. To learn more about competitive sourcing see http://www.whitehouse.gov/omb/circulars/a076/a076sa1.html

X. Chris Thiry asked for a Point of Contact at the map printer who does the beautiful work on our Forest Visitor Maps and other maps. The Printer is Williams and Heinz Map Corporation, 8119 Central Avenue, Capitol Heights, MD 20743. The Point of Contact is Mr. Mark Budd, at 1-800-338-6228. (Report taken and submitted by Christopher J.J. Thiry)

2003 minutes compiled by Mike Furlough

back to top

Fiscal Year 2002



May 3, 2002

CUAC representatives:
Janet Collins, Western Washington University (WAML)
Mike Furlough, University of Virginia (MAGERT)
Donna Koepp, University of Kansas (GODORT)
Clara P. McLeod, Washington University (GIS)
Bruce Obenhasu, Virginia Tech (SLA G&M)
Daniel T. Seldin, Indiana University (NACIS)
Paul Stout, Ball State University (NACIS)
Christopher J. J. Thiry, Colorado School of Mines (WAML)
Mark Thomas, Duke University (MAGERT)
Linda Zellmer, Indiana University (GIS)

Betsy Banas (NFS)
Dan Cavanaugh (USGS)
Howard Danley (NOAA)
John Hebert (LC)
Betty Jones (GPO)
Jim Lusby (NIMA)
John Moeller (FGDC)
Richard H. Smith (NARA)
Timothy Trainor (Census)
Doug Vandegraft (F&WS)

Susan J. DeLost (NFS)
Wil Danielson (GPO)
Mark Flood (NFS)
Robin Haun-Mohamad (GPO)
Vi Moorhouse (LC Cataloging)



Welcome and introductions


CUAC Presentation


Preservation and Archiving Issues Roundtable Discussion


Led by Donna Koepp


University of Kansas


Government Documents and Map Library


Library of Congress, John Hebert




National Archives and Records Administration, Richard Smith


US Government Printing Office, Betty Jones


Federal Geographic Data Committee, John Moeller




Forest Service, Betsy Banas


Census, Tim Trainor


US Geological Survey, Dan Cavanaugh




NIMA, Jim Lusby


NOAA National Ocean Service, Howard Danley


Fish and Wildlife Service, Doug Vandegraft


Wrap-up and Closing Remarks

Preservation and Archiving Issues Roundtable Discussion

Facilitated by Donna Koepp, University of Kansas, Government Documents and Map Library

Introduction (Donna Koepp, CUAC) Our biggest concern is the preservation of cartographic and spatial data, especially what is born digital and we never see in paper. We are concerned about having snapshots in time for data that is constantly being updated, so that we have historical records. Libraries are not set up to preserve that data mainly because of file size. Are the agencies preserving snapshots of their data? If not, is there some role that libraries can play, similar to what we do with paper documents? GPO does some preservation of text documents, but is not preserving maps – GPO is referring users to USGS and other agencies because the files are so large. Libraries have some capacity to work with government agencies to do this in partnership to preserve these datasets.

John Moeller (FGDC) encouraged our participation and representation in FGDC. A specific opportunity is with the Historical Data Working Group of FGDC chaired by Bruce Ambacher from the National Archives and Records Administration (NARA). They developed the policy and guideline statement “Managing Historical Geospatial Data Records: Guide for Federal Agencies” in 1997. Tools in place that can be used include the metadata standard for documentation, a final draft of an international metadata standard should be approved by the end of this calendar year, and the spatial data transfer standard.

Donna Koepp (CUAC) asked if John knew of any agency that was preserving all of its cartographic data.

John Moeller (FGDC) replied that he did not know of any. He knows that the Earth Resources Observation System (EROS) data center has an extensive archive of imagery and Bureau of Land Management (BLM) has a policy for preserving all information including digital information.

Donna Koepp (CUAC) mentioned the special problems with BLM’s decentralization. State and local offices are not necessarily following the same rules.

Chris Thiry (CUAC) pointed out users often want historical data. People are doing historical studies, examples include the history of land management and growth areas, and this is why we are so interested in having snapshots of the data. We may lose this history and end up with a period of time where we don’t have the documentation.

Richard Smith (NARA) hopes it is a comfort to know that federal statutes require records maintenance, control and disposition schedules, for materials of enduring or permanent value, regardless of format. Sometimes there is a snapshot provision. The Electronic Records Archive of NARA is charged with preserving many different electronic records formats including maps and cartographic data sets independent of software and hardware. Currently in a pilot project, the Electronic Records Archives is supposed to be up and running by 2004. The Archives has a plan for collecting and preserving digital datasets..

Donna Koepp (CUAC) mentioned the NARA definition of records management and found it comforting that their definition of records includes maps. Bruce Obenhaus (CUAC) brought up issues of when do we take snapshots and how much change is worth identifying? What is of enduring value? These are hard questions that might not have answers currently.

Richard Smith (NARA) added that the National Archives has appraisal archivists that are familiar with electronic records. They are hammering out agreements with agencies on the maintenance, use and final disposition of these files. That’s the law and nearly the practice. Archives has schedules for USGS electronic records, as an example. Archives will likely preserve only a small (2-3% of paper is now preserved and we presume electronic data will be similar) percentage of the data actually collected. This is a shared responsibility between NARA and the originating agencies.

Donna Koepp (CUAC) asked what is included in NARA? Is it similar to Federal Depository Library Program (FDLP)? NARA keeps records of the agency, FDLP keeps the publications of the agencies. These are different types of material.

Richard Smith (NARA) The National Archives collects record sets from agencies. Archives has what he presumes FDLP libraries have and a lot of manuscripts to back up the publications.

Mark Thomas (CUAC) Now there is a blurring of published materials and electronic materials. With digital spatial data, maps are made on the fly, there is no permanent published version because the user makes maps for a specific purpose. The problem lies with saving the original data.

Richard Smith (NARA) Maps or records created by an agency may not have a permanent value to the agency and would not be preserved. When records are still important to an agency the agency keeps them until the use of the record dies down, at this point it will be transferred to NARA. Some records are deemed so important that the agencies keep them for many decades.

Donna Koepp (CUAC) There still are concerns with items that are not getting into the GPO distribution system, including the very special projects that may be sitting on agency shelves and we don’t know exist because they have never been cataloged. This is also a problem with electronic items that never get into the system. It’s a matter of getting information out there and sharing it. It’s a matter of discovery.

Mike Furlough (CUAC) questioned to what extent NARA has already worked with cartographic data in electronic format? Currently statistical data is the bulk of the electronic data that NARA has archived.

Richard Smith (NARA) Only 4 groups of spatial files including the TIGER files are currently in NARA electronic archives, possibly 5% or less of what is out there. NARA is setting up schedules for the transfer of files but most have not been transferred to NARA because of the high rate of activity on the file. NARA may wait until files are 15-20 years old before they are deposited.

Chris Thiry (CUAC) Asked Mark Flood (NFS) – do you have data that you can no longer access for any reason?

Mark Flood (NFS) There has been problems accessing data collected 5-10 years ago because of changes in hardware and sorfware. This is not as much a problem in maps yet because they have not been done electronically for a long period of time. This problem could be coming in the near future.

John Hebert (LC): Of concern to the Library of Congress is the ability to acquire increments of improvements in cartographic output. LC is much more global in acquisitions than NARA.

Linda Zellmer (CUAC) In asking federal agencies about archiving their data the answer was, “it is in the metadata”. They are updating files but not including dates for updated fields in the metadata. Would like to see a temporal GIS, with dates when a field or feature was added.

Susan DeLost (NFS): National Forest Service is now developing feature level metadata. For each record there will be a metadata link attached to a particular record including a year when the field was added.

Tim Trainer (Census): From a producer and user perspective you will end up with more metadata than spatial data. That is something that we need to take another look at.

Donna Koepp (CUAC) thanked everyone for their participation and insights on the question of preserving and archiving cartographic data.

Library of Congress
John Hebert, Chief of the Geography and Map Division of the Library of Congress

John Hebert, Chief of the Geography and Map Division of the Library of Congress, presented the LC update again this year. His presentation focused on the areas of acquisitions, staffing, scanning projects, general projects, the Phillips Society and the special project this past summer.

Of significance is the acquisition of the only known copy of a 1507 map, compiled by cartographer Martin Waldseemüller, to bear the name “America” and the first to depict a separate Western Hemisphere. Congress appropriated $5 million for the purchase of the map and fund raising is still underway to secure an additional $5 million. They have some pretty good leads for this money. There are several other items in the packet that came from Prince Johannes Waldburg-Wolfegg in which the library is very interested. They received from Census 130,000 sheets of Census track materials for the 2000 Census. After September 11 there was a great deal of interest in holdings covering Southwest Asia. The Division put together a listing of what they hold and have tried to fill in gaps. LC continues to receive materials produced by the former USSR. They have completed most of the acquisitions of Soviet produced maps at 1:200,000 scale and are now acquiring the 1:100,000 scale series world wide. In addition they have sought nautical charts for the Arctic and Pacific coasts. LC has received what John believes will be the final acquisition of paper state road maps, about 20,000 sheets, and expects future receipts from state highway departments will be digital.

The Geography and Map Division has a total of 55 employees. In the past year they have added 5 new technicians, and currently have a posting for two new catalogers. An assistant chief of the division and two new reference librarians will be advertised in the near future. They are adding one new person in the scanning and digital lab to replace one lost last year, bringing the staff back up to four. An additional digital specialist, a GIS person, is also being added. A new GIS initiative to create an “on demand” service for Congress is underway. Two geographer positions will be added for this initiative.

Scanning Program
The Library has over 6,000 maps scanned. Cataloging is slowing the progress with as many as one third requiring original cataloging. They hope to recover some of the cost of the scanning and cataloging from sales of printed copies of the maps. The Waldseemüller map was scanned last fall, front and back. After they complete payment on the map, the question will be what to do with the scanned copies. LC probably will look to recover some costs by selling prints from the scanned copies and John wants it to be available online. They are currently completing the Civil War project, about 2,500 maps, Revolutionary War period maps, another 2000 maps, and are working on about 3000 sheets of British produced maps from the Revolutionary War era. New projects include scanning an early 19th century map of Japan which is divided into 214 sheets. Each sheet is about 5 by 5 feet. LC holds 207 sheets, 160 of which are not found anywhere else in the world.

Professor Li from Beijing is coming to work at the Library this summer on the manuscript materials on China. Along with identifying and cataloging these materials they hope to scan many of them. Scanning could be problematic since many of them are scroll maps, some up to 60 feet long, that may take some creative work to complete. A continuing project is acquiring maps used in the field by soldiers and personal remembrances of those soldiers from World War II, Vietnam, and Korea. The hope is to produce an historical record of how maps are used in combat. Any help on locating veterans and maps would be appreciated. LC and the National Imagery and Mapping Agency (NIMA) are now in a cooperative cataloging project where NIMA is cataloging their set maps in Marc format to the sheet level. A Lewis and Clark exhibit, largely maps, is being planned with the kickoff to be in September 2003.

Philip Lee Phillips Society
The Phillips Society is the Friends of the Geography and Map Division organization. There are currently over 200 members. This year’s meeting is a joint meeting with the Texas Map Society in Arlington, Texas in October. The Society publishes newsletters and occasional papers.

Special Project
Last year’s summer project with five participants was a great success. They are not planning one this year. Instead, this summer the Library is hosting two librarians from tribal libraries in North Dakota and Minnesota. They expect to go back to the traditional summer project next year.

Sanborn Atlases
LC currently does not have a project to scan the Sanborn Atlases. Bell and Howell/Proquest developed a digital record of the black and white film but researchers are dissatisfied because it is black and white and because the film is not always a good copy. LC would like to scan the original color maps but lacks the resources to digitize all the maps and lacks permission from EDR Sanborn for those still under copyright.

LC is looking into the possibility of using some facilities at Fort Meade for remote storage.

National Archives and Records Administration
Richard H. Smith, Senior Archivist, Cartographic Unit, Special Media Archives Services Division

Dr. Richard Smith began by recounting the history of the Cartographic and Architectural Records Branch of the National Archives ( web site). Acquisition of maps and charts began in the 1930’s. In the 1960’s aerial photographs were added to the collection and in the 1970’s through 1990’s architectural and engineering plans were also added. Currently, they have just under 2.5 million maps, just over 2.5 million architectural and engineering drawings and 16 million aerial photographs. Not all acquisitions are in paper copy; the Archives also have materials on film and aperture cards. The cartographic unit has a staff of 14 who accession, process, describe and make records available to the public in the Public Research Room. The Research Room is open six days a week and three evenings a week in the Archives II building in College Park, Maryland. For more background on the Cartographic and Architectural Records Branch refer to General Information Leaflet No. 26 (http://www.nara.gov/publications/leaflets/gil26.html).

Records, as defined by federal statute include “all books, papers, maps, photographs, machine readable materials, or other documentary materials, regardless of physical form or characteristics, made or received by an agency of the United States Government under Federal law or in connection with the transaction of public business and preserved or appropriate for preservation by that agency or its legitimate successor as evidence of the organization, functions, policies, decisions, procedures, operations, or other activities of the Government or because of the informational value of data in them”. (44 U.S.C. Chapter 33 Section 3301). Acquisitions are by records control schedules drawn up between the Archives and the originating agency. The Archives provides records lifecycle management guidance to all Federal agencies and conducts evaluations of Federal agency records management practices. Items come to the Archives after active use of the materials has diminished, the standard is about 30 years (after current administrative need for the materials is extinguished). Occasional offers of unique materials are made, but this is somewhat rare. Exceptions to the 30 year rule include receipt of a copy of most Federal agency maps at the time of printing. These records series are sometimes supplemented by annotated copies of maps and background files for published maps. Records are stored in record groups and kept in record series. The provenance of the materials is maintained. Appraisal and retention in the Archives is done on a series basis, not the individual piece. Cataloging is done at the collection, series and record group level. Rarely is any item-level cataloging done.

Maintenance and preservation of the collections are major priorities. To minimize handling Archives creates reference copies in photocopy, microfilm or photographic reproductions for especially valuable items, but generally original maps or drawings are brought to the Research Room. A recent example is the color 35mm film of the 1930 Census enumeration district maps now available to accompany the 1930 census schedules released in April. This is the first time Archives has filmed the enumeration district maps. Paper maps are stored flat in map cases in acid free folders with occasional items in Mylar sleeve application. A scanning project, done under contract with a private company, has processed about 300 maps and 100 aerial photographs so far. We should also be aware of the Center for Electronic Records and their programs and the related Electronic Records Archive (http://www.nara.gov/nara/electronic/).

Government Printing Office
Betty Jones, Chief of the Depository Administration Branch

Betty Jones, Chief of the Depository Administration Branch, presented for the Government Printing Office (GPO). She has been in the position for less than one year.

Staffing Changes
On Friday, March 29, 2002, President Bush nominated Bruce R. James to be the Public Printer. Current Public Printer, Michael F. DiMario has been in the position since 1993. The Public Printer is the head of the U.S. Government Printing Office. In the past year GPO has hired a chief of serials cataloging and a chief of monograph and map cataloging. They have also hired two new catalogers and made offers to two other candidates for cataloging positions. There are currently 14 catalogers with 6 positions still to filled. In addition they have hired three program analysts and will hire an additional librarian in the Depository Administration Branch.

Budget: fiscal year 2002 appropriations
LPS received funding from Congress to modernize the automated library system. They are on the fast track to purchase a state of the art integrated library system (ILS). The current legacy systems made it through the Y2K transition. One persistent problem is the current systems do not allow for the easy transfer of information from one to the other. This is a major advantage of the ILS. GPO will be hiring a consultant to help with the transition. Any help or advice librarians outside GPO can provide would be greatly appreciated.

October 12, 2001, Francis J. Buckley, Jr., Superintendent of Documents, issued the recall of USGS Open File Report 99-248: Source-Area Characteristics of Large Public Surface-Water Supplies in the Conterminous United States: An Information Resource Source-Water Assessment. Mr. Buckley explained the Policies and Procedures for Withdrawing Documents from the FDLP in the November 15 Administrative Notes, and again March 14 in a letter sent to all depository library directors and coordinators (the letter was reprinted in the April 15 Administrative Notes). Since FY 1995, the GPO has distributed 230,019 tangible product (print, microfiche, and CD-ROM) titles to depository libraries, and recalled just 20 (16 to be destroyed, 3 returned to the agency, 1 removed from shelves). GPO has not been asked to withdraw any electronic publication. Several agencies have taken electronic publications off their web sites.

Recommended Workstation Specifications
Betty presented copies of the 2002 Recommended Specifications for Public Access Workstations in Federal Depository Libraries and pointed out the “for cartographic data use” recommendations. This draft will be published in Administrative Notes and will supercede the recommended specifications dated June 2001 and become requirements on October 1, 2003.

GPO provided cataloging for 4,200 maps and map products this past year from USGS, Census Bureau, Department of Agriculture, NIMA, NOAA, CIA, and other agencies in paper, CD, DVD, and online. GPO will continue to disseminate maps in a tangible format whenever possible. Census track maps for the 2000 census will not distributed in paper because of the prohibitive cost of production and distribution. They will be available on DVD. The Interagency Agreement with USGS expires this fiscal year. GPO does not foresee any major changes or any problems in renewing the Agreement.

Federal Geographic Data Committee (FGDC)
John Moeller, Staff Director

John Moeller, Staff Director of the FGDC, presented at the meeting for the first time. He primarily discussed policy; what the FGDC is, what tasks have been assigned to it and then generally about the National Spatial Data Infrastructure (NSDI). The FGDC is an interagency and intersectional committee at the federal level. There are currently 17 cabinet and executive level agencies represented, and additional agencies/organizations are expected to become members, e.g., GPO and GSA. The FGDC has a Steering Committee, a Coordination Group, and a FGDC Secretariat staff. FGDC is under the leadership of the Department of the Interior. The Deputy Secretary of the Department of the Interior is the chair and the vice-chair is Mark Foreman, OMB Associate Director for Technology and Electronic Government. Within the Committee, there are 27 working groups or subcommittees that are organized on thematic categories, for example, the U.S. Forest Service for vegetation, the U.S. Fish and Wildlife Service for wetlands, and Census for cultural and demographic issues. Working groups deal with issues that cut across areas, such as a NARA lead working group for historical data and a recently established working group on homeland security with NIMA and USGS serving as co-chairs. FGDC’s primary responsibility is determining among local participating agencies how activities for providing, collecting, and utilizing spatial information at the federal level can be better coordinated and to provide federal leadership for the National Spatial Data Infrastructure. A component of this goal is also to involve state, local and tribal governments, the academic community and the private sector.

John said that he directs the staff that supports the daily operations of the committees. The FGDC was organized in 1990 under OMB Circular A-16, which promotes “the coordinated use, sharing, and dissemination of geospatial data on a national basis”. This establishes the federal information policies for the federal government. Regarding questions about the recent removal of some government information off the Web, he stated that the government’s policy still is to have federal information made available at the least cost to the widest dissemination with the least amount of restrictions as possible. In spite of September 11th, that policy has not officially changed, although the limitations of it have changed and there were plans to reassess OMB Circular A-130. At this time, there will probably be three categories of information, one being classified, another being open public domain, and the third being restricted information based on some criteria and protected for perpetuity in some cases and in some cases open access after a certain amount of time. Studies have indicated about 80% of government data has a spatial component. When managing business processes and decision processes in the federal government, geography can be used to better understand the entire environment. More and more, the geospatial component to information is being perceived by people as fundamental and we need to take opportunities for building the global spatial data infrastructure. There are about 50 or more countries that are either beginning to build this infrastructure or are planning to do so and the commonalities are many. FGDC is supporting these initiatives. A new kind of infrastructure to improve the use of geospatial resources across the country is needed. Currently, this is operated at the federal level under an OMB Circular A-16 and Executive Order 12906.

The components of the spatial data infrastructure are:

  • Framework: 7 layers have been identified to provide a consistent base for spatial location. The layers include imagery, elevation, cadastral, transportation, government units, geodetic and hydrographic.
  • Metadata: An explanation or textual description of the data source. The FGDC has a metadata standard and federal agencies are required to use this. The expectation is that we will see greater implementation of the standard as more and more vendors begin to put it into their tools. In addition, there is the ISO standard that is being worked on by the ISO Geospatial Technical Committee 211. It should be in place by the end of the year. The federal government is committed to building a transition from the FGDC existing metadata standards to the ISO standards. There may just be one uniform standard for North America, including Canada, United States and Mexico.
  • Clearinghouse: A metadata catalog to ensure access to data that is already available to fit a user’s needs. The catalogs are networked from county to country. For example, the United States, Canada and Australia have been networked. There are 26 or 27 countries that are now part of the global NSDI clearinghouse. The clearinghouse is expected to be at least 80-90% global in the future.
  • Standards: Data and Technology. 17 standards have been endorsed through the FGDC and another 20 or so are in some form of development by the subcommittees and workforce. The goal is to have interoperable data and specifications. They focus on data content and data classification. NIMA has been a big promoter of these products. The Open GIS Consortium is the primary organization providing guidance for the interoperable geoprocessing technology specifications.
  • Geodata: Available geographic data needed for community decision-making. The hope is to use descriptors, the clearinghouse, the standards and the other tools to make all geographic data more accessible and useable. The results will be that we will have the opportunity of finding geodata, understanding what is in a dataset, using more and more consistent terminology and definitions of the data and having more tools available so that we can bring them together for decision making.
  • Partnership: Relationships for collaboration, sharing and policy deliberations. These are critical as 80% of the government data has a spatial component, cadastral data is only 1-2% at the federal level while 98% is at the local level, and only 5% of the biological spatial data is at the federal level. Thus the only way to build information relationships is through partnerships and collaborations.

John emphasized that the National Spatial Data Infrastructure (NSDI) is being developed for organizations to cooperatively produce and share geographic data. He cited several examples of geospatial data products where the use of standards has added to the understanding of the importance of interagency cooperation. A goal of the Infrastructure is to reduce duplication of effort among agencies and localities as well as to improve quality, increase availability and reduce costs related to producing and accessing geographic information.

John discussed the geospatial One-Stop E-Government initiative, which resulted from the government’s desire to provide services to help other government entities, businesses and citizens to more effectively use electronic technology. A federal OMB task force was established to recommend profitable e-government initiatives and 24 initiatives were selected, one of which was the Geospatial Information One-Stop. This initiative was assigned to the Department of the Interior and the FGDC. Currently, FGDC is working with 11 federal partner agencies plus state, local and tribal governments. The vision of the Geospatial One-Stop is to spatially enable the delivery of government services and to provide a place where access to individual information and access to combined information will be possible. The future model should provide fast, low-cost, reliable access to geospatial data needed for government operations via a government-to-government portal for this information. This will also facilitate the effective alignment of roles, responsibilities and resources for government-to-government geospatial interactions needed for vertical missions such as homeland security. Another goal is to have multi-sector input for standards which will create consistency in order to promote interoperability and stimulate market development of tools. The focus of the Geospatial One-Stop is to accelerate development and implementation of NSDI technology, policies and standards that support “one-stop” access. The outcome of the initiative should be that the infrastructure is accelerated, achieving better, faster, less expensive access to reliable data for use by citizens, to improve the use of resources for data acquisitions, partnerships, and reduce duplications, and to have all E-Government initiatives spatially enabled through data and functional capability.

In summary, John stated that an important goal is to create a multi-purpose program of procedures and technology with federal, state, local, academia, private sector and tribal governments to provide access to an enhanced geospatial one-stop portal that is enabled by standards and technology interoperability tools and is not vendor specific. The data will be based on standards and will be commercially available and technology driven so that it can be used in a whole variety of applications enabling geographic information use across the nation and the world. We are encouraged to provide output and representation from our communities, to give input by reviewing the standards and to recommend candidates to work on team projects to help further the Geospatial One-Stop initiative.

National Forest Service
Betsy Banas, Staff Cartographer, Geospatial Services Group

Betsy Banas, National Forest Service (NSF) gave us an overview of the Service’s mapping history, mapping programs, and digital mapping committees.

Betsy began by noting the similarities between the mission statement of CUAC and that of the Forest Service. The Forest Service mission statement is “caring for the land and serving the people”. Gifford Pinchot was the first Forest Service chief and the mission statement then was to “provide the greatest amount of good for the greatest amount of people in the long run”. She noted the philosophical differences between Gifford Pinchot and John Muir in establishing “reserves” vs “preserves”.

The Forest Service was created in 1905 to provide quality water and timber for the Nation’s benefit. It originally had 60 forest reserves covering 56 million acres; now, it has 155 forests and grasslands covering 191 million acres. The Service is very decentralized, having 9 Regions, 1 through 10. Region 7 was absorbed into Regions 8 and 9 long ago. At the time that the Forest Service was organized, it was deliberately decentralized, as it was decided that decision makers needed to be right there, “on the ground” as they were most familiar with the public’s needs at the local level.

The Forest Service is the largest forestry research organization in the world, having 20 research and experimental forests and other special areas. It also provides technical and financial assistance to state and private forestry.

Over the years, the public has expanded the list of what they want from national forests and grasslands. Congress responded by directing the Forest Service to manage national forests for additional multiple uses and benefits as well as for the sustained yield of renewable resources such as water, forage, wildlife, wood, and recreation. Multiple use means managing resources under the best combination of uses to benefit the American people while ensuring the productivity of the land and protecting the quality of the environment.

The mapping and geospatial data programs have helped meet the Forest Service mission by aiding in fire management, forest planning, forest health protection, watershed restoration, ecosystem management and sustainability of our resources, and recreation. Initially mapping was done at the local level and it was a vital part of administering the land. The maps were made to the specifications and requirements of the particular forest. There was little standardization or consistency among Regions.

This changed during World War II. There was an effort to consolidate mapping for defense purposes. The Forest Service, at the time, had the equipment and expertise. During the War, NFS map programs worked out of Gettysburg, Pennsylvania, mapped areas of the U.S. along the Pacific Coast, and aided in making detailed maps of Japan.

Through the late 1960’s regular Forest Service mapping business continued to be decentralized and non-standardized. But mapping technology began to change; new, costly equipment, computers, etc. required the centralizing of mapping operations. The Geospatial Service and Technology Center (GSTC) was founded in 1975 (then called Geometronics Service Center) and is located in Salt Lake City, Utah. Its intent was to bring together the skills and resources needed to build and maintain a standardized base mapping program. The Center’s program has since expanded to include production of digital data.

The Remote Sensing Application Center (RSAC) is co-located with GSTC in Salt Lake City. It provides technical support in evaluating and developing remote sensing, image processing, and how it relates to geospatial technologies throughout the Forest Service. It also provides project support and assistance with using remote sensing technologies, and technology transfer and training.

The Geospatial Service and Technology Center is more than maps. It provides geospatial services, data, training and awareness. These services and products support core Forest Service business needs including forest planning, watershed restoration, resources inventory, and transportation management. While NFS has a national program and centralized geospatial service and tech center in Salt Lake City, many mapping activities continue in the Regions. The Forest Service is developing a clearinghouse, this will be a FGDC and NSDI node. This will eventually provide all Forest Service geospatial data, and FGDC compliant metadata. Hopefully by September of this year, that node will be active.

Forest Service Maps
The Primary Base Series (PBS) maps of NFS have a scale of 1:24,000. They are topographic maps, used as an administrative product. The Forest Service started production in 1992 of the Single Edition Quad maps when they entered into an agreement with USGS. The Primary Base maps are produced by the Forest Service to USGS standards. This agreement has eliminated duplicative efforts. The maps are revised sooner with partnerships than without partnerships, and show Forest Service data. USGS prints and distributes the maps for the Forest Service. The Forest Service is responsible for about 12,500 of the 55,000+ topographic sheets produced of the United States. They are mapping at a rate of 600 per year.

The Secondary Base Series is at a scale of ½ inch to the mile (1:126,720). The cartographic work is performed at GSTC. The base map is forwarded to Region/Forest where it is enhanced with photos, transportation guides and visitor information to become the standard Forest Visitor Map.

Forest Visitor Maps are being distributed by USGS through a relatively new agreement. Previously the maps were only available at Forest Visitor Centers. The new agreement provides for the sale of Forest Visitor Maps through a USGS vendor network, and provides customers with one stop shopping. The maps are available to vendors at volume discounts. This partnership has increased customer service. The maps are still also available at Forest Visitor Centers, Forest Supervisor and District Ranger Offices and can also be ordered from the various Forest Service websites –but only USGS provides the one stop shopping capability that vendors like because the receive a discount and can stock a variety of maps on their shelves.

Other Forest Service maps include: wilderness area maps, wild and scenic rivers maps, “Pocket Guides,” “Guide to Your National Forest,” and other specialty products. FSWEB site: http://fsweb.r5.fs.fed.us/unit/puf/geometronics/

Other collaborative efforts include http://cuac.wustl.edu/www.recreation.gov. This interagency initiative provides web-served recreation information to public. It cuts across government boundaries. Outdoors America Map is a guide to recreation opportunities on Federal Lands; 11 Federal Agencies are involved. The Forest Service is represented as a voting member on the U.S. Board on Geographic Names. Forest Service is responsible for their areas in the updating and maintenance of the Geographic Names Information System. The Forest Service is adding information to the National Atlas of the United States. There are other exchanges with USGS including Digital Elevation Models (DEMs), Digital Orthophoto Quads (DOQs), and the National Map. The Forest Service is working in Lake Tahoe Basin Management Unit on a pilot of the National Map.

FGDC and Geospatial Advisory Committee (GAC) Activities
Forest Service is participating in FGDC (Federal Geographic Data Committee). FGDC is trying to create Geospatial One Stop and I-Teams (which have to do with data sharing at the local level). John Moeller (who also spoke at CUAC) is FGDC Secretariat Staff Director and Project manager for Geospatial OneStop. NFS has taken the lead of the FGDC Vegetation Subcommittee. Vegetation Subcommittee activity had languished – initially a lot of effort had been put into trying to develop a vegetation data standard. No consensus on the elements of the standard could ever be reached, within NFS or among agencies on the subcommittee, so it stalled out. Alison Hill is new chair, and the Committee is reinvigorated. NFS is the Co-Lead for Sustainable Forest Data Subcommittee, active on Homeland Security Working Group, and Imagery and Remote Sensing Task Force.

The Geospatial Advisory Committee (GAC) was formed in 1999 to address advancing of Forest Service Geospatial Data Technologies. The geospatial community recognized the need to direct and coordinate geospatial data activity. GAC promotes awareness of geospatial data throughout Forest Service, and advises the Geospatial Executive Board (GEB). Its roles and responsibilities are to identify, monitor, and address issues regarding the state of NFS geospatial programs and activities. It also develops and makes recommendations concerning geospatial program execution to the Geospatial Executive Board. GAC communicates progress to NFS geospatial community and others. GAC emphasis areas are 1) standardized GIS data, 2) natural resource applications coordination, 3) geospatial training and awareness, 4) coordinate and share standardized GIS data, 5) cartographic publishing, and 6) technology architecture coordination. GAC’s goals are to ensure NFS geospatial policy, programs are compatible and integrated, and to ensure programs are responsive to NFS business needs.

Forest Service Contact Information and Forest Service Home Page

GSTC Home Page

Bureau of the Census
Tim Trainor, Chief, Cartographic Operations Branch

Tim Trainor began by discussing a couple of the Census Bureau’s Geographic programs. The fifty State Data Centers (SDCs) participated in the Public Use Microdata Area (PUMA) Delineation Program. Tim spoke at some length about the Urbanized Area Delineation program, which culminated with a Federal Register notice on May 1, 2002 (71 FR 21961) listing the 466 areas defined as Urbanized Areas (UA) for Census 2000 (up from 405 in 1990). General criteria are that there must be a density of 500 people per square mile and a minimum population of 50,000. There is no grandfathering of urbanized areas: Cumberland, MD, was dropped from the UA list which qualified in 1990. The more important detail is that the category has been expanded to include “urban clusters”, with urbanized areas and and urban clusters totaling 3,638 qualifying areas, so more areas will have data available. The smaller “Urban Cluster” (UC) is defined for areas of sufficient density from 2,500 to 50,000 inhabitants plus other characteristics. Detailed definitions and discussion of UA’s and UC’s may be found in a Federal Register announcement of March 15, 2002 (67 FR 11663). The concept of undevelopable areas adjacent to or within UAs (e.g., floodplains along a river) are now diplomatically being called “exempt” rather than “undevelopable.” And of course, all of this information is available on the web.

Tim then reviewed several of the geographic products from Census. Some of these involve Zip Code Tabluation Areas (ZCTAs), in which each Census block is assigned a single Zip Code. This constructed geography will result in various special boundary files and tabulations. The TIGER 2002 files, which use 2000 geography, will be available soon on the web. Probably, at some point there will be maps but specifications have not yet been finalized.

2002 TIGER/Line files, based on Census 2000 Geography will be available to download by the end of this week. Based on Census 2000, many redistricting activities are underway in the states.

Pre-defined maps, mostly in pdf format, are available on the Internet. These are also available on DVD (CDs are used only if the files total less than 650 megabytes) and as on-demand plotted maps. Recommended specifications for plotters are on the web site. Tim has a national map showing locations of the State Data Centers, it is used internally, but possibly could be made available. It is constantly changing and has all of the different kinds of state data centers, in terms of their classifications. Census 2000 block maps for every community in the country have been produced. They include the 130,000 maps sheets John Hebert referred to as recently accessioned at LC Geography and Map Division. Census has produced an additional 280,000 sheets, that are block maps for geographic levels above census tracts, such as places and county subdivisions.

For legal governments, maps have been sent to the entity’s highest elected official and currently are available on the web. Six DVDs will be manufactured shortly that include regions of states. Unlike the 1990 county block maps, users can access a town or city of choice without having to acquire all of the maps for a county. Census tract outlines maps are available on one DVD and American Indian/Alaskan Native Areas and Hawaiian home land block maps are available on one CD-ROM.

Generalized boundary files are available on the web for most levels of geography in several popular ESRI formats: Arc/Info exports (.e00), ArcView shapefile (.shp), and Arc/Info ASCII format. Census 2000 boundary files are available in both high resolution and low resolution versions. They are re-doing the 1990 files so that nested geography share the same points.

As a result of user input, more printed reports than originally planned will be generated. County outline and subdivision outline maps will be produced. Page sized county maps by state, will be done by the end of summer. Metropolitan Areas will be redefined in 2003 based on new criteria.

The Bureau is still producing thematic maps. One recent map shows the center of population for each state. Another is the famous “nighttime” map, where white “light” on a dark background indicates population distribution, which recently had the biggest press run in Census history, of 1,500,000 sheets. Five copies were sent to every school in America. They are planning a 108th Congressional District Atlas for next year and have released a Census 2000 atlas based on the first seven questions of the census questionnaire.

This is the 100th anniversary of Census as an agency.

The Bureau realizes the acute need for modernization of its Master Address File (MAF) and the entire TIGER system. TIGER is old and technology has advanced significantly since being developed. (Most people don’t know that Census still maintains the files in an internal format, not the ASCII format that it distributes.) Everyone knows that the positional accuracy of boundaries is inaccurate, and Census wants to move beyond relative accuracy and to true positional accuracy. One reason this will be imperative is that TIGER will form the transportation layer of The National Map. Updating can’t wait: there are sixty-five committees already looking at Census 2010 planning, and to maintain the geographic standards of the ongoing American Community Survey, MAF and TIGER must stay updated and be improved. The goal is to get an enumerator to a housing unit 100% of the time.

There are many partnerships with other agencies and partners. Census maintains boundaries for most local governments on an annual basis.

The MAF/TIGER modernization is focusing on three important projects. One is to get existing files where they exist. Out of the 3,000 counties, about 1,000 of them have GIS files, of them a small number have really good GIS files. Census is evaluating that currently. A second strategy is to have contractors look at commercial sources that are available that can be used without restriction into the public domain. A third alternative is to use imagery where the previous two options are not possible as a means to improve and maintain the spatial data.

U.S. Geological Survey
Dan Cavanaugh, Chief, Branch of Program Development

Dan Cavanaugh, US Geological Survey (USGS) gave an update that focused on three themes: New Products, especially published maps, the National Atlas and the National Map.

New Products
USGS has released several maps that are different than they generally produce. They include a map of Lake Tahoe showing underground structure, and a Tapestry of Time and Terrain which depicts geology and physiography. There is also a new map of New England showing earthquakes between 1638 and 1998 (I-2737), which proved particularly timely given the recent earthquake there. Another recently published map, titled Geographic Face of the Nation – Land Cover, developed from the National Land Cover Data (NLCD), was jointly produced by USGS and the Environmental Protection Agency. A new relief map will be released similar to the Thelin & Pike map (late 70’s, early 80’s) titled Geographic Face of the Nation – Elevation. The new map will have fewer data artifacts than the previous edition.

USGS is continuing to forge partnerships, especially with the Forest Service. USGS Map Dealers (about 2000 of them) are now distributing Forest Service maps. Their goal is to distribute Forest Service maps for all 9 Forest Service regions. The map distributors are pleased about being able to obtain maps from one source (USGS), rather than having to deal with multiple agencies and regions. The USGS has also entered into partnerships with other agencies, such as the Library of Congress. This partnership has resulted in reproduction of an 1894 map of Colorado. It is available from USGS (see http://rockyweb.cr.usgs.gov/historicmaps/historicmapsfromlca.html for more information). USGS is working with the National Park Service to produce geologic maps of the National Parks. They also continue to distribute National Imagery and Mapping Agency (NIMA) products. About 90-95% of the NIMA products that were available before September 11 are still available.

Some of the most popular products at USGS continue to be the booklets, such as the General Interest Publications, which are available for free. Dan indicated that just prior to our meeting, the Director of the Survey announced that the USGS will be getting out of retail sales (at the ESIC) by FY2004. It is uncertain if that is the beginning or end of FY04. Over the counter retail sales may cease at other USGS locations as well, and is probably a year or two away. A question was asked if there are other ESIC offices to be closed. Dave indicated that the Washington DC ESIC in Main Interior had closed this year due to budget cuts, and that the Spokane ESIC was closed last year to budget cuts. Remaining ESIC offices include Reston, Menlo Park, Denver, Anchorage, Rolla, and Sioux Falls, SD.

Dan was asked about the recently published maps of Utah and Colorado that came through FDLP. They are not a “national program”. These maps were produced from the National Elevation Dataset by the Rocky Mountain Mapping Center, and are similar to the one of Pennsylvania that was issued several years ago. They will not be issued for the entire United States unless funding is made available. Dan was also asked if there were plans to revise or update Maps for America. The response was no, due to lack of funding.

The National Atlas
The National Atlas continues to be one of the Geological Survey’s most popular web sites. It is a cooperative venture between 21 partners and ESRI. There are presently 420 map layers available on the National Atlas web site. People can use it to make and print their own map. It also includes internal links to other web sites. For example, when a user clicks on a National Park, they are linked to sites with information on that park. The National Atlas web site receives 4.6 million hits per month, and links to 1900 other web sites. A new map is drawn every 1.5 seconds. Over 350,000 map layers have been downloaded from the site.

Through the National Atlas, the USGS has been able to produce hard copy products, such as the Federal and Indian Lands map, the elevation map of North America, the Forest Cover map, (produced with data from many Federal agencies), the Presidential Elections map, which includes insets showing the results of all Presidential elections since 1789, and the General Reference map, showing roads and county boundaries. This map will be revised to show Alaska at the same scale as the lower 48, in another words, one will be able to compare the land masses against each other and re-released. The National Atlas is viewed by some people as a small scale version of the more detailed National Map.

The National Map
The National Mapping Division is now the Geography Discipline. The National Map is everything that the National Mapping Division used to do. There used to be three organizations under the National Mapping Division. They were Map and Data Collection, Earth Science Information Management and Delivery, and Research. They are now known as Cooperative Topographic Mapping, Land Remote Sensing dealing with Landsat, and Geographic Analysis and Monitoring which equates to the research area.

The primary activity of the National Mapping Discipline is to compile the base data for the National Map. The vision of the National Map is to develop a current, continually revised, seamless, complete, consistent product that will reflect geographic reality, have positional and logical consistency, and have no cartographic offsets. It will be a temporal record, with metadata for both the data set and the features within it. The National Map will address 5 needs, to Map, Monitor, Understand, Model and Predict. The 7.5 minute topographic map is probably the USGS’ most famous product. It is the only U.S. cartographic product that is comprehensive, trans-jurisdictional and border-to-border and coast-to-coast. Compiling it was an immense engineering feat that would cost over $2,000,000,000 to replicate today. On average, the topographic map is 23 years old. U.S.G.S. is finding that they can not keep up with currency. Base data, such as aerial photographs, often show features that topographic maps do not.

Because topographic information has a variety of uses (scientific studies, planning, decision making, land and resource management, delivery of government services, economic activities, natural disaster relief, homeland defense), it will be the base of the National Map. There is presently some duplication of effort among and between geographic information sectors (federal, state and local governments and the private sector). Cooperation between these sectors (Cooperative Topographic Mapping) will provide the base information needed for the National Map. Partnerships will be built to develop the base data, which will be accessible via the web 24 hours a day. Users will be able to specify the data and area of interest and print their map on demand. Cooperative Topographic Mapping will include activities such as acquiring, archiving, and disseminating base geographic data, maintaining and providing derivative products, including topographic maps, and conducting research to improve data collection, maintenance, access, and applications capabilities. The core data, which will include themes such as orthophotography, elevation, transportation, hydrography, structures, boundaries, geographic names and land cover, will be public domain, either collected by government agencies or made available through licensing agreements. Links to other data with higher resolution, enriched content and additional attributes will be available. These links may be to licensed data. This means that USGS’ role will be changing from data producer to organizer responsible for awareness, availability, and utility. USGS will be the catalyst and collaborator for creating and stimulating data partnerships, a partner in standards development, and an integrator of data from other participants. When no other source of data exists, USGS will produce and own the data. There will be a temporal component or versioning, but the details have not been worked out yet. Data will be accessible 24 hours a day and will be in the public domain.

The National Atlas is an example of a small-scale implementation of the National Map. It has been developed through partnerships. USGS has integrated the content so that it is consistent nationwide. They have also developed the metadata and provided web access. USGS offers derivative products, such as the data layers and printed National Atlas maps.

There are currently 7 National Map pilot projects underway in the United States (see http://nationalmap.usgs.gov/nmpilots.html) for more information. One in Delaware is currently the most complete and went live April 18 (URL: http://www.datamil.udel.edu/nationalmappilot). The events of September 11 illustrate the urgency for geospatial data and the National Map. September 11 has shown us that data must exist before, during and after an event, be readily accessible, and that partnerships among state, local, and federal agencies and the private sector are required. The events have illustrated that cartographic information is a national infrastructure, just like the Interstate Highway System. As a result of September 11, there is an emphasis to compile information, including high-resolution color imagery, high accuracy elevation data and critical infrastructure, for 120 major metropolitan areas in the United States. NIMA and other Federal agencies are partnering in this effort. Links with state and local agencies and “first responders” are also being developed.

National Imagery and Mapping Agency
Jim Lusby, NIMA Staff Officer, Disclosure and Release Division, Office of International & Policy

Jim Lusby, represented National Imagery and Mapping Agency (NIMA) and provided an overview of the policy of Limited Distribution Products (LIMDIS) and an update on the distribution of Shuttle Radar Topography Mission Data.

NIMA has authority under U.S. law, Title 10, to restrict distribution of cartographic data if it is required to do so under international agreements, if disclosure would reveal sensitive methods for obtaining the data, or if disclosure would interfere with military or intelligence operations. Officially Limited Distribution (LIMDIS) is a caveat, not a security classification, e.g., “Classified” or “Secret.” It is still enforceable under law. Roughly 35% of NIMA’s products fall under the LIMDIS category.

NIMA has 80,000 different line items, and of those, 30,000 are limited distribution. 20,000 are foreign produced and NIMA works in cooperation with the foreign governments.

Jim has worked to arrange exceptions to LIMDIS for academics and government agencies for an expressly noted purpose, e.g., to support disaster relief operations. Unauthorized re-distribution of LIMDIS data in such situations can result in agencies or contractors losing their ability to obtain future exemptions. Most requests for exemption require the agreement of a third party, such as the foreign agency responsible for supplying the data. NIMA evaluates all requests on a case by case basis, and tries to balance benefits and risks of exemptions.

NIMA also assists foreign countries with information in times of need. Jim mentioned NIMA and USGS efforts in assisting Honduras, Nicaragua, and El Salvador during “Hurricane Mitch”. They are partnering with USGS, Census, Forest Service, and others.

Making NIMA products available to other government agencies can be a lengthy process. Criteria for approval of release is based on desired geographic location, the use, and justification for needing the material.

NIMA is working to make the process smoother by spelling out conditions of release during the initial data collection process with third parties, taking some internal steps to formalize LIMDIS policies and procedures, and by highlighting the issue to NIMA customers in forums such as CUAC. Is there a greater amount of risk to giving this product to someone to satisfy them? Are there other sources that will work? Is this is the only source and what kind of risk will have to be weighed? What is the derived product coming out of it?

There are many multinational projects underway. NIMA works with “disclosure” or “release” restrictions. Disclosure is where someone can look at it and walk away or release where they can actually give someone the map. NIMA is trying to obtain more “disclosure” than “release” situations in working together.

Limited distribution is a caveat that restricts anyone from using it unless NIMA gives approval. Official use only means that you need that product for planning and you will use it only for that purpose.

Some products will be more easily available, others will be less. NIMA will be working on updating their “Memorandum of Understanding” (MOU’s). They are trying to reduce the amount of LIMDIS information or make it classified and try to get out of the gray area.

Will Danielson from GPO asked Jim about maps received at GPO for FDLP cataloging that were marked with the LIMDIS caveat. Jim said that GPO/FDLP were indeed supposed to receive such items as they had been declassified. Jim explained that after printed materials are marked LIMDIS at the printer, a new press run can not be done to remove the LIMDIS caveat. Instead that marking is supposed to be removed or obliterated by the distributor.

Finally, Jim presented a revised schedule for release of the Shuttle Radar Topography Mission (SRTM) data products. This is the digital terrain data that librarians are hoping for. Alaska is not well represented. Having fallen behind after September 11, Jim cautioned that the schedule was subject to further change. Production of data for North and South America is expected to be complete by summer 2002, but distribution schedules and methods have not been determined. USGS through the EROS Data Center with a joint agreement will be the data holder for the public. Public release data will vary in resolution, depending upon geographic area. USA data will be level 2 (30 meter resolution), non-USA areas will be level 1 (roughly 90 meters). By 2004, everything should be completed, elevation data for the world, and all the products done. It will be much better than anything they have had in the past and they are using additional information from others. 1,000 meter is available now.

National Ocean Service - NOAA
Howard Danley, Deputy Chief of the Navigation Services Division

NOAA has 1037 paper charts for sale through the Distribution Division of the Federal Aviation Administration’s National Aeronautical Charting Office. The National Aeronautical Charting Office also does the printing of the nautical charts. These are available through the FDLP. A private company, Maptech, sells raster images of the charts. On the web at maptech.com, thumbnails at 90 dots to the inch are available using MrSid compression.

There is great interest by graduate students in shoreline movement over the years, terrain, ports, and features. For the last four to five years, a selection of historical charts from the late 1800s to about 10 years in the past has been available on the NOAA web page. In cleaning out the warehouse, they discovered historical charts and scanned them. They can be downloaded. MrSid made this possible. These include hydrographic surveys. One can use “mapfinder” on the website: http://mapfinder.nos.noaa.gov/ to find hydrographic surveys over time.

U.S. Coast Pilot is a supplement to the nautical charts. From the early to mid-1800s, this was a private publication. In the mid-1800s, the Coast Survey purchased the publication. NOAA has contracted with a company in Beltsville, MD to scan the Coast Pilots starting with the oldest, a 1776 publication by the British Admiralty. These images will be placed on the Web, linked through the NOAA library. These online Coast Pilots will be searchable by chapter with an index in the back. Some of the older Coast Pilots had foldouts that are causing problems with scanning because they do not want the binding affected. Funding has been provided for about one-half of the project. Additional funding will be sought next year for finishing the project.

NOAA will be continuing to place electronic nautical charts on the Web in a vector format. There are about 150 charts with a browser available. They can be downloaded. They will be different from the printed charts; the symbology and detail are different. Current coast pilots are available on the web and can be downloaded. Electronic charts and Coast Pilots are considered “provisional” because they are not updated for navigation. These images have increased sales. Distances between Ports will go up on the web too.

Post September 11, NOAA has taken airflows, ship schedules and names from its web site, but decided to leave nautical data as it can be obtained elsewhere.

Questions about potential web products included: the early edition nautical charts of Alaska that had been classified because of the Distant Early Warning (DEW) sites; and the historical t-sheets. The T-sheets (topographic) date back to the mid-1800’s and contain a tremendous amount of information including land use, land ownership, and place names. National Archives holds the t-sheet photographic negatives and the originals.

Paper charts will be around for an indefinite time, especially for the recreation community. For large vessels, there will be a requirement for backup, in whatever form. The print on demand program is still alive but going slowly. There are 876 charts of the 1,037 available through print on demand. The number of print on demand agents is now 40. 17,000 copies of charts have been sold through print on demand last year.

U.S. Fish and Wildlife Service
Doug Vandegraft, Chief Cartographer

Doug Vandegraft is the chief cartographer at Fish and Wildlife. The Fish and Wildlife Service (F&WS) has seven regional offices and about 25 cartographers throughout the United States.

Over the last year, his office has worked on digitizing the boundaries of the 538 wildlife refuges. They are three-quarters completed. Doug noted that 85% of refuge acreage is located in the state of Alaska.

In addition, they are working on a digital land status layer indicating F&WS land ownership. In other words, what lands they own within the wildlife refuges. They are always trying to acquire land to protect critters. Refuge boundaries are approved acquisition boundaries and within that boundary, they have decided that the habitat is worth saving.

Refuges date back to 1903, but the F&WS was not created until 1940. The Bureau of Biological Surveys was the first agency to manage wildlife refuges and in 1936, developed a template of what refuge maps should look like. They are still using the same format, but in 1980 ANILCA added 100 million acres in Alaska, and the format no longer worked well. The F&WS are experimenting new ways of depicting wildlife refuges land status using the digital raster graphics (DRG’s) and digital orthophotoquads (DOQ’s). F&WS has new refuges in the South Pacific and the agency is producing new maps of those areas. Doug indicated that they are currently working with USGS on a new refuge map to commemorate their Centennial. Alaska will be at the same scale as the lower 48.

The Yukon Delta refuge includes 26 million acres. F&WS has scanned about 500 of the original land status maps dating back to the 1920’s. Originals will go to National Archives. Refuge boundaries are available on the web and they may be downloaded. It is important to recognize that there may be private in-holdings within the refuge boundaries depicted.

Work continues on the Real Property Database. The database provides information on tracts of lands owned by F&WS including price paid, parcel size, name of former owner, and additional information. Some information is not available due to its sensitivity. They are currently working on linking refuge boundaries to this database, which will be displayed in a web-based map-server environment. Ideally, there will be a photograph for each refuge. Doug indicated that the most important component of geographic information systems is the query capability. He provided some demo examples of how F&WS is hoping to use GIS with the Real Property Database. Doug is working on securing funding to pursue this project.



back to top