CARTOGRAPHIC USERS ADVISORY
COUNCIL (CUAC) 2002 MEETING MINUTES
Janet Collins, Western Washington University (WAML)
Mike Furlough, University of Virginia (MAGERT)
Donna Koepp, University of Kansas (GODORT)
Clara P. McLeod, Washington University (GIS)
Bruce Obenhasu, Virginia Tech (SLA G&M)
Daniel T. Seldin, Indiana University (NACIS)
Paul Stout, Ball State University (NACIS)
Christopher J. J. Thiry, Colorado School of Mines (WAML)
Mark Thomas, Duke University (MAGERT)
Linda Zellmer, Indiana University (GIS)
Betsy Banas (NFS)
Dan Cavanaugh (USGS)
Howard Danley (NOAA)
John Hebert (LC)
Betty Jones (GPO)
Jim Lusby (NIMA)
John Moeller (FGDC)
Richard H. Smith (NARA)
Timothy Trainor (Census)
Doug Vandegraft (F&WS)
Susan J. DeLost (NFS)
Wil Danielson (GPO)
Mark Flood (NFS)
Robin Haun-Mohamad (GPO)
Vi Moorhouse (LC Cataloging)
and Archiving Issues Roundtable Discussion
by Donna Koepp
University of Kansas
Documents and Map Library
of Congress, John Hebert
Archives and Records Administration, Richard Smith
Printing Office, Betty Jones
Geographic Data Committee, John Moeller
Forest Service, Betsy
Geological Survey, Dan Cavanaugh
and Wildlife Service, Doug Vandegraft
and Closing Remarks
and Archiving Issues Roundtable Discussion
by Donna Koepp, University
of Kansas, Government
Documents and Map Library
(Donna Koepp, CUAC) Our biggest concern is the preservation of cartographic
and spatial data, especially what is born digital and we never see in
paper. We are concerned about having snapshots in time for data that is
constantly being updated, so that we have historical records. Libraries are
not set up to preserve that data mainly because of file size. Are the
agencies preserving snapshots of their data? If not, is there some role
that libraries can play, similar to what we do with paper documents? GPO
does some preservation of text documents, but is not preserving maps
– GPO is referring users to USGS and other agencies because the files
are so large. Libraries have some capacity to work with government agencies
to do this in partnership to preserve these datasets.
Moeller (FGDC) encouraged our participation and representation in FGDC. A
specific opportunity is with the Historical Data Working Group of FGDC
chaired by Bruce Ambacher from the National Archives and Records
Administration (NARA). They developed the policy and guideline statement
“Managing Historical Geospatial Data Records: Guide for Federal
Agencies” in 1997. Tools in place that can be used include the
metadata standard for documentation, a final draft of an international
metadata standard should be approved by the end of this calendar year, and
the spatial data transfer standard.
Koepp (CUAC) asked if John knew of any agency that was preserving all of
its cartographic data.
Moeller (FGDC) replied that he did not know of any. He knows that the Earth
Resources Observation System (EROS) data center has an extensive archive of
imagery and Bureau of Land Management (BLM) has a policy for preserving all
information including digital information.
Koepp (CUAC) mentioned the special problems with BLM’s
decentralization. State and local offices are not necessarily following the
Thiry (CUAC) pointed out users often want historical data. People are doing
historical studies, examples include the history of land management and
growth areas, and this is why we are so interested in having snapshots of
the data. We may lose this history and end up with a period of time where
we don’t have the documentation.
Smith (NARA) hopes it is a comfort to know that federal statutes require
records maintenance, control and disposition schedules, for materials of
enduring or permanent value, regardless of format. Sometimes there is a
snapshot provision. The Electronic Records Archive of NARA is charged with
preserving many different electronic records formats including maps and
cartographic data sets independent of software and hardware. Currently in a
pilot project, the Electronic Records Archives is supposed to be up and
running by 2004. The Archives has a plan for collecting and preserving
Koepp (CUAC) mentioned the NARA
definition of records management and found it comforting that their
definition of records includes maps. Bruce Obenhaus (CUAC) brought up
issues of when do we take snapshots and how much change is worth
identifying? What is of enduring value? These are hard questions that might
not have answers currently.
Smith (NARA) added that the National Archives has appraisal archivists that
are familiar with electronic records. They are hammering out agreements
with agencies on the maintenance, use and final disposition of these files.
That’s the law and nearly the practice. Archives has schedules for
USGS electronic records, as an example. Archives will likely preserve only
a small (2-3% of paper is now preserved and we presume electronic data will
be similar) percentage of the data actually collected. This is a shared
responsibility between NARA
and the originating agencies.
Koepp (CUAC) asked what is included in NARA? Is it similar to Federal Depository
Library Program (FDLP)? NARA
keeps records of the agency, FDLP keeps the publications of the agencies.
These are different types of material.
Smith (NARA) The National Archives collects record sets from agencies.
Archives has what he presumes FDLP libraries have and a lot of manuscripts
to back up the publications.
Thomas (CUAC) Now there is a blurring of published materials and electronic
materials. With digital spatial data, maps are made on the fly, there is no
permanent published version because the user makes maps for a specific
purpose. The problem lies with saving the original data.
Smith (NARA) Maps or records created by an agency may not have a permanent
value to the agency and would not be preserved. When records are still
important to an agency the agency keeps them until the use of the record dies
down, at this point it will be transferred to NARA. Some records are deemed so
important that the agencies keep them for many decades.
Koepp (CUAC) There still are concerns with items that are not getting into
the GPO distribution system, including the very special projects that may
be sitting on agency shelves and we don’t know exist because they
have never been cataloged. This is also a problem with electronic items
that never get into the system. It’s a matter of getting information
out there and sharing it. It’s a matter of discovery.
Furlough (CUAC) questioned to what extent NARA has already worked with cartographic
data in electronic format? Currently statistical data is the bulk of the
electronic data that NARA
Smith (NARA) Only 4 groups of spatial files including the TIGER files are
currently in NARA
electronic archives, possibly 5% or less of what is out there. NARA is setting up
schedules for the transfer of files but most have not been transferred to NARA because of the
high rate of activity on the file. NARA
may wait until files are 15-20 years old before they are deposited.
Thiry (CUAC) Asked Mark Flood (NFS) – do you have data that you can
no longer access for any reason?
Flood (NFS) There has been problems accessing data collected 5-10 years ago
because of changes in hardware and sorfware. This is not as much a problem
in maps yet because they have not been done electronically for a long
period of time. This problem could be coming in the near future.
Hebert (LC): Of concern to the Library of Congress is the ability to
acquire increments of improvements in cartographic output. LC is much more
global in acquisitions than NARA.
Zellmer (CUAC) In asking federal agencies about archiving their data the
answer was, “it is in the metadata”. They are updating files
but not including dates for updated fields in the metadata. Would like to
see a temporal GIS, with dates when a field or feature was added.
DeLost (NFS): National Forest Service is now developing feature level
metadata. For each record there will be a metadata link attached to a
particular record including a year when the field was added.
Trainer (Census): From a producer and user perspective you will end up with
more metadata than spatial data. That is something that we need to take
another look at.
Koepp (CUAC) thanked everyone for their participation and insights on the
question of preserving and archiving cartographic data.
John Hebert, Chief of the Geography and Map Division of the Library of
Hebert, Chief of the Geography and Map Division of the Library of Congress,
presented the LC update again this year. His presentation focused on the
areas of acquisitions, staffing, scanning projects, general projects, the
Phillips Society and the special project this past summer.
Of significance is the acquisition of the only known copy of a 1507 map,
compiled by cartographer Martin Waldseemüller, to bear the name
“America” and the first to depict a separate Western
Hemisphere. Congress appropriated $5 million for the purchase of the map
and fund raising is still underway to secure an additional $5 million. They
have some pretty good leads for this money. There are several other items
in the packet that came from Prince Johannes Waldburg-Wolfegg in which the
library is very interested. They received from Census 130,000 sheets of
Census track materials for the 2000 Census. After September 11 there was a
great deal of interest in holdings covering Southwest
Asia. The Division put together a listing of what they hold
and have tried to fill in gaps. LC continues to receive materials produced
by the former USSR.
They have completed most of the acquisitions of Soviet produced maps at
1:200,000 scale and are now acquiring the 1:100,000 scale series world
wide. In addition they have sought nautical charts for the Arctic and Pacific coasts. LC has received what John
believes will be the final acquisition of paper state road maps, about
20,000 sheets, and expects future receipts from state highway departments
will be digital.
The Geography and Map Division has a total of 55 employees. In the past
year they have added 5 new technicians, and currently have a posting for
two new catalogers. An assistant chief of the division and two new
reference librarians will be advertised in the near future. They are adding
one new person in the scanning and digital lab to replace one lost last
year, bringing the staff back up to four. An additional digital specialist,
a GIS person, is also being added. A new GIS initiative to create an
“on demand” service for Congress is underway. Two geographer
positions will be added for this initiative.
The Library has over 6,000 maps scanned. Cataloging is slowing the progress
with as many as one third requiring original cataloging. They hope to
recover some of the cost of the scanning and cataloging from sales of
printed copies of the maps. The Waldseemüller map was scanned last fall,
front and back. After they complete payment on the map, the question will
be what to do with the scanned copies. LC probably will look to recover
some costs by selling prints from the scanned copies and John wants it to
be available online. They are currently completing the Civil War project,
about 2,500 maps, Revolutionary War period maps, another 2000 maps, and are
working on about 3000 sheets of British produced maps from the
Revolutionary War era. New projects include scanning an early 19th century
map of Japan
which is divided into 214 sheets. Each sheet is about 5 by 5 feet. LC holds
207 sheets, 160 of which are not found anywhere else in the world.
Professor Li from Beijing
is coming to work at the Library this summer on the manuscript materials on
Along with identifying and cataloging these materials they hope to scan
many of them. Scanning could be problematic since many of them are scroll
maps, some up to 60 feet long, that may take some creative work to
complete. A continuing project is acquiring maps used in the field by
soldiers and personal remembrances of those soldiers from World War II,
Vietnam, and Korea. The hope is to produce an historical record of how maps
are used in combat. Any help on locating veterans and maps would be appreciated.
LC and the National Imagery and Mapping Agency (NIMA) are now in a
cooperative cataloging project where NIMA is cataloging their set maps in
Marc format to the sheet level. A Lewis and Clark exhibit, largely maps, is
being planned with the kickoff to be in September 2003.
Lee Phillips Society
The Phillips Society is the Friends of the Geography and Map Division
organization. There are currently over 200 members. This year’s
meeting is a joint meeting with the Texas Map Society in Arlington, Texas
in October. The Society publishes newsletters and occasional papers.
Last year’s summer project with five participants was a great
success. They are not planning one this year. Instead, this summer the
Library is hosting two librarians from tribal libraries in North Dakota and
Minnesota. They expect to go back to the traditional summer project next
LC currently does not have a project to scan the Sanborn Atlases. Bell and
Howell/Proquest developed a digital record of the black and white film but
researchers are dissatisfied because it is black and white and because the
film is not always a good copy. LC would like to scan the original color
maps but lacks the resources to digitize all the maps and lacks permission
from EDR Sanborn for those still under copyright.
looking into the possibility of using some facilities at Fort Meade for
Archives and Records Administration
Richard H. Smith, Senior Archivist, Cartographic Unit, Special Media
Archives Services Division
Richard Smith began by recounting the history of the Cartographic and
Architectural Records Branch of the National Archives ( web site). Acquisition of
maps and charts began in the 1930’s. In the 1960’s aerial
photographs were added to the collection and in the 1970’s through
1990’s architectural and engineering plans were also added.
Currently, they have just under 2.5 million maps, just over 2.5 million
architectural and engineering drawings and 16 million aerial photographs.
Not all acquisitions are in paper copy; the Archives also have materials on
film and aperture cards. The cartographic unit has a staff of 14 who
accession, process, describe and make records available to the public in
the Public Research Room. The Research Room is open six days a week and
three evenings a week in the Archives II building in College Park,
Maryland. For more background on the Cartographic and Architectural Records
Branch refer to General Information Leaflet No. 26 (http://www.nara.gov/publications/leaflets/gil26.html).
as defined by federal statute include “all books, papers, maps,
photographs, machine readable materials, or other documentary materials,
regardless of physical form or characteristics, made or received by an
agency of the United States Government under Federal law or in connection with
the transaction of public business and preserved or appropriate for
preservation by that agency or its legitimate successor as evidence of the
organization, functions, policies, decisions, procedures, operations, or
other activities of the Government or because of the informational value of
data in them”. (44 U.S.C. Chapter 33 Section 3301). Acquisitions are
by records control schedules drawn up between the Archives and the
originating agency. The Archives provides records lifecycle management guidance
to all Federal agencies and conducts evaluations of Federal agency records
management practices. Items come to the Archives after active use of the
materials has diminished, the standard is about 30 years (after current
administrative need for the materials is extinguished). Occasional offers
of unique materials are made, but this is somewhat rare. Exceptions to the
30 year rule include receipt of a copy of most Federal agency maps at the
time of printing. These records series are sometimes supplemented by
annotated copies of maps and background files for published maps. Records
are stored in record groups and kept in record series. The provenance of
the materials is maintained. Appraisal and retention in the Archives is
done on a series basis, not the individual piece. Cataloging is done at the
collection, series and record group level. Rarely is any item-level
and preservation of the collections are major priorities. To minimize
handling Archives creates reference copies in photocopy, microfilm or
photographic reproductions for especially valuable items, but generally
original maps or drawings are brought to the Research Room. A recent
example is the color 35mm film of the 1930 Census enumeration district maps
now available to accompany the 1930 census schedules released in April.
This is the first time Archives has filmed the enumeration district maps.
Paper maps are stored flat in map cases in acid free folders with
occasional items in Mylar sleeve application. A scanning project, done
under contract with a private company, has processed about 300 maps and 100
aerial photographs so far. We should also be aware of the Center for
Electronic Records and their programs and the related Electronic Records
Betty Jones, Chief of the Depository Administration Branch
Jones, Chief of the Depository Administration Branch, presented for the
Government Printing Office (GPO). She has been in the position for less
than one year.
On Friday, March 29, 2002, President Bush nominated Bruce R. James to be
the Public Printer. Current Public Printer, Michael F. DiMario has been in
the position since 1993. The Public Printer is the head of the U.S.
Government Printing Office. In the past year GPO has hired a chief of
serials cataloging and a chief of monograph and map cataloging. They have
also hired two new catalogers and made offers to two other candidates for
cataloging positions. There are currently 14 catalogers with 6 positions
still to filled. In addition they have hired three program analysts and
will hire an additional librarian in the Depository Administration Branch.
fiscal year 2002 appropriations
LPS received funding from Congress to modernize the automated library
system. They are on the fast track to purchase a state of the art
integrated library system (ILS). The current legacy systems made it through
the Y2K transition. One persistent problem is the current systems do not
allow for the easy transfer of information from one to the other. This is a
major advantage of the ILS. GPO will be hiring a consultant to help with
the transition. Any help or advice librarians outside GPO can provide would
be greatly appreciated.
October 12, 2001, Francis J. Buckley, Jr., Superintendent of Documents,
issued the recall of USGS Open File Report 99-248: Source-Area
Characteristics of Large Public Surface-Water Supplies in the Conterminous
United States: An Information Resource Source-Water Assessment. Mr. Buckley
explained the Policies and Procedures for Withdrawing Documents from the
FDLP in the November 15 Administrative Notes, and again March 14 in a
letter sent to all depository library directors and coordinators (the
letter was reprinted in the April 15 Administrative Notes). Since FY 1995,
the GPO has distributed 230,019 tangible product (print, microfiche, and
CD-ROM) titles to depository libraries, and recalled just 20 (16 to be
destroyed, 3 returned to the agency, 1 removed from shelves). GPO has not
been asked to withdraw any electronic publication. Several agencies have
taken electronic publications off their web sites.
Betty presented copies of the 2002 Recommended Specifications for Public
Access Workstations in Federal Depository Libraries and pointed out the
“for cartographic data use” recommendations. This draft will be
published in Administrative Notes and will supercede the recommended
specifications dated June 2001 and become requirements on October 1, 2003.
GPO provided cataloging for 4,200 maps and map products this past year from
USGS, Census Bureau, Department of Agriculture, NIMA, NOAA, CIA, and other
agencies in paper, CD, DVD, and online. GPO will continue to disseminate
maps in a tangible format whenever possible. Census track maps for the 2000
census will not distributed in paper because of the prohibitive cost of
production and distribution. They will be available on DVD. The Interagency
Agreement with USGS expires this fiscal year. GPO does not foresee any
major changes or any problems in renewing the Agreement.
Geographic Data Committee (FGDC)
John Moeller, Staff Director
Moeller, Staff Director of the FGDC, presented at the meeting for the first
time. He primarily discussed policy; what the FGDC is, what tasks have been
assigned to it and then generally about the National Spatial Data
Infrastructure (NSDI). The FGDC is an interagency and intersectional
committee at the federal level. There are currently 17 cabinet and
executive level agencies represented, and additional agencies/organizations
are expected to become members, e.g., GPO and GSA. The FGDC has a Steering
Committee, a Coordination Group, and a FGDC Secretariat staff. FGDC is
under the leadership of the Department of the Interior. The Deputy
Secretary of the Department of the Interior is the chair and the vice-chair
is Mark Foreman, OMB Associate Director for Technology and Electronic Government.
Within the Committee, there are 27 working groups or subcommittees that are
organized on thematic categories, for example, the U.S. Forest Service for
vegetation, the U.S. Fish and Wildlife Service for wetlands, and Census for
cultural and demographic issues. Working groups deal with issues that cut
across areas, such as a NARA lead working group for historical data and a
recently established working group on homeland security with NIMA and USGS
serving as co-chairs. FGDC’s primary responsibility is determining
among local participating agencies how activities for providing,
collecting, and utilizing spatial information at the federal level can be
better coordinated and to provide federal leadership for the National
Spatial Data Infrastructure. A component of this goal is also to involve
state, local and tribal governments, the academic community and the private
said that he directs the staff that supports the daily operations of the
committees. The FGDC was organized in 1990 under OMB Circular A-16, which
promotes “the coordinated use, sharing, and dissemination of
geospatial data on a national basis”. This establishes the federal
information policies for the federal government. Regarding questions about
the recent removal of some government information off the Web, he stated
that the government’s policy still is to have federal information
made available at the least cost to the widest dissemination with the least
amount of restrictions as possible. In spite of September 11th, that policy
has not officially changed, although the limitations of it have changed and
there were plans to reassess OMB Circular A-130. At this time, there will
probably be three categories of information, one being classified, another
being open public domain, and the third being restricted information based
on some criteria and protected for perpetuity in some cases and in some
cases open access after a certain amount of time. Studies have indicated
about 80% of government data has a spatial component. When managing
business processes and decision processes in the federal government,
geography can be used to better understand the entire environment. More and
more, the geospatial component to information is being perceived by people
as fundamental and we need to take opportunities for building the global
spatial data infrastructure. There are about 50 or more countries that are
either beginning to build this infrastructure or are planning to do so and
the commonalities are many. FGDC is supporting these initiatives. A new
kind of infrastructure to improve the use of geospatial resources across
the country is needed. Currently, this is operated at the federal level
under an OMB Circular A-16 and Executive Order 12906.
components of the spatial data infrastructure are:
Framework: 7 layers
have been identified to provide a consistent base for spatial
location. The layers include imagery, elevation, cadastral,
transportation, government units, geodetic and hydrographic.
explanation or textual description of the data source. The FGDC has a
metadata standard and federal agencies are required to use this. The
expectation is that we will see greater implementation of the standard
as more and more vendors begin to put it into their tools. In
addition, there is the ISO standard that is being worked on by the ISO
Geospatial Technical Committee 211. It should be in place by the end
of the year. The federal government is committed to building a
transition from the FGDC existing metadata standards to the ISO standards.
There may just be one uniform standard for North America, including
Canada, United States and Mexico.
Clearinghouse: A metadata
catalog to ensure access to data that is already available to fit a
user’s needs. The catalogs are networked from county to country.
For example, the United States, Canada and Australia have been
networked. There are 26 or 27 countries that are now part of the
global NSDI clearinghouse. The clearinghouse is expected to be at
least 80-90% global in the future.
and Technology. 17 standards have been endorsed through the FGDC
and another 20 or so are in some form of development by the
subcommittees and workforce. The goal is to have interoperable data
and specifications. They focus on data content and data classification.
NIMA has been a big promoter of these products. The Open GIS
Consortium is the primary organization providing guidance for the
interoperable geoprocessing technology specifications.
geographic data needed for community decision-making. The hope is to
use descriptors, the clearinghouse, the standards and the other tools
to make all geographic data more accessible and useable. The results
will be that we will have the opportunity of finding geodata,
understanding what is in a dataset, using more and more consistent
terminology and definitions of the data and having more tools
available so that we can bring them together for decision making.
Relationships for collaboration, sharing and policy deliberations.
These are critical as 80% of the government data has a spatial
component, cadastral data is only 1-2% at the federal level while 98%
is at the local level, and only 5% of the biological spatial data is
at the federal level. Thus the only way to build information
relationships is through partnerships and collaborations.
emphasized that the National Spatial Data Infrastructure (NSDI) is being
developed for organizations to cooperatively produce and share geographic
data. He cited several examples of geospatial data products where the use
of standards has added to the understanding of the importance of
interagency cooperation. A goal of the Infrastructure is to reduce
duplication of effort among agencies and localities as well as to improve
quality, increase availability and reduce costs related to producing and
accessing geographic information.
discussed the geospatial One-Stop E-Government initiative, which resulted
from the government’s desire to provide services to help other
government entities, businesses and citizens to more effectively use
electronic technology. A federal OMB task force was established to
recommend profitable e-government initiatives and 24 initiatives were
selected, one of which was the Geospatial Information One-Stop. This initiative
was assigned to the Department of the Interior and the FGDC. Currently,
FGDC is working with 11 federal partner agencies plus state, local and
tribal governments. The vision of the Geospatial One-Stop is to spatially
enable the delivery of government services and to provide a place where
access to individual information and access to combined information will be
possible. The future model should provide fast, low-cost, reliable access
to geospatial data needed for government operations via a government-to-government
portal for this information. This will also facilitate the effective
alignment of roles, responsibilities and resources for
government-to-government geospatial interactions needed for vertical
missions such as homeland security. Another goal is to have multi-sector
input for standards which will create consistency in order to promote
interoperability and stimulate market development of tools. The focus of
the Geospatial One-Stop is to accelerate development and implementation of
NSDI technology, policies and standards that support “one-stop”
access. The outcome of the initiative should be that the infrastructure is
accelerated, achieving better, faster, less expensive access to reliable
data for use by citizens, to improve the use of resources for data
acquisitions, partnerships, and reduce duplications, and to have all
E-Government initiatives spatially enabled through data and functional
summary, John stated that an important goal is to create a multi-purpose
program of procedures and technology with federal, state, local, academia,
private sector and tribal governments to provide access to an enhanced
geospatial one-stop portal that is enabled by standards and technology
interoperability tools and is not vendor specific. The data will be based
on standards and will be commercially available and technology driven so
that it can be used in a whole variety of applications enabling geographic
information use across the nation and the world. We are encouraged to
provide output and representation from our communities, to give input by
reviewing the standards and to recommend candidates to work on team
projects to help further the Geospatial One-Stop initiative.
Betsy Banas, Staff Cartographer, Geospatial Services Group
Banas, National Forest Service (NSF) gave us an overview of the
Service’s mapping history, mapping programs, and digital mapping
Betsy began by noting the similarities between the mission statement of
CUAC and that of the Forest Service. The Forest Service mission statement
is “caring for the land and serving the people”. Gifford
Pinchot was the first Forest Service chief and the mission statement then
was to “provide the greatest amount of good for the greatest amount of
people in the long run”. She noted the philosophical differences
between Gifford Pinchot and John Muir in establishing
“reserves” vs “preserves”.
Forest Service was created in 1905 to provide quality water and timber for
the Nation’s benefit. It originally had 60 forest reserves covering
56 million acres; now, it has 155 forests and grasslands covering 191
million acres. The Service is very decentralized, having 9 Regions, 1
through 10. Region 7 was absorbed into Regions 8 and 9 long ago. At the
time that the Forest Service was organized, it was deliberately
decentralized, as it was decided that decision makers needed to be right
there, “on the ground” as they were most familiar with the
public’s needs at the local level.
Forest Service is the largest forestry research organization in the world,
having 20 research and experimental forests and other special areas. It
also provides technical and financial assistance to state and private
years, the public has expanded the list of what they want from national
forests and grasslands. Congress responded by directing the Forest Service
to manage national forests for additional multiple uses and benefits as
well as for the sustained yield of renewable resources such as water,
forage, wildlife, wood, and recreation. Multiple use means managing
resources under the best combination of uses to benefit the American people
while ensuring the productivity of the land and protecting the quality of
mapping and geospatial data programs have helped meet the Forest Service
mission by aiding in fire management, forest planning, forest health
protection, watershed restoration, ecosystem management and sustainability
of our resources, and recreation. Initially mapping was done at the local
level and it was a vital part of administering the land. The maps were made
to the specifications and requirements of the particular forest. There was
little standardization or consistency among Regions.
changed during World War II. There was an effort to consolidate mapping for
defense purposes. The Forest Service, at the time, had the equipment and
expertise. During the War, NFS map programs worked out of Gettysburg,
Pennsylvania, mapped areas of the U.S. along the Pacific Coast, and aided
in making detailed maps of Japan.
the late 1960’s regular Forest Service mapping business continued to
be decentralized and non-standardized. But mapping technology began to
change; new, costly equipment, computers, etc. required the centralizing of
mapping operations. The Geospatial Service and Technology Center (GSTC) was
founded in 1975 (then called Geometronics Service Center) and is located in
Salt Lake City, Utah. Its intent was to bring together the skills and
resources needed to build and maintain a standardized base mapping program.
The Center’s program has since expanded to include production of
Remote Sensing Application Center (RSAC) is co-located with GSTC in Salt
Lake City. It provides technical support in evaluating and developing
remote sensing, image processing, and how it relates to geospatial
technologies throughout the Forest Service. It also provides project
support and assistance with using remote sensing technologies, and
technology transfer and training.
Geospatial Service and Technology Center is more than maps. It provides
geospatial services, data, training and awareness. These services and
products support core Forest Service business needs including forest
planning, watershed restoration, resources inventory, and transportation
management. While NFS has a national program and centralized geospatial
service and tech center in Salt Lake City, many mapping activities continue
in the Regions. The Forest Service is developing a clearinghouse, this will
be a FGDC and NSDI node. This will eventually provide all Forest Service
geospatial data, and FGDC compliant metadata. Hopefully by September of
this year, that node will be active.
The Primary Base Series (PBS) maps of NFS have a scale of 1:24,000. They
are topographic maps, used as an administrative product. The Forest Service
started production in 1992 of the Single Edition Quad maps when they
entered into an agreement with USGS. The Primary Base maps are produced by
the Forest Service to USGS standards. This agreement has eliminated
duplicative efforts. The maps are revised sooner with partnerships than
without partnerships, and show Forest Service data. USGS prints and
distributes the maps for the Forest Service. The Forest Service is
responsible for about 12,500 of the 55,000+ topographic sheets produced of
the United States. They are mapping at a rate of 600 per year.
Secondary Base Series is at a scale of ½ inch to the mile
(1:126,720). The cartographic work is performed at GSTC. The base map is
forwarded to Region/Forest where it is enhanced with photos, transportation
guides and visitor information to become the standard Forest Visitor Map.
Visitor Maps are being distributed by USGS through a relatively new
agreement. Previously the maps were only available at Forest Visitor
Centers. The new agreement provides for the sale of Forest Visitor Maps
through a USGS vendor network, and provides customers with one stop
shopping. The maps are available to vendors at volume discounts. This
partnership has increased customer service. The maps are still also
available at Forest Visitor Centers, Forest Supervisor and District Ranger
Offices and can also be ordered from the various Forest Service websites
–but only USGS provides the one stop shopping capability that vendors
like because the receive a discount and can stock a variety of maps on
Forest Service maps include: wilderness area maps, wild and scenic rivers
maps, “Pocket Guides,” “Guide to Your National
Forest,” and other specialty products. FSWEB site:
collaborative efforts include http://cuac.wustl.edu/www.recreation.gov.
This interagency initiative provides web-served recreation information to
public. It cuts across government boundaries. Outdoors America Map is a
guide to recreation opportunities on Federal Lands; 11 Federal Agencies are
involved. The Forest Service is represented as a voting member on the U.S.
Board on Geographic Names. Forest Service is responsible for their areas in
the updating and maintenance of the Geographic Names Information System.
The Forest Service is adding information to the National Atlas of the
United States. There are other exchanges with USGS including Digital
Elevation Models (DEMs), Digital Orthophoto Quads (DOQs), and the National
Map. The Forest Service is working in Lake Tahoe Basin Management Unit on a
pilot of the National Map.
Geospatial Advisory Committee (GAC) Activities
Forest Service is participating in FGDC (Federal Geographic Data
Committee). FGDC is trying to create Geospatial One Stop and I-Teams (which
have to do with data sharing at the local level). John Moeller (who also
spoke at CUAC) is FGDC Secretariat Staff Director and Project manager for
Geospatial OneStop. NFS has taken the lead of the FGDC Vegetation
Subcommittee. Vegetation Subcommittee activity had languished –
initially a lot of effort had been put into trying to develop a vegetation
data standard. No consensus on the elements of the standard could ever be
reached, within NFS or among agencies on the subcommittee, so it stalled
out. Alison Hill is new chair, and the Committee is reinvigorated. NFS is
the Co-Lead for Sustainable Forest Data Subcommittee, active on Homeland
Security Working Group, and Imagery and Remote Sensing Task Force.
Geospatial Advisory Committee (GAC) was formed in 1999 to address advancing
of Forest Service Geospatial Data Technologies. The geospatial community
recognized the need to direct and coordinate geospatial data activity. GAC
promotes awareness of geospatial data throughout Forest Service, and
advises the Geospatial Executive Board (GEB). Its roles and
responsibilities are to identify, monitor, and address issues regarding the
state of NFS geospatial programs and activities. It also develops and makes
recommendations concerning geospatial program execution to the Geospatial
Executive Board. GAC communicates progress to NFS geospatial community and
others. GAC emphasis areas are 1) standardized GIS data, 2) natural
resource applications coordination, 3) geospatial training and awareness,
4) coordinate and share standardized GIS data, 5) cartographic publishing,
and 6) technology architecture coordination. GAC’s goals are to
ensure NFS geospatial policy, programs are compatible and integrated, and
to ensure programs are responsive to NFS business needs.
of the Census
Tim Trainor, Chief, Cartographic Operations Branch
Trainor began by discussing a couple of the Census Bureau’s
Geographic programs. The fifty State Data Centers (SDCs) participated in
the Public Use Microdata Area (PUMA) Delineation Program. Tim spoke at some
length about the Urbanized Area Delineation program, which culminated with
a Federal Register notice on May 1, 2002 (71 FR 21961) listing the 466
areas defined as Urbanized Areas (UA) for Census 2000 (up from 405 in
1990). General criteria are that there must be a density of 500 people per
square mile and a minimum population of 50,000. There is no grandfathering
of urbanized areas: Cumberland, MD, was dropped from the UA list which
qualified in 1990. The more important detail is that the category has been
expanded to include “urban clusters”, with urbanized areas and
and urban clusters totaling 3,638 qualifying areas, so more areas will have
data available. The smaller “Urban Cluster” (UC) is defined for
areas of sufficient density from 2,500 to 50,000 inhabitants plus other
characteristics. Detailed definitions and discussion of UA’s and
UC’s may be found in a Federal Register announcement of March 15,
2002 (67 FR 11663). The concept of undevelopable areas adjacent to or
within UAs (e.g., floodplains along a river) are now diplomatically being
called “exempt” rather than “undevelopable.” And of
course, all of this information is available on the web.
reviewed several of the geographic products from Census. Some of these
involve Zip Code Tabluation Areas (ZCTAs), in which each Census block is
assigned a single Zip Code. This constructed geography will result in
various special boundary files and tabulations. The TIGER 2002 files, which
use 2000 geography, will be available soon on the web. Probably, at some
point there will be maps but specifications have not yet been finalized.
TIGER/Line files, based on Census 2000 Geography will be available to
download by the end of this week. Based on Census 2000, many redistricting
activities are underway in the states.
maps, mostly in pdf format, are available on the Internet. These are also
available on DVD (CDs are used only if the files total less than 650
megabytes) and as on-demand plotted maps. Recommended specifications for
plotters are on the web site. Tim has a national map showing locations of
the State Data Centers, it is used internally, but possibly could be made
available. It is constantly changing and has all of the different kinds of
state data centers, in terms of their classifications. Census 2000 block
maps for every community in the country have been produced. They include
the 130,000 maps sheets John Hebert referred to as recently accessioned at
LC Geography and Map Division. Census has produced an additional 280,000
sheets, that are block maps for geographic levels above census tracts, such
as places and county subdivisions.
legal governments, maps have been sent to the entity’s highest
elected official and currently are available on the web. Six DVDs will be
manufactured shortly that include regions of states. Unlike the 1990 county
block maps, users can access a town or city of choice without having to
acquire all of the maps for a county. Census tract outlines maps are
available on one DVD and American Indian/Alaskan Native Areas and Hawaiian
home land block maps are available on one CD-ROM.
boundary files are available on the web for most levels of geography in
several popular ESRI formats: Arc/Info exports (.e00), ArcView shapefile
(.shp), and Arc/Info ASCII format. Census 2000 boundary files are available
in both high resolution and low resolution versions. They are re-doing the
1990 files so that nested geography share the same points.
result of user input, more printed reports than originally planned will be
generated. County outline and subdivision outline maps will be produced.
Page sized county maps by state, will be done by the end of summer.
Metropolitan Areas will be redefined in 2003 based on new criteria.
Bureau is still producing thematic maps. One recent map shows the center of
population for each state. Another is the famous “nighttime”
map, where white “light” on a dark background indicates
population distribution, which recently had the biggest press run in Census
history, of 1,500,000 sheets. Five copies were sent to every school in America.
They are planning a 108th Congressional District Atlas for next year and
have released a Census 2000 atlas based on the first seven questions of the
the 100th anniversary of Census as an agency.
Bureau realizes the acute need for modernization of its Master Address File
(MAF) and the entire TIGER system. TIGER is old and technology has advanced
significantly since being developed. (Most people don’t know that
Census still maintains the files in an internal format, not the ASCII
format that it distributes.) Everyone knows that the positional accuracy of
boundaries is inaccurate, and Census wants to move beyond relative accuracy
and to true positional accuracy. One reason this will be imperative is that
TIGER will form the transportation layer of The National Map. Updating
can’t wait: there are sixty-five committees already looking at Census
2010 planning, and to maintain the geographic standards of the ongoing
American Community Survey, MAF and TIGER must stay updated and be improved.
The goal is to get an enumerator to a housing unit 100% of the time.
are many partnerships with other agencies and partners. Census maintains
boundaries for most local governments on an annual basis.
MAF/TIGER modernization is focusing on three important projects. One is to
get existing files where they exist. Out of the 3,000 counties, about 1,000
of them have GIS files, of them a small number have really good GIS files.
Census is evaluating that currently. A second strategy is to have
contractors look at commercial sources that are available that can be used
without restriction into the public domain. A third alternative is to use
imagery where the previous two options are not possible as a means to
improve and maintain the spatial data.
Dan Cavanaugh, Chief, Branch of Program Development
Cavanaugh, US Geological Survey (USGS) gave an update that focused on three
themes: New Products, especially published maps, the National Atlas and the
USGS has released several maps that are different than they generally
produce. They include a map of Lake Tahoe showing underground structure,
and a Tapestry of Time and Terrain which depicts geology and physiography.
There is also a new map of New England showing earthquakes between 1638 and
1998 (I-2737), which proved particularly timely given the recent earthquake
there. Another recently published map, titled Geographic Face of the Nation
– Land Cover, developed from the National Land Cover Data (NLCD), was
jointly produced by USGS and the Environmental Protection Agency. A new
relief map will be released similar to the Thelin & Pike map (late
70’s, early 80’s) titled Geographic Face of the Nation –
Elevation. The new map will have fewer data artifacts than the previous
continuing to forge partnerships, especially with the Forest Service. USGS
Map Dealers (about 2000 of them) are now distributing Forest Service maps.
Their goal is to distribute Forest Service maps for all 9 Forest Service
regions. The map distributors are pleased about being able to obtain maps
from one source (USGS), rather than having to deal with multiple agencies
and regions. The USGS has also entered into partnerships with other
agencies, such as the Library of Congress. This partnership has resulted in
reproduction of an 1894 map of Colorado. It is available from USGS (see
http://rockyweb.cr.usgs.gov/historicmaps/historicmapsfromlca.html for more
information). USGS is working with the National Park Service to produce
geologic maps of the National Parks. They also continue to distribute
National Imagery and Mapping Agency (NIMA) products. About 90-95% of the
NIMA products that were available before September 11 are still available.
the most popular products at USGS continue to be the booklets, such as the
General Interest Publications, which are available for free. Dan indicated
that just prior to our meeting, the Director of the Survey announced that
the USGS will be getting out of retail sales (at the ESIC) by FY2004. It is
uncertain if that is the beginning or end of FY04. Over the counter retail
sales may cease at other USGS locations as well, and is probably a year or
two away. A question was asked if there are other ESIC offices to be
closed. Dave indicated that the Washington DC ESIC in Main Interior had
closed this year due to budget cuts, and that the Spokane ESIC was closed
last year to budget cuts. Remaining ESIC offices include Reston, Menlo
Park, Denver, Anchorage, Rolla, and Sioux Falls, SD.
asked about the recently published maps of Utah and Colorado that came
through FDLP. They are not a “national program”. These maps
were produced from the National Elevation Dataset by the Rocky Mountain
Mapping Center, and are similar to the one of Pennsylvania that was issued
several years ago. They will not be issued for the entire United States
unless funding is made available. Dan was also asked if there were plans to
revise or update Maps for America. The response was no, due to lack of
The National Atlas continues to be one of the Geological Survey’s
most popular web sites. It is a cooperative venture between 21 partners and
ESRI. There are presently 420 map layers available on the National Atlas
web site. People can use it to make and print their own map. It also
includes internal links to other web sites. For example, when a user clicks
on a National Park, they are linked to sites with information on that park.
The National Atlas web site receives 4.6 million hits per month, and links
to 1900 other web sites. A new map is drawn every 1.5 seconds. Over 350,000
map layers have been downloaded from the site.
the National Atlas, the USGS has been able to produce hard copy products,
such as the Federal and Indian Lands map, the elevation map of North
America, the Forest Cover map, (produced with data from many Federal
agencies), the Presidential Elections map, which includes insets showing
the results of all Presidential elections since 1789, and the General
Reference map, showing roads and county boundaries. This map will be
revised to show Alaska at the same scale as the lower 48, in another words,
one will be able to compare the land masses against each other and
re-released. The National Atlas is viewed by some people as a small scale
version of the more detailed National Map.
The National Mapping Division is now the Geography Discipline. The National
Map is everything that the National Mapping Division used to do. There used
to be three organizations under the National Mapping Division. They were
Map and Data Collection, Earth Science Information Management and Delivery,
and Research. They are now known as Cooperative Topographic Mapping, Land
Remote Sensing dealing with Landsat, and Geographic Analysis and Monitoring
which equates to the research area.
primary activity of the National Mapping Discipline is to compile the base
data for the National Map. The vision of the National Map is to develop a
current, continually revised, seamless, complete, consistent product that
will reflect geographic reality, have positional and logical consistency,
and have no cartographic offsets. It will be a temporal record, with
metadata for both the data set and the features within it. The National Map
will address 5 needs, to Map, Monitor, Understand, Model and Predict. The
7.5 minute topographic map is probably the USGS’ most famous product.
It is the only U.S. cartographic product that is comprehensive,
trans-jurisdictional and border-to-border and coast-to-coast. Compiling it
was an immense engineering feat that would cost over $2,000,000,000 to
replicate today. On average, the topographic map is 23 years old. U.S.G.S.
is finding that they can not keep up with currency. Base data, such as
aerial photographs, often show features that topographic maps do not.
topographic information has a variety of uses (scientific studies,
planning, decision making, land and resource management, delivery of
government services, economic activities, natural disaster relief, homeland
defense), it will be the base of the National Map. There is presently some
duplication of effort among and between geographic information sectors
(federal, state and local governments and the private sector). Cooperation
between these sectors (Cooperative Topographic Mapping) will provide the
base information needed for the National Map. Partnerships will be built to
develop the base data, which will be accessible via the web 24 hours a day.
Users will be able to specify the data and area of interest and print their
map on demand. Cooperative Topographic Mapping will include activities such
as acquiring, archiving, and disseminating base geographic data,
maintaining and providing derivative products, including topographic maps,
and conducting research to improve data collection, maintenance, access,
and applications capabilities. The core data, which will include themes
such as orthophotography, elevation, transportation, hydrography,
structures, boundaries, geographic names and land cover, will be public
domain, either collected by government agencies or made available through
licensing agreements. Links to other data with higher resolution, enriched
content and additional attributes will be available. These links may be to
licensed data. This means that USGS’ role will be changing from data
producer to organizer responsible for awareness, availability, and utility.
USGS will be the catalyst and collaborator for creating and stimulating
data partnerships, a partner in standards development, and an integrator of
data from other participants. When no other source of data exists, USGS
will produce and own the data. There will be a temporal component or
versioning, but the details have not been worked out yet. Data will be accessible
24 hours a day and will be in the public domain.
National Atlas is an example of a small-scale implementation of the
National Map. It has been developed through partnerships. USGS has
integrated the content so that it is consistent nationwide. They have also
developed the metadata and provided web access. USGS offers derivative
products, such as the data layers and printed National Atlas maps.
are currently 7 National Map pilot projects underway in the United States
(see http://nationalmap.usgs.gov/nmpilots.html) for more information. One
in Delaware is currently the most complete and went live April 18 (URL: http://www.datamil.udel.edu/nationalmappilot).
The events of September 11 illustrate the urgency for geospatial data and
the National Map. September 11 has shown us that data must exist before,
during and after an event, be readily accessible, and that partnerships
among state, local, and federal agencies and the private sector are
required. The events have illustrated that cartographic information is a
national infrastructure, just like the Interstate Highway System. As a
result of September 11, there is an emphasis to compile information,
including high-resolution color imagery, high accuracy elevation data and
critical infrastructure, for 120 major metropolitan areas in the United
States. NIMA and other Federal agencies are partnering in this effort.
Links with state and local agencies and “first responders” are
also being developed.
Imagery and Mapping Agency
Jim Lusby, NIMA Staff Officer, Disclosure and Release Division, Office of
International & Policy
Lusby, represented National Imagery and Mapping Agency (NIMA) and provided
an overview of the policy of Limited Distribution Products (LIMDIS) and an
update on the distribution of Shuttle Radar Topography Mission Data.
authority under U.S. law, Title 10, to restrict distribution of
cartographic data if it is required to do so under international
agreements, if disclosure would reveal sensitive methods for obtaining the
data, or if disclosure would interfere with military or intelligence
operations. Officially Limited Distribution (LIMDIS) is a caveat, not a
security classification, e.g., “Classified” or
“Secret.” It is still enforceable under law. Roughly 35% of
NIMA’s products fall under the LIMDIS category.
80,000 different line items, and of those, 30,000 are limited distribution.
20,000 are foreign produced and NIMA works in cooperation with the foreign
worked to arrange exceptions to LIMDIS for academics and government
agencies for an expressly noted purpose, e.g., to support disaster relief
operations. Unauthorized re-distribution of LIMDIS data in such situations
can result in agencies or contractors losing their ability to obtain future
exemptions. Most requests for exemption require the agreement of a third
party, such as the foreign agency responsible for supplying the data. NIMA
evaluates all requests on a case by case basis, and tries to balance
benefits and risks of exemptions.
also assists foreign countries with information in times of need. Jim
mentioned NIMA and USGS efforts in assisting Honduras, Nicaragua, and El
Salvador during “Hurricane Mitch”. They are partnering with
USGS, Census, Forest Service, and others.
NIMA products available to other government agencies can be a lengthy
process. Criteria for approval of release is based on desired geographic
location, the use, and justification for needing the material.
working to make the process smoother by spelling out conditions of release
during the initial data collection process with third parties, taking some
internal steps to formalize LIMDIS policies and procedures, and by
highlighting the issue to NIMA customers in forums such as CUAC. Is there a
greater amount of risk to giving this product to someone to satisfy them?
Are there other sources that will work? Is this is the only source and what
kind of risk will have to be weighed? What is the derived product coming
out of it?
are many multinational projects underway. NIMA works with
“disclosure” or “release” restrictions. Disclosure
is where someone can look at it and walk away or release where they can
actually give someone the map. NIMA is trying to obtain more
“disclosure” than “release” situations in working
distribution is a caveat that restricts anyone from using it unless NIMA
gives approval. Official use only means that you need that product for
planning and you will use it only for that purpose.
products will be more easily available, others will be less. NIMA will be
working on updating their “Memorandum of Understanding”
(MOU’s). They are trying to reduce the amount of LIMDIS information
or make it classified and try to get out of the gray area.
Danielson from GPO asked Jim about maps received at GPO for FDLP cataloging
that were marked with the LIMDIS caveat. Jim said that GPO/FDLP were indeed
supposed to receive such items as they had been declassified. Jim explained
that after printed materials are marked LIMDIS at the printer, a new press
run can not be done to remove the LIMDIS caveat. Instead that marking is
supposed to be removed or obliterated by the distributor.
Jim presented a revised schedule for release of the Shuttle Radar
Topography Mission (SRTM) data products. This is the digital terrain data
that librarians are hoping for. Alaska is not well represented. Having
fallen behind after September 11, Jim cautioned that the schedule was
subject to further change. Production of data for North and South America
is expected to be complete by summer 2002, but distribution schedules and
methods have not been determined. USGS through the EROS Data Center with a
joint agreement will be the data holder for the public. Public release data
will vary in resolution, depending upon geographic area. USA data will be
level 2 (30 meter resolution), non-USA areas will be level 1 (roughly 90
meters). By 2004, everything should be completed, elevation data for the
world, and all the products done. It will be much better than anything they
have had in the past and they are using additional information from others.
1,000 meter is available now.
Ocean Service - NOAA
Howard Danley, Deputy Chief of the Navigation Services Division
1037 paper charts for sale through the Distribution Division of the Federal
Aviation Administration’s National Aeronautical Charting Office. The
National Aeronautical Charting Office also does the printing of the
nautical charts. These are available through the FDLP. A private company,
Maptech, sells raster images of the charts. On the web at maptech.com,
thumbnails at 90 dots to the inch are available using MrSid compression.
great interest by graduate students in shoreline movement over the years,
terrain, ports, and features. For the last four to five years, a selection
of historical charts from the late 1800s to about 10 years in the past has
been available on the NOAA web page. In cleaning out the warehouse, they
discovered historical charts and scanned them. They can be downloaded.
MrSid made this possible. These include hydrographic surveys. One can use
“mapfinder” on the website: http://mapfinder.nos.noaa.gov/ to find
hydrographic surveys over time.
Coast Pilot is a supplement to the nautical charts. From the early to
mid-1800s, this was a private publication. In the mid-1800s, the Coast Survey
purchased the publication. NOAA has contracted with a company in
Beltsville, MD to scan the Coast Pilots starting with the oldest, a 1776
publication by the British Admiralty. These images will be placed on the
Web, linked through the NOAA library. These online Coast Pilots will be
searchable by chapter with an index in the back. Some of the older Coast
Pilots had foldouts that are causing problems with scanning because they do
not want the binding affected. Funding has been provided for about one-half
of the project. Additional funding will be sought next year for finishing
will be continuing to place electronic nautical charts on the Web in a
vector format. There are about 150 charts with a browser available. They
can be downloaded. They will be different from the printed charts; the
symbology and detail are different. Current coast pilots are available on
the web and can be downloaded. Electronic charts and Coast Pilots are
considered “provisional” because they are not updated for
navigation. These images have increased sales. Distances between Ports will
go up on the web too.
September 11, NOAA has taken airflows, ship schedules and names from its
web site, but decided to leave nautical data as it can be obtained
about potential web products included: the early edition nautical charts of
Alaska that had been classified because of the Distant Early Warning (DEW)
sites; and the historical t-sheets. The T-sheets (topographic) date back to
the mid-1800’s and contain a tremendous amount of information
including land use, land ownership, and place names. National Archives
holds the t-sheet photographic negatives and the originals.
charts will be around for an indefinite time, especially for the recreation
community. For large vessels, there will be a requirement for backup, in
whatever form. The print on demand program is still alive but going slowly.
There are 876 charts of the 1,037 available through print on demand. The
number of print on demand agents is now 40. 17,000 copies of charts have
been sold through print on demand last year.
Fish and Wildlife Service
Doug Vandegraft, Chief Cartographer
Vandegraft is the chief cartographer at Fish and Wildlife. The Fish and
Wildlife Service (F&WS) has seven regional offices and about 25
cartographers throughout the United States.
last year, his office has worked on digitizing the boundaries of the 538
wildlife refuges. They are three-quarters completed. Doug noted that 85% of
refuge acreage is located in the state of Alaska.
addition, they are working on a digital land status layer indicating
F&WS land ownership. In other words, what lands they own within the
wildlife refuges. They are always trying to acquire land to protect
critters. Refuge boundaries are approved acquisition boundaries and within
that boundary, they have decided that the habitat is worth saving.
date back to 1903, but the F&WS was not created until 1940. The Bureau
of Biological Surveys was the first agency to manage wildlife refuges and
in 1936, developed a template of what refuge maps should look like. They
are still using the same format, but in 1980 ANILCA added 100 million acres
in Alaska, and the format no longer worked well. The F&WS are experimenting
new ways of depicting wildlife refuges land status using the digital raster
graphics (DRG’s) and digital orthophotoquads (DOQ’s). F&WS
has new refuges in the South Pacific and the agency is producing new maps
of those areas. Doug indicated that they are currently working with USGS on
a new refuge map to commemorate their Centennial. Alaska will be at the
same scale as the lower 48.
Yukon Delta refuge includes 26 million acres. F&WS has scanned about
500 of the original land status maps dating back to the 1920’s.
Originals will go to National Archives. Refuge boundaries are available on
the web and they may be downloaded. It is important to recognize that there
may be private in-holdings within the refuge boundaries depicted.
continues on the Real Property Database. The database provides information
on tracts of lands owned by F&WS including price paid, parcel size,
name of former owner, and additional information. Some information is not
available due to its sensitivity. They are currently working on linking
refuge boundaries to this database, which will be displayed in a web-based
map-server environment. Ideally, there will be a photograph for each
refuge. Doug indicated that the most important component of geographic
information systems is the query capability. He provided some demo examples
of how F&WS is hoping to use GIS with the Real Property Database. Doug
is working on securing funding to pursue this project.