Submission for OMB Review; Comment Request, 59719-59720 [E6-16728]
Download as PDF
Federal Register / Vol. 71, No. 196 / Wednesday, October 11, 2006 / Notices
Act (Title VII, Public Law 108–447)
directed the Secretary of Agriculture to
publish a six month advance notice in
the Federal Register whenever new
recreation fee areas are established.
The Whistle Stop project is a
partnership between the Forest Service
and Alaska Railroad that will provide
additional recreation opportunities
using alternative transportation. This
new service will allow the opportunity
for visitors to access National Forest
lands which were previously
inaccessible to the majority of forest
visitors. Market research demonstrates a
demand for these sorts of recreation
opportunities on the Kenai Peninsula.
The Forest Service has identified a goal
of achieving cost recovery through a
combination of revenue sharing with the
Alaska Railroad; fees from public-use
cabin rentals and campsites; and fees
obtained through backcountry permits.
Implementation of backcountry
permits, as described in the Record of
Decision, will provide the Forest
Service with the ability to accurately
track recreation use and ensure that use
levels and numbers of encounters are
not exceeding thresholds established in
the Forest Plan and Recreation
Opportunity Spectrum (ROS)
guidelines. Chugach National Forest
goals include maintaining a backcountry
social experience and protecting the
natural and cultural resources
throughout the area. Backcountry
permits will be required for visitors
utilizing the enhanced amenities
provided through the Whistle Stop
Project area. Amenities include a
developed trail system, backcountry
campsites, interpretive materials, and
Whistle Stop stations that will include
a shelter, restroom facilities and bearproof food storage containers. Issuance
of the backcountry permit will allow for
better public safety and result in
improved visitor education and
information about proper camping
techniques, fir prevention, safety in bear
country, and sanitation. Members of the
public are welcome to comment.
Dated: October 4, 2006.
Joe Meade,
Chugach National Forest Supervisor.
[FR Doc. 06–8591 Filed 10–10–06; 8:45 am]
BILLING CODE 3410–11–M
pwalker on PRODPC60 with NOTICES
DEPARTMENT OF COMMERCE
Submission for OMB Review;
Comment Request
DOC will submit to the Office of
Management and Budget (OMB) for
clearance the following proposal for
VerDate Aug<31>2005
16:53 Oct 10, 2006
Jkt 211001
collection of information under the
provisions of the Paperwork Reduction
Act (44 U.S.C. chapter 35).
Agency: U.S. Census Bureau.
Title: American Community Survey,
2007 Methods Panel.
Form Number(s): ACS–1(2005), ACS–
1(X)Seq, ACS–1(X)Pro.
Agency Approval Number: None.
Type of Request: New collection.
Burden: 46,000 hours.
Number of Respondents: Postage
Test—20,000; Grid vs. Sequential Test—
40,000; Degree Test Reinterview—
32,000.
Avg. Hours per Response:
Questionnaires—38 minutes;
Reinterview—15 minutes.
Needs and Uses: The U.S. Census
Bureau requests authorization from the
Office of Management and Budget
(OMB) to conduct the American
Community Survey 2007 Methods Panel
tests.
Given the rapid demographic changes
experienced in recent years and the
strong expectation that such changes
will continue and accelerate, the oncea-decade data collection approach of a
census is no longer acceptable as a
source for the housing and socioeconomic data collected on the census
long-form. To meet the needs and
expectations of the country, the Census
Bureau developed the American
Community Survey (ACS). This survey
collects detailed socioeconomic data
every month and provides tabulations of
these data on a yearly basis. The ACS
allows the Census Bureau to provide
more timely and relevant housing and
socio-economic data while also
reducing operational risks in the census
by eliminating the long-form historically
given to one in every six addresses.
Full implementation of the ACS
includes an annual sample of
approximately three million residential
addresses a year in the 50 states and the
District of Columbia, and another 36,000
addresses in Puerto Rico. A sample this
large allows for annual production and
release of single-year estimates for areas
with a population of 65,000 or more.
Lower levels of geography require
aggregates of three and five years’ worth
of data in order to produce estimates of
comparable reliability to the census
long-form. However, an ongoing data
collection effort with an annual sample
of this magnitude requires that the ACS
continue to research possible methods
for maintaining if not reducing data
collection costs. If costs increase, the
ACS would have to consider reductions
in sample thus reducing the reliability
of the data as compared to the reliability
of the census long-form, especially at
lower levels of geography.
PO 00000
Frm 00005
Fmt 4703
Sfmt 4703
59719
One of the tests included in the 2007
Methods Panel addresses a method for
potentially reducing data collection
costs. In this test, we will implement the
same mailing strategy as ACS
production and send each sampled
address a prenotice letter, an initial
questionnaire (ACS–1(2005)) packet,
and a reminder postcard and for those
who haven’t responded by a certain
date, we will send a second
questionnaire packet. However, for this
test we will send the prenotice letter
using standard postage. Current ACS
production procedures send all mail
pieces using a first-class postage rate.
Using standard postage rather than firstclass postage for this mail piece can
potentially save the ACS approximately
two hundred and thirty thousand
dollars in data collection costs each
year. The test will evaluate whether the
use of standard mailing for the prenotice
letter impacts mail response rates.
A second test included in the 2007
Methods Panel addresses another aspect
of ACS data collection relative to the
census. Both the ACS and the census
collect a core set of basic demographic
questions (age and date of birth, gender,
relationship, Hispanic origin and race).
However, the 2010 Census will use a
different format (similar to the format
for the 2000 Census) from the format
used by the ACS for collecting this
information on the mail questionnaire.
The census format, referred to as a
sequential person design, creates a
column for each person that includes
each question and associated response
categories. The ACS format, referred to
as the grid design, lists the names of all
persons down the left side of the form,
the questions across the top of the page,
and the response categories fall in the
‘cells’ created by crossing the person
names by question.
This second test will compare the
sequential person (ACS–1(X)Seq) and
grid (ACS–1(X)Pro) formats for
collecting the basic demographic
information to measure the impact on
data quality, specifically unit and item
non-response rates, response
distributions, and within household
coverage. The outcome of the test will
determine whether the different formats
might contribute to differences in the
estimates for the basic demographic
questions. If the format does influence
how people respond to these basic
demographic questions, the Census
Bureau will decide whether the ACS
should alter its format of the collection
of these data items to more closely
reflect the census style format prior to
the 2010 Census.
The 2007 Methods Panel may also
include a third test contingent on the
E:\FR\FM\11OCN1.SGM
11OCN1
pwalker on PRODPC60 with NOTICES
59720
Federal Register / Vol. 71, No. 196 / Wednesday, October 11, 2006 / Notices
funding allocations in the President’s
budget for 2007. This third test will
measure and compare the data quality
between two versions of new content
proposed by the National Science
Foundation for inclusion on the ACS.
The proposed content asks about the
major field in which a person received
his or her bachelor’s degree. In this test,
half the sample will answer an openended question reporting the actual
degree he or she received. The other half
of the sample will provide their field of
degree information by answering a
series of yes/no questions. The test will
assess which, if either, version results in
data of sufficient quality for inclusion
on the ACS.
Given that the ACS collects data every
day of the year in every county in the
U.S. and in every municipio in Puerto
Rico, the ACS provides an opportunity
to produce data not available from any
other source or survey at the same low
levels of geography. The Census Bureau,
in conjunction with the Office of
Management and Budget, has a policy
for determining whether new content or
questions will be added to the ACS. As
part of the content determination
process, the Census Bureau must test
the proposed content to determine
whether the ACS can produce data of
sufficiently high quality for the
proposed topic. In all likelihood, this
test will fold into the grid versus
sequential form design test noted above
in an effort to reduce cost and burden.
The test would, however, include a
Content Follow-Up Reinterview of
approximately 80 percent of the sample.
The Census Bureau and OMB will
consider these results in deciding
whether to include the new content, per
the Census Bureau’s Policy on New
Content for the ACS.
In order to provide data of comparable
reliability as the census long-form at
low levels of geography (e.g., census
tract level) or for characteristics of
special, small populations, the ACS
must collect data on a continual basis
and aggregate three to five years worth
of data. Essentially the ACS collects
data every day of the year, either by
mail, telephone interviews or personalvisit interviews in order to have an
adequate number of interviews to
achieve estimates with comparable
reliability to the census long-form at low
levels of geography. Federal agencies
use the ACS data to determine
appropriate funding for state and local
governments through block grants. State
and local governments use ACS data for
program planning, administration and
evaluation. Thus, the reliability and the
quality of the data must remain high in
VerDate Aug<31>2005
16:53 Oct 10, 2006
Jkt 211001
order for the users to rely on the data
for funding decisions.
Similarly, the federal government as
well as state and local governments uses
the core, basic demographics collected
as part of the census for funding and
programmatic decisions. With full
implementation of the ACS, those same
data are available every year. From a
data user’s perspective, large differences
in the estimates for those core data
items between ACS and the census can
be problematic in terms of funding and
program decisions. Since the ACS is a
sample survey rather than a census we
expect some differences in results
between the two. However, there are
many other factors that contribute to
different results, such as differences in
the interviewing staff, social relevance
of the census versus a current survey,
and even form design.
Thus, the 2007 Methods Panel will
investigate ways to reduce or at least
maintain data collection costs so the
Census Bureau can continue to provide
data of comparable reliability as the
census long-form did. Additionally, the
2007 Methods Panel will test whether
differences in form design between the
census and the ACS may contribute to
differences in results for the basic
demographic items used by federal,
state and local governments for funding
and programmatic decisions. Lastly,
funding permitting, the Methods panel
will test proposed content regarding
major field of study for a person’s
bachelor degree in order to provide the
National Science Foundation and the
National Center for Education Statistics
with current information regarding
estimates of types of fields in which
people receive bachelor’s degrees.
Affected Public: Individuals or
households.
Frequency: One time.
Respondent’s Obligation: Mandatory.
Legal Authority: Title 13, United
States Code, Sections 141, 193, and 221.
OMB Desk Officer: Brian HarrisKojetin, (202) 395–7314.
Copies of the above information
collection proposal can be obtained by
calling or writing Diana Hynek,
Departmental Paperwork Clearance
Officer, (202) 482–0266, Department of
Commerce, room 6625, 14th and
Constitution Avenue, NW., Washington,
DC 20230 (or via the Internet at
dhynek@doc.gov).
Written comments and
recommendations for the proposed
information collection should be sent
within 30 days of publication of this
notice to Brian Harris-Kojetin, OMB
Desk Officer either by fax (202–395–
7245) or e-mail (bharrisk@omb.eop.gov).
PO 00000
Frm 00006
Fmt 4703
Sfmt 4703
Dated: October 3, 2006.
Madeleine Clayton,
Management Analyst, Office of the Chief
Information Officer.
[FR Doc. E6–16728 Filed 10–10–06; 8:45 am]
BILLING CODE 3510–07–P
DEPARTMENT OF COMMERCE
Membership of the Office of the
Secretary Performance Review Board
Department of Commerce.
Notice of Membership on the
Office of the Secretary Performance
Review Board.
AGENCY:
ACTION:
SUMMARY: In accordance with 5 U.S.C.,
4314(c)(4), DOC announces the
appointment of persons to serve as
members of the Office of the Secretary
(OS) Performance Review Board (PRB).
The OS/PRB is responsible for
reviewing performance appraisals and
ratings of Senior Executive Service
(SES) members. The appointment of
these members to the OS/PRB will be
for a period of 24 months.
DATES: Effective Date: The effective date
of service of appointees to the Office of
the Secretary Performance Review
Board is upon publication of this notice.
FOR FURTHER INFORMATION CONTACT:
Denise Yaag, Director, Office of
Executive Resources, Office of Human
Resources Management, Office of the
Director, 14th and Constitution Avenue,
NW., Washington, DC 20230, (202) 482–
3600.
SUPPLEMENTARY INFORMATION: The
names, position titles, and type of
appointment of the members of the OS/
PRB are set forth below by organization:
Department of Commerce, Office of the
Secretary, 2006–2008 Performance
Review Board Membership
Office of the Secretary
Tracey S. Rhodes, Director, Executive
Secretariat.
Richard Yamamoto, Director, Office of
Security (Alternate).
Office of Assistant Secretary for
Administration
Lisa Casias, Deputy Director for
Financial Policy.
Economic Development Administration
Mary Pleffner, Deputy Assistant
Secretary for Management Services and
CFO.
National Oceanic and Atmospheric
Administration
John E. Jones, Jr., Deputy Assistant
Administrator for Weather Services.
E:\FR\FM\11OCN1.SGM
11OCN1
Agencies
[Federal Register Volume 71, Number 196 (Wednesday, October 11, 2006)]
[Notices]
[Pages 59719-59720]
From the Federal Register Online via the Government Printing Office [www.gpo.gov]
[FR Doc No: E6-16728]
=======================================================================
-----------------------------------------------------------------------
DEPARTMENT OF COMMERCE
Submission for OMB Review; Comment Request
DOC will submit to the Office of Management and Budget (OMB) for
clearance the following proposal for collection of information under
the provisions of the Paperwork Reduction Act (44 U.S.C. chapter 35).
Agency: U.S. Census Bureau.
Title: American Community Survey, 2007 Methods Panel.
Form Number(s): ACS-1(2005), ACS-1(X)Seq, ACS-1(X)Pro.
Agency Approval Number: None.
Type of Request: New collection.
Burden: 46,000 hours.
Number of Respondents: Postage Test--20,000; Grid vs. Sequential
Test--40,000; Degree Test Reinterview--32,000.
Avg. Hours per Response: Questionnaires--38 minutes; Reinterview--
15 minutes.
Needs and Uses: The U.S. Census Bureau requests authorization from
the Office of Management and Budget (OMB) to conduct the American
Community Survey 2007 Methods Panel tests.
Given the rapid demographic changes experienced in recent years and
the strong expectation that such changes will continue and accelerate,
the once-a-decade data collection approach of a census is no longer
acceptable as a source for the housing and socio-economic data
collected on the census long-form. To meet the needs and expectations
of the country, the Census Bureau developed the American Community
Survey (ACS). This survey collects detailed socioeconomic data every
month and provides tabulations of these data on a yearly basis. The ACS
allows the Census Bureau to provide more timely and relevant housing
and socio-economic data while also reducing operational risks in the
census by eliminating the long-form historically given to one in every
six addresses.
Full implementation of the ACS includes an annual sample of
approximately three million residential addresses a year in the 50
states and the District of Columbia, and another 36,000 addresses in
Puerto Rico. A sample this large allows for annual production and
release of single-year estimates for areas with a population of 65,000
or more. Lower levels of geography require aggregates of three and five
years' worth of data in order to produce estimates of comparable
reliability to the census long-form. However, an ongoing data
collection effort with an annual sample of this magnitude requires that
the ACS continue to research possible methods for maintaining if not
reducing data collection costs. If costs increase, the ACS would have
to consider reductions in sample thus reducing the reliability of the
data as compared to the reliability of the census long-form, especially
at lower levels of geography.
One of the tests included in the 2007 Methods Panel addresses a
method for potentially reducing data collection costs. In this test, we
will implement the same mailing strategy as ACS production and send
each sampled address a prenotice letter, an initial questionnaire (ACS-
1(2005)) packet, and a reminder postcard and for those who haven't
responded by a certain date, we will send a second questionnaire
packet. However, for this test we will send the prenotice letter using
standard postage. Current ACS production procedures send all mail
pieces using a first-class postage rate. Using standard postage rather
than first-class postage for this mail piece can potentially save the
ACS approximately two hundred and thirty thousand dollars in data
collection costs each year. The test will evaluate whether the use of
standard mailing for the prenotice letter impacts mail response rates.
A second test included in the 2007 Methods Panel addresses another
aspect of ACS data collection relative to the census. Both the ACS and
the census collect a core set of basic demographic questions (age and
date of birth, gender, relationship, Hispanic origin and race).
However, the 2010 Census will use a different format (similar to the
format for the 2000 Census) from the format used by the ACS for
collecting this information on the mail questionnaire. The census
format, referred to as a sequential person design, creates a column for
each person that includes each question and associated response
categories. The ACS format, referred to as the grid design, lists the
names of all persons down the left side of the form, the questions
across the top of the page, and the response categories fall in the
`cells' created by crossing the person names by question.
This second test will compare the sequential person (ACS-1(X)Seq)
and grid (ACS-1(X)Pro) formats for collecting the basic demographic
information to measure the impact on data quality, specifically unit
and item non-response rates, response distributions, and within
household coverage. The outcome of the test will determine whether the
different formats might contribute to differences in the estimates for
the basic demographic questions. If the format does influence how
people respond to these basic demographic questions, the Census Bureau
will decide whether the ACS should alter its format of the collection
of these data items to more closely reflect the census style format
prior to the 2010 Census.
The 2007 Methods Panel may also include a third test contingent on
the
[[Page 59720]]
funding allocations in the President's budget for 2007. This third test
will measure and compare the data quality between two versions of new
content proposed by the National Science Foundation for inclusion on
the ACS. The proposed content asks about the major field in which a
person received his or her bachelor's degree. In this test, half the
sample will answer an open-ended question reporting the actual degree
he or she received. The other half of the sample will provide their
field of degree information by answering a series of yes/no questions.
The test will assess which, if either, version results in data of
sufficient quality for inclusion on the ACS.
Given that the ACS collects data every day of the year in every
county in the U.S. and in every municipio in Puerto Rico, the ACS
provides an opportunity to produce data not available from any other
source or survey at the same low levels of geography. The Census
Bureau, in conjunction with the Office of Management and Budget, has a
policy for determining whether new content or questions will be added
to the ACS. As part of the content determination process, the Census
Bureau must test the proposed content to determine whether the ACS can
produce data of sufficiently high quality for the proposed topic. In
all likelihood, this test will fold into the grid versus sequential
form design test noted above in an effort to reduce cost and burden.
The test would, however, include a Content Follow-Up Reinterview of
approximately 80 percent of the sample. The Census Bureau and OMB will
consider these results in deciding whether to include the new content,
per the Census Bureau's Policy on New Content for the ACS.
In order to provide data of comparable reliability as the census
long-form at low levels of geography (e.g., census tract level) or for
characteristics of special, small populations, the ACS must collect
data on a continual basis and aggregate three to five years worth of
data. Essentially the ACS collects data every day of the year, either
by mail, telephone interviews or personal-visit interviews in order to
have an adequate number of interviews to achieve estimates with
comparable reliability to the census long-form at low levels of
geography. Federal agencies use the ACS data to determine appropriate
funding for state and local governments through block grants. State and
local governments use ACS data for program planning, administration and
evaluation. Thus, the reliability and the quality of the data must
remain high in order for the users to rely on the data for funding
decisions.
Similarly, the federal government as well as state and local
governments uses the core, basic demographics collected as part of the
census for funding and programmatic decisions. With full implementation
of the ACS, those same data are available every year. From a data
user's perspective, large differences in the estimates for those core
data items between ACS and the census can be problematic in terms of
funding and program decisions. Since the ACS is a sample survey rather
than a census we expect some differences in results between the two.
However, there are many other factors that contribute to different
results, such as differences in the interviewing staff, social
relevance of the census versus a current survey, and even form design.
Thus, the 2007 Methods Panel will investigate ways to reduce or at
least maintain data collection costs so the Census Bureau can continue
to provide data of comparable reliability as the census long-form did.
Additionally, the 2007 Methods Panel will test whether differences in
form design between the census and the ACS may contribute to
differences in results for the basic demographic items used by federal,
state and local governments for funding and programmatic decisions.
Lastly, funding permitting, the Methods panel will test proposed
content regarding major field of study for a person's bachelor degree
in order to provide the National Science Foundation and the National
Center for Education Statistics with current information regarding
estimates of types of fields in which people receive bachelor's
degrees.
Affected Public: Individuals or households.
Frequency: One time.
Respondent's Obligation: Mandatory.
Legal Authority: Title 13, United States Code, Sections 141, 193,
and 221.
OMB Desk Officer: Brian Harris-Kojetin, (202) 395-7314.
Copies of the above information collection proposal can be obtained
by calling or writing Diana Hynek, Departmental Paperwork Clearance
Officer, (202) 482-0266, Department of Commerce, room 6625, 14th and
Constitution Avenue, NW., Washington, DC 20230 (or via the Internet at
dhynek@doc.gov).
Written comments and recommendations for the proposed information
collection should be sent within 30 days of publication of this notice
to Brian Harris-Kojetin, OMB Desk Officer either by fax (202-395-7245)
or e-mail (bharrisk@omb.eop.gov).
Dated: October 3, 2006.
Madeleine Clayton,
Management Analyst, Office of the Chief Information Officer.
[FR Doc. E6-16728 Filed 10-10-06; 8:45 am]
BILLING CODE 3510-07-P