Agency Information Collection Activities; Submission to the Office of Management and Budget (OMB) for Review and Approval; Comment Request; American Community Survey Methods Panel Tests, 84526-84529 [2024-24529]
Download as PDF
84526
Federal Register / Vol. 89, No. 205 / Wednesday, October 23, 2024 / Notices
as purchases of specialized software or
hardware needed to report, or
expenditures for accounting or records
maintenance services required
specifically by the collection.)
Respondent’s Obligation: Voluntary.
Legal Authority: Title 13 U.S.C. 8(b);
42 U.S.C. 701; 42 U.S.C. 1769d(a)(4)(B);
and 42 U.S.C. 241.
IV. Request for Comments
We are soliciting public comments to
permit the Department/Bureau to: (a)
Evaluate whether the proposed
information collection is necessary for
the proper functions of the Department,
including whether the information will
have practical utility; (b) Evaluate the
accuracy of our estimate of the time and
cost burden for this proposed collection,
including the validity of the
methodology and assumptions used; (c)
Evaluate ways to enhance the quality,
utility, and clarity of the information to
be collected; and (d) Minimize the
reporting burden on those who are to
respond, including the use of automated
collection techniques or other forms of
information technology.
Comments that you submit in
response to this notice are a matter of
public record. We will include, or
summarize, each comment in our
request to OMB to approve this ICR.
Before including your address, phone
number, email address, or other
personal identifying information in your
comment, you should be aware that
your entire comment—including your
personal identifying information—may
be made publicly available at any time.
While you may ask us in your comment
to withhold your personal identifying
information from public review, we
cannot guarantee that we will be able to
do so.
Sheleen Dumas,
Departmental PRA Clearance Officer, Office
of the Under Secretary for Economic Affairs,
Commerce Department.
[FR Doc. 2024–24530 Filed 10–22–24; 8:45 am]
BILLING CODE 3510–07–P
DEPARTMENT OF COMMERCE
khammond on DSKJM1Z7X2PROD with NOTICES
Census Bureau
Agency Information Collection
Activities; Submission to the Office of
Management and Budget (OMB) for
Review and Approval; Comment
Request; American Community Survey
Methods Panel Tests
Census Bureau, Commerce.
Notice of information collection,
request for comment.
AGENCY:
ACTION:
VerDate Sep<11>2014
18:48 Oct 22, 2024
Jkt 265001
The Department of
Commerce, in accordance with the
Paperwork Reduction Act (PRA) of
1995, invites the general public and
other Federal agencies to comment on
proposed, and continuing information
collections, which helps us assess the
impact of our information collection
requirements and minimize the public’s
reporting burden. The purpose of this
notice is to allow for 60 days of public
comment on the proposed revision of
the American Community Survey
Methods Panel Tests, prior to the
submission of the information collection
request (ICR) to OMB for approval.
DATES: To ensure consideration,
comments regarding this proposed
information collection must be received
on or before December 23, 2024.
ADDRESSES: Interested persons are
invited to submit written comments by
email acso.pra@census.gov. Please
reference American Community Survey
Methods Panel Tests in the subject line
of your comments. You may also submit
comments, identified by Docket Number
USBC–2024–0027, to the Federal eRulemaking Portal: https://
www.regulations.gov. Click the
‘‘Comment Now!’’ icon, complete the
required fields, and enter or attach your
comments. All comments received are
part of the public record. No comments
will be posted to https://
www.regulations.gov for public viewing
until after the comment period has
closed. Comments will generally be
posted without change. All Personally
Identifiable Information (for example,
name and address) voluntarily
submitted by the commenter may be
publicly accessible. Do not submit
Confidential Business Information or
otherwise sensitive or protected
information. You may submit
attachments to electronic comments in
Microsoft Word, Excel, or Adobe PDF
file formats.
FOR FURTHER INFORMATION CONTACT:
Requests for additional information or
specific questions related to collection
activities should be directed to G. Brian
Wilson, U.S. Census Bureau, American
Community Survey Office, 301–763–
2819, George.Brian.Wilson@census.gov.
SUPPLEMENTARY INFORMATION:
SUMMARY:
I. Abstract
The American Community Survey
(ACS) is an ongoing monthly survey that
collects detailed social, economic,
housing and demographic data from
about 3.5 million addresses in the
United States and about 36,000
addresses in Puerto Rico each year
(where it is called the Puerto Rico
Community Survey). The ACS also
PO 00000
Frm 00006
Fmt 4703
Sfmt 4703
collects detailed data from about
150,000 residents living in group
quarters (GQ) facilities in the United
States and Puerto Rico. Resulting
tabulations from this data collection are
provided on a yearly basis. The ACS
allows the Census Bureau to provide
timely and relevant social, economic,
housing, and demographic statistics,
even for low levels of geography.
An ongoing data collection effort with
an annual sample of this magnitude
requires that the Census Bureau
continue research, tests, and evaluations
aimed at improving data quality,
reducing data collection costs, and
improving the ACS questionnaire
content and related data collection
materials. The ACS Methods Panel is a
research program at the Census Bureau
designed to address and respond to
survey issues and needs of the ACS. As
part of the Decennial Census Program,
the ACS also provides an opportunity to
research and test elements of survey
data collection that relate to the
decennial census. As such, the ACS
Methods Panel can serve as a testbed for
the decennial census. From 2025 to
2028, the ACS Methods Panel may test
ACS and decennial census methods for
reducing survey cost, addressing
respondent burden, and improving
survey response, data quality, and
survey efficiencies for housing units and
group quarters. The ACS Methods Panel
may also address other emerging needs
of the program.
At this time, proposals are in place for
several tests related to self-response.
Tests may also be conducted for
nonresponse follow-up data collection
and other ACS operations. Because the
ACS Methods Panel is designed to
address emerging issues, we may
propose additional testing as needed.
Any testing would focus on methods for
reducing data collection costs,
improving data quality, improving the
respondent experience, revising content,
or testing new questions for the
Decennial Census Program. The
proposed tests are outlined below.
Questionnaire Timing Test: In an
effort to boost self-response rates and
decrease survey costs, the Questionnaire
Timing Test will test whether changing
the timing of when the ACS paper
questionnaire is sent to sampled
addresses can increase self-response
(overall and by data collection mode)
and/or reduce data collection costs. The
test will also evaluate the impact of
including a Quick Response (QR) code
directing respondents to the internet
data collection instrument. If successful,
adopting these changes could decrease
data collection costs associated with the
paper questionnaire and the Computer-
E:\FR\FM\23OCN1.SGM
23OCN1
khammond on DSKJM1Z7X2PROD with NOTICES
Federal Register / Vol. 89, No. 205 / Wednesday, October 23, 2024 / Notices
Assisted Personal Interviewing (CAPI)
nonresponse follow-up operation.
Internet Instrument Response Option
and Error Message Design Test: This test
will provide information to aid the
development of web design standards
for household and group quarters data
collection instruments used throughout
the Census Bureau. This test will focus
on design standards related to response
options and error messages to increase
data quality and the response
experience. The test for the response
options will compare the use of
standard radio buttons (the current
design) to the use of response buttons,
which have a border around the radio
button and response option wording.
The response buttons will highlight
when hovered over and change to green
once selected. This test will determine
if these changes decrease response time,
change response distributions, or affect
item nonresponse. An additional change
is a modification to error message
design to explore how respondents react
to a different display. Current error
messages display at the top of the page
within a box and use an exclamation
mark and color to draw attention. For
missing write-in fields, an arrow shows
where the error occurred. This
experiment will test a change in colors
used to draw attention to the error.
Instead of an arrow showing where
there is a missing write-in, a change in
the write-in border will be used.
Additional Internet Instrument
Testing: In 2013, the ACS incorporated
the use of an internet instrument to
collect survey responses. The design of
the instrument reflected the research
and standards of survey data collection
at that time. With a growing population
using the internet to respond to the
ACS, as well as the increased use of
smartphones and other electronic
devices with smaller screens, an
evaluation of the internet instrument is
needed. Design elements will be
developed and tested based on input
from experts in survey methodology and
web survey design. Testing may include
revisions focused on improving login
procedures and screen navigation,
improving the user interface design, as
well as methods to decrease respondent
burden. Multiple tests may be
conducted.
Self-Response Mail Messaging and
Contact Strategies Testing: In response
to declining ACS response rates and
increasing data collection costs, the
Census Bureau plans to study methods
to increase self-response to the survey,
as this mode of data collection is the
least expensive. The Census Bureau
currently sends up to five mailings to a
sampled address to inform the
VerDate Sep<11>2014
18:48 Oct 22, 2024
Jkt 265001
occupants that their address has been
selected to participate in the ACS and
to encourage them to self-respond to the
survey. The proposed tests would
evaluate changes to the mailings,
including the use of additional plain
language to improve communication,
redesigning the visual appearance of the
mail materials, improving messaging to
motivate response, and adding or
removing materials included in the
mailings. Changes to the contact
method, the number of contacts, and the
timing of the contacts may also be
tested. Multiple tests may be conducted.
Content Testing: Working through the
Office of Management and Budget
Interagency Committee for the ACS, the
Census Bureau will solicit proposals
from other Federal agencies to change
existing questions or add new questions
to the ACS. The objective of content
testing is to determine the impact of
changing question wording and
response categories, as well as
redefining underlying constructs, on the
quality of the data collected. The Census
Bureau evaluates changes to current
questions by comparing the revised
questions to the current ACS questions.
For new questions, the Census Bureau
proposes comparing the performance of
two versions of any new questions and
benchmark results with other wellknown sources of such information. The
questions would be tested using all
modes of data collection. Response bias
or variance may also be measured to
evaluate the questions by conducting a
follow-up interview with respondents.
Multiple tests may be conducted.
Nonresponse Follow-up Data
Collection Testing: The Census Bureau
is proposing to test modifications to
nonresponse follow-up data collection
operations to increase response to the
survey. The proposed tests would
evaluate changes to the materials used
by ACS field representatives (FRs),
including changes to the messaging to
motivate response or changes to the
types of materials used. Testing may
also include evaluation of modifications
to operational approaches and data
collection procedures, such as contact
methods and timing. Multiple tests may
be conducted.
II. Method of Collection
The American Community Survey is
collected via the following modes:
internet, paper questionnaire, telephone
interview, and in-person interview
(CAPI). The Census Bureau sends up to
five mailings to eligible housings units
to encourage self-response. Respondents
may receive help by utilizing an
Interactive Voice Response (IVR) system
(though survey response cannot be
PO 00000
Frm 00007
Fmt 4703
Sfmt 4703
84527
provided by IVR). Respondents can also
call our Telephone Questionnaire
Assistance (TQA) help line for help or
to respond. FRs may visit a housing unit
or sampled GQ facility to conduct an
interview in person or may conduct the
interview by phone. Administrative
records are also used to replace,
supplement, and support data
collection. The ACS Methods Panel
Tests use all of these modes of data
collection or a subset of the modes,
depending on the purpose of the test.
Specific modes for the tests are noted
below.
Questionnaire Timing Test: This test
will evaluate mailout materials, number
of mailings, and the timing of mailouts
that solicit self-response using paper
questionnaire responses. The test will
include housing units only.
Internet Instrument Response Option
and Error Message Design Test: This test
will assess modifications to the internet
instrument conducted via a split-sample
experiment. Only the internet mode of
the self-response phase of data
collection is included in the testing.
Additional internet Instrument
Testing: This testing will assess
modifications to the internet instrument
conducted via split-sample experiments.
Only the internet mode of the selfresponse phase of data collection is
included in the testing.
Self-Response Mail Messaging and
Contact Strategies Testing: This testing
will evaluate mailout materials that
solicit self-response using internet,
paper questionnaire, and telephone
responses. Tests will be done as a split
sample and will include housing units
only.
Content Testing: This testing is for
item-level changes and will be
conducted as a split-sample experiment,
with half of the sampled addresses
receiving one version of the questions
and the other half receiving a different
version of the questions. All modes of
ACS data collection are included in the
test. Additionally, a follow-up
reinterview may be conducted with all
households that respond to measure
response bias or response variance.
Nonresponse Follow-up Data
Collection Testing: This testing will be
done as a split sample focusing on inperson and telephone interviews
conducted by FRs. As part of their
interaction with respondents, FRs also
encourage response online and provide
materials to respondents. Respondents
may also mail back a paper
questionnaire they received during the
self-response phase of the ACS.
III. Data
OMB Control Number: 0607–0936.
E:\FR\FM\23OCN1.SGM
23OCN1
84528
Federal Register / Vol. 89, No. 205 / Wednesday, October 23, 2024 / Notices
Form Number(s): ACS–1, ACS–1(GQ),
ACS–1(PR)SP, ACS CAPI(HU), and ACS
RI(HU).
Type of Review: Regular submission,
Request for a Revision of a Currently
Approved Collection.
Test
Affected Public: Individuals or
households.
Estimated Number of Respondents:
Estimated number of respondents
Questionnaire Timing Test .......................................................................
Response Option and Error Message Design Test .................................
Additional Internet Instrument Testing .....................................................
Self-Response Mail Messaging and Contact Strategies Testing .............
Content Testing ........................................................................................
Content Testing Follow-up Interview ........................................................
Nonresponse Follow-up Data Collection Testing .....................................
288,000.
288000.
Test A—60,000,
Test A—60,000,
Test A—40,000,
Test A—40,000,
100,000.
Test
Test
Test
Test
B—60,000.
B—60,000, Test C—60,000.
B—40,000.
B—40,000.
Estimated Time per Response:
Estimated time
per response
Test
Questionnaire Timing Test ..........................................................................................................................................................
Response Option and Error Message Design Test ....................................................................................................................
Additional Internet Instrument Testing .........................................................................................................................................
Self-Response Mail Messaging and Contact Strategies Testing ................................................................................................
Content Testing ...........................................................................................................................................................................
Content Testing Follow-up Interview ...........................................................................................................................................
Nonresponse Follow-up Data Collection Testing ........................................................................................................................
40
40
40
40
40
20
40
Estimated Total Annual Burden
Hours:
Estimated
number of
respondents
Test
Questionnaire Timing Test ...................................................................................................
Response Option and Error Message Design Test .............................................................
Additional Internet Instrument Testing ..................................................................................
Estimated time
per response
(in minutes)
Nonresponse Follow-up Data Collection Testing .................................................................
288,000 ............
288000 .............
Test A—60,000
Test B—60,000
Test A—60,000
Test B—60,000
Test C—60,000
Test A—40,000
Test B—40,000
Test A—40,000
Test B—40,000
100,000 ............
40
192,000
192,000
40,000
40,000
40,000
40,000
40,000
26,667
26,667
13,333
13,333
66,667
Total (over 3 years) * .....................................................................................................
1,136,000 .........
............................
730,667
Annual Burden Hours ....................................................................................................
378,667 ............
............................
243,556
Self-Response Mail Messaging and Contact Strategies Testing .........................................
Content Testing ....................................................................................................................
Content Testing Follow-up Interview ....................................................................................
40
40
40
Total
burden hours
40
40
20
khammond on DSKJM1Z7X2PROD with NOTICES
* Note: This is the maximum burden requested for these tests. Every effort is taken to use existing production sample for testing when the
tests do not involve content changes.
Estimated Total Annual Cost to
Public: $0 (This is not the cost of
respondents’ time, but the indirect costs
respondents may incur for such things
as purchases of specialized software or
hardware needed to report, or
expenditures for accounting or records
maintenance services required
specifically by the collection.)
Respondent’s Obligation: Mandatory.
Legal Authority: Title 13 U.S.C. 141,
193, and 221.
IV. Request for Comments
We are soliciting public comments to
permit the Department/Bureau to: (a)
VerDate Sep<11>2014
18:48 Oct 22, 2024
Jkt 265001
Evaluate whether the proposed
information collection is necessary for
the proper functions of the Department,
including whether the information will
have practical utility; (b) Evaluate the
accuracy of our estimate of the time and
cost burden for this proposed collection,
including the validity of the
methodology and assumptions used; (c)
Evaluate ways to enhance the quality,
utility, and clarity of the information to
be collected; and (d) Minimize the
reporting burden on those who are to
respond, including the use of automated
collection techniques or other forms of
information technology.
PO 00000
Frm 00008
Fmt 4703
Sfmt 4703
Comments that you submit in
response to this notice are a matter of
public record. We will include, or
summarize, each comment in our
request to OMB to approve this ICR.
Before including your address, phone
number, email address, or other
personal identifying information in your
comment, you should be aware that
your entire comment—including your
personal identifying information—may
be made publicly available at any time.
While you may ask us in your comment
to withhold your personal identifying
information from public review, we
E:\FR\FM\23OCN1.SGM
23OCN1
Federal Register / Vol. 89, No. 205 / Wednesday, October 23, 2024 / Notices
cannot guarantee that we will be able to
do so.
Sheleen Dumas,
Departmental PRA Clearance Officer, Office
of the Under Secretary for Economic Affairs,
Commerce Department.
[FR Doc. 2024–24529 Filed 10–22–24; 8:45 am]
BILLING CODE 3510–07–P
DEPARTMENT OF COMMERCE
International Trade Administration
[A–533–889]
Certain Quartz Surface Products From
India: Notice of Amended Final Results
of Antidumping Duty Administrative
Review Pursuant to Settlement
Enforcement and Compliance,
International Trade Administration,
Department of Commerce.
SUMMARY: The U.S. Department of
Commerce (Commerce) is issuing these
amended final results pursuant to a
settlement agreement with Antique
Marbonite Private Limited, Arizona
Tile, LLC, Cambria Company LLC, M S
International, Inc., PNS Clearance LLC,
Prism Johnson Limited, Shivam
Enterprises, and other various entities
with respect to the final results of the
administrative review of certain quartz
surface products (QSP) from India
during the period of review (POR)
December 13, 2019, through May 31,
2021.
AGENCY:
DATES:
Applicable October 23, 2024.
FOR FURTHER INFORMATION CONTACT:
Patrick Barton, AD/CVD Operations,
Office III, Enforcement and Compliance,
International Trade Administration,
U.S. Department of Commerce, 1401
Constitution Avenue NW, Washington,
DC 20230; telephone: (202) 482–0012.
SUPPLEMENTARY INFORMATION:
International Pvt. Ltd., Cuarzo, Esprit
Stones Pvt., Ltd., Glowstone Industries
Private Limited, Keros Stone LLP, Mahi
Granites Private Limited., Marudhar
Rocks International Pvt. Ltd., Pacific
Industries Limited, Pacific Quartz
Surfaces LLP, Paradigm Stone India
Private Limited, Pelican Quartz Stone,
Quartzkraft LLP, Rocks Forever, Safayar
Ceramics Private Ltd., Satya Exports,
Southern Rocks and Minerals Private
Limited, and Sunex Stones Private Ltd.,
producers and/or exporters of QSP from
India to the United States. In the Final
Results, Commerce assigned to Antique
Group a weighted-average dumping
margin of 323.12 percent for the POR
and assigned to the non-selected
respondents 2 a weighted-average
dumping margin of 3.19 percent for the
POR.3
Following the publication of the Final
Results, Antique Group, Arizona Title et
al.,4 APB Trading, LLC et al.,5 and
Cambria Company LLC (Cambria) filed
lawsuits with the U.S. Court of
International Trade (CIT) challenging
certain aspects of Commerce’s Final
Results, including Commerce’s
decisions to reject Antique Group’s
second supplemental questionnaire
response, deny Antique Group’s
requests for permission to refile the
response, apply total adverse facts
available to Antique Group, and assign
the all-others rate calculated in the
underlying investigation to the nonselected respondents in the
administrative review.6
On October 16, 2024, the United
States, Antique Group, Arizona Tile et
al., APB Trading, LLC et al., Cambria,
and Federation of Indian Quartz Surface
Industry entered into an agreement to
settle this dispute. Pursuant to the terms
of settlement and the stipulation for
entry of judgment, the amended final
weighted-average dumping margin for
Antique Group is 3.58 percent.
khammond on DSKJM1Z7X2PROD with NOTICES
Background
On January 9, 2023, Commerce
published the final results of the
administrative review of the
antidumping duty order on QSP from
India.1 The POR is December 13, 2019,
through May 31, 2021.
The administrative review covers
Antique Marbonite Private Limited,
India/Shivam Enterprises/Prism
Johnson Limited (collectively, Antique
Group), Pokarna Engineered Stone
Limited, ARO Granite Industries
Limited, Baba Super Minerals Pvt. Ltd.,
Camrola Quartz Limited, Chariot
1 See Certain Quartz Surface Products from India:
Final Results of Antidumping Duty Administrative
Review; 2019–2021, 88 FR 1188 (January 9, 2023)
(Final Results).
VerDate Sep<11>2014
18:48 Oct 22, 2024
Jkt 265001
2 Including, but not limited to: ARO Granite
Industries Limited, Baba Super Minerals Pvt. Ltd.,
Camrola Quartz Limited, Chariot International Pvt.
Ltd., Cuarzo, Esprit Stones Pvt., Ltd., Glowstone
Industries Private Limited, Keros Stone LLP, Mahi
Granites Private Limited., Marudhar Rocks
International Pvt. Ltd., Pacific Industries Limited,
Pacific Quartz Surfaces LLP, Paradigm Stone India
Private Limited, Pelican Quartz Stone, Quartzkraft
LLP, Rocks Forever, Safayar Ceramics Private Ltd.,
Satya Exports, Southern Rocks and Minerals Private
Limited, and Sunex Stones Private Ltd.
3 See Final Results, 88 FR at 1189.
4 Arizona Tile, LLC, M S International, Inc., and
PNS Clearance LLC (collectively, Arizona Tile).
5 APB Trading, LLC, Cosmos Granite (South East)
LLC, Cosmos Granite (South West) LLC, Curava
Corporation; DivyaShakti Limited, Divyashakti
Granites Limited, Marudhar Rocks International
Pvt. Ltd., Overseas Manufacturing and Supply Inc.,
Quartzkraft LLP, and Stratus Surfaces LLC.
6 See Cambria Company LLC v. United States,
Slip Op. 24–62 (CIT May 28, 2024).
PO 00000
Frm 00009
Fmt 4703
Sfmt 4703
84529
Additionally, Commerce will set
importer-specific dumping margins for
the remaining unliquidated entries of
certain non-selected companies at 1.02
percent.7 The CIT issued its order of
judgment by stipulation on October 16,
2024.8
Assessment Rates
Consistent with the settlement
agreement and October 16, 2024, order
of judgment by stipulation, Commerce
will instruct U.S. Customs and Border
Protection (CBP) to liquidate all
unliquidated entries of QSP from India
produced and/or exported by Antique
Group, and entered, or withdrawn from
warehouse, for consumption in the
United States during the POR equal to
Antique Group’s weighted-average
dumping margin of 3.58 percent.
Additionally, Commerce will instruct
CBP to assess importer-specific
dumping margins for certain nonselected companies at 1.02 percent as
follows: (i) for subject merchandise
imported by Arizona Tile, LLC and
produced and/or exported by ARO
Granite Industries Limited, Esprit
Stones Pvt., Ltd., Marudhar Rocks
International Pvt. Ltd., Pacific Industries
Limited, Pacific Quartz Surfaces LLP,
Paradigm Stone India Private Limited,
or Quartzkraft LLP; (ii) for subject
merchandise imported by M S
International, Inc. and produced and/or
exported by Baba Super Minerals Pvt.
Ltd., Camrola Quartz Limited, Chariot
International Pvt. Ltd., Cuarzo, Esprit
Stones Pvt., Ltd., Glowstone Industries
Private Limited, Keros Stone LLP, Mahi
Granites Private Limited., Pacific
Industries Limited, Pacific Quartz
Surfaces LLP, Paradigm Stone India
Private Limited, Pelican Quartz Stone,
Rocks Forever, Safayar Ceramics Private
Ltd., Satya Exports, or Southern Rocks
and Minerals Private Limited; and (iii)
for subject merchandise imported by
PNS Clearance LLC and produced and/
or exported by Baba Super Minerals Pvt.
Ltd., Camrola Quartz Limited, Chariot
International Pvt. Ltd., Cuarzo, Esprit
Stones Pvt., Ltd., Glowstone Industries
Private Limited, Keros Stone LLP, Mahi
Granites Private Limited., Marudhar
Rocks International Pvt. Ltd., Pacific
Industries Limited, Pacific Quartz
Surfaces LLP, Paradigm Stone India
Private Limited, Pelican Quartz Stone,
Rocks Forever, Safayar Ceramics Private
Ltd., Satya Exports, Southern Rocks and
Minerals Private Limited, or Sunex
Stones Private Ltd.
7 See
‘‘Assessment Rates’’ section, infra.
Cambria Company LLC v. United States,
Consol. Court No. 23–00007, Doc. No. 115 (Oct. 16,
2024).
8 See
E:\FR\FM\23OCN1.SGM
23OCN1
Agencies
[Federal Register Volume 89, Number 205 (Wednesday, October 23, 2024)]
[Notices]
[Pages 84526-84529]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 2024-24529]
-----------------------------------------------------------------------
DEPARTMENT OF COMMERCE
Census Bureau
Agency Information Collection Activities; Submission to the
Office of Management and Budget (OMB) for Review and Approval; Comment
Request; American Community Survey Methods Panel Tests
AGENCY: Census Bureau, Commerce.
ACTION: Notice of information collection, request for comment.
-----------------------------------------------------------------------
SUMMARY: The Department of Commerce, in accordance with the Paperwork
Reduction Act (PRA) of 1995, invites the general public and other
Federal agencies to comment on proposed, and continuing information
collections, which helps us assess the impact of our information
collection requirements and minimize the public's reporting burden. The
purpose of this notice is to allow for 60 days of public comment on the
proposed revision of the American Community Survey Methods Panel Tests,
prior to the submission of the information collection request (ICR) to
OMB for approval.
DATES: To ensure consideration, comments regarding this proposed
information collection must be received on or before December 23, 2024.
ADDRESSES: Interested persons are invited to submit written comments by
email [email protected]. Please reference American Community Survey
Methods Panel Tests in the subject line of your comments. You may also
submit comments, identified by Docket Number USBC-2024-0027, to the
Federal e-Rulemaking Portal: https://www.regulations.gov. Click the
``Comment Now!'' icon, complete the required fields, and enter or
attach your comments. All comments received are part of the public
record. No comments will be posted to https://www.regulations.gov for
public viewing until after the comment period has closed. Comments will
generally be posted without change. All Personally Identifiable
Information (for example, name and address) voluntarily submitted by
the commenter may be publicly accessible. Do not submit Confidential
Business Information or otherwise sensitive or protected information.
You may submit attachments to electronic comments in Microsoft Word,
Excel, or Adobe PDF file formats.
FOR FURTHER INFORMATION CONTACT: Requests for additional information or
specific questions related to collection activities should be directed
to G. Brian Wilson, U.S. Census Bureau, American Community Survey
Office, 301-763-2819, [email protected].
SUPPLEMENTARY INFORMATION:
I. Abstract
The American Community Survey (ACS) is an ongoing monthly survey
that collects detailed social, economic, housing and demographic data
from about 3.5 million addresses in the United States and about 36,000
addresses in Puerto Rico each year (where it is called the Puerto Rico
Community Survey). The ACS also collects detailed data from about
150,000 residents living in group quarters (GQ) facilities in the
United States and Puerto Rico. Resulting tabulations from this data
collection are provided on a yearly basis. The ACS allows the Census
Bureau to provide timely and relevant social, economic, housing, and
demographic statistics, even for low levels of geography.
An ongoing data collection effort with an annual sample of this
magnitude requires that the Census Bureau continue research, tests, and
evaluations aimed at improving data quality, reducing data collection
costs, and improving the ACS questionnaire content and related data
collection materials. The ACS Methods Panel is a research program at
the Census Bureau designed to address and respond to survey issues and
needs of the ACS. As part of the Decennial Census Program, the ACS also
provides an opportunity to research and test elements of survey data
collection that relate to the decennial census. As such, the ACS
Methods Panel can serve as a testbed for the decennial census. From
2025 to 2028, the ACS Methods Panel may test ACS and decennial census
methods for reducing survey cost, addressing respondent burden, and
improving survey response, data quality, and survey efficiencies for
housing units and group quarters. The ACS Methods Panel may also
address other emerging needs of the program.
At this time, proposals are in place for several tests related to
self-response. Tests may also be conducted for nonresponse follow-up
data collection and other ACS operations. Because the ACS Methods Panel
is designed to address emerging issues, we may propose additional
testing as needed. Any testing would focus on methods for reducing data
collection costs, improving data quality, improving the respondent
experience, revising content, or testing new questions for the
Decennial Census Program. The proposed tests are outlined below.
Questionnaire Timing Test: In an effort to boost self-response
rates and decrease survey costs, the Questionnaire Timing Test will
test whether changing the timing of when the ACS paper questionnaire is
sent to sampled addresses can increase self-response (overall and by
data collection mode) and/or reduce data collection costs. The test
will also evaluate the impact of including a Quick Response (QR) code
directing respondents to the internet data collection instrument. If
successful, adopting these changes could decrease data collection costs
associated with the paper questionnaire and the Computer-
[[Page 84527]]
Assisted Personal Interviewing (CAPI) nonresponse follow-up operation.
Internet Instrument Response Option and Error Message Design Test:
This test will provide information to aid the development of web design
standards for household and group quarters data collection instruments
used throughout the Census Bureau. This test will focus on design
standards related to response options and error messages to increase
data quality and the response experience. The test for the response
options will compare the use of standard radio buttons (the current
design) to the use of response buttons, which have a border around the
radio button and response option wording. The response buttons will
highlight when hovered over and change to green once selected. This
test will determine if these changes decrease response time, change
response distributions, or affect item nonresponse. An additional
change is a modification to error message design to explore how
respondents react to a different display. Current error messages
display at the top of the page within a box and use an exclamation mark
and color to draw attention. For missing write-in fields, an arrow
shows where the error occurred. This experiment will test a change in
colors used to draw attention to the error. Instead of an arrow showing
where there is a missing write-in, a change in the write-in border will
be used.
Additional Internet Instrument Testing: In 2013, the ACS
incorporated the use of an internet instrument to collect survey
responses. The design of the instrument reflected the research and
standards of survey data collection at that time. With a growing
population using the internet to respond to the ACS, as well as the
increased use of smartphones and other electronic devices with smaller
screens, an evaluation of the internet instrument is needed. Design
elements will be developed and tested based on input from experts in
survey methodology and web survey design. Testing may include revisions
focused on improving login procedures and screen navigation, improving
the user interface design, as well as methods to decrease respondent
burden. Multiple tests may be conducted.
Self-Response Mail Messaging and Contact Strategies Testing: In
response to declining ACS response rates and increasing data collection
costs, the Census Bureau plans to study methods to increase self-
response to the survey, as this mode of data collection is the least
expensive. The Census Bureau currently sends up to five mailings to a
sampled address to inform the occupants that their address has been
selected to participate in the ACS and to encourage them to self-
respond to the survey. The proposed tests would evaluate changes to the
mailings, including the use of additional plain language to improve
communication, redesigning the visual appearance of the mail materials,
improving messaging to motivate response, and adding or removing
materials included in the mailings. Changes to the contact method, the
number of contacts, and the timing of the contacts may also be tested.
Multiple tests may be conducted.
Content Testing: Working through the Office of Management and
Budget Interagency Committee for the ACS, the Census Bureau will
solicit proposals from other Federal agencies to change existing
questions or add new questions to the ACS. The objective of content
testing is to determine the impact of changing question wording and
response categories, as well as redefining underlying constructs, on
the quality of the data collected. The Census Bureau evaluates changes
to current questions by comparing the revised questions to the current
ACS questions. For new questions, the Census Bureau proposes comparing
the performance of two versions of any new questions and benchmark
results with other well-known sources of such information. The
questions would be tested using all modes of data collection. Response
bias or variance may also be measured to evaluate the questions by
conducting a follow-up interview with respondents. Multiple tests may
be conducted.
Nonresponse Follow-up Data Collection Testing: The Census Bureau is
proposing to test modifications to nonresponse follow-up data
collection operations to increase response to the survey. The proposed
tests would evaluate changes to the materials used by ACS field
representatives (FRs), including changes to the messaging to motivate
response or changes to the types of materials used. Testing may also
include evaluation of modifications to operational approaches and data
collection procedures, such as contact methods and timing. Multiple
tests may be conducted.
II. Method of Collection
The American Community Survey is collected via the following modes:
internet, paper questionnaire, telephone interview, and in-person
interview (CAPI). The Census Bureau sends up to five mailings to
eligible housings units to encourage self-response. Respondents may
receive help by utilizing an Interactive Voice Response (IVR) system
(though survey response cannot be provided by IVR). Respondents can
also call our Telephone Questionnaire Assistance (TQA) help line for
help or to respond. FRs may visit a housing unit or sampled GQ facility
to conduct an interview in person or may conduct the interview by
phone. Administrative records are also used to replace, supplement, and
support data collection. The ACS Methods Panel Tests use all of these
modes of data collection or a subset of the modes, depending on the
purpose of the test. Specific modes for the tests are noted below.
Questionnaire Timing Test: This test will evaluate mailout
materials, number of mailings, and the timing of mailouts that solicit
self-response using paper questionnaire responses. The test will
include housing units only.
Internet Instrument Response Option and Error Message Design Test:
This test will assess modifications to the internet instrument
conducted via a split-sample experiment. Only the internet mode of the
self-response phase of data collection is included in the testing.
Additional internet Instrument Testing: This testing will assess
modifications to the internet instrument conducted via split-sample
experiments. Only the internet mode of the self-response phase of data
collection is included in the testing.
Self-Response Mail Messaging and Contact Strategies Testing: This
testing will evaluate mailout materials that solicit self-response
using internet, paper questionnaire, and telephone responses. Tests
will be done as a split sample and will include housing units only.
Content Testing: This testing is for item-level changes and will be
conducted as a split-sample experiment, with half of the sampled
addresses receiving one version of the questions and the other half
receiving a different version of the questions. All modes of ACS data
collection are included in the test. Additionally, a follow-up
reinterview may be conducted with all households that respond to
measure response bias or response variance.
Nonresponse Follow-up Data Collection Testing: This testing will be
done as a split sample focusing on in-person and telephone interviews
conducted by FRs. As part of their interaction with respondents, FRs
also encourage response online and provide materials to respondents.
Respondents may also mail back a paper questionnaire they received
during the self-response phase of the ACS.
III. Data
OMB Control Number: 0607-0936.
[[Page 84528]]
Form Number(s): ACS-1, ACS-1(GQ), ACS-1(PR)SP, ACS CAPI(HU), and
ACS RI(HU).
Type of Review: Regular submission, Request for a Revision of a
Currently Approved Collection.
Affected Public: Individuals or households.
Estimated Number of Respondents:
------------------------------------------------------------------------
Test Estimated number of respondents
------------------------------------------------------------------------
Questionnaire Timing Test.............. 288,000.
Response Option and Error Message 288000.
Design Test.
Additional Internet Instrument Testing. Test A--60,000, Test B--60,000.
Self-Response Mail Messaging and Test A--60,000, Test B--60,000,
Contact Strategies Testing. Test C--60,000.
Content Testing........................ Test A--40,000, Test B--40,000.
Content Testing Follow-up Interview.... Test A--40,000, Test B--40,000.
Nonresponse Follow-up Data Collection 100,000.
Testing.
------------------------------------------------------------------------
Estimated Time per Response:
------------------------------------------------------------------------
Estimated time per
Test response
------------------------------------------------------------------------
Questionnaire Timing Test........................... 40
Response Option and Error Message Design Test....... 40
Additional Internet Instrument Testing.............. 40
Self-Response Mail Messaging and Contact Strategies 40
Testing............................................
Content Testing..................................... 40
Content Testing Follow-up Interview................. 20
Nonresponse Follow-up Data Collection Testing....... 40
------------------------------------------------------------------------
Estimated Total Annual Burden Hours:
----------------------------------------------------------------------------------------------------------------
Estimated time
Test Estimated number of respondents per response Total burden
(in minutes) hours
----------------------------------------------------------------------------------------------------------------
Questionnaire Timing Test................. 288,000........................... 40 192,000
Response Option and Error Message Design 288000............................ 40 192,000
Test.
Additional Internet Instrument Testing.... Test A--60,000.................... 40 40,000
Test B--60,000.................... 40,000
Self-Response Mail Messaging and Contact Test A--60,000.................... 40 40,000
Strategies Testing. Test B--60,000.................... 40,000
Test C--60,000.................... 40,000
Content Testing........................... Test A--40,000.................... 40 26,667
Test B--40,000.................... 26,667
Content Testing Follow-up Interview....... Test A--40,000.................... 20 13,333
Test B--40,000.................... 13,333
Nonresponse Follow-up Data Collection 100,000........................... 40 66,667
Testing.
---------------------------------------------------------------------
Total (over 3 years) *................ 1,136,000......................... ................ 730,667
---------------------------------------------------------------------
Annual Burden Hours................... 378,667........................... ................ 243,556
----------------------------------------------------------------------------------------------------------------
* Note: This is the maximum burden requested for these tests. Every effort is taken to use existing production
sample for testing when the tests do not involve content changes.
Estimated Total Annual Cost to Public: $0 (This is not the cost of
respondents' time, but the indirect costs respondents may incur for
such things as purchases of specialized software or hardware needed to
report, or expenditures for accounting or records maintenance services
required specifically by the collection.)
Respondent's Obligation: Mandatory.
Legal Authority: Title 13 U.S.C. 141, 193, and 221.
IV. Request for Comments
We are soliciting public comments to permit the Department/Bureau
to: (a) Evaluate whether the proposed information collection is
necessary for the proper functions of the Department, including whether
the information will have practical utility; (b) Evaluate the accuracy
of our estimate of the time and cost burden for this proposed
collection, including the validity of the methodology and assumptions
used; (c) Evaluate ways to enhance the quality, utility, and clarity of
the information to be collected; and (d) Minimize the reporting burden
on those who are to respond, including the use of automated collection
techniques or other forms of information technology.
Comments that you submit in response to this notice are a matter of
public record. We will include, or summarize, each comment in our
request to OMB to approve this ICR. Before including your address,
phone number, email address, or other personal identifying information
in your comment, you should be aware that your entire comment--
including your personal identifying information--may be made publicly
available at any time. While you may ask us in your comment to withhold
your personal identifying information from public review, we
[[Page 84529]]
cannot guarantee that we will be able to do so.
Sheleen Dumas,
Departmental PRA Clearance Officer, Office of the Under Secretary for
Economic Affairs, Commerce Department.
[FR Doc. 2024-24529 Filed 10-22-24; 8:45 am]
BILLING CODE 3510-07-P