Submission for OMB Review; Comment Request, 29609-29614 [2015-12140]
Download as PDF
Federal Register / Vol. 80, No. 99 / Friday, May 22, 2015 / Notices
asabaliauskas on DSK5VPTVN1PROD with NOTICES
3. Reporting
a. Performance reporting. All
recipients of DLT financial assistance
must provide annual performance
activity reports to RUS until the project
is complete and the funds are expended.
A final performance report is also
required; the final report may serve as
the last annual report. The final report
must include an evaluation of the
success of the project in meeting DLT
Program objectives. See 7 CFR 1703.107
for additional information on these
reporting requirements.
b. Financial reporting. All recipients
of DLT financial assistance must
provide an annual audit, beginning with
the first year in which a portion of the
financial assistance is expended. Audits
are governed by United States
Department of Agriculture audit
regulations. Please see 7 CFR 1703.108
and Subpart F (Audit Requirements) of
2 CFR part 200 for a description of the
financial reporting requirements of all
recipients of DLT financial assistance.
c. Recipient and Subrecipient
Reporting. The applicant must have the
necessary processes and systems in
place to comply with the reporting
requirements for first-tier sub-awards
and executive compensation under the
Federal Funding Accountability and
Transparency Act of 2006 in the event
the applicant receives funding unless
such applicant is exempt from such
reporting requirements pursuant to 2
CFR part 170, § 170.110(b). The
reporting requirements under the
Transparency Act pursuant to 2 CFR
part 170 are as follows:
i. First Tier Sub-Awards of $25,000 or
more (unless they are exempt under 2
CFR part 170) must be reported by the
Recipient to https://www.fsrs.gov no later
than the end of the month following the
month the obligation was made. Please
note that currently underway is a
consolidation of eight federal
procurement systems, including the
Sub-award Reporting System (FSRS),
into one system, the System for Award
Management (SAM). As result the FSRS
will soon be consolidated into and
accessed through https://www.sam.gov/
portal/public/SAM/.
ii. The Total Compensation of the
Recipient’s Executives (5 most highly
compensated executives) must be
reported by the Recipient (if the
Recipient meets the criteria under 2 CFR
part 170) to https://www.sam.gov/
portal/public/SAM/ by the end of the
month following the month in which
the award was made.
iii. The Total Compensation of the
Subrecipient’s Executives (5 most
highly compensated executives) must be
VerDate Sep<11>2014
18:19 May 21, 2015
Jkt 235001
reported by the Subrecipient (if the
Subrecipient meets the criteria under 2
CFR part 170) to the Recipient by the
end of the month following the month
in which the subaward was made.
d. Record Keeping and Accounting.
The grant contract will contain
provisions relating to record keeping
and accounting requirements.
G. Federal Awarding Agency Contacts
1. Web site: https://www.rd.usda.gov/
programs-services/distance-learningtelemedicine-grants. The DLT Web site
maintains up-to-date resources and
contact information for DLT programs.
2. Telephone: 202–720–0800.
3. Fax: 202–205–2921.
4. Email: dltinfo@wdc.usda.gov.
5. Main point of contact: Shawn
Arner, Deputy Assistant Administrator,
Loan Origination and Approval
Division, Rural Utilities Service.
H. Other Information
29609
9410, by fax (202) 690–7442 or email at
program.intake@usda.gov.
3. Persons With Disabilities
Individuals who are deaf, hard of
hearing or have speech disabilities and
that wish to file either an EEO or
program complaint may contact USDA
through the Federal Relay Service at
(800) 877–8339 or (800) 845–6136 (in
Spanish).
Persons with disabilities who wish to
file a program complaint, please see
information above on how to contact us
by mail directly or by email. If you
require alternative means of
communication for program information
(e.g., Braille, large print, audiotape, etc.)
please contact USDA’s TARGET Center
at (202) 720–2600 (voice and TDD).
Dated: May 14, 2015.
Brandon McBride,
Administrator, Rural Utilities Service.
[FR Doc. 2015–12222 Filed 5–21–15; 8:45 am]
BILLING CODE P
1. USDA Non-Discrimination Statement
USDA prohibits discrimination
against its customers, employees, and
applicants for employment on the bases
of race, color, national origin, age,
disability, sex, gender identity, religion,
reprisal, and where applicable, political
beliefs, marital status, familial or
parental status, sexual orientation, or all
or part of an individual’s income is
derived from any public assistance
program, or protected genetic
information in employment or in any
program or activity conducted or funded
by USDA. (Not all prohibited bases will
apply to all programs and/or
employment activities.)
DEPARTMENT OF COMMERCE
2. How To File a Complaint
Questionnaire
If you wish to file an employment
complaint, you must contact your
agency’s EEO Counselor within 45 days
of the date of the alleged discriminatory
act, event, or in the case of a personnel
action. Additional information can be
found online at https://
www.ascr.usda.gov/complaint_filing_
file.html.
If you wish to file a Civil Rights
program complaint of discrimination,
complete the USDA Program
Discrimination Complaint Form (PDF),
found online at https://
www.ascr.usda.gov/complaint_filing_
cust.html, or at any USDA office, or call
(866) 632–9992 to request the form. You
may also write a letter containing all of
the information requested in the form.
Send your completed complaint form or
letter to us by mail at U.S. Department
of Agriculture, Director, Office of
Adjudication, 1400 Independence
Avenue SW., Washington, DC 20250–
DE–1A(E/S)
DE–1C(E/S)
DE–1D(E/S)
DE–1D2(E/S)
DE–1G(E/S)
DE–1H(E/S)
DE–1I(E/S)
DE–1W(E/S)
DE–1C(E/S)PR
DE–1I(E/S) PR
PO 00000
Frm 00008
Fmt 4703
Sfmt 4703
Submission for OMB Review;
Comment Request
The Department of Commerce will
submit to the Office of Management and
Budget (OMB) for clearance the
following proposal for collection of
information under the provisions of the
Paperwork Reduction Act (44 U.S.C.
chapter 35).
Agency: U.S. Census Bureau.
Title: 2015 National Content Test
OMB Control Number: 0607–XXXX.
Form Number(s):
Instruction Card
DE–33
DE–33 PR
Questionnaire Cover Letters
DE–16(L1)
DE–16(L1)(FB)
DE–16(L1)(E/S)
DE–16(L1)(E/S)PR
DE–16(L2)
DE–16(L2)(F/B)
DE–16(L2)(E/S)
DE–16(L2)(E/S)PR
DE–17(L1)
E:\FR\FM\22MYN1.SGM
22MYN1
29610
Federal Register / Vol. 80, No. 99 / Friday, May 22, 2015 / Notices
DE–6A(IN)(E/S)
DE–6A(1)(IN)
DE–6A(1)(IN)(E/S)
DE–8A
DE–8A(E/S)
Internet Instrument Spec
Reinterview Instrument Spec (Coverage)
Reinterview Instrument Spec (Race)
Wording for Emails and Text Messages
Type of Request: New Collection.
Number of Respondents: 1.3 million
households.
Average Hours per Response: 0.2.
Burden Hours: 216,667.
Postcards
DE–9
DE–9(E/S)PR
DE–9(I)
DE–9(v2)
DE–9(v3)
DE–9(ES)(PR)
DE–9(v3)(E/S)(PR)
DE–9(2A)
DE–9(2A)(E/S)PR
DE–9(2B)
DE–9(2B)(E/S)PR
DE–9(2C)
DE–9(2D)
DE–17(L1)(F/B)
DE–17(L1)(E/S)
DE–17(L2)
DE–17(L2)(F/B)
DE–17(L2)(E/S)
DE–17(L3)
DE–17(L3)(F/B)
DE–17(L3)(E/S)
DE–17(L4)
DE–17(L4)(F/B)
DE–17(L4)(E/S)
DE–17(L4)(E/S)PR
DE–17(L5)
DE–17(L5)(F/B)
DE–17(L5)(E/S)
Envelopes
DE–6A(IN)
ESTIMATED BURDEN HOURS FOR 2015 NATIONAL CONTENT TEST
Estimated response time
(minutes)
Initial Response ...........................................................................................................................
Telephone Reinterview ................................................................................................................
1,200,000
100,000
10
10
200,000
16,667
Total ......................................................................................................................................
asabaliauskas on DSK5VPTVN1PROD with NOTICES
Total number
of respondents
1,300,000
........................
216,667
Needs and Uses: During the years
preceding the 2020 Census, the Census
Bureau will pursue its commitment to
reducing the cost of conducting the next
decennial census while maintaining the
highest data quality possible. A primary
decennial census cost driver is the
collection of data from members of the
public for which the Census Bureau
received no reply via initially offered
response options. We refer to these
cases as nonrespondents. Increasing the
number of people who take advantage of
self response options (such as
completing a paper questionnaire and
mailing it back to the Census Bureau, or
responding via telephone or Internet
alternatives) can contribute to a less
costly census.
The 2015 National Content Test (NCT)
is part of the research and development
cycle leading up to the 2020 Census.
The first objective of this test is to
evaluate and compare different versions
of questions about such things as race
and Hispanic origin, relationship, and
within-household coverage. The 2015
NCT is the primary mid-decade
opportunity to compare different
versions of questions prior to making
final decisions for the 2020 Census. The
test will include a reinterview to further
assess the accuracy and reliability of the
question alternatives about race and
origin and within-household coverage.
For the decennial census, the Census
Bureau adheres to the U.S. Office of
Management and Budget’s (OMB)
October 30, 1997 ‘‘Revisions to the
Standards for the Classification of
VerDate Sep<11>2014
18:19 May 21, 2015
Jkt 235001
Federal Data on Race and Ethnicity’’
(see www.whitehouse.gov/omb/fedreg_
1997standards) for classifying racial and
ethnic responses. There are five broad
categories for data on race: ‘‘White,’’
‘‘Black or African American,’’
‘‘American Indian or Alaska Native,’’
‘‘Asian,’’ and ‘‘Native Hawaiian or Other
Pacific Islander.’’ There are two broad
categories for data on ethnicity:
‘‘Hispanic or Latino’’ and ‘‘Not Hispanic
or Latino.’’ The OMB standards advise
that respondents shall be offered the
option of selecting one or more racial
designations. The OMB standards also
advise that race and ethnicity are two
distinct concepts; therefore, Hispanics
or Latinos may be of any race.
Additionally, the 1997 OMB
standards permit the collection of more
detailed information on population
groups, provided that any additional
groups can be aggregated into the
standard broad set of categories.
Currently, the Census Bureau collects
additional detailed information on
Hispanic or Latino groups, American
Indian and Alaska Native tribes, Asian
groups, and Native Hawaiian and Other
Pacific Islander groups.
For example, responses to the race
question such as Navajo Nation, Nome
Eskimo Community, and Mayan are
collected and tabulated separately in
Census Bureau censuses and surveys,
but also are aggregated and tabulated
into the total American Indian or Alaska
Native population. Similarly, responses
to the race question such as Chinese,
Asian, Indian, and Vietnamese are
PO 00000
Frm 00009
Fmt 4703
Sfmt 4703
Estimated
burden hours
collected and tabulated separately, but
also aggregated and tabulated into the
total Asian population, while responses
such as Native Hawaiian, Chamorro, or
Fijian are collected and tabulated
separately, but also tabulated, and
aggregated into the total Native
Hawaiian or Other Pacific Islander
population. Responses to the ethnicity
question such as Mexican, Puerto Rican,
and Cuban are collected and tabulated
separately, but also are tabulated and
aggregated in Census Bureau censuses
and surveys, but also tabulated and
aggregated into the total Hispanic or
Latino population.
The 2015 NCT will test ways to
collect and tabulate detailed
information for the detailed groups, not
just to the broad groups identified
above. Detailed data for specific White
population groups, such as German,
Irish, and Polish, and Black population
groups, such as African American,
Jamaican, and Nigerian, will be
collected and tabulated, and may be
aggregated into the total ‘‘White’’ or
‘‘Black or African American’’
populations respectively.
The 2015 NCT also includes testing of
a separate ‘‘Middle Eastern or North
African’’ (MENA) category and the
collection of data on detailed MENA
groups, such as Lebanese, Egyptian, and
Iranian. Currently, following the 1997
OMB standards, Middle Eastern and
North African responses are classified
under the White racial category, per
OMB’s definition of ‘‘White.’’
E:\FR\FM\22MYN1.SGM
22MYN1
Federal Register / Vol. 80, No. 99 / Friday, May 22, 2015 / Notices
The second objective of the NCT is to
test different contact strategies for
optimizing self-response. The Census
Bureau has committed to using the
Internet as a primary response option in
the 2020 Census. The 2015 NCT
includes nine different approaches to
encouraging households to respond and,
specifically, to respond using the less
costly and more efficient Internet
response option. These approaches
include altering the timing of the first
reminder, use of email as a reminder,
altering the timing for sending the mail
questionnaire, use of a third reminder,
and sending a reminder letter in place
of a paper questionnaire to nonrespondents.
One benefit of the Internet response
mode is that it allows for more
functionality and greater flexibility in
designing questions compared to paper,
which is constrained by space
availability. The 2015 NCT will utilize
web-based technology, such as the
Internet, smart phones, and tablets to
improve question designs, and to
optimize reporting of detailed racial and
ethnic groups (e.g., Samoan, Iranian,
Blackfeet Tribe, Filipino, Jamaican,
Puerto Rican, Irish, etc.).
Web-based designs also provide much
more utility and flexibility for using
detailed checkboxes and write-in spaces
to elicit and collect data for detailed
groups than traditional paper
questionnaires, and will help collect
data for both the broader OMB
categories, as well as more detailed
responses across all groups.
Some of the findings from this
research include:
• Combining race and ethnicity into
one question did not change the
proportion of people who reported as
Hispanics, Blacks, Asians, American
Indians and Alaska Natives, or Native
Hawaiians and Other Pacific Islanders.
• The combined question yielded
higher item response rates, compared
with separate question approaches.
• The combined question increased
reporting of detailed responses for most
groups, but decreased reporting for
others.
The successful strategies from the
AQE research have been employed in
the design of the Census Bureau’s 2020
Census research. Four key dimensions
of the questions on race and Hispanic
origin are being tested in the 2015 NCT.
These include question format, response
categories, wording of the instructions,
and question terminology.
Components of the Test
1. Separate Race and Origin Questions
(Paper and Internet)
This is a modified version of the race
and Hispanic origin approach used in
the 2010 Census. Updates since the
2010 Census include added write-in
spaces and examples for the White
response category and the Black or
African American response category,
removal of the term ‘‘Negro,’’ and the
addition of an instruction to allow for
multiple responses in the Hispanic
origin question.
asabaliauskas on DSK5VPTVN1PROD with NOTICES
Race and Origin Content
The 2015 NCT builds on extensive
research previously conducted by the
Census Bureau as part of the 2010
Census. One major study was the 2010
Census Race and Hispanic Origin
Alternative Questionnaire Experiment
(AQE) (for details, see www.census.gov/
2010census/news/press-kits/aqe/
aqe.html). The 2010 AQE examined
alternative strategies for improving the
collection of data on a race and
Hispanic origin, with four goals in
mind:
1. Increasing reporting in the standard
race and ethnic categories as defined by
the U.S. Office of Management and
Budget;
2. Decreasing item non-response for
these questions;
3. Increasing the accuracy and
reliability of the results for this
question; and
4. Eliciting detailed responses for all
racial and ethnic communities (e.g.,
Chinese, Mexican, Jamaican, etc.).
VerDate Sep<11>2014
18:19 May 21, 2015
Jkt 235001
Question Format
The 2015 NCT will evaluate the use
of two alternative question approaches
for collecting detailed data on race and
ethnicity. One approach uses two
separate questions: The first about
Hispanic origin, and the second about
race. The other approach combines the
two items into one question about race
and origin. The 2015 NCT research will
test both approaches with new data
collection methods, including Internet,
telephone, and in-person response. Each
approach is described below, with its
associated data collection modes.
2. Combined Question With Checkboxes
and Write-Ins Visible at Same Time
(Paper)
This is a modified version of the
combined question approaches found to
be successful in the 2010 AQE research.
Checkboxes are provided for the U.S.
Office of Management and Budget
(OMB) broad categories (per the 1997
Standards for the Classification of
Federal Data on Race and Ethnicity),
with a corresponding write-in space for
detailed response to each checkbox
category. In this version, all checkboxes
PO 00000
Frm 00010
Fmt 4703
Sfmt 4703
29611
and write-in spaces are visible at all
times. Each response category contains
six example groups, which represent the
diversity of the geographic definitions of
the OMB category. For instance, the
Asian category examples of Chinese,
Filipino, Asian, Indian, Vietnamese,
Korean, and Japanese represent the six
largest detailed Asian groups in the
United States, reflecting OMB’s
definition of Asian (‘‘A person having
origins in any of the original peoples of
the Far East, Southeast Asia, and the
Indian subcontinent.’’). Respondents do
not have to select an OMB checkbox,
but may enter a detailed response in the
write-in space without checking a
category.
3. Combined Question With Major
Checkboxes, Detailed Checkboxes, and
Write-Ins (Paper)
This is a modified version of the
combined question approaches found to
be successful in the 2010 AQE.
Checkboxes are provided for the OMB
categories, along with a series of
detailed checkboxes under each major
category, and a corresponding write-in
space and examples to elicit and collect
all other detailed responses within the
major category. In this version, all
checkboxes and write-in spaces are
visible at all times. Again, the detailed
response categories represent the
diversity of the geographic definitions of
the OMB category.
For instance, under the Asian
category (and major checkbox), a series
of detailed checkboxes is presented for
Chinese, Filipino, Asian Indian,
Vietnamese, Korean, and Japanese,
which represent the six largest detailed
Asian groups in the United States. Then,
instructions to enter additional detailed
groups (with the examples of ‘‘Pakistani,
Thai, Hmong, etc.’’) precede a dedicated
write-in area to collect other detailed
responses. Again, these detailed groups
reflect OMB’s definition of Asian (‘‘A
person having origins in any of the
original peoples of the Far East,
Southeast Asia, and the Indian
subcontinent.’’). Respondents do not
have to select an OMB checkbox, but
may enter a detailed response in the
write-in space without checking a
category.
4. Combined Question With Major
Checkboxes and Write-Ins on Separate
Screens (Internet)
In this version, the detailed origin
groups are solicited on subsequent
screens after the OMB response
categories have been selected. On the
first screen, the OMB checkbox
categories are shown along with their
six representative example groups. Once
E:\FR\FM\22MYN1.SGM
22MYN1
Federal Register / Vol. 80, No. 99 / Friday, May 22, 2015 / Notices
asabaliauskas on DSK5VPTVN1PROD with NOTICES
the OMB categories have been selected,
one at a time, subsequent screens solicit
further detail for each category that was
chosen (e.g., Asian), using a write-in
space, with examples, to collect the
detailed groups (e.g., Korean and
Japanese). The intent is to separate
mouse click tasks (checkbox categories)
and typing tasks (write-ins) in an
attempt to elicit responses that are more
detailed. This approach was used as one
of three race and origin Internet panels
in the 2014 Census Test.
Race Response Categories
The 2015 NCT will also evaluate the
use of a ‘‘Middle Eastern or North
African’’ (‘‘MENA’’) response category.
There will be two treatments for testing
this dimension:
1. Use of a MENA category: This
treatment tests the addition of a MENA
checkbox category to the race question.
The MENA category is placed within
the current category lineup, based on
estimates of population size, between
the category for American Indians and
Alaska Natives and the category for
Native Hawaiians and Other Pacific
Islanders. With the addition of this new
category, the ‘‘White’’ example groups
are revised. The Middle Eastern and
North African examples of Lebanese and
Egyptian are replaced with the
European examples of Polish and
French. The MENA checkbox category
will have the examples of Lebanese,
Iranian, Egyptian, Syrian, Moroccan,
and Algerian. All other checkbox
categories and write-in spaces remain
the same.
2. No separate MENA category: This
treatment tests approaches without a
separate MENA checkbox category, and
represents the current OMB definition
of White (‘‘A person having origins in
any of the original peoples of Europe,
the Middle East, or North Africa.’’). Here
the category will provide examples of
Middle Eastern and North African
origins (e.g., Lebanese; Egyptian) along
with examples of European origins (e.g.,
German; Irish) as part of the ‘‘White’’
racial category.
VerDate Sep<11>2014
18:19 May 21, 2015
Jkt 235001
5. Combined Question Branching With
Detailed Checkbox Screens (Internet)
This version is an alternative method
of soliciting detailed origin groups using
separate screens, detailed checkboxes,
and write-in spaces. On the first screen,
the OMB checkbox categories are shown
along with their six representative
Wording of the Instructions
The 2015 NCT will evaluate the use
of different approaches for wording the
instructions used to collect data on race
and ethnicity. The 2010 AQE research
found that respondents frequently
overlook the instruction to ‘‘Mark [X]
one or more boxes’’ and have difficulty
understanding the instructions. From
the 2010 AQE qualitative research we
learned that some respondents stop
reading the instruction after noticing the
visual cue [X] and proceed directly to
do just that—mark a box—overlooking
the remainder of the instruction. The
new instruction being tested in the 2015
NCT (‘‘Mark all boxes that apply’’) is an
attempt to improve the clarity of the
question and make it more apparent that
PO 00000
Frm 00011
Fmt 4703
Sfmt 4703
example groups. Once the OMB
categories have been selected, one at a
time, subsequent screens solicit further
detail for each category, this time using
a series of additional checkboxes for the
six largest detailed groups (e.g., Chinese,
Filipino, Asian, Indian, Vietnamese,
Korean, and Japanese) with a write-in
space also provided to collect additional
groups.
more than one group may be selected.
The following options will be tested in
the 2015 NCT.
1. ‘‘Mark [X] one or more’’: One
version (old instructions) will advise
respondents to, ‘‘Mark [X] one or more
boxes AND print [origins/ethnicities/
details].’’
2. ‘‘Mark all that apply’’: An
alternative version (new instructions),
will advise respondents to, ‘‘Mark all
boxes that apply AND print [origins/
ethnicities/details] in the spaces below.
Note, you may report more than one
group.’’
Instructions for American Indian and
Alaska Native (AIAN) Write-In Area
The 2015 NCT will also examine
different instructions to optimize
detailed reporting within the AIAN
write-in area. From the 2010 AQE
research and recent 2014 qualitative
research that the Census Bureau
conducted with American Indians,
Alaska Natives, and Central and South
American Indian respondents, we know
the instruction to ‘‘Print enrolled or
principal tribe’’ causes confusion for
many AIAN respondents and means
E:\FR\FM\22MYN1.SGM
22MYN1
EN22MY15.007
29612
Federal Register / Vol. 80, No. 99 / Friday, May 22, 2015 / Notices
asabaliauskas on DSK5VPTVN1PROD with NOTICES
different things to different people. The
research found that AIAN respondents
were confused by the use of different
terms and concepts (e.g., ‘‘enrolled’’,
‘‘affiliated,’’ ‘‘villages,’’ ‘‘race,’’ ‘‘origin,’’
‘‘tribe,’’ etc.) and there was
disagreement among focus group
participants as to what ‘‘affiliated tribe’’
or ‘‘enrolled’’ or ‘‘villages’’ meant.
The overwhelming sentiment from
2014 AIAN focus group participants was
that they want to be treated equally with
other race/ethnic groups, and this was
accomplished by not using different
terminology (i.e., enrolled, affiliated,
villages, etc.). Asking ‘‘What is your race
or origin?’’ in conjunction with ‘‘Print,
for example, . . .’’ (along with AIAN
example groups) allowed the
respondents to understand what the
question asked them to report (their race
or origin) and did not limit their writein response by confounding the
instructions with terms that mean
different things to different people (e.g.,
tribes, villages, etc.). Therefore, the
instruction to, ‘‘Print, for example, . . .’’
presented a viable alternative for further
exploration in 2015 NCT research.
Based on the findings and
recommendations from this research,
the 2015 NCT will test variations of the
instructions for the AIAN write-in area.
We plan to test the instruction, ‘‘Print
enrolled or principal tribe, for example
. . .’’ on control versions, and the
instruction, ‘‘Print, for example . . .’’ on
experimental versions, to see how they
perform.
Question Terms
The 2015 NCT will evaluate the use
of different conceptual terms (e.g.,
origin, ethnicity, or no terms) in the
wording of questions for collecting data
on race and ethnicity. Recent qualitative
focus groups and qualitative research
(e.g., 2010 AQE research; 2013 Census
Test research; cognitive pre-testing for
the 2016 American Community Survey
(ACS) Content Test) found that the
terms ‘‘race,’’ ‘‘ethnicity,’’ and ‘‘origin’’
are confusing or misleading to many
respondents, and mean different things
to different people. The 2010 AQE
research tested the removal of the term
‘‘race’’ from the question, and showed
no evidence that removal of the term
had any effect on either unit or item
response rates. Recent cognitive
research for the 2016 ACS Content Test
tested an open-ended instruction
(‘‘Which categories describe you?’’) and
found that respondents did not have
issues with understanding what the
question was asking. The following
options will be tested in the 2015 NCT.
1. ‘‘Origin’’ term: The current version
of the race and Hispanic origin
VerDate Sep<11>2014
18:19 May 21, 2015
Jkt 235001
questions, and the combined question,
use the terms ‘‘race’’ and/or ‘‘origin’’ to
describe the concepts and groups in the
question stem and/or instructions. For
instance, in the combined race and
Hispanic origin approach, the question
stem is ‘‘What is Person 1’s race or
origin?’’
2. ‘‘Ethnicity’’ term: One alternative
option being explored tests the use of
both the terms ‘‘ethnicity’’ along with
‘‘race’’ in the question stem and/or
instructions (e.g., ‘‘What is Person 1’s
race or ethnicity?’’).
3. NO terms: A second alternative
option being explored tests the removal
of the terms ‘‘race,’’ ‘‘origin,’’ and
‘‘ethnicity’’ from the question stem and
instructions. Instead, a general approach
asks, ‘‘Which categories describe Person
1?’’
Relationship Content
Two versions of the relationship
question will be tested. Both versions
are the same as those used in a splitsample in the 2014 Census Test, with no
changes. These relationship categories
were previously tested in other Census
Bureau surveys including the American
Housing Survey, American Community
Survey, and the Survey of Income and
Program Participation (currently used in
production). Although research to date
has been informative, leading to the
development of the revised relationship
question, additional quantitative testing
is needed. Because the incidence of
some household relationships—such as
same-sex couples—is relatively low in
the general population, the revised
question needs to be tested with large,
nationally representative samples prior
to a final decision to include them in
the 2020 Census questionnaire.
The first version uses the 2010 Census
relationship question response options,
but in a new order, starting with
‘‘husband or wife’’ and then the
‘‘unmarried partner’’ category. This
version also re-introduces the foster
child category, which was removed
from the 2010 Census form due to space
issues.
The second version includes the same
basic response options as the 2010
Census version, but modifies/expands
the ‘‘husband or wife’’ and ‘‘unmarried
partner’’ categories to distinguish
between same-sex and opposite-sex
relationships.
Coverage Content (Internet Only)
The 2012 NCT experimented with
several methods to improve withinhousehold coverage for Internet
respondents. One benefit of the online
response mode is that it allows for more
functionality and greater flexibility in
PO 00000
Frm 00012
Fmt 4703
Sfmt 4703
29613
designing questions compared to paper,
which is constrained by space
availability. The 2012 NCT included a
coverage follow-up reinterview to
evaluate the different Internet design
options, but some results were
inconclusive. In the 2015 NCT, two
designs will be tested to compare
different approaches for helping
respondents provide a more accurate
roster of household residents.
The first approach is the ‘‘RulesBased’’ approach, and will allow us to
see whether the presence of a question
asking the number of people in the
household, along with the residence
rule instructions, helps respondents
create an accurate roster. This is similar
to the approach used across all modes
in Census 2000 and the 2010 Census,
where the respondent was expected to
understand various applications of our
residence rules and apply them to their
household. The roster creation is
followed by a household-level question
that probes to determine if any
additional people not listed originally
should be included for consideration as
residents of the household (several
types of people and living situations are
shown in a list).
The ‘‘Question-Based’’ approach
allows us to ask guided questions to
help improve resident responses.
Respondents are not shown the
residence rule instructions and are only
asked to create an initial roster of people
they consider to be living or staying at
their address on Census Day. This is
followed by several short householdlevel questions about types of people
and living situations that might apply to
people in the household that were not
listed originally.
The materials mailed to the
respondents will inform them that the
survey is mandatory in accordance with
Title 13, United States Code, Sections
141 and 193. This information also will
be available via a hyperlink from within
the Internet Instrument.
The results of the 2015 NCT will help
guide the design of additional 2020
Census testing later this decade. The
2015 NCT will be the only opportunity
to test content with a nationally
representative sample prior to the 2020
Census. Testing in 2015 is necessary to
establish recommendations for contact
strategies, response options, and content
options that can be further refined and
tested in later tests. At this point in the
decade, the Census Bureau needs to
acquire evidence showing whether the
strategies being tested can reduce the
cost per housing unit during a decennial
census, while providing high quality
and accuracy of the census data. The
nationally-representative sample is
E:\FR\FM\22MYN1.SGM
22MYN1
29614
Federal Register / Vol. 80, No. 99 / Friday, May 22, 2015 / Notices
designed to ensure that the unbiased
estimates from this test accurately
reflect the nation as a whole, across a
variety of demographic characteristics.
Along with other results, the response
rates to paper and Internet collection
will be used to help inform 2020
Decennial program planning and cost
estimation metrics values. In addition,
several demographic questions and
coverage probes are included in this test
to achieve improved coverage by future
decennial censuses and surveys.
Information quality is an integral part
of the pre-dissemination review of the
information disseminated by the Census
Bureau (fully described in the Census
Bureau’s Information Quality
Guidelines). Information quality is also
integral to the information collections
conducted by the Census Bureau and is
incorporated into the clearance process
required by the Paperwork Reduction
Act.
Affected Public: Individuals or
Households.
Frequency: One Time.
Respondent’s Obligation: Mandatory.
Legal Authority: Title 13 U.S.C. 141
and 193.
This information collection request
may be viewed at www.reginfo.gov.
Follow the instructions to view
Department of Commerce collections
currently under review by OMB.
Written comments and
recommendations for the proposed
information collection should be sent
within 30 days of publication of this
notice to OIRA_Submission@
omb.eop.gov or fax to (202) 395–5806.
Dated: May 14, 2015.
Glenna Mickelson,
Management Analyst, Office of the Chief
Information Officer.
[FR Doc. 2015–12140 Filed 5–21–15; 8:45 am]
BILLING CODE 3510–07–P
DEPARTMENT OF COMMERCE
Foreign-Trade Zones Board
[S–20–2015]
asabaliauskas on DSK5VPTVN1PROD with NOTICES
Approval of Subzone Status; Roger
Electric Corporation; Bayamon, Puerto
Rico
On February 20, 2015, the Executive
Secretary of the Foreign-Trade Zones
(FTZ) Board docketed an application
submitted by the Puerto Rico Trade &
Export Company, grantee of FTZ 61,
requesting subzone status subject to the
existing activation limit of FTZ 61, on
behalf of Roger Electric Corporation in
Bayamon, Puerto Rico.
The application was processed in
accordance with the FTZ Act and
VerDate Sep<11>2014
18:19 May 21, 2015
Jkt 235001
Regulations, including notice in the
Federal Register inviting public
comment (80 FR 10456–10457, 02–26–
2015). The FTZ staff examiner reviewed
the application and determined that it
meets the criteria for approval.
Pursuant to the authority delegated to
the FTZ Board’s Executive Secretary (15
CFR 400.36(f)), the application to
establish Subzone 61O is approved,
subject to the FTZ Act and the Board’s
regulations, including Section 400.13,
and further subject to FTZ 61’s 1,821.07acre activation limit.
Dated: May 14, 2015.
Andrew McGilvray,
Executive Secretary.
[FR Doc. 2015–12516 Filed 5–21–15; 8:45 am]
BILLING CODE 3510–DS–P
DEPARTMENT OF COMMERCE
Foreign-Trade Zones Board
[B–04–2015]
Foreign-Trade Zone (FTZ) 26—Atlanta,
Georgia; Authorization of Production
Activity; Mizuno USA, Inc. (Golf Clubs),
Braselton, Georgia
DEPARTMENT OF COMMERCE
International Trade Administration
[A–427–818]
Low Enriched Uranium From France:
Final Results of Changed
Circumstances Review
Enforcement and Compliance,
International Trade Administration,
Department of Commerce.
SUMMARY: The Department of Commerce
(the Department) has granted an
extension of time for the re-exportation
of one specified entry of low enriched
uranium (LEU) that entered under a
narrow provision that conditionally
excludes it from the scope of the
antidumping (AD) order. The
Department extends the exportation
deadline until January 31, 2018.
DATES: Effective date: May 22, 2015.
FOR FURTHER INFORMATION CONTACT:
Andrew Huston, AD/CVD Operations,
Office VII, Enforcement and
Compliance, International Trade
Administration, U.S. Department of
Commerce, 14th Street and Constitution
Avenue NW., Washington, DC 20230;
telephone: (202) 482–4261.
SUPPLEMENTARY INFORMATION:
AGENCY:
Background
On January 15, 2015, Georgia ForeignTrade Zone, Inc., grantee of FTZ 26,
submitted a notification of proposed
production activity to the Foreign-Trade
Zones (FTZ) Board on behalf of Mizuno
USA, Inc., within Site 31 of FTZ 26, in
Braselton, Georgia.
The notification was processed in
accordance with the regulations of the
FTZ Board (15 CFR part 400), including
notice in the Federal Register inviting
public comment (80 FR 5507, 02–02–
2015). The FTZ Board has determined
that no further review of the activity is
warranted at this time. The production
activity described in the notification is
authorized, subject to the FTZ Act and
the Board’s regulations, including
Section 400.14.
Dated: May 15, 2015.
Andrew McGilvray,
Executive Secretary.
Scope of the Order
[FR Doc. 2015–12550 Filed 5–21–15; 8:45 am]
BILLING CODE 3510–DS–P
PO 00000
On February 17, 2015, the Department
published the initiation and preliminary
results of the changed circumstances
review (CCR).1 In the Initiation and
Preliminary Results the Department
preliminarily determined that changed
circumstances did not exist, and that
Eurodif SA and Areva Inc. (collectively
AREVA) would not be granted an
additional extension of time to re-export
the specified entry of low-enriched
uranium. Since the publication of the
Initiation and Preliminary Results, the
following events have taken place.
AREVA, Centrus Energy Corporation
(Petitioners), and the Nuclear Energy
Institute submitted comments on March
17, 2015. Chubu Electric Power
Company, Inc. submitted comments on
March 24, 2015. No rebuttal comments
were filed.
The product covered by the order is
all low-enriched uranium. Lowenriched uranium is enriched uranium
hexafluoride (UF6) with a U235 product
assay of less than 20 percent that has
not been converted into another
1 See Low Enriched Uranium from France:
Initiation of Expedited Changed Circumstances
Review and Preliminary Results of Changed
Circumstances Review, 80 FR 8285 (February 17,
2015) (Initiation and Preliminary Results).
Frm 00013
Fmt 4703
Sfmt 4703
E:\FR\FM\22MYN1.SGM
22MYN1
Agencies
[Federal Register Volume 80, Number 99 (Friday, May 22, 2015)]
[Notices]
[Pages 29609-29614]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 2015-12140]
=======================================================================
-----------------------------------------------------------------------
DEPARTMENT OF COMMERCE
Submission for OMB Review; Comment Request
The Department of Commerce will submit to the Office of Management
and Budget (OMB) for clearance the following proposal for collection of
information under the provisions of the Paperwork Reduction Act (44
U.S.C. chapter 35).
Agency: U.S. Census Bureau.
Title: 2015 National Content Test
OMB Control Number: 0607-XXXX.
Form Number(s):
Questionnaire
DE-1A(E/S)
DE-1C(E/S)
DE-1D(E/S)
DE-1D2(E/S)
DE-1G(E/S)
DE-1H(E/S)
DE-1I(E/S)
DE-1W(E/S)
DE-1C(E/S)PR
DE-1I(E/S) PR
Instruction Card
DE-33
DE-33 PR
Questionnaire Cover Letters
DE-16(L1)
DE-16(L1)(FB)
DE-16(L1)(E/S)
DE-16(L1)(E/S)PR
DE-16(L2)
DE-16(L2)(F/B)
DE-16(L2)(E/S)
DE-16(L2)(E/S)PR
DE-17(L1)
[[Page 29610]]
DE-17(L1)(F/B)
DE-17(L1)(E/S)
DE-17(L2)
DE-17(L2)(F/B)
DE-17(L2)(E/S)
DE-17(L3)
DE-17(L3)(F/B)
DE-17(L3)(E/S)
DE-17(L4)
DE-17(L4)(F/B)
DE-17(L4)(E/S)
DE-17(L4)(E/S)PR
DE-17(L5)
DE-17(L5)(F/B)
DE-17(L5)(E/S)
Postcards
DE-9
DE-9(E/S)PR
DE-9(I)
DE-9(v2)
DE-9(v3)
DE-9(ES)(PR)
DE-9(v3)(E/S)(PR)
DE-9(2A)
DE-9(2A)(E/S)PR
DE-9(2B)
DE-9(2B)(E/S)PR
DE-9(2C)
DE-9(2D)
Envelopes
DE-6A(IN)
DE-6A(IN)(E/S)
DE-6A(1)(IN)
DE-6A(1)(IN)(E/S)
DE-8A
DE-8A(E/S)
Internet Instrument Spec
Reinterview Instrument Spec (Coverage)
Reinterview Instrument Spec (Race)
Wording for Emails and Text Messages
Type of Request: New Collection.
Number of Respondents: 1.3 million households.
Average Hours per Response: 0.2.
Burden Hours: 216,667.
Estimated Burden Hours for 2015 National Content Test
----------------------------------------------------------------------------------------------------------------
Estimated
Total number response time Estimated
of respondents (minutes) burden hours
----------------------------------------------------------------------------------------------------------------
Initial Response................................................ 1,200,000 10 200,000
Telephone Reinterview........................................... 100,000 10 16,667
-----------------------------------------------
Total....................................................... 1,300,000 .............. 216,667
----------------------------------------------------------------------------------------------------------------
Needs and Uses: During the years preceding the 2020 Census, the
Census Bureau will pursue its commitment to reducing the cost of
conducting the next decennial census while maintaining the highest data
quality possible. A primary decennial census cost driver is the
collection of data from members of the public for which the Census
Bureau received no reply via initially offered response options. We
refer to these cases as nonrespondents. Increasing the number of people
who take advantage of self response options (such as completing a paper
questionnaire and mailing it back to the Census Bureau, or responding
via telephone or Internet alternatives) can contribute to a less costly
census.
The 2015 National Content Test (NCT) is part of the research and
development cycle leading up to the 2020 Census.
The first objective of this test is to evaluate and compare
different versions of questions about such things as race and Hispanic
origin, relationship, and within-household coverage. The 2015 NCT is
the primary mid-decade opportunity to compare different versions of
questions prior to making final decisions for the 2020 Census. The test
will include a reinterview to further assess the accuracy and
reliability of the question alternatives about race and origin and
within-household coverage.
For the decennial census, the Census Bureau adheres to the U.S.
Office of Management and Budget's (OMB) October 30, 1997 ``Revisions to
the Standards for the Classification of Federal Data on Race and
Ethnicity'' (see www.whitehouse.gov/omb/fedreg_1997standards) for
classifying racial and ethnic responses. There are five broad
categories for data on race: ``White,'' ``Black or African American,''
``American Indian or Alaska Native,'' ``Asian,'' and ``Native Hawaiian
or Other Pacific Islander.'' There are two broad categories for data on
ethnicity: ``Hispanic or Latino'' and ``Not Hispanic or Latino.'' The
OMB standards advise that respondents shall be offered the option of
selecting one or more racial designations. The OMB standards also
advise that race and ethnicity are two distinct concepts; therefore,
Hispanics or Latinos may be of any race.
Additionally, the 1997 OMB standards permit the collection of more
detailed information on population groups, provided that any additional
groups can be aggregated into the standard broad set of categories.
Currently, the Census Bureau collects additional detailed information
on Hispanic or Latino groups, American Indian and Alaska Native tribes,
Asian groups, and Native Hawaiian and Other Pacific Islander groups.
For example, responses to the race question such as Navajo Nation,
Nome Eskimo Community, and Mayan are collected and tabulated separately
in Census Bureau censuses and surveys, but also are aggregated and
tabulated into the total American Indian or Alaska Native population.
Similarly, responses to the race question such as Chinese, Asian,
Indian, and Vietnamese are collected and tabulated separately, but also
aggregated and tabulated into the total Asian population, while
responses such as Native Hawaiian, Chamorro, or Fijian are collected
and tabulated separately, but also tabulated, and aggregated into the
total Native Hawaiian or Other Pacific Islander population. Responses
to the ethnicity question such as Mexican, Puerto Rican, and Cuban are
collected and tabulated separately, but also are tabulated and
aggregated in Census Bureau censuses and surveys, but also tabulated
and aggregated into the total Hispanic or Latino population.
The 2015 NCT will test ways to collect and tabulate detailed
information for the detailed groups, not just to the broad groups
identified above. Detailed data for specific White population groups,
such as German, Irish, and Polish, and Black population groups, such as
African American, Jamaican, and Nigerian, will be collected and
tabulated, and may be aggregated into the total ``White'' or ``Black or
African American'' populations respectively.
The 2015 NCT also includes testing of a separate ``Middle Eastern
or North African'' (MENA) category and the collection of data on
detailed MENA groups, such as Lebanese, Egyptian, and Iranian.
Currently, following the 1997 OMB standards, Middle Eastern and North
African responses are classified under the White racial category, per
OMB's definition of ``White.''
[[Page 29611]]
The second objective of the NCT is to test different contact
strategies for optimizing self-response. The Census Bureau has
committed to using the Internet as a primary response option in the
2020 Census. The 2015 NCT includes nine different approaches to
encouraging households to respond and, specifically, to respond using
the less costly and more efficient Internet response option. These
approaches include altering the timing of the first reminder, use of
email as a reminder, altering the timing for sending the mail
questionnaire, use of a third reminder, and sending a reminder letter
in place of a paper questionnaire to non-respondents.
One benefit of the Internet response mode is that it allows for
more functionality and greater flexibility in designing questions
compared to paper, which is constrained by space availability. The 2015
NCT will utilize web-based technology, such as the Internet, smart
phones, and tablets to improve question designs, and to optimize
reporting of detailed racial and ethnic groups (e.g., Samoan, Iranian,
Blackfeet Tribe, Filipino, Jamaican, Puerto Rican, Irish, etc.).
Web-based designs also provide much more utility and flexibility
for using detailed checkboxes and write-in spaces to elicit and collect
data for detailed groups than traditional paper questionnaires, and
will help collect data for both the broader OMB categories, as well as
more detailed responses across all groups.
Components of the Test
Race and Origin Content
The 2015 NCT builds on extensive research previously conducted by
the Census Bureau as part of the 2010 Census. One major study was the
2010 Census Race and Hispanic Origin Alternative Questionnaire
Experiment (AQE) (for details, see www.census.gov/2010census/news/press-kits/aqe/aqe.html). The 2010 AQE examined alternative strategies
for improving the collection of data on a race and Hispanic origin,
with four goals in mind:
1. Increasing reporting in the standard race and ethnic categories
as defined by the U.S. Office of Management and Budget;
2. Decreasing item non-response for these questions;
3. Increasing the accuracy and reliability of the results for this
question; and
4. Eliciting detailed responses for all racial and ethnic
communities (e.g., Chinese, Mexican, Jamaican, etc.).
Some of the findings from this research include:
Combining race and ethnicity into one question did not
change the proportion of people who reported as Hispanics, Blacks,
Asians, American Indians and Alaska Natives, or Native Hawaiians and
Other Pacific Islanders.
The combined question yielded higher item response rates,
compared with separate question approaches.
The combined question increased reporting of detailed
responses for most groups, but decreased reporting for others.
The successful strategies from the AQE research have been employed
in the design of the Census Bureau's 2020 Census research. Four key
dimensions of the questions on race and Hispanic origin are being
tested in the 2015 NCT. These include question format, response
categories, wording of the instructions, and question terminology.
Question Format
The 2015 NCT will evaluate the use of two alternative question
approaches for collecting detailed data on race and ethnicity. One
approach uses two separate questions: The first about Hispanic origin,
and the second about race. The other approach combines the two items
into one question about race and origin. The 2015 NCT research will
test both approaches with new data collection methods, including
Internet, telephone, and in-person response. Each approach is described
below, with its associated data collection modes.
1. Separate Race and Origin Questions (Paper and Internet)
This is a modified version of the race and Hispanic origin approach
used in the 2010 Census. Updates since the 2010 Census include added
write-in spaces and examples for the White response category and the
Black or African American response category, removal of the term
``Negro,'' and the addition of an instruction to allow for multiple
responses in the Hispanic origin question.
2. Combined Question With Checkboxes and Write-Ins Visible at Same Time
(Paper)
This is a modified version of the combined question approaches
found to be successful in the 2010 AQE research. Checkboxes are
provided for the U.S. Office of Management and Budget (OMB) broad
categories (per the 1997 Standards for the Classification of Federal
Data on Race and Ethnicity), with a corresponding write-in space for
detailed response to each checkbox category. In this version, all
checkboxes and write-in spaces are visible at all times. Each response
category contains six example groups, which represent the diversity of
the geographic definitions of the OMB category. For instance, the Asian
category examples of Chinese, Filipino, Asian, Indian, Vietnamese,
Korean, and Japanese represent the six largest detailed Asian groups in
the United States, reflecting OMB's definition of Asian (``A person
having origins in any of the original peoples of the Far East,
Southeast Asia, and the Indian subcontinent.''). Respondents do not
have to select an OMB checkbox, but may enter a detailed response in
the write-in space without checking a category.
3. Combined Question With Major Checkboxes, Detailed Checkboxes, and
Write-Ins (Paper)
This is a modified version of the combined question approaches
found to be successful in the 2010 AQE. Checkboxes are provided for the
OMB categories, along with a series of detailed checkboxes under each
major category, and a corresponding write-in space and examples to
elicit and collect all other detailed responses within the major
category. In this version, all checkboxes and write-in spaces are
visible at all times. Again, the detailed response categories represent
the diversity of the geographic definitions of the OMB category.
For instance, under the Asian category (and major checkbox), a
series of detailed checkboxes is presented for Chinese, Filipino, Asian
Indian, Vietnamese, Korean, and Japanese, which represent the six
largest detailed Asian groups in the United States. Then, instructions
to enter additional detailed groups (with the examples of ``Pakistani,
Thai, Hmong, etc.'') precede a dedicated write-in area to collect other
detailed responses. Again, these detailed groups reflect OMB's
definition of Asian (``A person having origins in any of the original
peoples of the Far East, Southeast Asia, and the Indian
subcontinent.''). Respondents do not have to select an OMB checkbox,
but may enter a detailed response in the write-in space without
checking a category.
4. Combined Question With Major Checkboxes and Write-Ins on Separate
Screens (Internet)
In this version, the detailed origin groups are solicited on
subsequent screens after the OMB response categories have been
selected. On the first screen, the OMB checkbox categories are shown
along with their six representative example groups. Once
[[Page 29612]]
the OMB categories have been selected, one at a time, subsequent
screens solicit further detail for each category that was chosen (e.g.,
Asian), using a write-in space, with examples, to collect the detailed
groups (e.g., Korean and Japanese). The intent is to separate mouse
click tasks (checkbox categories) and typing tasks (write-ins) in an
attempt to elicit responses that are more detailed. This approach was
used as one of three race and origin Internet panels in the 2014 Census
Test.
5. Combined Question Branching With Detailed Checkbox Screens
(Internet)
This version is an alternative method of soliciting detailed origin
groups using separate screens, detailed checkboxes, and write-in
spaces. On the first screen, the OMB checkbox categories are shown
along with their six representative example groups. Once the OMB
categories have been selected, one at a time, subsequent screens
solicit further detail for each category, this time using a series of
additional checkboxes for the six largest detailed groups (e.g.,
Chinese, Filipino, Asian, Indian, Vietnamese, Korean, and Japanese)
with a write-in space also provided to collect additional groups.
[GRAPHIC] [TIFF OMITTED] TN22MY15.007
Race Response Categories
The 2015 NCT will also evaluate the use of a ``Middle Eastern or
North African'' (``MENA'') response category. There will be two
treatments for testing this dimension:
1. Use of a MENA category: This treatment tests the addition of a
MENA checkbox category to the race question. The MENA category is
placed within the current category lineup, based on estimates of
population size, between the category for American Indians and Alaska
Natives and the category for Native Hawaiians and Other Pacific
Islanders. With the addition of this new category, the ``White''
example groups are revised. The Middle Eastern and North African
examples of Lebanese and Egyptian are replaced with the European
examples of Polish and French. The MENA checkbox category will have the
examples of Lebanese, Iranian, Egyptian, Syrian, Moroccan, and
Algerian. All other checkbox categories and write-in spaces remain the
same.
2. No separate MENA category: This treatment tests approaches
without a separate MENA checkbox category, and represents the current
OMB definition of White (``A person having origins in any of the
original peoples of Europe, the Middle East, or North Africa.''). Here
the category will provide examples of Middle Eastern and North African
origins (e.g., Lebanese; Egyptian) along with examples of European
origins (e.g., German; Irish) as part of the ``White'' racial category.
Wording of the Instructions
The 2015 NCT will evaluate the use of different approaches for
wording the instructions used to collect data on race and ethnicity.
The 2010 AQE research found that respondents frequently overlook the
instruction to ``Mark [X] one or more boxes'' and have difficulty
understanding the instructions. From the 2010 AQE qualitative research
we learned that some respondents stop reading the instruction after
noticing the visual cue [X] and proceed directly to do just that--mark
a box--overlooking the remainder of the instruction. The new
instruction being tested in the 2015 NCT (``Mark all boxes that
apply'') is an attempt to improve the clarity of the question and make
it more apparent that more than one group may be selected. The
following options will be tested in the 2015 NCT.
1. ``Mark [X] one or more'': One version (old instructions) will
advise respondents to, ``Mark [X] one or more boxes AND print [origins/
ethnicities/details].''
2. ``Mark all that apply'': An alternative version (new
instructions), will advise respondents to, ``Mark all boxes that apply
AND print [origins/ethnicities/details] in the spaces below. Note, you
may report more than one group.''
Instructions for American Indian and Alaska Native (AIAN) Write-In Area
The 2015 NCT will also examine different instructions to optimize
detailed reporting within the AIAN write-in area. From the 2010 AQE
research and recent 2014 qualitative research that the Census Bureau
conducted with American Indians, Alaska Natives, and Central and South
American Indian respondents, we know the instruction to ``Print
enrolled or principal tribe'' causes confusion for many AIAN
respondents and means
[[Page 29613]]
different things to different people. The research found that AIAN
respondents were confused by the use of different terms and concepts
(e.g., ``enrolled'', ``affiliated,'' ``villages,'' ``race,''
``origin,'' ``tribe,'' etc.) and there was disagreement among focus
group participants as to what ``affiliated tribe'' or ``enrolled'' or
``villages'' meant.
The overwhelming sentiment from 2014 AIAN focus group participants
was that they want to be treated equally with other race/ethnic groups,
and this was accomplished by not using different terminology (i.e.,
enrolled, affiliated, villages, etc.). Asking ``What is your race or
origin?'' in conjunction with ``Print, for example, . . .'' (along with
AIAN example groups) allowed the respondents to understand what the
question asked them to report (their race or origin) and did not limit
their write-in response by confounding the instructions with terms that
mean different things to different people (e.g., tribes, villages,
etc.). Therefore, the instruction to, ``Print, for example, . . .''
presented a viable alternative for further exploration in 2015 NCT
research.
Based on the findings and recommendations from this research, the
2015 NCT will test variations of the instructions for the AIAN write-in
area. We plan to test the instruction, ``Print enrolled or principal
tribe, for example . . .'' on control versions, and the instruction,
``Print, for example . . .'' on experimental versions, to see how they
perform.
Question Terms
The 2015 NCT will evaluate the use of different conceptual terms
(e.g., origin, ethnicity, or no terms) in the wording of questions for
collecting data on race and ethnicity. Recent qualitative focus groups
and qualitative research (e.g., 2010 AQE research; 2013 Census Test
research; cognitive pre-testing for the 2016 American Community Survey
(ACS) Content Test) found that the terms ``race,'' ``ethnicity,'' and
``origin'' are confusing or misleading to many respondents, and mean
different things to different people. The 2010 AQE research tested the
removal of the term ``race'' from the question, and showed no evidence
that removal of the term had any effect on either unit or item response
rates. Recent cognitive research for the 2016 ACS Content Test tested
an open-ended instruction (``Which categories describe you?'') and
found that respondents did not have issues with understanding what the
question was asking. The following options will be tested in the 2015
NCT.
1. ``Origin'' term: The current version of the race and Hispanic
origin questions, and the combined question, use the terms ``race''
and/or ``origin'' to describe the concepts and groups in the question
stem and/or instructions. For instance, in the combined race and
Hispanic origin approach, the question stem is ``What is Person 1's
race or origin?''
2. ``Ethnicity'' term: One alternative option being explored tests
the use of both the terms ``ethnicity'' along with ``race'' in the
question stem and/or instructions (e.g., ``What is Person 1's race or
ethnicity?'').
3. NO terms: A second alternative option being explored tests the
removal of the terms ``race,'' ``origin,'' and ``ethnicity'' from the
question stem and instructions. Instead, a general approach asks,
``Which categories describe Person 1?''
Relationship Content
Two versions of the relationship question will be tested. Both
versions are the same as those used in a split-sample in the 2014
Census Test, with no changes. These relationship categories were
previously tested in other Census Bureau surveys including the American
Housing Survey, American Community Survey, and the Survey of Income and
Program Participation (currently used in production). Although research
to date has been informative, leading to the development of the revised
relationship question, additional quantitative testing is needed.
Because the incidence of some household relationships--such as same-sex
couples--is relatively low in the general population, the revised
question needs to be tested with large, nationally representative
samples prior to a final decision to include them in the 2020 Census
questionnaire.
The first version uses the 2010 Census relationship question
response options, but in a new order, starting with ``husband or wife''
and then the ``unmarried partner'' category. This version also re-
introduces the foster child category, which was removed from the 2010
Census form due to space issues.
The second version includes the same basic response options as the
2010 Census version, but modifies/expands the ``husband or wife'' and
``unmarried partner'' categories to distinguish between same-sex and
opposite-sex relationships.
Coverage Content (Internet Only)
The 2012 NCT experimented with several methods to improve within-
household coverage for Internet respondents. One benefit of the online
response mode is that it allows for more functionality and greater
flexibility in designing questions compared to paper, which is
constrained by space availability. The 2012 NCT included a coverage
follow-up reinterview to evaluate the different Internet design
options, but some results were inconclusive. In the 2015 NCT, two
designs will be tested to compare different approaches for helping
respondents provide a more accurate roster of household residents.
The first approach is the ``Rules-Based'' approach, and will allow
us to see whether the presence of a question asking the number of
people in the household, along with the residence rule instructions,
helps respondents create an accurate roster. This is similar to the
approach used across all modes in Census 2000 and the 2010 Census,
where the respondent was expected to understand various applications of
our residence rules and apply them to their household. The roster
creation is followed by a household-level question that probes to
determine if any additional people not listed originally should be
included for consideration as residents of the household (several types
of people and living situations are shown in a list).
The ``Question-Based'' approach allows us to ask guided questions
to help improve resident responses. Respondents are not shown the
residence rule instructions and are only asked to create an initial
roster of people they consider to be living or staying at their address
on Census Day. This is followed by several short household-level
questions about types of people and living situations that might apply
to people in the household that were not listed originally.
The materials mailed to the respondents will inform them that the
survey is mandatory in accordance with Title 13, United States Code,
Sections 141 and 193. This information also will be available via a
hyperlink from within the Internet Instrument.
The results of the 2015 NCT will help guide the design of
additional 2020 Census testing later this decade. The 2015 NCT will be
the only opportunity to test content with a nationally representative
sample prior to the 2020 Census. Testing in 2015 is necessary to
establish recommendations for contact strategies, response options, and
content options that can be further refined and tested in later tests.
At this point in the decade, the Census Bureau needs to acquire
evidence showing whether the strategies being tested can reduce the
cost per housing unit during a decennial census, while providing high
quality and accuracy of the census data. The nationally-representative
sample is
[[Page 29614]]
designed to ensure that the unbiased estimates from this test
accurately reflect the nation as a whole, across a variety of
demographic characteristics.
Along with other results, the response rates to paper and Internet
collection will be used to help inform 2020 Decennial program planning
and cost estimation metrics values. In addition, several demographic
questions and coverage probes are included in this test to achieve
improved coverage by future decennial censuses and surveys.
Information quality is an integral part of the pre-dissemination
review of the information disseminated by the Census Bureau (fully
described in the Census Bureau's Information Quality Guidelines).
Information quality is also integral to the information collections
conducted by the Census Bureau and is incorporated into the clearance
process required by the Paperwork Reduction Act.
Affected Public: Individuals or Households.
Frequency: One Time.
Respondent's Obligation: Mandatory.
Legal Authority: Title 13 U.S.C. 141 and 193.
This information collection request may be viewed at
www.reginfo.gov. Follow the instructions to view Department of Commerce
collections currently under review by OMB.
Written comments and recommendations for the proposed information
collection should be sent within 30 days of publication of this notice
to OIRA_Submission@omb.eop.gov or fax to (202) 395-5806.
Dated: May 14, 2015.
Glenna Mickelson,
Management Analyst, Office of the Chief Information Officer.
[FR Doc. 2015-12140 Filed 5-21-15; 8:45 am]
BILLING CODE 3510-07-P