Submission for OMB Review; Comment Request, 69188-69193 [2015-28416]

Download as PDF 69188 Notices Federal Register Vol. 80, No. 216 Monday, November 9, 2015 This section of the FEDERAL REGISTER contains documents other than rules or proposed rules that are applicable to the public. Notices of hearings and investigations, committee meetings, agency decisions and rulings, delegations of authority, filing of petitions and applications and agency statements of organization and functions are examples of documents appearing in this section. DEPARTMENT OF AGRICULTURE Rural Business-Cooperative Service Comments Notice of Request for Extension of Currently Approved Information Collection Rural Business-Cooperative Service, USDA. ACTION: Proposed collection; comments requested. AGENCY: In accordance with the Paperwork Reduction Act of 1995, this notice announces the Rural BusinessCooperative Service’s (RBS) intention to request an extension for a currently approved information collection in support of the program for 7 CFR part 4284, subpart K, Agriculture Innovation Demonstration Centers. DATES: Comments on this notice must be received by January 8, 2016 to be considered. FOR FURTHER INFORMATION CONTACT: Chad Parker, Deputy Administrator, Cooperative Programs, Rural Development, U.S. Department of Agriculture, STOP 3250, Room 5813– South, 1400 Independence Avenue SW., Washington, DC 20250–3250. Telephone: (202) 720–7558, Email: chad.parker@wdc.usda.gov. SUPPLEMENTARY INFORMATION: Title: Agriculture Innovation Centers. OMB Number: 0570–0045. Expiration Date of Approval: March 31, 2016. Type of Request: Extension of currently approved information collection. Abstract: Agriculture Innovation Center applicants must provide required information to demonstrate eligibility for the program and compliance with applicable laws and regulations. Grantees are required to provide progress reports for the duration of the grant agreement to ensure continued compliance and to measure the success of the program. srobinson on DSK5SPTVN1PROD with NOTICES SUMMARY: VerDate Sep<11>2014 19:52 Nov 06, 2015 Jkt 238001 Estimate of Burden: Public reporting burden for this collection is estimated to average 4.38 hours per response. Estimated Number of Respondents: 1. Estimated Number of Responses per Respondent: 13. Estimated Number of Responses: 13. Estimated Total Annual Burden on Respondents: 57 hours. Copies of this information collection can be obtained from Jeanne Jacobs, Regulations and Paperwork Management Branch, (202) 692–0040. Comments are invited on: (a) Whether the proposed collection of information is necessary for the proper performance of the functions of RBS, including whether the information will have practical utility; (b) the accuracy of the Agency’s estimate of the burden to collect the required information, including the validity of the strategy used; (c) ways to enhance the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology. Comments on the paperwork burden may be sent to Jeanne Jacobs, Regulations and Paperwork Management Branch, Rural Development, U.S. Department of Agriculture, STOP 0742, 1400 Independence Avenue SW., Washington, DC 20250–0742. All responses to this notice will be summarized and included in the request for the Office of Management and Budget’s approval. All comments will become a matter of public record. Dated: October 23, 2015. Samuel H. Rikkers, Acting Administrator, Rural BusinessCooperative Service. [FR Doc. 2015–28443 Filed 11–6–15; 8:45 am] BILLING CODE 3410–XY–P DEPARTMENT OF COMMERCE Submission for OMB Review; Comment Request The Department of Commerce will submit to the Office of Management and Budget (OMB) for clearance the PO 00000 Frm 00001 Fmt 4703 Sfmt 4703 following proposal for collection of information under the provisions of the Paperwork Reduction Act (44 U.S.C. chapter 35). Agency: National Institute of Standards and Technology (NIST). Title: Baldrige Executive Fellows Program. OMB Control Number: None. Form Number(s): None. Type of Request: New collection. Number of Respondents: 12. Average Hours per Response: 1 hour. Burden Hours: 12. Needs and Uses: Collection needed to obtain information to select applicants for the Baldrige Executive Fellows Program. Affected Public: Business, health care, education, or other for-profit organizations; health care, education, and other nonprofit organizations; and individuals. Frequency: Annual. Respondent's Obligation: Voluntary. This information collection request may be viewed at reginfo.gov. Follow the instructions to view Department of Commerce collections currently under review by OMB. Written comments and recommendations for the proposed information collection should be sent within 30 days of publication of this notice to OIRA_Submission@ omb.eop.gov or fax to (202) 395–5806. Dated: November 4, 2015. Glenna Mickelson, Management Analyst, Office of the Chief Information Officer. [FR Doc. 2015–28410 Filed 11–6–15; 8:45 am] BILLING CODE 3510–13–P DEPARTMENT OF COMMERCE Submission for OMB Review; Comment Request The Department of Commerce will submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection of information under the provisions of the Paperwork Reduction Act (44 U.S.C. chapter 35). Agency: U.S. Census Bureau. Title: 2016 Census Test. OMB Control Number: 0607–XXXX. Form Number(s): Questionnaire DF–1(ES) E:\FR\FM\09NON1.SGM 09NON1 69189 Federal Register / Vol. 80, No. 216 / Monday, November 9, 2015 / Notices DF–1(EC) DF–1(EK) DF–9(2B)(ES) DF–9(2B)(EC) DF–9(2B)(EK) DF–9C(ES) DF–9C(EC) DF–9C(EK) DF–9(2C)(ES) DF–9(2C)(EC) DF–9(2C)(EK) DF–9(AR)(1) Instruction Card DF–33(ES) DF–33(EC) DF–33(EK) Questionnaire Cover Letters DF–16(L2)(ES) DF–16(L2)(EC) DF–16(L2)(EK) DF–16(L4)(ES) DF–16(L4)(EC) DF–16(L4)(EK) DF–17(L2)(ES) DF–17(L2)(EC) DF–17(L2)(EK) DF–6U(IN) DF–6U(1)(IN) DF–8A(ES) DF–8A(EC) DF–8A(EK) DF–5(ES) Field Materials DF–26B DF–28(ES) DF–28(EC) DF–28(EK) Languages Brochures DF–12 DF–14 Internet Instrument Spec Information Insert Postcards/Reminder Letter DF–9L(ES) DF–9B(ES) DF–9B(EC) DF–9B(EK) COMPASS (NRFU/QA RI) Spec DF–17(TQA) DF–17I(ES) DF–17I(EC) DF–17I(EK) Reinterview Instrument Spec (Coverage) Type of Request: New Collection. Number of Respondents: 412,348. Average Hours per Response: 0.2. Burden Hours: 68,954. Estimated burden hours for 2016 Census Test: Envelopes DF–6A(1)(IN)(ES) DF–6A(IN)(ES) Estimated number of respondents Estimated time per response (minutes) Estimated total annual burden hours Self Response ............................................................................................................................. NRFU ........................................................................................................................................... NRFU Quality Control Reinterview .............................................................................................. Non-ID Manual Processing—phone followup .............................................................................. Coverage Reinterview ................................................................................................................. Non-ID Response Validation ....................................................................................................... Focus Group Selection Contacts ................................................................................................. Focus Group Participants ............................................................................................................ 250,000 120,000 12,000 400 24,500 5,000 288 160 10 10 10 5 10 10 3 120 41,667 20,000 2,000 33 4,084 834 15 320 Totals .................................................................................................................................... 412,348 ........................ 68,954 srobinson on DSK5SPTVN1PROD with NOTICES Type of respondent/operation Needs and Uses: During the years preceding the 2020 Census, the Census Bureau is pursuing its commitment to reduce the cost of conducting the census while maintaining the quality of the results. A primary decennial census cost driver is the collection of data in person from addresses for which the Census Bureau received no reply via initially offered response options. We refer to these as nonresponse cases, and the efforts we make to collect data from these cases as the Nonresponse Followup, or NRFU, operation. The 2016 Census Test will allow the Census Bureau to build upon past tests, to refine our plans and methods associated with the reengineered field operations for the NRFU operation of the Census. Namely, this test will allow us to: • Test refinements to the ratios of field enumerators to field supervisors. • Test refinements to our enhanced operational control system, including the way we assign work to field staff, and how those assignments are routed. • Test alternatives to government furnished equipment for data collection, VerDate Sep<11>2014 19:52 Nov 06, 2015 Jkt 238001 such as enumerator use of personally owned devices (sometimes known as Bring Your Own Device, or BYOD), or devices provided by a private company as part of a contract for wireless service (sometimes known as Device As A Service). • Test refinements to our use of administrative records to reduce the NRFU workload. • Test new methods of conducting NRFU quality control reinterviews. Increasing the number of people who take advantage of self response options (such as responding online, completing a paper questionnaire and mailing it back to the Census Bureau, or responding via telephone) can contribute to a less costly census. The Census Bureau has committed to using the Internet as a primary response option in the 2020 Census, and we are studying ways to offer and promote this option to respondents. In addition to increasing and optimizing self response through the Internet, the Census Bureau plans to test the impacts of providing additional materials to respondents as part of their first mailing along with a PO 00000 Frm 00002 Fmt 4703 Sfmt 4703 letter invitation. One example of additional material is an insert to be included for traditionally hard-to-count populations. We will also test a tailored envelope treatment to determine whether this represents an effective way to encourage and support self response for respondents who speak languages other than English. We also will continue to study the option of allowing people to respond on the Internet without having or using a unique identification code previously supplied by the Census Bureau. Each of these will be discussed in more detail in subsequent sections of this supporting statement. 2016 Census Test—Los Angeles County (Part), California and Harris County (Part), Texas The areas within Los Angeles County (part), California and Harris County (part), Texas were chosen based on a variety of characteristics—including language diversity, demographic diversity, varying levels of Internet usage, large metropolitan areas and high vacancy rates. These characteristics can E:\FR\FM\09NON1.SGM 09NON1 69190 Federal Register / Vol. 80, No. 216 / Monday, November 9, 2015 / Notices help the Census Bureau refine its operational plans for the 2020 Census by testing operational procedures on traditionally hard-to-count populations. The tests will allow for our continued development of providing additional ways for the population to respond to the once-a-decade census, as well as more cost-effective ways for census takers to follow up with households that fail to respond. Los Angeles County (part), California, places and census designated places (CDP) Alhambra city Los Angeles city Montebello city Monterey Park city Pasadena city Rosemead city San Gabriel city San Marino city South El Monte city South Pasadena city Temple City city East Los Angeles CDP East Pasadena CDP East San Gabriel CDP San Pasqual CDP South San Gabriel CDP Harris County (part), Texas, places srobinson on DSK5SPTVN1PROD with NOTICES Bunker Hill Village city Hedwig Village city Hilshire Village city Houston city Hunters Creek Village city Jersey Village city Piney Point Village city Spring Valley Village city To increase Internet self response rates, the Census Bureau will improve contact and notification strategies that were studied in prior testing. The core of our contact strategy is an Internetpush strategy, which was previously tested in the 2012 National Census Test, 2014 Census Test and the 2015 Optimizing Self Response and Census Tests and is now being further refined. We also introduced a supplemental contact strategy in the 2015 National Content Test, the Internet Choice panel, which we will continue to study in the 2016 Census Test. In the 2016 Census Test, improvements to this approach will be tested by modifying the content of our messages, including materials in the mailing packages. We also will continue our efforts to make it easier for respondents by allowing them to respond without providing a pre-assigned identification (ID) number associated with their address. This response option, referred to as ‘‘Non-ID,’’ was successfully implemented on the Internet in the 2014 and 2015 Census Tests. In this test, we will continue to develop the VerDate Sep<11>2014 19:52 Nov 06, 2015 Jkt 238001 infrastructure to deploy real-time processing of Non-ID responses. Specifically, we will implement automated processing of Non-ID responses in a cloud-based environment instead of using Census Bureau hardware. This work will help us prepare for conducting Non-ID Processing at the scale we anticipate for 2020. In addition, we will be conducting a manual matching and geocoding operation for Non-ID responses that could not be matched to a record in the Census address list, or assigned to a census block during automated processing. Some of this processing will require Census staff to call respondents to obtain further information, such as missing address items that could help us obtain a match to a record in the Census address list. In some cases, we may also ask for the respondent’s assistance in accurately locating their living quarters on a map so that we can associate the response to the correct census block, which is required for data tabulation. The 2016 Census Test will be comprised of four phases: Self Response, NRFU (with a reinterview component), Coverage Reinterview, and focus groups. Self Response We will implement an ‘‘Internet Push’’ contact strategy, which involves first sending a letter inviting people to respond via the Internet; then sending up to two postcard reminders to nonresponding addresses; and ultimately sending a paper questionnaire to addresses that still have not responded. The Census Bureau will directly contact up to 250,000 addresses in each site to request self response via one of the available response modes (Internet, telephone, paper). Materials included in the mailing explain the test and provide information on how to respond. The impact of message content on self response will be tested by varying the content of the mailing packages in the ‘‘Internet Push’’ for different panels. Specifically, we will test language that addresses how participation in the Census benefits respondents’ communities and cite the mandatory nature of the census. Mail panels targeting limited English proficiency (LEP) households will include a language insert as part of the contact strategy. LEP households represent a subsample of housing units in each test location. We also plan to include the Census Internet Uniform Resource Locator (URL) on envelopes with messaging in multiple languages for a panel of housing units. This is intended to serve as a prompt for LEP PO 00000 Frm 00003 Fmt 4703 Sfmt 4703 respondents to access the Census URL without needing to read a letter written in a language in which they are not fluent. An ‘‘Internet Choice’’ panel will also be tested; which involves first sending a questionnaire with a letter inviting people to respond via the Internet or by using the questionnaire; then sending up to two postcard reminders to non-responding addresses; and ultimately sending a second paper questionnaire to addresses that still have not responded. The design of the mail panels is fully described in Supporting Statement B. In addition to supporting Non-ID self response and conducting manual processing of Non-ID returns when required, we will take steps to identify duplicate or potentially fraudulent NonID responses. For all Non-ID responses, we will compare response data to information contained in commercial lists and Federal administrative records maintained within the Census Bureau. This will help validate respondentprovided data as well as examine the gaps in coverage we might have in currently available administrative records datasets. Last, in order to confirm the results from the records linkage, we will conduct a Response Validation operation to recollect the response data for an estimated sample of 5,000 of the Non-ID returns. This will likely be performed as a combination of telephone interviews and in-person visits, but the proportions of each of these are still to be determined. Telephone questionnaire assistance will be available to all respondents. In addition, on-line respondents will be provided with pre-defined ‘‘Help’’ screens or ‘‘Frequently Asked Questions’’ accessible through the Internet instrument. People who prefer not to respond via a paper form or on the Internet can also call the telephone questionnaire assistance number and speak to an agent to complete the questionnaire for their household. Content Tests Objectives in Self Response and Nonresponse Followup Data Collection The 2016 Census Test questionnaire will include questions on housing tenure, household roster, age, sex/ gender, date of birth, race and Hispanic origin, and relationship. Based on results from the 2010 Race and Hispanic Origin Alternative Questionnaire Experiment (Compton, et al. 2012 1), the 2016 Census Test will include a 1 Compton, E., Bentley. M., Ennis, S., Rastogi, S., (2012), ‘‘2010 Census Race and Hispanic Origin Alternative Questionnaire Experiment,’’ DSSD 2010 CPEX Memorandum Series #B–05–R2, U.S. Census Bureau. E:\FR\FM\09NON1.SGM 09NON1 srobinson on DSK5SPTVN1PROD with NOTICES Federal Register / Vol. 80, No. 216 / Monday, November 9, 2015 / Notices combined race and Hispanic origin question intended to build on what is being tested in the 2015 National Content Test. This combined question provides examples and write-in areas for each major response category, including a response category for Middle Eastern and North African ethnicities. With this combined question format no separate ‘‘Hispanic origin’’ question is used. Rather, Hispanic ethnicity or origin is measured within the single item. Respondents are asked to self-identify by selecting one or more checkboxes, and to write-in a specific origin for each checkbox selected. The 2016 Census Test allows us to test responses to these questions in geographic areas with different race and Hispanic Origin concentrations from the prior test areas. The inclusion of the combined question will also allow the Census Bureau to conduct imputation research using this combined format in a setting when there are self responses, administrative records and NRFU enumerator responses. This will allow the Census Bureau to understand imputation approaches needed for a combined question. We also plan to test variation in terminology by comparing ‘‘Am.’’ with ‘‘American’’ in the response category ‘‘Black or African Am.’’ on the Internet instrument. This research is being undertaken to assess the impact of different wording for the racial category that collects and tabulates data for the African American, African, and AfroCaribbean populations. This test will provide insights to how respondents identify with the race category, depending on the wording used to describe the category itself (‘‘Black or African Am.’’ vs. ‘‘Black or African American’’). For the relationship question, we plan to include variations in question wording associated with ‘‘nonrelatives.’’ We will compare responses to a relationship question with, and without, the response categories ‘‘roomer or boarder’’ and ‘‘housemate or roommate.’’ Cognitive testing has repeatedly shown that respondents do not know what the Census Bureau sees as the differences between these categories. The 2016 Census Test will continue to include the response categories recommended by the OMB Interagency Working Group (see section 11 of this document—Justification for Sensitive Questions) for opposite-sex and samesex husband/wife/spouse households, and for the category for unmarried partner. VerDate Sep<11>2014 19:52 Nov 06, 2015 Jkt 238001 The 2016 Census Test will include a question on the Internet instrument that will allow respondents to report that a housing unit they own is vacant as of Census Day, and to provide the reason for the vacancy status (e.g., a seasonal or rental unit). Collecting these data from respondents may allow the Census Bureau to identify some vacant housing units during self response so they can be removed from NRFU operations. The Census Bureau’s research on how best to present and explain the residence rule (who to count) in specific situations will continue. The Internet data collection instrument will include various ways to ask about and confirm the number of persons residing at an address. Respondents will see one of three screens about the enumeration of people in their household: one that displays the Census Bureau’s basic residence rule, and then asks for the number of people in the household based on that rule; One that asks for the number of people who live in the household but provides our residence rule definition in the help text; and one that asks if any other people live at the household, with the residence rule in the help text. After the names of the roster members are collected, the respondent will then see one of three series of undercount detection questions: One series asks for additional people on two separate screens, another series asks for additional people on only one screen, or no undercount questions at all. After the demographic items are collected, the respondent will then see overcount detection questions or, if the case had not received undercount questions, no overcount detection questions. The materials mailed to the respondents will inform them that the survey is mandatory in accordance with title 13, United States Code, sections 141 and 193. This information also will be available via a hyperlink from within the Internet instrument. Nonresponse Followup (NRFU) Operation Testing The 2016 Census Test will determine our 2020 Census methods for conducting NRFU operations that will increase efficiency and reduce costs. Based on previous tests, the Census Bureau will refine its contact strategies and methods for field data collection, case assignment management, and field staff administrative functions. This will include further testing of how administrative records can be used to reduce the NRFU workload. As part of the 2016 Census Test, we will collect housing unit status and enumerate the occupants of households PO 00000 Frm 00004 Fmt 4703 Sfmt 4703 69191 that do not respond to the self response phase of the census using automated enumeration software on standard (iOS and Android operating system) smartphone devices. The test will enable our continued study of options for alternatives to using government furnished equipment. This includes options for an enumerator to use their own smartphone for enumeration, often known as ‘‘Bring Your Own Device (BYOD)’’, and options to use a ‘Device as a Service’ contract, where the Census Bureau will not own the smartphone devices outright, but instead will pay a vendor for their use, including any initialization and setup processes required. This has the potential to mitigate risks to the operation. For example, unpredictable increases in costs associated with device initialization and hardware support. We will also continue to operationally test the field data collection application we use on these devices. The devices will use a modified version of the software used in the 2015 Census Test, with updated capabilities for handling special non-interview cases (such as demolished homes and non-existent addresses), better handling of addresses with multiple units (like apartment buildings), a clearer path for enumerators to take when attempting to collect data from a householder’s neighbor or another knowledgeable source, new screens related to detecting potential ‘‘overcount’’ in a household (scenarios where current household residents also lived at another location, like student housing), and numerous other minor incremental user interface and performance updates. The Census Bureau also plans to test a newly redesigned portion of our quality assurance activities—the NRFU Reinterview program (NRFU-RI). We plan to test: • New methodologies for selecting cases to be reinterviewed, including the potential use of operational control system data (paradata) and administrative records to detect potential falsification by enumerators • Using our automated field data collection instrument for conducting these reinterviews • Using our recently re-designed operational control system to optimize the routing and assignment of reinterview cases, and • Using the same field staff to conduct both NRFU interviews and associated reinterviews, with an explicit rule within the instrument that an enumerator is not allowed to reinterview their own work. All of these changes have the potential to lead to a more cost-effective, E:\FR\FM\09NON1.SGM 09NON1 srobinson on DSK5SPTVN1PROD with NOTICES 69192 Federal Register / Vol. 80, No. 216 / Monday, November 9, 2015 / Notices streamlined, and higher quality NRFU operation for the 2020 Census. We will continue to test our newly re-engineered field infrastructure, allowing us to refine our requirements for staffing ratios and position duties for 2020 Census operations. We will also continue to test our enhanced operational control system, using lessons learned from the 2015 Census Test to make further improvements to how assignments are made and routed. We will continue to test improvements to our use of systematic alerts that will quickly notify field supervisors of potential problem enumerators, detect possible falsification, and improve both quality and efficiency for the NRFU operation. Additionally, we will continue to test our implementation of an ‘adaptive design’ contact strategy: Using a varied number of personal visit attempts by geographic area based on criteria associated with people who are harder to count. We also will study when is the optimal point to discontinue attempts to collect information from each nonresponding household, and instead move to attempting to collect information from a householder’s neighbor or another knowledgeable source. Finally, we will build upon work from the 2013, 2014, and 2015 Census Tests in a continued attempt to refine and evaluate our use of administrative records (including government and third-party data sources) to reduce the NRFU workload. Cases will be removed from the NRFU operation based on our administrative records modeling as follows: • Any case that is given a status of vacant from our administrative records modeling will be immediately removed from the NRFU workload; and • Any case that is given a status of occupied from our administrative records modeling will be removed from the NRFU workload after one unsuccessful attempt at field enumeration is made (as long as good administrative records exist for that case). Unlike previous tests, for all cases removed from the NRFU workload in this way, we will test mailing these addresses a supplemental letter to prompt a self response. If these cases do not self-respond, we will enumerate the unit based on the results of our administrative records modeling. For a sample of the cases that would be removed via this criteria, we will continue to perform the field followup activities. This will allow us to compare the outcomes of those that get a completed interview with our modeled status of the household, and determine VerDate Sep<11>2014 19:52 Nov 06, 2015 Jkt 238001 the quality of our administrative record modeling. Coverage Reinterview As described previously, the 2016 Census Test Internet instrument contains embedded coverage experiments, and a reinterview is needed to quantify the effects of each particular version on the roster provided by the Internet respondent. The quality of the final household roster created from the panels with experimentally applied questions will be evaluated by a coverage reinterview conducted by telephone. Note that these panels are used to evaluate the different residence rule approaches used in the different questionnaire panels. The reinterview will contain extensive questions about potentially missed roster members and other places that any household members sometimes stay. Specifically, the reinterview will re-contact responders to determine if any people may have been left off the roster or erroneously included on the roster during the initial response. If there are indications during the reinterview that some people may have been left off the roster, then we will ask for demographic information about the missed people. If there are indications during the reinterview that some people may have been erroneously included, then we will ask for information about stay durations in order to resolve residency situations. The reinterview will be a Computer Assisted Telephone Interviewing (CATI) operation conducted in the Census Bureau’s call centers. In addition to contacting Internet responders, a small portion of people who responded by paper or as a part of NRFU will be selected for the Coverage Reinterview. The inclusion of such cases will allow us to quantify the quality of household rosters collected in these two other modes. Focus Groups Following the end of data collection, the Census Bureau will conduct focus groups with 2016 Census Test participants to ask about their experience. Topics will include their opinions on the use of administrative records by the Census Bureau. Participants also will be asked about their general concerns with government data collection and the government’s ability to protect confidential data. The specific information collection materials for those activities will be submitted separately as non-substantive changes. Testing in 2016 is necessary to build on the findings from prior testing and to establish recommendations for contact strategies, response options, and field PO 00000 Frm 00005 Fmt 4703 Sfmt 4703 operation efficiencies that can be further refined and deployed again in subsequent operational and system development activities. At this point in the decade, the Census Bureau needs to solidify evidence showing whether the strategies being tested can reduce the cost per housing unit during a decennial census, while still providing high quality and accuracy of the census data. The results of the 2016 Census Test from both sites will inform decisions that the Census Bureau will make about refining the detailed operational plan for the 2020 Census and will help guide the evaluation of additional 2020 Census test results later this decade. Along with other results related to content, the response rates to paper and Internet collection will be used to help inform 2020 Census program planning and cost estimates. Several versions of some of the demographic questions and versions of coverage questions are included in this test in order to determine further the best questions and procedures for collecting the data from hard-to-count populations and achieve optimal within-household person coverage within the decennial census. Testing enhancements to Non-ID processing will inform final planning for the 2020 Census design, as well as the infrastructure required to support large scale, real-time processing of electronic Non-ID response data submitted via the Internet. Building upon previous Census Tests, the NRFU portion of the 2016 Census Test will inform the following important decisions for conducting the 2020 Census: • We will continue to research the cost and quality impact of reducing the NRFU caseload through the use of administrative records information, to inform our final strategy for the use of administrative records. This test will also allow us to further define our core set of administrative records that will be used for the 2020 Census, and our strategies for acquiring and using those records. This research will help us achieve our goal of a more cost-effective 2020 Census, while maintaining quality of the results. • We will continue to research the cost and quality impacts of new NRFU contact strategies that make use of adaptive design and a re-engineered management structure employing automated payroll, automated training, and minimal face-to-face contact between enumerators and supervisors. Enumerators are asked to provide worktime availability in advance, and the system then will assign the optimal number of cases to attempt each day, as well as the optimal route to follow that E:\FR\FM\09NON1.SGM 09NON1 Federal Register / Vol. 80, No. 216 / Monday, November 9, 2015 / Notices day. Again, this operational research will help us towards our goal of a more cost-effective 2020 Census, while maintaining quality of the results. • We will be able to determine at what rate field staff are willing to use their own personally owned devices to conduct Census enumeration, and continue to develop our technical processes to enable this to be done in a secure and cost-effective manner. We will also be able to make quality and cost determinations about a ‘Device as a Service’ option, and be able to develop more mature cost models to inform our decisions related to the device provision strategies for the 2020 Census NRFU operation. • We will be able to determine the cost and quality impacts of our newly re-engineered NRFU Reinterview quality assurance program. This data will inform our decision on an integrated and re-designed approach to quality assurance for the 2020 Census. Affected Public: Individuals or Households. Frequency: One time. Respondent's Obligation: Mandatory. Legal Authority: Title 13, United States Code, sections 141 and 193. This information collection request may be viewed at www.reginfo.gov. Follow the instructions to view Department of Commerce collections currently under review by OMB. Written comments and recommendations for the proposed information collection should be sent within 30 days of publication of this notice to OIRA_Submission@omb.eop.gov or fax to (202) 395–5806. Dated: November 4, 2015. Glenna Mickelson, Management Analyst, Office of the Chief Information Officer. [FR Doc. 2015–28416 Filed 11–6–15; 8:45 am] BILLING CODE 3510–07–P DEPARTMENT OF COMMERCE Foreign-Trade Zones Board [S–117–2015] srobinson on DSK5SPTVN1PROD with NOTICES Approval of Subzone Status; Springsteen Logistics, LLC; Rock Hill and Fort Lawn, South Carolina On August 11, 2015, the Executive Secretary of the Foreign-Trade Zones (FTZ) Board docketed an application submitted by the South Carolina State Ports Authority, grantee of FTZ 38, requesting subzone status subject to the existing activation limit of FTZ 38, on behalf of Springsteen Logistics, LLC in VerDate Sep<11>2014 19:52 Nov 06, 2015 Jkt 238001 69193 Rock Hill and Fort Lawn, South Carolina. The application was processed in accordance with the FTZ Act and Regulations, including notice in the Federal Register inviting public comment (80 FR 49201, August 17, 2015). The FTZ staff examiner reviewed the application and determined that it meets the criteria for approval. Pursuant to the authority delegated to the FTZ Board Executive Secretary (15 CFR 400.36(f)), the application to establish Subzone 38J is approved, subject to the FTZ Act and the Board’s regulations, including § 400.13, and further subject to FTZ 38’s 2,000-acre activation limit. in response to material submitted during the foregoing period may be submitted during the subsequent 15-day period to January 4, 2016. A copy of the application will be available for public inspection at the Office of the Executive Secretary, Foreign-Trade Zones Board, Room 21013, U.S. Department of Commerce, 1401 Constitution Avenue NW., Washington, DC 20230–0002, and in the ‘‘Reading Room’’ section of the FTZ Board’s Web site, which is accessible via www.trade.gov/ftz. For further information, contact Kathleen Boyce at Kathleen.Boyce@ trade.gov or (202) 482–1346. Dated: October 30, 2015. Andrew McGilvray, Executive Secretary. Dated: November 3, 2015. Elizabeth Whiteman, Acting Executive Secretary. [FR Doc. 2015–28459 Filed 11–6–15; 8:45 am] [FR Doc. 2015–28458 Filed 11–6–15; 8:45 am] BILLING CODE 3510–DS–P BILLING CODE 3510–DS–P DEPARTMENT OF COMMERCE DEPARTMENT OF COMMERCE Foreign-Trade Zones Board International Trade Administration [S–147–2015] Initiation of Antidumping and Countervailing Duty Administrative Reviews Foreign-Trade Zone 76—Bridgeport, Connecticut; Application for Subzone; MannKind Corporation; Danbury, Connecticut An application has been submitted to the Foreign-Trade Zones (FTZ) Board by the Bridgeport Port Authority, grantee of FTZ 76, requesting subzone status for the facilities of MannKind Corporation, located in Danbury, Connecticut. The application was submitted pursuant to the provisions of the Foreign-Trade Zones Act, as amended (19 U.S.C. 81a– 81u), and the regulations of the FTZ Board (15 CFR part 400). It was formally docketed on November 3, 2015. The proposed subzone would consist of the following sites: Site 1 (12.5 acres) 40 Taylor Street, Danbury; and, Site 2 (5 acres) 1 Casper Street, Danbury. The proposed subzone would be subject to the existing activation limit of FTZ 76. A notification of proposed production activity has been submitted and will be published separately for public comment. In accordance with the FTZ Board’s regulations, Kathleen Boyce of the FTZ Staff is designated examiner to review the application and make recommendations to the Executive Secretary. Public comment is invited from interested parties. Submissions shall be addressed to the FTZ Board’s Executive Secretary at the address below. The closing period for their receipt is December 21, 2015. Rebuttal comments PO 00000 Frm 00006 Fmt 4703 Sfmt 4703 Enforcement and Compliance, International Trade Administration, Department of Commerce. SUMMARY: The Department of Commerce (‘‘the Department’’) has received requests to conduct administrative reviews of various antidumping and countervailing duty orders and findings with September anniversary dates. In accordance with the Department’s regulations, we are initiating those administrative reviews. DATES: Effective Date: November 9, 2015. AGENCY: FOR FURTHER INFORMATION CONTACT: Brenda E. Waters, Office of AD/CVD Operations, Customs Liaison Unit, Enforcement and Compliance, International Trade Administration, U.S. Department of Commerce, 14th Street and Constitution Avenue NW., Washington, DC 20230, telephone: (202) 482–4735. SUPPLEMENTARY INFORMATION: Background The Department has received timely requests, in accordance with 19 CFR 351.213(b), for administrative reviews of various antidumping and countervailing duty orders and findings with September anniversary dates. All deadlines for the submission of various types of information, certifications, or comments or actions by the Department discussed below refer to E:\FR\FM\09NON1.SGM 09NON1

Agencies

[Federal Register Volume 80, Number 216 (Monday, November 9, 2015)]
[Notices]
[Pages 69188-69193]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 2015-28416]


-----------------------------------------------------------------------

DEPARTMENT OF COMMERCE


Submission for OMB Review; Comment Request

    The Department of Commerce will submit to the Office of Management 
and Budget (OMB) for clearance the following proposal for collection of 
information under the provisions of the Paperwork Reduction Act (44 
U.S.C. chapter 35).
    Agency: U.S. Census Bureau.
    Title: 2016 Census Test.
    OMB Control Number: 0607-XXXX.
    Form Number(s):

Questionnaire

DF-1(ES)

[[Page 69189]]

DF-1(EC)
DF-1(EK)

Instruction Card

DF-33(ES)
DF-33(EC)
DF-33(EK)

Questionnaire Cover Letters

DF-16(L2)(ES)
DF-16(L2)(EC)
DF-16(L2)(EK)
DF-16(L4)(ES)
DF-16(L4)(EC)
DF-16(L4)(EK)
DF-17(L2)(ES)
DF-17(L2)(EC)
DF-17(L2)(EK)

Postcards/Reminder Letter

DF-9L(ES)
DF-9B(ES)
DF-9B(EC)
DF-9B(EK)
DF-9(2B)(ES)
DF-9(2B)(EC)
DF-9(2B)(EK)
DF-9C(ES)
DF-9C(EC)
DF-9C(EK)
DF-9(2C)(ES)
DF-9(2C)(EC)
DF-9(2C)(EK)
DF-9(AR)(1)

Languages Brochures

DF-12
DF-14

Information Insert

DF-17(TQA)
DF-17I(ES)
DF-17I(EC)
DF-17I(EK)

Envelopes

DF-6A(1)(IN)(ES)
DF-6A(IN)(ES)
DF-6U(IN)
DF-6U(1)(IN)
DF-8A(ES)
DF-8A(EC)
DF-8A(EK)
DF-5(ES)

Field Materials

DF-26B
DF-28(ES)
DF-28(EC)
DF-28(EK)

Internet Instrument Spec

COMPASS (NRFU/QA RI) Spec
Reinterview Instrument Spec (Coverage)
    Type of Request: New Collection.
    Number of Respondents: 412,348.
    Average Hours per Response: 0.2.
    Burden Hours: 68,954.
    Estimated burden hours for 2016 Census Test:

----------------------------------------------------------------------------------------------------------------
                                                                     Estimated    Estimated time     Estimated
                  Type of respondent/operation                       number of     per response    total annual
                                                                    respondents      (minutes)     burden hours
----------------------------------------------------------------------------------------------------------------
Self Response...................................................         250,000              10          41,667
NRFU............................................................         120,000              10          20,000
NRFU Quality Control Reinterview................................          12,000              10           2,000
Non-ID Manual Processing--phone followup........................             400               5              33
Coverage Reinterview............................................          24,500              10           4,084
Non-ID Response Validation......................................           5,000              10             834
Focus Group Selection Contacts..................................             288               3              15
Focus Group Participants........................................             160             120             320
                                                                 -----------------------------------------------
    Totals......................................................         412,348  ..............          68,954
----------------------------------------------------------------------------------------------------------------

    Needs and Uses: During the years preceding the 2020 Census, the 
Census Bureau is pursuing its commitment to reduce the cost of 
conducting the census while maintaining the quality of the results. A 
primary decennial census cost driver is the collection of data in 
person from addresses for which the Census Bureau received no reply via 
initially offered response options. We refer to these as nonresponse 
cases, and the efforts we make to collect data from these cases as the 
Nonresponse Followup, or NRFU, operation.
    The 2016 Census Test will allow the Census Bureau to build upon 
past tests, to refine our plans and methods associated with the 
reengineered field operations for the NRFU operation of the Census. 
Namely, this test will allow us to:
     Test refinements to the ratios of field enumerators to 
field supervisors.
     Test refinements to our enhanced operational control 
system, including the way we assign work to field staff, and how those 
assignments are routed.
     Test alternatives to government furnished equipment for 
data collection, such as enumerator use of personally owned devices 
(sometimes known as Bring Your Own Device, or BYOD), or devices 
provided by a private company as part of a contract for wireless 
service (sometimes known as Device As A Service).
     Test refinements to our use of administrative records to 
reduce the NRFU workload.
     Test new methods of conducting NRFU quality control 
reinterviews.
    Increasing the number of people who take advantage of self response 
options (such as responding online, completing a paper questionnaire 
and mailing it back to the Census Bureau, or responding via telephone) 
can contribute to a less costly census. The Census Bureau has committed 
to using the Internet as a primary response option in the 2020 Census, 
and we are studying ways to offer and promote this option to 
respondents. In addition to increasing and optimizing self response 
through the Internet, the Census Bureau plans to test the impacts of 
providing additional materials to respondents as part of their first 
mailing along with a letter invitation. One example of additional 
material is an insert to be included for traditionally hard-to-count 
populations. We will also test a tailored envelope treatment to 
determine whether this represents an effective way to encourage and 
support self response for respondents who speak languages other than 
English. We also will continue to study the option of allowing people 
to respond on the Internet without having or using a unique 
identification code previously supplied by the Census Bureau. Each of 
these will be discussed in more detail in subsequent sections of this 
supporting statement.

2016 Census Test--Los Angeles County (Part), California and Harris 
County (Part), Texas

    The areas within Los Angeles County (part), California and Harris 
County (part), Texas were chosen based on a variety of 
characteristics--including language diversity, demographic diversity, 
varying levels of Internet usage, large metropolitan areas and high 
vacancy rates. These characteristics can

[[Page 69190]]

help the Census Bureau refine its operational plans for the 2020 Census 
by testing operational procedures on traditionally hard-to-count 
populations. The tests will allow for our continued development of 
providing additional ways for the population to respond to the once-a-
decade census, as well as more cost-effective ways for census takers to 
follow up with households that fail to respond.

------------------------------------------------------------------------
   Los Angeles County (part), California, places and census designated
                              places (CDP)
-------------------------------------------------------------------------
Alhambra city
Los Angeles city
Montebello city
Monterey Park city
Pasadena city
Rosemead city
San Gabriel city
San Marino city
South El Monte city
South Pasadena city
Temple City city
East Los Angeles CDP
East Pasadena CDP
East San Gabriel CDP
San Pasqual CDP
South San Gabriel CDP
------------------------------------------------------------------------
                   Harris County (part), Texas, places
------------------------------------------------------------------------
Bunker Hill Village city
Hedwig Village city
Hilshire Village city
Houston city
Hunters Creek Village city
Jersey Village city
Piney Point Village city
Spring Valley Village city
------------------------------------------------------------------------

    To increase Internet self response rates, the Census Bureau will 
improve contact and notification strategies that were studied in prior 
testing. The core of our contact strategy is an Internet-push strategy, 
which was previously tested in the 2012 National Census Test, 2014 
Census Test and the 2015 Optimizing Self Response and Census Tests and 
is now being further refined. We also introduced a supplemental contact 
strategy in the 2015 National Content Test, the Internet Choice panel, 
which we will continue to study in the 2016 Census Test. In the 2016 
Census Test, improvements to this approach will be tested by modifying 
the content of our messages, including materials in the mailing 
packages.
    We also will continue our efforts to make it easier for respondents 
by allowing them to respond without providing a pre-assigned 
identification (ID) number associated with their address. This response 
option, referred to as ``Non-ID,'' was successfully implemented on the 
Internet in the 2014 and 2015 Census Tests. In this test, we will 
continue to develop the infrastructure to deploy real-time processing 
of Non-ID responses. Specifically, we will implement automated 
processing of Non-ID responses in a cloud-based environment instead of 
using Census Bureau hardware. This work will help us prepare for 
conducting Non-ID Processing at the scale we anticipate for 2020. In 
addition, we will be conducting a manual matching and geocoding 
operation for Non-ID responses that could not be matched to a record in 
the Census address list, or assigned to a census block during automated 
processing. Some of this processing will require Census staff to call 
respondents to obtain further information, such as missing address 
items that could help us obtain a match to a record in the Census 
address list. In some cases, we may also ask for the respondent's 
assistance in accurately locating their living quarters on a map so 
that we can associate the response to the correct census block, which 
is required for data tabulation.
    The 2016 Census Test will be comprised of four phases: Self 
Response, NRFU (with a reinterview component), Coverage Reinterview, 
and focus groups.

Self Response

    We will implement an ``Internet Push'' contact strategy, which 
involves first sending a letter inviting people to respond via the 
Internet; then sending up to two postcard reminders to non-responding 
addresses; and ultimately sending a paper questionnaire to addresses 
that still have not responded. The Census Bureau will directly contact 
up to 250,000 addresses in each site to request self response via one 
of the available response modes (Internet, telephone, paper). Materials 
included in the mailing explain the test and provide information on how 
to respond. The impact of message content on self response will be 
tested by varying the content of the mailing packages in the ``Internet 
Push'' for different panels. Specifically, we will test language that 
addresses how participation in the Census benefits respondents' 
communities and cite the mandatory nature of the census. Mail panels 
targeting limited English proficiency (LEP) households will include a 
language insert as part of the contact strategy. LEP households 
represent a subsample of housing units in each test location. We also 
plan to include the Census Internet Uniform Resource Locator (URL) on 
envelopes with messaging in multiple languages for a panel of housing 
units. This is intended to serve as a prompt for LEP respondents to 
access the Census URL without needing to read a letter written in a 
language in which they are not fluent. An ``Internet Choice'' panel 
will also be tested; which involves first sending a questionnaire with 
a letter inviting people to respond via the Internet or by using the 
questionnaire; then sending up to two postcard reminders to non-
responding addresses; and ultimately sending a second paper 
questionnaire to addresses that still have not responded. The design of 
the mail panels is fully described in Supporting Statement B.
    In addition to supporting Non-ID self response and conducting 
manual processing of Non-ID returns when required, we will take steps 
to identify duplicate or potentially fraudulent Non-ID responses. For 
all Non-ID responses, we will compare response data to information 
contained in commercial lists and Federal administrative records 
maintained within the Census Bureau. This will help validate 
respondent-provided data as well as examine the gaps in coverage we 
might have in currently available administrative records datasets. 
Last, in order to confirm the results from the records linkage, we will 
conduct a Response Validation operation to recollect the response data 
for an estimated sample of 5,000 of the Non-ID returns. This will 
likely be performed as a combination of telephone interviews and in-
person visits, but the proportions of each of these are still to be 
determined.
    Telephone questionnaire assistance will be available to all 
respondents. In addition, on-line respondents will be provided with 
pre-defined ``Help'' screens or ``Frequently Asked Questions'' 
accessible through the Internet instrument. People who prefer not to 
respond via a paper form or on the Internet can also call the telephone 
questionnaire assistance number and speak to an agent to complete the 
questionnaire for their household.

Content Tests Objectives in Self Response and Nonresponse Followup Data 
Collection

    The 2016 Census Test questionnaire will include questions on 
housing tenure, household roster, age, sex/gender, date of birth, race 
and Hispanic origin, and relationship. Based on results from the 2010 
Race and Hispanic Origin Alternative Questionnaire Experiment (Compton, 
et al. 2012 \1\), the 2016 Census Test will include a

[[Page 69191]]

combined race and Hispanic origin question intended to build on what is 
being tested in the 2015 National Content Test. This combined question 
provides examples and write-in areas for each major response category, 
including a response category for Middle Eastern and North African 
ethnicities. With this combined question format no separate ``Hispanic 
origin'' question is used. Rather, Hispanic ethnicity or origin is 
measured within the single item. Respondents are asked to self-identify 
by selecting one or more checkboxes, and to write-in a specific origin 
for each checkbox selected. The 2016 Census Test allows us to test 
responses to these questions in geographic areas with different race 
and Hispanic Origin concentrations from the prior test areas.
---------------------------------------------------------------------------

    \1\ Compton, E., Bentley. M., Ennis, S., Rastogi, S., (2012), 
``2010 Census Race and Hispanic Origin Alternative Questionnaire 
Experiment,'' DSSD 2010 CPEX Memorandum Series #B-05-R2, U.S. Census 
Bureau.
---------------------------------------------------------------------------

    The inclusion of the combined question will also allow the Census 
Bureau to conduct imputation research using this combined format in a 
setting when there are self responses, administrative records and NRFU 
enumerator responses. This will allow the Census Bureau to understand 
imputation approaches needed for a combined question.
    We also plan to test variation in terminology by comparing ``Am.'' 
with ``American'' in the response category ``Black or African Am.'' on 
the Internet instrument. This research is being undertaken to assess 
the impact of different wording for the racial category that collects 
and tabulates data for the African American, African, and Afro-
Caribbean populations. This test will provide insights to how 
respondents identify with the race category, depending on the wording 
used to describe the category itself (``Black or African Am.'' vs. 
``Black or African American'').
    For the relationship question, we plan to include variations in 
question wording associated with ``non-relatives.'' We will compare 
responses to a relationship question with, and without, the response 
categories ``roomer or boarder'' and ``housemate or roommate.'' 
Cognitive testing has repeatedly shown that respondents do not know 
what the Census Bureau sees as the differences between these 
categories.
    The 2016 Census Test will continue to include the response 
categories recommended by the OMB Interagency Working Group (see 
section 11 of this document--Justification for Sensitive Questions) for 
opposite-sex and same-sex husband/wife/spouse households, and for the 
category for unmarried partner.
    The 2016 Census Test will include a question on the Internet 
instrument that will allow respondents to report that a housing unit 
they own is vacant as of Census Day, and to provide the reason for the 
vacancy status (e.g., a seasonal or rental unit). Collecting these data 
from respondents may allow the Census Bureau to identify some vacant 
housing units during self response so they can be removed from NRFU 
operations.
    The Census Bureau's research on how best to present and explain the 
residence rule (who to count) in specific situations will continue. The 
Internet data collection instrument will include various ways to ask 
about and confirm the number of persons residing at an address. 
Respondents will see one of three screens about the enumeration of 
people in their household: one that displays the Census Bureau's basic 
residence rule, and then asks for the number of people in the household 
based on that rule; One that asks for the number of people who live in 
the household but provides our residence rule definition in the help 
text; and one that asks if any other people live at the household, with 
the residence rule in the help text. After the names of the roster 
members are collected, the respondent will then see one of three series 
of undercount detection questions: One series asks for additional 
people on two separate screens, another series asks for additional 
people on only one screen, or no undercount questions at all. After the 
demographic items are collected, the respondent will then see overcount 
detection questions or, if the case had not received undercount 
questions, no overcount detection questions.
    The materials mailed to the respondents will inform them that the 
survey is mandatory in accordance with title 13, United States Code, 
sections 141 and 193. This information also will be available via a 
hyperlink from within the Internet instrument.

Nonresponse Followup (NRFU) Operation Testing

    The 2016 Census Test will determine our 2020 Census methods for 
conducting NRFU operations that will increase efficiency and reduce 
costs. Based on previous tests, the Census Bureau will refine its 
contact strategies and methods for field data collection, case 
assignment management, and field staff administrative functions. This 
will include further testing of how administrative records can be used 
to reduce the NRFU workload.
    As part of the 2016 Census Test, we will collect housing unit 
status and enumerate the occupants of households that do not respond to 
the self response phase of the census using automated enumeration 
software on standard (iOS and Android operating system) smartphone 
devices. The test will enable our continued study of options for 
alternatives to using government furnished equipment. This includes 
options for an enumerator to use their own smartphone for enumeration, 
often known as ``Bring Your Own Device (BYOD)'', and options to use a 
`Device as a Service' contract, where the Census Bureau will not own 
the smartphone devices outright, but instead will pay a vendor for 
their use, including any initialization and setup processes required. 
This has the potential to mitigate risks to the operation. For example, 
unpredictable increases in costs associated with device initialization 
and hardware support. We will also continue to operationally test the 
field data collection application we use on these devices. The devices 
will use a modified version of the software used in the 2015 Census 
Test, with updated capabilities for handling special non-interview 
cases (such as demolished homes and non-existent addresses), better 
handling of addresses with multiple units (like apartment buildings), a 
clearer path for enumerators to take when attempting to collect data 
from a householder's neighbor or another knowledgeable source, new 
screens related to detecting potential ``overcount'' in a household 
(scenarios where current household residents also lived at another 
location, like student housing), and numerous other minor incremental 
user interface and performance updates.
    The Census Bureau also plans to test a newly redesigned portion of 
our quality assurance activities--the NRFU Reinterview program (NRFU-
RI). We plan to test:
     New methodologies for selecting cases to be reinterviewed, 
including the potential use of operational control system data 
(paradata) and administrative records to detect potential falsification 
by enumerators
     Using our automated field data collection instrument for 
conducting these reinterviews
     Using our recently re-designed operational control system 
to optimize the routing and assignment of reinterview cases, and
     Using the same field staff to conduct both NRFU interviews 
and associated reinterviews, with an explicit rule within the 
instrument that an enumerator is not allowed to reinterview their own 
work.
    All of these changes have the potential to lead to a more cost-
effective,

[[Page 69192]]

streamlined, and higher quality NRFU operation for the 2020 Census. We 
will continue to test our newly re-engineered field infrastructure, 
allowing us to refine our requirements for staffing ratios and position 
duties for 2020 Census operations. We will also continue to test our 
enhanced operational control system, using lessons learned from the 
2015 Census Test to make further improvements to how assignments are 
made and routed. We will continue to test improvements to our use of 
systematic alerts that will quickly notify field supervisors of 
potential problem enumerators, detect possible falsification, and 
improve both quality and efficiency for the NRFU operation.
    Additionally, we will continue to test our implementation of an 
`adaptive design' contact strategy: Using a varied number of personal 
visit attempts by geographic area based on criteria associated with 
people who are harder to count. We also will study when is the optimal 
point to discontinue attempts to collect information from each non-
responding household, and instead move to attempting to collect 
information from a householder's neighbor or another knowledgeable 
source.
    Finally, we will build upon work from the 2013, 2014, and 2015 
Census Tests in a continued attempt to refine and evaluate our use of 
administrative records (including government and third-party data 
sources) to reduce the NRFU workload. Cases will be removed from the 
NRFU operation based on our administrative records modeling as follows:
     Any case that is given a status of vacant from our 
administrative records modeling will be immediately removed from the 
NRFU workload; and
     Any case that is given a status of occupied from our 
administrative records modeling will be removed from the NRFU workload 
after one unsuccessful attempt at field enumeration is made (as long as 
good administrative records exist for that case).
    Unlike previous tests, for all cases removed from the NRFU workload 
in this way, we will test mailing these addresses a supplemental letter 
to prompt a self response. If these cases do not self-respond, we will 
enumerate the unit based on the results of our administrative records 
modeling.
    For a sample of the cases that would be removed via this criteria, 
we will continue to perform the field followup activities. This will 
allow us to compare the outcomes of those that get a completed 
interview with our modeled status of the household, and determine the 
quality of our administrative record modeling.

Coverage Reinterview

    As described previously, the 2016 Census Test Internet instrument 
contains embedded coverage experiments, and a reinterview is needed to 
quantify the effects of each particular version on the roster provided 
by the Internet respondent. The quality of the final household roster 
created from the panels with experimentally applied questions will be 
evaluated by a coverage reinterview conducted by telephone. Note that 
these panels are used to evaluate the different residence rule 
approaches used in the different questionnaire panels. The reinterview 
will contain extensive questions about potentially missed roster 
members and other places that any household members sometimes stay. 
Specifically, the reinterview will re-contact responders to determine 
if any people may have been left off the roster or erroneously included 
on the roster during the initial response. If there are indications 
during the reinterview that some people may have been left off the 
roster, then we will ask for demographic information about the missed 
people. If there are indications during the reinterview that some 
people may have been erroneously included, then we will ask for 
information about stay durations in order to resolve residency 
situations. The reinterview will be a Computer Assisted Telephone 
Interviewing (CATI) operation conducted in the Census Bureau's call 
centers.
    In addition to contacting Internet responders, a small portion of 
people who responded by paper or as a part of NRFU will be selected for 
the Coverage Reinterview. The inclusion of such cases will allow us to 
quantify the quality of household rosters collected in these two other 
modes.

Focus Groups

    Following the end of data collection, the Census Bureau will 
conduct focus groups with 2016 Census Test participants to ask about 
their experience. Topics will include their opinions on the use of 
administrative records by the Census Bureau. Participants also will be 
asked about their general concerns with government data collection and 
the government's ability to protect confidential data. The specific 
information collection materials for those activities will be submitted 
separately as non-substantive changes.
    Testing in 2016 is necessary to build on the findings from prior 
testing and to establish recommendations for contact strategies, 
response options, and field operation efficiencies that can be further 
refined and deployed again in subsequent operational and system 
development activities. At this point in the decade, the Census Bureau 
needs to solidify evidence showing whether the strategies being tested 
can reduce the cost per housing unit during a decennial census, while 
still providing high quality and accuracy of the census data. The 
results of the 2016 Census Test from both sites will inform decisions 
that the Census Bureau will make about refining the detailed 
operational plan for the 2020 Census and will help guide the evaluation 
of additional 2020 Census test results later this decade.
    Along with other results related to content, the response rates to 
paper and Internet collection will be used to help inform 2020 Census 
program planning and cost estimates. Several versions of some of the 
demographic questions and versions of coverage questions are included 
in this test in order to determine further the best questions and 
procedures for collecting the data from hard-to-count populations and 
achieve optimal within-household person coverage within the decennial 
census.
    Testing enhancements to Non-ID processing will inform final 
planning for the 2020 Census design, as well as the infrastructure 
required to support large scale, real-time processing of electronic 
Non-ID response data submitted via the Internet. Building upon previous 
Census Tests, the NRFU portion of the 2016 Census Test will inform the 
following important decisions for conducting the 2020 Census:
     We will continue to research the cost and quality impact 
of reducing the NRFU caseload through the use of administrative records 
information, to inform our final strategy for the use of administrative 
records. This test will also allow us to further define our core set of 
administrative records that will be used for the 2020 Census, and our 
strategies for acquiring and using those records. This research will 
help us achieve our goal of a more cost-effective 2020 Census, while 
maintaining quality of the results.
     We will continue to research the cost and quality impacts 
of new NRFU contact strategies that make use of adaptive design and a 
re-engineered management structure employing automated payroll, 
automated training, and minimal face-to-face contact between 
enumerators and supervisors. Enumerators are asked to provide work-time 
availability in advance, and the system then will assign the optimal 
number of cases to attempt each day, as well as the optimal route to 
follow that

[[Page 69193]]

day. Again, this operational research will help us towards our goal of 
a more cost-effective 2020 Census, while maintaining quality of the 
results.
     We will be able to determine at what rate field staff are 
willing to use their own personally owned devices to conduct Census 
enumeration, and continue to develop our technical processes to enable 
this to be done in a secure and cost-effective manner. We will also be 
able to make quality and cost determinations about a `Device as a 
Service' option, and be able to develop more mature cost models to 
inform our decisions related to the device provision strategies for the 
2020 Census NRFU operation.
     We will be able to determine the cost and quality impacts 
of our newly re-engineered NRFU Reinterview quality assurance program. 
This data will inform our decision on an integrated and re-designed 
approach to quality assurance for the 2020 Census.

    Affected Public: Individuals or Households.
    Frequency: One time.
    Respondent's Obligation: Mandatory.
    Legal Authority: Title 13, United States Code, sections 141 and 
193.
    This information collection request may be viewed at 
www.reginfo.gov. Follow the instructions to view Department of Commerce 
collections currently under review by OMB.
    Written comments and recommendations for the proposed information 
collection should be sent within 30 days of publication of this notice 
to OIRA_Submission@omb.eop.gov or fax to (202) 395-5806.

    Dated: November 4, 2015.
Glenna Mickelson,
Management Analyst, Office of the Chief Information Officer.
[FR Doc. 2015-28416 Filed 11-6-15; 8:45 am]
BILLING CODE 3510-07-P
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.