Submission for OMB Review; Comment Request, 67957-67959 [2016-23821]

Download as PDF Federal Register / Vol. 81, No. 191 / Monday, October 3, 2016 / Notices conference call number and conference ID number. Members of the public are also entitled to submit written comments; the comments must be received in the regional office one week prior to the start of the meeting, by Tuesday November 8, 2016. Written comments may be mailed to the Regional Programs Unit Office, U.S. Commission on Civil Rights, 55 W. Monroe St., Suite 410, Chicago, IL 60615. They may also be faxed to the Commission at (312) 353– 8324, or emailed to Carolyn Allen at callen@usccr.gov. Persons who desire additional information may contact the Regional Programs Unit Office at (312) 353–8311. Records generated from this meeting may be inspected and reproduced at the Regional Programs Unit Office, as they become available, both before and after the meeting. Records of the meeting will be available via www.facadatabase.gov under the Commission on Civil Rights, Indiana Advisory Committee link (https://www.facadatabase.gov/ committee/meetings.aspx?cid=247). Persons interested in the work of this Committee are directed to the Commission’s Web site, https:// www.usccr.gov, or may contact the Regional Programs Unit Office at the above email or street address. Agenda Welcome and Introductions Civil Rights Report, Final Review and Approval Civil Rights and the School to Prison Pipeline in Indiana Public Comment Future Plans and Actions Adjournment Dated: September 27, 2016. David Mussatt, Chief, Regional Programs Unit. [FR Doc. 2016–23729 Filed 9–30–16; 8:45 am] BILLING CODE 6335–01–P COMMISSION ON CIVIL RIGHTS Notice of Public Meeting of the Indiana Advisory Committee for a Meeting To Discuss an Updated Draft Report on Civil Rights and the School to Prison Pipeline in the State U.S. Commission on Civil Rights. ACTION: Announcement of meeting. sradovich on DSK3GMQ082PROD with NOTICES AGENCY: Notice is hereby given, pursuant to the provisions of the rules and regulations of the U.S. Commission on Civil Rights (Commission) and the Federal Advisory Committee Act that the Indiana Advisory Committee SUMMARY: VerDate Sep<11>2014 17:56 Sep 30, 2016 Jkt 241001 (Committee) will hold a meeting on Wednesday, October 19, 2016, at 2:00pm EDT for the purpose of discussing a draft report regarding the school to prison pipeline in the state. DATES: The meeting will be held on Wednesday, October 19, 2016, at 2:00 p.m. EDT. Public call information: Dial: 888– 455–2265, Conference ID: 3309385. FOR FURTHER INFORMATION CONTACT: Melissa Wojnaroski, DFO, at mwojnaroski@usccr.gov or 312–353– 8311. SUPPLEMENTARY INFORMATION: Members of the public can listen to the discussion. This meeting is available to the public through the following tollfree call-in number: 888–455–2265, conference ID: 3309385. Any interested member of the public may call this number and listen to the meeting. An open comment period will be provided to allow members of the public to make a statement as time allows. The conference call operator will ask callers to identify themselves, the organization they are affiliated with (if any), and an email address prior to placing callers into the conference room. Callers can expect to incur regular charges for calls they initiate over wireless lines, according to their wireless plan. The Commission will not refund any incurred charges. Callers will incur no charge for calls they initiate over landline connections to the toll-free telephone number. Persons with hearing impairments may also follow the proceedings by first calling the Federal Relay Service at 1–800–977–8339 and providing the Service with the conference call number and conference ID number. Members of the public are also entitled to submit written comments; the comments must be received in the regional office within 30 days following the meeting. Written comments may be mailed to the Regional Programs Unit Office, U.S. Commission on Civil Rights, 55 W. Monroe St., Suite 410, Chicago, IL 60615. They may also be faxed to the Commission at (312) 353–8324, or emailed to Carolyn Allen at callen@ usccr.gov. Persons who desire additional information may contact the Regional Programs Unit Office at (312) 353–8311. Records generated from this meeting may be inspected and reproduced at the Regional Programs Unit Office, as they become available, both before and after the meeting. Records of the meeting will be available via www.facadatabase.gov under the Commission on Civil Rights, Indiana Advisory Committee link (https://www.facadatabase.gov/ PO 00000 Frm 00002 Fmt 4703 Sfmt 4703 67957 committee/meetings.aspx?cid=247). Persons interested in the work of this Committee are directed to the Commission’s Web site, https:// www.usccr.gov, or may contact the Regional Programs Unit Office at the above email or street address. Agenda Welcome and Introductions Discussion Draft Report: Civil Rights and the School to Prison Pipeline in Indiana Public Comment Future Plans and Actions Adjournment Dated: September 27, 2016. David Mussatt, Chief, Regional Programs Unit. [FR Doc. 2016–23728 Filed 9–30–16; 8:45 am] BILLING CODE 6335–01–P DEPARTMENT OF COMMERCE Submission for OMB Review; Comment Request The Department of Commerce will submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection of information under the provisions of the Paperwork Reduction Act (44 U.S.C. chapter 35). Agency: U.S. Census Bureau. Title: American Community Survey (ACS) Methods Panel, Online Communications Improving Survey Response Campaign. OMB Control Number: 0607–0936. Form Number(s): ACS–1, ACS–1 (Spanish), ACS CATI, ACS CAPI, ACS Internet. Type of Request: Nonsubstantive Change Request. Number of Respondents: None. Average Hours per Response: None. Burden Hours: No additional burden hours are requested under this nonsubstantive change request. Needs and Uses: The American Community Survey collects detailed socioeconomic data from about 3.5 million households in the United States and 36,000 in Puerto Rico each year. The ACS also collects detailed socioeconomic data from about 195,000 residents living in Group Quarter (GQ) facilities. An ongoing data collection effort with an annual sample of this magnitude requires that the ACS continue research, testing, and evaluations aimed at improving data quality, achieving survey cost efficiencies, and improving ACS questionnaire content and related data collection materials. The ACS Methods E:\FR\FM\03OCN1.SGM 03OCN1 67958 Federal Register / Vol. 81, No. 191 / Monday, October 3, 2016 / Notices sradovich on DSK3GMQ082PROD with NOTICES Panel is a research program that is designed to address and respond to issues and survey needs. In line with the Census Bureau’s goal to increase survey response rates through communications, the Census Bureau seeks to launch a pilot of a targeted digital advertising campaign. During the 2000 and 2010 decennial enumerations, the Census Bureau saw an uptick of ACS response rates.1 A year-over-year increase of 6.4 percentage points was observed in the Savannah, GA media market during the 2015 Census Site Test.2 Outside of decennial years, traditional broad-based advertising methods are cost-prohibitive because of the relatively small sample size for most Census Bureau surveys compared to the general population. With the advent of digital advertising tactics, however, the Census Bureau now has the potential opportunity to cost-effectively deliver promotional messages to individual households within a survey sample. The ACS offers a large enough national sample to field a test of such tactics and determine whether they lift response rates. If digital advertisements encourage recipients to respond to a survey early in the process of data collection, including responding online, then the Census Bureau will save money on costly follow-up efforts to collect data from nonrespondents, including sending Census Bureau interviewers to respondents’ households in person. Offsetting data-collection costs in this way would ultimately save taxpayers money. Findings from this pilot campaign will have applications across the range of the Census Bureau’s collection efforts as advertisements will not be survey-specific and will focus on the value of the Census Bureau’s work in general. We propose to execute the pilot campaign aiming to using the January and February 2017 ACS production samples. We will deliver targeted digital advertisements to a panel of in-sample residents that can be linked by household address to digital profiles (including cookies and/or device ID) by a third-party data vendor. This technique is an emerging standard in online advertising, in line with the advertising households receive from 1 Chesnut, J. & M. Davis. (2011). ‘‘Evaluation of the ACS Mail Materials and Mailing Strategy during the 2010 Census.’’ American Community Survey Research and Evaluation Program. U.S. Census Bureau. 2 Walejko, G. et al. (2015). ‘‘Modeling the Effect of Diverse Communication Strategies on Decennial Census Test Response Rates.’’ Presentation. 2015 Federal Committee on Statistical Methodology Research Conference. December 2nd, 2015. Washington, DC. VerDate Sep<11>2014 17:56 Sep 30, 2016 Jkt 241001 companies and organizations every day. We will place video, display banners, and paid social media advertisements. Linked households will be served ads shortly before they receive a mailed survey questionnaire and during the ACS data collection process. Ads will not directly call on recipients to complete the ACS or any particular survey, nor will they mention any survey by name. Rather they will be designed to create positive associations with the Census Bureau’s work generally and make the case for the importance of completing a Census Bureau questionnaire if selected. When an advertisement is clicked, the user will be directed to a Census.gov web landing page featuring general information about the value of the Census Bureau’s work and a link to the ‘‘Are You in a Survey?’’ page.3 The purpose of this test is to study the impact of these changes on self-response behavior and assess any potential savings overall or with subgroups. The advertisements will include a mix of online video, banner display ads, and paid social media content on both desktop and mobile devices. They will be displayed around the web on various Web sites targeted to linked households in the treatment groups. Ad serving will be optimized based on audience reach and user engagement with the ads (measured in terms of video and click metrics). The optimal media mix will be applied evenly across both treatments. We will prioritize rich media placements including video and social video over standard placements such as banner display, with the goal to maximize video advertising to tell a compelling story to raise awareness of the Census Bureau’s work. This pilot will include two experimental treatments (a high-spend group and a low-spend group) as well as a control group. Households in the highspend group will receive roughly twice the number of advertisement exposures as households in the low-spend treatment group, though the channel mix and content of the advertisements will remain the same between the two groups. The Control group will not receive any advertisements. To field this test, we plan to use ACS production (clearance number: 0607– 0810, expires 06/30/2018). Thus, there is no increase in burden from this test since the treatment will result in approximately the same burden estimate per interview (40 minutes). The ACS sample design consists of randomly assigning each monthly sample panel 3 See https://www.census.gov/programs-surveys/ are-you-in-a-survey.html. PO 00000 Frm 00003 Fmt 4703 Sfmt 4703 into 24 groups of approximately 12,000 addresses each. Each group, called a methods panel group, within a monthly sample is representative of the full monthly sample. Each monthly sample is a representative subsample of the entire annual sample and is representative of the sampling frame. The test will include two months of production sample (aiming for January and February 2017). We will choose eight randomly selected methods panel groups per month for each of the two experimental treatments; the remaining eight methods panel groups will be the control. Over the two production months, each treatment will use 16 methods panel groups, or a mail out sample of roughly 192,000 addresses, which will be used for linking to establish eligibility for micro targeted digital advertising. We estimate that approximately 31 percent of the mailable addresses will be eligible for digital advertising, which is approximately 30,000 addresses for each of the two experimental treatments per month. We will compare the Internet return rates at the cut date for the replacement mailing, the Internet, mail, and selfresponse return rates before the start of Computer Assisted Telephone Interviewing (CATI), and the Internet, mail, self-response, and CATI return rates prior to the start of Computer Assisted Personal Interviewing (CAPI). We will compare the self-response and CAPI return rates as well as the overall response rates when all data collection activities end. Additionally, the overall response rate will be calculated for all sample addresses. For each comparison, we will use a = 0.1 and a two-tailed test so that we can measure the impact on the evaluation measure in either direction with 80 percent power. Based on previous year’s data for the January and February panels we calculated effective sample sizes. We assumed an Undeliverable as Addressed (UAA) rate of 18.0 percent (these addresses may be advertised to, but will be removed from self-response analysis because they do not have an opportunity to respond), a self-response rate of 57.5 percent for all three groups, a CATI response rate of 25 percent, and a CAPI response rate of 85 percent. We expect to be able to detect self-response differences between the high- and low-spend treatment panel of 0.8 percentage points, and between a treatment panel and the control on the order of 0.8 percentage points. Additional metrics of interest include overall costs and response rates by subgroups. Affected Public: Individuals or households. E:\FR\FM\03OCN1.SGM 03OCN1 Federal Register / Vol. 81, No. 191 / Monday, October 3, 2016 / Notices Frequency: One-time test as part of the monthly American Community Survey. Respondent’s Obligation: Mandatory. Legal Authority: Title 13, United States Code, Sections 141, 193, and 221. This information collection request may be viewed at www.reginfo.gov. Follow the instructions to view Department of Commerce collections currently under review by OMB. Written comments and recommendations for the proposed information collection should be sent within 30 days of publication of this notice to OIRA_Submission@omb.eop. gov or fax to (202) 395–5806. Sheleen Dumas, PRA Departmental Lead, Office of the Chief Information Officer. [FR Doc. 2016–23821 Filed 9–30–16; 8:45 am] BILLING CODE 3510–07–P DEPARTMENT OF COMMERCE Membership of the Departmental Performance Review Board Department of Commerce. Notice of membership on the Departmental Performance Review Board. AGENCY: ACTION: In accordance with statutes on ratings for performance appraisals, the Department of Commerce (DOC), announces the appointment of those individuals who have been selected to serve as members of the Departmental Performance Review Board. The Performance Review Board is responsible for reviewing performance appraisals and ratings of Senior Executive Service (SES) members and making recommendations to the appointing authority on other performance management issues, such as pay adjustments, bonuses and Presidential Rank Awards. The appointment of these members to the Performance Review Board will be for a period of twenty-four (24) months. DATES: The period of appointment for those individuals selected for the Departmental Performance Review Board begins on October 3, 2016. FOR FURTHER INFORMATION CONTACT: Nancy Osborn, U.S. Department of Commerce, Office of Human Resources Management, Office of Executive Resources, 14th and Constitution Avenue NW., Room 51010, Washington, DC 20230, at (202) 482–5815. SUPPLEMENTARY INFORMATION: In accordance with Ratings for Performance Appraisals, 5 U.S.C. 4314(c)(4), the Department of Commerce sradovich on DSK3GMQ082PROD with NOTICES SUMMARY: VerDate Sep<11>2014 17:56 Sep 30, 2016 Jkt 241001 (DOC), announces the appointment of those individuals who have been selected to serve as members of the Departmental Performance Review Board. The Performance Review Board is responsible for (1) reviewing performance appraisals and ratings of Senior Executive Service (SES) members and (2) making recommendations to the appointing authority on other performance management issues, such as pay adjustments, bonuses and Presidential Rank Awards. The appointment of these members to the Performance Review Board will be for a period of twenty-four (24) months. The name, position title, and type of appointment of each member of the Performance Review Board are set forth below: 1. Jon Alexander, Deputy Director, Financial Management Systems, Career SES 2. Dennis Alvord, Senior Advisor for Policy and Program Integration, Career SES 3. Stephen Burke, Chief Financial Officer and Director for Administration, Career SES 4. Kathleen James, Chief Administrative Officer, Career SES 5. Lauren Leonard, Director of the Office of White House Liaison and Senior Advisor to the Secretary, Noncareer SES 6. Holly Vineyard, Deputy Assistant Secretary for Global Markets, Career SES Dated: September 20, 2016. Denise A. Yaag, Director, Office of Executive Resources, Office of Human Resources Management, Office of the Secretary/Office of the CFO/ASA, Department of Commerce. [FR Doc. 2016–23659 Filed 9–30–16; 8:45 am] BILLING CODE 3510–25–P DEPARTMENT OF COMMERCE Office of the Secretary Membership of the Performance Review Board for the Office of the Secretary Office of the Secretary, Department of Commerce. ACTION: Notice of membership on the Office of the Secretary Performance Review Board. AGENCY: In accordance with 5 U.S.C. 4314(c)(4), the Office of the Secretary, Department of Commerce (DOC), announces the appointment of those individuals who have been selected to serve as members of the Performance Review Board. The Performance Review SUMMARY: PO 00000 Frm 00004 Fmt 4703 Sfmt 4703 67959 Board is responsible for (1) reviewing performance appraisals and ratings of Senior Executive Service (SES) and Senior Level (SL) members and (2) making recommendations to the appointing authority on other performance management issues, such as pay adjustments, bonuses and Presidential Rank Awards. The appointment of these members to the Performance Review Board will be for a period of twenty-four (24) months. DATES: The period of appointment for those individuals selected for the Office of the Secretary Performance Review Board begins on October 3, 2016. FOR FURTHER INFORMATION CONTACT: Nancy Osborn, U.S. Department of Commerce, Office of Human Resources Management, Office of Executive Resources, 14th and Constitution Avenue NW., Room 51010, Washington, DC 20230, at (202) 482–5815. SUPPLEMENTARY INFORMATION: In accordance with 5 U.S.C. 4314(c)(4), the Office of the Secretary, Department of Commerce (DOC), announces the appointment of those individuals who have been selected to serve as members of the Office of the Secretary Performance Review Board. The Performance Review Board is responsible for (1) reviewing performance appraisals and ratings of Senior Executive Service (SES) and Senior Level (SL) members and (2) making recommendations to the appointing authority on other performance management issues, such as pay adjustments, bonuses and Presidential Rank Awards. The appointment of these members to the Performance Review Board will be for a period of twenty-four (24) months. DATES: The period of appointment for those individuals selected for the Office of the Secretary Performance Review Board begins on October 3, 2016. The name, position title, and type of appointment of each member of the Performance Review Board are set forth below: 1. Gordon Alston, Director, Financial Reporting and Internal Controls, Career SES 2. Paige Atkins, Associate Administrator for Spectrum Management, Career SES 3. Kurt Bersani, Deputy Chief Financial and Administrative Officer, Career SES 4. Theodore LeCompte, Deputy Chief of Staff and Senior Advisor to the Secretary, Noncareer SES 5. Lauren Leonard, Director of the Office of White House Liaison and Senior Advisor to the Secretary, Noncareer SES E:\FR\FM\03OCN1.SGM 03OCN1

Agencies

[Federal Register Volume 81, Number 191 (Monday, October 3, 2016)]
[Notices]
[Pages 67957-67959]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 2016-23821]


=======================================================================
-----------------------------------------------------------------------

DEPARTMENT OF COMMERCE


Submission for OMB Review; Comment Request

    The Department of Commerce will submit to the Office of Management 
and Budget (OMB) for clearance the following proposal for collection of 
information under the provisions of the Paperwork Reduction Act (44 
U.S.C. chapter 35).
    Agency: U.S. Census Bureau.
    Title: American Community Survey (ACS) Methods Panel, Online 
Communications Improving Survey Response Campaign.
    OMB Control Number: 0607-0936.
    Form Number(s): ACS-1, ACS-1 (Spanish), ACS CATI, ACS CAPI, ACS 
Internet.
    Type of Request: Nonsubstantive Change Request.
    Number of Respondents: None.
    Average Hours per Response: None.
    Burden Hours: No additional burden hours are requested under this 
nonsubstantive change request.
    Needs and Uses: The American Community Survey collects detailed 
socioeconomic data from about 3.5 million households in the United 
States and 36,000 in Puerto Rico each year. The ACS also collects 
detailed socioeconomic data from about 195,000 residents living in 
Group Quarter (GQ) facilities. An ongoing data collection effort with 
an annual sample of this magnitude requires that the ACS continue 
research, testing, and evaluations aimed at improving data quality, 
achieving survey cost efficiencies, and improving ACS questionnaire 
content and related data collection materials. The ACS Methods

[[Page 67958]]

Panel is a research program that is designed to address and respond to 
issues and survey needs. In line with the Census Bureau's goal to 
increase survey response rates through communications, the Census 
Bureau seeks to launch a pilot of a targeted digital advertising 
campaign. During the 2000 and 2010 decennial enumerations, the Census 
Bureau saw an uptick of ACS response rates.\1\ A year-over-year 
increase of 6.4 percentage points was observed in the Savannah, GA 
media market during the 2015 Census Site Test.\2\
---------------------------------------------------------------------------

    \1\ Chesnut, J. & M. Davis. (2011). ``Evaluation of the ACS Mail 
Materials and Mailing Strategy during the 2010 Census.'' American 
Community Survey Research and Evaluation Program. U.S. Census 
Bureau.
    \2\ Walejko, G. et al. (2015). ``Modeling the Effect of Diverse 
Communication Strategies on Decennial Census Test Response Rates.'' 
Presentation. 2015 Federal Committee on Statistical Methodology 
Research Conference. December 2nd, 2015. Washington, DC.
---------------------------------------------------------------------------

    Outside of decennial years, traditional broad-based advertising 
methods are cost-prohibitive because of the relatively small sample 
size for most Census Bureau surveys compared to the general population. 
With the advent of digital advertising tactics, however, the Census 
Bureau now has the potential opportunity to cost-effectively deliver 
promotional messages to individual households within a survey sample. 
The ACS offers a large enough national sample to field a test of such 
tactics and determine whether they lift response rates. If digital 
advertisements encourage recipients to respond to a survey early in the 
process of data collection, including responding online, then the 
Census Bureau will save money on costly follow-up efforts to collect 
data from nonrespondents, including sending Census Bureau interviewers 
to respondents' households in person. Offsetting data-collection costs 
in this way would ultimately save taxpayers money. Findings from this 
pilot campaign will have applications across the range of the Census 
Bureau's collection efforts as advertisements will not be survey-
specific and will focus on the value of the Census Bureau's work in 
general.
    We propose to execute the pilot campaign aiming to using the 
January and February 2017 ACS production samples. We will deliver 
targeted digital advertisements to a panel of in-sample residents that 
can be linked by household address to digital profiles (including 
cookies and/or device ID) by a third-party data vendor. This technique 
is an emerging standard in online advertising, in line with the 
advertising households receive from companies and organizations every 
day. We will place video, display banners, and paid social media 
advertisements. Linked households will be served ads shortly before 
they receive a mailed survey questionnaire and during the ACS data 
collection process. Ads will not directly call on recipients to 
complete the ACS or any particular survey, nor will they mention any 
survey by name. Rather they will be designed to create positive 
associations with the Census Bureau's work generally and make the case 
for the importance of completing a Census Bureau questionnaire if 
selected. When an advertisement is clicked, the user will be directed 
to a Census.gov web landing page featuring general information about 
the value of the Census Bureau's work and a link to the ``Are You in a 
Survey?'' page.\3\
---------------------------------------------------------------------------

    \3\ See https://www.census.gov/programs-surveys/are-you-in-a-survey.html.
---------------------------------------------------------------------------

    The purpose of this test is to study the impact of these changes on 
self-response behavior and assess any potential savings overall or with 
subgroups. The advertisements will include a mix of online video, 
banner display ads, and paid social media content on both desktop and 
mobile devices. They will be displayed around the web on various Web 
sites targeted to linked households in the treatment groups. Ad serving 
will be optimized based on audience reach and user engagement with the 
ads (measured in terms of video and click metrics). The optimal media 
mix will be applied evenly across both treatments. We will prioritize 
rich media placements including video and social video over standard 
placements such as banner display, with the goal to maximize video 
advertising to tell a compelling story to raise awareness of the Census 
Bureau's work.
    This pilot will include two experimental treatments (a high-spend 
group and a low-spend group) as well as a control group. Households in 
the high-spend group will receive roughly twice the number of 
advertisement exposures as households in the low-spend treatment group, 
though the channel mix and content of the advertisements will remain 
the same between the two groups. The Control group will not receive any 
advertisements.
    To field this test, we plan to use ACS production (clearance 
number: 0607-0810, expires 06/30/2018). Thus, there is no increase in 
burden from this test since the treatment will result in approximately 
the same burden estimate per interview (40 minutes). The ACS sample 
design consists of randomly assigning each monthly sample panel into 24 
groups of approximately 12,000 addresses each. Each group, called a 
methods panel group, within a monthly sample is representative of the 
full monthly sample. Each monthly sample is a representative subsample 
of the entire annual sample and is representative of the sampling 
frame.
    The test will include two months of production sample (aiming for 
January and February 2017). We will choose eight randomly selected 
methods panel groups per month for each of the two experimental 
treatments; the remaining eight methods panel groups will be the 
control. Over the two production months, each treatment will use 16 
methods panel groups, or a mail out sample of roughly 192,000 
addresses, which will be used for linking to establish eligibility for 
micro targeted digital advertising. We estimate that approximately 31 
percent of the mailable addresses will be eligible for digital 
advertising, which is approximately 30,000 addresses for each of the 
two experimental treatments per month.
    We will compare the Internet return rates at the cut date for the 
replacement mailing, the Internet, mail, and self-response return rates 
before the start of Computer Assisted Telephone Interviewing (CATI), 
and the Internet, mail, self-response, and CATI return rates prior to 
the start of Computer Assisted Personal Interviewing (CAPI). We will 
compare the self-response and CAPI return rates as well as the overall 
response rates when all data collection activities end. Additionally, 
the overall response rate will be calculated for all sample addresses. 
For each comparison, we will use [alpha] = 0.1 and a two-tailed test so 
that we can measure the impact on the evaluation measure in either 
direction with 80 percent power. Based on previous year's data for the 
January and February panels we calculated effective sample sizes. We 
assumed an Undeliverable as Addressed (UAA) rate of 18.0 percent (these 
addresses may be advertised to, but will be removed from self-response 
analysis because they do not have an opportunity to respond), a self-
response rate of 57.5 percent for all three groups, a CATI response 
rate of 25 percent, and a CAPI response rate of 85 percent. We expect 
to be able to detect self-response differences between the high- and 
low-spend treatment panel of 0.8 percentage points, and between a 
treatment panel and the control on the order of 0.8 percentage points. 
Additional metrics of interest include overall costs and response rates 
by subgroups.
    Affected Public: Individuals or households.

[[Page 67959]]

    Frequency: One-time test as part of the monthly American Community 
Survey.
    Respondent's Obligation: Mandatory.
    Legal Authority: Title 13, United States Code, Sections 141, 193, 
and 221.
    This information collection request may be viewed at 
www.reginfo.gov. Follow the instructions to view Department of Commerce 
collections currently under review by OMB.
    Written comments and recommendations for the proposed information 
collection should be sent within 30 days of publication of this notice 
to OIRA_Submission@omb.eop.gov or fax to (202) 395-5806.

Sheleen Dumas,
PRA Departmental Lead, Office of the Chief Information Officer.
[FR Doc. 2016-23821 Filed 9-30-16; 8:45 am]
 BILLING CODE 3510-07-P
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.