Proposed Data Collection Submitted for Public Comment and Recommendations, 28357-28359 [2021-11147]

Download as PDF 28357 Federal Register / Vol. 86, No. 100 / Wednesday, May 26, 2021 / Notices ESTIMATED ANNUALIZED BURDEN HOURS Average burden per response (in hours) Number of responses per respondent Number of respondents Total burden (in hours) Type of respondents Form name International Panel Physicians .......... TB Indicators REDCap web form .... 333 1 3 999 Total ........................................... ........................................................... ........................ ........................ ........................ 999 Jeffrey M. Zirger, Lead, Information Collection Review Office, Office of Scientific Integrity, Office of Science, Centers for Disease Control and Prevention. [FR Doc. 2021–11148 Filed 5–25–21; 8:45 am] BILLING CODE 4163–18–P DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention [60Day-21–0222; Docket No. CDC–2021– 0051] Proposed Data Collection Submitted for Public Comment and Recommendations Centers for Disease Control and Prevention (CDC), Department of Health and Human Services (HHS). ACTION: Notice with comment period. AGENCY: The Centers for Disease Control and Prevention (CDC), as part of its continuing effort to reduce public burden and maximize the utility of government information, invites the general public and other Federal agencies the opportunity to comment on a proposed and/or continuing information collection, as required by the Paperwork Reduction Act of 1995. This notice invites comment on a proposed information collection project titled the Collaborating Center for Questionnaire Design and Evaluation Research (CCQDER). This generic clearance request, requests approval for collection of information that encompasses general questionnaire development, pre-testing, and measurement-error reduction activities to be carried out in 2021–2024. DATES: CDC must receive written comments on or before July 26, 2021. ADDRESSES: You may submit comments, identified by Docket No. CDC–2021– 0051 by any of the following methods: • Federal eRulemaking Portal: Regulations.gov. Follow the instructions for submitting comments. • Mail: Jeffrey M. Zirger, Information Collection Review Office, Centers for Disease Control and Prevention, 1600 SUMMARY: VerDate Sep<11>2014 20:22 May 25, 2021 Jkt 253001 Clifton Road NE, MS–D74, Atlanta, Georgia 30329. Instructions: All submissions received must include the agency name and Docket Number. CDC will post, without change, all relevant comments to Regulations.gov. Please note: Submit all comments through the Federal eRulemaking portal (regulations.gov) or by U.S. mail to the address listed above. FOR FURTHER INFORMATION CONTACT: To request more information on the proposed project or to obtain a copy of the information collection plan and instruments, contact Jeffrey M. Zirger, Information Collection Review Office, Centers for Disease Control and Prevention, 1600 Clifton Road NE, MS– D74, Atlanta, Georgia 30329; phone: 404–639–7118; Email: omb@cdc.gov. SUPPLEMENTARY INFORMATION: Under the Paperwork Reduction Act of 1995 (PRA) (44 U.S.C. 3501–3520), Federal agencies must obtain approval from the Office of Management and Budget (OMB) for each collection of information they conduct or sponsor. In addition, the PRA also requires Federal agencies to provide a 60-day notice in the Federal Register concerning each proposed collection of information, including each new proposed collection, each proposed extension of existing collection of information, and each reinstatement of previously approved information collection before submitting the collection to the OMB for approval. To comply with this requirement, we are publishing this notice of a proposed data collection as described below. The OMB is particularly interested in comments that will help: 1. Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility; 2. Evaluate the accuracy of the agency’s estimate of the burden of the proposed collection of information, including the validity of the methodology and assumptions used; 3. Enhance the quality, utility, and clarity of the information to be collected; PO 00000 Frm 00029 Fmt 4703 Sfmt 4703 4. Minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submissions of responses; and 5. Assess information collection costs. Proposed Project The Collaborating Center for Questionnaire Design and Evaluation Research (CCQDER) (OMB Control No. 0920–0222, Exp. 08/31/2021)— Revision—National Center for Health Statistics (NCHS), Centers for Disease Control and Prevention (CDC). Background and Brief Description Section 306 of the Public Health Service (PHS) Act (42 U.S.C. 242k), as amended, authorizes that the Secretary of Health and Human Services (DHHS), acting through NCHS, shall undertake and support (by grant or contract) research, demonstrations, and evaluations respecting new or improved methods for obtaining current data to support statistical and epidemiological activities for the purpose of improving the effectiveness, efficiency, and quality of health services in the United States. The Collaborating Center for Questionnaire Design and Evaluation Research (CCQDER) is the focal point within NCHS for questionnaire and survey development, pre-testing, and evaluation activities for CDC surveys such as; the National Survey of Family Growth (NSFG) (OMB Control No. 0920–0314), the Research and Development Survey (RANDS), including RANDS COVID (OMB Control No. 0920–1298), and other federallysponsored surveys. The CCQDER is requesting three additional years of OMB Clearance for this generic submission. The CCQDER and other NCHS programs conduct cognitive interviews, focus groups, in-depth or ethnographic interviews, usability tests, field tests/ pilot interviews, and experimental research in laboratory and field settings, both for applied questionnaire development and evaluation, as well as E:\FR\FM\26MYN1.SGM 26MYN1 28358 Federal Register / Vol. 86, No. 100 / Wednesday, May 26, 2021 / Notices more basic research on measurement errors and survey response. Various techniques to evaluate interviewer administered, selfadministered, telephone, Computer Assisted Personal Interviewing (CAPI), Computer Assisted Self-Interviewing (CASI), Audio Computer-Assisted SelfInterviewing (ACASI), and web-based questionnaires are used. The most common questionnaire evaluation method is the cognitive interview. These evaluations are conducted by the CCQDER. The interview structure consists of respondents first answering a draft survey question and then providing textual information to reveal the processes involved in answering the test question. Specifically, cognitive interview respondents are asked to describe how and why they answered the question as they did. Through the interviewing process, various types of question-response problems that would not normally be identified in a traditional survey interview, such as interpretive errors and recall accuracy, are uncovered. By conducting a comparative analysis of cognitive interviews, it is also possible to determine whether particular interpretive patterns occur within particular sub-groups of the population. Interviews are generally conducted in small rounds totaling 40–100 interviews; ideally, the questionnaire is re-worked between rounds, and revisions are tested iteratively until interviews yield relatively few new insights. Cognitive interviewing is inexpensive and provides useful data on questionnaire performance while minimizing respondent burden. Cognitive interviewing offers a detailed depiction of meanings and processes used by respondents to answer questions—processes that ultimately produce the survey data. As such, the method offers an insight that can transform understanding of question validity and response error. Documented findings from these studies represent tangible evidence of how the question performs. Such documentation also serves CDC data users, allowing them to be critical users in their approach and application of the data. In addition to cognitive interviewing, a number of other qualitative and In 2021–2024 NCHS/CCQDER staff plans to continue research on methods evaluation and general questionnaire design research. We envision that over the next three years, NCHS/CCQDER will work collaboratively with survey researchers from universities and other Federal agencies to define and examine several research areas, including, but not limited to: (1) Differences between face-to-face, telephone, and virtual/ video-over internet cognitive interviewing, (2) effectiveness of different approaches to cognitive interviewing, such as concurrent and retrospective probing, (3) reactions of both survey respondents and survey interviewers to the use of Computer Assisted Personal Interviewing (CAPI), Audio Computer-Assisted SelfInterview (ACASI), video-over internet/ virtual, (4) social, cultural and linguistic factors in the question response process, and (5) recruitment and respondent participation at varying levels of incentive, in an effort to establish empirical evidence regarding remuneration and coercion. Procedures for each of these studies will be similar to those applied in the usual testing of survey questions. For example, questionnaires that are of current interest (such as RANDS and NIOSH) may be evaluated using several of the techniques described above. Or, different versions of a survey question will be developed, and the variants then administered to separate groups of respondents in order to study the cognitive processes that account for the differences in responses obtained across different versions. These studies will be conducted either by CCQDER staff, DHHS staff, or NCHS contractors who are trained in cognitive interviewing techniques. The results of these studies will be applied to our specific questionnaire development activities in order to improve the methods that we use to conduct questionnaire testing, and to guide questionnaire design in general. We are requesting 9,455 annualized hours, totaling 28,365 over three years. This is an increase of 1,672 hours per year or 5,016 hours over three years. The difference is due to an anticipated increase in the number and size of projects being undertaken. There is no cost to respondents other than their time to participate. quantitative methods are used to investigate and research measurement errors and the survey response process. These methods include conducting focus groups, usability tests, in-depth or ethnographic interviews, and the administration and analysis of questions in both representative and nonrepresentative field tests. Focus groups are group interviews whose primary purpose is to elicit the basic sociocultural understandings and terminology that form the basis of questionnaire design. Each group typically consists of one moderator and four to 10 participants, depending on the research question. In-depth or ethnographic interviews are one-on-one interviews designed to elicit the understandings or terminology that are necessary for question design, as well as to gather detailed information that can contribute to the analysis of both qualitative and quantitative data. Usability tests are typically one-on-one interviews that are used to determine how a given survey or information collection tool functions in the field, and how the mode and layout of the instrument itself may contribute to survey response error and the survey response process. In addition to these qualitative methods, NCHS also uses various tools to obtain quantitative data, which can be analyzed alone or analyzed alongside qualitative data to give a much fuller accounting of the survey response process. For instance, phone, internet, mail, and in-person follow-up interviews of previous NCHS survey respondents may be used to test the validity of survey questions and questionnaires, and to obtain more detailed information that cannot be gathered on the original survey. Additionally, field or pilot tests may be conducted on both representative and non-representative samples, including those obtained from commercial survey and web panel vendors. Beyond looking at traditional measures of survey errors (such as item missing rates and nonresponse, and don’t know rates), these pilot tests can be used to run experimental designs in order to capture how different questions function in a field setting. Similar methodology has been adopted by other federal agencies, as well as by academic and commercial survey organizations. ESTIMATED ANNUALIZED BURDEN TABLE Number of respondents Types of respondents Form name Individuals or households ................. Eligibility Screeners .......................... VerDate Sep<11>2014 20:00 May 25, 2021 Jkt 253001 PO 00000 Frm 00030 Fmt 4703 Sfmt 4703 Number of responses per respondent Average hours per response (in hours) 1 5/60 4,400 E:\FR\FM\26MYN1.SGM 26MYN1 Total burden hours 367 28359 Federal Register / Vol. 86, No. 100 / Wednesday, May 26, 2021 / Notices ESTIMATED ANNUALIZED BURDEN TABLE—Continued Form name Individuals or households ................. Individuals or households ................. Individuals or households ................. Developmental Questionnaires ........ Respondent Data Collection Sheet .. Focus Group Documents ................. Total ........................................... ........................................................... Jeffrey M. Zirger, Lead, Information Collection Review Office, Office of Scientific Integrity, Office of Science, Centers for Disease Control and Prevention. [FR Doc. 2021–11147 Filed 5–25–21; 8:45 am] BILLING CODE 4163–18–P DEPARTMENT OF HEALTH AND HUMAN SERVICES Administration for Children and Families Proposed Information Collection Activity; Office of Refugee Resettlement Cash and Medical Assistance Program Quarterly Report on Expenditures and Obligations— (ORR–2) (OMB #0970–0407) Office of Refugee Resettlement (ORR), Administration for Children and Families, HHS. ACTION: Request for public comment. AGENCY: The Office of Refugee Resettlement (ORR) is requesting a three-year extension of the ORR Cash and Medical Assistance (CMA) Program Quarterly Report on Expenditures and Obligations (ORR–2) (OMB #0970–0407, expiration 8/31/2021). There are no changes requested to the form. DATES: Comments due within 60 days of publication. In compliance with the requirements of the Paperwork SUMMARY: Number of responses per respondent Average hours per response (in hours) 8,750 8,750 225 1 1 1 55/60 5/60 90/60 8,021 729 338 ........................ ........................ ........................ 9,455 Number of respondents Types of respondents Reduction Act of 1995, ACF is soliciting public comment on the specific aspects of the information collection described above. ADDRESSES: Copies of the proposed collection of information can be obtained and comments may be forwarded by emailing infocollection@ acf.hhs.gov. Alternatively, copies can also be obtained by writing to the Administration for Children and Families, Office of Planning, Research, and Evaluation (OPRE), 330 C Street SW, Washington, DC 20201, Attn: ACF Reports Clearance Officer. All requests, emailed or written, should be identified by the title of the information collection. SUPPLEMENTARY INFORMATION: Description: The Office of Refugee Resettlement (ORR) reimburses, to the extent of available appropriations, certain non-federal costs for the provision of cash and medical assistance to refugees, along with allowable expenses for the administration the refugee resettlement program at the State level. States and Replacement Designees currently submit the ORR–2 Quarterly Report on Expenditures and Obligations, which provides aggregate expenditure and obligation data. The ORR–2 collects expenditures and obligations data separately for each of the four CMA program components: Refugee cash assistance, refugee medical assistance, Total burden hours cash and medical assistance administration, and services for unaccompanied minors. This breakdown of financial status data allows ORR to track program expenditures in greater detail to anticipate any funding issues and to meet the requirements of ORR regulations at CFR 400.211 to collect these data for use in estimating future costs of the refugee resettlement program. ORR must implement the methodology at CFR 400.211 each year after receipt of its annual appropriation to ensure that appropriated funds will be adequate for reimbursement to states of the costs for assistance provided to entering refugees. The estimating methodology prescribed in the regulations requires the use of actual past costs by program component. If the methodology indicates that appropriated funds are inadequate, ORR must take steps to reduce federal expenses, such as by limiting the number of months of eligibility for Refugee Cash Assistance and Refugee Medical Assistance. The ORR–2 is a single-page financial report that allows ORR to collect the necessary data to ensure that funds are adequate for the projected need and thereby meet the requirements of both the Refugee Act and ORR regulations. Respondents: State governments and Replacement Designees. ANNUAL BURDEN ESTIMATES Instrument Total number of respondents Annual number of responses per respondent Average burden hours per response Annual burden hours ORR Financial Status Report Cash and Medical Assistance Program, Quarterly Report on Expenditures and Obligations ............................................. 66 4 1.5 396 Estimated Total Annual Burden Hours: 396. Comments: The Department specifically requests comments on (a) whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the VerDate Sep<11>2014 20:00 May 25, 2021 Jkt 253001 information shall have practical utility; (b) the accuracy of the agency’s estimate of the burden of the proposed collection of information; (c) the quality, utility, and clarity of the information to be collected; and (d) ways to minimize the burden of the collection of information on respondents, including through the PO 00000 Frm 00031 Fmt 4703 Sfmt 4703 use of automated collection techniques or other forms of information technology. Consideration will be given to comments and suggestions submitted within 60 days of this publication. Authority: 8 U.S.C. 1522 of the Immigration and Nationality Act (the Act) (Title IV, Sec. 412 of the Act) for each state E:\FR\FM\26MYN1.SGM 26MYN1

Agencies

[Federal Register Volume 86, Number 100 (Wednesday, May 26, 2021)]
[Notices]
[Pages 28357-28359]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 2021-11147]


-----------------------------------------------------------------------

DEPARTMENT OF HEALTH AND HUMAN SERVICES

Centers for Disease Control and Prevention

[60Day-21-0222; Docket No. CDC-2021-0051]


Proposed Data Collection Submitted for Public Comment and 
Recommendations

AGENCY: Centers for Disease Control and Prevention (CDC), Department of 
Health and Human Services (HHS).

ACTION: Notice with comment period.

-----------------------------------------------------------------------

SUMMARY: The Centers for Disease Control and Prevention (CDC), as part 
of its continuing effort to reduce public burden and maximize the 
utility of government information, invites the general public and other 
Federal agencies the opportunity to comment on a proposed and/or 
continuing information collection, as required by the Paperwork 
Reduction Act of 1995. This notice invites comment on a proposed 
information collection project titled the Collaborating Center for 
Questionnaire Design and Evaluation Research (CCQDER). This generic 
clearance request, requests approval for collection of information that 
encompasses general questionnaire development, pre-testing, and 
measurement-error reduction activities to be carried out in 2021-2024.

DATES: CDC must receive written comments on or before July 26, 2021.

ADDRESSES: You may submit comments, identified by Docket No. CDC-2021-
0051 by any of the following methods:
     Federal eRulemaking Portal: Regulations.gov. Follow the 
instructions for submitting comments.
     Mail: Jeffrey M. Zirger, Information Collection Review 
Office, Centers for Disease Control and Prevention, 1600 Clifton Road 
NE, MS-D74, Atlanta, Georgia 30329.
    Instructions: All submissions received must include the agency name 
and Docket Number. CDC will post, without change, all relevant comments 
to Regulations.gov.
    Please note: Submit all comments through the Federal eRulemaking 
portal (regulations.gov) or by U.S. mail to the address listed above.

FOR FURTHER INFORMATION CONTACT: To request more information on the 
proposed project or to obtain a copy of the information collection plan 
and instruments, contact Jeffrey M. Zirger, Information Collection 
Review Office, Centers for Disease Control and Prevention, 1600 Clifton 
Road NE, MS-D74, Atlanta, Georgia 30329; phone: 404-639-7118; Email: 
[email protected].

SUPPLEMENTARY INFORMATION: Under the Paperwork Reduction Act of 1995 
(PRA) (44 U.S.C. 3501-3520), Federal agencies must obtain approval from 
the Office of Management and Budget (OMB) for each collection of 
information they conduct or sponsor. In addition, the PRA also requires 
Federal agencies to provide a 60-day notice in the Federal Register 
concerning each proposed collection of information, including each new 
proposed collection, each proposed extension of existing collection of 
information, and each reinstatement of previously approved information 
collection before submitting the collection to the OMB for approval. To 
comply with this requirement, we are publishing this notice of a 
proposed data collection as described below.
    The OMB is particularly interested in comments that will help:
    1. Evaluate whether the proposed collection of information is 
necessary for the proper performance of the functions of the agency, 
including whether the information will have practical utility;
    2. Evaluate the accuracy of the agency's estimate of the burden of 
the proposed collection of information, including the validity of the 
methodology and assumptions used;
    3. Enhance the quality, utility, and clarity of the information to 
be collected;
    4. Minimize the burden of the collection of information on those 
who are to respond, including through the use of appropriate automated, 
electronic, mechanical, or other technological collection techniques or 
other forms of information technology, e.g., permitting electronic 
submissions of responses; and
    5. Assess information collection costs.

Proposed Project

    The Collaborating Center for Questionnaire Design and Evaluation 
Research (CCQDER) (OMB Control No. 0920-0222, Exp. 08/31/2021)--
Revision--National Center for Health Statistics (NCHS), Centers for 
Disease Control and Prevention (CDC).

Background and Brief Description

    Section 306 of the Public Health Service (PHS) Act (42 U.S.C. 
242k), as amended, authorizes that the Secretary of Health and Human 
Services (DHHS), acting through NCHS, shall undertake and support (by 
grant or contract) research, demonstrations, and evaluations respecting 
new or improved methods for obtaining current data to support 
statistical and epidemiological activities for the purpose of improving 
the effectiveness, efficiency, and quality of health services in the 
United States.
    The Collaborating Center for Questionnaire Design and Evaluation 
Research (CCQDER) is the focal point within NCHS for questionnaire and 
survey development, pre-testing, and evaluation activities for CDC 
surveys such as; the National Survey of Family Growth (NSFG) (OMB 
Control No. 0920-0314), the Research and Development Survey (RANDS), 
including RANDS COVID (OMB Control No. 0920-1298), and other federally-
sponsored surveys. The CCQDER is requesting three additional years of 
OMB Clearance for this generic submission.
    The CCQDER and other NCHS programs conduct cognitive interviews, 
focus groups, in-depth or ethnographic interviews, usability tests, 
field tests/pilot interviews, and experimental research in laboratory 
and field settings, both for applied questionnaire development and 
evaluation, as well as

[[Page 28358]]

more basic research on measurement errors and survey response.
    Various techniques to evaluate interviewer administered, self-
administered, telephone, Computer Assisted Personal Interviewing 
(CAPI), Computer Assisted Self-Interviewing (CASI), Audio Computer-
Assisted Self-Interviewing (ACASI), and web-based questionnaires are 
used.
    The most common questionnaire evaluation method is the cognitive 
interview. These evaluations are conducted by the CCQDER. The interview 
structure consists of respondents first answering a draft survey 
question and then providing textual information to reveal the processes 
involved in answering the test question. Specifically, cognitive 
interview respondents are asked to describe how and why they answered 
the question as they did. Through the interviewing process, various 
types of question-response problems that would not normally be 
identified in a traditional survey interview, such as interpretive 
errors and recall accuracy, are uncovered. By conducting a comparative 
analysis of cognitive interviews, it is also possible to determine 
whether particular interpretive patterns occur within particular sub-
groups of the population. Interviews are generally conducted in small 
rounds totaling 40-100 interviews; ideally, the questionnaire is re-
worked between rounds, and revisions are tested iteratively until 
interviews yield relatively few new insights.
    Cognitive interviewing is inexpensive and provides useful data on 
questionnaire performance while minimizing respondent burden. Cognitive 
interviewing offers a detailed depiction of meanings and processes used 
by respondents to answer questions--processes that ultimately produce 
the survey data. As such, the method offers an insight that can 
transform understanding of question validity and response error. 
Documented findings from these studies represent tangible evidence of 
how the question performs. Such documentation also serves CDC data 
users, allowing them to be critical users in their approach and 
application of the data.
    In addition to cognitive interviewing, a number of other 
qualitative and quantitative methods are used to investigate and 
research measurement errors and the survey response process. These 
methods include conducting focus groups, usability tests, in-depth or 
ethnographic interviews, and the administration and analysis of 
questions in both representative and non-representative field tests. 
Focus groups are group interviews whose primary purpose is to elicit 
the basic sociocultural understandings and terminology that form the 
basis of questionnaire design. Each group typically consists of one 
moderator and four to 10 participants, depending on the research 
question. In-depth or ethnographic interviews are one-on-one interviews 
designed to elicit the understandings or terminology that are necessary 
for question design, as well as to gather detailed information that can 
contribute to the analysis of both qualitative and quantitative data. 
Usability tests are typically one-on-one interviews that are used to 
determine how a given survey or information collection tool functions 
in the field, and how the mode and layout of the instrument itself may 
contribute to survey response error and the survey response process.
    In addition to these qualitative methods, NCHS also uses various 
tools to obtain quantitative data, which can be analyzed alone or 
analyzed alongside qualitative data to give a much fuller accounting of 
the survey response process. For instance, phone, internet, mail, and 
in-person follow-up interviews of previous NCHS survey respondents may 
be used to test the validity of survey questions and questionnaires, 
and to obtain more detailed information that cannot be gathered on the 
original survey. Additionally, field or pilot tests may be conducted on 
both representative and non-representative samples, including those 
obtained from commercial survey and web panel vendors. Beyond looking 
at traditional measures of survey errors (such as item missing rates 
and non-response, and don't know rates), these pilot tests can be used 
to run experimental designs in order to capture how different questions 
function in a field setting. Similar methodology has been adopted by 
other federal agencies, as well as by academic and commercial survey 
organizations.
    In 2021-2024 NCHS/CCQDER staff plans to continue research on 
methods evaluation and general questionnaire design research. We 
envision that over the next three years, NCHS/CCQDER will work 
collaboratively with survey researchers from universities and other 
Federal agencies to define and examine several research areas, 
including, but not limited to: (1) Differences between face-to-face, 
telephone, and virtual/video-over internet cognitive interviewing, (2) 
effectiveness of different approaches to cognitive interviewing, such 
as concurrent and retrospective probing, (3) reactions of both survey 
respondents and survey interviewers to the use of Computer Assisted 
Personal Interviewing (CAPI), Audio Computer-Assisted Self-Interview 
(ACASI), video-over internet/virtual, (4) social, cultural and 
linguistic factors in the question response process, and (5) 
recruitment and respondent participation at varying levels of 
incentive, in an effort to establish empirical evidence regarding 
remuneration and coercion. Procedures for each of these studies will be 
similar to those applied in the usual testing of survey questions. For 
example, questionnaires that are of current interest (such as RANDS and 
NIOSH) may be evaluated using several of the techniques described 
above. Or, different versions of a survey question will be developed, 
and the variants then administered to separate groups of respondents in 
order to study the cognitive processes that account for the differences 
in responses obtained across different versions.
    These studies will be conducted either by CCQDER staff, DHHS staff, 
or NCHS contractors who are trained in cognitive interviewing 
techniques. The results of these studies will be applied to our 
specific questionnaire development activities in order to improve the 
methods that we use to conduct questionnaire testing, and to guide 
questionnaire design in general.
    We are requesting 9,455 annualized hours, totaling 28,365 over 
three years. This is an increase of 1,672 hours per year or 5,016 hours 
over three years. The difference is due to an anticipated increase in 
the number and size of projects being undertaken. There is no cost to 
respondents other than their time to participate.

                                        Estimated Annualized Burden Table
----------------------------------------------------------------------------------------------------------------
                                                                     Number of     Average hours
     Types of respondents           Form name        Number of     responses per   per response    Total burden
                                                    respondents     respondent      (in hours)         hours
----------------------------------------------------------------------------------------------------------------
Individuals or households.....  Eligibility                4,400               1            5/60             367
                                 Screeners.

[[Page 28359]]

 
Individuals or households.....  Developmental              8,750               1           55/60           8,021
                                 Questionnaires.
Individuals or households.....  Respondent Data            8,750               1            5/60             729
                                 Collection
                                 Sheet.
Individuals or households.....  Focus Group                  225               1           90/60             338
                                 Documents.
                                                 ---------------------------------------------------------------
    Total.....................  ................  ..............  ..............  ..............           9,455
----------------------------------------------------------------------------------------------------------------


Jeffrey M. Zirger,
Lead, Information Collection Review Office, Office of Scientific 
Integrity, Office of Science, Centers for Disease Control and 
Prevention.
[FR Doc. 2021-11147 Filed 5-25-21; 8:45 am]
BILLING CODE 4163-18-P


This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.