Agency Forms Undergoing Paperwork Reduction Act Review, 76049-76051 [2022-26887]

Download as PDF 76049 Federal Register / Vol. 87, No. 237 / Monday, December 12, 2022 / Notices access the data through the NCHS Virtual Data Enclave (VDE), then the ‘‘VDE Data Use Agreement Form’’ and the ‘‘Designated Agent Form’’ would need to be completed and returned to NCHS. In order to capture the information needed to adjudicate a researcher’s commitment to protect confidential NCHS data, researchers must complete and sign the data security forms. This request allows for both researcher Estimated Annualized Burden Hours Number of responses per respondent Number of respondents Average burden per response (in hrs.) Total burden (in hrs.) Type of respondents Form name Researcher ........................................ Research Data Center (RDC) Proposal. 110 1 1 110 Total ........................................... ........................................................... ........................ ........................ ........................ 110 Jeffrey M. Zirger, Lead, Information Collection Review Office, Office of Scientific Integrity, Office of Science, Centers for Disease Control and Prevention. [FR Doc. 2022–26889 Filed 12–9–22; 8:45 am] BILLING CODE 4163–18–P DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention [30Day–23–0222] Agency Forms Undergoing Paperwork Reduction Act Review lotter on DSK11XQN23PROD with NOTICES1 signature and the time per response for a total estimated annual burden total of 110 hours. There is no cost to respondents other than their time to complete the forms. In accordance with the Paperwork Reduction Act of 1995, the Centers for Disease Control and Prevention (CDC) has submitted the information collection request titled ‘‘Collaborating Center for Questionnaire Design and Evaluation for the National Center for Health Statistics’’ to the Office of Management and Budget (OMB) for review and approval. CDC previously published a ‘‘Proposed Data Collection Submitted for Public Comment and Recommendations’’ notice on September 30, 2022 to obtain comments from the public and affected agencies. CDC did not receive comments related to the previous notice. This notice serves to allow an additional 30 days for public and affected agency comments. CDC will accept all comments for this proposed information collection project. The Office of Management and Budget is particularly interested in comments that: (a) Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility; (b) Evaluate the accuracy of the agencies estimate of the burden of the proposed collection of information, VerDate Sep<11>2014 18:08 Dec 09, 2022 Jkt 259001 including the validity of the methodology and assumptions used; (c) Enhance the quality, utility, and clarity of the information to be collected; (d) Minimize the burden of the collection of information on those who are to respond, including, through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses; and (e) Assess information collection costs. To request additional information on the proposed project or to obtain a copy of the information collection plan and instruments, call (404) 639–7570. Comments and recommendations for the proposed information collection should be sent within 30 days of publication of this notice to www.reginfo.gov/public/ do/PRAMain. Find this particular information collection by selecting ‘‘Currently under 30-day Review—Open for Public Comments’’ or by using the search function. Direct written comments and/or suggestions regarding the items contained in this notice to the Attention: CDC Desk Officer, Office of Management and Budget, 725 17th Street NW, Washington, DC 20503 or by fax to (202) 395–5806. Provide written comments within 30 days of notice publication. Proposed Project The Collaborating Center for Questionnaire Design and Evaluation Research (CCQDER) (OMB Control No. 0920–0222, Exp. 09/30/2024)— Revision—National Center for Health Statistics (NCHS), Centers for Disease Control and Prevention (CDC). Background and Brief Description Section 306 of the Public Health Service (PHS) Act (42 U.S.C. 242k), as amended, authorizes that the Secretary PO 00000 Frm 00026 Fmt 4703 Sfmt 4703 of Health and Human Services (DHHS), acting through NCHS, shall undertake and support (by grant or contract) research, demonstrations, and evaluations respecting new or improved methods for obtaining current data to support statistical and epidemiological activities for the purpose of improving the effectiveness, efficiency, and quality of health services in the United States. The Collaborating Center for Questionnaire Design and Evaluation Research (CCQDER) is the focal point within NCHS for questionnaire and survey development, pre-testing, and evaluation activities for CDC surveys such as the National Survey of Family Growth (NSFG), the Research and Development Survey (RANDS) (including RANDS COVID), and other federally sponsored surveys. The CCQDER is requesting three years of OMB Clearance for this Generic submission. The CCQDER and other NCHS programs conduct cognitive interviews, focus groups, in-depth or ethnographic interviews, usability tests, field tests/ pilot interviews, and experimental research in laboratory and field settings, both for applied questionnaire development and evaluation as well as more basic research on measurement errors and survey response. Various techniques to evaluate interviewer administered, self-administered, telephone, Computer Assisted Personal Interviewing (CAPI), Computer Assisted Self-Interviewing (CASI), Audio Computer-Assisted Self-Interviewing (ACASI), and web-based questionnaires are used. The most common questionnaire evaluation method is the cognitive interview. These evaluations are conducted by the CCQDER. The interview structure consists of respondents first answering a draft survey question and then providing textual information to reveal the processes involved in answering the test E:\FR\FM\12DEN1.SGM 12DEN1 76050 Federal Register / Vol. 87, No. 237 / Monday, December 12, 2022 / Notices question. Specifically, cognitive interview respondents are asked to describe how and why they answered the question as they did. Through the interviewing process, various types of question-response problems that would not normally be identified in a traditional survey interview, such as interpretive errors and recall accuracy, are uncovered. By conducting a comparative analysis of cognitive interviews, it is also possible to determine whether particular interpretive patterns occur within particular sub-groups of the population. Interviews are generally conducted in small rounds totaling 40–100 interviews; ideally, the questionnaire is re-worked between rounds, and revisions are tested iteratively until interviews yield relatively few new insights. Cognitive interviewing is inexpensive and provides useful data on questionnaire performance while minimizing respondent burden. Cognitive interviewing offers a detailed depiction of meanings and processes used by respondents to answer questions—processes that ultimately produce the survey data. As such, the method offers an insight that can transform understanding of question validity and response error. Documented findings from these studies represent tangible evidence of how the question performs. Such documentation also serves CDC data users, allowing them to be critical users in their approach and application of the data. In addition to cognitive interviewing, a number of other qualitative and quantitative methods are used to investigate and research measurement errors and the survey response process. These methods include conducting focus groups, usability tests, in-depth or ethnographic interviews, and the administration and analysis of questions in both representative and nonrepresentative field tests. Focus groups are conducted by the CCQDER. They are group interviews whose primary purpose is to elicit the basic sociocultural understandings and terminology that form the basis of questionnaire design. Each group typically consists of one moderator and four to 10 participants, depending on the research question. In-depth or ethnographic interviews are one-on-one interviews designed to elicit the understandings or terminology that are necessary for question design, as well as to gather detailed information that can contribute to the analysis of both qualitative and quantitative data. Usability tests are typically one-on-one interviews that are used to determine how a given survey or information collection tool functions in the field, and how the mode and layout of the instrument itself may contribute to survey response error and the survey response process. In addition to these qualitative methods, NCHS also uses various tools to obtain quantitative data, which can be analyzed alone or analyzed alongside qualitative data to give a much fuller accounting of the survey response process. For instance, phone, internet, mail, and in-person follow-up interviews of previous NCHS survey respondents may be used to test the validity of survey questions and questionnaires and to obtain more detailed information that cannot be gathered on the original survey. Additionally, field or pilot tests may be conducted on both representative and non-representative samples, including those obtained from commercial survey and web panel vendors. Beyond looking at traditional measures of survey errors (such as item missing rates and nonresponse, and don’t know rates), these pilot tests can be used to run experimental designs in order to capture how different questions function in a field setting. Similar methodology has been adopted by other federal agencies, as well as by academic and commercial survey organizations. In 2022–2025 NCHS/CCQDER staff plans to continue research on methods evaluation and general questionnaire design research. We envision that over the next three years, NCHS/CCQDER will work collaboratively with survey lotter on DSK11XQN23PROD with NOTICES1 Type of respondents Individuals Individuals Individuals Individuals Individuals or or or or or VerDate Sep<11>2014 households households households households households 18:08 Dec 09, 2022 Jkt 259001 Eligibility Screeners ........................................ Developmental Questionnaires ...................... Respondent Data Collection Sheet ................ Focus Group Documents ............................... RANDS Methodological Surveys ................... PO 00000 Frm 00027 Fmt 4703 Sfmt 4703 Estimated Annualized Burden Table Number of respondents Form name ............................... ............................... ............................... ............................... ............................... researchers from universities and other federal agencies to define and examine several research areas, including, but not limited to: (1) differences between face-to-face, telephone, and virtual/ video-over internet cognitive interviewing; (2) effectiveness of different approaches to cognitive interviewing, such as concurrent and retrospective probing; (3) reactions of both survey respondents and survey interviewers to the use of Computer Assisted Personal Interviewing (CAPI), Audio Computer-Assisted SelfInterview (ACASI), video-over internet/ virtual; (4) social, cultural and linguistic factors in the question response process; and (5) recruitment and respondent participation at varying levels of incentive in an effort to establish empirical evidence regarding remuneration and coercion. Procedures for each of these studies will be similar to those applied in the usual testing of survey questions. For example, questionnaires that are of current interest (such as RANDS and NIOSH) may be evaluated using several of the techniques described above, or different versions of a survey question will be developed, and the variants then administered to separate groups of respondents in order to study the cognitive processes that account for the differences in responses obtained across different versions. These studies will be conducted either by CCQDER staff, DHHS staff, or NCHS contractors who are trained in cognitive interviewing techniques. The results of these studies will be applied to our specific questionnaire development activities in order to improve the methods that we use to conduct questionnaire testing, and to guide questionnaire design in general. CDC requests OMB approval for an estimated 21,905 annualized burden hours. This is an increase of 12,450 hours per year due to the addition of RANDS Methodological Surveys. There is no cost to respondents other than their time to participate. 4,400 8,750 8,750 225 49,800 E:\FR\FM\12DEN1.SGM 12DEN1 Number of responses per respondent 1 1 1 1 1 Average hours per response (in hours) 5/60 55/60 5/60 90/60 15/60 Federal Register / Vol. 87, No. 237 / Monday, December 12, 2022 / Notices Jeffrey M. Zirger, Lead, Information Collection Review Office, Office of Scientific Integrity, Office of Science, Centers for Disease Control and Prevention. [FR Doc. 2022–26887 Filed 12–9–22; 8:45 am] BILLING CODE 4163–18–P DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention [30Day–23–1279] lotter on DSK11XQN23PROD with NOTICES1 Agency Forms Undergoing Paperwork Reduction Act Review In accordance with the Paperwork Reduction Act of 1995, the Centers for Disease Control and Prevention (CDC) has submitted the information collection request titled ‘‘WISEWOMAN National Program Evaluation’’ to the Office of Management and Budget (OMB) for review and approval. CDC previously published a ‘‘Proposed Data Collection Submitted for Public Comment and Recommendations’’ notice on August 22, 2022 to obtain comments from the public and affected agencies. CDC received two comments related to the previous notice. This notice serves to allow an additional 30 days for public and affected agency comments. CDC will accept all comments for this proposed information collection project. The Office of Management and Budget is particularly interested in comments that: (a) Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility; (b) Evaluate the accuracy of the agencies estimate of the burden of the proposed collection of information, including the validity of the methodology and assumptions used; (c) Enhance the quality, utility, and clarity of the information to be collected; (d) Minimize the burden of the collection of information on those who are to respond, including, through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses; and (e) Assess information collection costs. To request additional information on the proposed project or to obtain a copy VerDate Sep<11>2014 18:08 Dec 09, 2022 Jkt 259001 of the information collection plan and instruments, call (404) 639–7570. Comments and recommendations for the proposed information collection should be sent within 30 days of publication of this notice to www.reginfo.gov/public/ do/PRAMain. Find this particular information collection by selecting ‘‘Currently under 30-day Review—Open for Public Comments’’ or by using the search function. Direct written comments and/or suggestions regarding the items contained in this notice to the Attention: CDC Desk Officer, Office of Management and Budget, 725 17th Street NW, Washington, DC 20503 or by fax to (202) 395–5806. Provide written comments within 30 days of notice publication. Proposed Project WISEWOMAN National Program Evaluation (OMB Control No. 0920– 1279, Exp. 12/31/2022)—Extension— National Center for Chronic Disease Prevention and Health Promotion (NCCDPHP), Centers for Disease Control and Prevention (CDC). Background and Brief Description The CDC has supported the WISEWOMAN (Well-Integrated Screening and Evaluation for Women Across the Nation) program since 1995. The WISEWOMAN program is designed to serve low-income women ages 40–64 who have elevated risk factors for cardiovascular disease (CVD) and have no health insurance, or are underinsured for medical and preventive care services. Through the WISEWOMAN program, women have access to screening services for selected CVD risk factors such as elevated blood cholesterol, hypertension, and abnormal blood glucose levels; referrals to heathy behavior support programs; and referrals to medical care. WISEWOMAN participants must be co-enrolled in the CDC-sponsored National Breast and Cervical Cancer Early Detection Program (NBCCEDP). The WISEWOMAN program is administered through cooperative agreements with state, territorial, or tribal health departments. Each WISEWOMAN recipient submits to CDC an annual progress report that describes program objectives and activities, and semi-annual data reports (known as minimum data elements, or MDEs) on the screening, assessment, and healthy behavior support services offered to women who participate in the program. Participant-level MDEs are de-identified prior to transmission to CDC. PO 00000 Frm 00028 Fmt 4703 Sfmt 4703 76051 In 2018, CDC released the fifth funding opportunity announcement (FOA) for the WISEWOMAN program (DP18–1816), which resulted in fiveyear cooperative agreements with 24 state, territorial, and tribal health departments, including six new and 18 continuing awardees from the previous NOFO. Key program elements were retained (e.g., provision of screening services, promotion of healthy lifestyle behaviors, and linkage to healthy behavior support services and community based resources), but a number of changes were incorporated into the program at that time. The current FOA reflects increased emphasis on three strategies to reduce CVD risk and support hypertension control and management, including: (1) tracking and monitoring clinical measures; (2) implementing team-based care; and (3) linking community resources and clinical services to support care coordination, self-management, and lifestyle change. CDC seeks to conduct a multicomponent evaluation to assess the effectiveness of the program on individual, organizational, and community-level outcomes. The indepth assessment is designed to complement the routine progress and MDE information already being collected from WISEWOMAN program recipients. The data collection focuses on obtaining qualitative and quantitative information at the organizational and community levels about process and procedures implemented, and barriers, facilitators, and other contextual factors that affect program implementation and participant outcomes. Data collection activities include a Program Survey with all WISEWOMAN awardee programs, administered in the second and fourth program years, and a onetime site visit to each recipient spread across the three-year data collection effort. During site visits, semi-structured interviews will be conducted with WISEWOMAN staff members and staff at partner organizations, such as clinical providers and community-based resource providers, who are positioned to provide a variety of perspectives on program implementation. CDC requests OMB approval for a one-year extension of this data collection, and requests approval for an estimated 84 annual burden hours. Participation is voluntary and there are no costs to respondents other than their time. Estimated Annualized Burden Hours E:\FR\FM\12DEN1.SGM 12DEN1

Agencies

[Federal Register Volume 87, Number 237 (Monday, December 12, 2022)]
[Notices]
[Pages 76049-76051]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 2022-26887]


-----------------------------------------------------------------------

DEPARTMENT OF HEALTH AND HUMAN SERVICES

Centers for Disease Control and Prevention

[30Day-23-0222]


Agency Forms Undergoing Paperwork Reduction Act Review

    In accordance with the Paperwork Reduction Act of 1995, the Centers 
for Disease Control and Prevention (CDC) has submitted the information 
collection request titled ``Collaborating Center for Questionnaire 
Design and Evaluation for the National Center for Health Statistics'' 
to the Office of Management and Budget (OMB) for review and approval. 
CDC previously published a ``Proposed Data Collection Submitted for 
Public Comment and Recommendations'' notice on September 30, 2022 to 
obtain comments from the public and affected agencies. CDC did not 
receive comments related to the previous notice. This notice serves to 
allow an additional 30 days for public and affected agency comments.
    CDC will accept all comments for this proposed information 
collection project. The Office of Management and Budget is particularly 
interested in comments that:
    (a) Evaluate whether the proposed collection of information is 
necessary for the proper performance of the functions of the agency, 
including whether the information will have practical utility;
    (b) Evaluate the accuracy of the agencies estimate of the burden of 
the proposed collection of information, including the validity of the 
methodology and assumptions used;
    (c) Enhance the quality, utility, and clarity of the information to 
be collected;
    (d) Minimize the burden of the collection of information on those 
who are to respond, including, through the use of appropriate 
automated, electronic, mechanical, or other technological collection 
techniques or other forms of information technology, e.g., permitting 
electronic submission of responses; and
    (e) Assess information collection costs.
    To request additional information on the proposed project or to 
obtain a copy of the information collection plan and instruments, call 
(404) 639-7570. Comments and recommendations for the proposed 
information collection should be sent within 30 days of publication of 
this notice to www.reginfo.gov/public/do/PRAMain. Find this particular 
information collection by selecting ``Currently under 30-day Review--
Open for Public Comments'' or by using the search function. Direct 
written comments and/or suggestions regarding the items contained in 
this notice to the Attention: CDC Desk Officer, Office of Management 
and Budget, 725 17th Street NW, Washington, DC 20503 or by fax to (202) 
395-5806. Provide written comments within 30 days of notice 
publication.

Proposed Project

    The Collaborating Center for Questionnaire Design and Evaluation 
Research (CCQDER) (OMB Control No. 0920-0222, Exp. 09/30/2024)--
Revision--National Center for Health Statistics (NCHS), Centers for 
Disease Control and Prevention (CDC).

Background and Brief Description

    Section 306 of the Public Health Service (PHS) Act (42 U.S.C. 
242k), as amended, authorizes that the Secretary of Health and Human 
Services (DHHS), acting through NCHS, shall undertake and support (by 
grant or contract) research, demonstrations, and evaluations respecting 
new or improved methods for obtaining current data to support 
statistical and epidemiological activities for the purpose of improving 
the effectiveness, efficiency, and quality of health services in the 
United States.
    The Collaborating Center for Questionnaire Design and Evaluation 
Research (CCQDER) is the focal point within NCHS for questionnaire and 
survey development, pre-testing, and evaluation activities for CDC 
surveys such as the National Survey of Family Growth (NSFG), the 
Research and Development Survey (RANDS) (including RANDS COVID), and 
other federally sponsored surveys. The CCQDER is requesting three years 
of OMB Clearance for this Generic submission.
    The CCQDER and other NCHS programs conduct cognitive interviews, 
focus groups, in-depth or ethnographic interviews, usability tests, 
field tests/pilot interviews, and experimental research in laboratory 
and field settings, both for applied questionnaire development and 
evaluation as well as more basic research on measurement errors and 
survey response. Various techniques to evaluate interviewer 
administered, self-administered, telephone, Computer Assisted Personal 
Interviewing (CAPI), Computer Assisted Self-Interviewing (CASI), Audio 
Computer-Assisted Self-Interviewing (ACASI), and web-based 
questionnaires are used.
    The most common questionnaire evaluation method is the cognitive 
interview. These evaluations are conducted by the CCQDER. The interview 
structure consists of respondents first answering a draft survey 
question and then providing textual information to reveal the processes 
involved in answering the test

[[Page 76050]]

question. Specifically, cognitive interview respondents are asked to 
describe how and why they answered the question as they did. Through 
the interviewing process, various types of question-response problems 
that would not normally be identified in a traditional survey 
interview, such as interpretive errors and recall accuracy, are 
uncovered. By conducting a comparative analysis of cognitive 
interviews, it is also possible to determine whether particular 
interpretive patterns occur within particular sub-groups of the 
population. Interviews are generally conducted in small rounds totaling 
40-100 interviews; ideally, the questionnaire is re-worked between 
rounds, and revisions are tested iteratively until interviews yield 
relatively few new insights.
    Cognitive interviewing is inexpensive and provides useful data on 
questionnaire performance while minimizing respondent burden. Cognitive 
interviewing offers a detailed depiction of meanings and processes used 
by respondents to answer questions--processes that ultimately produce 
the survey data. As such, the method offers an insight that can 
transform understanding of question validity and response error. 
Documented findings from these studies represent tangible evidence of 
how the question performs. Such documentation also serves CDC data 
users, allowing them to be critical users in their approach and 
application of the data.
    In addition to cognitive interviewing, a number of other 
qualitative and quantitative methods are used to investigate and 
research measurement errors and the survey response process. These 
methods include conducting focus groups, usability tests, in-depth or 
ethnographic interviews, and the administration and analysis of 
questions in both representative and non-representative field tests. 
Focus groups are conducted by the CCQDER. They are group interviews 
whose primary purpose is to elicit the basic socio-cultural 
understandings and terminology that form the basis of questionnaire 
design. Each group typically consists of one moderator and four to 10 
participants, depending on the research question. In-depth or 
ethnographic interviews are one-on-one interviews designed to elicit 
the understandings or terminology that are necessary for question 
design, as well as to gather detailed information that can contribute 
to the analysis of both qualitative and quantitative data. Usability 
tests are typically one-on-one interviews that are used to determine 
how a given survey or information collection tool functions in the 
field, and how the mode and layout of the instrument itself may 
contribute to survey response error and the survey response process.
    In addition to these qualitative methods, NCHS also uses various 
tools to obtain quantitative data, which can be analyzed alone or 
analyzed alongside qualitative data to give a much fuller accounting of 
the survey response process. For instance, phone, internet, mail, and 
in-person follow-up interviews of previous NCHS survey respondents may 
be used to test the validity of survey questions and questionnaires and 
to obtain more detailed information that cannot be gathered on the 
original survey. Additionally, field or pilot tests may be conducted on 
both representative and non-representative samples, including those 
obtained from commercial survey and web panel vendors. Beyond looking 
at traditional measures of survey errors (such as item missing rates 
and non-response, and don't know rates), these pilot tests can be used 
to run experimental designs in order to capture how different questions 
function in a field setting. Similar methodology has been adopted by 
other federal agencies, as well as by academic and commercial survey 
organizations.
    In 2022-2025 NCHS/CCQDER staff plans to continue research on 
methods evaluation and general questionnaire design research. We 
envision that over the next three years, NCHS/CCQDER will work 
collaboratively with survey researchers from universities and other 
federal agencies to define and examine several research areas, 
including, but not limited to: (1) differences between face-to-face, 
telephone, and virtual/video-over internet cognitive interviewing; (2) 
effectiveness of different approaches to cognitive interviewing, such 
as concurrent and retrospective probing; (3) reactions of both survey 
respondents and survey interviewers to the use of Computer Assisted 
Personal Interviewing (CAPI), Audio Computer-Assisted Self-Interview 
(ACASI), video-over internet/virtual; (4) social, cultural and 
linguistic factors in the question response process; and (5) 
recruitment and respondent participation at varying levels of incentive 
in an effort to establish empirical evidence regarding remuneration and 
coercion. Procedures for each of these studies will be similar to those 
applied in the usual testing of survey questions. For example, 
questionnaires that are of current interest (such as RANDS and NIOSH) 
may be evaluated using several of the techniques described above, or 
different versions of a survey question will be developed, and the 
variants then administered to separate groups of respondents in order 
to study the cognitive processes that account for the differences in 
responses obtained across different versions.
    These studies will be conducted either by CCQDER staff, DHHS staff, 
or NCHS contractors who are trained in cognitive interviewing 
techniques. The results of these studies will be applied to our 
specific questionnaire development activities in order to improve the 
methods that we use to conduct questionnaire testing, and to guide 
questionnaire design in general.
    CDC requests OMB approval for an estimated 21,905 annualized burden 
hours. This is an increase of 12,450 hours per year due to the addition 
of RANDS Methodological Surveys. There is no cost to respondents other 
than their time to participate.

Estimated Annualized Burden Table

----------------------------------------------------------------------------------------------------------------
                                                                                     Number of     Average hours
          Type of respondents                   Form name            Number of     responses per   per response
                                                                    respondents     respondent      (in hours)
----------------------------------------------------------------------------------------------------------------
Individuals or households.............  Eligibility Screeners...           4,400               1            5/60
Individuals or households.............  Developmental                      8,750               1           55/60
                                         Questionnaires.
Individuals or households.............  Respondent Data                    8,750               1            5/60
                                         Collection Sheet.
Individuals or households.............  Focus Group Documents...             225               1           90/60
Individuals or households.............  RANDS Methodological              49,800               1           15/60
                                         Surveys.
----------------------------------------------------------------------------------------------------------------



[[Page 76051]]

Jeffrey M. Zirger,
Lead, Information Collection Review Office, Office of Scientific 
Integrity, Office of Science, Centers for Disease Control and 
Prevention.
[FR Doc. 2022-26887 Filed 12-9-22; 8:45 am]
BILLING CODE 4163-18-P
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.