Agency Information Collection Activities: Proposed Collection; Comment Request, 77153-77155 [2024-21564]
Download as PDF
Federal Register / Vol. 89, No. 183 / Friday, September 20, 2024 / Notices
at the Bridge of the Americas Land Port
of Entry.
GSA conducted internal and external
scoping meetings to seek input on
alternatives and issues associated with
implementation of the proposed action
through various alternatives. The GSA
has narrowed the alternatives that best
fulfill the purpose and need to the
following two with the addition of the
No Action Alternative:
Multi-Level Modernization with High/
Low Booths Primarily within Existing
Port Boundaries with Minor Land
Acquisition. (Viable Action Alternative
#A1)
Multi-Level Modernization within
Existing Port Boundaries with Minor
Land Acquisition Immediately Adjacent
to the Port and Elimination of
Commercial Cargo Operations. (Viable
Action Alternative #4)
The Draft EIS states the purpose and
need for the Proposed Action, analyzes
the alternatives considered, including
the option of No Action and assesses
environmental impacts of each
alternative, including avoidance,
minimization, and potential mitigation
measures.
GSA, in cooperation with CBP has
selected Viable Action Alternative #4
Multi-Level Modernization within
Existing Port Boundaries with Minor
Land Acquisition Immediately Adjacent
to the Port and Elimination of
Commercial Cargo Operations as its
Preferred Alternative.
GSA believes this alternative would
best fulfill its statutory mission and
responsibilities, giving consideration to
economic, environmental, technical and
other factors and is seeking public and
stakeholder comments on this
alternative before a final decision is
made.
Michael Clardy,
Director, Facilities Management Division
(7PM), General Services Administration—
Public Building Service, Greater Southwest
Region.
[FR Doc. 2024–21068 Filed 9–19–24; 8:45 am]
BILLING CODE 6820–AY–P
ddrumheller on DSK120RN23PROD with NOTICES1
DEPARTMENT OF HEALTH AND
HUMAN SERVICES
Agency for Healthcare Research and
Quality
Agency Information Collection
Activities: Proposed Collection;
Comment Request
Agency for Healthcare Research
and Quality, HHS.
ACTION: Information collection notice.
AGENCY:
VerDate Sep<11>2014
16:44 Sep 19, 2024
Jkt 262001
In compliance with the
Paperwork Reduction Act of 1995, this
notice announces the intention of the
Agency for Healthcare Research and
Quality (AHRQ) to request that the
Office of Management and Budget
(OMB) approve the reinstatement
without change of the information
collection project Evaluating the
Implementation of PCOR to Increase
Referral, Enrollment, and Retention
through Automatic Referral to Cardiac
Rehabilitation (CR) with Care
Coordinator OMB No. 0935–0252 for
which approval has expired. The
reinstatement of this previously
approved PRA collection for which
approval has expired is required in
order to discontinue this collection.
DATES: Comments on this notice must be
received by November 19, 2024.
ADDRESSES: Written comments should
be submitted to: Doris Lefkowitz,
Reports Clearance Officer, AHRQ, by
email at
REPORTSCLEARANCEOFFICER@
ahrq.hhs.gov.
Copies of the proposed collection
plans, data collection instruments, and
specific details on the estimated burden
can be obtained from the AHRQ Reports
Clearance Officer.
FOR FURTHER INFORMATION CONTACT:
Doris Lefkowitz, AHRQ Reports
Clearance Officer, (301) 427–1477, or by
email at
REPORTSCLEARANCEOFFICER@
ahrq.hhs.gov.
SUPPLEMENTARY INFORMATION:
Title of Information Collection:
Evaluating the Implementation of PCOR
to Increase Referral, Enrollment, and
Retention through Automatic Referral to
Cardiac Rehabilitation (CR) with Care
Coordinator.
OMB No.: 0935–0252.
Type of Request: Reinstatement
without change to discontinue the
collection.
The aim of this project, known as
TAKEheart, was to (a) raise awareness
about the benefits of cardiac
rehabilitation (CR) after myocardial
infarction or coronary revascularization,
then to (b) spread knowledge about the
best practices to increase referrals to CR,
and, finally, (c) to increase CR uptake.
AHRQ evaluated TAKEheart to assess:
• the extent and effectiveness of the
dissemination and implementation
efforts
• the uptake and usage of Automatic
Referral with Care Coordination and
• levels of referral to CR at the end of
the intervention.
Evaluation results were used to
improve the intervention and to provide
guidance for future AHRQ
SUMMARY:
PO 00000
Frm 00079
Fmt 4703
Sfmt 4703
77153
dissemination and implementation
projects. Two cohorts of ‘‘Partner
Hospitals,’’ up to 125 hospitals in total,
engaged in efforts to implement
Automatic Referral with Care
Coordination over twelve-month
periods. The evaluation ascertained the
diversity of hospitals engaged in the
activities that contributed to (or
hindered) their efforts, and the types of
support which they reported having
been most (and least) useful. This
information was used to improve
recruitment, technical assistance, and
tools for the second cohort.
In addition, hospitals—including
those involved in the implementation—
were invited to attend Affinity Group
virtual meetings organized around
specific topics of interest which are not
intrinsic to Automatic Referral with
Care Coordination. Hospital staff
engaged in Affinity Groups created a
vibrant Learning Community. The
evaluation determined which Affinity
Groups engaged the most participants of
the Learning Community, and which
resources participants determined the
most useful. This information was used
to develop resources which were
available on a new, permanent website
dedicated to improving CR.
This study was conducted by AHRQ
through its contractor, Abt Associates
Inc., pursuant to AHRQ’s statutory
authority to disseminate governmentfunded research relevant to comparative
clinical effectiveness research. 42 U.S.C.
299b–37(a).
Method of Collection
To collect data on the many facets of
the intervention, the collection
implemented multiple data collection
tools, each of which had a specific
purpose and set of respondents.
1. Partner Hospital Champion Survey.
Each Partner Hospital designated a
‘‘Champion’’ who coordinated activities
associated with implementing
Automatic Referral with Care
Coordination at the hospital and
provide the Champion’s name and email
address. Champions could have had any
role in the hospital, although they were
expected to be in relevant positions,
such as cardiologists or quality
improvement managers. We conducted
online surveys of 125 Champions (one
Champion per hospital). We used the
email addresses to send the Champion
a survey at two points: seven months
after the start of implementation and at
the end of the 12-month implementation
period. The first survey focused on four
constructs. First, it captured data about
the hospital context, such as whether it
had prior experience customizing an
EMR or is a safety net hospital. Second,
E:\FR\FM\20SEN1.SGM
20SEN1
77154
Federal Register / Vol. 89, No. 183 / Friday, September 20, 2024 / Notices
it addressed the hospital’s decision to
participate in TAKEheart. Third, it
captured data on the CR programs the
hospital refers to, whether the number
or type has changed, and why. Fourth,
it collected feedback on the training and
technical assistance received. The
second survey focused on three
constructs. The first construct collected
feedback on the TAKEheart
components, including training,
technical assistance, and use of the
website. The second construct asked
about the hospitals’ response to
participating in TAKEheart, such as
changes to referral workflow or CR
programs. The third construct asked
those Partner Hospitals that had not
completed the process of implementing
Automatic Referral with Care
Coordination whether they anticipated
continuing to work towards that goal
and their confidence in succeeding.
2. Partner Hospital Interviews.
a. Interviews with Partner Hospital
Champions. We selected, from each
cohort, eight Partner Hospitals which
demonstrated a strong interest in
addressing underserved populations or
reducing disparities in participation in
cardiac rehabilitation. We conducted a
key informant interview with the
Champion of each selected Partner
Hospital to delve into how they were
addressing the needs of underserved
populations by implementing
Automatic Referral with Care
Coordination.
b. Interviews with Partner Hospital
cardiologists. We selected, from each
cohort, eight hospitals based on criteria
selected in conversation with AHRQ,
such as hospitals which serve specific
populations, or have the same EMRs,
which informed their experience
customizing the EMR. We conducted
semi-structured interviews with one
cardiologist at each of the selected
hospitals twice. In the second month of
the cohort implementation, we asked
about their needs, concerns, and
expectations of the program. In the 11th
month of the cohort implementation, we
determined whether their concerns were
addressed appropriately and adequately.
c. Interviews with Partner Hospitals
that withdraw. We expected that a small
number of Partner Hospitals would
withdraw from the cohort. We identified
these hospitals by their lack of
participation in training and technical
assistance events; Technical Assistance
(TA) Providers confirmed their
withdrawal. We interviewed up to nine
withdrawing hospitals to better
understand the reason for withdrawal
(e.g., a merger resulted in a loss of
support for the intervention, Champion
left), as well as facilitators and barriers
of each hospitals’ approach to
implementing Automatic Referral with
Care Coordination. If more than nine
hospitals withdrew, we ceased
interviewing.
3. Learning Community Participant
Survey. We conducted online surveys of
250 currently active Learning
Community participants at two points
in time, in months 18 and 31 of the
project. We administered the survey by
sending a link to an online survey to
email addresses entered by virtual
meeting participants during registration.
The email described the purpose of the
survey.
4. Learning Community Follow-up
Survey. We conducted a brief online
survey with up to 15 Learning
Community participants following the
final virtual meeting for each of 10
Affinity Group, to ascertain whether the
hospitals were able to act on what they
learned during the session. The total
sample was 150 Learning Community
participants.
Estimated Annual Respondent Burden
Exhibit 1 presents estimates of the
reporting burden hours for the data
collection efforts. Time estimates were
based on prior experiences and what
could reasonably be requested of
participating health care organizations.
The number of respondents listed in
column A, Exhibit 1 reflects a projected
90% response rate for data collection
effort 1, and an 80% response rate for
efforts 3 and 4 below.
1. Partner Hospital Champion Survey.
We assumed 113 hospital champions
would complete the survey based on a
90% response rate. It was expected to
take up to 45 minutes to complete for
a total of 169.5 hours to complete.
2. Partner Hospital Interviews. Indepth interviews occured with select
Partner Hospital staff.
a. Interviews with Partner Hospital
Champions. We had a single, 90 minute
interview with eight Partner Hospital
Champions, in each cohort, from Partner
Hospital which have a common
characteristic of particular interest, for a
total of 24 hours.
b. Interviews with Partner Hospital
cardiologists. We held individual, up-to30 minute interviews with eight
cardiologists, twice in each cohort, for a
total of 16 hours.
c. Interviews with Partner Hospitals
that withdraw. We interviewed up to
nine withdrawing hospitals for no more
than 20 minutes to better understand
the reason for withdrawal as well as
facilitators and barriers, for a total of 2.7
hours.
3. Learning Community Participant
Survey. We assumed 200 Learning
Community participants would
complete the survey based on an 80%
response rate. It was expected to take up
to 15 minutes to complete each survey
for a total of 100 hours.
Learning Community Follow-up
Survey. We conducted a brief, up to 10
minute, online survey of participants of
each of just ten selected Affinity Groups
at two months after the virtual meeting.
We assumed 120 Learning Community
participants would complete the survey
based on an 80% response rate. It was
expected to take up to 15 minutes to
complete each survey for a total of 20.4
hours.
EXHIBIT 1—ESTIMATED ANNUALIZED BURDEN HOURS
A.
Number of
respondents
ddrumheller on DSK120RN23PROD with NOTICES1
Data collection method or project activity
B.
Number of
responses per
respondent
C.
Hours per
response
D.
Total burden
hours
1. Partner Hospital Champion Survey * ...........................................................
2a. Interviews with Partner Hospital Champions .............................................
2b. Interviews with Partner Hospital Cardiologists ..........................................
2c. Interviews with Partner Hospitals that withdraw ........................................
3. Learning Community Survey ** ....................................................................
4. Learning Community Follow-up Survey ** ...................................................
113
16
16
9
200
120
2
1
2
1
2
1
0.75
1.5
0.5
0.3
0.25
0.17
169.5
24.0
16.0
2.7
100.0
20.4
Total ..........................................................................................................
474
........................
........................
332.6
* Number of respondents (Column A) reflects a sample size assuming a 90% response rate for this data collection effort.
** Number of respondents (Column A) reflects a sample size assuming an 80% response rate for this data collection effort.
VerDate Sep<11>2014
16:44 Sep 19, 2024
Jkt 262001
PO 00000
Frm 00080
Fmt 4703
Sfmt 4703
E:\FR\FM\20SEN1.SGM
20SEN1
77155
Federal Register / Vol. 89, No. 183 / Friday, September 20, 2024 / Notices
Exhibit 2, below, presents the
estimated annualized cost burden
associated with the respondents’ time to
participate in this research. The total
cost burden was estimated to be about
$21,497.
EXHIBIT 2—ESTIMATED ANNUALIZED COST BURDEN
A.
Number of
respondents
Data collection method or project activity
B.
Total burden
hours
Average
hourly
wage rate
Total cost
burden
1. Partner Hospital Champion Survey * ...........................................................
2a. Interviews with Partner Hospital Champions .............................................
2b. Interviews with Partner Hospital Cardiologists ..........................................
2c. Interviews with Partner Hospitals that withdraw ........................................
3. Learning Community Survey ** ....................................................................
4. Learning Community Follow-up Survey ** ...................................................
113
16
16
9
200
120
169.5
24.0
16.0
2.7
100.0
20.4
$72.27
72.27
96.58
72.27
47.95
47.95
$12,250
1,734
1,545
195
4,795
978
Total ..........................................................................................................
474
332.6
........................
21,497
* Number of respondents (Column A) reflects a sample size assuming a 90% response rate for this data collection effort.
** Number of respondents (Column A) reflects a sample size assuming an 80% response rate for this data collection effort.
Request for Comments
In accordance with the Paperwork
Reduction Act, 44 U.S.C. 3501–3520,
comments on AHRQ’s information
collection are requested with regard to
any of the following: (a) whether the
proposed collection of information is
necessary for the proper performance of
AHRQ’s health care research and health
care information dissemination
functions, including whether the
information will have practical utility;
(b) the accuracy of AHRQ’s estimate of
burden (including hours and costs) of
the proposed collection(s) of
information; (c) ways to enhance the
quality, utility and clarity of the
information to be collected; and (d)
ways to minimize the burden of the
collection of information upon the
respondents, including the use of
automated collection techniques or
other forms of information technology.
Comments submitted in response to
this notice will be summarized and
included in the Agency’s subsequent
request for OMB approval of the
proposed information collection. All
comments will become a matter of
public record.
Dated: September 17, 2024.
Marquita Cullom,
Associate Director.
[FR Doc. 2024–21564 Filed 9–19–24; 8:45 am]
ddrumheller on DSK120RN23PROD with NOTICES1
BILLING CODE 4160–90–P
VerDate Sep<11>2014
16:44 Sep 19, 2024
Jkt 262001
DEPARTMENT OF HEALTH AND
HUMAN SERVICES
Centers for Disease Control and
Prevention
Meeting of the Advisory Board on
Radiation and Worker Health,
Subcommittee for Procedure Reviews,
National Institute for Occupational
Safety and Health
Centers for Disease Control and
Prevention (CDC), Department of Health
and Human Services (HHS).
ACTION: Notice of meeting.
AGENCY:
In accordance with the
Federal Advisory Committee Act, the
Centers for Disease Control and
Prevention (CDC) announces the
following meeting for the Subcommittee
on Procedures Reviews (SPR) of the
Advisory Board on Radiation and
Worker Health (ABRWH or the Advisory
Board). This meeting is open to the
public, but without a public comment
period. The public is welcome to submit
written comments in advance of the
meeting, to the contact person below.
Written comments received in advance
of the meeting will be included in the
official record of the meeting. The
public is also welcomed to listen to the
meeting by joining the audio conference
(information below). The audio
conference line has 150 ports for callers.
DATES: The meeting will be held on
November 8, 2024, from 11 a.m. to 4:30
p.m., EST. Written comments must be
received on or before November 1, 2024.
ADDRESSES: You may submit comments
by mail to: Rashaun Roberts, Ph.D.,
National Institute for Occupational
Safety and Health, Centers for Disease
Control and Prevention, 1090 Tusculum
Avenue, MS C–24, Cincinnati, Ohio
45226.
Meeting Information: Audio
Conference Call via FTS Conferencing.
SUMMARY:
PO 00000
Frm 00081
Fmt 4703
Sfmt 4703
The USA toll-free dial-in number is 1–
866–659–0537; the pass code is
9933701.
FOR FURTHER INFORMATION CONTACT:
Rashaun Roberts, Ph.D., Designated
Federal Officer, National Institute for
Occupational Safety and Health, Centers
for Disease Control and Prevention,
1090 Tusculum Avenue, Mailstop C–24,
Cincinnati, Ohio 45226, Telephone:
(513) 533–6800, Toll Free 1(800) CDC–
INFO, Email: ocas@cdc.gov.
SUPPLEMENTARY INFORMATION:
Background: The Advisory Board was
established under the Energy Employees
Occupational Illness Compensation
Program Act of 2000 to advise the
President on a variety of policy and
technical functions required to
implement and effectively manage the
new compensation program. Key
functions of the Advisory Board include
providing advice on the development of
probability of causation guidelines that
have been promulgated by the
Department of Health and Human
Services (HHS) as a final rule; advice on
methods of dose reconstruction, which
have also been promulgated by HHS as
a final rule; advice on the scientific
validity and quality of dose estimation
and reconstruction efforts being
performed for purposes of the
compensation program; and advice on
petitions to add classes of workers to the
Special Exposure Cohort (SEC). In
December 2000, the President delegated
responsibility for funding, staffing, and
operating the Advisory Board to HHS,
which subsequently delegated this
authority to CDC. NIOSH implements
this responsibility for CDC.
The charter was issued on August 3,
2001, renewed at appropriate intervals,
and rechartered under Executive Order
14109 (September 29, 2023) on March
22, 2024. Unless continued by the
President the Board will terminate on
E:\FR\FM\20SEN1.SGM
20SEN1
Agencies
[Federal Register Volume 89, Number 183 (Friday, September 20, 2024)]
[Notices]
[Pages 77153-77155]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 2024-21564]
=======================================================================
-----------------------------------------------------------------------
DEPARTMENT OF HEALTH AND HUMAN SERVICES
Agency for Healthcare Research and Quality
Agency Information Collection Activities: Proposed Collection;
Comment Request
AGENCY: Agency for Healthcare Research and Quality, HHS.
ACTION: Information collection notice.
-----------------------------------------------------------------------
SUMMARY: In compliance with the Paperwork Reduction Act of 1995, this
notice announces the intention of the Agency for Healthcare Research
and Quality (AHRQ) to request that the Office of Management and Budget
(OMB) approve the reinstatement without change of the information
collection project Evaluating the Implementation of PCOR to Increase
Referral, Enrollment, and Retention through Automatic Referral to
Cardiac Rehabilitation (CR) with Care Coordinator OMB No. 0935-0252 for
which approval has expired. The reinstatement of this previously
approved PRA collection for which approval has expired is required in
order to discontinue this collection.
DATES: Comments on this notice must be received by November 19, 2024.
ADDRESSES: Written comments should be submitted to: Doris Lefkowitz,
Reports Clearance Officer, AHRQ, by email at
[email protected].
Copies of the proposed collection plans, data collection
instruments, and specific details on the estimated burden can be
obtained from the AHRQ Reports Clearance Officer.
FOR FURTHER INFORMATION CONTACT: Doris Lefkowitz, AHRQ Reports
Clearance Officer, (301) 427-1477, or by email at
[email protected].
SUPPLEMENTARY INFORMATION:
Title of Information Collection: Evaluating the Implementation of
PCOR to Increase Referral, Enrollment, and Retention through Automatic
Referral to Cardiac Rehabilitation (CR) with Care Coordinator.
OMB No.: 0935-0252.
Type of Request: Reinstatement without change to discontinue the
collection.
The aim of this project, known as TAKEheart, was to (a) raise
awareness about the benefits of cardiac rehabilitation (CR) after
myocardial infarction or coronary revascularization, then to (b) spread
knowledge about the best practices to increase referrals to CR, and,
finally, (c) to increase CR uptake.
AHRQ evaluated TAKEheart to assess:
the extent and effectiveness of the dissemination and
implementation efforts
the uptake and usage of Automatic Referral with Care
Coordination and
levels of referral to CR at the end of the intervention.
Evaluation results were used to improve the intervention and to
provide guidance for future AHRQ dissemination and implementation
projects. Two cohorts of ``Partner Hospitals,'' up to 125 hospitals in
total, engaged in efforts to implement Automatic Referral with Care
Coordination over twelve-month periods. The evaluation ascertained the
diversity of hospitals engaged in the activities that contributed to
(or hindered) their efforts, and the types of support which they
reported having been most (and least) useful. This information was used
to improve recruitment, technical assistance, and tools for the second
cohort.
In addition, hospitals--including those involved in the
implementation--were invited to attend Affinity Group virtual meetings
organized around specific topics of interest which are not intrinsic to
Automatic Referral with Care Coordination. Hospital staff engaged in
Affinity Groups created a vibrant Learning Community. The evaluation
determined which Affinity Groups engaged the most participants of the
Learning Community, and which resources participants determined the
most useful. This information was used to develop resources which were
available on a new, permanent website dedicated to improving CR.
This study was conducted by AHRQ through its contractor, Abt
Associates Inc., pursuant to AHRQ's statutory authority to disseminate
government-funded research relevant to comparative clinical
effectiveness research. 42 U.S.C. 299b-37(a).
Method of Collection
To collect data on the many facets of the intervention, the
collection implemented multiple data collection tools, each of which
had a specific purpose and set of respondents.
1. Partner Hospital Champion Survey. Each Partner Hospital
designated a ``Champion'' who coordinated activities associated with
implementing Automatic Referral with Care Coordination at the hospital
and provide the Champion's name and email address. Champions could have
had any role in the hospital, although they were expected to be in
relevant positions, such as cardiologists or quality improvement
managers. We conducted online surveys of 125 Champions (one Champion
per hospital). We used the email addresses to send the Champion a
survey at two points: seven months after the start of implementation
and at the end of the 12-month implementation period. The first survey
focused on four constructs. First, it captured data about the hospital
context, such as whether it had prior experience customizing an EMR or
is a safety net hospital. Second,
[[Page 77154]]
it addressed the hospital's decision to participate in TAKEheart.
Third, it captured data on the CR programs the hospital refers to,
whether the number or type has changed, and why. Fourth, it collected
feedback on the training and technical assistance received. The second
survey focused on three constructs. The first construct collected
feedback on the TAKEheart components, including training, technical
assistance, and use of the website. The second construct asked about
the hospitals' response to participating in TAKEheart, such as changes
to referral workflow or CR programs. The third construct asked those
Partner Hospitals that had not completed the process of implementing
Automatic Referral with Care Coordination whether they anticipated
continuing to work towards that goal and their confidence in
succeeding.
2. Partner Hospital Interviews.
a. Interviews with Partner Hospital Champions. We selected, from
each cohort, eight Partner Hospitals which demonstrated a strong
interest in addressing underserved populations or reducing disparities
in participation in cardiac rehabilitation. We conducted a key
informant interview with the Champion of each selected Partner Hospital
to delve into how they were addressing the needs of underserved
populations by implementing Automatic Referral with Care Coordination.
b. Interviews with Partner Hospital cardiologists. We selected,
from each cohort, eight hospitals based on criteria selected in
conversation with AHRQ, such as hospitals which serve specific
populations, or have the same EMRs, which informed their experience
customizing the EMR. We conducted semi-structured interviews with one
cardiologist at each of the selected hospitals twice. In the second
month of the cohort implementation, we asked about their needs,
concerns, and expectations of the program. In the 11th month of the
cohort implementation, we determined whether their concerns were
addressed appropriately and adequately.
c. Interviews with Partner Hospitals that withdraw. We expected
that a small number of Partner Hospitals would withdraw from the
cohort. We identified these hospitals by their lack of participation in
training and technical assistance events; Technical Assistance (TA)
Providers confirmed their withdrawal. We interviewed up to nine
withdrawing hospitals to better understand the reason for withdrawal
(e.g., a merger resulted in a loss of support for the intervention,
Champion left), as well as facilitators and barriers of each hospitals'
approach to implementing Automatic Referral with Care Coordination. If
more than nine hospitals withdrew, we ceased interviewing.
3. Learning Community Participant Survey. We conducted online
surveys of 250 currently active Learning Community participants at two
points in time, in months 18 and 31 of the project. We administered the
survey by sending a link to an online survey to email addresses entered
by virtual meeting participants during registration. The email
described the purpose of the survey.
4. Learning Community Follow-up Survey. We conducted a brief online
survey with up to 15 Learning Community participants following the
final virtual meeting for each of 10 Affinity Group, to ascertain
whether the hospitals were able to act on what they learned during the
session. The total sample was 150 Learning Community participants.
Estimated Annual Respondent Burden
Exhibit 1 presents estimates of the reporting burden hours for the
data collection efforts. Time estimates were based on prior experiences
and what could reasonably be requested of participating health care
organizations. The number of respondents listed in column A, Exhibit 1
reflects a projected 90% response rate for data collection effort 1,
and an 80% response rate for efforts 3 and 4 below.
1. Partner Hospital Champion Survey. We assumed 113 hospital
champions would complete the survey based on a 90% response rate. It
was expected to take up to 45 minutes to complete for a total of 169.5
hours to complete.
2. Partner Hospital Interviews. In-depth interviews occured with
select Partner Hospital staff.
a. Interviews with Partner Hospital Champions. We had a single, 90
minute interview with eight Partner Hospital Champions, in each cohort,
from Partner Hospital which have a common characteristic of particular
interest, for a total of 24 hours.
b. Interviews with Partner Hospital cardiologists. We held
individual, up-to-30 minute interviews with eight cardiologists, twice
in each cohort, for a total of 16 hours.
c. Interviews with Partner Hospitals that withdraw. We interviewed
up to nine withdrawing hospitals for no more than 20 minutes to better
understand the reason for withdrawal as well as facilitators and
barriers, for a total of 2.7 hours.
3. Learning Community Participant Survey. We assumed 200 Learning
Community participants would complete the survey based on an 80%
response rate. It was expected to take up to 15 minutes to complete
each survey for a total of 100 hours.
Learning Community Follow-up Survey. We conducted a brief, up to 10
minute, online survey of participants of each of just ten selected
Affinity Groups at two months after the virtual meeting. We assumed 120
Learning Community participants would complete the survey based on an
80% response rate. It was expected to take up to 15 minutes to complete
each survey for a total of 20.4 hours.
Exhibit 1--Estimated Annualized Burden Hours
----------------------------------------------------------------------------------------------------------------
B. Number of
Data collection method or project activity A. Number of responses per C. Hours per D. Total
respondents respondent response burden hours
----------------------------------------------------------------------------------------------------------------
1. Partner Hospital Champion Survey *........... 113 2 0.75 169.5
2a. Interviews with Partner Hospital Champions.. 16 1 1.5 24.0
2b. Interviews with Partner Hospital 16 2 0.5 16.0
Cardiologists..................................
2c. Interviews with Partner Hospitals that 9 1 0.3 2.7
withdraw.......................................
3. Learning Community Survey **................. 200 2 0.25 100.0
4. Learning Community Follow-up Survey **....... 120 1 0.17 20.4
---------------------------------------------------------------
Total....................................... 474 .............. .............. 332.6
----------------------------------------------------------------------------------------------------------------
* Number of respondents (Column A) reflects a sample size assuming a 90% response rate for this data collection
effort.
** Number of respondents (Column A) reflects a sample size assuming an 80% response rate for this data
collection effort.
[[Page 77155]]
Exhibit 2, below, presents the estimated annualized cost burden
associated with the respondents' time to participate in this research.
The total cost burden was estimated to be about $21,497.
Exhibit 2--Estimated Annualized Cost Burden
----------------------------------------------------------------------------------------------------------------
A. Number of B. Total Average hourly Total cost
Data collection method or project activity respondents burden hours wage rate burden
----------------------------------------------------------------------------------------------------------------
1. Partner Hospital Champion Survey *........... 113 169.5 $72.27 $12,250
2a. Interviews with Partner Hospital Champions.. 16 24.0 72.27 1,734
2b. Interviews with Partner Hospital 16 16.0 96.58 1,545
Cardiologists..................................
2c. Interviews with Partner Hospitals that 9 2.7 72.27 195
withdraw.......................................
3. Learning Community Survey **................. 200 100.0 47.95 4,795
4. Learning Community Follow-up Survey **....... 120 20.4 47.95 978
---------------------------------------------------------------
Total....................................... 474 332.6 .............. 21,497
----------------------------------------------------------------------------------------------------------------
* Number of respondents (Column A) reflects a sample size assuming a 90% response rate for this data collection
effort.
** Number of respondents (Column A) reflects a sample size assuming an 80% response rate for this data
collection effort.
Request for Comments
In accordance with the Paperwork Reduction Act, 44 U.S.C. 3501-
3520, comments on AHRQ's information collection are requested with
regard to any of the following: (a) whether the proposed collection of
information is necessary for the proper performance of AHRQ's health
care research and health care information dissemination functions,
including whether the information will have practical utility; (b) the
accuracy of AHRQ's estimate of burden (including hours and costs) of
the proposed collection(s) of information; (c) ways to enhance the
quality, utility and clarity of the information to be collected; and
(d) ways to minimize the burden of the collection of information upon
the respondents, including the use of automated collection techniques
or other forms of information technology.
Comments submitted in response to this notice will be summarized
and included in the Agency's subsequent request for OMB approval of the
proposed information collection. All comments will become a matter of
public record.
Dated: September 17, 2024.
Marquita Cullom,
Associate Director.
[FR Doc. 2024-21564 Filed 9-19-24; 8:45 am]
BILLING CODE 4160-90-P