Evaluation Policy; Cooperative Research or Demonstration Projects, 51574-51575 [2014-20616]
Download as PDF
51574
Federal Register / Vol. 79, No. 168 / Friday, August 29, 2014 / Notices
Desk Officer for the Administration for
Children and Families.
Robert Sargis,
Reports Clearance Officer.
[FR Doc. 2014–20594 Filed 8–28–14; 8:45 am]
BILLING CODE 4184–01–P
DEPARTMENT OF HEALTH AND
HUMAN SERVICES
Administration for Children and
Families
Evaluation Policy; Cooperative
Research or Demonstration Projects
Administration for Children
and Families, HHS.
ACTION: Notice.
AGENCY:
Administration for Children
and Families (ACF) is announcing its
evaluation policy for research or
demonstration projects as authorized by
42 U.S.C. 1310.
SUPPLEMENTARY INFORMATION: This
evaluation policy builds on ACF’s
strong history of evaluation by outlining
key principles to govern our planning,
conduct, and use of evaluation. The
evaluation policy reconfirms our
commitment to conducting rigorous,
relevant evaluations and to using
evidence from evaluations to inform
policy and practice. ACF seeks to
promote rigor, relevance, transparency,
independence, and ethics in the
conduct of evaluations. This policy
addresses each of these principles.
The mission of ACF is to foster health
and well-being by providing Federal
leadership, partnership, and resources
for the compassionate and effective
delivery of human services. Our vision
is children, youth, families, individuals,
and communities who are resilient, safe,
healthy, and economically secure. The
importance of these goals demands that
we continually innovate and improve,
and that we evaluate our activities and
those of our partners. Through
evaluation, ACF and our partners can
learn systematically so that we can
make our services as effective as
possible.
Evaluation produces one type of
evidence. A learning organization with
a culture of continual improvement
requires many types of evidence,
including not only evaluation but also
descriptive research studies,
performance measures, financial and
cost data, survey statistics, and program
administrative data. Further, continual
improvement requires systematic
approaches to using information, such
as regular data-driven reviews of
performance and progress. Although
wreier-aviles on DSK5TPTVN1PROD with NOTICES
SUMMARY:
VerDate Mar<15>2010
15:25 Aug 28, 2014
Jkt 232001
this policy focuses on evaluation, the
principles and many of the specifics
apply to the development and use of
other types of information as well.
This policy applies to all ACFsponsored evaluations. While much of
ACF’s evaluation activity is overseen by
OPRE, ACF program offices also sponsor
evaluations through dedicated contracts
or as part of their grant-making. In order
to promote quality, coordination, and
usefulness in ACF’s evaluation
activities, ACF program offices will
consult with OPRE in developing
evaluation activities. Program offices
will discuss evaluation projects with
OPRE in early stages to clarify
evaluation questions and
methodological options for addressing
them, and as activities progress, OPRE
will review designs, plans, and reports.
Program offices may also ask OPRE to
design and oversee evaluation projects
on their behalf or in collaboration with
program office staff.
Rigor: ACF is committed to using the
most rigorous methods that are
appropriate to the evaluation questions
and feasible within budget and other
constraints. Rigor is not restricted to
impact evaluations, but is also necessary
in implementation or process
evaluations, descriptive studies,
outcome evaluations, and formative
evaluations; and in both qualitative and
quantitative approaches. Rigor requires
ensuring that inferences about cause
and effect are well founded (internal
validity); requires clarity about the
populations, settings, or circumstances
to which results can be generalized
(external validity); and requires the use
of measures that accurately capture the
intended information (measurement
reliability and validity).
In assessing the effects of programs or
services, ACF evaluations will use
methods that isolate to the greatest
extent possible the impacts of the
programs or services from other
influences such as trends over time,
geographic variation, or pre-existing
differences between participants and
non-participants. For such causal
questions, experimental approaches are
preferred. When experimental
approaches are not feasible, high-quality
quasi-experiments offer an alternative.
ACF will recruit and maintain an
evaluation workforce with training and
experience appropriate for planning and
overseeing a rigorous evaluation
portfolio. To accomplish this, ACF will
recruit staff with advanced degrees and
experience in a range of relevant
disciplines such as program evaluation,
policy analysis, economics, sociology,
child development, etc. ACF will
provide professional development
PO 00000
Frm 00049
Fmt 4703
Sfmt 4703
opportunities so that staff can keep their
skills current.
ACF will ensure that contractors and
grantees conducting evaluations have
appropriate expertise through
emphasizing the capacity for rigor in
requests for proposal and funding
opportunity announcements. This
emphasis entails specifying
expectations in criteria for the selection
of grantees and contractors, and
engaging reviewers with evaluation
expertise. It also requires allocating
sufficient resources for evaluation
activities. ACF will generally require
evaluation contractors to consult with
external advisors who are leaders in
relevant fields through the formation of
technical work groups or other means.
Relevance: Evaluation priorities
should take into account legislative
requirements and Congressional
interests and should reflect the interests
and needs of ACF, HHS, and
Administration leadership; program
office staff and leadership; ACF partners
such as states, territories, tribes, and
local grantees; the populations served;
researchers; and other stakeholders.
Evaluations should be designed to
represent the diverse populations that
ACF programs serve, and ACF should
encourage diversity among those
carrying out the work, through building
awareness of opportunities and building
evaluation capacity among underrepresented groups.
There must be strong partnerships
among evaluation staff, program staff,
policy-makers, and service providers.
Policy-makers and practitioners should
have the opportunity to influence
evaluation priorities to meet their
interests and needs. Further, for new
initiatives and demonstrations in
particular, evaluations will be more
feasible and useful when planned in
concert with the planning of the
initiative or demonstration, rather than
as an afterthought. Given Federal
requirements related to procurement
and information collection, it can take
many months to award a grant or
contract and begin collecting data. Thus,
it is critical that planning for research
and evaluation be integrated with
planning for new initiatives.
It is important for evaluators to
disseminate findings in ways that are
accessible and useful to policy-makers
and practitioners. OPRE and program
offices will work in partnership to
inform potential applicants, program
providers, administrators, policymakers, and funders through
disseminating evidence from ACFsponsored and other good quality
evaluations.
E:\FR\FM\29AUN1.SGM
29AUN1
Federal Register / Vol. 79, No. 168 / Friday, August 29, 2014 / Notices
wreier-aviles on DSK5TPTVN1PROD with NOTICES
It is ACF’s policy to integrate both use
of existing evidence and opportunities
for further learning into all of our
activities. Where an evidence base is
lacking, we will build evidence through
strong evaluations. Where evidence
exists, we will use it. Discretionary
funding opportunity announcements
will require that successful applicants
cooperate with any Federal evaluations
if selected to participate. As legally
allowed, programs with waiver
authorities should require rigorous
evaluations as a condition of waivers.
As appropriate, ACF will encourage,
incentivize, or require grantees to use
existing evidence of effective strategies
in designing or selecting service
approaches. The emphasis on evidence
is meant to support, not inhibit,
innovation, improvement, and learning.
Transparency: ACF will make
information about planned and ongoing
evaluations easily accessible, typically
through posting on the web information
about the contractor or grantee
conducting the work and descriptions of
the evaluation questions, methods to be
used, and expected timeline for
reporting results. ACF will present
information about study designs,
implementation, and findings at
professional conferences.
Study plans will be published in
advance. ACF will release evaluation
results regardless of the findings.
Evaluation reports will describe the
methods used, including strengths and
weaknesses, and discuss the
generalizability of the findings.
Evaluation reports will present
comprehensive results, including
favorable, unfavorable, and null
findings. ACF will release evaluation
results timely—usually within 2 months
of a report’s completion.
ACF will archive evaluation data for
secondary use by interested researchers,
typically through building requirements
into contracts to prepare data sets for
secondary use.
Independence: Independence and
objectivity are core principles of
evaluation.1 Agency and program
leadership, program staff, service
providers, and others should participate
actively in setting evaluation priorities,
identifying evaluation questions, and
assessing the implications of findings.
However, it is important to insulate
1American Evaluation Association, ‘‘An
Evaluation Roadmap for a More Effective
Government’’, November 2013, https://
www.eval.org/d/do/472, accessed 16 December
2013, and Government Accountability Office,
‘‘Employment and Training Administration:
Increased Authority and Accountability Could
Improve Research Program’’, GAO–10- 243, January
2010, https://www.gao.gov/products/GAO-10-243,
accessed 18 June 2012.
VerDate Mar<15>2010
15:25 Aug 28, 2014
Jkt 232001
evaluation functions from undue
influence and from both the appearance
and the reality of bias. To promote
objectivity, ACF protects independence
in the design, conduct, and analysis of
evaluations. To this end:
• ACF will conduct evaluations
through the competitive award of grants
and contracts to external experts who
are free from conflicts of interest.
• The director of OPRE reports
directly to the Assistant Secretary for
Children and Families; has authority to
approve the design of evaluation
projects and analysis plans; and has
authority to approve, release, and
disseminate evaluation reports.
Ethics: ACF-sponsored evaluations
will be conducted in an ethical manner
and safeguard the dignity, rights, safety,
and privacy of participants. ACFsponsored evaluations will comply with
both the spirit and the letter of relevant
requirements such as regulations
governing research involving human
subjects.
Mark H. Greenberg,
Acting Assistant Secretary for Children and
Families.
[FR Doc. 2014–20616 Filed 8–28–14; 8:45 am]
BILLING CODE 4184–79–P
DEPARTMENT OF HEALTH AND
HUMAN SERVICES
Food and Drug Administration
[Docket No. FDA–2013–P–0886]
Determination That JADELLE
(Levonorgestrel) Implant, 75
Milligrams, Was Not Withdrawn From
Sale for Reasons of Safety or
Effectiveness
AGENCY:
Food and Drug Administration,
HHS.
ACTION:
Notice.
The Food and Drug
Administration (FDA or we) has
determined that JADELLE
(levonorgestrel) implant, 75 milligrams
(mg), was not withdrawn from sale for
reasons of safety or effectiveness. This
determination will allow FDA to
approve abbreviated new drug
applications (ANDAs) for JADELLE
(levonorgestrel) implant, 75 mg, if all
other legal and regulatory requirements
are met.
FOR FURTHER INFORMATION CONTACT:
Nisha Shah, Center for Drug Evaluation
and Research, Food and Drug
Administration, 10903 New Hampshire
Ave., Bldg. 51, Rm. 6222, Silver Spring,
MD 20993–0002, 301–796–4455.
SUMMARY:
PO 00000
Frm 00050
Fmt 4703
Sfmt 4703
51575
In 1984,
Congress enacted the Drug Price
Competition and Patent Term
Restoration Act of 1984 (Pub. L. 98–417)
(the 1984 amendments), which
authorized the approval of duplicate
versions of drug products under an
ANDA procedure. ANDA applicants
must, with certain exceptions, show that
the drug for which they are seeking
approval contains the same active
ingredient in the same strength and
dosage form as the ‘‘listed drug,’’ which
is a version of the drug that was
previously approved. ANDA applicants
do not have to repeat the extensive
clinical testing otherwise necessary to
gain approval of a new drug application
(NDA).
The 1984 amendments include what
is now section 505(j)(7) of the Federal
Food, Drug, and Cosmetic Act (21 U.S.C.
355(j)(7)), which requires FDA to
publish a list of all approved drugs.
FDA publishes this list as part of the
‘‘Approved Drug Products With
Therapeutic Equivalence Evaluations,’’
which is known generally as the
‘‘Orange Book.’’ Under FDA regulations,
drugs are removed from the list if the
Agency withdraws or suspends
approval of the drug’s NDA or ANDA
for reasons of safety or effectiveness or
if FDA determines that the listed drug
was withdrawn from sale for reasons of
safety or effectiveness (§ 314.162 (21
CFR 314.162)).
A person may petition the Agency to
determine, or the Agency may
determine on its own initiative, whether
a listed drug was withdrawn from sale
for reasons of safety or effectiveness.
This determination may be made at any
time after the drug has been withdrawn
from sale, but must be made prior to
approving an ANDA that refers to the
listed drug (§ 314.161 (21 CFR 314.161)).
FDA may not approve an ANDA that
does not refer to a listed drug.
JADELLE (levonorgestrel) implant, 75
mg, is the subject of NDA 20–544, held
by Population Council, and initially
approved on November 1, 1996.
JADELLE (levonorgestrel) implants, 75
mg, are indicated for the prevention of
pregnancy and are a long-term (up to 5
years) reversible method of
contraception.
Population Council has never
marketed JADELLE (levonorgestrel)
implant, 75 mg. Therefore, as in
previous instances (see e.g., 72 FR 9763,
61 FR 25497), the Agency has
determined, for purposes of §§ 314.161
and 314.162, never marketing an
approved drug product is equivalent to
withdrawing the drug from sale.
Arnall Golden Gregory, LLP
submitted a citizen petition dated July
SUPPLEMENTARY INFORMATION:
E:\FR\FM\29AUN1.SGM
29AUN1
Agencies
[Federal Register Volume 79, Number 168 (Friday, August 29, 2014)]
[Notices]
[Pages 51574-51575]
From the Federal Register Online via the Government Printing Office [www.gpo.gov]
[FR Doc No: 2014-20616]
-----------------------------------------------------------------------
DEPARTMENT OF HEALTH AND HUMAN SERVICES
Administration for Children and Families
Evaluation Policy; Cooperative Research or Demonstration Projects
AGENCY: Administration for Children and Families, HHS.
ACTION: Notice.
-----------------------------------------------------------------------
SUMMARY: Administration for Children and Families (ACF) is announcing
its evaluation policy for research or demonstration projects as
authorized by 42 U.S.C. 1310.
SUPPLEMENTARY INFORMATION: This evaluation policy builds on ACF's
strong history of evaluation by outlining key principles to govern our
planning, conduct, and use of evaluation. The evaluation policy
reconfirms our commitment to conducting rigorous, relevant evaluations
and to using evidence from evaluations to inform policy and practice.
ACF seeks to promote rigor, relevance, transparency, independence, and
ethics in the conduct of evaluations. This policy addresses each of
these principles.
The mission of ACF is to foster health and well-being by providing
Federal leadership, partnership, and resources for the compassionate
and effective delivery of human services. Our vision is children,
youth, families, individuals, and communities who are resilient, safe,
healthy, and economically secure. The importance of these goals demands
that we continually innovate and improve, and that we evaluate our
activities and those of our partners. Through evaluation, ACF and our
partners can learn systematically so that we can make our services as
effective as possible.
Evaluation produces one type of evidence. A learning organization
with a culture of continual improvement requires many types of
evidence, including not only evaluation but also descriptive research
studies, performance measures, financial and cost data, survey
statistics, and program administrative data. Further, continual
improvement requires systematic approaches to using information, such
as regular data-driven reviews of performance and progress. Although
this policy focuses on evaluation, the principles and many of the
specifics apply to the development and use of other types of
information as well.
This policy applies to all ACF-sponsored evaluations. While much of
ACF's evaluation activity is overseen by OPRE, ACF program offices also
sponsor evaluations through dedicated contracts or as part of their
grant-making. In order to promote quality, coordination, and usefulness
in ACF's evaluation activities, ACF program offices will consult with
OPRE in developing evaluation activities. Program offices will discuss
evaluation projects with OPRE in early stages to clarify evaluation
questions and methodological options for addressing them, and as
activities progress, OPRE will review designs, plans, and reports.
Program offices may also ask OPRE to design and oversee evaluation
projects on their behalf or in collaboration with program office staff.
Rigor: ACF is committed to using the most rigorous methods that are
appropriate to the evaluation questions and feasible within budget and
other constraints. Rigor is not restricted to impact evaluations, but
is also necessary in implementation or process evaluations, descriptive
studies, outcome evaluations, and formative evaluations; and in both
qualitative and quantitative approaches. Rigor requires ensuring that
inferences about cause and effect are well founded (internal validity);
requires clarity about the populations, settings, or circumstances to
which results can be generalized (external validity); and requires the
use of measures that accurately capture the intended information
(measurement reliability and validity).
In assessing the effects of programs or services, ACF evaluations
will use methods that isolate to the greatest extent possible the
impacts of the programs or services from other influences such as
trends over time, geographic variation, or pre-existing differences
between participants and non-participants. For such causal questions,
experimental approaches are preferred. When experimental approaches are
not feasible, high-quality quasi-experiments offer an alternative.
ACF will recruit and maintain an evaluation workforce with training
and experience appropriate for planning and overseeing a rigorous
evaluation portfolio. To accomplish this, ACF will recruit staff with
advanced degrees and experience in a range of relevant disciplines such
as program evaluation, policy analysis, economics, sociology, child
development, etc. ACF will provide professional development
opportunities so that staff can keep their skills current.
ACF will ensure that contractors and grantees conducting
evaluations have appropriate expertise through emphasizing the capacity
for rigor in requests for proposal and funding opportunity
announcements. This emphasis entails specifying expectations in
criteria for the selection of grantees and contractors, and engaging
reviewers with evaluation expertise. It also requires allocating
sufficient resources for evaluation activities. ACF will generally
require evaluation contractors to consult with external advisors who
are leaders in relevant fields through the formation of technical work
groups or other means.
Relevance: Evaluation priorities should take into account
legislative requirements and Congressional interests and should reflect
the interests and needs of ACF, HHS, and Administration leadership;
program office staff and leadership; ACF partners such as states,
territories, tribes, and local grantees; the populations served;
researchers; and other stakeholders. Evaluations should be designed to
represent the diverse populations that ACF programs serve, and ACF
should encourage diversity among those carrying out the work, through
building awareness of opportunities and building evaluation capacity
among under-represented groups.
There must be strong partnerships among evaluation staff, program
staff, policy-makers, and service providers. Policy-makers and
practitioners should have the opportunity to influence evaluation
priorities to meet their interests and needs. Further, for new
initiatives and demonstrations in particular, evaluations will be more
feasible and useful when planned in concert with the planning of the
initiative or demonstration, rather than as an afterthought. Given
Federal requirements related to procurement and information collection,
it can take many months to award a grant or contract and begin
collecting data. Thus, it is critical that planning for research and
evaluation be integrated with planning for new initiatives.
It is important for evaluators to disseminate findings in ways that
are accessible and useful to policy-makers and practitioners. OPRE and
program offices will work in partnership to inform potential
applicants, program providers, administrators, policy-makers, and
funders through disseminating evidence from ACF-sponsored and other
good quality evaluations.
[[Page 51575]]
It is ACF's policy to integrate both use of existing evidence and
opportunities for further learning into all of our activities. Where an
evidence base is lacking, we will build evidence through strong
evaluations. Where evidence exists, we will use it. Discretionary
funding opportunity announcements will require that successful
applicants cooperate with any Federal evaluations if selected to
participate. As legally allowed, programs with waiver authorities
should require rigorous evaluations as a condition of waivers. As
appropriate, ACF will encourage, incentivize, or require grantees to
use existing evidence of effective strategies in designing or selecting
service approaches. The emphasis on evidence is meant to support, not
inhibit, innovation, improvement, and learning.
Transparency: ACF will make information about planned and ongoing
evaluations easily accessible, typically through posting on the web
information about the contractor or grantee conducting the work and
descriptions of the evaluation questions, methods to be used, and
expected timeline for reporting results. ACF will present information
about study designs, implementation, and findings at professional
conferences.
Study plans will be published in advance. ACF will release
evaluation results regardless of the findings. Evaluation reports will
describe the methods used, including strengths and weaknesses, and
discuss the generalizability of the findings. Evaluation reports will
present comprehensive results, including favorable, unfavorable, and
null findings. ACF will release evaluation results timely--usually
within 2 months of a report's completion.
ACF will archive evaluation data for secondary use by interested
researchers, typically through building requirements into contracts to
prepare data sets for secondary use.
Independence: Independence and objectivity are core principles of
evaluation.\1\ Agency and program leadership, program staff, service
providers, and others should participate actively in setting evaluation
priorities, identifying evaluation questions, and assessing the
implications of findings. However, it is important to insulate
evaluation functions from undue influence and from both the appearance
and the reality of bias. To promote objectivity, ACF protects
independence in the design, conduct, and analysis of evaluations. To
this end:
---------------------------------------------------------------------------
\1\American Evaluation Association, ``An Evaluation Roadmap for
a More Effective Government'', November 2013, https://www.eval.org/d/do/472, accessed 16 December 2013, and Government Accountability
Office, ``Employment and Training Administration: Increased
Authority and Accountability Could Improve Research Program'', GAO-
10- 243, January 2010, https://www.gao.gov/products/GAO-10-243,
accessed 18 June 2012.
---------------------------------------------------------------------------
ACF will conduct evaluations through the competitive award
of grants and contracts to external experts who are free from conflicts
of interest.
The director of OPRE reports directly to the Assistant
Secretary for Children and Families; has authority to approve the
design of evaluation projects and analysis plans; and has authority to
approve, release, and disseminate evaluation reports.
Ethics: ACF-sponsored evaluations will be conducted in an ethical
manner and safeguard the dignity, rights, safety, and privacy of
participants. ACF-sponsored evaluations will comply with both the
spirit and the letter of relevant requirements such as regulations
governing research involving human subjects.
Mark H. Greenberg,
Acting Assistant Secretary for Children and Families.
[FR Doc. 2014-20616 Filed 8-28-14; 8:45 am]
BILLING CODE 4184-79-P