Notice of Final Priorities, Requirements, Definitions, and Selection Criteria, 21986-21997 [2011-9479]
Download as PDF
21986
Federal Register / Vol. 76, No. 75 / Tuesday, April 19, 2011 / Notices
DEPARTMENT OF EDUCATION
Notice of Final Priorities,
Requirements, Definitions, and
Selection Criteria
Enhanced Assessment Instruments;
Catalog of Federal Domestic Assistance
(CFDA) Number: 84.368.
AGENCY: Office of Elementary and
Secondary Education, Department of
Education.
ACTION: Notice of final priorities,
requirements, definitions, and selection
criteria.
The Assistant Secretary for
Elementary and Secondary Education
announces priorities, requirements,
definitions, and selection criteria under
the Enhanced Assessment Instruments
Grant program, also called the Enhanced
Assessment Grants (EAG) program. The
Assistant Secretary may use one or more
of these priorities, requirements,
definitions, and selection criteria for
competitions in fiscal year (FY) 2011
and later years. We take these actions to
focus Federal financial assistance on the
pressing need to improve the
assessment instruments and systems
used by States to accurately measure
student academic achievement and
growth under the Elementary and
Secondary Education Act of 1965, as
amended (ESEA).
DATES: Effective Date: These priorities,
requirements, definitions, and selection
criteria are effective May 19, 2011.
FOR FURTHER INFORMATION CONTACT:
Collette Roney, U.S. Department of
Education, 400 Maryland Avenue, SW.,
Room 3W210, Washington, DC 20202.
Telephone: (202) 401–5245. E-mail:
Collette.Roney@ed.gov.
If you use a telecommunications
device for the deaf (TDD), call the
Federal Relay Service (FRS), toll free, at
1–800–877–8339.
SUPPLEMENTARY INFORMATION:
Purpose of Program: The purpose of
the EAG program is to enhance the
quality of assessment instruments and
systems used by States for measuring
the academic achievement of
elementary and secondary school
students.
Program Authority: 20 U.S.C. 7301a.
Public Comment: We published a
notice of proposed priorities,
requirements, definitions, and selection
criteria for this program in the Federal
Register on January 7, 2011 (76 FR
1138). That notice contained
background information and our reasons
for proposing the particular priorities,
requirements, definitions, and selection
criteria. In response to comments we
srobinson on DSKHWCL6B1PROD with NOTICES3
SUMMARY:
VerDate Mar<15>2010
16:17 Apr 18, 2011
Jkt 223001
received on the notice, we have made
revisions to Priority 1—English
Language Proficiency Assessment
System (ELP Priority), Priority 2—
Collaborative Efforts Among States
(Collaborative Efforts Priority), and the
requirements, definitions, and selection
criteria.
Public Comment: In response to our
invitation in the notice of proposed
priorities, requirements, definitions, and
selection criteria, 15 parties submitted
comments. We group major issues
according to subject. Generally, we do
not address technical and other minor
changes.
Analysis of Comments and Changes:
An analysis of the comments and of any
changes in the priorities, requirements,
definitions, and selection criteria since
publication of the notice of proposed
priorities, requirements, definitions, and
selection criteria follows.
Priority 1—English Language
Proficiency (ELP) Assessment System
Comment: Many commenters
expressed support for the ELP Priority
and its broad objective of promoting the
development of high-quality ELP
assessment systems. Commenters stated
that the priority addresses assessment
needs unique to English learners and
that improvements in assessments used
to measure English learners’ progress in
and attainment of English proficiency
will support improvements in
curriculum and instruction for English
learners, help raise their educational
achievement, and help close
achievement gaps between English
learners and their English proficientpeers. Commenters also stated that the
priority promotes innovative, highquality assessments that are aligned
with common college- and career-ready
standards, which will help prepare
English learners for higher education
and careers and ensure that English
learners have access to the same
rigorous academic content as all
students. Another commenter stated
that the use of multiple measures of
both academic and English proficiency
will provide more ongoing feedback to
educators as well as students and their
families and offers the promise of
greater validity and reliability in
assessments for the diverse population
of English learners.
Discussion: We agree with the
commenters that the development of
high-quality ELP assessments aligned
with ELP standards that in turn
correspond to a common set of collegeand career-ready standards in English
language arts and mathematics are likely
to contribute to improved teaching and
learning for English learners. We
PO 00000
Frm 00002
Fmt 4701
Sfmt 4703
appreciate the commenters’ recognition
that we designed the ELP Priority to
support the development of high-quality
diagnostic and summative assessments
that measure students’ abilities in each
of the four language domains (reading,
writing, speaking, and listening), in
order to meet the significant need for
ELP assessments that correspond to
college- and career-ready standards held
in common by multiple States.
Changes: None.
Comment: Some commenters
recommended that the ELP assessment
system outlined in the ELP Priority be
defined more explicitly and suggested
that the priority explicitly support the
development of benchmark and
formative assessments as well as
diagnostic and summative assessments.
The commenters expressed concern that
formative assessments may be underemphasized in the resulting ELP
assessment systems if they are not
explicitly included in the priority, and
stated that many educators prefer an
ELP assessment system that includes
benchmark and formative assessments.
One commenter stressed that the focus
on assessments developed under this
priority should be on measuring
students’ progress towards English
proficiency. Another commenter
recommended that the limited amount
of funds for the EAG program be
focused on the development of
summative assessments only.
Discussion: We believe that two types
of assessments are particularly
important for English learners: (1)
Diagnostic assessments (e.g., screener or
placement tests), which can be used to
determine whether a student should be
classified as an English learner, and (2)
summative assessments, which can be
used to determine whether an English
learner has made progress toward and
achieved grade-level English proficiency
and should no longer be classified as an
English learner. The ELP Priority does
not preclude an applicant from
including benchmark or formative
assessments in the ELP assessment
system it proposes to develop. However,
because of the importance of diagnostic
and summative assessments to the
implementation of Federal education
programs such as Title III of the ESEA,
and given the limited resources
available, we decline to expand the ELP
Priority to require more than the
development of diagnostic and
summative ELP assessments.
We agree that clarification of the
components for an ELP assessment
system developed under the ELP
Priority would be helpful and have
added a definition of English language
E:\FR\FM\19APN3.SGM
19APN3
srobinson on DSKHWCL6B1PROD with NOTICES3
Federal Register / Vol. 76, No. 75 / Tuesday, April 19, 2011 / Notices
proficiency (ELP) assessment system, for
purposes of the ELP Priority.
Changes: We have added a definition
of ELP assessment system to the final
definitions. The definition specifies
that, for purposes of the ELP Priority,
ELP assessment system means a system
of assessments that includes, at a
minimum, diagnostic (e.g., screener or
placement) and summative assessments
at each grade level from kindergarten
through grade 12 that cover the four
language domains of reading, writing,
speaking, and listening, as required by
section 3113(b)(2) of the ESEA, and that
meets all other requirements of the
priority. Consistent with this change, we
also have revised paragraphs (a)(2) and
(a)(3) of the ELP priority to include both
screener and placement assessments as
examples of diagnostic assessments.
Comment: One commenter noted that
schools implementing the ELP
assessment systems developed under
the ELP Priority will need time to
transition to the new assessments and
stated that the proposed priorities,
requirements, definitions, and selection
criteria did not address how an
applicant would need to approach such
a transition.
Discussion: Given the four-year
project period we are planning for
grants under this program, we anticipate
that some of the actions needed to
support the transition to new ELP
assessment systems may take place after
the end of the project period, while
other actions (e.g., developing
professional capacity and outreach as
described in the selection criteria) will
occur during the project period. Because
operational administration of the
assessments is not required during the
project period, we are not requiring a
complete transition plan. Transition
issues will be addressed by applicants,
as necessary, in response to selection
criterion (e), the professional capacity
and outreach selection criterion, and we
decline to add any additional
requirements relating to transition, as
some of these activities may occur
outside the grant period. We note, in
addition, that the Department routinely
provides guidance to the field on
current implementation issues and will
continue to do so in the future.
Changes: None.
Comment: Some commenters
expressed concern that the ELP Priority
does not adequately address
coordination between the grants to be
awarded under the EAG program and
grants already awarded under the RTTA
program. The commenters
recommended that the ELP Priority
require more specific coordination
between EAG and RTTA grants. They
VerDate Mar<15>2010
16:17 Apr 18, 2011
Jkt 223001
also suggested that we ensure that ELP
assessments developed under the EAG
program be embedded in work on
assessments under the RTTA program,
particularly because of the academic
language that students likely will need
in order to access the assessments to be
developed under the RTTA grants.
Discussion: We understand the
importance of ensuring that projects
funded under the EAG program and
other Department programs related to
assessments coordinate efforts where
appropriate. We plan to facilitate
coordination and technical assistance,
as needed, across newly awarded EAG
projects and the RTTA grants. EAG and
RTTA grantees will be required to
participate in such technical assistance
and other activities conducted or
facilitated by the Department or its
designees. We are clarifying this
expectation for coordination by adding
language to requirement (b) that will
require EAG grantees to coordinate with
the RTTA program.
Changes: We have revised
requirement (b) by adding a phrase that
requires EAG grantees to coordinate
with the RTTA program in the
development of assessments under the
EAG program.
Comment: One commenter stated that
States will need guidance and technical
support from the Department on such
implementation issues as
accountability, timeframes, and
benchmarks for English learners’
linguistic and academic progress once
States have developed their ELP
assessments. The commenter also
asserted that, if the reauthorization of
the ESEA occurs prior to the
development and implementation of
ELP assessment systems funded under
the EAG program, a reauthorized ESEA
should not constrain such work.
Discussion: We recognize that the
Department will need to work with
grantees and provide technical
assistance on implementing new ELP
assessment systems. If a reauthorized
ESEA requires changes to the projects
awarded under the EAG program, we
will work with grantees to make the
necessary changes and provide guidance
to the field, as appropriate.
Changes: None.
Comment: A few commenters
expressed concern with the examples of
linguistic components of language
included in paragraph (a)(7) of the
proposed ELP Priority. One commenter
suggested adding ‘‘semantics and
graphophonemic’’ to the list of
examples. Another commenter
suggested removing the list of examples.
One commenter stated that the
linguistic components should be
PO 00000
Frm 00003
Fmt 4701
Sfmt 4703
21987
embedded within ELP standards and
that determinations of students’ English
proficiency should not be limited to the
sum of students’ abilities on any group
of specific linguistic components of
language. This commenter expressed
concern that the ELP Priority could be
interpreted as requiring the ELP
assessment systems to provide
subscores on discrete linguistic
components. Another commenter stated
that the ELP Priority should specify that
measurement of any linguistic
component should be driven by the
functions of comprehension and
expression. This commenter suggested
revising the priority to require the
assessments to reflect the linguistic
components of language or demonstrate
students’ control over linguistic
components of language.
Discussion: Based on consideration of
the comments and our further review of
this issue prompted by the comments,
we revised the list of the examples of
linguistic components of language by
removing ‘‘vocabulary’’ from the list. Use
of the abbreviation ‘‘e.g.’’ in the
parenthetical indicates that the list is
not exhaustive or definitive.
While a valid and reliable ELP
assessment system should consider
students’ control over the linguistic
components of language, we do not
intend to require that the ELP
assessment systems generate subscores
for the linguistic components of
language. However, we do intend to
require that the ELP assessment systems
generate a valid and reliable measure of
students’ abilities in each of the four
language domains and are revising the
priority accordingly.
Changes: We have revised paragraph
(a)(7) of the ELP Priority to indicate that
the ELP assessment systems must
ensure that the measures of students’
English proficiency consider students’
control over the linguistic components
of language (e.g., phonology, syntax,
morphology). We also have revised
paragraph (c)(2) of the ELP Priority to
state that ELP assessment systems
developed under the priority must
provide a valid and reliable measure of
students’ abilities in each of the four
language domains and a comprehensive
ELP score based on all four domains,
with each language domain score
making a significant contribution to the
comprehensive ELP score, at each
proficiency level. To be consistent with
revisions to paragraph (c)(2) of the ELP
Priority, we have revised paragraph
(a)(9) of the ELP Priority to list the four
language domains.
Comment: Some commenters
expressed concern about references to
the uses of data from the ELP
E:\FR\FM\19APN3.SGM
19APN3
srobinson on DSKHWCL6B1PROD with NOTICES3
21988
Federal Register / Vol. 76, No. 75 / Tuesday, April 19, 2011 / Notices
assessment systems for evaluations of
teacher and principal effectiveness. A
few commenters outlined several
concerns that may limit the usefulness
of ELP assessments in evaluating
teacher and principal effectiveness, for
example: Limitations of current testing
instruments; difficulty in isolating the
effects of a teacher or principal on an
individual student’s scores, especially
when multiple teachers are involved in
a student’s instruction; a limited
knowledge base about growth in English
learners’ acquisition of English and how
to use measures of growth; and the
complexities of using longitudinal data,
especially for English learners who tend
to have high mobility rates. One
commenter noted that States could
misinterpret the ELP Priority as
requiring student learning on an ELP
assessment to be the only measure of
teacher effectiveness.
A few commenters suggested revising
the ELP Priority to require the use of
multiple measures for evaluations of
teacher and principal effectiveness, as
opposed to using ELP assessments as
the sole measure to evaluate teacher and
principal effectiveness. One commenter
suggested removing the provisions of
the ELP Priority that refer to the use of
ELP assessment data for informing
evaluations of teacher and principal
effectiveness. A few commenters stated
that ELP assessments should be used for
evaluations of teacher and principal
effectiveness only after a research base
has been established to support the use
of the assessments for such purposes.
Discussion: The ELP Priority does not
require that States or other entities use
data from ELP assessment systems
developed under the priority as the
single measure of teacher and principal
effectiveness. The ELP Priority, in
combination with the assessment design
selection criterion, is intended to signal
that ELP assessment systems should be
developed so that, as appropriate, the
data that they provide can be used as
one of multiple measures for teacher
and principal evaluation. We have
revised the language in the ELP Priority
and the assessment design selection
criterion to more clearly reflect that
intent.
Changes: We have revised paragraph
(c) of the ELP Priority to distinguish
those circumstances in which ELP
assessment data can be used as a single
measure (paragraph (c)(3)) and those
circumstances in which ELP assessment
data can be one measure along with
other appropriate measures (paragraph
(c)(4)). We have included evaluations of
principal and teacher effectiveness in
paragraph (c)(4). We have also revised
the assessment design selection
VerDate Mar<15>2010
16:17 Apr 18, 2011
Jkt 223001
criterion in paragraph (b)(6)(i) to
indicate that data from the assessments
developed under the EAG program
should be used only as appropriate as
one of multiple measures for
determinations of individual principal
and teacher effectiveness.
Comment: Many commenters raised
questions about the references in the
ELP Priority to a ‘‘common definition of
‘English learner’.’’ One commenter
expressed support for the general
approach of requiring a common
definition, noting that a common
definition would ensure that the data
States provide on the total number of
English learners being served would be
more accurate and consistent across the
nation, thereby allowing parents,
educators, and other stakeholders to
make comparisons across States and the
nation. Multiple commenters requested
that the Department clarify the meaning
of the term ‘‘common’’ and had diverging
views on whether ‘‘common’’ should be
defined as ‘‘identical’’ or ‘‘similar’’ (e.g.,
comparable and consistent).
Commenters also asked for clarification
as to whether the reference to a
‘‘common definition’’ applies to home
language surveys, screening
instruments, procedures for identifying
and classifying English learners,
definitions of language proficiency
levels, and criteria for determining the
English proficiency of students and
student exit from English learner status.
Several commenters provided specific
suggestions for how the term ‘‘common’’
should be interpreted when used in the
phrase ‘‘common definition of English
learner.’’ One commenter recommended
that the common definition of English
learner, including classification and exit
criteria, be based solely on the ELP
assessment system and not on academic
performance. The commenter noted that
excluding academic performance
measures would avoid problems of
construct validity and avoid confusing
the ‘‘English learner’’ classification with
non-language-related criteria. Another
commenter recommended that an
assessment of students’ proficiency in
their first language be considered in the
common definition of English learner.
Another commenter asked how
subgroups of English learners would fit
within a common definition and how
data on these subgroups would be
collected, disaggregated, reported, and
used.
One commenter stated that requiring
multiple States to change their
definition of English learner to a
common definition would be an
unreasonable Federal administrative
requirement that goes beyond the intent
of the ESEA. This commenter
PO 00000
Frm 00004
Fmt 4701
Sfmt 4703
recommended removing paragraph
(a)(2) from the ELP Priority, which calls
for States to adopt a common definition
of English learner.
Discussion: The term ‘‘common,’’ as
used in a ‘‘common definition of English
learner’’ in paragraph (a)(2) of the ELP
Priority, means an identical definition
of English learner with respect to certain
criteria, specifically: The diagnostic
assessments and associated achievement
standards used to classify students as
English learners, as well as the
summative assessments and associated
achievement standards used to exit
students from English learner status.
This definition is the same for all
subgroups of English learners, with the
exception of English learners with the
most significant cognitive disabilities
who are eligible to participate in
alternate assessments based on alternate
academic achievement standards in
accordance with 34 CFR 200.6(a)(2).
Assessment of students’ proficiency in
their first language is beyond the scope
of the ELP Priority.
The use of a common definition of
‘‘English learner’’ and common criteria
for exiting a student from English
learner status will help ensure
consistency in identifying English
learners across the States in a
consortium. However, the term
‘‘common’’ for purposes of the ELP
Priority does not apply to other areas
such as home language surveys,
program placement and instruction for
students, and the duration of program
and support services for students. To
clarify the scope of the ELP Priority, we
have added language to paragraph (a)(2)
to indicate that ‘‘common’’ means
identical for purposes of the diagnostic
assessments and associated achievement
standards used to classify students as
English learners as well as the
summative assessments and associated
achievement standards used to exit
students from English learner status. To
provide further clarity, we also
substituted the word ‘‘common’’ for the
word ‘‘uniform’’ in the definition of
English learner.
We agree with the commenter that a
common definition of English learner
should be based on the ELP assessments
to be developed under the priority, as
reflected in paragraph (c)(3) of the ELP
Priority. We also agree that the priority
should specifically reference subgroups
of English learners and, therefore, are
adding language to paragraph (c)(1) of
the ELP Priority to require that the ELP
assessment system provide data that can
be disaggregated by key English learner
subgroups.
Because participation in a grant under
the EAG program is voluntary and no
E:\FR\FM\19APN3.SGM
19APN3
srobinson on DSKHWCL6B1PROD with NOTICES3
Federal Register / Vol. 76, No. 75 / Tuesday, April 19, 2011 / Notices
entity is required to participate and
adopt a common definition of English
learner, we do not believe the
requirement in the ELP Priority
regarding a common definition of
English learner represents an
unreasonable Federal administrative
requirement and therefore decline to
remove this provision from the priority.
Changes: We have revised paragraph
(a)(2) of the ELP Priority to indicate that
‘‘common’’ means identical for purposes
of the diagnostic and summative
assessments and the associated
achievement standards used to classify
students as English learners and exit
students from English learner status. We
also substituted the word ‘‘common’’ for
the word ‘‘uniform’’ in the definition of
English learner. We have revised
paragraph (c)(1) of the ELP Priority to
require that the ELP assessment system
provide data that can be disaggregated
by key English learner subgroups and to
provide examples of those subgroups.
Comment: Several commenters
expressed concern that the proposed
ELP Priority did not adequately address
the development of ELP standards with
which assessments developed under the
priority must be aligned. The
commenters recommended that the
Department revise the priority to
provide that the ELP standards be of
high quality. One commenter also stated
that the language of the proposed ELP
Priority was unclear regarding whether
EAG applicants would be required to
develop the ELP standards to which
assessments under the priority must be
aligned as an activity under a grant. A
few commenters specifically
recommended that we require grantees
to submit a detailed plan for developing
and implementing the ELP standards on
which they would base their ELP
assessments. Another commenter
recommended including in the ELP
Priority a provision requiring the
development of ELP standards or a
requirement that all members of a
consortium agree to the adoption and
implementation of common ELP
standards as a requirement for joining a
consortium. These commenters stated
that it would be impossible for a
consortium to successfully develop
common ELP assessments if each State
in the consortium had its own ELP
standards.
One commenter noted that linguistic
components of language embedded
within ELP standards may be necessary,
but are not sufficient, to measure the
extent to which English learners can
process and use language for specified
purposes or situations. This commenter
stated that it is the discourse level of
language that carries the ‘‘semantic load’’
VerDate Mar<15>2010
16:17 Apr 18, 2011
Jkt 223001
supportive of communication that is
needed for college- and careerreadiness.
Another commenter stated that the
ELP assessments developed under the
ELP Priority should be aligned with ELP
standards that correspond to content
standards not only in English language
arts but also in other subject areas.
Another commenter noted the
importance of effectively implementing
ELP standards, stating that, in an
aligned assessment system, standards
are the reference point for designing
proficiency measures, interpreting and
communicating assessment results, and
using assessment results to improve
teaching and learning.
Discussion: We agree that high-quality
ELP standards and their implementation
are a crucial foundation for the ELP
assessment systems to be developed
under the ELP Priority. Section 6112 of
the ESEA, which authorizes the EAG
program, does not authorize EAG funds
to be used for developing standards.
Therefore, the Department can make
awards under the EAG program only to
develop assessments. We are adding a
program requirement clarifying this
limitation.
We expect that the assessments
developed under the ELP Priority will
be aligned with high-quality ELP
standards, and are revising the ELP
Priority to more specifically define the
characteristics of high-quality ELP
standards to which the ELP assessments
should align.
Grants under the RTTA program,
which the ELP Priority is designed to
complement, are focused on
assessments that are aligned with
college- and career-ready standards in
English language arts and mathematics
that are held in common by a multiple
States. Hence, we are providing that the
assessments developed under the ELP
Priority must be aligned with ELP
standards that correspond to common,
college- and career-ready standards in
English language arts and mathematics.
The ELP Priority does not preclude an
applicant from proposing to align the
ELP assessments with ELP standards
that include the academic language
necessary for college- and careerreadiness in subjects in addition to
English language arts and mathematics.
We also expect that rigorous ELP
standards that correspond to a set of
college- and career-ready standards in
English language arts and mathematics
that are held in common by multiple
States and that are developed with
broad stakeholder involvement will
attend not only to the linguistic
components of language but also to the
discourse level of language.
PO 00000
Frm 00005
Fmt 4701
Sfmt 4703
21989
Changes: We have revised paragraph
(a)(5) of the ELP Priority to more
specifically define the characteristics of
the ELP standards to which the ELP
assessments developed under the
program must align. Specifically, we
have indicated that those standards
must correspond to a common set of
college- and career-ready standards in
English language arts and mathematics,
and be rigorous, developed with broad
stakeholder involvement, and vetted
with experts and practitioners. The
standards also must be standards for
which external evaluations have
documented rigor and correspondence
to a common set of college- and careerready standards in English language arts
and mathematics.
We removed the reference to States
adopting or utilizing any standards
developed under a proposed project
from paragraph (d) of the Collaborative
Efforts Priority in order to clarify that
EAG program funds may not be used to
develop standards. We also have added
a new requirement (e), which requires
grantees to ensure that EAG funds are
not used to support the development of
standards, such as under the ELP
Priority or any other priority. The
subsequent requirements have been renumbered accordingly.
Comment: Two commenters
expressed support for our approach to
ELP assessments for students with the
most significant cognitive disabilities.
One commenter suggested removing
paragraph (e) of the ELP Priority, which
requires applicants to include in their
applications the strategies the applicant
State or, if the applicant is part of a
consortium, all States in the consortium,
would use to assess the English
proficiency of English learners with the
most significant cognitive disabilities.
The commenter suggested replacing this
provision with a requirement that
grantees under the EAG program
coordinate with existing grantees
funded under the Individuals with
Disabilities Education Act (IDEA),
including the General Supervision
Enhancement Grant (GSEG) program, to
address the needs of English learners
with the most significant cognitive
disabilities. One commenter suggested
that we require applicants to indicate
how they would coordinate work under
an EAG grant awarded under the ELP
Priority with grants awarded under the
GSEG program.
Discussion: Recent awards under the
GSEG program are supporting the
development of alternate assessments
based on alternate achievement
standards that measure student
knowledge and skills against academic
content standards in English language
E:\FR\FM\19APN3.SGM
19APN3
srobinson on DSKHWCL6B1PROD with NOTICES3
21990
Federal Register / Vol. 76, No. 75 / Tuesday, April 19, 2011 / Notices
arts and mathematics held in common
by multiple States; these grants are not
supporting the development of alternate
ELP assessments for students with the
most significant cognitive disabilities.
We acknowledge the importance of
developing alternate ELP assessments
for English learners with the most
significant cognitive disabilities but,
due to limited resources, are not
including them in the ELP Priority.
There will be limited overlap in the
focus of the projects awarded under the
ELP Priority and the projects awarded
under the GSEG program because the
EAG grants will not be supporting the
development of alternate assessments
and because the GSEG awards, which
focus only on alternate assessments, are
not supporting the development of ELP
assessments. Accordingly, we decline to
require that EAG grantees coordinate
with GSEG grantees.
To clarify the reference to English
learners with the most significant
cognitive disabilities who are eligible to
participate in alternate assessments
based on alternate academic
achievement standards, we added the
relevant regulatory citation to
paragraphs (a)(10) and (a)(11) of the ELP
Priority.
Changes: We have added the relevant
regulatory citation, 34 CFR 200.6(a)(2),
to paragraphs (a)(10) and (a)(11) of the
ELP Priority.
Comment: One commenter
recommended that we consider adding
a priority to support the development of
assessments to measure proficiency in a
second language other than English for
States that support bilingual education
and bi-literacy.
Discussion: We recognize that
measuring student proficiency in a
second language other than English can
provide useful data to educators of such
students. States already have the
flexibility to develop such assessments,
which under certain circumstances may
be supported by ESEA funds in
accordance with section 6111 of the
ESEA.
We decline to make the suggested
change because we believe that
developing new ELP assessments is a
more pressing need than developing
assessments that measure student
proficiency in a second language other
than English. The Department has
provided funding under the RTTA
program to consortia that together
include 44 States and the District of
Columbia to develop new assessment
systems that measure student
knowledge and skills against a common
set of college- and career-ready
standards in English language arts and
mathematics. ELP assessments
VerDate Mar<15>2010
16:17 Apr 18, 2011
Jkt 223001
corresponding to such common
standards will be needed when the
RTTA assessments are implemented,
and such assessments have not been
developed.
Changes: None.
Comment: One commenter noted that
addressing issues such as the
assessment of students whose education
has been interrupted might be more
appropriately addressed by the GSEG
program.
Discussion: The ELP Priority requires
that ELP assessment systems developed
under the priority accurately assess
English learners with limited or no
formal education, including students
whose education has been interrupted.
Data on the English proficiency of these
students can support efforts to improve
their instruction. The GSEG program
focuses on assessment for students with
disabilities, who may or may not be
English learners. We decline to make a
change in response to this comment,
because it is beyond the scope of the
program to make changes to other
programs, such as the GSEG program,
and because the GSEG program focuses
on assessments for students with
disabilities, only some of whom are
English learners and not necessarily
English learners with interrupted
education.
Changes: None.
Priority 2—Collaborative Efforts Among
States
Comment: We received a variety of
comments on the paragraph in the
Collaborative Efforts Priority that
requires a consortium to include a
minimum of 15 States. One commenter
stated that providing grants to sizeable
consortia of States would maximize the
impact of program funds. Another
commenter suggested that the
Department establish an eligibility
restriction under which only consortia
would be eligible to apply and require
that a consortium include a minimum of
15 States that represent at least 30% of
the nation’s English learners. Another
commenter expressed concern that the
approach to consortia may result in
grants that do not include all States,
including some States with sizable
English learner populations. Two
additional commenters recommended
removing the proposed minimum
number of States in a consortium,
suggesting that a minimum of 15 States
would impose an unfair obstacle to
States and that improvement in
assessment quality will be achieved
through competition in the marketplace.
Discussion: States have indicated to
the Department their interest in working
together in consortia to develop
PO 00000
Frm 00006
Fmt 4701
Sfmt 4703
assessments aligned with common
standards. Because of the complexity of
developing and implementing
assessments and assessment-related
instruments, collaborative efforts
between and among States can yield
approaches that build on each State’s
expertise and experience, as well as
approaches that generate substantial
efficiencies in development,
administration, costs, and uses of
results. We believe that larger consortia
will make more effective use of EAG
funds by drawing on the expertise and
experience of more States, increasing
the potential impact across States, and
increasing the degree to which common
assessment tools are available to States
nationwide. However, we do not want
to limit States’ flexibility in forming
consortia by adding requirements in the
Collaborative Efforts Priority, such as a
requirement that a certain percentage of
English learners be represented by the
population of consortium member
States. We do not have the authority to
require all States to participate, and we
decline to prohibit individual States
from applying for an award under the
EAG program; as a result, we decline to
make the suggested changes in these
areas.
Changes: None.
Comment: While expressing general
support for the Collaborative Efforts
Priority, one commenter expressed
concern regarding the requirement to
have States sign a binding memorandum
of understanding to use assessments not
yet developed. The commenter
suggested that requiring a strong and
exclusive letter of support for one
consortium proposal would be a more
reasonable requirement.
Discussion: Under Department
regulations, all members of a
consortium applying for a grant must
enter into an agreement that (1) details
the activities that each member of the
consortium plans to perform; and (2)
binds each member of the consortium to
every statement and assurance made by
the applicant in its application. (34 CFR
75.128). In response to the commenters’
concerns that States may decide to leave
a consortium after receiving the grant,
we are revising paragraph (c)(3) of the
Collaborative Efforts Priority to require
applicants to include in their
applications protocols for member
States to leave a consortium and for new
member States to join a consortium. A
consortium of States applying for a grant
would have flexibility in determining
the roles that member States may play.
In addition, a State could enter or leave
a consortium according to the protocols
the consortium has established for this
purpose. In light of the Department’s
E:\FR\FM\19APN3.SGM
19APN3
Federal Register / Vol. 76, No. 75 / Tuesday, April 19, 2011 / Notices
srobinson on DSKHWCL6B1PROD with NOTICES3
regulations and the changes being made
to provide flexibility to States, we
decline to require a strong and exclusive
letter of support rather than a binding
memorandum of understanding.
Changes: We have revised paragraph
(c)(3) of the Collaborative Efforts
Priority to require that applications from
consortia include protocols to allow
States to leave the consortium and for
new member States to join the
consortium. We also revised paragraph
(d) of the Collaborative Efforts Priority
to indicate that, to remain in the
consortium, a State must adopt or use
any instrument, including to the extent
applicable, assessments, developed
under the proposed project no later than
the end of the project period.
Selection Criteria
Comment: One commenter,
expressing support for the selection
criteria, observed that the criteria
include all the essential principles
needed to govern the development and
implementation of high-quality,
rigorous, research-based assessment
practices.
Discussion: We agree that the
selection criteria should address the key
aspects of developing high-quality
assessments and that the selection
criteria, as designed, address those key
aspects.
Changes: None.
Comment: One commenter suggested
revising paragraph (5) of the assessment
design selection criterion, which
specifies the types of data that must be
provided by the assessments. The
commenter suggested adding the
following categories of data: types of
English learner program services, length
of time in the English learner program,
and level of English proficiency.
Discussion: Students’ levels of English
proficiency are already included among
the data the ELP assessments developed
under the ELP Priority must provide.
However, because the selection criteria
in this notice may be used in future
competitions, which may or may not
include the ELP Priority, we decline to
revise the selection criteria in a manner
that relates specifically to the ELP
Priority. For this same reason, we
decline to include in the selection
criteria the other types of data the
commenter suggested (i.e., English
learner program services, length of time
in the English learner program). In
addition, data regarding services
provided by English learner programs
and the length of time students are in
such programs are data that help assess
program effectiveness; they are not data
that ELP assessments provide.
Changes: None.
VerDate Mar<15>2010
16:17 Apr 18, 2011
Jkt 223001
Comment: One commenter suggested
that we revise paragraph (b)(10) of the
assessment design selection criterion,
which addresses methods of scoring, to
allow for self-scoring of student
performance on assessments in order to
shorten the turnaround time for scoring.
Discussion: The selection criteria do
not specify the scoring methods that
grantees must use. Applicants may
propose to use a self-scoring approach,
as the commenter suggests, so long as
the approach is consistent with the
technical quality requirements for the
assessments.
Changes: None.
Comment: One commenter
recommended that paragraph (11) of the
assessment design selection criterion,
which addresses reports to be produced
based on the assessments, be revised to
include the provision of reports in a
language and format that parents can
understand.
Discussion: We agree with the
commenter that reports of assessment
data should be provided to parents in an
understandable and uniform format and,
to the extent practicable, in a language
that parents can understand, and have
revised this paragraph accordingly.
Changes: We have revised paragraph
(11) of the assessment design selection
criterion to provide for the
consideration of the extent to which,
reports produced based on the
assessments will be presented in an
understandable and uniform format, and
to the extent practicable, in a language
that parents can understand.
Comment: One commenter noted that
paragraph (1)(ii) of the proposed
assessment development plan selection
criterion, which described the types of
personnel to be involved in each
assessment development phase and
provided some examples of such
personnel, did not include references to
advocates for English learners or parents
of English learners. The commenter
suggested that the Department revise
this paragraph to include such
stakeholders in the examples provided.
Discussion: We agree that the list of
examples should include a reference to
other key stakeholders and have revised
the selection criterion accordingly.
However, because the selection criteria
may be used in future competitions,
which may or may not include the ELP
Priority, we decline to revise the
selection criteria in a manner that
relates specifically to the ELP Priority,
such as listing stakeholder groups
specific to English learners.
Changes: We have revised paragraph
(1)(ii) of the assessment development
plan selection criterion to include ‘‘other
PO 00000
Frm 00007
Fmt 4701
Sfmt 4703
21991
key stakeholders’’ in the list of examples
provided.
Comment: One commenter expressed
support for the Department’s reference
to the use of representative sampling for
field testing in paragraph (5) of the
assessment development plan selection
criterion. This commenter suggested
that we revise this paragraph to specify
certain subgroups of English learners
that may be considered in a
representative sample.
Discussion: We agree with the
suggestion that the student populations
that should be considered for
representative sampling include highand low-performing students, different
types of English learners, and students
with disabilities, and that it would be
helpful for applicants to have examples
of subgroups of English learners that
may be considered. We have revised
this paragraph to provide examples of
the subgroups of English learners that
may be considered in a representative
sample.
Changes: We have revised paragraph
(5) of the assessment development plan
selection criterion to include the
following examples of subgroups of
English learners that may be considered
in a representative sample: recently
arrived English learners, former English
learners, migratory English learners, and
English learners with disabilities.
Comment: One commenter expressed
support for the emphasis on research
and evaluation in the selection criteria.
Discussion: The Department agrees
that the selection criteria should include
a research and evaluation component
and believes that the selection criteria,
as designed, adequately consider
whether an applicant’s research and
evaluation plan will ensure that the
assessments developed are valid,
reliable, and fair for their intended
purposes.
Changes: None.
Comment: One commenter, while
expressing support for the emphasis on
professional capacity and outreach in
the selection criteria, stated that
mainstream and content-area teachers,
as well as English-as-a-second language
and bilingual program educators and
administrators, should be included in
professional capacity and outreach
plans. The commenter also suggested
that such plans should address
additional factors relating to ELP
assessments, including the definition of
English learners, language proficiency
levels, exit criteria for programs and
services, and professional development
on the use of the assessments and
assessment results to inform and
improve instruction.
E:\FR\FM\19APN3.SGM
19APN3
21992
Federal Register / Vol. 76, No. 75 / Tuesday, April 19, 2011 / Notices
Discussion: The activities suggested
by the commenter are allowable under
the requirements for this program.
However, because the selection criteria
may be used in future competitions that
may or may not involve the ELP
Priority, we decline to make the
recommended changes to the selection
criterion.
Changes: None.
srobinson on DSKHWCL6B1PROD with NOTICES3
Requirements
Comment: One commenter
recommended that the requirement
related to evaluation be revised to
mandate that evidence from evaluation
activities be posted on a specific Web
site used by professionals who
specialize in issues related to English
learners in order to improve
dissemination of findings.
Discussion: The EAG requirements do
not preclude grantees from posting
information related to grant activities on
Web sites (provided that the appropriate
disclaimers are included). However, we
believe that specifying the manner in
which grantees make information
available would be unnecessarily
prescriptive. Therefore, we decline to
make the suggested change in order to
provide grantees with flexibility in how
they meet the requirement to make
information related to grant activities
available to the public.
Changes: None.
Comment: A couple of commenters
expressed concern regarding the
requirement that grantees develop a
strategy to make student-level data that
result from any assessments or other
assessment-related instruments
developed under the ELP Priority
available on an ongoing basis for
research, including for prospective
linking, validity, and program
improvement studies. One commenter
recommended that the requirements
affirmatively address the applicable
privacy safeguards under the ESEA and
the Family Educational Rights and
Privacy Act (FERPA) to ensure that
disaggregated data used to report
achievement results for subgroups
cannot be traced back to an identifiable
student. Another commenter suggested
removing the requirement due to
concerns about privacy issues and a
concern that limited funds for the grants
might be diverted to research or other
entities that have separate access to
governmental and non-governmental
funding sources. The commenter also
stated that the proposed requirements
included all necessary considerations
for validity, reliability, and fairness,
thereby making the need for further
research duplicative and superfluous.
VerDate Mar<15>2010
16:17 Apr 18, 2011
Jkt 223001
Discussion: Eligible applicants
awarded a grant under the EAG program
must comply with FERPA and 34 CFR
Part 99, as well as State and local
requirements regarding privacy; we are
adding a footnote to the notice
reminding applicants that they must
comply with these requirements. With
regard to the concern that limited funds
for the grants might be diverted to
research, we note that the requirement
states that grant recipients must make
data available for further research, and
that grant recipients may only use grant
funds on research and evaluation
activities that fall within the scope of
the activities proposed in their
approved applications. In order to allow
for additional research that may prove
useful, we decline to remove the
requirement.
Changes: We have added a footnote to
requirement (c) (making student-level
data available for further research)
reminding applicants that they must
comply with FERPA and State and local
privacy requirements should they
receive an award under this program.
Comment: Two commenters
expressed concern regarding the
requirement that grantees, unless
otherwise protected by law or agreement
as proprietary information, make any
assessment content and other
assessment-related instruments
developed with EAG funds freely
available to States, technology platform
providers, and others that request it for
purposes of administering assessments,
provided that those parties receiving
assessment content comply with
consortium or State requirements for
test or item security. One commenter
reiterated that all instruments
developed with EAG funding must be
open-source and available to any State
requesting the use of the tools and
instruments. The other commenter
requested that we clarify that
assessments would be freely available to
States and others, including local
educational agencies. This commenter
recommended removing the phrase
‘‘unless otherwise protected by law or
agreement as proprietary information’’
from the requirement, and adding a
reference to making the information
available to local educational agencies.
Discussion: We cannot make a change
to protections of proprietary information
guaranteed by existing laws. In addition,
for work funded by the EAG program
and other Department-funded
discretionary grant programs, the
Department reserves a royalty-free,
nonexclusive, and irrevocable license to
reproduce, publish, or otherwise use,
and to authorize others to use, for
Federal Government purposes: The
PO 00000
Frm 00008
Fmt 4701
Sfmt 4703
copyright in any work developed under
a grant from the EAG program; and any
rights of copyright to which a grantee or
a contractor purchases ownership with
grant support. (34 CFR 80.34). At this
time we do not intend to exercise this
license with respect to any products
produced with EAG funds. If a grantee
develops a product but fails to make it
reasonably available to interested
entities, however, we may exercise our
license if doing so would further the
interests of the Federal Government. We
believe the requirement as originally
stated, coupled with our license with
respect to any products produced with
EAG funds, will serve to make
adequately available products produced
with EAG funds. Additionally, we note
that this requirement is consistent with
requirements of the RTTA program (see
program requirement 6 ‘‘Making Work
Available,’’ in the RTTA program notice
inviting applications, 75 FR 18175
(April 9, 2010)). As a result, we decline
to make the suggested changes.
Changes: None.
Definitions
Comment: With regard to the
definition of a common set of collegeand career-ready standards, one
commenter suggested revising the
definition to specify what constitutes a
‘‘significant number of States.’’
Discussion: In using the term
significant, we intended to indicate
multiple States rather than to refer to a
specific number of States. We agree that
the ELP Priority should be more specific
and have replaced the phrase
‘‘significant number of’’ with the term
‘‘multiple.’’
Changes: In the definition of common
set of college- and career-ready
standards, we have replaced the phrase
‘‘significant number of’’ with the term
‘‘multiple.’’
Funding
Comment: Some commenters
expressed concern about the amount of
funds anticipated to be available for
awards under a competition for EAG
funds involving the ELP Priority. Two
commenters stated that the information
they had from interviews and press
reports suggested that funding for the
development of ELP assessment systems
under the ELP Priority would be
limited, especially when compared to
funds available for recent Department
grants awarded under the RTTA and
GSEG programs. Another commenter
expressed concern that the amount of
funding that would be available for an
EAG competition involving the ELP
Priority would be too small, especially
in comparison with the RTTA and
E:\FR\FM\19APN3.SGM
19APN3
srobinson on DSKHWCL6B1PROD with NOTICES3
Federal Register / Vol. 76, No. 75 / Tuesday, April 19, 2011 / Notices
GSEG programs that the new priorities
for the EAG program are designed to
complement. Some commenters
recommended that the Department
consider making additional funds
available to support the development of
ELP assessment systems under the EAG
program. Another commenter noted
that, based on its experience in
developing assessments, the cost of
accomplishing the scope and scale of
work proposed in the notice of proposed
priorities, requirements, definitions, and
selection criteria would require more
than the $10.7 million appropriated for
the EAG in FY 2010 to be awarded in
2011. The commenter encouraged the
Department to provide funding for
grants under the EAG program
comparable to the amounts awarded
under the RTTA and GSEG programs.
Another commenter stated that $10.7
million would be inadequate to address
the needs of English learners through
the EAG program. Another commenter
recommended that the Department
provide awards of $30 million, and
suggested decreasing the estimated
number of awards if necessary to fund
grantees at this amount. None of the
commenters outlined specific
anticipated costs for the various
components of developing an ELP
assessment system, and only one
commenter suggested a specific amount
for awards.
Discussion: We cannot alter the
amount of funding that Congress
appropriated for the EAG program in the
FY 2010 budget. In developing our
estimates for the average size and range
of awards included in the FY 2011
notice inviting applications for new
awards for FY 2010 funds, published
elsewhere in this issue of the Federal
Register, we considered the costs of
efforts to develop ELP assessment
systems that the Department has
previously funded, the cost estimates for
activities under programs with similar
goals, and other information available
for estimating the costs of developing
assessment systems.
Changes: None.
Final Priorities:
English Language Proficiency
Assessment System. The Department
establishes a priority under the EAG
program for an English language
proficiency assessment system. To meet
this priority, an applicant must propose
a comprehensive plan to develop an
English language proficiency assessment
system that is valid, reliable, and fair for
its intended purpose. Such a plan must
include the following features:
(a) Design. The assessment system
must—
VerDate Mar<15>2010
16:17 Apr 18, 2011
Jkt 223001
(1) Be designed for implementation in
multiple States;
(2) Be based on a common definition
of English learner adopted by the
applicant State and, if the applicant
applies as part of a consortium, adopted
and held in common by all States in the
consortium, where common with
respect to the definition of ‘‘English
learner’’ means identical for purposes of
the diagnostic (e.g., screener or
placement) assessments and associated
achievement standards used to classify
students as English learners as well as
the summative assessments and
associated achievement standards used
to exit students from English learner
status;
(3) At a minimum, include diagnostic
(e.g., screener or placement) and
summative assessments;
(4) Measure students’ English
proficiency against a set of English
language proficiency standards held by
the applicant State and, if the applicant
applies as part of a consortium, held in
common by all States in the consortium;
(5) Measure students’ English
proficiency against a set of English
language proficiency standards that
correspond to a common set of collegeand career-ready standards (as defined
in this notice) in English language arts
and mathematics, are rigorous, are
developed with broad stakeholder
involvement, are vetted with experts
and practitioners, and for which
external evaluations have documented
rigor and correspondence with a
common set of college- and career-ready
standards in English language arts and
mathematics;
(6) Cover the full range of the English
language proficiency standards across
the four language domains of reading,
writing, speaking, and listening, as
required by section 3113(b)(2) of the
ESEA;
(7) Ensure that the measures of
students’ English proficiency consider
the students’ control over the linguistic
components of language (e.g.,
phonology, syntax, morphology);
(8) Produce results that indicate
whether individual students have
attained the English proficiency
necessary to participate fully in
academic instruction in English and
meet or exceed college- and career-ready
standards;
(9) Provide at least an annual measure
of English proficiency and student
progress in learning English for English
learners in kindergarten through grade
12 in each of the four language domains
of reading, writing, speaking, and
listening;
(10) Assess all English learners,
including English learners who are also
PO 00000
Frm 00009
Fmt 4701
Sfmt 4703
21993
students with disabilities and students
with limited or no formal education,
except for English learners with the
most significant cognitive disabilities
who are eligible to participate in
alternate assessments based on alternate
academic achievement standards in
accordance with 34 CFR 200.6(a)(2); and
(11) Be accessible to all English
learners, including by providing
appropriate accommodations for English
learners with disabilities, except for
English learners with the most
significant cognitive disabilities who are
eligible to participate in alternate
assessments based on alternate
academic achievement standards in
accordance with 34 CFR 200.6(a)(2).
(b) Technical quality. The assessment
system must measure students’ English
proficiency in ways that—
(1) Are consistent with nationally
recognized professional and technical
standards; and
(2) As appropriate, elicit complex
student demonstrations of
comprehension and production of
academic English (e.g., performance
tasks, selected responses, brief or
extended constructed responses).
(c) Data. The assessment system must
produce data that—
(1) Include student attainment of
English proficiency and student
progress in learning English (including
data disaggregated by English learner
subgroups such as English learners by
years in a language instruction
educational program; English learners
whose formal education has been
interrupted; students who were formerly
English learners by years out of the
language instruction educational
program; English learners by level of
English proficiency, such as those who
initially scored proficient on the English
language proficiency assessment;
English learners by disability status; and
English learners by native language);
(2) Provide a valid and reliable
measure of students’ abilities in each of
the four language domains (reading,
writing, speaking, and listening) and a
comprehensive English proficiency
score based on all four domains, with
each language domain score making a
significant contribution to the
comprehensive ELP score, at each
proficiency level; and
(3) Can be used for the—
(i) Identification of students as
English learners;
(ii) Decisions about whether a student
should exit from English language
instruction educational programs;
(iii) Determinations of school, local
educational agency, and State
effectiveness for the purposes of
E:\FR\FM\19APN3.SGM
19APN3
srobinson on DSKHWCL6B1PROD with NOTICES3
21994
Federal Register / Vol. 76, No. 75 / Tuesday, April 19, 2011 / Notices
accountability under Title I and Title III
of the ESEA;
(4) Can be used, as appropriate, as one
of multiple measures, to inform—
(i) Evaluations of individual
principals and teachers in order to
determine their effectiveness;
(ii) Determinations of principal and
teacher professional development and
support needs; and
(iii) Strategies to improve teaching,
learning, and language instruction
education programs.
(d) Compatibility. The assessment
system must use compatible approaches
to technology, assessment
administration, scoring, reporting, and
other factors that facilitate the coherent
inclusion of the assessments within
States’ student assessment systems.
(e) Students with the most significant
cognitive disabilities. The
comprehensive plan to develop an
English language proficiency assessment
system must include the strategies the
applicant State and, if the applicant is
part of a consortium, all States in the
consortium, plans to use to assess the
English proficiency of English learners
with the most significant cognitive
disabilities who are eligible to
participate in alternate assessments
based on alternate academic
achievement standards in accordance
with 34 CFR 200.6(a)(2) in lieu of
including those students in the
operational administration of the
assessments developed for other English
learners under a grant from this
competition.
Collaborative Efforts Among States.
The Department establishes a priority
under the EAG program for
collaborative efforts among States. To
meet this priority, an applicant must—
(a) Include a minimum of 15 States in
the consortium;
(b) Identify in its application a
proposed project management partner
and provide an assurance that the
proposed project management partner is
not partnered with any other eligible
applicant applying for an award under
this competition; 1
(c) Provide a description of the
consortium’s structure and operation.
The description must include—
(1) The organizational structure of the
consortium (e.g., differentiated roles
that a member State may hold);
(2) The consortium’s method and
process (e.g., consensus, majority) for
making different types of decisions (e.g.,
policy, operational);
(3) The protocols by which the
consortium will operate, including
1 In selecting a proposed project management
partner, an eligible applicant must comply with the
requirements for procurement in 34 CFR 80.36.
VerDate Mar<15>2010
16:17 Apr 18, 2011
Jkt 223001
protocols for member States to change
roles in the consortium, for member
States to leave the consortium, and for
new member States to join the
consortium;
(4) The consortium’s plan, including
the process and timeline, for setting key
policies and definitions for
implementing the proposed project,
including, for any assessments
developed through a project funded by
this grant, the common set of standards
upon which to base the assessments, a
common set of performance-level
descriptors, a common set of
achievement standards, common
assessment administration procedures,
common item-release and test-security
policies, and a common set of policies
and procedures for accommodations
and student participation; and
(5) The consortium’s plan for
managing grant funds received under
this competition; and
(d) Provide a memorandum of
understanding or other binding
agreement executed by each State in the
consortium that includes an assurance
that, to remain in the consortium, the
State will adopt or use any instrument,
including to the extent applicable,
assessments, developed under the
proposed project no later than the end
of the project period.
Types of Priorities:
When inviting applications for a
competition using one or more
priorities, we designate the type of each
priority as absolute, competitive
preference, or invitational through a
notice in the Federal Register. The
effect of each type of priority follows:
Absolute priority: Under an absolute
priority, we consider only applications
that meet the priority (34 CFR
75.105(c)(3)).
Competitive preference priority:
Under a competitive preference priority,
we give competitive preference to an
application by (1) awarding additional
points, depending on the extent to
which the application meets the priority
(34 CFR 75.105(c)(2)(i)); or (2) selecting
an application that meets the priority
over an application of comparable merit
that does not meet the priority (34 CFR
75.105(c)(2)(ii)).
Invitational priority: Under an
invitational priority, we are particularly
interested in applications that meet the
priority. However, we do not give an
application that meets the priority a
preference over other applications (34
CFR 75.105(c)(1)).
Final Requirements:
The Department establishes the
following requirements for the
Enhanced Assessment Grants program.
We may apply one or more of these
PO 00000
Frm 00010
Fmt 4701
Sfmt 4703
requirements in any year in which a
competition for program funds is held.
An eligible applicant awarded a grant
under this program must:
(a) Evaluate the validity, reliability,
and fairness of any assessments or other
assessment-related instruments
developed under a grant from this
competition, and make available
documentation of evaluations of
technical quality through formal
mechanisms (e.g., peer-reviewed
journals) and informal mechanisms
(e.g., newsletters), both in print and
electronically;
(b) Actively participate in any
applicable technical assistance activities
conducted or facilitated by the
Department or its designees, coordinate
with the RTTA program in the
development of assessments under this
program, and participate in other
activities as determined by the
Department;
(c) Develop a strategy to make
student-level data that result from any
assessments or other assessment-related
instruments developed under a grant
from this competition available on an
ongoing basis for research, including for
prospective linking, validity, and
program improvement studies; 2
(d) Ensure that any assessments or
other assessment-related instruments
developed under a grant from this
competition will be operational (ready
for large-scale administration) at the end
of the project period;
(e) Ensure that funds awarded under
the EAG program are not used to
support the development of standards,
such as under the English language
proficiency assessment system priority
or any other priority.
(f) Maximize the interoperability of
any assessments and other assessmentrelated instruments developed with
funds from this competition across
technology platforms and the ability for
States to move their assessments from
one technology platform to another by
doing the following, as applicable, for
any assessments developed with funds
from this competition by—
(1) Developing all assessment items in
accordance with an industry-recognized
open-licensed interoperability standard
that is approved by the Department
during the grant period, without nonstandard extensions or additions; and
(2) Producing all student-level data in
a manner consistent with an industryrecognized open-licensed
interoperability standard that is
2 Eligible applicants awarded a grant under this
program must comply with the Family Educational
Rights and Privacy Act (FERPA) and 34 CFR Part
99, as well as State and local requirements
regarding privacy.
E:\FR\FM\19APN3.SGM
19APN3
srobinson on DSKHWCL6B1PROD with NOTICES3
Federal Register / Vol. 76, No. 75 / Tuesday, April 19, 2011 / Notices
approved by the Department during the
grant period;
(g) Unless otherwise protected by law
or agreement as proprietary information,
make any assessment content (i.e.,
assessments and assessment items) and
other assessment-related instruments
developed with funds from this
competition freely available to States,
technology platform providers, and
others that request it for purposes of
administering assessments, provided
that those parties receiving assessment
content comply with consortium or
State requirements for test or item
security; and
(h) For any assessments and other
assessment-related instruments
developed with funds from this
competition, use technology to the
maximum extent appropriate to
develop, administer, and score the
assessments and report results.
Final Definitions:
The Department establishes the
following definitions for the Enhanced
Assessment Grants program. We may
apply one or more of these definitions
in any year in which a competition for
program funds is held.
Common set of college- and careerready standards means a set of
academic content standards for grades
K–12 held in common by multiple
States, that (a) define what a student
must know and be able to do at each
grade level; (b) if mastered, would
ensure that the student is college- and
career-ready by the time of high school
graduation; and (c) for any consortium
of States applying under the EAG
program, are substantially identical
across all States in the consortium.
A State in a consortium may
supplement the common set of collegeand career-ready standards with
additional content standards, provided
that the additional standards do not
comprise more than 15 percent of the
State’s total standards for that content
area.
English language proficiency
assessment system, for purposes of the
English language proficiency assessment
system priority, means a system of
assessments that includes, at a
minimum, diagnostic (e.g., screener or
placement) and summative assessments
at each grade level from kindergarten
through grade 12 that cover the four
language domains of reading, writing,
speaking, and listening, as required by
section 3113(b)(2) of the ESEA, and that
meets all other requirements of the
priority.
English learner means a student who
is an English learner as defined by the
applicant consistent with the definition
of a student who is ‘‘limited English
VerDate Mar<15>2010
16:17 Apr 18, 2011
Jkt 223001
proficient’’ as that term is defined in
section 9101(25) of the ESEA. If the
applicant submits an application on
behalf of a consortium, member States
must develop and adopt a common
definition of the term during the period
of the grant.
Student with a disability means a
student who has been identified as a
child with a disability under the
Individuals with Disabilities Education
Act, as amended.
Final Selection Criteria:
The Department establishes the
following selection criteria for the
Enhanced Assessment Grant program.
We may apply one or more of these
selection criteria in any year in which
a competition for program funds is held.
(a) Theory of action. The Secretary
reviews each application to determine
the extent to which the eligible
applicant’s theory of action is logical,
coherent, and credible, and will result
in improved student outcomes. In
determining the extent to which the
theory of action has these attributes, we
will consider the description of, and
rationale for—
(1) How the assessment results will be
used (e.g., at the State, local educational
agency, school, classroom, and student
levels);
(2) How the assessments and
assessment results will be incorporated
into coherent educational systems (i.e.,
systems that include standards,
assessments, curriculum, instruction,
and professional development) of the
State(s) participating in the grant; and
(3) How those educational systems as
a whole will improve student
achievement.
(b) Assessment design. The Secretary
reviews each application to determine
the extent to which the design of the
eligible applicant’s proposed
assessments is innovative, feasible, and
consistent with the theory of action. In
determining the extent to which the
design has these attributes, we will
consider—
(1) The number and types of
assessments, as appropriate (e.g.,
diagnostic assessments, summative
assessments);
(2) How the assessments will measure
student knowledge and skills against the
full range of the relevant standards,
including the standards against which
student achievement has traditionally
been difficult to measure, provide an
accurate measure of student proficiency
on those standards, including for
students who are high- and lowperforming in academic areas, and
provide an accurate measure of student
progress in the relevant area over a full
academic year;
PO 00000
Frm 00011
Fmt 4701
Sfmt 4703
21995
(3) How the assessments will produce
the required student performance data,
as described in the priority;
(4) How and when during the
academic year different types of student
data will be available to inform and
guide instruction, interventions, and
professional development;
(5) The types of data that will be
produced by the assessments, which
must include student achievement data
and other data specified in the relevant
priority;
(6) The uses of the data that will be
produced by the assessments, including
(but not limited to)—
(i) Determining individual student
achievement and student progress;
determining, as appropriate and as one
of multiple measures, individual
principal and teacher effectiveness, if
applicable; and professional
development and support needs;
(ii) Informing teaching, learning, and
program improvement; and
(7) The frequency and timing of
administration of the assessments, and
the rationale for these;
(8) The number and types of items
(e.g., performance tasks, selected
responses, observational rating, brief or
extended constructed responses) and
the distribution of item types within the
assessments, including the extent to
which the items will be varied and elicit
complex student demonstrations or
applications of knowledge, skills, and
approaches to learning, as appropriate
(descriptions should include a concrete
example of each item type proposed);
and the rationale for using these item
types and their distributions;
(9) The assessments’ administration
mode (e.g., paper-and-pencil, teacher
rating, computer-based, or other
electronic device), and the rationale for
the mode;
(10) The methods for scoring student
performance on the assessments, the
estimated turnaround times for scoring,
and the rationale for these; and
(11) The reports that will be produced
based on the assessments, and for each
report: The key data it will present; its
intended use; target audience (e.g.,
students, parents, teachers,
administrators, policymakers); and its
presentation in an understandable and
uniform format and, to the extent
practicable, in a language that parents
can understand.
(c) Assessment development plan.
The Secretary reviews each application
to determine the extent to which the
eligible applicant’s plan for developing
the proposed assessments will ensure
that the assessments are ready by the
end of the grant period for wide-scale
administration in a manner that is
E:\FR\FM\19APN3.SGM
19APN3
srobinson on DSKHWCL6B1PROD with NOTICES3
21996
Federal Register / Vol. 76, No. 75 / Tuesday, April 19, 2011 / Notices
timely, cost-effective, and consistent
with the proposed design and
incorporates a process for ongoing
feedback and improvement. In
determining the extent to which the
assessment development plan has these
attributes, we will consider—
(1)(i) The approaches for developing
assessment items (e.g., evidencecentered design, universal design) and
the rationale for using those approaches;
and the development phases and
processes to be implemented consistent
with the approaches; and
(ii) The types of personnel (e.g.,
practitioners, content experts,
assessment experts, experts in assessing
English learners, linguists, experts in
second language acquisition, experts in
assessing students with disabilities,
psychometricians, cognitive scientists,
institution of higher education
representatives, experts on career
readiness standards, and other key
stakeholders) involved in each
development phase and process;
(2) The approach and strategy for
designing and developing
accommodations, accommodation
policies, and methods for standardizing
the use of those accommodations for
students with disabilities;
(3) The approach and strategy for
ensuring scalable, accurate, and
consistent scoring of items, including
the approach and moderation system for
any human-scored items and the extent
to which teachers are trained and
involved in the administration and
scoring of assessments;
(4) The approach and strategy for
developing the reporting system; and
(5) The overall approach to quality
control and the strategy for field-testing
assessment items, accommodations,
scoring systems, and reporting systems,
including, with respect to assessment
items and accommodations, the use of
representative sampling of all types of
student populations, taking into
particular account high- and lowperforming students, different types of
English learners (e.g., recently arrived
English learners, former English
learners, migratory English learners, and
English learners with disabilities), and
students with disabilities.
(d) Research and evaluation. The
Secretary reviews each application to
determine the extent to which the
eligible applicant’s research and
evaluation plan will ensure that the
assessments developed are valid,
reliable, and fair for their intended
purposes. In determining the extent to
which the research and evaluation plan
has these attributes, we will consider—
(1) The plan for identifying and
employing psychometric techniques
VerDate Mar<15>2010
16:17 Apr 18, 2011
Jkt 223001
suitable for verifying, as appropriate to
each assessment, its construct,
consequential, and predictive validity;
external validity; reliability; fairness;
precision across the full performance
continuum; and comparability within
and across grade levels; and
(2) The plan for determining whether
the assessments are being implemented
as designed and the theory of action is
being realized, including whether the
intended effects on individuals and
institutions are being achieved.
(e) Professional capacity and
outreach. The Secretary reviews each
application to determine the extent to
which the eligible applicant’s plan for
implementing the proposed assessments
is feasible, cost-effective, and consistent
with the theory of action. In
determining the extent to which the
implementation plan has these
attributes, we will consider—
(1) The plan for supporting teachers
and administrators in implementing the
assessments and for developing, in an
ongoing manner, their professional
capacity to use the assessments and
results to inform and improve
instructional practice; and
(2) The strategy and plan for
informing the public and key
stakeholders (including teachers,
administrators, families, legislators, and
policymakers) in each State or in each
member State within a consortium
about the assessments and for building
support from the public and those
stakeholders.
(f) Technology approach. The
Secretary reviews each application to
determine the extent to which the
eligible applicant would use technology
effectively to improve the quality,
accessibility, cost-effectiveness, and
efficiency of the proposed assessments.
In determining the extent to which the
eligible applicant is using technology
effectively, we will consider—
(1) The description of, and rationale
for, the ways in which technology will
be used in assessment design,
development, administration, scoring,
and reporting; the types of technology to
be used (including whether the
technology is existing and commercially
available or is being newly developed);
and how other States or organizations
can re-use in a cost-effective manner
any technology platforms and
technology components developed
under this grant; and
(2) How technology-related
implementation or deployment barriers
will be addressed (e.g., issues relating to
local access to internet-based
assessments).
(g) Project management. The
Secretary reviews each application to
PO 00000
Frm 00012
Fmt 4701
Sfmt 4703
determine the extent to which the
eligible applicant’s project management
plan will result in implementation of
the proposed assessments on time,
within budget, and in a manner that is
financially sustainable over time. In
determining the extent to which the
project management plan has these
attributes, we will consider—
(1) The project workplan and
timeline, including, for each key
deliverable (e.g., necessary
procurements and any needed approvals
for human subjects research,
assessment, scoring and moderation
system, professional development
activities), the major milestones,
deadlines, and entities responsible for
execution;
(2) The approach to identifying,
managing, and mitigating risks
associated with the project;
(3) The extent to which the eligible
applicant’s budget is adequate to
support the development of assessments
that meet the requirements of the
priority and includes costs that are
reasonable in relation to the objectives,
design, and significance of the proposed
project and the number of students to be
served;
(4) For each applicant State or for
each member State within a consortium,
the estimated costs for the ongoing
administration, maintenance, and
enhancement of the operational
assessments after the end of the project
period for the grant and a plan for how
the State will fund the assessments over
time (including by allocating to the
assessments funds for existing State or
local assessments that will be replaced
by the new assessments); and
(5) The quality and commitment of
the personnel who will carry out the
proposed project, including the
qualifications, relevant training, and
experience of the project director and
other key project personnel, and the
extent to which the time commitments
of the project director and other key
project personnel are appropriate and
adequate to meet the objectives of the
proposed project.
This notice does not preclude us from
proposing additional priorities,
requirements, definitions, or selection
criteria, subject to meeting applicable
rulemaking requirements.
Note: This notice does not solicit
applications. In any year in which we choose
to use these priorities, requirements,
definitions, and selection criteria, we invite
applications through a notice in the Federal
Register.
Executive Order 12866: This notice
has been reviewed in accordance with
Executive Order 12866. Under the terms
E:\FR\FM\19APN3.SGM
19APN3
Federal Register / Vol. 76, No. 75 / Tuesday, April 19, 2011 / Notices
srobinson on DSKHWCL6B1PROD with NOTICES3
of the order, we have assessed the
potential costs and benefits of this final
regulatory action.
The potential costs associated with
this final regulatory action are those
resulting from statutory requirements
and those we have determined as
necessary for administering this
program effectively and efficiently.
In assessing the potential costs and
benefits—both quantitative and
qualitative—of this final regulatory
action, we have determined that the
benefits of the final priorities,
requirements, definitions, and selection
criteria justify the costs.
We have determined, also, that this
final regulatory action does not unduly
interfere with State, local, and tribal
governments in the exercise of their
governmental functions.
We fully discussed the costs and
benefits of this regulatory action in the
notice of proposed priorities,
requirements, definitions, and selection
VerDate Mar<15>2010
16:17 Apr 18, 2011
Jkt 223001
criteria. Elsewhere in this notice we
discuss the potential costs and benefits,
both quantitative and qualitative, of the
final priorities, requirements,
definitions, and selection criteria.
Intergovernmental Review: This
program is subject to Executive Order
12372 and the regulations in 34 CFR
part 79. One of the objectives of the
Executive order is to foster an
intergovernmental partnership and a
strengthened federalism. The Executive
order relies on processes developed by
State and local governments for
coordination and review of proposed
Federal financial assistance.
This document provides early
notification of our specific plans and
actions for this program.
Accessible Format: Individuals with
disabilities can obtain this document in
an accessible format (e.g., braille, large
print, audiotape, or computer diskette)
on request to the program contact
PO 00000
Frm 00013
Fmt 4701
Sfmt 9990
21997
person listed under FOR FURTHER
INFORMATION CONTACT.
Electronic Access to This Document:
The official version of this document is
the document published in the Federal
Register. Free Internet access to the
official edition of the Federal Register
and the Code of Federal Regulations is
available via the Federal Digital System
at: https://www.gpo.gov/fdsys. At this
site you can view this document, as well
as all other documents of this
Department published in the Federal
Register, in text or Adobe Portable
Document Format (PDF). To use PDF
you must have Adobe Acrobat Reader,
which is available free at the site.
Dated: April 14, 2011.
´
Thelma Melendez de Santa Ana,
Assistant Secretary for Elementary and
Secondary Education.
[FR Doc. 2011–9479 Filed 4–18–11; 8:45 am]
BILLING CODE 4000–01–P
E:\FR\FM\19APN3.SGM
19APN3
Agencies
[Federal Register Volume 76, Number 75 (Tuesday, April 19, 2011)]
[Notices]
[Pages 21986-21997]
From the Federal Register Online via the Government Printing Office [www.gpo.gov]
[FR Doc No: 2011-9479]
[[Page 21985]]
Vol. 76
Tuesday,
No. 75
April 19, 2011
Part IV
Department of Education
-----------------------------------------------------------------------
Notice of Final Priorities, Requirements, Definitions, and Selection
Criteria; Notice
Federal Register / Vol. 76 , No. 75 / Tuesday, April 19, 2011 /
Notices
[[Page 21986]]
-----------------------------------------------------------------------
DEPARTMENT OF EDUCATION
Notice of Final Priorities, Requirements, Definitions, and
Selection Criteria
Enhanced Assessment Instruments; Catalog of Federal Domestic
Assistance (CFDA) Number: 84.368.
EAGENCY: Office of Elementary and Secondary Education, Department of
Education.
ACTION: Notice of final priorities, requirements, definitions, and
selection criteria.
-----------------------------------------------------------------------
SUMMARY: The Assistant Secretary for Elementary and Secondary
Education announces priorities, requirements, definitions, and
selection criteria under the Enhanced Assessment Instruments Grant
program, also called the Enhanced Assessment Grants (EAG) program. The
Assistant Secretary may use one or more of these priorities,
requirements, definitions, and selection criteria for competitions in
fiscal year (FY) 2011 and later years. We take these actions to focus
Federal financial assistance on the pressing need to improve the
assessment instruments and systems used by States to accurately measure
student academic achievement and growth under the Elementary and
Secondary Education Act of 1965, as amended (ESEA).
DATES: Effective Date: These priorities, requirements, definitions, and
selection criteria are effective May 19, 2011.
FOR FURTHER INFORMATION CONTACT: Collette Roney, U.S. Department of
Education, 400 Maryland Avenue, SW., Room 3W210, Washington, DC 20202.
Telephone: (202) 401-5245. E-mail: Collette.Roney@ed.gov.
If you use a telecommunications device for the deaf (TDD), call the
Federal Relay Service (FRS), toll free, at 1-800-877-8339.
SUPPLEMENTARY INFORMATION:
Purpose of Program: The purpose of the EAG program is to enhance
the quality of assessment instruments and systems used by States for
measuring the academic achievement of elementary and secondary school
students.
Program Authority: 20 U.S.C. 7301a.
Public Comment: We published a notice of proposed priorities,
requirements, definitions, and selection criteria for this program in
the Federal Register on January 7, 2011 (76 FR 1138). That notice
contained background information and our reasons for proposing the
particular priorities, requirements, definitions, and selection
criteria. In response to comments we received on the notice, we have
made revisions to Priority 1--English Language Proficiency Assessment
System (ELP Priority), Priority 2--Collaborative Efforts Among States
(Collaborative Efforts Priority), and the requirements, definitions,
and selection criteria.
Public Comment: In response to our invitation in the notice of
proposed priorities, requirements, definitions, and selection criteria,
15 parties submitted comments. We group major issues according to
subject. Generally, we do not address technical and other minor
changes.
Analysis of Comments and Changes: An analysis of the comments and
of any changes in the priorities, requirements, definitions, and
selection criteria since publication of the notice of proposed
priorities, requirements, definitions, and selection criteria follows.
Priority 1--English Language Proficiency (ELP) Assessment System
Comment: Many commenters expressed support for the ELP Priority and
its broad objective of promoting the development of high-quality ELP
assessment systems. Commenters stated that the priority addresses
assessment needs unique to English learners and that improvements in
assessments used to measure English learners' progress in and
attainment of English proficiency will support improvements in
curriculum and instruction for English learners, help raise their
educational achievement, and help close achievement gaps between
English learners and their English proficient-peers. Commenters also
stated that the priority promotes innovative, high-quality assessments
that are aligned with common college- and career-ready standards, which
will help prepare English learners for higher education and careers and
ensure that English learners have access to the same rigorous academic
content as all students. Another commenter stated that the use of
multiple measures of both academic and English proficiency will provide
more ongoing feedback to educators as well as students and their
families and offers the promise of greater validity and reliability in
assessments for the diverse population of English learners.
Discussion: We agree with the commenters that the development of
high-quality ELP assessments aligned with ELP standards that in turn
correspond to a common set of college- and career-ready standards in
English language arts and mathematics are likely to contribute to
improved teaching and learning for English learners. We appreciate the
commenters' recognition that we designed the ELP Priority to support
the development of high-quality diagnostic and summative assessments
that measure students' abilities in each of the four language domains
(reading, writing, speaking, and listening), in order to meet the
significant need for ELP assessments that correspond to college- and
career-ready standards held in common by multiple States.
Changes: None.
Comment: Some commenters recommended that the ELP assessment system
outlined in the ELP Priority be defined more explicitly and suggested
that the priority explicitly support the development of benchmark and
formative assessments as well as diagnostic and summative assessments.
The commenters expressed concern that formative assessments may be
under-emphasized in the resulting ELP assessment systems if they are
not explicitly included in the priority, and stated that many educators
prefer an ELP assessment system that includes benchmark and formative
assessments. One commenter stressed that the focus on assessments
developed under this priority should be on measuring students' progress
towards English proficiency. Another commenter recommended that the
limited amount of funds for the EAG program be focused on the
development of summative assessments only.
Discussion: We believe that two types of assessments are
particularly important for English learners: (1) Diagnostic assessments
(e.g., screener or placement tests), which can be used to determine
whether a student should be classified as an English learner, and (2)
summative assessments, which can be used to determine whether an
English learner has made progress toward and achieved grade-level
English proficiency and should no longer be classified as an English
learner. The ELP Priority does not preclude an applicant from including
benchmark or formative assessments in the ELP assessment system it
proposes to develop. However, because of the importance of diagnostic
and summative assessments to the implementation of Federal education
programs such as Title III of the ESEA, and given the limited resources
available, we decline to expand the ELP Priority to require more than
the development of diagnostic and summative ELP assessments.
We agree that clarification of the components for an ELP assessment
system developed under the ELP Priority would be helpful and have added
a definition of English language
[[Page 21987]]
proficiency (ELP) assessment system, for purposes of the ELP Priority.
Changes: We have added a definition of ELP assessment system to the
final definitions. The definition specifies that, for purposes of the
ELP Priority, ELP assessment system means a system of assessments that
includes, at a minimum, diagnostic (e.g., screener or placement) and
summative assessments at each grade level from kindergarten through
grade 12 that cover the four language domains of reading, writing,
speaking, and listening, as required by section 3113(b)(2) of the ESEA,
and that meets all other requirements of the priority. Consistent with
this change, we also have revised paragraphs (a)(2) and (a)(3) of the
ELP priority to include both screener and placement assessments as
examples of diagnostic assessments.
Comment: One commenter noted that schools implementing the ELP
assessment systems developed under the ELP Priority will need time to
transition to the new assessments and stated that the proposed
priorities, requirements, definitions, and selection criteria did not
address how an applicant would need to approach such a transition.
Discussion: Given the four-year project period we are planning for
grants under this program, we anticipate that some of the actions
needed to support the transition to new ELP assessment systems may take
place after the end of the project period, while other actions (e.g.,
developing professional capacity and outreach as described in the
selection criteria) will occur during the project period. Because
operational administration of the assessments is not required during
the project period, we are not requiring a complete transition plan.
Transition issues will be addressed by applicants, as necessary, in
response to selection criterion (e), the professional capacity and
outreach selection criterion, and we decline to add any additional
requirements relating to transition, as some of these activities may
occur outside the grant period. We note, in addition, that the
Department routinely provides guidance to the field on current
implementation issues and will continue to do so in the future.
Changes: None.
Comment: Some commenters expressed concern that the ELP Priority
does not adequately address coordination between the grants to be
awarded under the EAG program and grants already awarded under the RTTA
program. The commenters recommended that the ELP Priority require more
specific coordination between EAG and RTTA grants. They also suggested
that we ensure that ELP assessments developed under the EAG program be
embedded in work on assessments under the RTTA program, particularly
because of the academic language that students likely will need in
order to access the assessments to be developed under the RTTA grants.
Discussion: We understand the importance of ensuring that projects
funded under the EAG program and other Department programs related to
assessments coordinate efforts where appropriate. We plan to facilitate
coordination and technical assistance, as needed, across newly awarded
EAG projects and the RTTA grants. EAG and RTTA grantees will be
required to participate in such technical assistance and other
activities conducted or facilitated by the Department or its designees.
We are clarifying this expectation for coordination by adding language
to requirement (b) that will require EAG grantees to coordinate with
the RTTA program.
Changes: We have revised requirement (b) by adding a phrase that
requires EAG grantees to coordinate with the RTTA program in the
development of assessments under the EAG program.
Comment: One commenter stated that States will need guidance and
technical support from the Department on such implementation issues as
accountability, timeframes, and benchmarks for English learners'
linguistic and academic progress once States have developed their ELP
assessments. The commenter also asserted that, if the reauthorization
of the ESEA occurs prior to the development and implementation of ELP
assessment systems funded under the EAG program, a reauthorized ESEA
should not constrain such work.
Discussion: We recognize that the Department will need to work with
grantees and provide technical assistance on implementing new ELP
assessment systems. If a reauthorized ESEA requires changes to the
projects awarded under the EAG program, we will work with grantees to
make the necessary changes and provide guidance to the field, as
appropriate.
Changes: None.
Comment: A few commenters expressed concern with the examples of
linguistic components of language included in paragraph (a)(7) of the
proposed ELP Priority. One commenter suggested adding ``semantics and
graphophonemic'' to the list of examples. Another commenter suggested
removing the list of examples. One commenter stated that the linguistic
components should be embedded within ELP standards and that
determinations of students' English proficiency should not be limited
to the sum of students' abilities on any group of specific linguistic
components of language. This commenter expressed concern that the ELP
Priority could be interpreted as requiring the ELP assessment systems
to provide subscores on discrete linguistic components. Another
commenter stated that the ELP Priority should specify that measurement
of any linguistic component should be driven by the functions of
comprehension and expression. This commenter suggested revising the
priority to require the assessments to reflect the linguistic
components of language or demonstrate students' control over linguistic
components of language.
Discussion: Based on consideration of the comments and our further
review of this issue prompted by the comments, we revised the list of
the examples of linguistic components of language by removing
``vocabulary'' from the list. Use of the abbreviation ``e.g.'' in the
parenthetical indicates that the list is not exhaustive or definitive.
While a valid and reliable ELP assessment system should consider
students' control over the linguistic components of language, we do not
intend to require that the ELP assessment systems generate subscores
for the linguistic components of language. However, we do intend to
require that the ELP assessment systems generate a valid and reliable
measure of students' abilities in each of the four language domains and
are revising the priority accordingly.
Changes: We have revised paragraph (a)(7) of the ELP Priority to
indicate that the ELP assessment systems must ensure that the measures
of students' English proficiency consider students' control over the
linguistic components of language (e.g., phonology, syntax,
morphology). We also have revised paragraph (c)(2) of the ELP Priority
to state that ELP assessment systems developed under the priority must
provide a valid and reliable measure of students' abilities in each of
the four language domains and a comprehensive ELP score based on all
four domains, with each language domain score making a significant
contribution to the comprehensive ELP score, at each proficiency level.
To be consistent with revisions to paragraph (c)(2) of the ELP
Priority, we have revised paragraph (a)(9) of the ELP Priority to list
the four language domains.
Comment: Some commenters expressed concern about references to the
uses of data from the ELP
[[Page 21988]]
assessment systems for evaluations of teacher and principal
effectiveness. A few commenters outlined several concerns that may
limit the usefulness of ELP assessments in evaluating teacher and
principal effectiveness, for example: Limitations of current testing
instruments; difficulty in isolating the effects of a teacher or
principal on an individual student's scores, especially when multiple
teachers are involved in a student's instruction; a limited knowledge
base about growth in English learners' acquisition of English and how
to use measures of growth; and the complexities of using longitudinal
data, especially for English learners who tend to have high mobility
rates. One commenter noted that States could misinterpret the ELP
Priority as requiring student learning on an ELP assessment to be the
only measure of teacher effectiveness.
A few commenters suggested revising the ELP Priority to require the
use of multiple measures for evaluations of teacher and principal
effectiveness, as opposed to using ELP assessments as the sole measure
to evaluate teacher and principal effectiveness. One commenter
suggested removing the provisions of the ELP Priority that refer to the
use of ELP assessment data for informing evaluations of teacher and
principal effectiveness. A few commenters stated that ELP assessments
should be used for evaluations of teacher and principal effectiveness
only after a research base has been established to support the use of
the assessments for such purposes.
Discussion: The ELP Priority does not require that States or other
entities use data from ELP assessment systems developed under the
priority as the single measure of teacher and principal effectiveness.
The ELP Priority, in combination with the assessment design selection
criterion, is intended to signal that ELP assessment systems should be
developed so that, as appropriate, the data that they provide can be
used as one of multiple measures for teacher and principal evaluation.
We have revised the language in the ELP Priority and the assessment
design selection criterion to more clearly reflect that intent.
Changes: We have revised paragraph (c) of the ELP Priority to
distinguish those circumstances in which ELP assessment data can be
used as a single measure (paragraph (c)(3)) and those circumstances in
which ELP assessment data can be one measure along with other
appropriate measures (paragraph (c)(4)). We have included evaluations
of principal and teacher effectiveness in paragraph (c)(4). We have
also revised the assessment design selection criterion in paragraph
(b)(6)(i) to indicate that data from the assessments developed under
the EAG program should be used only as appropriate as one of multiple
measures for determinations of individual principal and teacher
effectiveness.
Comment: Many commenters raised questions about the references in
the ELP Priority to a ``common definition of `English learner'.'' One
commenter expressed support for the general approach of requiring a
common definition, noting that a common definition would ensure that
the data States provide on the total number of English learners being
served would be more accurate and consistent across the nation, thereby
allowing parents, educators, and other stakeholders to make comparisons
across States and the nation. Multiple commenters requested that the
Department clarify the meaning of the term ``common'' and had diverging
views on whether ``common'' should be defined as ``identical'' or
``similar'' (e.g., comparable and consistent). Commenters also asked
for clarification as to whether the reference to a ``common
definition'' applies to home language surveys, screening instruments,
procedures for identifying and classifying English learners,
definitions of language proficiency levels, and criteria for
determining the English proficiency of students and student exit from
English learner status.
Several commenters provided specific suggestions for how the term
``common'' should be interpreted when used in the phrase ``common
definition of English learner.'' One commenter recommended that the
common definition of English learner, including classification and exit
criteria, be based solely on the ELP assessment system and not on
academic performance. The commenter noted that excluding academic
performance measures would avoid problems of construct validity and
avoid confusing the ``English learner'' classification with non-
language-related criteria. Another commenter recommended that an
assessment of students' proficiency in their first language be
considered in the common definition of English learner. Another
commenter asked how subgroups of English learners would fit within a
common definition and how data on these subgroups would be collected,
disaggregated, reported, and used.
One commenter stated that requiring multiple States to change their
definition of English learner to a common definition would be an
unreasonable Federal administrative requirement that goes beyond the
intent of the ESEA. This commenter recommended removing paragraph
(a)(2) from the ELP Priority, which calls for States to adopt a common
definition of English learner.
Discussion: The term ``common,'' as used in a ``common definition
of English learner'' in paragraph (a)(2) of the ELP Priority, means an
identical definition of English learner with respect to certain
criteria, specifically: The diagnostic assessments and associated
achievement standards used to classify students as English learners, as
well as the summative assessments and associated achievement standards
used to exit students from English learner status. This definition is
the same for all subgroups of English learners, with the exception of
English learners with the most significant cognitive disabilities who
are eligible to participate in alternate assessments based on alternate
academic achievement standards in accordance with 34 CFR 200.6(a)(2).
Assessment of students' proficiency in their first language is beyond
the scope of the ELP Priority.
The use of a common definition of ``English learner'' and common
criteria for exiting a student from English learner status will help
ensure consistency in identifying English learners across the States in
a consortium. However, the term ``common'' for purposes of the ELP
Priority does not apply to other areas such as home language surveys,
program placement and instruction for students, and the duration of
program and support services for students. To clarify the scope of the
ELP Priority, we have added language to paragraph (a)(2) to indicate
that ``common'' means identical for purposes of the diagnostic
assessments and associated achievement standards used to classify
students as English learners as well as the summative assessments and
associated achievement standards used to exit students from English
learner status. To provide further clarity, we also substituted the
word ``common'' for the word ``uniform'' in the definition of English
learner.
We agree with the commenter that a common definition of English
learner should be based on the ELP assessments to be developed under
the priority, as reflected in paragraph (c)(3) of the ELP Priority. We
also agree that the priority should specifically reference subgroups of
English learners and, therefore, are adding language to paragraph
(c)(1) of the ELP Priority to require that the ELP assessment system
provide data that can be disaggregated by key English learner
subgroups.
Because participation in a grant under the EAG program is voluntary
and no
[[Page 21989]]
entity is required to participate and adopt a common definition of
English learner, we do not believe the requirement in the ELP Priority
regarding a common definition of English learner represents an
unreasonable Federal administrative requirement and therefore decline
to remove this provision from the priority.
Changes: We have revised paragraph (a)(2) of the ELP Priority to
indicate that ``common'' means identical for purposes of the diagnostic
and summative assessments and the associated achievement standards used
to classify students as English learners and exit students from English
learner status. We also substituted the word ``common'' for the word
``uniform'' in the definition of English learner. We have revised
paragraph (c)(1) of the ELP Priority to require that the ELP assessment
system provide data that can be disaggregated by key English learner
subgroups and to provide examples of those subgroups.
Comment: Several commenters expressed concern that the proposed ELP
Priority did not adequately address the development of ELP standards
with which assessments developed under the priority must be aligned.
The commenters recommended that the Department revise the priority to
provide that the ELP standards be of high quality. One commenter also
stated that the language of the proposed ELP Priority was unclear
regarding whether EAG applicants would be required to develop the ELP
standards to which assessments under the priority must be aligned as an
activity under a grant. A few commenters specifically recommended that
we require grantees to submit a detailed plan for developing and
implementing the ELP standards on which they would base their ELP
assessments. Another commenter recommended including in the ELP
Priority a provision requiring the development of ELP standards or a
requirement that all members of a consortium agree to the adoption and
implementation of common ELP standards as a requirement for joining a
consortium. These commenters stated that it would be impossible for a
consortium to successfully develop common ELP assessments if each State
in the consortium had its own ELP standards.
One commenter noted that linguistic components of language embedded
within ELP standards may be necessary, but are not sufficient, to
measure the extent to which English learners can process and use
language for specified purposes or situations. This commenter stated
that it is the discourse level of language that carries the ``semantic
load'' supportive of communication that is needed for college- and
career-readiness.
Another commenter stated that the ELP assessments developed under
the ELP Priority should be aligned with ELP standards that correspond
to content standards not only in English language arts but also in
other subject areas.
Another commenter noted the importance of effectively implementing
ELP standards, stating that, in an aligned assessment system, standards
are the reference point for designing proficiency measures,
interpreting and communicating assessment results, and using assessment
results to improve teaching and learning.
Discussion: We agree that high-quality ELP standards and their
implementation are a crucial foundation for the ELP assessment systems
to be developed under the ELP Priority. Section 6112 of the ESEA, which
authorizes the EAG program, does not authorize EAG funds to be used for
developing standards. Therefore, the Department can make awards under
the EAG program only to develop assessments. We are adding a program
requirement clarifying this limitation.
We expect that the assessments developed under the ELP Priority
will be aligned with high-quality ELP standards, and are revising the
ELP Priority to more specifically define the characteristics of high-
quality ELP standards to which the ELP assessments should align.
Grants under the RTTA program, which the ELP Priority is designed
to complement, are focused on assessments that are aligned with
college- and career-ready standards in English language arts and
mathematics that are held in common by a multiple States. Hence, we are
providing that the assessments developed under the ELP Priority must be
aligned with ELP standards that correspond to common, college- and
career-ready standards in English language arts and mathematics. The
ELP Priority does not preclude an applicant from proposing to align the
ELP assessments with ELP standards that include the academic language
necessary for college- and career-readiness in subjects in addition to
English language arts and mathematics. We also expect that rigorous ELP
standards that correspond to a set of college- and career-ready
standards in English language arts and mathematics that are held in
common by multiple States and that are developed with broad stakeholder
involvement will attend not only to the linguistic components of
language but also to the discourse level of language.
Changes: We have revised paragraph (a)(5) of the ELP Priority to
more specifically define the characteristics of the ELP standards to
which the ELP assessments developed under the program must align.
Specifically, we have indicated that those standards must correspond to
a common set of college- and career-ready standards in English language
arts and mathematics, and be rigorous, developed with broad stakeholder
involvement, and vetted with experts and practitioners. The standards
also must be standards for which external evaluations have documented
rigor and correspondence to a common set of college- and career-ready
standards in English language arts and mathematics.
We removed the reference to States adopting or utilizing any
standards developed under a proposed project from paragraph (d) of the
Collaborative Efforts Priority in order to clarify that EAG program
funds may not be used to develop standards. We also have added a new
requirement (e), which requires grantees to ensure that EAG funds are
not used to support the development of standards, such as under the ELP
Priority or any other priority. The subsequent requirements have been
re-numbered accordingly.
Comment: Two commenters expressed support for our approach to ELP
assessments for students with the most significant cognitive
disabilities. One commenter suggested removing paragraph (e) of the ELP
Priority, which requires applicants to include in their applications
the strategies the applicant State or, if the applicant is part of a
consortium, all States in the consortium, would use to assess the
English proficiency of English learners with the most significant
cognitive disabilities. The commenter suggested replacing this
provision with a requirement that grantees under the EAG program
coordinate with existing grantees funded under the Individuals with
Disabilities Education Act (IDEA), including the General Supervision
Enhancement Grant (GSEG) program, to address the needs of English
learners with the most significant cognitive disabilities. One
commenter suggested that we require applicants to indicate how they
would coordinate work under an EAG grant awarded under the ELP Priority
with grants awarded under the GSEG program.
Discussion: Recent awards under the GSEG program are supporting the
development of alternate assessments based on alternate achievement
standards that measure student knowledge and skills against academic
content standards in English language
[[Page 21990]]
arts and mathematics held in common by multiple States; these grants
are not supporting the development of alternate ELP assessments for
students with the most significant cognitive disabilities. We
acknowledge the importance of developing alternate ELP assessments for
English learners with the most significant cognitive disabilities but,
due to limited resources, are not including them in the ELP Priority.
There will be limited overlap in the focus of the projects awarded
under the ELP Priority and the projects awarded under the GSEG program
because the EAG grants will not be supporting the development of
alternate assessments and because the GSEG awards, which focus only on
alternate assessments, are not supporting the development of ELP
assessments. Accordingly, we decline to require that EAG grantees
coordinate with GSEG grantees.
To clarify the reference to English learners with the most
significant cognitive disabilities who are eligible to participate in
alternate assessments based on alternate academic achievement
standards, we added the relevant regulatory citation to paragraphs
(a)(10) and (a)(11) of the ELP Priority.
Changes: We have added the relevant regulatory citation, 34 CFR
200.6(a)(2), to paragraphs (a)(10) and (a)(11) of the ELP Priority.
Comment: One commenter recommended that we consider adding a
priority to support the development of assessments to measure
proficiency in a second language other than English for States that
support bilingual education and bi-literacy.
Discussion: We recognize that measuring student proficiency in a
second language other than English can provide useful data to educators
of such students. States already have the flexibility to develop such
assessments, which under certain circumstances may be supported by ESEA
funds in accordance with section 6111 of the ESEA.
We decline to make the suggested change because we believe that
developing new ELP assessments is a more pressing need than developing
assessments that measure student proficiency in a second language other
than English. The Department has provided funding under the RTTA
program to consortia that together include 44 States and the District
of Columbia to develop new assessment systems that measure student
knowledge and skills against a common set of college- and career-ready
standards in English language arts and mathematics. ELP assessments
corresponding to such common standards will be needed when the RTTA
assessments are implemented, and such assessments have not been
developed.
Changes: None.
Comment: One commenter noted that addressing issues such as the
assessment of students whose education has been interrupted might be
more appropriately addressed by the GSEG program.
Discussion: The ELP Priority requires that ELP assessment systems
developed under the priority accurately assess English learners with
limited or no formal education, including students whose education has
been interrupted. Data on the English proficiency of these students can
support efforts to improve their instruction. The GSEG program focuses
on assessment for students with disabilities, who may or may not be
English learners. We decline to make a change in response to this
comment, because it is beyond the scope of the program to make changes
to other programs, such as the GSEG program, and because the GSEG
program focuses on assessments for students with disabilities, only
some of whom are English learners and not necessarily English learners
with interrupted education.
Changes: None.
Priority 2--Collaborative Efforts Among States
Comment: We received a variety of comments on the paragraph in the
Collaborative Efforts Priority that requires a consortium to include a
minimum of 15 States. One commenter stated that providing grants to
sizeable consortia of States would maximize the impact of program
funds. Another commenter suggested that the Department establish an
eligibility restriction under which only consortia would be eligible to
apply and require that a consortium include a minimum of 15 States that
represent at least 30% of the nation's English learners. Another
commenter expressed concern that the approach to consortia may result
in grants that do not include all States, including some States with
sizable English learner populations. Two additional commenters
recommended removing the proposed minimum number of States in a
consortium, suggesting that a minimum of 15 States would impose an
unfair obstacle to States and that improvement in assessment quality
will be achieved through competition in the marketplace.
Discussion: States have indicated to the Department their interest
in working together in consortia to develop assessments aligned with
common standards. Because of the complexity of developing and
implementing assessments and assessment-related instruments,
collaborative efforts between and among States can yield approaches
that build on each State's expertise and experience, as well as
approaches that generate substantial efficiencies in development,
administration, costs, and uses of results. We believe that larger
consortia will make more effective use of EAG funds by drawing on the
expertise and experience of more States, increasing the potential
impact across States, and increasing the degree to which common
assessment tools are available to States nationwide. However, we do not
want to limit States' flexibility in forming consortia by adding
requirements in the Collaborative Efforts Priority, such as a
requirement that a certain percentage of English learners be
represented by the population of consortium member States. We do not
have the authority to require all States to participate, and we decline
to prohibit individual States from applying for an award under the EAG
program; as a result, we decline to make the suggested changes in these
areas.
Changes: None.
Comment: While expressing general support for the Collaborative
Efforts Priority, one commenter expressed concern regarding the
requirement to have States sign a binding memorandum of understanding
to use assessments not yet developed. The commenter suggested that
requiring a strong and exclusive letter of support for one consortium
proposal would be a more reasonable requirement.
Discussion: Under Department regulations, all members of a
consortium applying for a grant must enter into an agreement that (1)
details the activities that each member of the consortium plans to
perform; and (2) binds each member of the consortium to every statement
and assurance made by the applicant in its application. (34 CFR
75.128). In response to the commenters' concerns that States may decide
to leave a consortium after receiving the grant, we are revising
paragraph (c)(3) of the Collaborative Efforts Priority to require
applicants to include in their applications protocols for member States
to leave a consortium and for new member States to join a consortium. A
consortium of States applying for a grant would have flexibility in
determining the roles that member States may play. In addition, a State
could enter or leave a consortium according to the protocols the
consortium has established for this purpose. In light of the
Department's
[[Page 21991]]
regulations and the changes being made to provide flexibility to
States, we decline to require a strong and exclusive letter of support
rather than a binding memorandum of understanding.
Changes: We have revised paragraph (c)(3) of the Collaborative
Efforts Priority to require that applications from consortia include
protocols to allow States to leave the consortium and for new member
States to join the consortium. We also revised paragraph (d) of the
Collaborative Efforts Priority to indicate that, to remain in the
consortium, a State must adopt or use any instrument, including to the
extent applicable, assessments, developed under the proposed project no
later than the end of the project period.
Selection Criteria
Comment: One commenter, expressing support for the selection
criteria, observed that the criteria include all the essential
principles needed to govern the development and implementation of high-
quality, rigorous, research-based assessment practices.
Discussion: We agree that the selection criteria should address the
key aspects of developing high-quality assessments and that the
selection criteria, as designed, address those key aspects.
Changes: None.
Comment: One commenter suggested revising paragraph (5) of the
assessment design selection criterion, which specifies the types of
data that must be provided by the assessments. The commenter suggested
adding the following categories of data: types of English learner
program services, length of time in the English learner program, and
level of English proficiency.
Discussion: Students' levels of English proficiency are already
included among the data the ELP assessments developed under the ELP
Priority must provide. However, because the selection criteria in this
notice may be used in future competitions, which may or may not include
the ELP Priority, we decline to revise the selection criteria in a
manner that relates specifically to the ELP Priority. For this same
reason, we decline to include in the selection criteria the other types
of data the commenter suggested (i.e., English learner program
services, length of time in the English learner program). In addition,
data regarding services provided by English learner programs and the
length of time students are in such programs are data that help assess
program effectiveness; they are not data that ELP assessments provide.
Changes: None.
Comment: One commenter suggested that we revise paragraph (b)(10)
of the assessment design selection criterion, which addresses methods
of scoring, to allow for self-scoring of student performance on
assessments in order to shorten the turnaround time for scoring.
Discussion: The selection criteria do not specify the scoring
methods that grantees must use. Applicants may propose to use a self-
scoring approach, as the commenter suggests, so long as the approach is
consistent with the technical quality requirements for the assessments.
Changes: None.
Comment: One commenter recommended that paragraph (11) of the
assessment design selection criterion, which addresses reports to be
produced based on the assessments, be revised to include the provision
of reports in a language and format that parents can understand.
Discussion: We agree with the commenter that reports of assessment
data should be provided to parents in an understandable and uniform
format and, to the extent practicable, in a language that parents can
understand, and have revised this paragraph accordingly.
Changes: We have revised paragraph (11) of the assessment design
selection criterion to provide for the consideration of the extent to
which, reports produced based on the assessments will be presented in
an understandable and uniform format, and to the extent practicable, in
a language that parents can understand.
Comment: One commenter noted that paragraph (1)(ii) of the proposed
assessment development plan selection criterion, which described the
types of personnel to be involved in each assessment development phase
and provided some examples of such personnel, did not include
references to advocates for English learners or parents of English
learners. The commenter suggested that the Department revise this
paragraph to include such stakeholders in the examples provided.
Discussion: We agree that the list of examples should include a
reference to other key stakeholders and have revised the selection
criterion accordingly. However, because the selection criteria may be
used in future competitions, which may or may not include the ELP
Priority, we decline to revise the selection criteria in a manner that
relates specifically to the ELP Priority, such as listing stakeholder
groups specific to English learners.
Changes: We have revised paragraph (1)(ii) of the assessment
development plan selection criterion to include ``other key
stakeholders'' in the list of examples provided.
Comment: One commenter expressed support for the Department's
reference to the use of representative sampling for field testing in
paragraph (5) of the assessment development plan selection criterion.
This commenter suggested that we revise this paragraph to specify
certain subgroups of English learners that may be considered in a
representative sample.
Discussion: We agree with the suggestion that the student
populations that should be considered for representative sampling
include high- and low-performing students, different types of English
learners, and students with disabilities, and that it would be helpful
for applicants to have examples of subgroups of English learners that
may be considered. We have revised this paragraph to provide examples
of the subgroups of English learners that may be considered in a
representative sample.
Changes: We have revised paragraph (5) of the assessment
development plan selection criterion to include the following examples
of subgroups of English learners that may be considered in a
representative sample: recently arrived English learners, former
English learners, migratory English learners, and English learners with
disabilities.
Comment: One commenter expressed support for the emphasis on
research and evaluation in the selection criteria.
Discussion: The Department agrees that the selection criteria
should include a research and evaluation component and believes that
the selection criteria, as designed, adequately consider whether an
applicant's research and evaluation plan will ensure that the
assessments developed are valid, reliable, and fair for their intended
purposes.
Changes: None.
Comment: One commenter, while expressing support for the emphasis
on professional capacity and outreach in the selection criteria, stated
that mainstream and content-area teachers, as well as English-as-a-
second language and bilingual program educators and administrators,
should be included in professional capacity and outreach plans. The
commenter also suggested that such plans should address additional
factors relating to ELP assessments, including the definition of
English learners, language proficiency levels, exit criteria for
programs and services, and professional development on the use of the
assessments and assessment results to inform and improve instruction.
[[Page 21992]]
Discussion: The activities suggested by the commenter are allowable
under the requirements for this program. However, because the selection
criteria may be used in future competitions that may or may not involve
the ELP Priority, we decline to make the recommended changes to the
selection criterion.
Changes: None.
Requirements
Comment: One commenter recommended that the requirement related to
evaluation be revised to mandate that evidence from evaluation
activities be posted on a specific Web site used by professionals who
specialize in issues related to English learners in order to improve
dissemination of findings.
Discussion: The EAG requirements do not preclude grantees from
posting information related to grant activities on Web sites (provided
that the appropriate disclaimers are included). However, we believe
that specifying the manner in which grantees make information available
would be unnecessarily prescriptive. Therefore, we decline to make the
suggested change in order to provide grantees with flexibility in how
they meet the requirement to make information related to grant
activities available to the public.
Changes: None.
Comment: A couple of commenters expressed concern regarding the
requirement that grantees develop a strategy to make student-level data
that result from any assessments or other assessment-related
instruments developed under the ELP Priority available on an ongoing
basis for research, including for prospective linking, validity, and
program improvement studies. One commenter recommended that the
requirements affirmatively address the applicable privacy safeguards
under the ESEA and the Family Educational Rights and Privacy Act
(FERPA) to ensure that disaggregated data used to report achievement
results for subgroups cannot be traced back to an identifiable student.
Another commenter suggested removing the requirement due to concerns
about privacy issues and a concern that limited funds for the grants
might be diverted to research or other entities that have separate
access to governmental and non-governmental funding sources. The
commenter also stated that the proposed requirements included all
necessary considerations for validity, reliability, and fairness,
thereby making the need for further research duplicative and
superfluous.
Discussion: Eligible applicants awarded a grant under the EAG
program must comply with FERPA and 34 CFR Part 99, as well as State and
local requirements regarding privacy; we are adding a footnote to the
notice reminding applicants that they must comply with these
requirements. With regard to the concern that limited funds for the
grants might be diverted to research, we note that the requirement
states that grant recipients must make data available for further
research, and that grant recipients may only use grant funds on
research and evaluation activities that fall within the scope of the
activities proposed in their approved applications. In order to allow
for additional research that may prove useful, we decline to remove the
requirement.
Changes: We have added a footnote to requirement (c) (making
student-level data available for further research) reminding applicants
that they must comply with FERPA and State and local privacy
requirements should they receive an award under this program.
Comment: Two commenters expressed concern regarding the requirement
that grantees, unless otherwise protected by law or agreement as
proprietary information, make any assessment content and other
assessment-related instruments developed with EAG funds freely
available to States, technology platform providers, and others that
request it for purposes of administering assessments, provided that
those parties receiving assessment content comply with consortium or
State requirements for test or item security. One commenter reiterated
that all instruments developed with EAG funding must be open-source and
available to any State requesting the use of the tools and instruments.
The other commenter requested that we clarify that assessments would be
freely available to States and others, including local educational
agencies. This commenter recommended removing the phrase ``unless
otherwise protected by law or agreement as proprietary information''
from the requirement, and adding a reference to making the information
available to local educational agencies.
Discussion: We cannot make a change to protections of proprietary
information guaranteed by existing laws. In addition, for work funded
by the EAG program and other Department-funded discretionary grant
programs, the Department reserves a royalty-free, nonexclusive, and
irrevocable license to reproduce, publish, or otherwise use, and to
authorize others to use, for Federal Government purposes: The copyright
in any work developed under a grant from the EAG program; and any
rights of copyright to which a grantee or a contractor purchases
ownership with grant support. (34 CFR 80.34). At this time we do not
intend to exercise this license with respect to any products produced
with EAG funds. If a grantee develops a product but fails to make it
reasonably available to interested entities, however, we may exercise
our license if doing so would further the interests of the Federal
Government. We believe the requirement as originally stated, coupled
with our license with respect to any products produced with EAG funds,
will serve to make adequately available products produced with EAG
funds. Additionally, we note that this requirement is consistent with
requirements of the RTTA program (see program requirement 6 ``Making
Work Available,'' in the RTTA program notice inviting applications, 75
FR 18175 (April 9, 2010)). As a result, we decline to make the
suggested changes.
Changes: None.
Definitions
Comment: With regard to the definition of a common set of college-
and career-ready standards, one commenter suggested revising the
definition to specify what constitutes a ``significant number of
States.''
Discussion: In using the term significant, we intended to indicate
multiple States rather than to refer to a specific number of States. We
agree that the ELP Priority should be more specific and have replaced
the phrase ``significant number of'' with the term ``multiple.''
Changes: In the definition of common set of college- and career-
ready standards, we have replaced the phrase ``significant number of''
with the term ``multiple.''
Funding
Comment: Some commenters expressed concern about the amount of
funds anticipated to be available for awards under a competition for
EAG funds involving the ELP Priority. Two commenters stated that the
information they had from interviews and press reports suggested that
funding for the development of ELP assessment systems under the ELP
Priority would be limited, especially when compared to funds available
for recent Department grants awarded under the RTTA and GSEG programs.
Another commenter expressed concern that the amount of funding that
would be available for an EAG competition involving the ELP Priority
would be too small, especially in comparison with the RTTA and
[[Page 21993]]
GSEG programs that the new priorities for the EAG program are designed
to complement. Some commenters recommended that the Department consider
making additional funds available to support the development of ELP
assessment systems under the EAG program. Another commenter noted that,
based on its experience in developing assessments, the cost of
accomplishing the scope and scale of work proposed in the notice of
proposed priorities, requirements, definitions, and selection criteria
would require more than the $10.7 million appropriated for the EAG in
FY 2010 to be awarded in 2011. The commenter encouraged the Department
to provide funding for grants under the EAG program comparable to the
amounts awarded under the RTTA and GSEG programs. Another commenter
stated that $10.7 million would be inadequate to address the needs of
English learners through the EAG program. Another commenter recommended
that the Department provide awards of $30 million, and suggested
decreasing the estimated number of awards if necessary to fund grantees
at this amount. None of the commenters outlined specific anticipated
costs for the various components of developing an ELP assessment
system, and only one commenter suggested a specific amount for awards.
Discussion: We cannot alter the amount of funding that Congress
appropriated for the EAG program in the FY 2010 budget. In developing
our estimates for the average size and range of awards included in the
FY 2011 notice inviting applications for new awards for FY 2010 funds,
published elsewhere in this issue of the Federal Register, we
considered the costs of efforts to develop ELP assessment systems that
the Department has previously funded, the cost estimates for activities
under programs with similar goals, and other information available for
estimating the costs of developing assessment systems.
Changes: None.
Final Priorities:
English Language Proficiency Assessment System. The Department
establishes a priority under the EAG program for an English language
proficiency assessment system. To meet this priority, an applicant must
propose a comprehensive plan to develop an English language proficiency
assessment system that is valid, reliable, and fair for its intended
purpose. Such a plan must include the following features:
(a) Design. The assessment system must--
(1) Be designed for implementation in multiple States;
(2) Be based on a common definition of English learner adopted by
the applicant State and, if the applicant applies as part of a
consortium, adopted and held in common by all States in the consortium,
where common with respect to the definition of ``English learner''
means identical for purposes of the diagnostic (e.g., screener or
placement) assessments and associated achievement standards used to
classify students as English learners as well as the summative
assessments and associated achievement standards used to exit students
from English learner status;
(3) At a minimum, include diagnostic (e.g., screener or placement)
and summative assessments;
(4) Measure students' English proficiency against a set of English
language proficiency standards held by the applicant State and, if the
applicant applies as part of a consortium, held in common by all States
in the consortium;
(5) Measure students' English proficiency against a set of English
language proficiency standards that correspond to a common set of
college- and career-ready standards (as defined in this notice) in
English language arts and mathematics, are rigorous, are developed with
broad stakeholder involvement, are vetted with experts and
practitioners, and for which external evaluations have documented rigor
and correspondence with a common set of college- and career-ready
standards in English language arts and mathematics;
(6) Cover the full range of the English language proficiency
standards across the four language domains of reading, writing,
speaking, and listening, as required by section 3113(b)(2) of the ESEA;
(7) Ensure that the measures of students' English proficiency
consider the students' control over the linguistic components of
language (e.g., phonology, syntax, morphology);
(8) Produce results that indicate whether individual students have
attained the English proficiency necessary to participate fully in
academic instruction in English and meet or exceed college- and career-
ready standards;
(9) Provide at least an annual measure of English proficiency and
student progress in learning English for English learners in
kindergarten through grade 12 in each of the four language domains of
reading, writing, speaking, and listening;
(10) Assess all English learners, including English learners who
are also students with disabilities and students with limited or no
formal education, except for English learners with the most significant
cognitive disabilities who are eligible to participate in alternate
assessments based on alternate academic achievement standards in
accordance with 34 CFR 200.6(a)(2); and
(11) Be accessible to all English learners, including by providing
appropriate accommodations for English learners with disabilities,
except for English learners with the most significant cognitive
disabilities who are eligible to participate in alternate assessments
based on alternate academic achievement standards in accordance with 34
CFR 200.6(a)(2).
(b) Technical quality. The assessment system must measure students'
English proficiency in ways that--
(1) Are consistent with nationally recognized professional and
technical standards; and
(2) As appropriate, elicit complex student demonstrations of
comprehension and production of academic English (e.g., performance
tasks, selected responses, brief or extended constructed responses).
(c) Data. The assessment system must produce data that--
(1) Include student attainment of English proficiency and student
progress in learning English (including data disaggregated by English
learner subgroups such as English learners by years in a language
instruction educational program; English learners whose formal
education has been interrupted; students who were formerly English
learners by years out of the language instruction educational program;
English learners by level of English proficiency, such as those who
initially scored proficient on the English language proficiency
assessment; English learners by disability status; and English learners
by native language);
(2) Provide a valid and reliable measure of students' abilities in
each of the four language domains (reading, writing, speaking, and
listening) and a comprehensive English proficiency score based on all
four domains, with each language domain score making a significant
contribution to the comprehensive ELP score, at each proficiency level;
and
(3) Can be used for the--
(i) Identification of students as English learners;
(ii) Decisions about whether a student should exit from English
language instruction educational programs;
(iii) Determinations of school, local educational agency, and State
effectiveness for the purposes of
[[Page 21994]]
accountability under Title I and Title III of the ESEA;
(4) Can be used, as appropriate, as one of multiple measures, to
inform--
(i) Evaluations of individual principals and teachers in order to
determine their effectiveness;
(ii) Determinations of principal and teacher professional
development and support needs; and
(iii) Strategies to improve teaching, learning, and language
instruction education programs.
(d) Compatibility. The assessment system must use compatible
approaches to technology, assessment administration, scoring,
reporting, and other factors that facilitate the coherent inclusion of
the assessments within States' student assessment systems.
(e) Students with the most significant cognitive disabilities. The
comprehensive plan to develop an English language proficiency
assessment system must include the strategies the applicant State and,
if the applicant is part of a consortium, all States in the consortium,
plans to use to assess the English proficiency of English learners with
the most significant cognitive disabilities who are eligible to
participate in alternate assessments based on alternate academic
achievement standards in accordance with 34 CFR 200.6(a)(2) in lieu of
including those students in the operational administration of the
assessments developed for other English learners under a grant from
this competition.
Collaborative Efforts Among States. The Department establishes a
priority under the EAG program for collaborative efforts among States.
To meet this priority, an applicant must--
(a) Include a minimum of 15 States in the consortium;
(b) Identify in its application a proposed project management
partner and provide an assurance that the proposed project management
partner is not partnered with any other eligible applicant applying for
an award under this competition; \1\
---------------------------------------------------------------------------
\1\ In selecting a proposed project management partner, an
eligible applicant must comply with the requirements for procurement
in 34 CFR 80.36.
---------------------------------------------------------------------------
(c) Provide a description of the consortium's structure and
operation. The description must include--
(1) The organizational structure of the consortium (e.g.,
differentiated roles that a member State may hold);
(2) The consortium's method and process (e.g., consensus, majority)
for making different types of decisions (e.g., policy, operational);
(3) The protocols by which the consortium will operate, including
protocols for member States to change roles in the consortium, for
member States to leave the consortium, and for new member States to
join the consortium;
(4) The consortium's plan, including the process and timeline, for
setting key policies and definitions for implementing the proposed
project, including, for any assessments developed through a project
funded by this grant, the common set of standards upon which to base
the assessments, a common set of performance-level descriptors, a
common set of achievement standards, common assessment administration
procedures, common item-release and test-security policies, and a
common set of policies and procedures for accommodations and student
participation; and
(5) The consortium's plan for managing grant funds received under
this competition; and
(d) Provide a memorandum of understanding or other binding
agreement executed by each State in the consortium that includes an
assurance that, to remain in the consortium, the State will adopt or
use any instrument, including to the extent applicable, assessments,
developed under the proposed project no later than the end of the
project period.
Types of Priorities:
When inviting applications for a competition using one or more
priorities, we designate the type of each priority as absolute,
competitive preference, or invitational through a notice in the Federal
Register. The effect of each type of priority follows:
Absolute priority: Under an absolute priority, we consider only
applications that meet the priority (34 CFR 75.105(c)(3)).
Competitive preference priority: Under a competitive preference
priority, we give competitive preference to an application by (1)
awarding additional points, depending on the extent to which the
application meets the priority (34 CFR 75.105(c)(2)(i)); or (2)
selecting an application that meets the priority over an application of
comparable merit that does not meet the priority (34 CFR
75.105(c)(2)(ii)).
Invitational priority: Under an invitational priority, we are
particularly interested in applications that meet the priority.
However, we do not give an application that meets the priority a
preference over other applications (34 CFR 75.105(c)(1)).
Final Requirements:
The Department establishes the following requirements for the
Enhanced Assessment Grants program. We may apply one or more of these
requirements in any year in which a competition for program funds is
held. An eligible applicant awarded a grant under this program must:
(a) Evaluate the validity, reliability, and fairness of any
assessmen