Proposed Priorities-Enhanced Assessment Instruments, 22550-22555 [2016-08726]
Download as PDF
22550
Federal Register / Vol. 81, No. 74 / Monday, April 18, 2016 / Proposed Rules
This document provides a
notice of public hearing on proposed
regulations that would require annual
country-by-country reporting by certain
United States persons that are the
ultimate parent entity of a multinational
enterprise group.
DATES: The public hearing is being held
on Friday, May 13, 2016, at 10 a.m. The
IRS must receive outlines of the topics
to be discussed at the public hearing by
Friday, April 29, 2016.
ADDRESSES: The public hearing is being
held in the IRS Auditorium, Internal
Revenue Service Building, 1111
Constitution Avenue NW., Washington,
DC 20224. Due to building security
procedures, visitors must enter at the
Constitution Avenue entrance. In
addition, all visitors must present photo
identification to enter the building.
Send Submissions to CC:PA:LPD:PR
(REG–109822–15), Room 5205, Internal
Revenue Service, P.O. Box 7604, Ben
Franklin Station, Washington, DC
20044. Submissions may be handdelivered Monday through Friday to
CC:PA:LPD:PR (REG–109822–15),
Couriers Desk, Internal Revenue
Service, 1111 Constitution Avenue NW.,
Washington, DC 20224 or sent
electronically via the Federal
eRulemaking Portal at
www.regulations.gov (IRS REG–109822–
15).
FOR FURTHER INFORMATION CONTACT:
Concerning the proposed regulations,
Melinda Harvey at (202) 317–6934;
concerning submissions of comments,
the hearing and/or to be placed on the
building access list to attend the hearing
Oluwafunmilayo Taylor at (202) 317–
6901 (not toll-free numbers).
SUPPLEMENTARY INFORMATION: The
subject of the public hearing is the
notice of proposed rulemaking (REG–
109822–15) that was published in the
Federal Register on Wednesday,
December 23, 2015 (80 FR 79795).
The rules of 26 CFR 601.601(a)(3)
apply to the hearing. Persons who wish
to present oral comments at the hearing
that submitted written comments by
March 22, 2016, must submit an outline
of the topics to be addressed and the
amount of time to be denoted to each
topic by Friday, April 29, 2016.
A period of 10 minutes is allotted to
each person for presenting oral
comments. After the deadline for
receiving outlines has passed, the IRS
will prepare an agenda containing the
schedule of speakers. Copies of the
agenda will be made available, free of
charge, at the hearing or in the Freedom
of Information Reading Room (FOIA RR)
(Room 1621) which is located at the
11th and Pennsylvania Avenue NW.,
mstockstill on DSK4VPTVN1PROD with PROPOSALS
SUMMARY:
VerDate Sep<11>2014
16:23 Apr 15, 2016
Jkt 238001
entrance, 1111 Constitution Avenue
NW., Washington, DC 20224.
Because of access restrictions, the IRS
will not admit visitors beyond the
immediate entrance area more than 30
minutes before the hearing starts. For
information about having your name
placed on the building access list to
attend the hearing, see the FOR FURTHER
INFORMATION CONTACT section of this
document.
Martin V. Franks,
Chief, Publications and Regulations Branch,
Legal Processing Division, Associate Chief
Counsel, (Procedure and Administration).
[FR Doc. 2016–08882 Filed 4–15–16; 8:45 am]
BILLING CODE 4830–01–P
DEPARTMENT OF EDUCATION
34 CFR Chapter II
[Docket ID ED–2016–OESE–0004; CFDA
Number: 84.368A.]
Proposed Priorities—Enhanced
Assessment Instruments
Office of Elementary and
Secondary Education, Department of
Education.
ACTION: Proposed priorities.
AGENCY:
The Assistant Secretary for
Elementary and Secondary Education
proposes priorities under the Enhanced
Assessment Instruments Grant program,
also called the Enhanced Assessment
Grants (EAG) program. The Assistant
Secretary may use one or more of these
priorities for competitions using funds
from fiscal year (FY) 2016 and later
years. Depending on the availability of
funds and the use of other priorities
under the EAG authority, the Assistant
Secretary may also choose not to use
one or more of these priorities for
competitions using funds from FY 2016
and later years. These proposed
priorities are designed to support
projects to improve States’ assessment
systems.
SUMMARY:
We must receive your comments
on or before May 18, 2016.
ADDRESSES: Submit your comments
through the Federal eRulemaking Portal
or via postal mail, commercial delivery,
or hand delivery. We will not accept
comments submitted by fax or by email
or those submitted after the comment
period. To ensure that we do not receive
duplicate copies, please submit your
comments only once. In addition, please
include the Docket ID and the term
‘‘Enhanced Assessment Grants—
Comments’’ at the top of your
comments.
DATES:
PO 00000
Frm 00003
Fmt 4702
Sfmt 4702
• Federal eRulemaking Portal: Go to
www.regulations.gov to submit your
comments electronically. Information
on using Regulations.gov, including
instructions for accessing agency
documents, submitting comments, and
viewing the docket, is available on the
site under the ‘‘Help’’ tab.
• Postal Mail, Commercial Delivery,
or Hand Delivery: If you mail or deliver
your comments about these proposed
priorities, address them to the Office of
Elementary and Secondary Education,
Attention: Enhanced Assessment
Grants—Comments, U.S. Department of
Education, 400 Maryland Avenue SW.,
Room 3e124, Washington, DC 20202–
6132.
Privacy Note: The Department of
Education’s (Department’s) policy is to make
all comments received from members of the
public available for public viewing in their
entirety on the Federal eRulemaking Portal at
www.regulations.gov. Therefore, commenters
should be careful to include in their
comments only information that they wish to
make publicly available.
FOR FURTHER INFORMATION CONTACT:
Donald Peasley. Telephone: (202) 453–
7982 or by email: donald.peasley@
ed.gov.
If you use a telecommunications
device for the deaf (TDD) or a text
telephone (TTY), call the Federal Relay
Service (FRS), toll free, at 1–800–877–
8339.
SUPPLEMENTARY INFORMATION:
Invitation to Comment: We invite you
to submit comments regarding this
notice. To ensure that your comments
have maximum effect in developing the
notice of final priorities, we urge you to
identify clearly the specific proposed
priority that each comment addresses.
We invite you to assist us in
complying with the specific
requirements of Executive Orders 12866
and 13563 and their overall requirement
of reducing regulatory burden that
might result from these proposed
priorities. Please let us know of any
further ways we could reduce potential
costs or increase potential benefits
while preserving the effective and
efficient administration of the program.
During and after the comment period,
you may inspect all public comments
about these proposed priorities by
accessing regulations.gov. You may also
inspect the comments in room 3e124,
400 Maryland Avenue SW.,
Washington, DC, between the hours of
8:30 a.m. and 4:00 p.m., Washington,
DC time, Monday through Friday of
each week except Federal holidays.
Assistance to Individuals with
Disabilities in Reviewing the
Rulemaking Record: On request we will
E:\FR\FM\18APP1.SGM
18APP1
Federal Register / Vol. 81, No. 74 / Monday, April 18, 2016 / Proposed Rules
mstockstill on DSK4VPTVN1PROD with PROPOSALS
provide an appropriate accommodation
or auxiliary aid to an individual with a
disability who needs assistance to
review the comments or other
documents in the public rulemaking
record for this notice. If you want to
schedule an appointment for this type of
accommodation or auxiliary aid, please
contact the person listed under FOR
FURTHER INFORMATION CONTACT.
Purpose of Program: The purpose of
the EAG program is to enhance the
quality of assessment instruments and
systems used by States for measuring
the academic achievement of
elementary and secondary school
students.
Program Authority: Section 6112 of
the Elementary and Secondary
Education Act of 1965 (ESEA), as
amended by the No Child Left Behind
Act of 2001 (NCLB), and section
1203(b)(1) of the ESEA, as amended by
the Every Student Succeeds Act (Pub. L.
114–95) (ESSA).
Proposed Priorities:
This notice contains three proposed
priorities.
Background:
Section 6112 of the ESEA, as
amended by the NCLB, and section
1203(b)(1) of the ESEA, as amended by
the ESSA, authorize the Department to
make competitive grant awards to State
educational agencies (SEAs) and
consortia of SEAs to help them enhance
the quality of their assessment
instruments and assessment systems.1
Under these provisions, State grantees
must meet at least one of the program’s
statutory priorities, including
collaborating with organizations to
improve the quality, validity, reliability,
and efficiency of academic assessments;
measuring student academic
achievement using multiple measures
from multiple sources; measuring
student growth on State assessments;
and evaluating student academic
achievement through the development
of comprehensive academic assessment
instruments and methods.
The grants awarded under this
competitive grant award program in
section 6112 will also lay the
groundwork for some new opportunities
in the recently reauthorized Elementary
and Secondary Education Act of 1965,
1 The Consolidated Appropriations Act, 2016
(Pub. L. 114–113) appropriated funds for the EAG
program under section 6112 of the ESEA, as
amended by the NCLB. As such, the upcoming EAG
competition will be conducted under that authority.
The Department is also establishing these priorities
under the authority in section 1203(b)(1) of the
ESEA, as amended by the ESSA, which, if funded,
would replace the EAG program under section
6112. These priorities may also be used in any
competition conducted after FY 2016 under that
authority.
VerDate Sep<11>2014
16:23 Apr 15, 2016
Jkt 238001
as amended by the Every Student
Succeeds Act. For example, the
reauthorization of ESEA, will allow up
to seven States or consortia of States to
receive an initial demonstration
authority to establish an innovative
assessment and accountability system
for a new approach to assessment for a
trial period of up to five years. This can
provide SEAs with an opportunity to
demonstrate what is possible when
assessment systems are redesigned with
student learning at the center. The EAG
program provides SEAs with support to
develop innovative assessment tools
and approaches which have the
potential to be used by all States,
including those approved under the
innovative assessment and
accountability demonstration authority,
and be more widely adopted at scale. In
addition, the EAG program provides
SEAs with support in developing
innovative summative assessment tools
and approaches that can be used within
the broader context of the multiple
measures of student achievement and
school accountability of the new ESSA
and the President’s Testing Action Plan.
Through this notice, the Department
proposes three additional priorities for
the EAG program that are designed to
support States in continuously
improving their assessment systems to
measure college- and career-readiness.
We believe that an essential part of
educating students involves assessing
students’ progress toward meeting the
high standards they need to be ready for
college and the workplace. Assessments
provide necessary information for
States, districts, educators, families, the
public, and students themselves to
measure progress and improve
outcomes for all learners. As such, we
recognize the importance of
continuously improving and innovating
to ensure assessments are fair, of high
quality and not duplicative, can be
completed in the minimum necessary
time while validly and reliably
measuring a student’s knowledge and
skills, and reflect the expectation that
students will be prepared for success in
college and careers.
Proposed Priority 1—Developing
Innovative Assessment Item Types and
Design Approaches.
Background: The President’s Testing
Action Plan highlighted the need to
reduce the time spent on unnecessary,
duplicative, or low-quality testing and
improve assessment efficiency and
quality to provide educators and parents
with more timely and actionable data on
students’ progress. SEAs and LEAs need
to continue developing new methods for
collecting evidence about what students
know and are able to do as it relates to
PO 00000
Frm 00004
Fmt 4702
Sfmt 4702
22551
State learning standards, including by
creating innovative item types and
design approaches, for example, by
developing modular assessments that
are given throughout the school year
instead of a single summative
assessment given at the end of the
school year.
Although traditional assessment items
such as multiple-choice questions have
advantages, innovative item types such
as performance tasks, simulations, and
interactive, multi-step, technology-rich
items that support competency-based
assessments or portfolio assessments
which demonstrate applied skills, have
the potential to provide a more
comprehensive view of a student’s
knowledge and mastery of standards.
Examples include: Items that provide
multi-step mathematics problems where
students demonstrate their approaches
to solving each step; items that permit
graphs or other visual response types;
and simulated game environments
where students interact with stimuli
and interaction information is collected.
As States implement more rigorous
standards, it is important that
assessment strategies are aligned with
the higher-level cognitive skills students
are expected to master.2 For example,
performance tasks and simulations
provide an opportunity for students to
apply their understanding and
demonstrate their abilities in real-world
scenarios. Rather than simply requiring
a student to select a response from a list
of options, competency-based
assessments can allow students to
interact with material and concepts to
formulate responses. Students’
responses to, and performance on, such
innovative item types provide insight
into their higher-level thinking and
problem-solving skills and allow
educators to better understand students’
mastery of content and concepts.3
We believe that good assessments
should require the same kind of
complex work that students do in an
effective classroom or in the real world,
including demonstration and
application of knowledge and skills.
Further, assessments should present
information and questions that push
students’ critical thinking skills so that
students gain valuable experience while
taking them. The inclusion of new,
innovative item types will help to
ensure that taking an assessment is a
worthwhile experience for students.
2 Darling-Hammond, Linda, et al. (2013). Criteria
for High-Quality Assessment (SCOPE, CRESST,
LSRI Policy Brief). https://edpolicy.stanford.edu/
sites/default/files/publications/criteria-higherquality-assessment_1.pdf.
3 Gorin, Joanna S. (2007). Test Design with
Cognition in Mind.
E:\FR\FM\18APP1.SGM
18APP1
mstockstill on DSK4VPTVN1PROD with PROPOSALS
22552
Federal Register / Vol. 81, No. 74 / Monday, April 18, 2016 / Proposed Rules
Modular assessment approaches also
can help SEAs and LEAs support
students and educators in a number of
significant ways by breaking down
large, summative assessment forms with
many items into smaller forms with
fewer items (e.g., testing only one
mathematics or reading competency).
This will allow students to be assessed
on specific competencies when they are
ready and capable of demonstrating
proficiency. This can allow advanced
students to move ahead rapidly while
providing students who need extra
support the flexibility and additional
time they need to learn and succeed, as
well as the opportunity to demonstrate
competence in the areas they have
mastered.
Modules can also provide educators
with more individualized, easilyintegrated assessments which are used
together to provide a summative
analysis of each learner.
Proposed Priority: Under this priority,
SEAs must:
(a) Develop, evaluate, and implement
new, innovative item types for use in
summative assessments in reading/
language arts, mathematics, or science;
(1) Development of innovative item
types under paragraph (a) may include,
for example, performance tasks;
simulations; or interactive, multi-step,
technology-rich items that can support
competency-based assessments or
portfolio projects;
(2) Projects under this priority must
be designed to develop new methods for
collecting evidence about a student’s
knowledge and abilities and ensure the
quality, validity, reliability, and fairness
of the assessment and comparability of
student data; or
(b) Develop new approaches to
transform traditional, end-of-year
summative assessment forms with many
items into a series of modular
assessment forms, each with fewer
items.
(1) To respond to paragraph (b),
applicants must develop modular
assessment approaches which can be
used to provide timely feedback to
educators and parents as well as be
combined to provide a valid, reliable,
and fair summative assessment of
individual learners.
(c) Applicants proposing projects
under either paragraph (a) or (b) must
provide a dissemination plan such that
their projects can serve as models and
resources that can be shared with States
across the Nation.
Proposed Priority 2—Improving
Assessment Scoring and Score
Reporting.
Background: By improving
assessment scoring and score reporting,
VerDate Sep<11>2014
16:23 Apr 15, 2016
Jkt 238001
SEAs can enhance the testing
experience for students and provide
more timely and relevant information to
parents and educators. While
developing high-quality assessments
that measure student knowledge and
skills against States’ standards is an
essential part of building strong
assessment systems, ensuring that
assessment results are available sooner,
and provide clear and actionable
information is also critically important.
With continued advancements in
technology to support and enhance
education in the classroom, it is also
becoming possible to improve the
testing experience for students by using
technology to automatically score nonmultiple choice assessment items.
Automated scoring can decrease the
time needed for scoring and releasing
results, lower costs, improve score
consistency, and reduce the need for
training of, and coordination among,
human scorers.4 Recent research has
examined existing automated scoring
systems for short and extended
constructed responses and found these
automated scoring systems to be similar
to human scorers.5
Building on the work done to date
and developing better technological
tools to score assessments would be
advantageous to SEAs, LEAs, educators,
and students. Automated scoring would
allow SEAs to incorporate more nonmultiple choice items, such as essays
and constructed responses, in
assessments while not adding
significantly to the time or cost to score
the tests. Assessment results could be
returned more quickly to students and
educators, who could in turn respond to
the results data through timely
implementation of additional teaching,
supports, or interventions that would
help students master content.6 The
4 Williamson, David M., Xiaoming Xi, and F. Jay
Breyer. (2012). A Framework for Evaluation and
Use of Automated Scoring. Educational
Measurement: Issues and Practice. Volume 31, Issue
1, Pages 2–13.
5 Shermis, Mark D., and Ben Hamner. (2012).
Contrasting State-of-the-Art Automated Scoring of
Essays: Analysis, National Council on Measurement
in Education. www.scoreright.org/NCME_2012_
Paper3_29_12.pdf; Shermis, Mark D. (2013).
Contrasting State-of-the-Art in the Machine Scoring
of Short-Form Constructed Responses. Educational
Assessment. www.tandfonline.com/doi/pdf/
10.1080/10627197.2015.997617.
6 For example, the Institute of Education Sciences
has recently invested in projects that are promising
examples of how technology can be leveraged to
improve scoring. The aim of one such project is to
develop a computerized oral reading fluency
assessment (see https://ies.ed.gov/funding/
grantsearch/details.asp?ID=1492). Traditional oral
reading fluency assessments require one-on-one
administration and hand-scoring, a time-consuming
and resource-intensive process that is prone to
scoring errors. The assessment under development
PO 00000
Frm 00005
Fmt 4702
Sfmt 4702
inclusion of additional non-multiple
choice items can also enhance the
testing experience for students by
requiring more engaging and complex
demonstrations of knowledge. To
improve scoring, applicants responding
to this priority could propose projects to
build, test, or enhance automated
scoring systems for use with nonmultiple choice items in reading/
language arts, mathematics, and science.
For example, an applicant could
propose to build, test, or improve a
system for reviewing brief or extended
student-constructed responses.
Applicants could propose projects that
will research, build, or test systems that
can score assessments and provide
diagnostic information to educators and
parents.
Score reporting, when done well,
provides valuable feedback to educators
that can be used to guide instruction
and supports for students. This feedback
is most relevant when it is available
soon after the assessment is
administered and when it is actionable
for students, parents, and educators.
The Department also recognizes a need
to improve the design and content of the
reports such that they clearly
communicate information to
stakeholders.
Efforts to improve the usefulness of
score reports could include:
Incorporating information about what
students’ results mean; including
multiple levels of information (e.g.,
overall proficiency, mastery of different
standards or skills); 7 providing
examples of questions that were likely
to be answered correctly or incorrectly
(and why); and connecting students and
their families to useful resources or aids
to address identified areas for
improvement. Improving
communications related to score
reporting could include: Presenting
information in easily comprehensible
formats (e.g., graphically or
numerically); tailoring reporting formats
to different audiences or for different
modes of dissemination; making results
available in a timelier manner (i.e.,
delivered to teachers and parents as
uses speech recognition software to record and
score students’ oral reading fluency, making
processes more efficient and less prone to scoring
errors. Another such project is aimed at developing
a new assessment tool to measure the science and
math skills of middle school English learners (see
https://ies.ed.gov/funding/grantsearch/
details.asp?ID=1475). It features auto-scoring to give
immediate feedback to teachers and students.
7 Zapata-Rivera, Diego, and Rebecca Zwick.
(2011). Improving Test Score Reporting:
Perspectives from the ETS Score Reporting
Conference. www.ets.org/Media/Research/pdf/RR11-45.pdf.
E:\FR\FM\18APP1.SGM
18APP1
mstockstill on DSK4VPTVN1PROD with PROPOSALS
Federal Register / Vol. 81, No. 74 / Monday, April 18, 2016 / Proposed Rules
soon as possible after the assessments
are administered).
Proposed Priority: Under this priority,
SEAs must:
(a) Develop innovative tools that
leverage technology to score
assessments;
(1) To respond to paragraph (a),
applicants must propose projects to
reduce the time it takes to provide test
results to educators, parents, and
students and to make it more costeffective to include non-multiple choice
items on assessments. These innovative
tools must improve automated scoring
of student assessments, in particular
non-multiple choice items in reading/
language arts, mathematics, and science;
or
(b) Propose projects, in consultation
with organizations representing parents,
students, and teachers, to address needs
related to score reporting and improve
the utility of information about student
performance included in reports of
assessment results and provide better
and more timely information to
educators and parents;
(1) To respond to paragraph (b),
applicants must include one or more of
the following in their projects:
(i) Developing enhanced score
reporting templates or digital
mechanisms for communicating
assessment results and their meaning;
(ii) Improving the assessment literacy
of educators and parents to improve the
interpretation of test results to support
teaching and learning in the classroom;
and
(iii) Developing mechanisms for
secure transmission and individual use
of assessment results by students and
parents.
(c) Applicants proposing projects
under either paragraph (a) or (b) must
provide a dissemination plan such that
their projects can serve as models and
resources that can be shared with States
across the Nation.
Proposed Priority 3—Inventory of
State and Local Assessment Systems.
Background: Recently, there has been
significant discussion about the amount
of time students spend in formal testing,
including classroom, district, and State
assessments. While the Department
believes that assessments are important
tools for measuring progress and
improving outcomes for all students, we
also recognize that too much testing, or
unnecessary testing, takes valuable time
away from teaching and learning in the
classroom.8
8 As
a part of the President’s Testing Action Plan,
The Department has recently released a Dear
Colleague Letter to State Chief School Officers
providing examples of existing Federal funding
VerDate Sep<11>2014
16:23 Apr 15, 2016
Jkt 238001
In response to this issue, some SEAs,
local educational agencies (LEAs), and
schools are currently in the process of
reviewing assessments administered to
students in kindergarten through grade
12 to better understand if each
assessment is of high quality, maximizes
instructional goals, has clear purpose
and utility, and is designed to provide
information on students’ progress
toward achieving proficiency on State
standards. To support such efforts, the
Department made the development of
tools to inventory State and local
assessment systems an invitational
priority in the FY 2015 EAG
competition. Through this proposed
priority, the Department would fund
States that are reviewing and
streamlining their statewide
assessments and working with some or
all of their LEAs to review and
streamline local assessments, including
eliminating redundant and unnecessary
assessments.
This priority would support the
identification of promising practices
that could be followed by other SEAs,
LEAs, and schools to maximize the
utility of their assessments to parents,
educators, and students.
Proposed Priority:
(a) Under this priority, SEAs must—
(1) Review statewide and local
assessments to ensure that each test is
of high quality, maximizes instructional
goals, has a clear purpose and utility,
and is designed to help students
demonstrate mastery of State standards;
(2) Determine whether assessments
are serving their intended purpose to
help schools meet their goals and to
eliminate redundant and unnecessary
testing; and
(3) Review State and LEA activities
related to test preparation to make sure
those activities are focused on academic
content and not on test-taking skills.
(b) To meet the requirements in
paragraph (a), SEAs must ensure that
tests are—
(1) Worth taking, meaning that
assessments are a component of good
instruction and require students to
perform the same kind of complex work
they do in an effective classroom and
the real world;
(2) High quality, resulting in
actionable, objective information about
students’ knowledge and skills,
including by assessing the full range of
relevant State standards, eliciting
complex student demonstrations or
applications of knowledge, providing an
streams, and best practices, which can be utilized
at the State and local levels to improve assessment
systems and reduce unnecessary testing: https://
www2.ed.gov/admins/lead/account/saa/160002signedcsso222016ltr.pdf.
PO 00000
Frm 00006
Fmt 4702
Sfmt 4702
22553
accurate measure of student
achievement, and producing
information that can be used to measure
student growth accurately over time;
(3) Time-limited, in order to balance
instructional time and the need for
assessments, for example, by
eliminating duplicative assessments and
assessments that incentivize low-quality
test preparation strategies that consume
valuable classroom time;
(4) Fair for all students and used to
support equity in educational
opportunity by ensuring that
accessibility features and
accommodations level the playing field
so tests accurately reflect what all
students, including students with
disabilities and English learners, know
and can do;
(5) Fully transparent to students and
parents, so that States and districts can
clearly explain to parents the purpose,
the source of the requirement (if
appropriate), and the use by teachers
and schools, and provide feedback to
parents and students on student
performance; and
(6) Tied to improving student learning
as tools in the broader work of teaching
and learning.
(c) Approaches to assessment
inventories under paragraph (a) must
include:
(1) Review of the schedule for
administration of all assessments
required at the Federal, State, and local
levels;
(2) Review of the purpose of, and legal
authority for, administration of all
assessments required at the Federal,
State, and local levels; and
(3) Feedback on the assessment
system from stakeholders, which could
include information on how teachers,
principals, other school leaders, and
administrators use assessment data to
inform and differentiate instruction,
how much time teachers spend on
assessment preparation and
administration, and the assessments that
administrators, teachers, principals,
other school leaders, parents, and
students do and do not find useful.
(d) Projects under this priority—
(1) Must be no longer than 12 months;
(2) Must include a longer-term project
plan, understanding that, beginning
with FY 2017, there may be dedicated
Federal funds for assessment audit work
as authorized under section 1202 of the
ESEA, as amended by the ESSA, and
understanding that States and LEAs may
use other Federal funds, such as the
State assessment grant funds, authorized
under section 1201 of the ESEA, as
amended by the ESSA, consistent with
the purposes for those funds, to
implement such plans; and
E:\FR\FM\18APP1.SGM
18APP1
22554
Federal Register / Vol. 81, No. 74 / Monday, April 18, 2016 / Proposed Rules
(3) Are eligible to receive a maximum
award of $200,000.
Types of Priorities:
When inviting applications for a
competition using one or more
priorities, we designate the type of each
priority as absolute, competitive
preference, or invitational through a
notice in the Federal Register. The
effect of each type of priority follows:
Absolute priority: Under an absolute
priority, we consider only applications
that meet the priority (34 CFR
75.105(c)(3)).
Competitive preference priority:
Under a competitive preference priority,
we give competitive preference to an
application by (1) awarding additional
points, depending on the extent to
which the application meets the priority
(34 CFR 75.105(c)(2)(i)); or (2) selecting
an application that meets the priority
over an application of comparable merit
that does not meet the priority (34 CFR
75.105(c)(2)(ii)).
Invitational priority: Under an
invitational priority, we are particularly
interested in applications that meet the
priority. However, we do not give an
application that meets the priority a
preference over other applications (34
CFR 75.105(c)(1)).
Final Priorities:
We will announce the final priorities
in a notice in the Federal Register. We
will determine the final priorities after
considering responses to this notice and
other information available to the
Department. This notice does not
preclude us from proposing additional
priorities, requirements, definitions, or
selection criteria, subject to meeting
applicable rulemaking requirements.
mstockstill on DSK4VPTVN1PROD with PROPOSALS
Note: This notice does not solicit
applications. In any year in which we choose
to use these priorities, we invite applications
through a notice in the Federal Register.
Paperwork Reduction Act of 1995
As part of its continuing effort to
reduce paperwork and respondent
burden, the Department provides the
general public and Federal agencies
with an opportunity to comment on
proposed and continuing collections of
information in accordance with the
Paperwork Reduction Act of 1995 (PRA)
(44 U.S.C. 3506(c)(2)(A)). This helps
ensure that: the public understands the
Department’s collection instructions,
respondents can provide the requested
data in the desired format, reporting
burden (time and financial resources) is
minimized, collection instruments are
clearly understood, and the Department
can properly assess the impact of
collection requirements on respondents.
These proposed priorities contain
information collection requirements that
VerDate Sep<11>2014
16:23 Apr 15, 2016
Jkt 238001
are approved by OMB under the
Departmental application control
number 1894–0006; this proposed
regulation does not affect the currently
approved data collection.
Executive Orders 12866 and 13563
Regulatory Impact Analysis
Under Executive Order 12866, the
Secretary must determine whether this
proposed regulatory action is
‘‘significant’’ and, therefore, subject to
the requirements of the Executive order
and subject to review by the Office of
Management and Budget (OMB).
Section 3(f) of Executive Order 12866
defines a ‘‘significant regulatory action’’
as an action likely to result in a rule that
may—
(1) Have an annual effect on the
economy of $100 million or more, or
adversely affect a sector of the economy,
productivity, competition, jobs, the
environment, public health or safety, or
State, local, or tribal governments or
communities in a material way (also
referred to as an ‘‘economically
significant’’ rule);
(2) Create serious inconsistency or
otherwise interfere with an action taken
or planned by another agency;
(3) Materially alter the budgetary
impacts of entitlement grants, user fees,
or loan programs or the rights and
obligations of recipients thereof; or
(4) Raise novel legal or policy issues
arising out of legal mandates, the
President’s priorities, or the principles
stated in the Executive order.
This proposed regulatory action is not
a significant regulatory action subject to
review by OMB under section 3(f) of
Executive Order 12866.
We have also reviewed this proposed
regulatory action under Executive Order
13563, which supplements and
explicitly reaffirms the principles,
structures, and definitions governing
regulatory review established in
Executive Order 12866. To the extent
permitted by law, Executive Order
13563 requires that an agency—
(1) Propose or adopt regulations only
upon a reasoned determination that
their benefits justify their costs
(recognizing that some benefits and
costs are difficult to quantify);
(2) Tailor its regulations to impose the
least burden on society, consistent with
obtaining regulatory objectives and
taking into account—among other things
and to the extent practicable—the costs
of cumulative regulations;
(3) In choosing among alternative
regulatory approaches, select those
approaches that maximize net benefits
(including potential economic,
environmental, public health and safety,
PO 00000
Frm 00007
Fmt 4702
Sfmt 4702
and other advantages; distributive
impacts; and equity);
(4) To the extent feasible, specify
performance objectives, rather than the
behavior or manner of compliance a
regulated entity must adopt; and
(5) Identify and assess available
alternatives to direct regulation,
including economic incentives—such as
user fees or marketable permits—to
encourage the desired behavior, or
provide information that enables the
public to make choices.
Executive Order 13563 also requires
an agency ‘‘to use the best available
techniques to quantify anticipated
present and future benefits and costs as
accurately as possible.’’ The Office of
Information and Regulatory Affairs of
OMB has emphasized that these
techniques may include ‘‘identifying
changing future compliance costs that
might result from technological
innovation or anticipated behavioral
changes.’’
We are issuing these proposed
priorities only on a reasoned
determination that their benefits would
justify their costs. In choosing among
alternative regulatory approaches, we
selected those approaches that would
maximize net benefits. Based on the
analysis that follows, the Department
believes that this regulatory action is
consistent with the principles in
Executive Order 13563.
We also have determined that this
regulatory action would not unduly
interfere with State, local, and tribal
governments in the exercise of their
governmental functions.
In accordance with both Executive
orders, the Department has assessed the
potential costs and benefits, both
quantitative and qualitative, of this
regulatory action. The potential costs
are those resulting from statutory
requirements and those we have
determined as necessary for
administering the Department’s
programs and activities.
The proposed priorities included in
this notice would benefit students,
parents, educators, administrators, and
other stakeholders by improving the
quality of State assessment instruments
and systems. The proposed priority for
an inventory of State and local
assessment systems would encourage
States to ensure that assessments are of
high quality, maximize instructional
goals, and have clear purpose and
utility. Further, it would encourage
States to eliminate unnecessary or
redundant tests. The proposed priority
for improving assessment scoring and
score reporting would allow for States to
score non-multiple choice assessment
items more quickly and at a lower cost
E:\FR\FM\18APP1.SGM
18APP1
mstockstill on DSK4VPTVN1PROD with PROPOSALS
Federal Register / Vol. 81, No. 74 / Monday, April 18, 2016 / Proposed Rules
and ensure that assessments provide
timely, actionable feedback to students,
parents, and educators. The proposed
priority for developing innovative
assessment item types and design
approaches, including the development
of modular assessments, would yield
new, more authentic methods for
collecting evidence about what students
know and are able to do and provide
educators with more individualized,
easily integrated assessments that can
support competency-based learning and
other forms of personalized instruction.
Intergovernmental Review: This
program is subject to Executive Order
12372 and the regulations in 34 CFR
part 79. One of the objectives of the
Executive order is to foster an
intergovernmental partnership and a
strengthened federalism. The Executive
order relies on processes developed by
State and local governments for
coordination and review of proposed
Federal financial assistance.
This document provides early
notification of our specific plans and
actions for this program.
Accessible Format: Individuals with
disabilities can obtain this document in
an accessible format (e.g., braille, large
print, audiotape, or compact disc) on
request to the program contact person
listed under FOR FURTHER INFORMATION
CONTACT.
Electronic Access to This Document:
The official version of this document is
the document published in the Federal
Register. Free Internet access to the
official edition of the Federal Register
and the Code of Federal Regulations is
available via the Federal Digital System
at: www.gpo.gov/fdsys. At this site you
can view this document, as well as all
other documents of this Department
published in the Federal Register, in
text or Adobe Portable Document
Format (PDF). To use PDF you must
have Adobe Acrobat Reader, which is
available free at the site.
You may also access documents of the
Department published in the Federal
Register by using the article search
feature at: www.federalregister.gov.
Specifically, through the advanced
search feature at this site, you can limit
your search to documents published by
the Department.
Dated: April 12, 2016.
Ann Whalen,
Senior Advisor to the Secretary Delegated
the Duties of Assistant Secretary for
Elementary and Secondary Education.
[FR Doc. 2016–08726 Filed 4–15–16; 8:45 am]
BILLING CODE 4000–01–P
VerDate Sep<11>2014
16:23 Apr 15, 2016
Jkt 238001
ENVIRONMENTAL PROTECTION
AGENCY
40 CFR Part 131
[EPA–HQ–OW–2016–0012; FRL–9944–70–
OW]
RIN 2040–AF60
Aquatic Life Criteria for Copper and
Cadmium in Oregon
Environmental Protection
Agency (EPA).
ACTION: Proposed rule.
AGENCY:
The Environmental Protection
Agency (EPA) proposes to establish
federal Clean Water Act (CWA) aquatic
life criteria for freshwaters under the
state of Oregon’s jurisdiction, to protect
aquatic life from the effects of exposure
to harmful levels of copper and
cadmium. In 2013, EPA determined that
the freshwater acute cadmium criterion
and freshwater acute and chronic
copper criteria that Oregon adopted in
2004 did not meet CWA requirements to
protect aquatic life in the state.
Therefore, EPA proposes to establish
federal freshwater criteria for cadmium
and copper that take into account the
best available science, EPA policies,
guidance and legal requirements, to
protect aquatic life uses in Oregon.
DATES: Comments must be received on
or before June 2, 2016.
ADDRESSES: Submit your comments,
identified by Docket ID No. EPA–HQ–
OW–2016–0012, at https://
www.regulations.gov. Follow the online
instructions for submitting comments.
Once submitted, comments cannot be
edited or removed from Regulations.gov.
EPA may publish any comment received
to its public docket. Do not submit
electronically any information you
consider to be Confidential Business
Information (CBI) or other information
whose disclosure is restricted by statute.
Multimedia submissions (audio, video,
etc.) must be accompanied by a written
comment. The written comment is
considered the official comment and
should include discussion of all points
you wish to make. EPA will generally
not consider comments or comment
contents located outside of the primary
submission (i.e. on the web, cloud, or
other file sharing system). For
additional submission methods, the full
EPA public comment policy,
information about CBI or multimedia
submissions, and general guidance on
making effective comments, please visit
https://www2.epa.gov/dockets/
commenting-epa-dockets.
EPA is offering two virtual public
hearings so that interested parties may
SUMMARY:
PO 00000
Frm 00008
Fmt 4702
Sfmt 4702
22555
also provide oral comments on this
proposed rule. The first hearing will be
on Monday, May 16, 2016 from 4:00pm
to 6:00pm Pacific Time. The second
hearing will be on Tuesday, May 17,
2016 from 9:00am to 11:00am Pacific
Time. For more details on the public
hearings and a link to register, please
visit https://www.epa.gov/wqs-tech/
water-quality-standards-regulationsoregon.
FOR FURTHER INFORMATION CONTACT:
Erica Fleisig, Office of Water, Standards
and Health Protection Division (4305T),
Environmental Protection Agency, 1200
Pennsylvania Avenue NW., Washington,
DC 20460; telephone number: (202)
566–1057; email address: fleisig.erica@
epa.gov.
SUPPLEMENTARY INFORMATION: This
proposed rule is organized as follows:
I. General Information
Does this action apply to me?
II. Background
A. Statutory and Regulatory Authority
B. EPA’s Disapproval of Oregon’s
Freshwater Copper and Cadmium
Criteria
C. General Recommended Approach for
Deriving Aquatic Life Criteria
III. Freshwater Cadmium Aquatic Life
Criteria
A. EPA’s National Recommended
Cadmium Criteria
B. Proposed Acute Cadmium Criterion for
Oregon’s Freshwaters
C. Implementation of Proposed Freshwater
Acute Cadmium Criterion in Oregon
IV. Freshwater Copper Aquatic Life Criteria
A. EPA’s National Recommended Copper
Criteria
B. Proposed Acute and Chronic Copper
Criteria for Oregon’s Freshwaters
C. Implementation of Proposed Freshwater
Acute and Chronic Copper Criteria in
Oregon
D. Ongoing State Efforts To Develop
Copper Criteria for Oregon’s Freshwaters
E. Incorporation by Reference
V. Critical Low-Flows and Mixing Zones
VI. Endangered Species Act
VII. Under what conditions will federal
standards be not promulgated or
withdrawn?
VIII. Alternative Regulatory Approaches and
Implementation Mechanisms
A. Designating Uses
B. Site-Specific Criteria
C. Variances
D. Compliance Schedules
IX. Economic Analysis
A. Identifying Affected Entities
B. Method for Estimating Costs
C. Results
X. Statutory and Executive Order Reviews
A. Executive Order 12866 (Regulatory
Planning and Review) and Executive
Order 13563 (Improving Regulation and
Regulatory Review)
B. Paperwork Reduction Act
C. Regulatory Flexibility Act
D. Unfunded Mandates Reform Act
E. Executive Order 13132 (Federalism)
E:\FR\FM\18APP1.SGM
18APP1
Agencies
[Federal Register Volume 81, Number 74 (Monday, April 18, 2016)]
[Proposed Rules]
[Pages 22550-22555]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 2016-08726]
=======================================================================
-----------------------------------------------------------------------
DEPARTMENT OF EDUCATION
34 CFR Chapter II
[Docket ID ED-2016-OESE-0004; CFDA Number: 84.368A.]
Proposed Priorities--Enhanced Assessment Instruments
AGENCY: Office of Elementary and Secondary Education, Department of
Education.
ACTION: Proposed priorities.
-----------------------------------------------------------------------
SUMMARY: The Assistant Secretary for Elementary and Secondary Education
proposes priorities under the Enhanced Assessment Instruments Grant
program, also called the Enhanced Assessment Grants (EAG) program. The
Assistant Secretary may use one or more of these priorities for
competitions using funds from fiscal year (FY) 2016 and later years.
Depending on the availability of funds and the use of other priorities
under the EAG authority, the Assistant Secretary may also choose not to
use one or more of these priorities for competitions using funds from
FY 2016 and later years. These proposed priorities are designed to
support projects to improve States' assessment systems.
DATES: We must receive your comments on or before May 18, 2016.
ADDRESSES: Submit your comments through the Federal eRulemaking Portal
or via postal mail, commercial delivery, or hand delivery. We will not
accept comments submitted by fax or by email or those submitted after
the comment period. To ensure that we do not receive duplicate copies,
please submit your comments only once. In addition, please include the
Docket ID and the term ``Enhanced Assessment Grants--Comments'' at the
top of your comments.
Federal eRulemaking Portal: Go to www.regulations.gov to
submit your comments electronically. Information on using
Regulations.gov, including instructions for accessing agency documents,
submitting comments, and viewing the docket, is available on the site
under the ``Help'' tab.
Postal Mail, Commercial Delivery, or Hand Delivery: If you
mail or deliver your comments about these proposed priorities, address
them to the Office of Elementary and Secondary Education, Attention:
Enhanced Assessment Grants--Comments, U.S. Department of Education, 400
Maryland Avenue SW., Room 3e124, Washington, DC 20202-6132.
Privacy Note: The Department of Education's (Department's)
policy is to make all comments received from members of the public
available for public viewing in their entirety on the Federal
eRulemaking Portal at www.regulations.gov. Therefore, commenters
should be careful to include in their comments only information that
they wish to make publicly available.
FOR FURTHER INFORMATION CONTACT: Donald Peasley. Telephone: (202) 453-
7982 or by email: donald.peasley@ed.gov.
If you use a telecommunications device for the deaf (TDD) or a text
telephone (TTY), call the Federal Relay Service (FRS), toll free, at 1-
800-877-8339.
SUPPLEMENTARY INFORMATION:
Invitation to Comment: We invite you to submit comments regarding
this notice. To ensure that your comments have maximum effect in
developing the notice of final priorities, we urge you to identify
clearly the specific proposed priority that each comment addresses.
We invite you to assist us in complying with the specific
requirements of Executive Orders 12866 and 13563 and their overall
requirement of reducing regulatory burden that might result from these
proposed priorities. Please let us know of any further ways we could
reduce potential costs or increase potential benefits while preserving
the effective and efficient administration of the program.
During and after the comment period, you may inspect all public
comments about these proposed priorities by accessing regulations.gov.
You may also inspect the comments in room 3e124, 400 Maryland Avenue
SW., Washington, DC, between the hours of 8:30 a.m. and 4:00 p.m.,
Washington, DC time, Monday through Friday of each week except Federal
holidays.
Assistance to Individuals with Disabilities in Reviewing the
Rulemaking Record: On request we will
[[Page 22551]]
provide an appropriate accommodation or auxiliary aid to an individual
with a disability who needs assistance to review the comments or other
documents in the public rulemaking record for this notice. If you want
to schedule an appointment for this type of accommodation or auxiliary
aid, please contact the person listed under FOR FURTHER INFORMATION
CONTACT.
Purpose of Program: The purpose of the EAG program is to enhance
the quality of assessment instruments and systems used by States for
measuring the academic achievement of elementary and secondary school
students.
Program Authority: Section 6112 of the Elementary and Secondary
Education Act of 1965 (ESEA), as amended by the No Child Left Behind
Act of 2001 (NCLB), and section 1203(b)(1) of the ESEA, as amended by
the Every Student Succeeds Act (Pub. L. 114-95) (ESSA).
Proposed Priorities:
This notice contains three proposed priorities.
Background:
Section 6112 of the ESEA, as amended by the NCLB, and section
1203(b)(1) of the ESEA, as amended by the ESSA, authorize the
Department to make competitive grant awards to State educational
agencies (SEAs) and consortia of SEAs to help them enhance the quality
of their assessment instruments and assessment systems.\1\ Under these
provisions, State grantees must meet at least one of the program's
statutory priorities, including collaborating with organizations to
improve the quality, validity, reliability, and efficiency of academic
assessments; measuring student academic achievement using multiple
measures from multiple sources; measuring student growth on State
assessments; and evaluating student academic achievement through the
development of comprehensive academic assessment instruments and
methods.
---------------------------------------------------------------------------
\1\ The Consolidated Appropriations Act, 2016 (Pub. L. 114-113)
appropriated funds for the EAG program under section 6112 of the
ESEA, as amended by the NCLB. As such, the upcoming EAG competition
will be conducted under that authority. The Department is also
establishing these priorities under the authority in section
1203(b)(1) of the ESEA, as amended by the ESSA, which, if funded,
would replace the EAG program under section 6112. These priorities
may also be used in any competition conducted after FY 2016 under
that authority.
---------------------------------------------------------------------------
The grants awarded under this competitive grant award program in
section 6112 will also lay the groundwork for some new opportunities in
the recently reauthorized Elementary and Secondary Education Act of
1965, as amended by the Every Student Succeeds Act. For example, the
reauthorization of ESEA, will allow up to seven States or consortia of
States to receive an initial demonstration authority to establish an
innovative assessment and accountability system for a new approach to
assessment for a trial period of up to five years. This can provide
SEAs with an opportunity to demonstrate what is possible when
assessment systems are redesigned with student learning at the center.
The EAG program provides SEAs with support to develop innovative
assessment tools and approaches which have the potential to be used by
all States, including those approved under the innovative assessment
and accountability demonstration authority, and be more widely adopted
at scale. In addition, the EAG program provides SEAs with support in
developing innovative summative assessment tools and approaches that
can be used within the broader context of the multiple measures of
student achievement and school accountability of the new ESSA and the
President's Testing Action Plan.
Through this notice, the Department proposes three additional
priorities for the EAG program that are designed to support States in
continuously improving their assessment systems to measure college- and
career-readiness. We believe that an essential part of educating
students involves assessing students' progress toward meeting the high
standards they need to be ready for college and the workplace.
Assessments provide necessary information for States, districts,
educators, families, the public, and students themselves to measure
progress and improve outcomes for all learners. As such, we recognize
the importance of continuously improving and innovating to ensure
assessments are fair, of high quality and not duplicative, can be
completed in the minimum necessary time while validly and reliably
measuring a student's knowledge and skills, and reflect the expectation
that students will be prepared for success in college and careers.
Proposed Priority 1--Developing Innovative Assessment Item Types
and Design Approaches.
Background: The President's Testing Action Plan highlighted the
need to reduce the time spent on unnecessary, duplicative, or low-
quality testing and improve assessment efficiency and quality to
provide educators and parents with more timely and actionable data on
students' progress. SEAs and LEAs need to continue developing new
methods for collecting evidence about what students know and are able
to do as it relates to State learning standards, including by creating
innovative item types and design approaches, for example, by developing
modular assessments that are given throughout the school year instead
of a single summative assessment given at the end of the school year.
Although traditional assessment items such as multiple-choice
questions have advantages, innovative item types such as performance
tasks, simulations, and interactive, multi-step, technology-rich items
that support competency-based assessments or portfolio assessments
which demonstrate applied skills, have the potential to provide a more
comprehensive view of a student's knowledge and mastery of standards.
Examples include: Items that provide multi-step mathematics problems
where students demonstrate their approaches to solving each step; items
that permit graphs or other visual response types; and simulated game
environments where students interact with stimuli and interaction
information is collected.
As States implement more rigorous standards, it is important that
assessment strategies are aligned with the higher-level cognitive
skills students are expected to master.\2\ For example, performance
tasks and simulations provide an opportunity for students to apply
their understanding and demonstrate their abilities in real-world
scenarios. Rather than simply requiring a student to select a response
from a list of options, competency-based assessments can allow students
to interact with material and concepts to formulate responses.
Students' responses to, and performance on, such innovative item types
provide insight into their higher-level thinking and problem-solving
skills and allow educators to better understand students' mastery of
content and concepts.\3\
---------------------------------------------------------------------------
\2\ Darling-Hammond, Linda, et al. (2013). Criteria for High-
Quality Assessment (SCOPE, CRESST, LSRI Policy Brief). https://edpolicy.stanford.edu/sites/default/files/publications/criteria-higher-quality-assessment_1.pdf.
\3\ Gorin, Joanna S. (2007). Test Design with Cognition in Mind.
---------------------------------------------------------------------------
We believe that good assessments should require the same kind of
complex work that students do in an effective classroom or in the real
world, including demonstration and application of knowledge and skills.
Further, assessments should present information and questions that push
students' critical thinking skills so that students gain valuable
experience while taking them. The inclusion of new, innovative item
types will help to ensure that taking an assessment is a worthwhile
experience for students.
[[Page 22552]]
Modular assessment approaches also can help SEAs and LEAs support
students and educators in a number of significant ways by breaking down
large, summative assessment forms with many items into smaller forms
with fewer items (e.g., testing only one mathematics or reading
competency). This will allow students to be assessed on specific
competencies when they are ready and capable of demonstrating
proficiency. This can allow advanced students to move ahead rapidly
while providing students who need extra support the flexibility and
additional time they need to learn and succeed, as well as the
opportunity to demonstrate competence in the areas they have mastered.
Modules can also provide educators with more individualized,
easily-integrated assessments which are used together to provide a
summative analysis of each learner.
Proposed Priority: Under this priority, SEAs must:
(a) Develop, evaluate, and implement new, innovative item types for
use in summative assessments in reading/language arts, mathematics, or
science;
(1) Development of innovative item types under paragraph (a) may
include, for example, performance tasks; simulations; or interactive,
multi-step, technology-rich items that can support competency-based
assessments or portfolio projects;
(2) Projects under this priority must be designed to develop new
methods for collecting evidence about a student's knowledge and
abilities and ensure the quality, validity, reliability, and fairness
of the assessment and comparability of student data; or
(b) Develop new approaches to transform traditional, end-of-year
summative assessment forms with many items into a series of modular
assessment forms, each with fewer items.
(1) To respond to paragraph (b), applicants must develop modular
assessment approaches which can be used to provide timely feedback to
educators and parents as well as be combined to provide a valid,
reliable, and fair summative assessment of individual learners.
(c) Applicants proposing projects under either paragraph (a) or (b)
must provide a dissemination plan such that their projects can serve as
models and resources that can be shared with States across the Nation.
Proposed Priority 2--Improving Assessment Scoring and Score
Reporting.
Background: By improving assessment scoring and score reporting,
SEAs can enhance the testing experience for students and provide more
timely and relevant information to parents and educators. While
developing high-quality assessments that measure student knowledge and
skills against States' standards is an essential part of building
strong assessment systems, ensuring that assessment results are
available sooner, and provide clear and actionable information is also
critically important.
With continued advancements in technology to support and enhance
education in the classroom, it is also becoming possible to improve the
testing experience for students by using technology to automatically
score non-multiple choice assessment items. Automated scoring can
decrease the time needed for scoring and releasing results, lower
costs, improve score consistency, and reduce the need for training of,
and coordination among, human scorers.\4\ Recent research has examined
existing automated scoring systems for short and extended constructed
responses and found these automated scoring systems to be similar to
human scorers.\5\
---------------------------------------------------------------------------
\4\ Williamson, David M., Xiaoming Xi, and F. Jay Breyer.
(2012). A Framework for Evaluation and Use of Automated Scoring.
Educational Measurement: Issues and Practice. Volume 31, Issue 1,
Pages 2-13.
\5\ Shermis, Mark D., and Ben Hamner. (2012). Contrasting State-
of-the-Art Automated Scoring of Essays: Analysis, National Council
on Measurement in Education. www.scoreright.org/NCME_2012_Paper3_29_12.pdf; Shermis, Mark D. (2013). Contrasting
State-of-the-Art in the Machine Scoring of Short-Form Constructed
Responses. Educational Assessment. www.tandfonline.com/doi/pdf/10.1080/10627197.2015.997617.
---------------------------------------------------------------------------
Building on the work done to date and developing better
technological tools to score assessments would be advantageous to SEAs,
LEAs, educators, and students. Automated scoring would allow SEAs to
incorporate more non-multiple choice items, such as essays and
constructed responses, in assessments while not adding significantly to
the time or cost to score the tests. Assessment results could be
returned more quickly to students and educators, who could in turn
respond to the results data through timely implementation of additional
teaching, supports, or interventions that would help students master
content.\6\ The inclusion of additional non-multiple choice items can
also enhance the testing experience for students by requiring more
engaging and complex demonstrations of knowledge. To improve scoring,
applicants responding to this priority could propose projects to build,
test, or enhance automated scoring systems for use with non-multiple
choice items in reading/language arts, mathematics, and science. For
example, an applicant could propose to build, test, or improve a system
for reviewing brief or extended student-constructed responses.
Applicants could propose projects that will research, build, or test
systems that can score assessments and provide diagnostic information
to educators and parents.
---------------------------------------------------------------------------
\6\ For example, the Institute of Education Sciences has
recently invested in projects that are promising examples of how
technology can be leveraged to improve scoring. The aim of one such
project is to develop a computerized oral reading fluency assessment
(see https://ies.ed.gov/funding/grantsearch/details.asp?ID=1492).
Traditional oral reading fluency assessments require one-on-one
administration and hand-scoring, a time-consuming and resource-
intensive process that is prone to scoring errors. The assessment
under development uses speech recognition software to record and
score students' oral reading fluency, making processes more
efficient and less prone to scoring errors. Another such project is
aimed at developing a new assessment tool to measure the science and
math skills of middle school English learners (see https://ies.ed.gov/funding/grantsearch/details.asp?ID=1475). It features
auto-scoring to give immediate feedback to teachers and students.
---------------------------------------------------------------------------
Score reporting, when done well, provides valuable feedback to
educators that can be used to guide instruction and supports for
students. This feedback is most relevant when it is available soon
after the assessment is administered and when it is actionable for
students, parents, and educators. The Department also recognizes a need
to improve the design and content of the reports such that they clearly
communicate information to stakeholders.
Efforts to improve the usefulness of score reports could include:
Incorporating information about what students' results mean; including
multiple levels of information (e.g., overall proficiency, mastery of
different standards or skills); \7\ providing examples of questions
that were likely to be answered correctly or incorrectly (and why); and
connecting students and their families to useful resources or aids to
address identified areas for improvement. Improving communications
related to score reporting could include: Presenting information in
easily comprehensible formats (e.g., graphically or numerically);
tailoring reporting formats to different audiences or for different
modes of dissemination; making results available in a timelier manner
(i.e., delivered to teachers and parents as
[[Page 22553]]
soon as possible after the assessments are administered).
---------------------------------------------------------------------------
\7\ Zapata-Rivera, Diego, and Rebecca Zwick. (2011). Improving
Test Score Reporting: Perspectives from the ETS Score Reporting
Conference. www.ets.org/Media/Research/pdf/RR-11-45.pdf.
---------------------------------------------------------------------------
Proposed Priority: Under this priority, SEAs must:
(a) Develop innovative tools that leverage technology to score
assessments;
(1) To respond to paragraph (a), applicants must propose projects
to reduce the time it takes to provide test results to educators,
parents, and students and to make it more cost-effective to include
non-multiple choice items on assessments. These innovative tools must
improve automated scoring of student assessments, in particular non-
multiple choice items in reading/language arts, mathematics, and
science; or
(b) Propose projects, in consultation with organizations
representing parents, students, and teachers, to address needs related
to score reporting and improve the utility of information about student
performance included in reports of assessment results and provide
better and more timely information to educators and parents;
(1) To respond to paragraph (b), applicants must include one or
more of the following in their projects:
(i) Developing enhanced score reporting templates or digital
mechanisms for communicating assessment results and their meaning;
(ii) Improving the assessment literacy of educators and parents to
improve the interpretation of test results to support teaching and
learning in the classroom; and
(iii) Developing mechanisms for secure transmission and individual
use of assessment results by students and parents.
(c) Applicants proposing projects under either paragraph (a) or (b)
must provide a dissemination plan such that their projects can serve as
models and resources that can be shared with States across the Nation.
Proposed Priority 3--Inventory of State and Local Assessment
Systems.
Background: Recently, there has been significant discussion about
the amount of time students spend in formal testing, including
classroom, district, and State assessments. While the Department
believes that assessments are important tools for measuring progress
and improving outcomes for all students, we also recognize that too
much testing, or unnecessary testing, takes valuable time away from
teaching and learning in the classroom.\8\
---------------------------------------------------------------------------
\8\ As a part of the President's Testing Action Plan, The
Department has recently released a Dear Colleague Letter to State
Chief School Officers providing examples of existing Federal funding
streams, and best practices, which can be utilized at the State and
local levels to improve assessment systems and reduce unnecessary
testing: https://www2.ed.gov/admins/lead/account/saa/16-0002signedcsso222016ltr.pdf.
---------------------------------------------------------------------------
In response to this issue, some SEAs, local educational agencies
(LEAs), and schools are currently in the process of reviewing
assessments administered to students in kindergarten through grade 12
to better understand if each assessment is of high quality, maximizes
instructional goals, has clear purpose and utility, and is designed to
provide information on students' progress toward achieving proficiency
on State standards. To support such efforts, the Department made the
development of tools to inventory State and local assessment systems an
invitational priority in the FY 2015 EAG competition. Through this
proposed priority, the Department would fund States that are reviewing
and streamlining their statewide assessments and working with some or
all of their LEAs to review and streamline local assessments, including
eliminating redundant and unnecessary assessments.
This priority would support the identification of promising
practices that could be followed by other SEAs, LEAs, and schools to
maximize the utility of their assessments to parents, educators, and
students.
Proposed Priority:
(a) Under this priority, SEAs must--
(1) Review statewide and local assessments to ensure that each test
is of high quality, maximizes instructional goals, has a clear purpose
and utility, and is designed to help students demonstrate mastery of
State standards;
(2) Determine whether assessments are serving their intended
purpose to help schools meet their goals and to eliminate redundant and
unnecessary testing; and
(3) Review State and LEA activities related to test preparation to
make sure those activities are focused on academic content and not on
test-taking skills.
(b) To meet the requirements in paragraph (a), SEAs must ensure
that tests are--
(1) Worth taking, meaning that assessments are a component of good
instruction and require students to perform the same kind of complex
work they do in an effective classroom and the real world;
(2) High quality, resulting in actionable, objective information
about students' knowledge and skills, including by assessing the full
range of relevant State standards, eliciting complex student
demonstrations or applications of knowledge, providing an accurate
measure of student achievement, and producing information that can be
used to measure student growth accurately over time;
(3) Time-limited, in order to balance instructional time and the
need for assessments, for example, by eliminating duplicative
assessments and assessments that incentivize low-quality test
preparation strategies that consume valuable classroom time;
(4) Fair for all students and used to support equity in educational
opportunity by ensuring that accessibility features and accommodations
level the playing field so tests accurately reflect what all students,
including students with disabilities and English learners, know and can
do;
(5) Fully transparent to students and parents, so that States and
districts can clearly explain to parents the purpose, the source of the
requirement (if appropriate), and the use by teachers and schools, and
provide feedback to parents and students on student performance; and
(6) Tied to improving student learning as tools in the broader work
of teaching and learning.
(c) Approaches to assessment inventories under paragraph (a) must
include:
(1) Review of the schedule for administration of all assessments
required at the Federal, State, and local levels;
(2) Review of the purpose of, and legal authority for,
administration of all assessments required at the Federal, State, and
local levels; and
(3) Feedback on the assessment system from stakeholders, which
could include information on how teachers, principals, other school
leaders, and administrators use assessment data to inform and
differentiate instruction, how much time teachers spend on assessment
preparation and administration, and the assessments that
administrators, teachers, principals, other school leaders, parents,
and students do and do not find useful.
(d) Projects under this priority--
(1) Must be no longer than 12 months;
(2) Must include a longer-term project plan, understanding that,
beginning with FY 2017, there may be dedicated Federal funds for
assessment audit work as authorized under section 1202 of the ESEA, as
amended by the ESSA, and understanding that States and LEAs may use
other Federal funds, such as the State assessment grant funds,
authorized under section 1201 of the ESEA, as amended by the ESSA,
consistent with the purposes for those funds, to implement such plans;
and
[[Page 22554]]
(3) Are eligible to receive a maximum award of $200,000.
Types of Priorities:
When inviting applications for a competition using one or more
priorities, we designate the type of each priority as absolute,
competitive preference, or invitational through a notice in the Federal
Register. The effect of each type of priority follows:
Absolute priority: Under an absolute priority, we consider only
applications that meet the priority (34 CFR 75.105(c)(3)).
Competitive preference priority: Under a competitive preference
priority, we give competitive preference to an application by (1)
awarding additional points, depending on the extent to which the
application meets the priority (34 CFR 75.105(c)(2)(i)); or (2)
selecting an application that meets the priority over an application of
comparable merit that does not meet the priority (34 CFR
75.105(c)(2)(ii)).
Invitational priority: Under an invitational priority, we are
particularly interested in applications that meet the priority.
However, we do not give an application that meets the priority a
preference over other applications (34 CFR 75.105(c)(1)).
Final Priorities:
We will announce the final priorities in a notice in the Federal
Register. We will determine the final priorities after considering
responses to this notice and other information available to the
Department. This notice does not preclude us from proposing additional
priorities, requirements, definitions, or selection criteria, subject
to meeting applicable rulemaking requirements.
Note: This notice does not solicit applications. In any year in
which we choose to use these priorities, we invite applications
through a notice in the Federal Register.
Paperwork Reduction Act of 1995
As part of its continuing effort to reduce paperwork and respondent
burden, the Department provides the general public and Federal agencies
with an opportunity to comment on proposed and continuing collections
of information in accordance with the Paperwork Reduction Act of 1995
(PRA) (44 U.S.C. 3506(c)(2)(A)). This helps ensure that: the public
understands the Department's collection instructions, respondents can
provide the requested data in the desired format, reporting burden
(time and financial resources) is minimized, collection instruments are
clearly understood, and the Department can properly assess the impact
of collection requirements on respondents.
These proposed priorities contain information collection
requirements that are approved by OMB under the Departmental
application control number 1894-0006; this proposed regulation does not
affect the currently approved data collection.
Executive Orders 12866 and 13563
Regulatory Impact Analysis
Under Executive Order 12866, the Secretary must determine whether
this proposed regulatory action is ``significant'' and, therefore,
subject to the requirements of the Executive order and subject to
review by the Office of Management and Budget (OMB). Section 3(f) of
Executive Order 12866 defines a ``significant regulatory action'' as an
action likely to result in a rule that may--
(1) Have an annual effect on the economy of $100 million or more,
or adversely affect a sector of the economy, productivity, competition,
jobs, the environment, public health or safety, or State, local, or
tribal governments or communities in a material way (also referred to
as an ``economically significant'' rule);
(2) Create serious inconsistency or otherwise interfere with an
action taken or planned by another agency;
(3) Materially alter the budgetary impacts of entitlement grants,
user fees, or loan programs or the rights and obligations of recipients
thereof; or
(4) Raise novel legal or policy issues arising out of legal
mandates, the President's priorities, or the principles stated in the
Executive order.
This proposed regulatory action is not a significant regulatory
action subject to review by OMB under section 3(f) of Executive Order
12866.
We have also reviewed this proposed regulatory action under
Executive Order 13563, which supplements and explicitly reaffirms the
principles, structures, and definitions governing regulatory review
established in Executive Order 12866. To the extent permitted by law,
Executive Order 13563 requires that an agency--
(1) Propose or adopt regulations only upon a reasoned determination
that their benefits justify their costs (recognizing that some benefits
and costs are difficult to quantify);
(2) Tailor its regulations to impose the least burden on society,
consistent with obtaining regulatory objectives and taking into
account--among other things and to the extent practicable--the costs of
cumulative regulations;
(3) In choosing among alternative regulatory approaches, select
those approaches that maximize net benefits (including potential
economic, environmental, public health and safety, and other
advantages; distributive impacts; and equity);
(4) To the extent feasible, specify performance objectives, rather
than the behavior or manner of compliance a regulated entity must
adopt; and
(5) Identify and assess available alternatives to direct
regulation, including economic incentives--such as user fees or
marketable permits--to encourage the desired behavior, or provide
information that enables the public to make choices.
Executive Order 13563 also requires an agency ``to use the best
available techniques to quantify anticipated present and future
benefits and costs as accurately as possible.'' The Office of
Information and Regulatory Affairs of OMB has emphasized that these
techniques may include ``identifying changing future compliance costs
that might result from technological innovation or anticipated
behavioral changes.''
We are issuing these proposed priorities only on a reasoned
determination that their benefits would justify their costs. In
choosing among alternative regulatory approaches, we selected those
approaches that would maximize net benefits. Based on the analysis that
follows, the Department believes that this regulatory action is
consistent with the principles in Executive Order 13563.
We also have determined that this regulatory action would not
unduly interfere with State, local, and tribal governments in the
exercise of their governmental functions.
In accordance with both Executive orders, the Department has
assessed the potential costs and benefits, both quantitative and
qualitative, of this regulatory action. The potential costs are those
resulting from statutory requirements and those we have determined as
necessary for administering the Department's programs and activities.
The proposed priorities included in this notice would benefit
students, parents, educators, administrators, and other stakeholders by
improving the quality of State assessment instruments and systems. The
proposed priority for an inventory of State and local assessment
systems would encourage States to ensure that assessments are of high
quality, maximize instructional goals, and have clear purpose and
utility. Further, it would encourage States to eliminate unnecessary or
redundant tests. The proposed priority for improving assessment scoring
and score reporting would allow for States to score non-multiple choice
assessment items more quickly and at a lower cost
[[Page 22555]]
and ensure that assessments provide timely, actionable feedback to
students, parents, and educators. The proposed priority for developing
innovative assessment item types and design approaches, including the
development of modular assessments, would yield new, more authentic
methods for collecting evidence about what students know and are able
to do and provide educators with more individualized, easily integrated
assessments that can support competency-based learning and other forms
of personalized instruction.
Intergovernmental Review: This program is subject to Executive
Order 12372 and the regulations in 34 CFR part 79. One of the
objectives of the Executive order is to foster an intergovernmental
partnership and a strengthened federalism. The Executive order relies
on processes developed by State and local governments for coordination
and review of proposed Federal financial assistance.
This document provides early notification of our specific plans and
actions for this program.
Accessible Format: Individuals with disabilities can obtain this
document in an accessible format (e.g., braille, large print,
audiotape, or compact disc) on request to the program contact person
listed under FOR FURTHER INFORMATION CONTACT.
Electronic Access to This Document: The official version of this
document is the document published in the Federal Register. Free
Internet access to the official edition of the Federal Register and the
Code of Federal Regulations is available via the Federal Digital System
at: www.gpo.gov/fdsys. At this site you can view this document, as well
as all other documents of this Department published in the Federal
Register, in text or Adobe Portable Document Format (PDF). To use PDF
you must have Adobe Acrobat Reader, which is available free at the
site.
You may also access documents of the Department published in the
Federal Register by using the article search feature at:
www.federalregister.gov. Specifically, through the advanced search
feature at this site, you can limit your search to documents published
by the Department.
Dated: April 12, 2016.
Ann Whalen,
Senior Advisor to the Secretary Delegated the Duties of Assistant
Secretary for Elementary and Secondary Education.
[FR Doc. 2016-08726 Filed 4-15-16; 8:45 am]
BILLING CODE 4000-01-P