Measuring Educational Gain in the National Reporting System for Adult Education, 2306-2324 [08-69]
Download as PDF
2306
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
DEPARTMENT OF EDUCATION
34 CFR Part 462
RIN 1830–ZA06
Measuring Educational Gain in the
National Reporting System for Adult
Education
Office of Vocational and Adult
Education, Department of Education.
ACTION: Final regulations.
rmajette on PROD1PC64 with RULES2
AGENCY:
SUMMARY: The Secretary establishes
procedures for determining the
suitability of tests for use in the
National Reporting System for Adult
Education (NRS). These final
regulations also include procedures that
States and local eligible providers must
follow when using suitable tests for NRS
reporting.
DATES: These regulations are effective
February 13, 2008.
The incorporation by reference of
certain publications listed in the rule is
approved by the Director of the Federal
Register as of February 13, 2008.
However, affected parties do not have to
comply with the information collection
requirements in §§ 462.10, 462.11,
462.12, 462.13, and 462.14 until the
Department of Education publishes in
the Federal Register the control number
assigned by the Office of Management
and Budget (OMB) to these information
collection requirements. Publication of
the control number notifies the public
that OMB has approved these
information collection requirements
under the Paperwork Reduction Act of
1995.
FOR FURTHER INFORMATION CONTACT:
Mike Dean, U.S. Department of
Education, 400 Maryland Avenue, SW.,
Room 11152, Potomac Center Plaza,
Washington, DC 20202–7240.
Telephone: (202) 245–7828 or via
Internet: Mike.Dean@ed.gov.
If you use a telecommunications
device for the deaf (TDD), call the
Federal Relay Service (FRS), toll free, at
1–800–877–8339.
Individuals with disabilities can
obtain this document in an alternative
format (e.g., Braille, large print,
audiotape, or computer diskette) on
request to the contact person listed in
the preceding paragraph.
SUPPLEMENTARY INFORMATION: These
final regulations further the
Department’s implementation of section
212 of the Adult Education and Family
Literacy Act (Act), 20 U.S.C. 9201 et
seq., which establishes a system to
assess the effectiveness of eligible
agencies in achieving continuous
improvement of adult education and
literacy activities.
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
On October 18, 2006, the Secretary
published a notice of proposed
rulemaking (NPRM) for 34 CFR part 462
in the Federal Register (71 FR 61580).
In the preamble to the NPRM, the
Secretary discussed on pages 61581 and
61582 the significant proposed
regulations. As a result of public
comment, these final regulations
contain several significant changes from
the NPRM. While we fully explain these
changes in the Analysis of Comments
and Changes section elsewhere in these
regulations, they are summarized as
follows:
• Rather than immediately
establishing, in § 462.4, a deadline for
State and local eligible providers to stop
using tests that are currently listed in
the Implementation Guidelines:
Measures and Methods for the National
Reporting System for Adult Education
(Guidelines), the Secretary will
announce a deadline in a notice
published in the Federal Register after
reviewing the first group of tests
submitted under these regulations.
• On April 14, 2008, the Secretary
will provide test publishers the first
opportunity to submit tests for review
under these final regulations. In
subsequent years, in accordance with
§ 462.10(b), test publishers must submit
applications to the Secretary by October
1 of each year.
• We have revised several sections of
the regulations to distinguish between
(1) traditional tests, which use items
that have been generated before the test
is administered, and (2) computerized
tests, which use an algorithm to select
test items while the test is being
administered. The changes affect
§§ 462.3(b) regarding the definition of
test, 462.11 regarding the information
that must be included in a test
publisher’s application, 462.12 and
462.13 regarding the Secretary’s review
of tests, and 462.41 regarding the
administration of tests.
• Section 462.12(e) has been revised
to clarify that test publishers can request
that the Secretary reconsider a decision
to revoke a determination that a test is
suitable before the Secretary makes a
final determination about the test’s
suitability for measuring educational
gain for the NRS.
Through these final regulations, we
formalize the process for the review and
approval of tests for use in the NRS. We
believe that the uniform process in these
regulations will facilitate test
publishers’ submissions of tests to the
Department for review and will help
strengthen the integrity of the NRS as a
critical tool for measuring State
performance on accountability
measures. This process also will provide
PO 00000
Frm 00002
Fmt 4701
Sfmt 4700
a means for examining tests that are
currently approved for use in the NRS,
but that have not been updated recently
and, therefore, need to be reassessed for
their continuing validity.
Analysis of Comments and Changes
In response to the Secretary’s
invitation in the NPRM, 13 parties
submitted comments on the proposed
regulations. An analysis of the
comments and of the changes in the
regulations since publication of the
NPRM follows.
We group and discuss issues under
the sections of the regulations to which
they pertain, with the appropriate
sections of the regulations referenced in
parentheses. Generally, we do not
address technical and minor changes—
and suggested changes the law does not
authorize the Secretary to make.
General Comment
Comments: A commenter stated that,
because each State uses its own
curriculum frameworks, the validity of
a particular test may vary to the extent
that the test aligns with a State’s
curricula. The commenter, therefore,
stated that the Department could not
approve a test for use in all States
without evaluating its validity for each
State that uses it.
Discussion: We agree that not all
States can use any single test. States are
expected to select a suitable test or tests
that best align with their particular
curricula. If a State’s curriculum is not
aligned with an existing test, the State
will need to develop its own test aligned
with the State curriculum and submit
the test to the Department for review
under these final regulations.
Changes: None.
Definitions
Adult Education (§ 462.3)
Comments: A commenter stated that
the proposed regulations incorrectly
defined adult education. The
commenter noted that the regulations
refer to students ‘‘who are not enrolled
in secondary school’’ while the Act
refers to students ‘‘who are not enrolled
or required to be enrolled in secondary
school under State law.’’ The
commenter recommended using the
definition in the Act.
Discussion: Section 462.3(a) indicates
that certain terms used in the
regulations, including adult education,
are defined in section 203 of the Act.
The language the commenter quotes is
from the definition of adult education
population in § 462.3(b), which is not
defined in the Act. Nevertheless, we
agree that the two definitions should be
consistent.
E:\FR\FM\14JAR2.SGM
14JAR2
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
Changes: We have modified the
definition of adult education population
to include individuals who are not
required to be enrolled in secondary
school under State law in order to make
it consistent with the definition of adult
education in the Act.
rmajette on PROD1PC64 with RULES2
Content domains and skill areas
(§ 462.3)
Comments: One commenter stated
that the term skill areas should be used
consistently throughout the regulations,
instead of the regulations using this
term interchangeably with the terms
content domain and content
specifications.
Discussion: In drafting the proposed
regulations, we used the terms content
domain and content specifications in
the sections of the regulations
applicable to test publishers because
those are terms of art in the test
publishing industry. Likewise, the term
NRS skill areas is used in the sections
of the regulations that are applicable to
States and local eligible recipients
because this is a term of art in the adult
education program. Although we used
the term content specifications in the
proposed regulations, we did not
include it as a defined term. We think
it is appropriate to do so in the final
regulations because the term has the
same meaning as the terms content
domains and NRS skill areas.
Changes: We have modified the
defined term, content domains or NRS
skill areas, in proposed § 462.3 to also
include the term content specifications.
Test publisher (§ 462.3)
Comments: A few commenters
expressed concern that the definition of
test publisher might be too restrictive
and could prevent the review of some
tests. Commenters recommended
expanding the definition to include
universities; adult education programs;
other entities that possess sufficient
expertise and capacity to develop,
document, and defend assessments;
entities in the process of copyrighting a
test; and entities holding an
unregistered copyright to a test. One
commenter agreed that the Secretary
should review only tests from test
publishers owning a registered
copyright.
Discussion: The proposed regulations
did not prohibit universities, adult
education programs, or other legitimate
test developers from submitting a test
for review. We explained in the
preamble to the NPRM that entities
submitting tests for the Secretary’s
review must be knowledgeable about
the test, be able to respond to technical
questions the Secretary raises during the
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
review process, and have the legal right
to submit the test for review. With
regard to the recommendation to have
the Secretary approve other entities who
can submit a test for review, it would be
inappropriate and counter-productive
for the Secretary to determine the
suitability of a test submitted for review
without the permission of the rightful
owner of a registered copyright of the
test or the entity licensed by the
copyright holder to sell or distribute the
test.
Changes: None.
June 30, 2008, deadline for transitioning
to suitable tests (§ 462.4)
Comments: Several commenters
expressed concern that States might not
have adequate time by the June 30,
2008, deadline to change assessment
instruments, particularly if the Secretary
determines that a test is no longer
suitable for use in the NRS. The
commenters stated that States need time
to rewrite assessment policies, select
replacement tests, retrain personnel,
purchase materials, modify complex
data systems, and, possibly, hire special
contractors to assist with modifying
those data systems. A different
commenter stated that it might take two
years to implement a change in State
assessment instruments. Commenters
recommended that the regulations
permit a State to negotiate a practical
transition timeline with the Secretary.
Commenters also recommended that,
because transitioning from unsuitable
tests places a burden on States’ financial
resources and professional development
capabilities, the regulations should be
deferred until the amount of funds
available for State leadership becomes
15 percent of the Federal allocation.
One commenter indicated that local
programs often pay the significant cost
of purchasing assessment instruments
and that replacing an entire assessment
system in a single budget year could
devastate a local budget.
Discussion: Proposed § 462.4 would
have permitted States and local eligible
providers to continue to measure
educational gain using a test that was
identified in the Guidelines until June
30, 2008. However, we specifically
asked for comments on whether this
deadline would provide sufficient time
for States and local eligible recipients to
make the transition to suitable tests
because we recognized that changing
tests significantly affects a State’s
accountability system. Our intention in
proposing the June 30, 2008, deadline
was to ensure that States stop using
unsuitable tests on a date certain but
still provide enough time for (1) the
Secretary to complete one review of
PO 00000
Frm 00003
Fmt 4701
Sfmt 4700
2307
tests and (2) States and local eligible
recipients to transition from unsuitable
tests to suitable tests. We also intended
to impose a deadline that would result
in the efficient removal of unsuitable
tests from use in the NRS. Once the
Secretary determines that a test is
unsuitable for use in the NRS,
permitting States to continue using it for
long periods of time would be
inconsistent with the Secretary’s intent
to improve data quality.
While we understand the desire to
defer implementation of the regulations
because of cost factors and timing
constraints, improving the quality of
State accountability systems and the
data reported by the NRS is of
immediate importance and should not
be unduly delayed. Adult Education
and Family Literacy Programs, like
other Federal programs, must report on
progress made, achievements, and
overall program effectiveness using
valid and reliable measures of
performance. The regulations are
designed to improve the reliability and
validity of data used to report the
educational gains of students, and
thereby improve the reliability and
validity of data on overall program
effectiveness.
In light of the commenters’ concerns,
and to accommodate States’ needs to
make system revisions, provide training,
and acquire tests, we will not specify a
date in these regulations by which
States and local eligible providers must
cease using unsuitable tests; instead, we
have provided for the Secretary to
announce this deadline in a notice
published in the Federal Register.
Changes: We have revised § 462.4 to
provide that the Secretary will
announce, through a notice in the
Federal Register, a deadline by which
States and local eligible providers must
stop using tests that are currently listed
in the Guidelines and that the Secretary
has determined not to be suitable for use
in the NRS under these final
regulations.
Deadline for submitting tests for review
by the Secretary (§ 462.10(b))
Comments: One commenter agreed
that the regulations should provide an
annual deadline for test publishers to
submit tests to the Secretary for review.
Other commenters requested
clarification on when the review cycle
begins and ends. Another commenter
asked if the first opportunity to submit
tests would be in 2007 or in 2008. Yet
another commenter suggested that the
date for submission of tests be no sooner
than two months and no later than four
months after the effective date of the
final regulations.
E:\FR\FM\14JAR2.SGM
14JAR2
2308
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
Discussion: We are establishing April
14, 2008 as the first date by which test
publishers must submit tests for review
under these regulations. In subsequent
years, test publishers must submit
applications to the Secretary by October
1 of each year. However, because we
cannot predict the number of tests that
will be submitted for review nor the
amount of time it will take to review the
tests, it is not possible to predict how
long the process will take from year to
year. We, therefore, do not think it is
appropriate to establish a date on which
we will announce the results of the
Secretary’s review. We will publish the
list of suitable tests well before the
program year in which they might be
used.
Changes: None.
rmajette on PROD1PC64 with RULES2
Content of an application—General
(§ 462.11)
Comments: A commenter responded
positively to the regulations’ specific
delineation of what an application for
test review must include. Another
commenter asked whether test
publishers must use a form in addition
to submitting the information outlined
in proposed § 462.11(b) through (j).
Another commenter stated that it may
be too constraining to require test
publishers to arrange application
information in the order established by
proposed § 462.11(b) through (j).
Discussion: To facilitate the review
process, the regulations in § 462.11
describe the specific requirements for
the contents of an application. A test
publisher is not required to submit any
form or information except as required
in § 462.11. We believe that organizing
the information in the application in the
order presented in § 462.11(b) through
(j) will help to ensure that information
about a test is easily available to and
reviewable by the educational testing
and assessment experts who will review
the tests; however, to provide test
publishers with some flexibility in
organizing their applications, we will
permit them to include in their
applications a table of contents that
identifies the location of the information
requested in § 462.11(b) through (j).
Changes: We have revised
§ 462.11(a)(3)(ii) to permit test
publishers to include a table of contents
in their applications as an alternative to
presenting information in the
application in the order described in
§ 462.11(b) through (j).
Content of an application—Involvement
of the adult education population
(§ 462.11)
Comments: A commenter stated that
proposed § 462.11 would generally
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
require a test publisher to demonstrate
that adult educators have been involved
in a test’s development and
maintenance, and that some publishers
would not meet that requirement easily.
The commenter also stated that
compliance with the regulations would
require customized tests developed
specifically for use in adult education,
which would increase the cost and
exclude some quality assessments.
Discussion: The regulations do not
require a test publisher to demonstrate
that adult educators have been involved
in a test’s development and
maintenance. We realize that tests
developed for other populations might
not be suitable for use in the NRS
because they were not developed with
the adult education population in mind
and do not readily measure the
educational functioning levels used in
the NRS. The regulations are clear that
the Secretary reviews tests to determine
their suitability for use in the NRS. For
instance, § 462.13(a) indicates that, in
order for the Secretary to consider a test
suitable for use in the NRS, the test
must measure the NRS educational
functioning levels of members of the
adult education population.
Accordingly, § 462.11(c)(1)(ii) requires
information that demonstrates the
extent to which the adult education
population was used to develop and
evaluate a test, which is appropriate
because the tests will be used with that
population.
Changes: None.
Content of an application—Motivation
of examinees (§ 462.11(c)(1)(iii))
Comments: Two commenters were
concerned that test publishers would
have to include information in the
application on the motivation of
examinees used in the development of
a test. One commenter indicated that
‘‘there is no generally accepted method
for identifying and classifying the
degree and level of motivation of
examinees.’’ The commenter stated that
a test publisher could make some
assumptions about motivation, but
indicated that the assumptions would
be subjective and not scientifically
valid. The second commenter requested
clarification of the expectation that
examinees would be motivated.
Discussion: The regulations only
require test publishers to provide in
their applications information on the
steps, if any, taken to ensure that
examinees were motivated while
responding to the test. The regulations
do not require test publishers to take
steps to ensure that examinees were
motivated while responding to the test.
Further, if a test publisher were to take
PO 00000
Frm 00004
Fmt 4701
Sfmt 4700
such steps, the test publisher would not
be required to use any particular
methodology for doing so.
Changes: None.
Content of an application—Item
development (§ 462.11(c))
Comments: A commenter noted that
the proposed regulations did not require
test publishers to include in their
applications information on item
selection or form development for the
test under review.
Discussion: The commenter’s
observation is correct and calls attention
to the need for the regulations to require
test publishers to include this
information in their applications and for
the regulations to clarify the distinction
between traditional tests, which use
items that have been generated before
the test is administered, and those that
use a computerized algorithm to select
test items while the test is being
administered.
Changes: We added a new paragraph
(3) to § 462.11(c) to require test
publishers to describe in their
applications the procedures used to
assign items (1) to forms, for tests that
are constructed prior to being
administered to examinees, or (2) to
examinees, for adaptive tests in which
items are selected in real time.
Content of an application—
Maintenance: history of test use
(§ 462.11(d)(4))
Comments: A few commenters
recommended that the regulations
require test publishers to include in
their applications additional
information on the history of the test’s
use.
Discussion: The regulations require
test publishers to provide
documentation of how a test is
maintained, including a history of the
test’s use. We are particularly interested
in information on how many times the
test forms have been administered. This
information is useful in gauging how
much the test forms have been exposed
and the likelihood of test items being
compromised.
Changes: We have revised
§ 462.11(d)(4) to clarify that information
submitted in the application regarding
the history of a test’s use must include
information on the number of times the
test has been administered.
Content of an application—Maintenance
(§ 462.11(d)(5))
Comments: A commenter
recommended that the regulations
require test publishers to include in
their applications the procedures used
for computerized adaptive tests to select
E:\FR\FM\14JAR2.SGM
14JAR2
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
subsets of items for administration,
determine the starting point and
termination conditions, score the tests,
and control item exposure.
Discussion: We agree that requiring
test publishers to provide the
recommended information will help
experts to better assess the suitability of
computerized adaptive tests for use in
the NRS.
Changes: We added a new paragraph
(5) to § 462.11(d) to require test
publishers to include in their
applications for computerized adaptive
tests the information recommended by
the commenter.
rmajette on PROD1PC64 with RULES2
Content of an application—Match of
content to the NRS educational
functioning levels (content validity)
(§ 462.11(e)(2) and (4))
Comments: A few commenters asked
if proposed § 462.11(e)(2) and (4) were
requesting the same information, and
sought clarification regarding the
difference between the paragraphs.
Discussion: The paragraphs are
requesting the same information.
Changes: We have removed
§ 462.11(e)(4) to eliminate the duplicate
information requirement and
renumbered the remaining paragraphs.
Content of an application—Procedures
for matching scores to NRS educational
functioning levels (§ 462.11(f)(2))
Comments: A commenter stated that
requiring the judgments of subjectmatter experts to translate an
examinee’s performance to the
examinee’s standing with respect to the
NRS educational functioning levels
might not prove fruitful and would
substantially increase the cost of test
development. The commenter stated
that determination of score ranges and
their fit to the existing NRS levels can
be made based on an analysis of skills
being assessed and an intimate
knowledge of the assessment tools being
used.
Discussion: Section 462.11(f) does not
require the use of subject matter experts.
It requires test publishers to document
the procedure they use to translate the
performance of an examinee to the
examinee’s standing with respect to the
NRS educational functioning levels. A
test publisher can choose the procedure
it thinks is best. However, if a test
publisher chooses to use judgmentbased procedures to translate
performance, the regulations require the
publisher to provide information on that
procedure, including information
concerning the subject matter experts
the test publisher used. Requiring this
information is consistent with accepted
professional test development and
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
standard-setting procedures in the 1999
edition of the Standards for Educational
and Psychological Testing and will help
test publishers demonstrate the
suitability of their tests for measuring
educational gain for the NRS.
Test scores are only useful in this
context if they can accurately classify
individuals according to NRS levels.
Therefore, it is necessary for test
publishers to demonstrate how the
range of test scores map onto the NRS
levels and do so in a reliable and valid
fashion. In the test development
process, developers need to show that
the range of test scores produced on
their tests covers the range of skills
depicted in the NRS levels, and more
importantly, shows which range of
scores corresponds to a specific NRS
level.
Changes: None.
Content of an application—Reliability
(§ 462.11(g))
Comments: A commenter noted that
in discussing reliability the proposed
regulations used the phrase ‘‘the
correlation between raw or number
correct scores.’’ The commenter noted
that this phraseology is not applicable to
tests that use an adaptive structure or a
multi-parameter item response theory
model. The commenter stated that, in
such situations, the particular items
answered correctly, not the number of
items answered correctly, determine the
score.
Discussion: We agree with the
commenter that the phrase in the
regulations is not applicable to
computerized adaptive tests.
Changes: We revised § 462.11(g)(1) to
require that, in the case of computerized
adaptive tests, test publishers document
in their applications the correlation
between raw (or scale) scores across
alternate administrations of the test.
Comments: With regard to proposed
§ 462.11(g)(2), a commenter suggested
that information about the number of
individuals classified into NRS levels
would only provide useful data if the
information were submitted after the
Department approved a test’s scores-toNRS-levels crosswalk. The commenter
stated that requiring this information
prior to test approval could produce
information that is not meaningful.
Another commenter responded
positively to the requirement for
‘‘inclusion of information about
decision/classification consistency.’’
Discussion: We do not agree that the
Department should approve the rules a
test publisher uses to transform the
scores of a test into estimates of
examinees’ NRS educational
functioning levels prior to the test
PO 00000
Frm 00005
Fmt 4701
Sfmt 4700
2309
publisher providing evidence that the
transformation rules result in reliable,
i.e., consistent, educational functioning
level classifications. We believe that,
when an application is submitted, a test
publisher should be able to provide
documentation of the degree of
consistency in performance across
different forms of the test, particularly
regarding which examinees are
classified into the same NRS
educational functioning levels across
different forms of the test. By
demonstrating that a test can
consistently classify individuals into the
same NRS educational functioning
levels across different forms of the test,
the test publisher assures the
Department that assessments of
educational gain are the result of
instruction and other interventions to
improve literacy, not measurement
error. Without this demonstration of
classification consistency, reports of
educational gain are uninterpretable.
This information is very important to
determinations about the suitability of a
test and whether the test measures the
NRS educational functioning levels as
required in § 462.13(a).
Changes: None.
Content of an application—Construct
validity (§ 462.11(h))
Comments: A commenter expressed
concern that proposed § 462.11(h)
would have required the results of
several studies on the adult education
population in connection with other
tests designed to assess educational
gain, which can be useful and
meaningful, but also time-consuming
and expensive. The commenter
indicated that imposing this
requirement after, not before, test
approval would permit test publishers
to collaborate and conduct the studies
in a more cost-effective manner.
Further, the commenter stated that the
requirement could exclude some
qualified assessments. The commenter
recommended that the regulations be
rewritten so that (1) these studies would
only be required of tests that have been
approved and (2) a five-year period
could be provided for conducting the
studies.
Discussion: We do not agree that a test
should be approved for use in the NRS
prior to the test publisher providing
documentation of the appropriateness of
a test for measuring educational gain for
the NRS, i.e., documentation that the
test measures what it is intended to
measure. Section 462.11(h) is consistent
with the 1999 edition of the Standards
for Educational and Psychological
Testing, which stresses the importance
of gathering evidence of the test’s
E:\FR\FM\14JAR2.SGM
14JAR2
2310
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
construct validity. The Secretary cannot
determine whether a test is suitable for
use in the NRS without having evidence
of the test’s construct validity.
Changes: None.
Content of an application—Construct
validity (§ 462.11(h)(1))
Comments: A commenter expressed
concern that proposed § 462.11(h)(1)
would have required test publishers to
document the appropriateness of a given
test for measuring educational gain in
the NRS, including the correlation
between the NRS test results and the
results of other tests that assess
educational gain in the same adult
education population. The commenter
stated that this comparison could lead
to faulty conclusions if the other test is
not an accurate measure of the
construct.
Discussion: We believe that it is
appropriate to look at test correlations
as one criterion for evaluating construct
validity. Generally, a test should
correlate with other tests known to
measure the same construct, and it
should not correlate (or have a very low
correlation) with tests known to
measure different constructs. This latter
relationship depends upon the nature of
the comparison constructs. To the
extent that the two constructs are
theoretically related, a correlation that
approximates their theoretical
relationship is expected.
Changes: None.
rmajette on PROD1PC64 with RULES2
Content of an application—Construct
validity (§ 462.11(h)(2))
Comments: Two commenters stated
that proposed § 462.11(h)(2) should be
reconsidered because ‘‘hours of
instruction’’ is not a variable that
correlates highly with test scores.
Discussion: Proposed § 462.11(h)(2)
would have required test publishers to
document that a test measures what it
is intended to measure, including the
extent to which the raw or scale scores
are related to other relevant variables,
such as hours of instruction or other
important process or outcome variables.
While we are aware of data establishing
the relationship between Adult Basic
Education (ABE) test scores and hours
of instruction, the reference to ‘‘hours of
instruction’’ in § 462.11(h)(2) was only
intended to provide an example of a
possibly relevant variable.
Changes: We have revised
§ 462.11(h)(2) to clarify that ‘‘hours of
instruction’’ is an example of possibly
relevant variables.
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
Content of an application—Other
information (§ 462.11(i)(1))
Comments: A commenter requested
clarification of the phrase ‘‘an analysis
of the effects of time on performance’’
used in proposed § 462.11(i)(1). The
commenter thought the phrase meant
‘‘the effects of the time it takes to
administer the test or for a student to
complete it.’’
Discussion: The commenter is correct.
Section 462.11(i)(1) requires an
application to include a description of
the manner in which test administration
time was determined and an analysis of
the speededness of the test. The term
‘‘speededness’’ as used in § 462.11(i)(1)
refers to the effects on test performance
that result from the time it takes to
administer or to complete the test.
Changes: We have revised
§ 462.11(i)(1) to clarify that we require
both information on the manner in
which test administration time was
determined and an analysis of the
speededness of the test.
Content of an application—Other
information (§ 462.11(i)(5))
Comments: A commenter requested
clarification of the term ‘‘retesting
procedures’’ used in proposed
§ 462.11(i)(5). The commenter asked if
the term referred to ‘‘pre- and posttesting.’’
Discussion: The term ‘‘retesting’’ as
used in § 462.11(i)(5) refers to the readministration of a test that might be
necessary because of problems in the
original administration (e.g., the test
taker becomes ill during the test and
cannot finish, there are external
interruptions during testing, or there are
administration errors).
Changes: We have revised
§ 462.11(i)(5) to clarify that ‘‘retesting’’
refers to the re-administration of a test
that might be necessary because of
problems in the original administration
such as, the test taker becomes ill during
the test and cannot finish, there are
external interruptions during testing, or
there are administration errors.
Content of an application—Other
information (§ 462.11(i)(6))
Comments: A commenter noted that
proposed § 462.11(i)(6) would require
test publishers to provide such other
evidence as the Secretary determines is
necessary to establish the test’s
compliance with the criteria and
requirements in proposed § 462.13. The
commenter requested clarification
regarding what that evidence would be.
Discussion: While § 462.11 includes
the information we anticipate the
Secretary will need to determine
PO 00000
Frm 00006
Fmt 4701
Sfmt 4700
whether a test is suitable for use in the
NRS, we recognize that the Secretary
can require a test publisher to provide
additional information to establish a
test’s compliance with the criteria and
requirements in § 462.13. Section
462.11(i)(6) merely alerts test publishers
to this possibility.
Changes: None.
Content of an application—Previous
tests (§ 462.11(j))
Comments: A commenter asked if a
test publisher submitting an application
for a test that is currently approved for
use in the NRS would have to provide
the information in proposed § 462.11(b)
through (i) or would have to provide
only documentation of periodic review
of the content and specifications of the
test as specified in proposed
§ 462.11(j)(1).
Discussion: As indicated in
§ 462.11(a)(1), a test publisher must
include with its application information
listed in paragraphs (b) through (i) as
well as the applicable information in
paragraph (j). All applications must,
therefore, include the information in
§ 462.11(b) through (i).
Changes: None.
Comments: A commenter requested
clarification of the term ‘‘periodic
review’’ used in proposed § 462.11(j)(1)
and (2) with regard to the
documentation that must be submitted
for a previous test to ensure that it
continues to reflect NRS educational
functioning levels. The commenter
asked if annual or bi-annual reviews
constitute a ‘‘periodic review,’’ and
suggested that the regulations use the
term ‘‘current review’’ and specify the
intervals for the reviews.
The commenter also suggested that
the regulations address the possibility
that the NRS educational functioning
levels might change. The commenter
stated that realigning tests to revised
NRS educational functioning levels
would cause a burden for test
publishers, and that test publishers need
at least one year advance notice of any
proposed changes in the NRS
educational functioning levels.
Discussion: Section 462.11(j) requires
test publishers to provide specific
information about currently used tests
to ensure that the tests continue to
reflect NRS educational functioning
levels. The shorter the period of time
between reviews of a test, the more
relevant the results would be in
determining the test’s content validity
with regard to NRS educational
functioning levels. However, we are
reluctant to specify a time-frame for
reviews because we do not want to
create an additional burden for test
E:\FR\FM\14JAR2.SGM
14JAR2
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
rmajette on PROD1PC64 with RULES2
publishers by requiring reviews more
frequently than a test publisher
typically would perform.
With regard to providing advance
notice to test publishers about changes
in the NRS educational functioning
levels, in the past we have customarily
provided notice to all concerned parties,
including test publishers, well in
advance of any changes to the NRS
educational functioning levels by
posting notices on the Internet at
https://www.nrsweb.org. We intend to
continue that practice and, because the
educational functioning levels are
included in § 462.44 of these final
regulations, will publish a notice of
proposed rulemaking in the Federal
Register requesting comment on any
proposed changes to § 462.44.
Changes: None.
Comments: A commenter asked if the
phrase ‘‘previous tests used in the NRS’’
in proposed § 462.11(j)(1) referred only
to those tests previously listed in the
NRS Implementation Guidelines.
Discussion: The ‘‘previous tests’’
discussed in § 462.11(j)(1) are tests that
were listed in the Guidelines and used
to measure educational gain in the NRS
before the effective date of these final
regulations.
Changes: None.
Comments: A commenter requested
clarification of the requirement in
proposed § 462.11(j)(3) for test
publishers to submit ‘‘new data’’ for
tests that have not changed in the seven
years since the Secretary determined the
tests were suitable for use in the NRS.
Discussion: The intent of § 462.11(j) is
to ensure that a test publisher provides
specific information about a test that has
been in use for years, but that has not
changed in the seven years since the
Secretary determined the test was
suitable for use in the NRS. In this
circumstance, the regulations require
test publishers to provide new, i.e.,
updated, data that support the validity
of the test.
Changes: We have revised
§ 462.11(j)(3) to clarify that test
publishers must provide updated data to
support test validity.
Computerized tests (§§ 462.3,
462.11(c)(3), 462.12(a)(2)(iii), 462.13(e),
and 462.41(c)(3))
Comments: A commenter
acknowledged that the proposed
regulations attempted to distinguish
between traditional paper-based tests
and computerized tests. However, the
commenter recommended that the
regulations be modified to clarify that
the term ‘‘parallel forms’’ refers to
‘‘paper-based tests or computerized tests
that do not involve an item selection
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
algorithm, such as computerized
adaptive tests, multistage adaptive tests,
or linear on-the-fly tests.’’ The
commenter suggested specifically
changing the proposed regulatory
language from ‘‘and have two or more
secure, parallel, equated forms’’ to ‘‘and
have two or more secure, parallel forms
if the test is produced before it is
administered to a learner (e.g., paperbased tests). If the test uses a
computerized algorithm for
administering items in real time, the
size of the item pool and the method of
item selection must ensure negligible
overlap in items across pre- and posttest.’’ Another commenter also
emphasized the importance of
recognizing computerized tests.
Discussion: As we have discussed
elsewhere in this notice, we agree that
the regulations should clarify the
distinction between (1) traditional tests,
which use items that are generated
before the test is administered, and (2)
computerized tests, which use an
algorithm to select test items while the
test is being administered.
Changes: We have revised §§ 462.3
(the definition of test), 462.11(c)(3),
462.12(a)(2)(iii), 462.13(e), and
462.41(c)(3) to provide the clarification
recommended by the commenters.
Tests that measure some, but not all,
educational functioning levels
(§§ 462.12(a)(2)(iv) and 462.13(b))
Comments: A commenter requested
clarification on whether the Secretary
will review tests that currently measure
only some, but not all, educational
functioning levels.
Discussion: The regulations provide
for the Secretary to review tests that
measure some, but not all, educational
functioning levels. Sections
462.12(a)(2)(iv) and 462.13(b) indicate
that the Secretary reviews and
determines the suitability of a test if an
application includes a test that samples
one or more of the major content
domains of the NRS educational
functioning levels of Adult Basic
Education (ABE), Adult Secondary
Education (ASE), or English-As-ASecond Language (ESL) with sufficient
numbers of questions to represent
adequately the domain or domains.
Further, § 462.12(b)(2) provides
flexibility for the Secretary to determine
that a test or a sub-test is suitable for use
in the NRS if it measures the content
domain of some, but not all, of the NRS
educational functioning levels.
Changes: None.
PO 00000
Frm 00007
Fmt 4701
Sfmt 4700
2311
Procedures the Secretary uses to review
the suitability of tests (§ 462.12(b))
Comments: A commenter asked if the
results of the Secretary’s review of a test
would be posted for public review.
Another commenter suggested that the
regulations should provide that test
publishers be notified at least 30 days
before the Secretary notifies States and
local eligible providers that a test is
unsuitable for use in the NRS. With
regard to tests that the Secretary
determines are unsuitable, a different
commenter recommended that the
regulations provide a time-frame by
which the Secretary would review any
additional information submitted by the
test publisher and make a final
determination.
Discussion: In accordance with
§ 462.12(c)(2), the Secretary will
annually notify the public through a
notice published in the Federal Register
and posted on the Internet at https://
www.nrsweb.org of tests that are suitable
for use in the NRS. Under § 462.12(e)(5),
the Secretary will follow the same
procedure to notify the public when a
test that was previously determined
suitable is determined to be unsuitable.
However, the Secretary will not post for
public review the Secretary’s
determination regarding a test that has
not been previously reviewed and that
the Secretary determines to be
unsuitable. We do not believe this is
necessary because the Secretary’s
determination regarding these tests will
not have an impact on the public,
States, or local eligible providers.
Proposed § 462.12(d) provided that a
test publisher would have 30 days to
request that the Secretary reconsider the
decision that a test is unsuitable.
Therefore, it will be after this 30-day
period that the Secretary will notify the
public of a decision regarding a test.
Because it is impossible to anticipate
the complexities that might be involved
in decisions regarding a test that was
initially determined to be unsuitable,
we do not believe it would be
appropriate to limit the time the
Secretary takes to review additional
information provided by a test publisher
and make a final determination
regarding the suitability of a test.
However, we intend to conduct the
review as expeditiously as possible.
Changes: None.
Publishing the list of suitable tests
(§ 462.12(c))
Comments: A commenter suggested
that if tests must be submitted for
review by October 1 of each year, the
test publisher and adult education
programs should be informed of which
E:\FR\FM\14JAR2.SGM
14JAR2
2312
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
tests are approved no later than
February 1 of the following year.
Discussion: Within a reasonable timeframe, the Secretary will review tests,
notify test publishers of the results of
the review, and provide States and local
eligible providers with a list of suitable
tests. However, because we cannot
predict the number of tests that will be
submitted for review nor the amount of
time it will take to review the tests, it
is not possible to predict how long the
process will take from year to year. We,
therefore, do not think it is appropriate
to establish a date by which we will
announce the results of the Secretary’s
review. We will publish the list of
suitable tests well before the program
year in which they can be used.
Changes: None.
rmajette on PROD1PC64 with RULES2
Revocation of determination that a test
is suitable (§ 462.12(e))
Comments: A commenter suggested
that test publishers be notified of the
Secretary’s decision to revoke a
determination that a test is suitable
before the Secretary notifies the general
public. The commenter stated that test
publishers should be given sufficient
time to address the Secretary’s concerns
before the Secretary revokes the
determination that the test is suitable.
This would provide a process
comparable to the process proposed for
a test that the Secretary determines is
unsuitable.
Discussion: We agree that test
publishers should have an opportunity
to address the Secretary’s decision to
revoke a determination that a test is
suitable before that determination
becomes a final decision.
Changes: We have revised § 462.12(e)
to give test publishers an opportunity to
request that the Secretary reconsider a
decision to revoke a determination that
a test is unsuitable.
Comments: A commenter stated that
proposed § 462.14(b) was unclear
concerning the reasons a test’s sevenyear approval status might be revoked.
Discussion: Section 462.12(e), not
§ 462.14(b), establishes the reasons for
which the Secretary can revoke the
determination regarding the suitability
of a test. In the proposed regulation, we
stated that the Secretary can revoke the
determination if the Secretary
determines that the information
submitted as a basis for the Secretary’s
review of the test was inaccurate. In
proposed § 462.14(b), however, we
implied that the Secretary also could
revoke a determination regarding the
suitability of a test if the test is
substantially revised—for example, by
changing its structure, number of items,
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
content specifications, item types or
sub-tests.
The proposed regulations were,
therefore, unclear as to whether the
Secretary could revoke a determination
about the suitability of a test if the test
had been substantially revised. We
intended that revision of a test would be
a valid reason for revoking a
determination about test suitability.
Changes: We have revised § 462.12(e)
to clarify that substantial changes to a
test is one of the reasons the Secretary
can revoke a determination regarding
the suitability of a test.
Criteria and requirements for
determining test suitability
(§ 462.13(c)(1), (d), and (f)(3) (proposed
(g)(3))
Comments: A commenter noted that
The Standards for Educational and
Psychological Testing referenced in
§ 462.13(c) provides specific guidelines
for the technical documentation,
including evidence of validity that
should be made available to interested
parties. The commenter recommended
clarifying that publishers should
provide the Secretary with information
regarding test development and validity.
Discussion: In proposed
§ 462.13(c)(1), we indicated that, in
order for a test to be determined suitable
for use in the NRS, the test would have
to meet all applicable and feasible
standards for test construction provided
in The Standards for Educational and
Psychological Testing. We did not
intend this language to imply that the
Secretary would review only
information that is related to test
development. We agree with the
commenter that validity is an important
factor in test evaluation and make it
clear in § 462.11(e) through (g), that test
publishers must include in their
applications information on content
validity, reliability, and construct
validity. However, we think adding the
reference to ‘‘validity’’ in § 462.13(c)(1)
will reinforce the importance of test
publishers providing information on
validity.
Changes: We have revised
§ 462.13(c)(1) to require a test to meet all
applicable and feasible standards for
test construction and validity that are
provided in the Standards for
Educational and Psychological Testing.
Comments: A commenter noted that
many factors influence the appropriate
time between testing and retesting,
including intensity of instruction,
frequency and length of class meetings,
and class size. The commenter wanted
to ‘‘decrease the emphasis on a single
protocol for post-testing’’ and suggested
that, when discussing the time between
PO 00000
Frm 00008
Fmt 4701
Sfmt 4700
test-taking, the regulations use the term
‘‘publisher’s recommendations’’ instead
of the term ‘‘publisher’s guidelines’’ that
was used in proposed §§ 462.13(d) and
462.40(c)(3)(ii).
Discussion: We do not agree that the
term ‘‘publisher’s guidelines’’
emphasizes the use of a single protocol
when determining when to administer a
post-test. State and local eligible
providers, like the commenter, are
aware that many factors influence the
appropriate time between pre-testing
and post-testing. These factors are taken
into consideration by test administrators
at the local level. However, because
tests differ, test administrators rely on
the guidelines, developed during test
construction and validation and
provided by test publishers, to help
ensure that a sufficient amount of time
has passed before a post-test is given in
order to optimize the measurement of
educational gains.
Changes: None.
Comments: A commenter requested
clarification of § 462.13(f)(3) (proposed
§ 462.13(g)(3)), which requires the
publisher of a test modified for an
individual with a disability to
recommend educational functioning
levels based on the previous
performance of test-takers who are
members of the adult education
population of interest to the NRS in
order for the Secretary to consider a test
suitable for use.
Discussion: In order for the Secretary
to consider a test that has been modified
for individuals with disabilities suitable
for use in the NRS, test publishers must
(1) demonstrate that adult education
students with disabilities were included
in the pilot or field test referred to in
§ 462.11(c)(1); (2) match scores to the
NRS educational functioning levels
based on the information obtained from
adult education students with the
disability who participated in the pilot
or field test and for whom the test has
been adapted; and (3) provide in the
application, as required in § 462.11(f),
documentation of the adequacy of the
procedure used to translate the
performance of adult education students
with the disability for whom the test has
been modified to an estimate of the
examinees’ standing with respect to the
NRS educational functional levels.
Changes: In response to the comment,
we changed § 462.13(f)(3) to clarify that
the Secretary considers a test modified
for individuals with disabilities suitable
for use in the NRS if test publishers (1)
recommend educational functioning
levels based on the information
obtained from adult education students
who participated in the pilot or field
test and who have the disability for
E:\FR\FM\14JAR2.SGM
14JAR2
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
which the test has been adapted and (2)
provide documentation of the adequacy
of the procedure used to translate the
performance of adult education students
with the disability for whom the test has
been modified to an estimate of the
examinees’ standing with respect to the
NRS educational functional levels.
rmajette on PROD1PC64 with RULES2
Subpart D—General (§§ 462.40–462.44)
Comments: Several commenters noted
that the Department currently provides
non-regulatory guidance to States on the
NRS. The commenters asked why the
Department is issuing regulations when
most States comply with and are
successfully implementing the nonregulatory guidance and the guidance
provides each State with needed
flexibility. The commenters stated that
‘‘as long as States are complying with
guidelines and meeting performance
standards, we understand that the
Federal role is to limit regulations and
allow the States to continually increase
their capabilities to improve program
services.’’ The commenters, therefore,
recommended that subpart D be
removed from the final regulations and
that the Secretary provide technical
assistance and resources to the few
States that are having difficulty creating
effective assessment and data
procedures. Additionally, one of these
commenters stated that while guidance
is valued, regulations sometimes seem
arbitrary, intrusive, and unnecessary.
Discussion: It is the Department’s
policy to regulate only when essential to
promote quality, and then in the most
flexible, most equitable, and least
burdensome way possible. We believe
these regulations comply with that
policy. While the regulations in subpart
D are legally binding, they largely codify
the guidance provided in the Guidelines
and the ‘‘State Assessment Policy
Guidance.’’ Although States have made
significant progress, there remains
variability in the quality of State
processes and procedures in the
collection and reporting of data on
student assessment and performance.
These regulations, like the Guidelines,
technical assistance activities, and other
efforts the Department has supported to
improve data quality, provide a
significant tool to create a standard level
of quality among all States and thereby
strengthen the integrity of the NRS as a
critical tool for measuring State
performance and the impact of adult
education.
Changes: None.
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
Meaning of the term ‘‘placement’’
(§§ 462.40(c)(4), (c)(6) and (c)(10),
462.41(c)(3) (proposed), 462.42(a),
(b)(2), and (c)(1), (d)(1), and (d)(2), and
462.43(a)(2) and (b))
Comments: A few commenters stated
that using the term ‘‘placement’’ in the
regulations to mean the ‘‘assignment of
a student to the appropriate NRS
educational functioning level’’ is
confusing because the traditional
meaning of placement is ‘‘placing a
student in a particular class or other
instructional offering.’’ Commenters
noted that tests designed for placement
are different from those used to measure
educational gain.
Discussion: The regulations, like the
Guidelines currently used by States, use
the terms ‘‘place’’ and ‘‘placement’’ as
terms of art to refer to the placement of
students in NRS educational
functioning levels in order to measure
and then report on educational gain in
the NRS. It is only within this context
that the terms are used, and no other
meaning should be inferred.
Changes: None.
Placing students (§ 462.42(d)(1) and (2))
Comments: A commenter stated that
placing a student in an NRS educational
functioning level using the lowest test
score could result in programs
providing instruction in skill areas that
are not most relevant to the student.
Further, the commenter stated that
programs (1) should not place students
based on one test and (2) should be
allowed to use multiple placement tools
and other pertinent information, such as
student goals, to determine placement.
Another commenter stated that all
scores have some measurement error
and that instituting a policy of using the
lowest test score might introduce
systematic, rather than random,
measurement errors. This commenter
also indicated that the policy of using
the lowest test score could encourage
programs to teach students only in an
area where gain is to be expected. The
commenter stated that a better policy
would be either to focus on the learner’s
primary subject area at pre-test, or
evaluate gain based on a composite
score across areas.
Discussion: With regard to placing a
student using the lowest test score when
the student is tested in multiple skill
areas, § 462.42(d)(1) and (2) are
consistent with the policy in the
Guidelines, which indicates: ‘‘States
should provide to local programs the
criteria for placing students at each
educational functioning level, using test
scores from the initial assessment. Not
all of the skill areas described in the
PO 00000
Frm 00009
Fmt 4701
Sfmt 4700
2313
[functioning] level descriptors need to
be used to place students, but the skill
used should be the areas most relevant
to the students’ needs and the program’s
curriculum. If multiple skill areas are
assessed and the student has differing
abilities in each area, however, NRS
policy requires that the program place
the student according to the lowest skill
area.’’ The Department’s policy ensures
that States use a standardized approach
when reporting educational gain, which
ensures comparability of data. These
regulations use the term ‘‘placement’’ as
a term of art to refer to the placement
of students in NRS educational
functioning levels in order to measure
and then report on educational gain in
the NRS. Placement of a student in an
educational functioning level for NRS
purposes does not affect placement in
an instructional program. States can use
a variety of tools when devising an
instructional program for students.
States and local eligible providers know
that using the lowest test score is a
convention used for reporting purposes
and it is not intended to encourage
programs to teach students only in skill
areas in which students scored the
lowest on tests or only in skill areas in
which gain is expected. The policy also
is not intended to restrict the number or
type of assessments used to identify a
student’s needs and to customize an
instructional program to meet those
needs. In fact, programs are expected to
teach students in multiple areas
depending on the students’ needs and
the programs’ curricula.
We do not agree with the commenter
that using the lowest test score might
introduce systematic, rather than
random, measurement error. Using the
lowest test score in order to
operationalize educational gain in a
consistent manner across States for
reporting purposes is error free. If tests
have met the standards established in
these regulations, they should be able to
assess skills at any level with minimum
error.
Changes: None.
Measuring educational gain (§ 462.43)
Comments: A commenter suggested
the Department measure significant gain
as determined by test developers. The
commenter thought this approach
would be the most accurate measure of
gain. The commenter opposed the
approach in proposed § 462.43, stating
that it would only capture learning gain
when a learner completes an
educational functioning level by
crossing artificial cut points regardless
of the starting level. Another commenter
stated that the approach for measuring
gain in § 462.43 is ‘‘too coarse, is likely
E:\FR\FM\14JAR2.SGM
14JAR2
2314
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
rmajette on PROD1PC64 with RULES2
to capture only extreme gain, and will
miss very significant educational gains
that occur within an EFL [educational
functioning level].’’ The commenter
suggested that the Department consider
using other options for demonstrating
educational gain such as ‘‘achieving a
gain score that is beyond chance
expectations (using a scale score, which
is more reliable than a proficiency
classification).’’ Another commenter
stated that ‘‘awarding only one
educational gain to students completing
more than one educational functioning
level to a high degree misrepresents and
under reports the effectiveness and
accomplishment of both adult teacher
and learner.’’ This commenter also
stated that the current NRS policy of
reporting an educational gain for
obtaining a General Educational
Development (GED) diploma by learners
in ASE II, but not to those in ABE or
ASE I, misrepresents and under-reports
accomplishments. A different
commenter suggested that attainment of
a GED diploma should be recognized as
an educational gain.
Discussion: The approach for
measuring educational gain in § 462.43
is well established and accepted by the
field. Over a two-year period, the
Department consulted with States and
convened a statutorily mandated panel
in order to develop a performance
accountability system for the adult
education program. During that time,
consensus was reached on defining the
performance measures, including how
to measure educational gain. States and
other participating entities agreed that
the approach in § 462.43 is the most
effective means of obtaining accurate,
reliable, and valid information on
student performance.
Changes: None.
High Advanced ESL (§ 462.44)
Comments: Some commenters
opposed proposed § 462.44 because it,
like the Guidelines, eliminated the
‘‘High Advanced ESL’’ literacy level
from the Functioning Level Table. The
commenters noted that the change
means that State and local agencies will
no longer be able to report educational
gain for adult English language learners
above the ‘‘Advanced ESL’’ literacy
level or Student Performance Level
(SPL) 6. The commenters stated that,
based on communications with the
Department’s Office of Vocational and
Adult Education (OVAE), the change
also means that programs can no longer
provide ‘‘High Advanced ESL’’
instruction in adult education programs.
The commenters indicated that OVAE
has suggested that students who would
have received ‘‘High Advanced ESL’’
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
instruction should now complete their
preparation by moving from ESL classes
to either adult basic education or adult
secondary education classes. The
commenters expressed their belief that
OVAE’s suggestion is pedagogically
problematic and, more importantly,
would prevent programs from
adequately addressing the learning
needs of ESL learners. Further, the
commenters opposed stopping ESL
instruction at the ‘‘Advanced ESL’’
literacy level or SPL 6 because they
believe students at that level do not
have language and literacy skills that are
sufficient to transition successfully to
postsecondary education or to meet the
demands of the workplace—two of the
purposes for adult education that are
cited in the Act. Commenters noted that
the regulations do not place this same
restriction on native English speakers
and recommended that the ‘‘High
Advanced ESL’’ literacy level be
restored to the Functioning Level Table.
Finally, the commenters recommended
that the regulations provide States with
flexibility and not limit program
services.
Discussion: Commenters are correct in
that proposed § 462.44 would codify a
change in the Guidelines by eliminating
the ‘‘High Advanced ESL’’ literacy level
from the Functioning Level Table. The
change was made in 2006 as part of the
information collection request for the
Guidelines. However, before the change
was made, (1) OVAE consulted with
Adult Education State Directors and (2)
the Office of Management and Budget
(OMB) provided interested Federal
agencies and the public, including
States and local eligible providers, an
opportunity to comment on the
information collection request, which
included the removal of the ‘‘High
Advanced ESL’’ literacy level from the
Functioning Level Table. We received
no negative comments regarding our
intent to eliminate the ‘‘High Advanced
ESL’’ literacy level, and therefore,
changed the Functioning Level Table in
the Guidelines and in the proposed
regulations.
The change in the Functioning Level
Table does not mean that adult
education programs can no longer
provide services to ‘‘High Advanced
ESL’’ students. These regulations are
consistent with the Act, which
authorizes services below the
postsecondary level, and do not change
who can be served by adult education
programs. The educational functioning
levels largely serve to classify students
for reporting purposes and should not
be viewed as the standard for
establishing the type of instruction that
can be provided. Placement of a student
PO 00000
Frm 00010
Fmt 4701
Sfmt 4700
in an educational functioning level for
NRS purposes does not affect placement
in an instructional program. A student
who scores above an SPL 6 might still
be served in adult education programs
if he or she has an educational need
below the postsecondary level. States
have the flexibility to determine how
programs are structured, how students
are placed in programs, and how
instruction is delivered.
Changes: None.
Student Performance Levels (SPL)
(§ 462.44)
Comments: A commenter requested
clarification of the relationship between
the NRS literacy levels and SPLs in the
Functioning Level Table. The
commenter indicated that no exact
definition of SPL levels is provided and,
therefore, their relationship to the NRS
levels is unclear.
Discussion: The SPLs were developed
by the Center for Applied Linguistics to
provide a standard description of adult
refugees’ abilities at a range of levels
and a common standard for ESL level
descriptions for use by programs
nationwide. They are nationally
recognized in the adult ESL education
community and represent a standard
metric for identifying skill levels of
adult ESL students in general language
ability, listening comprehension, and
oral communication. The Functioning
Level Table provides educational
functioning level descriptors for
students at literacy levels in ABE, ASE,
and ESL. The literacy levels are divided
into equivalent grade levels for ABE and
ASE and into SPLs for ESL. The
descriptors illustrate the types of skills
students functioning at a given level are
likely to have. The descriptors do not
provide a complete or comprehensive
delineation of all the skills at a
particular level, but instead, provide
examples to guide assessment and
instruction.
Changes: None.
Executive Order 12866
We have reviewed these final
regulations in accordance with
Executive Order 12866. Under the terms
of the order we have assessed the
potential costs and benefits of this
regulatory action.
The potential costs associated with
the final regulations are those resulting
from statutory requirements and those
we have determined to be necessary for
administering this program effectively
and efficiently.
In assessing the potential costs and
benefits—both quantitative and
qualitative—of these final regulations,
E:\FR\FM\14JAR2.SGM
14JAR2
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
we have determined that the benefits of
the regulations justify the costs.
We have also determined that this
regulatory action does not unduly
interfere with State, local, and tribal
governments in the exercise of their
governmental functions.
We discussed the potential costs and
benefits of these final regulations in the
preamble to the NPRM under the
headings Significant Proposed
Regulations (pages 61581 and 61582),
Regulatory Flexibility Act Certification
(pages 61582 and 61583), and
Paperwork Reduction Act of 1995 (page
61583).
Paperwork Reduction Act of 1995
The Paperwork Reduction Act of 1995
does not require you to respond to a
collection of information unless it
displays a valid OMB control number.
With the exception of §§ 462.10 through
462.14, we display the valid OMB
control number assigned to the
collection of information in these final
regulations at the end of the affected
sections of the regulations.
Intergovernmental Review
These regulations are subject to
Executive Order 12372 and the
regulations in 34 CFR part 79. One of
the objectives of the Executive order is
to foster an intergovernmental
partnership and a strengthened
federalism by relying on processes
developed by State and local
governments for coordination and
review of proposed Federal financial
assistance.
In accordance with the order, we
intend this document to provide early
notification of the Department’s specific
plans and actions for this program.
rmajette on PROD1PC64 with RULES2
Assessment of Educational Impact
In the NPRM we requested comments
on whether the proposed regulations
would require transmission of
information that any other agency or
authority of the United States gathers or
makes available.
Based on the response to the NPRM
and on our review, we have determined
that these final regulations do not
require transmission of information that
any other agency or authority of the
United States gathers or makes
available.
Electronic Access to This Document
You can view this document, as well
as all other Department of Education
documents published in the Federal
Register, in text or Adobe Portable
Document Format (PDF) on the Internet
at the following site: https://www.ed.gov/
news/fedregister.
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
To use PDF you must have Adobe
Acrobat Reader, which is available free
at this site. If you have questions about
using PDF, call the U.S. Government
Printing Office (GPO), toll free, at 1–
888–293–6498; or in the Washington,
DC, area at (202) 512–1530.
Note: The official version of this document
is the document published in the Federal
Register. Free Internet access to the official
edition of the Federal Register and the Code
of Federal Regulations is available on GPO
Access at: https://www.gpoaccess.gov/nara/
index.html.
(Catalog of Federal Domestic Assistance
Number does not apply.)
List of Subjects in 34 CFR Part 462
Administrative practice, Adult
education, Grants program—education,
Incorporation by reference, and
Reporting and recordkeeping
requirements.
Dated: January 7, 2008.
Margaret Spellings,
Secretary of Education.
For the reasons discussed in the
preamble, the Secretary amends title 34
of the Code of Federal Regulations by
adding a new part 462 to read as
follows:
I
PART 462—MEASURING
EDUCATIONAL GAIN IN THE
NATIONAL REPORTING SYSTEM FOR
ADULT EDUCATION
Subpart A—General
Sec.
462.1 What is the scope of this part?
462.2 What regulations apply?
462.3 What definitions apply?
462.4 What are the transition rules for using
tests to measure educational gain for the
National Reporting System for Adult
Education (NRS)?
Subpart B—What Process Does the
Secretary Use To Review the Suitability of
Tests for Use in the NRS?
462.10 How does the Secretary review
tests?
462.11 What must an application contain?
462.12 What procedures does the Secretary
use to review the suitability of tests?
462.13 What criteria and requirements does
the Secretary use for determining the
suitability of tests?
462.14 How often and under what
circumstances must a test be reviewed by
the Secretary?
Subpart C—[Reserved]
Subpart D—What Requirements Must
States and Local Eligible Providers Follow
When Measuring Educational Gain?
462.40 Must a State have an assessment
policy?
462.41 How must tests be administered in
order to accurately measure educational
gain?
PO 00000
Frm 00011
Fmt 4701
Sfmt 4700
2315
462.42 How are tests used to place students
at an NRS educational functioning level?
462.43 How is educational gain measured?
462.44 Which educational functioning
levels must States and local eligible
providers use to measure and report
educational gain in the NRS?
Authority: 20 U.S.C. 9212, unless
otherwise noted.
Subpart A—General
§ 462.1
What is the scope of this part?
The regulations in this part establish
the—
(a) Procedures the Secretary uses to
determine the suitability of
standardized tests for use in the
National Reporting System for Adult
Education (NRS) to measure educational
gain of participants in an adult
education program required to report
under the NRS; and
(b) Procedures States and local
eligible providers must follow when
measuring educational gain for use in
the NRS.
(Authority: 20 U.S.C. 9212)
§ 462.2
What regulations apply?
The following regulations apply to
this part:
(a) The Education Department General
Administrative Regulations (EDGAR) as
follows:
(1) 34 CFR part 74 (Administration of
Grants and Agreements with Institutions
of Higher Education, Hospitals, and
Other Non-Profit Organizations).
(2) 34 CFR part 76 (StateAdministered Programs).
(3) 34 CFR part 77 (Definitions that
Apply to Department Regulations).
(4) 34 CFR part 79 (Intergovernmental
Review of Department of Education
Programs and Activities).
(5) 34 CFR part 80 (Uniform
Administrative Requirements for Grants
and Cooperative Agreements to State
and Local Governments).
(6) 34 CFR part 81 (General Education
Provisions Act—Enforcement).
(7) 34 CFR part 82 (New Restrictions
on Lobbying).
(8) 34 CFR part 84 (Governmentwide
Requirements for Drug-Free Workplace
(Financial Assistance)).
(9) 34 CFR part 85 (Governmentwide
Debarment and Suspension
(Nonprocurement)).
(10) 34 CFR part 86 (Drug and Alcohol
Abuse Prevention).
(11) 34 CFR part 97 (Protection of
Human Subjects).
(12) 34 CFR part 98 (Student Rights in
Research, Experimental Programs, and
Testing).
(13) 34 CFR part 99 (Family
Educational Rights and Privacy).
E:\FR\FM\14JAR2.SGM
14JAR2
2316
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
(b) The regulations in this part 462.
(Authority: 20 U.S.C. 9212)
rmajette on PROD1PC64 with RULES2
§ 462.3
What definitions apply?
(a) Definitions in the Adult Education
and Family Literacy Act (Act). The
following terms used in these
regulations are defined in section 203 of
the Adult Education and Family
Literacy Act, 20 U.S.C. 9202 (Act):
Adult education,
Eligible provider,
Individual of limited English
proficiency,
Individual with a disability,
Literacy.
(b) Other definitions. The following
definitions also apply to this part:
Adult basic education (ABE) means
instruction designed for an adult whose
educational functioning level is
equivalent to a particular ABE literacy
level listed in the NRS educational
functioning level table in § 462.44.
Adult education population means
individuals—
(1) Who are 16 years of age or older;
(2) Who are not enrolled or required
to be enrolled in secondary school
under State law; and
(3) Who—
(i) Lack sufficient mastery of basic
educational skills to enable the
individuals to function effectively in
society;
(ii) Do not have a secondary school
diploma or its recognized equivalent,
and have not achieved an equivalent
level of education; or
(iii) Are unable to speak, read, or
write the English language.
Adult secondary education (ASE)
means instruction designed for an adult
whose educational functioning level is
equivalent to a particular ASE literacy
level listed in the NRS educational
functioning level table in § 462.44.
Content domains, content
specifications, or NRS skill areas mean,
for the purpose of the NRS, reading,
writing, and speaking the English
language, numeracy, problem solving,
English language acquisition, and other
literacy skills as defined by the
Secretary.
Educational functioning levels mean
the ABE, ASE, and ESL literacy levels,
as provided in § 462.44, that describe a
set of skills and competencies that
students demonstrate in the NRS skill
areas.
English-as-a-second language (ESL)
means instruction designed for an adult
whose educational functioning level is
equivalent to a particular ESL literacy
level listed in the NRS educational
functioning level table in § 462.44.
Guidelines means the Implementation
Guidelines: Measures and Methods for
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
the National Reporting System for Adult
Education (also known as NRS
Implementation Guidelines) posted on
the Internet at: https://www.nrsweb.org.
A copy of the Guidelines is also
available from the U.S. Department of
Education, Division of Adult Education
and Literacy, 400 Maryland Avenue,
SW., room 11159, Potomac Center Plaza,
Washington, DC 20202–7240.
Local eligible provider means an
‘‘eligible provider’’ as defined in the Act
that operates an adult education
program that is required to report under
the NRS.
State means ‘‘State’’ and ‘‘Outlying
area’’ as defined in the Act.
Test means a standardized test,
assessment, or instrument that has a
formal protocol on how it is to be
administered. These protocols include,
for example, the use of parallel, equated
forms, testing conditions, time allowed
for the test, standardized scoring, and
the amount of instructional time a
student needs before post-testing.
Violation of these protocols often
invalidates the test scores. Tests are not
limited to traditional paper and pencil
(or computer-administered) instruments
for which forms are constructed prior to
administration to examinees. Tests may
also include adaptive tests that use
computerized algorithms for selecting
and administering items in real time;
however, for such instruments, the size
of the item pool and the method of item
selection must ensure negligible overlap
in items across pre- and post-testing.
Test administrator means an
individual who is trained to administer
tests the Secretary determines to be
suitable under this part.
Test publisher means an entity,
individual, organization, or agency that
owns a registered copyright of a test or
is licensed by the copyright holder to
sell or distribute a test.
(Authority: 20 U.S.C. 9202, 9212)
§ 462.4 What are the transition rules for
using tests to measure educational gain for
the National Reporting System for Adult
Education (NRS)?
A State or a local eligible provider
may continue to measure educational
gain for the NRS using a test that was
identified in the Guidelines until the
Secretary announces through a notice
published in the Federal Register a
deadline by which States and local
eligible providers must use only tests
that the Secretary has reviewed and
determined to be suitable for use in the
NRS under this part.
(Approved by the Office of Management and
Budget under control number 1830–0027)
(Authority: 20 U.S.C. 9212)
PO 00000
Frm 00012
Fmt 4701
Sfmt 4700
Subpart B—What Process Does the
Secretary Use To Review the
Suitability of Tests for Use in the NRS?
§ 462.10
tests?
How does the Secretary review
(a) The Secretary only reviews tests
under this part that are submitted by a
test publisher.
(b) A test publisher that wishes to
have the suitability of its test
determined by the Secretary under this
part must submit an application to the
Secretary, in the manner the Secretary
may prescribe, by April 14, 2008, and,
thereafter, by October 1 of each year.
(Authority: 20 U.S.C. 9212)
§ 462.11 What must an application
contain?
(a) Application content and format. In
order for the Secretary to determine
whether a standardized test is suitable
for measuring the gains of participants
in an adult education program required
to report under the NRS, a test publisher
must—
(1) Include with its application
information listed in paragraphs (b)
through (i) of this section, and, if
applicable, the information listed in
paragraph (j) of this section;
(2) Provide evidence that it holds a
registered copyright of a test or is
licensed by the copyright holder to sell
or distribute a test.
(3)(i) Arrange the information in its
application in the order it is presented
in paragraphs (b) through (j) of this
section; or
(ii) Include a table of contents in its
application that identifies the location
of the information required in
paragraphs (b) through (j) of this section.
(4) Submit to the Secretary three
copies of its application.
(b) General information. (1) A
statement, in the technical manual for
the test, of the intended purpose of the
test and how the test will allow
examinees to demonstrate the skills that
are associated with the NRS educational
functioning levels in § 462.44.
(2) The name, address, e-mail address,
and telephone and fax numbers of a
contact person to whom the Secretary
may address inquiries.
(3) A summary of the precise editions,
forms, levels, and, if applicable, subtests and abbreviated tests that the test
publisher is requesting that the
Secretary review and determine to be
suitable for use in the NRS.
(c) Development. Documentation of
how the test was developed, including
a description of—
(1) The nature of samples of
examinees administered the test during
pilot or field testing, such as—
E:\FR\FM\14JAR2.SGM
14JAR2
rmajette on PROD1PC64 with RULES2
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
(i) The number of examinees
administered each item;
(ii) How similar the sample or
samples of examinees used to develop
and evaluate the test were to the adult
education population of interest to the
NRS; and
(iii) The steps, if any, taken to ensure
that the examinees were motivated
while responding to the test; and
(2) The steps taken to ensure the
quality of test items or tasks, such as—
(i) The extent to which items or tasks
on the test were reviewed for fairness
and sensitivity; and
(ii) The extent to which items or tasks
on the test were screened for the
adequacy of their psychometric
properties.
(3) The procedures used to assign
items to—
(i) Forms, for tests that are
constructed prior to being administered
to examinees; or
(ii) Examinees, for adaptive tests in
which items are selected in real time.
(d) Maintenance. Documentation of
how the test is maintained, including a
description of—
(1) How frequently, if ever, new forms
of the test are developed;
(2) The steps taken to ensure the
comparability of scores across forms of
the test;
(3) The steps taken to maintain the
security of the test;
(4) A history of the test’s use,
including the number of times the test
has been administered; and
(5) For a computerized adaptive test,
the procedures used to—
(i) Select subsets of items for
administration;
(ii) Determine the starting point and
termination conditions;
(iii) Score the test; and
(iv) Control for item exposure.
(e) Match of content to the NRS
educational functioning levels (content
validity). Documentation of the extent to
which the items or tasks on the test
cover the skills in the NRS educational
functioning levels in § 462.44,
including—
(1) Whether the items or tasks on the
test require the types and levels of skills
used to describe the NRS educational
functioning levels;
(2) Whether the items or tasks
measure skills that are not associated
with the NRS educational functioning
levels;
(3) Whether aspects of a particular
NRS educational functioning level are
not covered by any of the items or tasks;
(4) The procedures used to establish
the content validity of the test;
(5) The number of subject-matter
experts who provided judgments linking
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
the items or tasks to the NRS
educational functioning levels and their
qualifications for doing so, particularly
their familiarity with adult education
and the NRS educational functioning
levels; and
(6) The extent to which the judgments
of the subject matter experts agree.
(f) Match of scores to NRS educational
functioning levels. Documentation of the
adequacy of the procedure used to
translate the performance of an
examinee on a particular test to an
estimate of the examinee’s standing
with respect to the NRS educational
functioning levels in § 462.44,
including—
(1) The standard-setting procedures
used to establish cut scores for
transforming raw or scale scores on the
test into estimates of an examinee’s NRS
educational functioning level;
(2) If judgment-based procedures were
used—
(i) The number of subject-matter
experts who provided judgments, and
their qualifications; and
(ii) Evidence of the extent to which
the judgments of subject-matter experts
agree;
(3) The standard error of each cut
score, and how it was established; and
(4) The extent to which the cut scores
might be expected to differ if they had
been established by a different (though
similar) panel of experts.
(g) Reliability. Documentation of the
degree of consistency in performance
across different forms of the test in the
absence of any external interventions,
including—
(1) The correlation between raw (or
scale) scores across alternate forms of
the test or, in the case of computerized
adaptive tests, across alternate
administrations of the test;
(2) The consistency with which
examinees are classified into the same
NRS educational functioning levels
across forms of the test. Information
regarding classification consistency
should be reported for each NRS
educational functioning level that the
test is being considered for use in
measuring;
(3) The adequacy of the research
design leading to the estimates of the
reliability of the test, including—
(i) The size of the sample(s);
(ii) The similarity between the
sample(s) used in the data collection
and the adult education population; and
(iii) The steps taken to ensure the
motivation of the examinees; and
(4) Any other information explaining
the methodology and procedures used
to measure the reliability of the test.
(h) Construct validity. Documentation
of the appropriateness of a given test for
PO 00000
Frm 00013
Fmt 4701
Sfmt 4700
2317
measuring educational gain for the NRS,
i.e., documentation that the test
measures what it is intended to
measure, including—
(1) The extent to which the raw or
scale scores and the educational
functioning classifications associated
with the test correlate (or agree) with
scores or classifications associated with
other tests designed or intended to
assess educational gain in the same
adult education population as the NRS;
(2) The extent to which the raw or
scale scores are related to other relevant
variables, such as teacher evaluation,
hours of instruction, or other measures
that may be related to test performance;
(3) The adequacy of the research
designs associated with these sources of
evidence (see paragraph (g)(3) of this
section); and
(4) Other evidence demonstrating that
the test measures gains in educational
functioning resulting from adult
education and not from other constructirrelevant variables, such as practice
effects.
(i) Other information. (1) A
description of the manner in which test
administration time was determined,
and an analysis of the speededness of
the test.
(2) Additional guidance on the
interpretation of scores resulting from
any modifications of the tests for an
individual with a disability.
(3) The manual provided to test
administrators containing procedures
and instructions for test security and
administration.
(4) A description of the training or
certification required of test
administrators and scorers by the test
publisher.
(5) A description of retesting (e.g., readministration of a test because of
problems in the original administration
such as the test taker becomes ill during
the test and cannot finish, there are
external interruptions during testing, or
there are administration errors)
procedures and the analysis upon which
the criteria for retesting are based.
(6) Such other evidence as the
Secretary may determine is necessary to
establish the test’s compliance with the
criteria and requirements the Secretary
uses to determine the suitability of tests
as provided in § 462.13.
(j) Previous tests. (1) For a test used
to measure educational gain in the NRS
before the effective date of these
regulations that is submitted to the
Secretary for review under this part, the
test publisher must provide
documentation of periodic review of the
content and specifications of the test to
ensure that the test continues to reflect
NRS educational functioning levels.
E:\FR\FM\14JAR2.SGM
14JAR2
2318
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
(2) For a test first published five years
or more before the date it is submitted
to the Secretary for review under this
part, the test publisher must provide
documentation of periodic review of the
content and specifications of the test to
ensure that the test continues to reflect
NRS educational functioning levels.
(3) For a test that has not changed in
the seven years since the Secretary
determined, under § 462.13, that it was
suitable for use in the NRS that is again
being submitted to the Secretary for
review under this part, the test
publisher must provide updated data
supporting the validity of the test for
use in classifying adult learners with
respect to the NRS educational
functioning levels and the measurement
of educational gain as defined in
§ 462.43 of this part.
(4) If a test has been substantially
revised—for example by changing its
structure, number of items, content
specifications, item types, or sub-tests—
from the most recent edition reviewed
by the Secretary under this part, the test
publisher must provide an analysis of
the revisions, including the reasons for
the revisions, the implications of the
revisions for the comparability of scores
on the current test to scores on the
previous test, and results from validity,
reliability, and equating or standardsetting studies undertaken subsequent
to the revisions.
(Authority: 20 U.S.C. 9212)
rmajette on PROD1PC64 with RULES2
§ 462.12 What procedures does the
Secretary use to review the suitability of
tests?
(a) Review. (1) When the Secretary
receives a complete application from a
test publisher, the Secretary selects
experts in the field of educational
testing and assessment who possess
appropriate advanced degrees and
experience in test development or
psychometric research, or both, to
advise the Secretary on the extent to
which a test meets the criteria and
requirements in § 462.13.
(2) The Secretary reviews and
determines the suitability of a test only
if an application—
(i) Is submitted by a test publisher;
(ii) Meets the deadline established by
the Secretary;
(iii) Includes a test that—
(A) Has two or more secure, parallel,
equated forms of the same test—either
traditional paper and pencil or
computer-administered instruments—
for which forms are constructed prior to
administration to examinees; or
(B) Is an adaptive test that uses
computerized algorithms for selecting
and administering items in real time;
however, for such an instrument, the
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
size of the item pool and the method of
item selection must ensure negligible
overlap in items across pre- and posttesting;
(iv) Includes a test that samples one
or more of the major content domains of
the NRS educational functioning levels
of ABE, ESL, or ASE with sufficient
numbers of questions to represent
adequately the domain or domains; and
(v) Includes the information
prescribed by the Secretary, including
the information in § 462.11 of this part.
(b) Secretary’s determination. (1) The
Secretary determines whether a test
meets the criteria and requirements in
§ 462.13 after taking into account the
advice of the experts described in
paragraph (a)(1) of this section.
(2) For tests that contain multiple subtests measuring content domains other
than those of the NRS educational
functioning levels, the Secretary
determines the suitability of only those
sub-tests covering the domains of the
NRS educational functioning levels.
(c) Suitable tests. If the Secretary
determines that a test satisfies the
criteria and requirements in § 462.13
and, therefore, is suitable for use in the
NRS, the Secretary—
(1) Notifies the test publisher of the
Secretary’s decision; and
(2) Annually publishes in the Federal
Register and posts on the Internet at
https://www.nrsweb.org a list of the
names of tests and the educational
functioning levels the tests are suitable
to measure in the NRS. A copy of the
list is also available from the U.S.
Department of Education, Office of
Vocational and Adult Education,
Division of Adult Education and
Literacy, 400 Maryland Avenue, SW.,
room 11159, Potomac Center Plaza,
Washington, DC 20202–7240.
(d) Unsuitable tests. (1) If the
Secretary determines that a test does not
satisfy the criteria and requirements in
§ 462.13 and, therefore, is not suitable
for use in the NRS, the Secretary notifies
the test publisher of the Secretary’s
decision and of the reasons why the test
does not meet those criteria and
requirements.
(2) Within 30 days after the Secretary
notifies a test publisher that its test is
not suitable for use in the NRS, the test
publisher may request that the Secretary
reconsider the Secretary’s decision. This
request must be accompanied by—
(i) An analysis of why the information
and documentation submitted meet the
criteria and requirements in § 462.13,
notwithstanding the Secretary’s earlier
decision to the contrary; and
(ii) Any additional documentation
and information that address the
PO 00000
Frm 00014
Fmt 4701
Sfmt 4700
Secretary’s reasons for determining that
the test was unsuitable.
(3) The Secretary reviews the
additional information submitted by the
test publisher and makes a final
determination regarding the suitability
of the test for use in the NRS.
(i) If the Secretary’s decision is
unchanged and the test remains
unsuitable for use in the NRS, the
Secretary notifies the test publisher, and
this action concludes the review
process.
(ii) If the Secretary’s decision changes
and the test is determined to be suitable
for use in the NRS, the Secretary follows
the procedures in paragraph (c) of this
section.
(e) Revocation. (1) The Secretary’s
determination regarding the suitability
of a test may be revoked if the Secretary
determines that—
(i) The information the publisher
submitted as a basis for the Secretary’s
review of the test was inaccurate; or
(ii) A test has been substantially
revised—for example, by changing its
structure, number of items, content
specifications, item types, or sub-tests.
(2) The Secretary notifies the test
publisher of the—
(i) Secretary’s decision to revoke the
determination that the test is suitable for
use in the NRS; and
(ii) Reasons for the Secretary’s
revocation.
(3) Within 30 days after the Secretary
notifies a test publisher of the decision
to revoke a determination that a test is
suitable for use in the NRS, the test
publisher may request that the Secretary
reconsider the decision. This request
must be accompanied by documentation
and information that address the
Secretary’s reasons for revoking the
determination that the test is suitable for
use in the NRS.
(4) The Secretary reviews the
information submitted by the test
publisher and makes a final
determination regarding the suitability
of the test for use in the NRS.
(5) If the Secretary revokes the
determination regarding the suitability
of a test, the Secretary publishes in the
Federal Register and posts on the
Internet at https://www.nrsweb.org a
notice of that revocation along with the
date by which States and local eligible
providers must stop using the revoked
test. A copy of the notice of revocation
is also available from the U.S.
Department of Education, Office of
Vocational and Adult Education,
Division of Adult Education and
Literacy, 400 Maryland Avenue, SW.,
room 11159, Potomac Center Plaza,
Washington, DC 20202–7240.
(Authority: 20 U.S.C. 9212)
E:\FR\FM\14JAR2.SGM
14JAR2
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
rmajette on PROD1PC64 with RULES2
§ 462.13 What criteria and requirements
does the Secretary use for determining the
suitability of tests?
In order for the Secretary to consider
a test suitable for use in the NRS, the
test or the test publisher, if applicable,
must meet the following criteria and
requirements:
(a) The test must measure the NRS
educational functioning levels of
members of the adult education
population.
(b) The test must sample one or more
of the major content domains of the NRS
educational functioning levels of ABE,
ESL, or ASE with sufficient numbers of
questions to adequately represent the
domain or domains.
(c)(1) The test must meet all
applicable and feasible standards for
test construction and validity provided
in the 1999 edition of the Standards for
Educational and Psychological Testing,
prepared by the Joint Committee on
Standards for Educational and
Psychological Testing of the American
Educational Research Association, the
American Psychological Association,
and the National Council on
Measurement in Education incorporated
by reference in this section. The
Director of the Federal Register
approves this incorporation by reference
in accordance with 5 U.S.C. 552(a) and
1 CFR part 51. You may obtain a copy
from the American Psychological
Association, Inc., 750 First Street, NE.,
Washington, DC 20002. You may
inspect a copy at the Department of
Education, room 11159, 550 12th Street,
SW., Washington, DC 20202 or at the
National Archives and Records
Administration (NARA). For
information on the availability of this
material at NARA, call (202) 741–6030,
or go to: https://www.archives.gov/
federal_register/
code_of_federal_regulations/
ibr_locations.html.
(2) If requested by the Secretary, a test
publisher must explain why it believes
that certain standards in the 1999
edition of the Standards for Educational
and Psychological Testing were not
applicable or were not feasible to meet.
(d) The test must contain the
publisher’s guidelines for retesting,
including time between test-taking,
which are accompanied by appropriate
justification.
(e) The test must—
(1) Have two or more secure, parallel,
equated forms of the same test—either
traditional paper and pencil or
computer administered instruments—
for which forms are constructed prior to
administration to examinees; or
(2) Be an adaptive test that uses
computerized algorithms for selecting
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
and administering items in real time;
however, for such an instrument, the
size of the item pool and the method of
item selection must ensure negligible
overlap in items across pre- and posttesting. Scores associated with these
alternate administrations must be
equivalent in meaning.
(f) For a test that has been modified
for individuals with disabilities, the test
publisher must—
(1) Provide documentation that it
followed the guidelines provided in the
Testing Individuals With Disabilities
section of the 1999 edition of the
Standards for Educational and
Psychological Testing;
(2) Provide documentation of the
appropriateness and feasibility of the
modifications relevant to test
performance; and
(3)(i) Recommend educational
functioning levels based on the
information obtained from adult
education students who participated in
the pilot or field test and who have the
disability for which the test has been
modified; and
(ii) Provide documentation of the
adequacy of the procedures used to
translate the performance of adult
education students with the disability
for whom the test has been modified to
an estimate of the examinees’ standing
with respect to the NRS educational
functioning levels.
(Authority: 20 U.S.C. 9212)
§ 462.14 How often and under what
circumstances must a test be reviewed by
the Secretary?
(a) The Secretary’s determination that
a test is suitable for use in the NRS is
in effect for a period of seven years from
the date of the Secretary’s written
notification to the test publisher, unless
otherwise indicated by the Secretary.
After that time, if the test publisher
wants the test to be used in the NRS, the
test must be reviewed again by the
Secretary so that the Secretary can
determine whether the test continues to
be suitable for use in the NRS.
(b) If a test that the Secretary has
determined is suitable for use in the
NRS is substantially revised—for
example, by changing its structure,
number of items, content specifications,
item types, or sub-tests—and the test
publisher wants the test to continue to
be used in the NRS, the test publisher
must submit, as provided in
§ 462.11(j)(4), the substantially revised
test or version of the test to the
Secretary for review so that the
Secretary can determine whether the
test continues to be suitable for use in
the NRS.
(Authority: 20 U.S.C. 9212)
PO 00000
Frm 00015
Fmt 4701
Sfmt 4700
2319
Subpart C—[Reserved]
Subpart D—What Requirements Must
States and Local Eligible Providers
Follow When Measuring Educational
Gain?
§ 462.40
policy?
Must a State have an assessment
(a) A State must have a written
assessment policy that its local eligible
providers must follow in measuring
educational gain and reporting data in
the NRS.
(b) A State must submit its assessment
policy to the Secretary for review and
approval at the time it submits its
annual statistical report for the NRS.
(c) The State’s assessment policy
must—
(1) Include a statement requiring that
local eligible providers measure the
educational gain of all students who
receive 12 hours or more of instruction
in the State’s adult education program
with a test that the Secretary has
determined is suitable for use in the
NRS;
(2) Identify the pre- and post-tests that
the State requires local eligible
providers to use to measure the
educational gain of ABE, ESL, and ASE
students;
(3)(i) Indicate when, in calendar days
or instructional hours, local eligible
providers must administer pre- and
post-tests to students; and
(ii) Ensure that the time for
administering the post-test is long
enough after the pre-test to allow the
test to measure educational gains
according to the test publisher’s
guidelines;
(4) Specify the score ranges tied to
educational functioning levels for
placement and for reporting gains for
accountability;
(5) Identify the skill areas the State
intends to require local eligible
providers to assess in order to measure
educational gain;
(6) Include the guidance the State
provides to local eligible providers on
testing and placement of an individual
with a disability or an individual who
is unable to be tested because of a
disability;
(7) Describe the training requirements
that staff must meet in order to be
qualified to administer and score each
test selected by the State to measure the
educational gains of students;
(8) Identify the alternate form or forms
of each test that local eligible providers
must use for post-testing;
(9) Indicate whether local eligible
providers must use a locator test for
guidance on identifying the appropriate
pre-test;
E:\FR\FM\14JAR2.SGM
14JAR2
2320
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
(10) Describe the State’s policy for the
initial placement of a student at each
NRS educational functioning level using
test scores;
(11) Describe the State’s policy for
using the post-test for measuring
educational gain and for advancing
students across educational functioning
levels;
(12) Describe the pre-service and inservice staff training that the State or
local eligible providers will provide,
including training—
(i) For staff who either administer or
score each of the tests used to measure
educational gain;
(ii) For teachers and other local staff
involved in gathering, analyzing,
compiling, and reporting data for the
NRS; and
(iii) That includes the following
topics:
(A) NRS policy, accountability
policies, and the data collection process.
(B) Definitions of measures.
(C) Conducting assessments; and
(13) Identify the State or local agency
responsible for providing pre- and inservice training.
(Approved by the Office of Management and
Budget under control number 1830–0027)
(Authority: 20 U.S.C. 9212)
(Approved by the Office of Management and
Budget under control number 1830–0027)
(Authority: 20 U.S.C. 9212)
§ 462.42 How are tests used to place
students at an NRS educational functioning
level?
rmajette on PROD1PC64 with RULES2
§ 462.41 How must tests be administered
in order to accurately measure educational
gain?
(a) General. A local eligible provider
must measure the educational gains of
students using only tests that the
Secretary has determined are suitable
for use in the NRS and that the State has
identified in its assessment policy.
(b) Pre-test. A local eligible provider
must—
(1) Administer a pre-test to measure a
student’s educational functioning level
at intake, or as soon as possible
thereafter;
(2) Administer the pre-test to students
at a uniform time, according to its
State’s assessment policy; and
(3) Administer pre-tests to students in
the skill areas identified in its State’s
assessment policy.
(c) Post-test. A local eligible provider
must—
(1) Administer a post-test to measure
a student’s educational functioning
level after a set time period or number
of instructional hours;
(2) Administer the post-test to
students at a uniform time, according to
its State’s assessment policy;
(3)(i) Administer post-tests with a
secure, parallel, equated form of the
same test—either traditional paper and
pencil or computer-administered
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
instruments—for which forms are
constructed prior to administration to
examinees to pre-test and determine the
initial placement of students; or
(ii) Administer post-tests with an
adaptive test that uses computerized
algorithms for selecting and
administering items in real time;
however, for such an instrument, the
size of the item pool and the method of
item selection must ensure negligible
overlap in items across pre- and posttesting; and
(4) Administer post-tests to students
in the same skill areas as the pre-test.
(d) Other requirements. (1) A local
eligible provider must administer a test
using only staff who have been trained
to administer the test.
(2) A local eligible provider may use
the results of a test in the NRS only if
the test was administered in a manner
that is consistent with the State’s
assessment policy and the test
publisher’s guidelines.
(a) A local eligible provider must use
the results of the pre-test described in
§ 462.41(b) to initially place students at
the appropriate NRS educational
functioning level.
(b) A local eligible provider must use
the results of the post-test described in
§ 462.41(c)—
(1) To determine whether students
have completed one or more
educational functioning levels or are
progressing within the same level; and
(2) To place students at the
appropriate NRS educational
functioning level.
(c)(1) States and local eligible
providers are not required to use all of
the skill areas described in the NRS
educational functioning levels to place
students.
(2) States and local eligible providers
must test and report on the skill areas
most relevant to the students’ needs and
to the programs’ curriculum.
(d)(1) If a State’s assessment policy
requires a local eligible provider to test
a student in multiple skill areas and the
student will receive instruction in all of
the skill areas, the local eligible
provider must place the student in an
educational functioning level that is
equivalent to the student’s lowest test
score for any of the skill areas tested
under § 462.41(b) and (c).
PO 00000
Frm 00016
Fmt 4701
Sfmt 4700
(2) If a State’s assessment policy
requires a local eligible provider to test
a student in multiple skill areas, but the
student will receive instruction in fewer
than all of the skill areas, the local
eligible provider must place the student
in an educational functioning level that
is equivalent to the student’s lowest test
score for any of the skill areas—
(i) Tested under § 462.41(b) and (c);
and
(ii) In which the student will receive
instruction.
(Approved by the Office of Management and
Budget under control number 1830–0027)
(Authority: 20 U.S.C. 9212)
§ 462.43 How is educational gain
measured?
(a)(1) Educational gain is measured by
comparing the student’s initial
educational functioning level, as
measured by the pre-test described in
§ 462.41(b), with the student’s
educational functioning level as
measured by the post-test described in
§ 462.41(c).
Example: A State’s assessment policy
requires its local eligible providers to test
students in reading and numeracy. The
student scores lower in reading than in
numeracy. As described in § 462.42(d)(1), the
local eligible provider would use the
student’s reading score to place the student
in an educational functioning level. To
measure educational gain, the local eligible
provider would compare the reading score on
the pre-test with the reading score on the
post-test.
(2) A student is considered to have
made an educational gain when the
student’s post-test indicates that the
student has completed one or more
educational functioning levels above the
level in which the student was placed
by the pre-test.
(b) If a student is not post-tested, then
no educational gain can be measured for
that student and the local eligible
provider must report the student in the
same educational functioning level as
initially placed for NRS reporting
purposes.
(Approved by the Office of Management and
Budget under control number 1830–0027)
(Authority: 20 U.S.C. 9212)
§ 462.44 Which educational functioning
levels must States and local eligible
providers use to measure and report
educational gain in the NRS?
States and local eligible providers
must use the NRS educational
functioning levels in the following
functioning level table:
BILLING CODE 4001–01–P
E:\FR\FM\14JAR2.SGM
14JAR2
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
PO 00000
Frm 00017
Fmt 4701
Sfmt 4725
E:\FR\FM\14JAR2.SGM
14JAR2
2321
ER14ja08.000
rmajette on PROD1PC64 with RULES2
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
VerDate Aug<31>2005
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
15:27 Jan 11, 2008
Jkt 214001
PO 00000
Frm 00018
Fmt 4701
Sfmt 4725
E:\FR\FM\14JAR2.SGM
14JAR2
ER14ja08.001
rmajette on PROD1PC64 with RULES2
2322
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
PO 00000
Frm 00019
Fmt 4701
Sfmt 4725
E:\FR\FM\14JAR2.SGM
14JAR2
2323
ER14ja08.002
rmajette on PROD1PC64 with RULES2
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules and Regulations
(Approved by the Office of Management and
Budget under control number 1830–0027)
(Authority: 20 U.S.C. 9212)
[FR Doc. 08–69 Filed 1–11–08; 8:45 am]
BILLING CODE 4001–01–C
VerDate Aug<31>2005
15:27 Jan 11, 2008
Jkt 214001
PO 00000
Frm 00020
Fmt 4701
Sfmt 4700
E:\FR\FM\14JAR2.SGM
14JAR2
ER14ja08.003
rmajette on PROD1PC64 with RULES2
2324
Agencies
[Federal Register Volume 73, Number 9 (Monday, January 14, 2008)]
[Rules and Regulations]
[Pages 2306-2324]
From the Federal Register Online via the Government Printing Office [www.gpo.gov]
[FR Doc No: 08-69]
[[Page 2305]]
-----------------------------------------------------------------------
Part II
Department of Education
-----------------------------------------------------------------------
34 CFR Part 462
Measuring Educational Gain in the National Reporting System for Adult
Education; Final Rule
Federal Register / Vol. 73, No. 9 / Monday, January 14, 2008 / Rules
and Regulations
[[Page 2306]]
-----------------------------------------------------------------------
DEPARTMENT OF EDUCATION
34 CFR Part 462
RIN 1830-ZA06
Measuring Educational Gain in the National Reporting System for
Adult Education
AGENCY: Office of Vocational and Adult Education, Department of
Education.
ACTION: Final regulations.
-----------------------------------------------------------------------
SUMMARY: The Secretary establishes procedures for determining the
suitability of tests for use in the National Reporting System for Adult
Education (NRS). These final regulations also include procedures that
States and local eligible providers must follow when using suitable
tests for NRS reporting.
DATES: These regulations are effective February 13, 2008.
The incorporation by reference of certain publications listed in
the rule is approved by the Director of the Federal Register as of
February 13, 2008. However, affected parties do not have to comply with
the information collection requirements in Sec. Sec. 462.10, 462.11,
462.12, 462.13, and 462.14 until the Department of Education publishes
in the Federal Register the control number assigned by the Office of
Management and Budget (OMB) to these information collection
requirements. Publication of the control number notifies the public
that OMB has approved these information collection requirements under
the Paperwork Reduction Act of 1995.
FOR FURTHER INFORMATION CONTACT: Mike Dean, U.S. Department of
Education, 400 Maryland Avenue, SW., Room 11152, Potomac Center Plaza,
Washington, DC 20202-7240. Telephone: (202) 245-7828 or via Internet:
Mike.Dean@ed.gov.
If you use a telecommunications device for the deaf (TDD), call the
Federal Relay Service (FRS), toll free, at 1-800-877-8339.
Individuals with disabilities can obtain this document in an
alternative format (e.g., Braille, large print, audiotape, or computer
diskette) on request to the contact person listed in the preceding
paragraph.
SUPPLEMENTARY INFORMATION: These final regulations further the
Department's implementation of section 212 of the Adult Education and
Family Literacy Act (Act), 20 U.S.C. 9201 et seq., which establishes a
system to assess the effectiveness of eligible agencies in achieving
continuous improvement of adult education and literacy activities.
On October 18, 2006, the Secretary published a notice of proposed
rulemaking (NPRM) for 34 CFR part 462 in the Federal Register (71 FR
61580). In the preamble to the NPRM, the Secretary discussed on pages
61581 and 61582 the significant proposed regulations. As a result of
public comment, these final regulations contain several significant
changes from the NPRM. While we fully explain these changes in the
Analysis of Comments and Changes section elsewhere in these
regulations, they are summarized as follows:
Rather than immediately establishing, in Sec. 462.4, a
deadline for State and local eligible providers to stop using tests
that are currently listed in the Implementation Guidelines: Measures
and Methods for the National Reporting System for Adult Education
(Guidelines), the Secretary will announce a deadline in a notice
published in the Federal Register after reviewing the first group of
tests submitted under these regulations.
On April 14, 2008, the Secretary will provide test
publishers the first opportunity to submit tests for review under these
final regulations. In subsequent years, in accordance with Sec.
462.10(b), test publishers must submit applications to the Secretary by
October 1 of each year.
We have revised several sections of the regulations to
distinguish between (1) traditional tests, which use items that have
been generated before the test is administered, and (2) computerized
tests, which use an algorithm to select test items while the test is
being administered. The changes affect Sec. Sec. 462.3(b) regarding
the definition of test, 462.11 regarding the information that must be
included in a test publisher's application, 462.12 and 462.13 regarding
the Secretary's review of tests, and 462.41 regarding the
administration of tests.
Section 462.12(e) has been revised to clarify that test
publishers can request that the Secretary reconsider a decision to
revoke a determination that a test is suitable before the Secretary
makes a final determination about the test's suitability for measuring
educational gain for the NRS.
Through these final regulations, we formalize the process for the
review and approval of tests for use in the NRS. We believe that the
uniform process in these regulations will facilitate test publishers'
submissions of tests to the Department for review and will help
strengthen the integrity of the NRS as a critical tool for measuring
State performance on accountability measures. This process also will
provide a means for examining tests that are currently approved for use
in the NRS, but that have not been updated recently and, therefore,
need to be reassessed for their continuing validity.
Analysis of Comments and Changes
In response to the Secretary's invitation in the NPRM, 13 parties
submitted comments on the proposed regulations. An analysis of the
comments and of the changes in the regulations since publication of the
NPRM follows.
We group and discuss issues under the sections of the regulations
to which they pertain, with the appropriate sections of the regulations
referenced in parentheses. Generally, we do not address technical and
minor changes--and suggested changes the law does not authorize the
Secretary to make.
General Comment
Comments: A commenter stated that, because each State uses its own
curriculum frameworks, the validity of a particular test may vary to
the extent that the test aligns with a State's curricula. The
commenter, therefore, stated that the Department could not approve a
test for use in all States without evaluating its validity for each
State that uses it.
Discussion: We agree that not all States can use any single test.
States are expected to select a suitable test or tests that best align
with their particular curricula. If a State's curriculum is not aligned
with an existing test, the State will need to develop its own test
aligned with the State curriculum and submit the test to the Department
for review under these final regulations.
Changes: None.
Definitions
Adult Education (Sec. 462.3)
Comments: A commenter stated that the proposed regulations
incorrectly defined adult education. The commenter noted that the
regulations refer to students ``who are not enrolled in secondary
school'' while the Act refers to students ``who are not enrolled or
required to be enrolled in secondary school under State law.'' The
commenter recommended using the definition in the Act.
Discussion: Section 462.3(a) indicates that certain terms used in
the regulations, including adult education, are defined in section 203
of the Act. The language the commenter quotes is from the definition of
adult education population in Sec. 462.3(b), which is not defined in
the Act. Nevertheless, we agree that the two definitions should be
consistent.
[[Page 2307]]
Changes: We have modified the definition of adult education
population to include individuals who are not required to be enrolled
in secondary school under State law in order to make it consistent with
the definition of adult education in the Act.
Content domains and skill areas (Sec. 462.3)
Comments: One commenter stated that the term skill areas should be
used consistently throughout the regulations, instead of the
regulations using this term interchangeably with the terms content
domain and content specifications.
Discussion: In drafting the proposed regulations, we used the terms
content domain and content specifications in the sections of the
regulations applicable to test publishers because those are terms of
art in the test publishing industry. Likewise, the term NRS skill areas
is used in the sections of the regulations that are applicable to
States and local eligible recipients because this is a term of art in
the adult education program. Although we used the term content
specifications in the proposed regulations, we did not include it as a
defined term. We think it is appropriate to do so in the final
regulations because the term has the same meaning as the terms content
domains and NRS skill areas.
Changes: We have modified the defined term, content domains or NRS
skill areas, in proposed Sec. 462.3 to also include the term content
specifications.
Test publisher (Sec. 462.3)
Comments: A few commenters expressed concern that the definition of
test publisher might be too restrictive and could prevent the review of
some tests. Commenters recommended expanding the definition to include
universities; adult education programs; other entities that possess
sufficient expertise and capacity to develop, document, and defend
assessments; entities in the process of copyrighting a test; and
entities holding an unregistered copyright to a test. One commenter
agreed that the Secretary should review only tests from test publishers
owning a registered copyright.
Discussion: The proposed regulations did not prohibit universities,
adult education programs, or other legitimate test developers from
submitting a test for review. We explained in the preamble to the NPRM
that entities submitting tests for the Secretary's review must be
knowledgeable about the test, be able to respond to technical questions
the Secretary raises during the review process, and have the legal
right to submit the test for review. With regard to the recommendation
to have the Secretary approve other entities who can submit a test for
review, it would be inappropriate and counter-productive for the
Secretary to determine the suitability of a test submitted for review
without the permission of the rightful owner of a registered copyright
of the test or the entity licensed by the copyright holder to sell or
distribute the test.
Changes: None.
June 30, 2008, deadline for transitioning to suitable tests (Sec.
462.4)
Comments: Several commenters expressed concern that States might
not have adequate time by the June 30, 2008, deadline to change
assessment instruments, particularly if the Secretary determines that a
test is no longer suitable for use in the NRS. The commenters stated
that States need time to rewrite assessment policies, select
replacement tests, retrain personnel, purchase materials, modify
complex data systems, and, possibly, hire special contractors to assist
with modifying those data systems. A different commenter stated that it
might take two years to implement a change in State assessment
instruments. Commenters recommended that the regulations permit a State
to negotiate a practical transition timeline with the Secretary.
Commenters also recommended that, because transitioning from
unsuitable tests places a burden on States' financial resources and
professional development capabilities, the regulations should be
deferred until the amount of funds available for State leadership
becomes 15 percent of the Federal allocation. One commenter indicated
that local programs often pay the significant cost of purchasing
assessment instruments and that replacing an entire assessment system
in a single budget year could devastate a local budget.
Discussion: Proposed Sec. 462.4 would have permitted States and
local eligible providers to continue to measure educational gain using
a test that was identified in the Guidelines until June 30, 2008.
However, we specifically asked for comments on whether this deadline
would provide sufficient time for States and local eligible recipients
to make the transition to suitable tests because we recognized that
changing tests significantly affects a State's accountability system.
Our intention in proposing the June 30, 2008, deadline was to ensure
that States stop using unsuitable tests on a date certain but still
provide enough time for (1) the Secretary to complete one review of
tests and (2) States and local eligible recipients to transition from
unsuitable tests to suitable tests. We also intended to impose a
deadline that would result in the efficient removal of unsuitable tests
from use in the NRS. Once the Secretary determines that a test is
unsuitable for use in the NRS, permitting States to continue using it
for long periods of time would be inconsistent with the Secretary's
intent to improve data quality.
While we understand the desire to defer implementation of the
regulations because of cost factors and timing constraints, improving
the quality of State accountability systems and the data reported by
the NRS is of immediate importance and should not be unduly delayed.
Adult Education and Family Literacy Programs, like other Federal
programs, must report on progress made, achievements, and overall
program effectiveness using valid and reliable measures of performance.
The regulations are designed to improve the reliability and validity of
data used to report the educational gains of students, and thereby
improve the reliability and validity of data on overall program
effectiveness.
In light of the commenters' concerns, and to accommodate States'
needs to make system revisions, provide training, and acquire tests, we
will not specify a date in these regulations by which States and local
eligible providers must cease using unsuitable tests; instead, we have
provided for the Secretary to announce this deadline in a notice
published in the Federal Register.
Changes: We have revised Sec. 462.4 to provide that the Secretary
will announce, through a notice in the Federal Register, a deadline by
which States and local eligible providers must stop using tests that
are currently listed in the Guidelines and that the Secretary has
determined not to be suitable for use in the NRS under these final
regulations.
Deadline for submitting tests for review by the Secretary (Sec.
462.10(b))
Comments: One commenter agreed that the regulations should provide
an annual deadline for test publishers to submit tests to the Secretary
for review. Other commenters requested clarification on when the review
cycle begins and ends. Another commenter asked if the first opportunity
to submit tests would be in 2007 or in 2008. Yet another commenter
suggested that the date for submission of tests be no sooner than two
months and no later than four months after the effective date of the
final regulations.
[[Page 2308]]
Discussion: We are establishing April 14, 2008 as the first date by
which test publishers must submit tests for review under these
regulations. In subsequent years, test publishers must submit
applications to the Secretary by October 1 of each year. However,
because we cannot predict the number of tests that will be submitted
for review nor the amount of time it will take to review the tests, it
is not possible to predict how long the process will take from year to
year. We, therefore, do not think it is appropriate to establish a date
on which we will announce the results of the Secretary's review. We
will publish the list of suitable tests well before the program year in
which they might be used.
Changes: None.
Content of an application--General (Sec. 462.11)
Comments: A commenter responded positively to the regulations'
specific delineation of what an application for test review must
include. Another commenter asked whether test publishers must use a
form in addition to submitting the information outlined in proposed
Sec. 462.11(b) through (j). Another commenter stated that it may be
too constraining to require test publishers to arrange application
information in the order established by proposed Sec. 462.11(b)
through (j).
Discussion: To facilitate the review process, the regulations in
Sec. 462.11 describe the specific requirements for the contents of an
application. A test publisher is not required to submit any form or
information except as required in Sec. 462.11. We believe that
organizing the information in the application in the order presented in
Sec. 462.11(b) through (j) will help to ensure that information about
a test is easily available to and reviewable by the educational testing
and assessment experts who will review the tests; however, to provide
test publishers with some flexibility in organizing their applications,
we will permit them to include in their applications a table of
contents that identifies the location of the information requested in
Sec. 462.11(b) through (j).
Changes: We have revised Sec. 462.11(a)(3)(ii) to permit test
publishers to include a table of contents in their applications as an
alternative to presenting information in the application in the order
described in Sec. 462.11(b) through (j).
Content of an application--Involvement of the adult education
population (Sec. 462.11)
Comments: A commenter stated that proposed Sec. 462.11 would
generally require a test publisher to demonstrate that adult educators
have been involved in a test's development and maintenance, and that
some publishers would not meet that requirement easily. The commenter
also stated that compliance with the regulations would require
customized tests developed specifically for use in adult education,
which would increase the cost and exclude some quality assessments.
Discussion: The regulations do not require a test publisher to
demonstrate that adult educators have been involved in a test's
development and maintenance. We realize that tests developed for other
populations might not be suitable for use in the NRS because they were
not developed with the adult education population in mind and do not
readily measure the educational functioning levels used in the NRS. The
regulations are clear that the Secretary reviews tests to determine
their suitability for use in the NRS. For instance, Sec. 462.13(a)
indicates that, in order for the Secretary to consider a test suitable
for use in the NRS, the test must measure the NRS educational
functioning levels of members of the adult education population.
Accordingly, Sec. 462.11(c)(1)(ii) requires information that
demonstrates the extent to which the adult education population was
used to develop and evaluate a test, which is appropriate because the
tests will be used with that population.
Changes: None.
Content of an application--Motivation of examinees (Sec.
462.11(c)(1)(iii))
Comments: Two commenters were concerned that test publishers would
have to include information in the application on the motivation of
examinees used in the development of a test. One commenter indicated
that ``there is no generally accepted method for identifying and
classifying the degree and level of motivation of examinees.'' The
commenter stated that a test publisher could make some assumptions
about motivation, but indicated that the assumptions would be
subjective and not scientifically valid. The second commenter requested
clarification of the expectation that examinees would be motivated.
Discussion: The regulations only require test publishers to provide
in their applications information on the steps, if any, taken to ensure
that examinees were motivated while responding to the test. The
regulations do not require test publishers to take steps to ensure that
examinees were motivated while responding to the test. Further, if a
test publisher were to take such steps, the test publisher would not be
required to use any particular methodology for doing so.
Changes: None.
Content of an application--Item development (Sec. 462.11(c))
Comments: A commenter noted that the proposed regulations did not
require test publishers to include in their applications information on
item selection or form development for the test under review.
Discussion: The commenter's observation is correct and calls
attention to the need for the regulations to require test publishers to
include this information in their applications and for the regulations
to clarify the distinction between traditional tests, which use items
that have been generated before the test is administered, and those
that use a computerized algorithm to select test items while the test
is being administered.
Changes: We added a new paragraph (3) to Sec. 462.11(c) to require
test publishers to describe in their applications the procedures used
to assign items (1) to forms, for tests that are constructed prior to
being administered to examinees, or (2) to examinees, for adaptive
tests in which items are selected in real time.
Content of an application--Maintenance: history of test use (Sec.
462.11(d)(4))
Comments: A few commenters recommended that the regulations require
test publishers to include in their applications additional information
on the history of the test's use.
Discussion: The regulations require test publishers to provide
documentation of how a test is maintained, including a history of the
test's use. We are particularly interested in information on how many
times the test forms have been administered. This information is useful
in gauging how much the test forms have been exposed and the likelihood
of test items being compromised.
Changes: We have revised Sec. 462.11(d)(4) to clarify that
information submitted in the application regarding the history of a
test's use must include information on the number of times the test has
been administered.
Content of an application--Maintenance (Sec. 462.11(d)(5))
Comments: A commenter recommended that the regulations require test
publishers to include in their applications the procedures used for
computerized adaptive tests to select
[[Page 2309]]
subsets of items for administration, determine the starting point and
termination conditions, score the tests, and control item exposure.
Discussion: We agree that requiring test publishers to provide the
recommended information will help experts to better assess the
suitability of computerized adaptive tests for use in the NRS.
Changes: We added a new paragraph (5) to Sec. 462.11(d) to require
test publishers to include in their applications for computerized
adaptive tests the information recommended by the commenter.
Content of an application--Match of content to the NRS educational
functioning levels (content validity) (Sec. 462.11(e)(2) and (4))
Comments: A few commenters asked if proposed Sec. 462.11(e)(2) and
(4) were requesting the same information, and sought clarification
regarding the difference between the paragraphs.
Discussion: The paragraphs are requesting the same information.
Changes: We have removed Sec. 462.11(e)(4) to eliminate the
duplicate information requirement and renumbered the remaining
paragraphs.
Content of an application--Procedures for matching scores to NRS
educational functioning levels (Sec. 462.11(f)(2))
Comments: A commenter stated that requiring the judgments of
subject-matter experts to translate an examinee's performance to the
examinee's standing with respect to the NRS educational functioning
levels might not prove fruitful and would substantially increase the
cost of test development. The commenter stated that determination of
score ranges and their fit to the existing NRS levels can be made based
on an analysis of skills being assessed and an intimate knowledge of
the assessment tools being used.
Discussion: Section 462.11(f) does not require the use of subject
matter experts. It requires test publishers to document the procedure
they use to translate the performance of an examinee to the examinee's
standing with respect to the NRS educational functioning levels. A test
publisher can choose the procedure it thinks is best. However, if a
test publisher chooses to use judgment-based procedures to translate
performance, the regulations require the publisher to provide
information on that procedure, including information concerning the
subject matter experts the test publisher used. Requiring this
information is consistent with accepted professional test development
and standard-setting procedures in the 1999 edition of the Standards
for Educational and Psychological Testing and will help test publishers
demonstrate the suitability of their tests for measuring educational
gain for the NRS.
Test scores are only useful in this context if they can accurately
classify individuals according to NRS levels. Therefore, it is
necessary for test publishers to demonstrate how the range of test
scores map onto the NRS levels and do so in a reliable and valid
fashion. In the test development process, developers need to show that
the range of test scores produced on their tests covers the range of
skills depicted in the NRS levels, and more importantly, shows which
range of scores corresponds to a specific NRS level.
Changes: None.
Content of an application--Reliability (Sec. 462.11(g))
Comments: A commenter noted that in discussing reliability the
proposed regulations used the phrase ``the correlation between raw or
number correct scores.'' The commenter noted that this phraseology is
not applicable to tests that use an adaptive structure or a multi-
parameter item response theory model. The commenter stated that, in
such situations, the particular items answered correctly, not the
number of items answered correctly, determine the score.
Discussion: We agree with the commenter that the phrase in the
regulations is not applicable to computerized adaptive tests.
Changes: We revised Sec. 462.11(g)(1) to require that, in the case
of computerized adaptive tests, test publishers document in their
applications the correlation between raw (or scale) scores across
alternate administrations of the test.
Comments: With regard to proposed Sec. 462.11(g)(2), a commenter
suggested that information about the number of individuals classified
into NRS levels would only provide useful data if the information were
submitted after the Department approved a test's scores-to-NRS-levels
crosswalk. The commenter stated that requiring this information prior
to test approval could produce information that is not meaningful.
Another commenter responded positively to the requirement for
``inclusion of information about decision/classification consistency.''
Discussion: We do not agree that the Department should approve the
rules a test publisher uses to transform the scores of a test into
estimates of examinees' NRS educational functioning levels prior to the
test publisher providing evidence that the transformation rules result
in reliable, i.e., consistent, educational functioning level
classifications. We believe that, when an application is submitted, a
test publisher should be able to provide documentation of the degree of
consistency in performance across different forms of the test,
particularly regarding which examinees are classified into the same NRS
educational functioning levels across different forms of the test. By
demonstrating that a test can consistently classify individuals into
the same NRS educational functioning levels across different forms of
the test, the test publisher assures the Department that assessments of
educational gain are the result of instruction and other interventions
to improve literacy, not measurement error. Without this demonstration
of classification consistency, reports of educational gain are
uninterpretable. This information is very important to determinations
about the suitability of a test and whether the test measures the NRS
educational functioning levels as required in Sec. 462.13(a).
Changes: None.
Content of an application--Construct validity (Sec. 462.11(h))
Comments: A commenter expressed concern that proposed Sec.
462.11(h) would have required the results of several studies on the
adult education population in connection with other tests designed to
assess educational gain, which can be useful and meaningful, but also
time-consuming and expensive. The commenter indicated that imposing
this requirement after, not before, test approval would permit test
publishers to collaborate and conduct the studies in a more cost-
effective manner. Further, the commenter stated that the requirement
could exclude some qualified assessments. The commenter recommended
that the regulations be rewritten so that (1) these studies would only
be required of tests that have been approved and (2) a five-year period
could be provided for conducting the studies.
Discussion: We do not agree that a test should be approved for use
in the NRS prior to the test publisher providing documentation of the
appropriateness of a test for measuring educational gain for the NRS,
i.e., documentation that the test measures what it is intended to
measure. Section 462.11(h) is consistent with the 1999 edition of the
Standards for Educational and Psychological Testing, which stresses the
importance of gathering evidence of the test's
[[Page 2310]]
construct validity. The Secretary cannot determine whether a test is
suitable for use in the NRS without having evidence of the test's
construct validity.
Changes: None.
Content of an application--Construct validity (Sec. 462.11(h)(1))
Comments: A commenter expressed concern that proposed Sec.
462.11(h)(1) would have required test publishers to document the
appropriateness of a given test for measuring educational gain in the
NRS, including the correlation between the NRS test results and the
results of other tests that assess educational gain in the same adult
education population. The commenter stated that this comparison could
lead to faulty conclusions if the other test is not an accurate measure
of the construct.
Discussion: We believe that it is appropriate to look at test
correlations as one criterion for evaluating construct validity.
Generally, a test should correlate with other tests known to measure
the same construct, and it should not correlate (or have a very low
correlation) with tests known to measure different constructs. This
latter relationship depends upon the nature of the comparison
constructs. To the extent that the two constructs are theoretically
related, a correlation that approximates their theoretical relationship
is expected.
Changes: None.
Content of an application--Construct validity (Sec. 462.11(h)(2))
Comments: Two commenters stated that proposed Sec. 462.11(h)(2)
should be reconsidered because ``hours of instruction'' is not a
variable that correlates highly with test scores.
Discussion: Proposed Sec. 462.11(h)(2) would have required test
publishers to document that a test measures what it is intended to
measure, including the extent to which the raw or scale scores are
related to other relevant variables, such as hours of instruction or
other important process or outcome variables. While we are aware of
data establishing the relationship between Adult Basic Education (ABE)
test scores and hours of instruction, the reference to ``hours of
instruction'' in Sec. 462.11(h)(2) was only intended to provide an
example of a possibly relevant variable.
Changes: We have revised Sec. 462.11(h)(2) to clarify that ``hours
of instruction'' is an example of possibly relevant variables.
Content of an application--Other information (Sec. 462.11(i)(1))
Comments: A commenter requested clarification of the phrase ``an
analysis of the effects of time on performance'' used in proposed Sec.
462.11(i)(1). The commenter thought the phrase meant ``the effects of
the time it takes to administer the test or for a student to complete
it.''
Discussion: The commenter is correct. Section 462.11(i)(1) requires
an application to include a description of the manner in which test
administration time was determined and an analysis of the speededness
of the test. The term ``speededness'' as used in Sec. 462.11(i)(1)
refers to the effects on test performance that result from the time it
takes to administer or to complete the test.
Changes: We have revised Sec. 462.11(i)(1) to clarify that we
require both information on the manner in which test administration
time was determined and an analysis of the speededness of the test.
Content of an application--Other information (Sec. 462.11(i)(5))
Comments: A commenter requested clarification of the term
``retesting procedures'' used in proposed Sec. 462.11(i)(5). The
commenter asked if the term referred to ``pre- and post-testing.''
Discussion: The term ``retesting'' as used in Sec. 462.11(i)(5)
refers to the re-administration of a test that might be necessary
because of problems in the original administration (e.g., the test
taker becomes ill during the test and cannot finish, there are external
interruptions during testing, or there are administration errors).
Changes: We have revised Sec. 462.11(i)(5) to clarify that
``retesting'' refers to the re-administration of a test that might be
necessary because of problems in the original administration such as,
the test taker becomes ill during the test and cannot finish, there are
external interruptions during testing, or there are administration
errors.
Content of an application--Other information (Sec. 462.11(i)(6))
Comments: A commenter noted that proposed Sec. 462.11(i)(6) would
require test publishers to provide such other evidence as the Secretary
determines is necessary to establish the test's compliance with the
criteria and requirements in proposed Sec. 462.13. The commenter
requested clarification regarding what that evidence would be.
Discussion: While Sec. 462.11 includes the information we
anticipate the Secretary will need to determine whether a test is
suitable for use in the NRS, we recognize that the Secretary can
require a test publisher to provide additional information to establish
a test's compliance with the criteria and requirements in Sec. 462.13.
Section 462.11(i)(6) merely alerts test publishers to this possibility.
Changes: None.
Content of an application--Previous tests (Sec. 462.11(j))
Comments: A commenter asked if a test publisher submitting an
application for a test that is currently approved for use in the NRS
would have to provide the information in proposed Sec. 462.11(b)
through (i) or would have to provide only documentation of periodic
review of the content and specifications of the test as specified in
proposed Sec. 462.11(j)(1).
Discussion: As indicated in Sec. 462.11(a)(1), a test publisher
must include with its application information listed in paragraphs (b)
through (i) as well as the applicable information in paragraph (j). All
applications must, therefore, include the information in Sec.
462.11(b) through (i).
Changes: None.
Comments: A commenter requested clarification of the term
``periodic review'' used in proposed Sec. 462.11(j)(1) and (2) with
regard to the documentation that must be submitted for a previous test
to ensure that it continues to reflect NRS educational functioning
levels. The commenter asked if annual or bi-annual reviews constitute a
``periodic review,'' and suggested that the regulations use the term
``current review'' and specify the intervals for the reviews.
The commenter also suggested that the regulations address the
possibility that the NRS educational functioning levels might change.
The commenter stated that realigning tests to revised NRS educational
functioning levels would cause a burden for test publishers, and that
test publishers need at least one year advance notice of any proposed
changes in the NRS educational functioning levels.
Discussion: Section 462.11(j) requires test publishers to provide
specific information about currently used tests to ensure that the
tests continue to reflect NRS educational functioning levels. The
shorter the period of time between reviews of a test, the more relevant
the results would be in determining the test's content validity with
regard to NRS educational functioning levels. However, we are reluctant
to specify a time-frame for reviews because we do not want to create an
additional burden for test
[[Page 2311]]
publishers by requiring reviews more frequently than a test publisher
typically would perform.
With regard to providing advance notice to test publishers about
changes in the NRS educational functioning levels, in the past we have
customarily provided notice to all concerned parties, including test
publishers, well in advance of any changes to the NRS educational
functioning levels by posting notices on the Internet at https://
www.nrsweb.org. We intend to continue that practice and, because the
educational functioning levels are included in Sec. 462.44 of these
final regulations, will publish a notice of proposed rulemaking in the
Federal Register requesting comment on any proposed changes to Sec.
462.44.
Changes: None.
Comments: A commenter asked if the phrase ``previous tests used in
the NRS'' in proposed Sec. 462.11(j)(1) referred only to those tests
previously listed in the NRS Implementation Guidelines.
Discussion: The ``previous tests'' discussed in Sec. 462.11(j)(1)
are tests that were listed in the Guidelines and used to measure
educational gain in the NRS before the effective date of these final
regulations.
Changes: None.
Comments: A commenter requested clarification of the requirement in
proposed Sec. 462.11(j)(3) for test publishers to submit ``new data''
for tests that have not changed in the seven years since the Secretary
determined the tests were suitable for use in the NRS.
Discussion: The intent of Sec. 462.11(j) is to ensure that a test
publisher provides specific information about a test that has been in
use for years, but that has not changed in the seven years since the
Secretary determined the test was suitable for use in the NRS. In this
circumstance, the regulations require test publishers to provide new,
i.e., updated, data that support the validity of the test.
Changes: We have revised Sec. 462.11(j)(3) to clarify that test
publishers must provide updated data to support test validity.
Computerized tests (Sec. Sec. 462.3, 462.11(c)(3), 462.12(a)(2)(iii),
462.13(e), and 462.41(c)(3))
Comments: A commenter acknowledged that the proposed regulations
attempted to distinguish between traditional paper-based tests and
computerized tests. However, the commenter recommended that the
regulations be modified to clarify that the term ``parallel forms''
refers to ``paper-based tests or computerized tests that do not involve
an item selection algorithm, such as computerized adaptive tests,
multistage adaptive tests, or linear on-the-fly tests.'' The commenter
suggested specifically changing the proposed regulatory language from
``and have two or more secure, parallel, equated forms'' to ``and have
two or more secure, parallel forms if the test is produced before it is
administered to a learner (e.g., paper-based tests). If the test uses a
computerized algorithm for administering items in real time, the size
of the item pool and the method of item selection must ensure
negligible overlap in items across pre- and post-test.'' Another
commenter also emphasized the importance of recognizing computerized
tests.
Discussion: As we have discussed elsewhere in this notice, we agree
that the regulations should clarify the distinction between (1)
traditional tests, which use items that are generated before the test
is administered, and (2) computerized tests, which use an algorithm to
select test items while the test is being administered.
Changes: We have revised Sec. Sec. 462.3 (the definition of test),
462.11(c)(3), 462.12(a)(2)(iii), 462.13(e), and 462.41(c)(3) to provide
the clarification recommended by the commenters.
Tests that measure some, but not all, educational functioning levels
(Sec. Sec. 462.12(a)(2)(iv) and 462.13(b))
Comments: A commenter requested clarification on whether the
Secretary will review tests that currently measure only some, but not
all, educational functioning levels.
Discussion: The regulations provide for the Secretary to review
tests that measure some, but not all, educational functioning levels.
Sections 462.12(a)(2)(iv) and 462.13(b) indicate that the Secretary
reviews and determines the suitability of a test if an application
includes a test that samples one or more of the major content domains
of the NRS educational functioning levels of Adult Basic Education
(ABE), Adult Secondary Education (ASE), or English-As-A-Second Language
(ESL) with sufficient numbers of questions to represent adequately the
domain or domains. Further, Sec. 462.12(b)(2) provides flexibility for
the Secretary to determine that a test or a sub-test is suitable for
use in the NRS if it measures the content domain of some, but not all,
of the NRS educational functioning levels.
Changes: None.
Procedures the Secretary uses to review the suitability of tests (Sec.
462.12(b))
Comments: A commenter asked if the results of the Secretary's
review of a test would be posted for public review. Another commenter
suggested that the regulations should provide that test publishers be
notified at least 30 days before the Secretary notifies States and
local eligible providers that a test is unsuitable for use in the NRS.
With regard to tests that the Secretary determines are unsuitable, a
different commenter recommended that the regulations provide a time-
frame by which the Secretary would review any additional information
submitted by the test publisher and make a final determination.
Discussion: In accordance with Sec. 462.12(c)(2), the Secretary
will annually notify the public through a notice published in the
Federal Register and posted on the Internet at https://www.nrsweb.org of
tests that are suitable for use in the NRS. Under Sec. 462.12(e)(5),
the Secretary will follow the same procedure to notify the public when
a test that was previously determined suitable is determined to be
unsuitable. However, the Secretary will not post for public review the
Secretary's determination regarding a test that has not been previously
reviewed and that the Secretary determines to be unsuitable. We do not
believe this is necessary because the Secretary's determination
regarding these tests will not have an impact on the public, States, or
local eligible providers.
Proposed Sec. 462.12(d) provided that a test publisher would have
30 days to request that the Secretary reconsider the decision that a
test is unsuitable. Therefore, it will be after this 30-day period that
the Secretary will notify the public of a decision regarding a test.
Because it is impossible to anticipate the complexities that might
be involved in decisions regarding a test that was initially determined
to be unsuitable, we do not believe it would be appropriate to limit
the time the Secretary takes to review additional information provided
by a test publisher and make a final determination regarding the
suitability of a test. However, we intend to conduct the review as
expeditiously as possible.
Changes: None.
Publishing the list of suitable tests (Sec. 462.12(c))
Comments: A commenter suggested that if tests must be submitted for
review by October 1 of each year, the test publisher and adult
education programs should be informed of which
[[Page 2312]]
tests are approved no later than February 1 of the following year.
Discussion: Within a reasonable time-frame, the Secretary will
review tests, notify test publishers of the results of the review, and
provide States and local eligible providers with a list of suitable
tests. However, because we cannot predict the number of tests that will
be submitted for review nor the amount of time it will take to review
the tests, it is not possible to predict how long the process will take
from year to year. We, therefore, do not think it is appropriate to
establish a date by which we will announce the results of the
Secretary's review. We will publish the list of suitable tests well
before the program year in which they can be used.
Changes: None.
Revocation of determination that a test is suitable (Sec. 462.12(e))
Comments: A commenter suggested that test publishers be notified of
the Secretary's decision to revoke a determination that a test is
suitable before the Secretary notifies the general public. The
commenter stated that test publishers should be given sufficient time
to address the Secretary's concerns before the Secretary revokes the
determination that the test is suitable. This would provide a process
comparable to the process proposed for a test that the Secretary
determines is unsuitable.
Discussion: We agree that test publishers should have an
opportunity to address the Secretary's decision to revoke a
determination that a test is suitable before that determination becomes
a final decision.
Changes: We have revised Sec. 462.12(e) to give test publishers an
opportunity to request that the Secretary reconsider a decision to
revoke a determination that a test is unsuitable.
Comments: A commenter stated that proposed Sec. 462.14(b) was
unclear concerning the reasons a test's seven-year approval status
might be revoked.
Discussion: Section 462.12(e), not Sec. 462.14(b), establishes the
reasons for which the Secretary can revoke the determination regarding
the suitability of a test. In the proposed regulation, we stated that
the Secretary can revoke the determination if the Secretary determines
that the information submitted as a basis for the Secretary's review of
the test was inaccurate. In proposed Sec. 462.14(b), however, we
implied that the Secretary also could revoke a determination regarding
the suitability of a test if the test is substantially revised--for
example, by changing its structure, number of items, content
specifications, item types or sub-tests.
The proposed regulations were, therefore, unclear as to whether the
Secretary could revoke a determination about the suitability of a test
if the test had been substantially revised. We intended that revision
of a test would be a valid reason for revoking a determination about
test suitability.
Changes: We have revised Sec. 462.12(e) to clarify that
substantial changes to a test is one of the reasons the Secretary can
revoke a determination regarding the suitability of a test.
Criteria and requirements for determining test suitability (Sec.
462.13(c)(1), (d), and (f)(3) (proposed (g)(3))
Comments: A commenter noted that The Standards for Educational and
Psychological Testing referenced in Sec. 462.13(c) provides specific
guidelines for the technical documentation, including evidence of
validity that should be made available to interested parties. The
commenter recommended clarifying that publishers should provide the
Secretary with information regarding test development and validity.
Discussion: In proposed Sec. 462.13(c)(1), we indicated that, in
order for a test to be determined suitable for use in the NRS, the test
would have to meet all applicable and feasible standards for test
construction provided in The Standards for Educational and
Psychological Testing. We did not intend this language to imply that
the Secretary would review only information that is related to test
development. We agree with the commenter that validity is an important
factor in test evaluation and make it clear in Sec. 462.11(e) through
(g), that test publishers must include in their applications
information on content validity, reliability, and construct validity.
However, we think adding the reference to ``validity'' in Sec.
462.13(c)(1) will reinforce the importance of test publishers providing
information on validity.
Changes: We have revised Sec. 462.13(c)(1) to require a test to
meet all applicable and feasible standards for test construction and
validity that are provided in the Standards for Educational and
Psychological Testing.
Comments: A commenter noted that many factors influence the
appropriate time between testing and retesting, including intensity of
instruction, frequency and length of class meetings, and class size.
The commenter wanted to ``decrease the emphasis on a single protocol
for post-testing'' and suggested that, when discussing the time between
test-taking, the regulations use the term ``publisher's
recommendations'' instead of the term ``publisher's guidelines'' that
was used in proposed Sec. Sec. 462.13(d) and 462.40(c)(3)(ii).
Discussion: We do not agree that the term ``publisher's
guidelines'' emphasizes the use of a single protocol when determining
when to administer a post-test. State and local eligible providers,
like the commenter, are aware that many factors influence the
appropriate time between pre-testing and post-testing. These factors
are taken into consideration by test administrators at the local level.
However, because tests differ, test administrators rely on the
guidelines, developed during test construction and validation and
provided by test publishers, to help ensure that a sufficient amount of
time has passed before a post-test is given in order to optimize the
measurement of educational gains.
Changes: None.
Comments: A commenter requested clarification of Sec. 462.13(f)(3)
(proposed Sec. 462.13(g)(3)), which requires the publisher of a test
modified for an individual with a disability to recommend educational
functioning levels based on the previous performance of test-takers who
are members of the adult education population of interest to the NRS in
order for the Secretary to consider a test suitable for use.
Discussion: In order for the Secretary to consider a test that has
been modified for individuals with disabilities suitable for use in the
NRS, test publishers must (1) demonstrate that adult education students
with disabilities were included in the pilot or field test referred to
in Sec. 462.11(c)(1); (2) match scores to the NRS educational
functioning levels based on the information obtained from adult
education students with the disability who participated in the pilot or
field test and for whom the test has been adapted; and (3) provide in
the application, as required in Sec. 462.11(f), documentation of the
adequacy of the procedure used to translate the performance of adult
education students with the disability for whom the test has been
modified to an estimate of the examinees' standing with respect to the
NRS educational functional levels.
Changes: In response to the comment, we changed Sec. 462.13(f)(3)
to clarify that the Secretary considers a test modified for individuals
with disabilities suitable for use in the NRS if test publishers (1)
recommend educational functioning levels based on the information
obtained from adult education students who participated in the pilot or
field test and who have the disability for
[[Page 2313]]
which the test has been adapted and (2) provide documentation of the
adequacy of the procedure used to translate the performance of adult
education students with the disability for whom the test has been
modified to an estimate of the examinees' standing with respect to the
NRS educational functional levels.
Subpart D--General (Sec. Sec. 462.40-462.44)
Comments: Several commenters noted that the Department currently
provides non-regulatory guidance to States on the NRS. The commenters
asked why the Department is issuing regulations when most States comply
with and are successfully implementing the non-regulatory guidance and
the guidance provides each State with needed flexibility. The
commenters stated that ``as long as States are complying with
guidelines and meeting performance standards, we understand that the
Federal role is to limit regulations and allow the States to
continually increase their capabilities to improve program services.''
The commenters, therefore, recommended that subpart D be removed from
the final regulations and that the Secretary provide technical
assistance and resources to the few States that are having difficulty
creating effective assessment and data procedures. Additionally, one of
these commenters stated that while guidance is valued, regulations
sometimes seem arbitrary, intrusive, and unnecessary.
Discussion: It is the Department's policy to regulate only when
essential to promote quality, and then in the most flexible, most
equitable, and least burdensome way possible. We believe these
regulations comply with that policy. While the regulations in subpart D
are legally binding, they largely codify the guidance provided in the
Guidelines and the ``State Assessment Policy Guidance.'' Although
States have made significant progress, there remains variability in the
quality of State processes and procedures in the collection and
reporting of data on student assessment and performance. These
regulations, like the Guidelines, technical assistance activities, and
other efforts the Department has supported to improve data quality,
provide a significant tool to create a standard level of quality among
all States and thereby strengthen the integrity of the NRS as a
critical tool for measuring State performance and the impact of adult
education.
Changes: None.
Meaning of the term ``placement'' (Sec. Sec. 462.40(c)(4), (c)(6) and
(c)(10), 462.41(c)(3) (proposed), 462.42(a), (b)(2), and (c)(1),
(d)(1), and (d)(2), and 462.43(a)(2) and (b))
Comments: A few commenters stated that using the term ``placement''
in the regulations to mean the ``assignment of a student to the
appropriate NRS educational functioning level'' is confusing because
the traditional meaning of placement is ``placing a student in a
particular class or other instructional offering.'' Commenters noted
that tests designed for placement are different from those used to
measure educational gain.
Discussion: The regulations, like the Guidelines currently used by
States, use the terms ``place'' and ``placement'' as terms of art to
refer to the placement of students in NRS educational functioning
levels in order to measure and then report on educational gain in the
NRS. It is only within this context that the terms are used, and no
other meaning should be inferred.
Changes: None.
Placing students (Sec. 462.42(d)(1) and (2))
Comments: A commenter stated that placing a student in an NRS
educational functioning level using the lowest test score could result
in programs providing instruction in skill areas that are not most
relevant to the student. Further, the commenter stated that programs
(1) should not place students based on one test and (2) should be
allowed to use multiple placement tools and other pertinent
information, such as student goals, to determine placement. Another
commenter stated that all scores have some measurement error and that
instituting a policy of using the lowest test score might introduce
systematic, rather than random, measurement errors. This commenter also
indicated that the policy of using the lowest test score could
encourage programs to teach students only in an area where gain is to
be expected. The commenter stated that a better policy would be either
to focus on the learner's primary subject area at pre-test, or evaluate
gain based on a composite score across areas.
Discussion: With regard to placing a student using the lowest test
score when the student is tested in multiple skill areas, Sec.
462.42(d)(1) and (2) are consistent with the policy in the Guidelines,
which indicates: ``States should provide to local programs the criteria
for placing students at each educational functioning level, using test
scores from the initial assessment. Not all of the skill areas
described in the [functioning] level descriptors need to be used to
place students, but the skill used should be the areas most relevant to
the students' needs and the program's curriculum. If multiple skill
areas are assessed and the student has differing abilities in each
area, however, NRS policy requires that the program place the student
according to the lowest skill area.'' The Department's policy ensures
that States use a standardized approach when reporting educational
gain, which ensures comparability of data. These regulations use the
term ``placement'' as a term of art to refer to the placement of
students in NRS educational functioning levels in order to measure and
then report on educational gain in the NRS. Placement of a student in
an educational functioning level for NRS purposes does not affect
placement in an instructional program. States can use a variety of
tools when devising an instructional program for students. States and
local eligible providers know that using the lowest test score is a
convention used for reporting purposes and it is not intended to
encourage programs to teach students only in skill areas in which
students scored the lowest on tests or only in skill areas in which
gain is expected. The policy also is not intended to restrict the
number or type of assessments used to identify a student's needs and to
customize an instructional program to meet those needs. In fact,
programs are expected to teach students in multiple areas depending on
the students' needs and the programs' curricula.
We do not agree with the commenter that using the lowest test score
might introduce systematic, rather than random, measurement error.
Using the lowest test score in order to operationalize educational gain
in a consistent manner across States for reporting purposes is error
free. If tests have met the standards established in these regulations,
they should be able to assess skills at any level with minimum error.
Changes: None.
Measuring educational gain (Sec. 462.43)
Comments: A commenter suggested the Department measure significant
gain as determined by test developers. The commenter thought this
approach would be the most accurate measure of gain. The commenter
opposed the approach in proposed Sec. 462.43, stating that it would
only capture learning gain when a learner completes an educational
functioning level by crossing artificial cut points regardless of the
starting level. Another commenter stated that the approach for
measuring gain in Sec. 462.43 is ``too coarse, is likely
[[Page 2314]]
to capture only extreme gain, and will miss very significant
educational gains that occur within an EFL [educational functioning
level].'' The commenter suggested that the Department consider using
other options for demonstrating educational gain such as ``achieving a
gain score that is beyond chance expectations (using a scale score,
which is more reliable than a proficiency classification).'' Another
commenter stated that ``awarding only one educational gain to students
completing more than one educational functioning level to a high degree
misrepresents and under reports the effectiveness and accomplishment of
both adult teacher and learner.'' This commenter also stated that the
current NRS policy of reporting an educational gain for obtaining a
General Educational Development (GED) diploma by learners in ASE II,
but not to those in ABE or ASE I, misrepresents and under-reports
accomplishments. A different commenter suggested that attainment of a
GED diploma should be recognized as an educational gain.
Discussion: The approach for measuring educational gain in Sec.
462.43 is well established and accepted by the field. Over a two-year
period, the Department consulted with States and convened a statutorily
mandated panel in order to develop a performance accountability system
for the adult education program. During that time, consensus was
reached on defining the performance measures, including how to measure
educational gain. States and other participating entities agreed that
the approach in Sec. 462.43 is the most effective means of obtaining
accurate, reliable, and valid information on student performance.
Changes: None.
High Advanced ESL (Sec. 462.44)
Comments: Some commenters opposed proposed Sec. 462.44 because it,
like the Guidelines, eliminated the ``High Advanced ESL'' literacy
level from the Functioning Level Table. The commenters noted that the
change means that State and local agencies will no longer be able to
report educational gain for adult English language learners above the
``Advanced ESL'' literacy level or Student Performance Level (SPL) 6.
The commenters stated that, based on communications with the
Department's Office of Vocational and Adult Education (OVAE), the
change also means that programs can no longer provide ``High Advanced
ESL'' instruction in adult education programs. The commenters indicated
that OVAE has suggested that students who would have received ``High
Advanced ESL'' instruction should now complete their preparation by
moving from ESL classes to either adult basic education or adult
secondary education classes. The commenters expressed their belief that
OVAE's suggestion is pedagogically problematic and, more importantly,
would prevent programs from adequately addressing the learning needs of
ESL learners. Further, the commenters opposed stopping ESL instruction
at the ``Advanced ESL'' literacy level or SPL 6 because they believe
students at that level do not have language and literacy skills that
are sufficient to transition successfully to postsecondary education or
to meet the demands of the workplace--two of the purposes for adult
education that are cited in the Act. Commenters noted that the
regulations do not place this same restriction on native English
speakers and recommended that the ``High Advanced ESL'' literacy level
be restored to the Functioning Level Table. Finally, the commenters
recommended that the regulations provide States with flexibility and
not limit program services.
Discussion: Commenters are correct in that proposed Sec. 462.44
would codify a change in the Guidelines by eliminating the ``High
Advanced ESL'' literacy level from the Functioning Level Table. The
change was made in 2006 as part of the information collection request
for the Guidelines. However, before the change was made, (1) OVAE
consulted with Adult Education State Directors and (2) the Office of
Management and Budget (OMB) provided interested Federal agencies and
the public, including States and local eligible providers, an
opportunity to comment on the information collection request, which
included the removal of the ``High Advanced ESL'' literacy level from
the Functioning Level Table. We received no negative comments regarding
our intent to eliminate the ``High Advanced ESL'' literacy level, and
therefore, changed the Functioning Level Table in the Guidelines and in
the proposed regulations.
The change in the Functioning Level Table does not mean that adult
education programs can no longer provide services to ``High Advanced
ESL'' students. These regulations are consistent with the Act, which
authorizes services below the postsecondary level, and do not change
who can be served by adult education programs. The educational
functioning levels largely serve to classify students for reporting
purposes and should not be viewed as the standard for establishing the
type of instruction that can be provided. Placement of a student in an
educational functioning level for NRS purposes does not affect
placement in an instructional program. A student who scores above an
SPL 6 might still be served in adult education programs if he or she
has an educational need below the postsecondary level. States have the
flexibility to determine how programs are structured, how students are
placed in programs, and how instruction is delivered.
Ch