Assessment Technology Standards Request for Information (RFI), 79354-79358 [2010-31881]
Download as PDF
79354
Federal Register / Vol. 75, No. 243 / Monday, December 20, 2010 / Notices
majority concluded that reasonable
attorney’s fees and costs for this
arbitration should be reduced to
$28,393.50.
One panel member dissented stating
that the scope and amount of an award
of attorney’s fees and costs would not
materially damage the Oregon
Commission for the Blind’s RandolphSheppard program. Consequently, this
panel member would award
Complainant’s attorney $65,749.33,
reducing the original amount requested
by one-third.
The views and opinions expressed by
the panel do not necessarily represent
the views and opinions of the
Department.
Electronic Access to This Document:
You can view this document, as well as
all other Department of Education
documents published in the Federal
Register, in text or Adobe Portable
Document Format (PDF) on the Internet
at the following site: https://www.ed.gov/
news/fedregister. To use PDF you must
have Adobe Acrobat Reader, which is
available free at this site.
Note: The official version of this document
is the document published in the Federal
Register. Free Internet access to the official
edition of the Federal Register and the Code
of Federal Regulations is available on GPO
Access at: https://www.gpoaccess.gov/nara/
index.html.
Dated: December 15, 2010.
Alexa Posny,
Assistant Secretary for Special Education and
Rehabilitative Services.
[FR Doc. 2010–31879 Filed 12–17–10; 8:45 am]
BILLING CODE 4000–01–P
DEPARTMENT OF EDUCATION
Assessment Technology Standards
Request for Information (RFI)
Office of Innovation and
Improvement, U.S. Department of
Education.
ACTION: Notice of request for
information to gather technical expertise
pertaining to assessment technology
standards.
AGENCY:
The purpose of this RFI is to
collect information relating to
assessment technology standards.
Toward that end, we are posing a series
of questions to which we invite
interested members of the public to
respond. The Department anticipates
making use of this information in the
following ways. First of all, we expect
to use this information to help
determine the appropriate
interoperability standards for
assessments and related work developed
jlentini on DSKJ8SOYB1PROD with NOTICES
SUMMARY:
VerDate Mar<15>2010
17:18 Dec 17, 2010
Jkt 223001
under the Race to the Top Assessment
(RTTA) program. Secondly, we expect
to use this information to help us
develop related standards-based
programs. For example, we might, in the
future, offer additional grants, contracts,
or awards and some of those offerings
may include similar interoperability
requirements. This RFI may be used to
help set the interoperability
requirements for those offerings as well
as the existing RTTA program.
Under the RTTA program, the
Department requires grantees to develop
assessments that (see https://
www2.ed.gov/programs/racetothetopassessment/executive-summary.pdf, p.
78):
‘‘5. Maximize the interoperability of
assessments across technology platforms
and the ability for States to switch their
assessments from one technology
platform to another by—
(a) Developing all assessment items to
an industry-recognized open-licensed
interoperability standard that is
approved by the Department during the
grant period, without non-standard
extensions or additions; and
(b) Producing all student-level data in
a manner consistent with an industryrecognized open-licensed
interoperability standard that is
approved by the Department during the
grant period.’’
DATES: Written submissions must be
received by the Department on or before
5 p.m., Washington, DC time, on
January 17, 2011.
ADDRESSES: We encourage submissions
by e-mail using the following address:
RTTA-RFI@ed.gov. You must include
the term ‘‘Assessment RFI response’’ in
the subject line of your e-mail. If you
prefer to send your input by mail or
hand delivery, address it to Steve
Midgley, Office of Educational
Technology, Attention: Assessment RFI,
U.S. Department of Education, 400
Maryland Avenue, SW., Room 7E202,
Washington, DC 20202–0001.
FOR FURTHER INFORMATION CONTACT:
Steve Midgley, U.S. Department of
Education, 400 Maryland Avenue, SW.,
Room 7E202, Washington, DC 20202–
0001 by phone at 202–453–6381 or email at RTTA-RFI@ed.gov.
If you use a telecommunications
device for the deaf (TDD), call the
Federal Relay Service (FRS), toll free, at
1–800–877–8339.
SUPPLEMENTARY INFORMATION:
1. Introduction
The Department is seeking
information on technology standards
that may be applied to the management
and delivery of education-related
PO 00000
Frm 00021
Fmt 4703
Sfmt 4703
assessments, as well as those that may
be applied to the capture and reporting
of assessment results within distributed
online learning environments (i.e.
learning environments with components
managed by more than one
organization). THIS IS A REQUEST FOR
INFORMATION (RFI) ONLY. This
document uses the term ‘‘technology
standards’’ to refer to assessment
technology standards, specifications,
technical approaches and
implementations, and any other
functional or formal descriptions of
technical functionality. (Note: This
document refers to curricular or content
standards specifically as ‘‘curricular
standards.’’) Information about nonassessment technology standards and
related issues may be relevant and
included in responses, but this RFI is
specifically inquiring into technology
standards related to assessments of
learning. For the purpose of this RFI, the
Department does not distinguish
between technology specifications and
technology standards produced by
consortia, other groups, or nationally or
internationally recognized technology
standards development organizations.
This RFI is issued solely for
information and planning purposes and
does not constitute a Request for
Proposals (RFP) or a promise to issue an
RFP or notice inviting applications
(NIA). This request for information does
not commit the Department to contract
for any supply or service whatsoever.
Further, the Department is not at this
time seeking proposals and will not
accept unsolicited proposals.
Responders are advised that the
Department will not pay for any
information or administrative costs that
a person or entity may incur in
responding to this RFI. All costs
associated with responding to this RFI
will be solely at the interested party’s
expense. Not responding to this RFI will
not preclude individuals or
organizations from applying under
future contract or grant competition. If
the Department issues an RFP or NIA,
it will be posted on the Federal Business
Opportunities (https://www.fbo.gov/)
Web site (in the case of contracts) or the
Federal Register (https://
www.gpoaccess.gov/fr/) Web site (in the
case of grants, or other awards). It is the
responsibility of the potential offerors to
monitor these sites to determine
whether the Department issues an RFP
or NIA after considering the information
received in response to this RFI. Any
company or industry proprietary
information contained in responses
should be clearly marked as such, by
paragraph, such that publicly releasable
E:\FR\FM\20DEN1.SGM
20DEN1
Federal Register / Vol. 75, No. 243 / Monday, December 20, 2010 / Notices
jlentini on DSKJ8SOYB1PROD with NOTICES
information and proprietary information
are clearly distinguished. Any clearly
marked proprietary information
received in response to this request will
be properly protected from
unauthorized disclosure. The
Department will not use proprietary
information submitted from any one
source to establish the capability and
requirements for any future acquisition
or grant competition so as not to
inadvertently restrict competition. The
Department may publicly release or use
any or all materials submitted which are
not so marked.
The documents and information
submitted in response to this RFI
become the property of the U.S.
Government and will not be returned.
2. Background
The Department is investigating open
technology standards and specifications
to support the interoperable delivery
(that is, delivery in a way that allows
effective use across multiple systems or
components) of State- or locally selected
content and assessments for purposes of
education and training when conducted
via online learning platforms. As a part
of this effort, the Department is
investigating the availability and current
practice of open technology standards
and innovative technologies to support
management, delivery, and exchange of
assessment content, and the capture and
reporting of assessment results.
Existing technologies may serve as the
basis for the creation of new open
technology standards and specifications,
if implementation details related to
these technologies can be disclosed and
provided without restriction for
technical standardization or use. We
expect that applicable open technology
standards and specifications will be
combined with other technology
standards, current or to be developed,
providing the assessment capabilities
for online learning platforms that will
support the next generation of
technology for learning content.
Therefore, this RFI seeks information on
a range of solutions and approaches to
standardization of assessment via
technology, including deployment,
collection and reporting solutions,
techniques, and technology standards.
It is possible that RTTA grantees will
be able to use one or more existing
technology standards, or it may be that
additional development work will be
required to obtain sufficiently complete
technology standards for the program. It
is also possible that one or more existing
technology standards are suitable but
are not licensed in a way that will
permit free and open use by the public.
Through this RFI, the Department seeks
VerDate Mar<15>2010
17:18 Dec 17, 2010
Jkt 223001
to uncover and gather information on
how to resolve as many of these issues
as possible.
The Department may engage in
additional work to address these issues
at the conclusion of its analysis of the
responses to this RFI.
There are numerous efforts underway
across the Department that can benefit
from assessment technology
standardization of assessment content,
results, and reporting interoperability.
For example, the Department is
providing significant funding for the
development of ‘‘next- generation’’
assessment systems via the RTTA
program (see https://
edocket.access.gpo.gov/2010/pdf/20108176.pdf; https://www2.ed.gov/
programs/racetothetop-assessment/
index.html). In order to promote
technological innovation and market
competition, the Department has
specified that all assessment content
developed under this program be
developed using an ‘‘industry
recognized open-licensed
interoperability standard’’ that is
approved by the Department. The
assessment content developed under the
program must also be made freely
available to any State, technology
platform provider, or others that request
it for purposes of administering
assessments (consistent with test
security and protection requirements).
Moreover, the standards and technology
for controlling sensitive data
(assessment results and related
information) must also maintain the
privacy of any individually identifiable
information while permitting secure
interchange among authorized systems.
The Department intends that these
requirements, taken as a whole, give
States the flexibility to switch from one
technology platform to another,
allowing multiple providers to compete
for States’ business and for States to
make better decisions about cost and
value. Use of technology standards that
meet these requirements will help
ensure that public investments in
assessment instruments and related
technology can be used in the education
sector as broadly as possible and, at the
same time, contribute to a competitive
and innovative market place.
Through this notice, the Department
solicits advice, technical information,
additional questions (that is, questions
in addition to those put forward later in
this notice), and other input as to how
the Department can select the best
available technology standard(s) for the
RTTA program, as well as general
information related to assessment
technology standards and technology
and policy.
PO 00000
Frm 00022
Fmt 4703
Sfmt 4703
79355
3. Context for Responses
3.1 The primary intent of this RFI is
to explore existing, in-process, or
planned open technology standards,
specifications, and technology products
that support the management, delivery,
and exchange of assessment content and
the capture and exchange of assessment
results. While the focus of this RFI is
assessment technology standards, the
Department recognizes that assessment
generally occurs within the context of
broader learning activities (whether
online or offline) and, therefore, does
not wish to restrict the range of
responses to assessment-only
approaches. The Department, therefore,
also welcomes responses that address
broader technology standards or
approaches that are relevant to the
handling of assessment management,
delivery, or reporting. As mentioned
earlier, the Department has required
RTTA grantees to adopt a technical
standard (or standards) that permit
interoperability of the assessments and
technology developed by that program.
To help focus our consideration of the
comments provided in the response to
this RFI, we have developed several
questions regarding the development of
assessment technology standard(s) and
their application to the RTTA program.
Because these questions are only a guide
to help us better understand the issues
related to the development of
interoperable technology standards for
assessments, respondents do not have to
respond to any specific question.
Commenters responding to this RFI may
provide comments in a format that is
convenient to them.
3.2 Questions About Assessment
Technology Standards
General and Market Questions
3.2.1 Current Landscape. What are
the dominant or significant assessment
technology standards and platforms
(including technologies and approaches
for assessment management, delivery,
reporting, or other assessment
interoperability capabilities)? What is
the approximate market penetration of
the major, widely adopted solutions? To
what degree is there significant regional,
educational sub-sector, or international
diversity or commonality regarding the
adoption of various technology
standards and capabilities, if any?
3.2.2 Timelines. Approximately how
long would it take for technology
standards setting and adoption
processes to obtain a technology
standard that meets many or all of the
features or requirements described in
this RFI? What are the significant factors
that would affect the length of that
E:\FR\FM\20DEN1.SGM
20DEN1
jlentini on DSKJ8SOYB1PROD with NOTICES
79356
Federal Register / Vol. 75, No. 243 / Monday, December 20, 2010 / Notices
timeline, and how can the impact of
those factors be mitigated? More
specifically, would the acquisition of
existing intellectual property (IP),
reduction or simplification of specific
requirements, or other strategies reduce
the time required to develop these
technology standards and processes?
3.2.3 Process. What process or
processes are appropriate for the
adoption, modification, or design of the
most effective technology standard in a
manner that would answer many or all
of the questions in this RFI? We are
interested in learning the extent to
which the uses of one or another
process would affect the timeline
required to develop the technology
standards.
3.2.4 Intellectual Property. What are
the potential benefits and costs to the
Federal Government, States, and other
end-users of different IP restrictions or
permissions that could be applied to
technology standards and
specifications? Which types of licensed
or open IP (e.g., all rights reserved, MIT
Open License, or Gnu Public License)
should be considered as a government
technology standard? How should
openness relating to the IP of technology
standards be defined and categorized
(e.g., Open Source Initiative-compatible
license, free to use but not modify, noncommercial use only, or proprietary)
3.2.4.1 Existing Intellectual
Property. What are the IP licenses and
policies of existing assessment
technology standards, specifications,
and development and maintenance
policies? Are the documents, processes,
and procedures related to these IP
licenses and policies publicly available,
and how could the Department obtain
them?
3.2.5 Customizing. Can assessment
tools developed under existing
technology standards be customized,
adapted, or enhanced for the use of
specific communities of learning
without conflicting with the technology
standard under which a particular
assessment tool was developed? Which
technology standards provide the
greatest flexibility in permitting
adaption or other enhancement to meet
the needs of different educational
communities? What specific provisions
in existing technology standards would
tend to limit flexibility to adapt or
enhance assessment tools? How easy
would it be to amend existing
technology standards to offer more
flexibility to adapt and enhance
assessment tools to meet the needs of
various communities? Do final
technology standards publications
include flexible IP rights that enable and
permit such customizations? What are
VerDate Mar<15>2010
17:18 Dec 17, 2010
Jkt 223001
3.2.10 Security and Access. In what
ways do technology standards provide
for core security issues, such as access
logging, encryption, access levels, and
inter-system single-sign-on capabilities
(i.e., one login for systems managed by
different organizations)?
3.2.11 Results Validity. For this RFI,
‘‘Results Validity’’ means protecting the
statistical validity and reliability of
assessment instruments and items. How
can interoperable instruments be
managed to ensure they are
administered in a way that ensures valid
results? Are solutions regarding
assurance or management of validity
appropriate for inclusion in technology
standards, or should they be addressed
by the communities that would use the
technology standards to develop
specific assessments?
3.2.12 Results Capture. How can
technology standards accurately link
individual learners, their assessment
results, the systems where they take
their assessments, and the systems
where they view their results? How do
technology standards accurately make
these linkages when assessments,
content, and other data reside across
Technological Questions Regarding
numerous, distinct learning and
Assessment Technology Standards
curriculum management systems,
sometimes maintained by different
3.2.8 Interoperable Assessment
organizations?
Instruments. What techniques, such as
3.2.13 Results Privacy. How do
educational markup or assessment
technology standards enable assessment
markup languages (see also https://
results for individual learners to be kept
en.wikipedia.org/wiki/
private, especially as assessments
Markup_language), exist to describe,
results are transferred across numerous,
package, exchange, and deliver
distinct learning systems? How can such
interoperable assessments? How do
results best be shared securely over a
technology standards include
distributed set of systems managed by
assessments in packaged or structured
formats? How can technology standards independent organizations that are
enable interoperable use with resources authorized to receive the data, while
still maintaining privacy from
for learning content? How can
technology standards permit assessment unauthorized access?
instruments and items to be exchanged
3.2.14 Anonymization. Do
between and used by different
technology standards or technologies
assessment technology systems?
permit or enable anonymization of
3.2.9 Assessment Protection. For this assessment results for research or data
exchange and reporting? How do
RFI, ‘‘Assessment Protection’’ means
various technology standards
keeping assessment instruments and
accomplish these tasks? For example,
items sufficiently controlled to ensure
where a number of students take a test,
that their application yields valid
can their answers be anonymized
results. (See also paragraph below,
(through aggregation or other
‘‘Results Validity.’’) When assessment
techniques) and shared with researchers
instruments or content are re-used or
shared across organizations or publicly, to examine factors related to the
are there capabilities or strategies in the assessment (e.g., instructional inputs,
technology standards to assist in item or curriculum, materials, validity of the
instrument itself) without revealing the
instrument protection? What
mechanisms or processes exist to ensure identity of the learners? Is this an area
that assessment results are accurate and where technology standards can help?
3.2.15 Scoring and Analysis of
free from tampering? Do examples exist
Results. How can technology standards
of public or semi-public assessment
be used for the scoring, capture,
repositories that can provide valid tests
recording, analysis or evaluation of
or assessments while still sharing
assessment results?
assessment items broadly?
the risks and the benefits of permitting
such customization within technology
standards? When would it make sense
to prevent or to enable customization?
3.2.6 Conformance and Testing. Do
existing technology standards or
technologies include specifications or
testing procedures that can be used to
verify that a new product, such as an
assessment tool, meets the technology
standards under which it was
developed? What specifications or
testing procedures exist for this
purpose, e.g., software testing suites,
detailed specification descriptions, or
other verification methods? Are these
verification procedures included in the
costs of the technology standards, or
provided on a free or fee-basis, or
provided on some combination of bases?
3.2.7 Best Practices. What are best
practices related to the design and use
of assessment interoperability
technology standards? Where have these
best practices been adopted, and what
are the general lessons learned from
those adoptions? How might such best
practices be effectively used in the
future?
PO 00000
Frm 00023
Fmt 4703
Sfmt 4703
E:\FR\FM\20DEN1.SGM
20DEN1
jlentini on DSKJ8SOYB1PROD with NOTICES
Federal Register / Vol. 75, No. 243 / Monday, December 20, 2010 / Notices
3.2.15.1 Results Aggregation and
Reporting. How can technology
standards enable assessment results to
be aggregated into statistical or other
groupings? How can technology
standards provide capabilities for
results (aggregated or raw) to be
reported across multiple technology
systems? For example, if a learner takes
an assessment in one system, but the
results are to be displayed in another,
how do technology standards address
transferring results across those
systems? How do technology standards
address aggregation of results for a
number of learners who are assessed in
one system and whose results are
displayed in yet another technology
system? Can anonymization controls be
included with aggregation and reporting
solutions to ensure individual data
privacy and protection (see also 3.2.14
above).
3.2.16 Sequencing. How do
technology standards enable assessment
items stored within an assessment
instrument to be sequenced for
appropriate administration, when the
assessment consists of more than a
single linear sequence of items? For
example, how do technology standards
address computer-adaptive
assessments? How are the logic rules
that define such sequencing embedded
within a technology standard?
3.2.17 Computer-Driven scoring.
How do technology standards permit,
enable, or limit the ability to integrate
computer-driven scoring systems, in
particular those using ‘‘artificial
intelligence,’’ Bayesian analysis, or other
techniques beyond traditional bubblefill scoring?
3.2.18 Formative, Interim, and
Summative Assessments. What
technology and technology standards
exist that support formative, interim,
and summative assessments? What
technology standards support nontraditional assessment methods, such as
evidence, competency, and observationbased models?
3.2.19 Learning and Training. What
applications or technology standards
exist that can apply assessment results
to support learning and training? Are
there technology standards or
applications that support more than one
of the following: Early learning,
elementary/secondary education,
postsecondary education, job training,
corporate training, and military
training?
3.2.20 Repositories. What
technology standards-based assessment
instruments, questions, or item banks
(or repositories and learning
management systems) are used to
manage and deliver assessments?
VerDate Mar<15>2010
17:18 Dec 17, 2010
Jkt 223001
3.2.21 Content Lifecycle. How can
technology standards be employed to
support an assessment content lifecycle
(creation, storage, edit, deletion,
versioning, etc.)?
3.2.22 Interfaces and Services. What
interoperability specifications for
application program interfaces (APIs) or
Web services interfaces to assessment
management, delivery and tracking
systems have been developed? How are
they organized? What are the best
practices related to their design and
usage? How broadly have they been
adopted, and what are the lessons
learned from those who have designed
or implemented them?
3.2.23 Internal Transparency and
Ease of Use. Are there technology
standards and communication protocol
implementations that are ‘‘human
readable?’’ What are the benefits and
risks of ‘‘human readable’’ technology
standards? Some technology standards
are not comprehensible without tools to
unpack, decode, or otherwise interpret
the implementation data resulting from
use of the technology standard. Other
technology standards, such as HTML,
RTF and XML, are largely readable by
a reasonably sophisticated technical
user. RESTful-designed Web services
are often specifically intended to be
readable by, and even intuitive to, such
users as well. We ask commenters to
consider the extent to which various
technology standards possess native
‘‘human readability’’ and
comprehensibility.
3.2.24 Discovery and Search. How is
the discovery of items or instruments (or
other elements) handled within a
technology standard or technology? For
example, are there search APIs that are
provided to permit a search? How are
metadata exposed for discovery by
search engines or others?
3.2.25 Metadata. What kinds of
metadata about assessments (i.e.,
information describing assessments) are
permitted to be stored within
technology standards or technologies?
How do technology standards
accommodate structured data (such as
new State curriculum standards) that
were not anticipated when the
technology standard was designed? How
are metadata describing unstructured
(such as free-text input) and semistructured data incorporated within
assessment technology standards?
3.2.26 Recommendation, Rating,
and Review. Do technology standards or
technologies permit rating, review, or
recommendations to be incorporated
within an item, instrument, or other
element? If so, in what ways? How are
conflicting ratings handled? Do
technology standards or technologies
PO 00000
Frm 00024
Fmt 4703
Sfmt 4703
79357
permit ‘‘reviews of reviews’’ (e.g.,
‘‘thumbs up/down’’ or ‘‘Rate this review
1–5’’)? Is the rating or review system
centralized, or are multiple analyses of
the rating data permitted by distributed
participants?
3.2.27 Content and Media Diversity.
What types of diverse content types and
forms of assessment content exist that
extend beyond traditional paper-based
assessments translated to an electronic
delivery medium? We are interested in
learning more about electronic delivery
and interaction media, such as
performance-based assessments, games,
virtual worlds, mobile devices, and
simulations.
3.2.28 Accessibility. How do
technology standards ensure that the
platforms are accessible to all persons
with disabilities? How can technology
standards ensure the availability of
accommodations based on the
individual needs of persons with
disabilities? What factors are important
to consider so that accessibility
capabilities can be included within an
interoperable technology standard, both
for end-users, as well as operators,
teachers, and other administrators? How
are issues related to Universal Design
for Learning (UDL) relevant to standards
for accessible use? How can technology
standards provide for, improve, or
enhance Section 504 and 508 of the
Rehabilitation Act compliance for
assessment technology?
3.2.29 English Learners. How do
technology standards ensure that
assessment platforms support the
assessment, reporting of results, and
other capabilities related to the
assessment of English learners?
Questions about process and IP for
technology standards development
include:
3.2.30 Transparency. How do the
organizations that develop assessment
technology standards approach
development and maintenance
activities? Is it common for such work
to be performed in an unrestricted or
open public forum? Are there examples
of organizations conducting technology
standards development through private
(e.g., membership-driven) activities? Are
the final work products produced
through standards-development
activities made publicly available in a
timely manner? If not, when or for how
long is it necessary to keep these
products private? What circumstances
require, justify, or benefit from
protecting trade secrets or intellectual
property?
3.2.31 Participation. Does the
development of assessment technology
standards depend on membership fees
from individuals and organizations who
E:\FR\FM\20DEN1.SGM
20DEN1
jlentini on DSKJ8SOYB1PROD with NOTICES
79358
Federal Register / Vol. 75, No. 243 / Monday, December 20, 2010 / Notices
wish to contribute to development and
maintenance activities? Are there
requirements for ‘‘balance’’ within
membership across different
constituencies? What are the cost and
structure of such memberships? Are
there viable alternative methods for
generating revenue necessary to conduct
the work? What are the most realistic
and useful ways to generate
participation, fund work, and ensure
public access to a technology standardssetting process?
3.2.32 Availability. What are the
costs associated with final publication
of technology standards, and with all
supporting materials for those
standards, and can these assessment
products be made available at nominal
or no cost to users? Do technology
standards require restrictions for use or
application, including limitations on
derivation, resale, or other restrictions?
Is it appropriate to obtain patent,
copyright, or trademark protections for
assessment technology standards? Are
the publications for technology
standards and materials provided in a
machine-readable, well-defined form?
Are there restrictions or limitations on
any future application of the
publications and materials after initial
release? Are developer-assistance
materials (e.g., Document Type
Definitions, test harnesses, code
libraries, reference implementations)
also made available free under an openlicense? In what circumstances should
technology standards-setting
organizations retain rights or control, or
impose restrictions on the use of
publications, derivations, and resale or
developer-assistance technologies, as
opposed to open-licensing everything?
When should materials be made freely
available (that is, at no cost to the
consumer) while still retaining most or
all copyright license rights?
3.2.33 Derivation. For technology
standards, do copyright licenses for
publications and all supporting
materials and software licenses for
software artifacts permit the
unrestricted creation and dissemination
of derivative works (a.k.a. ‘‘open
licensed’’)? Do such open licenses
contain restrictions that require
publication and dissemination of such
works in a manner consistent with the
openness criteria described by, for
example, a GNU Public License (a.k.a.
‘‘viral licensed’’) or an MIT Public
License (a.k.a. ‘‘academic licensed’’)?
Are there policies or license restrictions
on derivative works intended to prevent
re-packaging, re-sale, or modifications
without re-publication for assessment
technology standards?
VerDate Mar<15>2010
17:18 Dec 17, 2010
Jkt 223001
3.2.34 Licensing Descriptions (for
materials contained within the standard,
not for the standard’s licensing itself).
How do technology standards address
licensing terms for assessment resources
described within the technology
standard? Are there successful
technology standards or approaches for
describing a wide variety of license
types, including traditional per-use
licensing, Web-fulfillment, free (but
licensed), open (but licensed, including
commercial or non-commercial use
permitted), and public domain status.
Are there other resource licensing issues
that should be addressed within a
technology standard as a best practice?
Accessible Format: Individuals with
disabilities can obtain this document in
an accessible format (e.g., Braille, large
print, or audiotape) on request to the
program contact person listed under FOR
FURTHER INFORMATION CONTACT.
Electronic Access to This Document:
You can view this document, as well as
all other documents of this Department
published in the Federal Register, in
text or Adobe Portable Document
Format (PDF) on the Internet at the
following site: https://www.ed.gov/news/
fedregister.
To use PDF you must have Adobe
Acrobat Reader, which is available free
at this site. If you have questions about
using PDF, call the U.S. Government
Printing Office (GPO), toll free, at 1–
888–293–6498; or in the Washington,
DC, area at (202) 512–1530.
Note: The official version of this document
is the document published in the Federal
Register. Free Internet access to the official
edition of the Federal Register and the Code
of Federal Regulations is available on GPO
Access at: https://www.gpoaccess.gov/nara/
index.html.
Program Authority: 20 U.S.C. 6771.
Dated: December 15, 2010.
James Shelton, III,
Assistant Deputy Secretary for Innovation and
Improvement.
[FR Doc. 2010–31881 Filed 12–17–10; 8:45 am]
BILLING CODE 4000–01–P
DEPARTMENT OF ENERGY
Federal Energy Regulatory
Commission
Combined Notice of Filings # 1
December 10, 2010.
Take notice that the Commission
received the following electric rate
filings:
Docket Numbers: ER10–2615–001;
ER11–2335–001.
PO 00000
Frm 00025
Fmt 4703
Sfmt 4703
Applicants: Plum Point Energy
Associates, L.L.C., Plum Point Services
Company, LLC.
Description: Plum Point MBR Entities
Submit 652 Notice of Change in Status.
Filed Date: 12/10/2010.
Accession Number: 20101210–5174.
Comment Date: 5 p.m. Eastern Time
on Monday, January 03, 2011.
Docket Numbers: ER10–2785–003.
Applicants: Chevron Coalinga Energy
Company.
Description: Chevron Coalinga Energy
Company submits tariff filing per 35:
Chevron Coalinga Energy Company
Tariff to be effective 10/19/2010.
Filed Date: 10/20/2010.
Accession Number: 20101020–5155.
Comment Date: 5 p.m. Eastern Time
on Thursday, December 30, 2010.
Docket Numbers: ER10–2786–003.
Applicants: Washington Gas Energy
Services, Inc.
Description: Washington Gas Energy
Services, Inc. submits tariff filing per 35:
Washington Gas Energy Services Tariff
to be effective 10/19/2010.
Filed Date: 10/20/2010.
Accession Number: 20101020–5156.
Comment Date: 5 p.m. Eastern Time
on Thursday, December 30, 2010.
Docket Numbers: ER11–2325–000.
Applicants: California Pacific Electric
Company, LLC.
Description: California Pacific Electric
Company, LLC submits tariff filing per
35.1: Electric Service Agreement to be
effective 12/31/1998.
Filed Date: 12/09/2010.
Accession Number: 20101209–5079.
Comment Date: 5 p.m. Eastern Time
on Monday, December 20, 2010.
Docket Numbers: ER11–2326–000.
Applicants: Florida Power
Corporation.
Description: Florida Power
Corporation submits tariff filing per
35.13(a)(2)(iii): Rate Schedule No. 204 of
Florida Power Corporation to be
effective 12/9/2010.
Filed Date: 12/09/2010.
Accession Number: 20101209–5080.
Comment Date: 5 p.m. Eastern Time
on Thursday, December 30, 2010.
Docket Numbers: ER11–2327–000.
Applicants: PJM Interconnection,
L.L.C.
Description: PJM Interconnection,
L.L.C. submits tariff filing per
35.13(a)(2)(iii): WMPA No. 2704, Queue
W2–071, CleanLight Energy, L.L.C. and
PSE&G to be effective 11/9/2010.
Filed Date: 12/09/2010.
Accession Number: 20101209–5104.
Comment Date: 5 p.m. Eastern Time
on Thursday, December 30, 2010.
Docket Numbers: ER11–2328–000.
E:\FR\FM\20DEN1.SGM
20DEN1
Agencies
[Federal Register Volume 75, Number 243 (Monday, December 20, 2010)]
[Notices]
[Pages 79354-79358]
From the Federal Register Online via the Government Printing Office [www.gpo.gov]
[FR Doc No: 2010-31881]
-----------------------------------------------------------------------
DEPARTMENT OF EDUCATION
Assessment Technology Standards Request for Information (RFI)
AGENCY: Office of Innovation and Improvement, U.S. Department of
Education.
ACTION: Notice of request for information to gather technical expertise
pertaining to assessment technology standards.
-----------------------------------------------------------------------
SUMMARY: The purpose of this RFI is to collect information relating to
assessment technology standards. Toward that end, we are posing a
series of questions to which we invite interested members of the public
to respond. The Department anticipates making use of this information
in the following ways. First of all, we expect to use this information
to help determine the appropriate interoperability standards for
assessments and related work developed under the Race to the Top
Assessment (RTTA) program. Secondly, we expect to use this information
to help us develop related standards-based programs. For example, we
might, in the future, offer additional grants, contracts, or awards and
some of those offerings may include similar interoperability
requirements. This RFI may be used to help set the interoperability
requirements for those offerings as well as the existing RTTA program.
Under the RTTA program, the Department requires grantees to develop
assessments that (see https://www2.ed.gov/programs/racetothetop-assessment/executive-summary.pdf, p. 78):
``5. Maximize the interoperability of assessments across technology
platforms and the ability for States to switch their assessments from
one technology platform to another by--
(a) Developing all assessment items to an industry-recognized open-
licensed interoperability standard that is approved by the Department
during the grant period, without non-standard extensions or additions;
and
(b) Producing all student-level data in a manner consistent with an
industry-recognized open-licensed interoperability standard that is
approved by the Department during the grant period.''
DATES: Written submissions must be received by the Department on or
before 5 p.m., Washington, DC time, on January 17, 2011.
ADDRESSES: We encourage submissions by e-mail using the following
address: RTTA-RFI@ed.gov. You must include the term ``Assessment RFI
response'' in the subject line of your e-mail. If you prefer to send
your input by mail or hand delivery, address it to Steve Midgley,
Office of Educational Technology, Attention: Assessment RFI, U.S.
Department of Education, 400 Maryland Avenue, SW., Room 7E202,
Washington, DC 20202-0001.
FOR FURTHER INFORMATION CONTACT: Steve Midgley, U.S. Department of
Education, 400 Maryland Avenue, SW., Room 7E202, Washington, DC 20202-
0001 by phone at 202-453-6381 or e-mail at RTTA-RFI@ed.gov.
If you use a telecommunications device for the deaf (TDD), call the
Federal Relay Service (FRS), toll free, at 1-800-877-8339.
SUPPLEMENTARY INFORMATION:
1. Introduction
The Department is seeking information on technology standards that
may be applied to the management and delivery of education-related
assessments, as well as those that may be applied to the capture and
reporting of assessment results within distributed online learning
environments (i.e. learning environments with components managed by
more than one organization). THIS IS A REQUEST FOR INFORMATION (RFI)
ONLY. This document uses the term ``technology standards'' to refer to
assessment technology standards, specifications, technical approaches
and implementations, and any other functional or formal descriptions of
technical functionality. (Note: This document refers to curricular or
content standards specifically as ``curricular standards.'')
Information about non-assessment technology standards and related
issues may be relevant and included in responses, but this RFI is
specifically inquiring into technology standards related to assessments
of learning. For the purpose of this RFI, the Department does not
distinguish between technology specifications and technology standards
produced by consortia, other groups, or nationally or internationally
recognized technology standards development organizations.
This RFI is issued solely for information and planning purposes and
does not constitute a Request for Proposals (RFP) or a promise to issue
an RFP or notice inviting applications (NIA). This request for
information does not commit the Department to contract for any supply
or service whatsoever. Further, the Department is not at this time
seeking proposals and will not accept unsolicited proposals. Responders
are advised that the Department will not pay for any information or
administrative costs that a person or entity may incur in responding to
this RFI. All costs associated with responding to this RFI will be
solely at the interested party's expense. Not responding to this RFI
will not preclude individuals or organizations from applying under
future contract or grant competition. If the Department issues an RFP
or NIA, it will be posted on the Federal Business Opportunities
(https://www.fbo.gov/) Web site (in the case of contracts) or the
Federal Register (https://www.gpoaccess.gov/fr/) Web site (in the case
of grants, or other awards). It is the responsibility of the potential
offerors to monitor these sites to determine whether the Department
issues an RFP or NIA after considering the information received in
response to this RFI. Any company or industry proprietary information
contained in responses should be clearly marked as such, by paragraph,
such that publicly releasable
[[Page 79355]]
information and proprietary information are clearly distinguished. Any
clearly marked proprietary information received in response to this
request will be properly protected from unauthorized disclosure. The
Department will not use proprietary information submitted from any one
source to establish the capability and requirements for any future
acquisition or grant competition so as not to inadvertently restrict
competition. The Department may publicly release or use any or all
materials submitted which are not so marked.
The documents and information submitted in response to this RFI
become the property of the U.S. Government and will not be returned.
2. Background
The Department is investigating open technology standards and
specifications to support the interoperable delivery (that is, delivery
in a way that allows effective use across multiple systems or
components) of State- or locally selected content and assessments for
purposes of education and training when conducted via online learning
platforms. As a part of this effort, the Department is investigating
the availability and current practice of open technology standards and
innovative technologies to support management, delivery, and exchange
of assessment content, and the capture and reporting of assessment
results.
Existing technologies may serve as the basis for the creation of
new open technology standards and specifications, if implementation
details related to these technologies can be disclosed and provided
without restriction for technical standardization or use. We expect
that applicable open technology standards and specifications will be
combined with other technology standards, current or to be developed,
providing the assessment capabilities for online learning platforms
that will support the next generation of technology for learning
content. Therefore, this RFI seeks information on a range of solutions
and approaches to standardization of assessment via technology,
including deployment, collection and reporting solutions, techniques,
and technology standards.
It is possible that RTTA grantees will be able to use one or more
existing technology standards, or it may be that additional development
work will be required to obtain sufficiently complete technology
standards for the program. It is also possible that one or more
existing technology standards are suitable but are not licensed in a
way that will permit free and open use by the public. Through this RFI,
the Department seeks to uncover and gather information on how to
resolve as many of these issues as possible.
The Department may engage in additional work to address these
issues at the conclusion of its analysis of the responses to this RFI.
There are numerous efforts underway across the Department that can
benefit from assessment technology standardization of assessment
content, results, and reporting interoperability. For example, the
Department is providing significant funding for the development of
``next- generation'' assessment systems via the RTTA program (see
https://edocket.access.gpo.gov/2010/pdf/2010-8176.pdf; https://www2.ed.gov/programs/racetothetop-assessment/). In order to
promote technological innovation and market competition, the Department
has specified that all assessment content developed under this program
be developed using an ``industry recognized open-licensed
interoperability standard'' that is approved by the Department. The
assessment content developed under the program must also be made freely
available to any State, technology platform provider, or others that
request it for purposes of administering assessments (consistent with
test security and protection requirements). Moreover, the standards and
technology for controlling sensitive data (assessment results and
related information) must also maintain the privacy of any individually
identifiable information while permitting secure interchange among
authorized systems. The Department intends that these requirements,
taken as a whole, give States the flexibility to switch from one
technology platform to another, allowing multiple providers to compete
for States' business and for States to make better decisions about cost
and value. Use of technology standards that meet these requirements
will help ensure that public investments in assessment instruments and
related technology can be used in the education sector as broadly as
possible and, at the same time, contribute to a competitive and
innovative market place.
Through this notice, the Department solicits advice, technical
information, additional questions (that is, questions in addition to
those put forward later in this notice), and other input as to how the
Department can select the best available technology standard(s) for the
RTTA program, as well as general information related to assessment
technology standards and technology and policy.
3. Context for Responses
3.1 The primary intent of this RFI is to explore existing, in-
process, or planned open technology standards, specifications, and
technology products that support the management, delivery, and exchange
of assessment content and the capture and exchange of assessment
results. While the focus of this RFI is assessment technology
standards, the Department recognizes that assessment generally occurs
within the context of broader learning activities (whether online or
offline) and, therefore, does not wish to restrict the range of
responses to assessment-only approaches. The Department, therefore,
also welcomes responses that address broader technology standards or
approaches that are relevant to the handling of assessment management,
delivery, or reporting. As mentioned earlier, the Department has
required RTTA grantees to adopt a technical standard (or standards)
that permit interoperability of the assessments and technology
developed by that program. To help focus our consideration of the
comments provided in the response to this RFI, we have developed
several questions regarding the development of assessment technology
standard(s) and their application to the RTTA program. Because these
questions are only a guide to help us better understand the issues
related to the development of interoperable technology standards for
assessments, respondents do not have to respond to any specific
question. Commenters responding to this RFI may provide comments in a
format that is convenient to them.
3.2 Questions About Assessment Technology Standards
General and Market Questions
3.2.1 Current Landscape. What are the dominant or significant
assessment technology standards and platforms (including technologies
and approaches for assessment management, delivery, reporting, or other
assessment interoperability capabilities)? What is the approximate
market penetration of the major, widely adopted solutions? To what
degree is there significant regional, educational sub-sector, or
international diversity or commonality regarding the adoption of
various technology standards and capabilities, if any?
3.2.2 Timelines. Approximately how long would it take for
technology standards setting and adoption processes to obtain a
technology standard that meets many or all of the features or
requirements described in this RFI? What are the significant factors
that would affect the length of that
[[Page 79356]]
timeline, and how can the impact of those factors be mitigated? More
specifically, would the acquisition of existing intellectual property
(IP), reduction or simplification of specific requirements, or other
strategies reduce the time required to develop these technology
standards and processes?
3.2.3 Process. What process or processes are appropriate for the
adoption, modification, or design of the most effective technology
standard in a manner that would answer many or all of the questions in
this RFI? We are interested in learning the extent to which the uses of
one or another process would affect the timeline required to develop
the technology standards.
3.2.4 Intellectual Property. What are the potential benefits and
costs to the Federal Government, States, and other end-users of
different IP restrictions or permissions that could be applied to
technology standards and specifications? Which types of licensed or
open IP (e.g., all rights reserved, MIT Open License, or Gnu Public
License) should be considered as a government technology standard? How
should openness relating to the IP of technology standards be defined
and categorized (e.g., Open Source Initiative-compatible license, free
to use but not modify, non-commercial use only, or proprietary)
3.2.4.1 Existing Intellectual Property. What are the IP licenses
and policies of existing assessment technology standards,
specifications, and development and maintenance policies? Are the
documents, processes, and procedures related to these IP licenses and
policies publicly available, and how could the Department obtain them?
3.2.5 Customizing. Can assessment tools developed under existing
technology standards be customized, adapted, or enhanced for the use of
specific communities of learning without conflicting with the
technology standard under which a particular assessment tool was
developed? Which technology standards provide the greatest flexibility
in permitting adaption or other enhancement to meet the needs of
different educational communities? What specific provisions in existing
technology standards would tend to limit flexibility to adapt or
enhance assessment tools? How easy would it be to amend existing
technology standards to offer more flexibility to adapt and enhance
assessment tools to meet the needs of various communities? Do final
technology standards publications include flexible IP rights that
enable and permit such customizations? What are the risks and the
benefits of permitting such customization within technology standards?
When would it make sense to prevent or to enable customization?
3.2.6 Conformance and Testing. Do existing technology standards or
technologies include specifications or testing procedures that can be
used to verify that a new product, such as an assessment tool, meets
the technology standards under which it was developed? What
specifications or testing procedures exist for this purpose, e.g.,
software testing suites, detailed specification descriptions, or other
verification methods? Are these verification procedures included in the
costs of the technology standards, or provided on a free or fee-basis,
or provided on some combination of bases?
3.2.7 Best Practices. What are best practices related to the design
and use of assessment interoperability technology standards? Where have
these best practices been adopted, and what are the general lessons
learned from those adoptions? How might such best practices be
effectively used in the future?
Technological Questions Regarding Assessment Technology Standards
3.2.8 Interoperable Assessment Instruments. What techniques, such
as educational markup or assessment markup languages (see also https://en.wikipedia.org/wiki/Markup_language), exist to describe, package,
exchange, and deliver interoperable assessments? How do technology
standards include assessments in packaged or structured formats? How
can technology standards enable interoperable use with resources for
learning content? How can technology standards permit assessment
instruments and items to be exchanged between and used by different
assessment technology systems?
3.2.9 Assessment Protection. For this RFI, ``Assessment
Protection'' means keeping assessment instruments and items
sufficiently controlled to ensure that their application yields valid
results. (See also paragraph below, ``Results Validity.'') When
assessment instruments or content are re-used or shared across
organizations or publicly, are there capabilities or strategies in the
technology standards to assist in item or instrument protection? What
mechanisms or processes exist to ensure that assessment results are
accurate and free from tampering? Do examples exist of public or semi-
public assessment repositories that can provide valid tests or
assessments while still sharing assessment items broadly?
3.2.10 Security and Access. In what ways do technology standards
provide for core security issues, such as access logging, encryption,
access levels, and inter-system single-sign-on capabilities (i.e., one
login for systems managed by different organizations)?
3.2.11 Results Validity. For this RFI, ``Results Validity'' means
protecting the statistical validity and reliability of assessment
instruments and items. How can interoperable instruments be managed to
ensure they are administered in a way that ensures valid results? Are
solutions regarding assurance or management of validity appropriate for
inclusion in technology standards, or should they be addressed by the
communities that would use the technology standards to develop specific
assessments?
3.2.12 Results Capture. How can technology standards accurately
link individual learners, their assessment results, the systems where
they take their assessments, and the systems where they view their
results? How do technology standards accurately make these linkages
when assessments, content, and other data reside across numerous,
distinct learning and curriculum management systems, sometimes
maintained by different organizations?
3.2.13 Results Privacy. How do technology standards enable
assessment results for individual learners to be kept private,
especially as assessments results are transferred across numerous,
distinct learning systems? How can such results best be shared securely
over a distributed set of systems managed by independent organizations
that are authorized to receive the data, while still maintaining
privacy from unauthorized access?
3.2.14 Anonymization. Do technology standards or technologies
permit or enable anonymization of assessment results for research or
data exchange and reporting? How do various technology standards
accomplish these tasks? For example, where a number of students take a
test, can their answers be anonymized (through aggregation or other
techniques) and shared with researchers to examine factors related to
the assessment (e.g., instructional inputs, curriculum, materials,
validity of the instrument itself) without revealing the identity of
the learners? Is this an area where technology standards can help?
3.2.15 Scoring and Analysis of Results. How can technology
standards be used for the scoring, capture, recording, analysis or
evaluation of assessment results?
[[Page 79357]]
3.2.15.1 Results Aggregation and Reporting. How can technology
standards enable assessment results to be aggregated into statistical
or other groupings? How can technology standards provide capabilities
for results (aggregated or raw) to be reported across multiple
technology systems? For example, if a learner takes an assessment in
one system, but the results are to be displayed in another, how do
technology standards address transferring results across those systems?
How do technology standards address aggregation of results for a number
of learners who are assessed in one system and whose results are
displayed in yet another technology system? Can anonymization controls
be included with aggregation and reporting solutions to ensure
individual data privacy and protection (see also 3.2.14 above).
3.2.16 Sequencing. How do technology standards enable assessment
items stored within an assessment instrument to be sequenced for
appropriate administration, when the assessment consists of more than a
single linear sequence of items? For example, how do technology
standards address computer-adaptive assessments? How are the logic
rules that define such sequencing embedded within a technology
standard?
3.2.17 Computer-Driven scoring. How do technology standards permit,
enable, or limit the ability to integrate computer-driven scoring
systems, in particular those using ``artificial intelligence,''
Bayesian analysis, or other techniques beyond traditional bubble-fill
scoring?
3.2.18 Formative, Interim, and Summative Assessments. What
technology and technology standards exist that support formative,
interim, and summative assessments? What technology standards support
non-traditional assessment methods, such as evidence, competency, and
observation-based models?
3.2.19 Learning and Training. What applications or technology
standards exist that can apply assessment results to support learning
and training? Are there technology standards or applications that
support more than one of the following: Early learning, elementary/
secondary education, postsecondary education, job training, corporate
training, and military training?
3.2.20 Repositories. What technology standards-based assessment
instruments, questions, or item banks (or repositories and learning
management systems) are used to manage and deliver assessments?
3.2.21 Content Lifecycle. How can technology standards be employed
to support an assessment content lifecycle (creation, storage, edit,
deletion, versioning, etc.)?
3.2.22 Interfaces and Services. What interoperability
specifications for application program interfaces (APIs) or Web
services interfaces to assessment management, delivery and tracking
systems have been developed? How are they organized? What are the best
practices related to their design and usage? How broadly have they been
adopted, and what are the lessons learned from those who have designed
or implemented them?
3.2.23 Internal Transparency and Ease of Use. Are there technology
standards and communication protocol implementations that are ``human
readable?'' What are the benefits and risks of ``human readable''
technology standards? Some technology standards are not comprehensible
without tools to unpack, decode, or otherwise interpret the
implementation data resulting from use of the technology standard.
Other technology standards, such as HTML, RTF and XML, are largely
readable by a reasonably sophisticated technical user. RESTful-designed
Web services are often specifically intended to be readable by, and
even intuitive to, such users as well. We ask commenters to consider
the extent to which various technology standards possess native ``human
readability'' and comprehensibility.
3.2.24 Discovery and Search. How is the discovery of items or
instruments (or other elements) handled within a technology standard or
technology? For example, are there search APIs that are provided to
permit a search? How are metadata exposed for discovery by search
engines or others?
3.2.25 Metadata. What kinds of metadata about assessments (i.e.,
information describing assessments) are permitted to be stored within
technology standards or technologies? How do technology standards
accommodate structured data (such as new State curriculum standards)
that were not anticipated when the technology standard was designed?
How are metadata describing unstructured (such as free-text input) and
semi-structured data incorporated within assessment technology
standards?
3.2.26 Recommendation, Rating, and Review. Do technology standards
or technologies permit rating, review, or recommendations to be
incorporated within an item, instrument, or other element? If so, in
what ways? How are conflicting ratings handled? Do technology standards
or technologies permit ``reviews of reviews'' (e.g., ``thumbs up/down''
or ``Rate this review 1-5'')? Is the rating or review system
centralized, or are multiple analyses of the rating data permitted by
distributed participants?
3.2.27 Content and Media Diversity. What types of diverse content
types and forms of assessment content exist that extend beyond
traditional paper-based assessments translated to an electronic
delivery medium? We are interested in learning more about electronic
delivery and interaction media, such as performance-based assessments,
games, virtual worlds, mobile devices, and simulations.
3.2.28 Accessibility. How do technology standards ensure that the
platforms are accessible to all persons with disabilities? How can
technology standards ensure the availability of accommodations based on
the individual needs of persons with disabilities? What factors are
important to consider so that accessibility capabilities can be
included within an interoperable technology standard, both for end-
users, as well as operators, teachers, and other administrators? How
are issues related to Universal Design for Learning (UDL) relevant to
standards for accessible use? How can technology standards provide for,
improve, or enhance Section 504 and 508 of the Rehabilitation Act
compliance for assessment technology?
3.2.29 English Learners. How do technology standards ensure that
assessment platforms support the assessment, reporting of results, and
other capabilities related to the assessment of English learners?
Questions about process and IP for technology standards development
include:
3.2.30 Transparency. How do the organizations that develop
assessment technology standards approach development and maintenance
activities? Is it common for such work to be performed in an
unrestricted or open public forum? Are there examples of organizations
conducting technology standards development through private (e.g.,
membership-driven) activities? Are the final work products produced
through standards-development activities made publicly available in a
timely manner? If not, when or for how long is it necessary to keep
these products private? What circumstances require, justify, or benefit
from protecting trade secrets or intellectual property?
3.2.31 Participation. Does the development of assessment technology
standards depend on membership fees from individuals and organizations
who
[[Page 79358]]
wish to contribute to development and maintenance activities? Are there
requirements for ``balance'' within membership across different
constituencies? What are the cost and structure of such memberships?
Are there viable alternative methods for generating revenue necessary
to conduct the work? What are the most realistic and useful ways to
generate participation, fund work, and ensure public access to a
technology standards-setting process?
3.2.32 Availability. What are the costs associated with final
publication of technology standards, and with all supporting materials
for those standards, and can these assessment products be made
available at nominal or no cost to users? Do technology standards
require restrictions for use or application, including limitations on
derivation, resale, or other restrictions? Is it appropriate to obtain
patent, copyright, or trademark protections for assessment technology
standards? Are the publications for technology standards and materials
provided in a machine-readable, well-defined form? Are there
restrictions or limitations on any future application of the
publications and materials after initial release? Are developer-
assistance materials (e.g., Document Type Definitions, test harnesses,
code libraries, reference implementations) also made available free
under an open-license? In what circumstances should technology
standards-setting organizations retain rights or control, or impose
restrictions on the use of publications, derivations, and resale or
developer-assistance technologies, as opposed to open-licensing
everything? When should materials be made freely available (that is, at
no cost to the consumer) while still retaining most or all copyright
license rights?
3.2.33 Derivation. For technology standards, do copyright licenses
for publications and all supporting materials and software licenses for
software artifacts permit the unrestricted creation and dissemination
of derivative works (a.k.a. ``open licensed'')? Do such open licenses
contain restrictions that require publication and dissemination of such
works in a manner consistent with the openness criteria described by,
for example, a GNU Public License (a.k.a. ``viral licensed'') or an MIT
Public License (a.k.a. ``academic licensed'')? Are there policies or
license restrictions on derivative works intended to prevent re-
packaging, re-sale, or modifications without re-publication for
assessment technology standards?
3.2.34 Licensing Descriptions (for materials contained within the
standard, not for the standard's licensing itself). How do technology
standards address licensing terms for assessment resources described
within the technology standard? Are there successful technology
standards or approaches for describing a wide variety of license types,
including traditional per-use licensing, Web-fulfillment, free (but
licensed), open (but licensed, including commercial or non-commercial
use permitted), and public domain status. Are there other resource
licensing issues that should be addressed within a technology standard
as a best practice?
Accessible Format: Individuals with disabilities can obtain this
document in an accessible format (e.g., Braille, large print, or
audiotape) on request to the program contact person listed under FOR
FURTHER INFORMATION CONTACT.
Electronic Access to This Document: You can view this document, as
well as all other documents of this Department published in the Federal
Register, in text or Adobe Portable Document Format (PDF) on the
Internet at the following site: https://www.ed.gov/news/fedregister.
To use PDF you must have Adobe Acrobat Reader, which is available
free at this site. If you have questions about using PDF, call the U.S.
Government Printing Office (GPO), toll free, at 1-888-293-6498; or in
the Washington, DC, area at (202) 512-1530.
Note: The official version of this document is the document
published in the Federal Register. Free Internet access to the
official edition of the Federal Register and the Code of Federal
Regulations is available on GPO Access at: https://www.gpoaccess.gov/nara/.
Program Authority: 20 U.S.C. 6771.
Dated: December 15, 2010.
James Shelton, III,
Assistant Deputy Secretary for Innovation and Improvement.
[FR Doc. 2010-31881 Filed 12-17-10; 8:45 am]
BILLING CODE 4000-01-P