Request for Comments on the U.S. Artificial Intelligence Safety Institute's Draft Document: Managing Misuse Risk for Dual-Use Foundation Models, 64878 [2024-17614]

Download as PDF 64878 Federal Register / Vol. 89, No. 153 / Thursday, August 8, 2024 / Notices DEPARTMENT OF COMMERCE National Institute of Standards and Technology XRIN 0693–XC137 Request for Comments on the U.S. Artificial Intelligence Safety Institute’s Draft Document: Managing Misuse Risk for Dual-Use Foundation Models U.S. Artificial Intelligence Safety Institute (AISI), National Institute of Standards and Technology (NIST), U.S. Department of Commerce. ACTION: Notice; request for comments. AGENCY: The U.S. Artificial Intelligence Safety Institute (AISI), housed within NIST at the Department of Commerce, requests comments on a draft document responsive to an Executive order on Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence (AI) issued on October 30, 2023: NIST AI 800–1, Managing Misuse Risk for Dual-Use Foundation Models, found at https:// nvlpubs.nist.gov/nistpubs/ai/ NIST.AI.800-1.ipd.pdf. DATES: Comments containing information in response to this notice must be received on or before September 9, 2024, at 11:59 p.m. eastern time. Submissions received after that date may not be considered. ADDRESSES: The draft of NIST AI 800– 1, Managing Misuse Risk for Dual-Use Foundation Models is available for review and comment on the U.S. AI Safety Institute website at https:// nvlpubs.nist.gov/nistpubs/ai/ NIST.AI.800-1.ipd.pdf and at www.regulations.gov under docket number 240802–0209. Comments may be submitted: By email: • Comments on NIST AI 800–1 may be sent electronically to NISTAI800-1@ nist.gov with ‘‘NIST AI 800–1, Managing the Risk of Misuse for Dual-Use Foundation Models’’ in the subject line. Electronic submissions may be sent as an attachment in any of the following unlocked formats: HTML; ASCII; Word; RTF; or PDF. Via www.regulations.gov: • To submit electronic public comments via the Federal eRulemaking Portal. 1. Go to www.regulations.gov and enter 240802–0209 in the search field, 2. Click the ‘‘Comment Now!’’ icon, complete the required fields, including the relevant document number and title in the subject field, and 3. Enter or attach your comments. • Written comments may also be submitted by mail to Information lotter on DSK11XQN23PROD with NOTICES1 SUMMARY: VerDate Sep<11>2014 17:24 Aug 07, 2024 Jkt 262001 Technology Laboratory, ATTN: AI EO Document Comments, National Institute of Standards and Technology, 100 Bureau Drive, Mail Stop 8900, Gaithersburg, MD 20899–8900. Comments containing references, studies, research, and other empirical data that are not widely published should include copies of the referenced materials. All submissions, including attachments and other supporting materials, will become part of the public record and subject to public disclosure. AISI will not accept comments accompanied by a request that part or all of the material be treated confidentially because of its business proprietary nature or for any other reason. Therefore, do not submit confidential business information or otherwise sensitive, protected, or personal information, such as account numbers, Social Security numbers, or names of other individuals. All relevant comments received by the deadline will be posted at https:// www.regulations.gov under docket number 240802–0209 and at https:// www.nist.gov/artificial-intelligence/ executive-order-safe-secure-andtrustworthy-artificial-intelligence. Attachments and other supporting materials may become part of the public record and may be subject to public disclosure. FOR FURTHER INFORMATION CONTACT: For questions about this request for comments contact: christina.knight@ nist.gov or Christina Knight, U.S. Department of Commerce, 1401 Constitution Ave. NW, Washington, DC ((240) 961–8688). Direct media inquiries to NIST’s Office of Public Affairs at (301) 975–2762. Users of telecommunication devices for the deaf, or a text telephone may call the Federal Relay Service toll free at 1–800–877– 8339. Accessible Format: NIST will make the request for comments available in alternate formats, such as Braille or large print, upon request by persons with disabilities. SUPPLEMENTARY INFORMATION: AISI requests comments on the draft of Managing Misuse Risk for Dual-Use Foundation Models. AISI welcomes input on all aspects of the draft guidance, such as modifications to the included objectives, practices, and recommendations; suggestions for additions or deletions; and areas where further empirical evidence is needed. The questions below are optional and intended to prompt feedback: 1. What practical challenges exist to meeting the objectives outlined in the guidance? PO 00000 Frm 00013 Fmt 4703 Sfmt 4703 2. How can the guidance better address the ways in which misuse risks differ based on deployment (e.g., how a foundation model is released) and modality (text, image, audio, multimodal, and others)? 3. How can the guidance better reflect the important role for real-world monitoring in making risk assessments? 4. How can the guidance’s examples of documentation better support communication of practically useful information while adequately addressing confidentiality concerns, such as protecting proprietary information? 5. How can the guidance better enable collaboration among actors across the AI supply chain, such as addressing the role of both developers and their thirdparty partners in managing misuse risk? Authority: Sections 4.1(a)(ii) and 4.1(a)(ii)(A) of Executive Order 14110 of Oct. 30, 2023; 15 U.S.C. 272. Alicia Chambers, NIST Executive Secretariat. [FR Doc. 2024–17614 Filed 8–7–24; 8:45 am] BILLING CODE 3510–13–P DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration [RTID 0648–XE154] Schedule for Atlantic Highly Migratory Species Outreach Workshops National Marine Fisheries Service (NMFS), National Oceanic and Atmospheric Administration (NOAA), Commerce. ACTION: Notice of public outreach workshops. AGENCY: Three free Atlantic Highly Migratory Species (HMS) Outreach Workshops will be held from August 19, 2024 through August 21, 2024 in locations across Puerto Rico. These workshops are being offered to be responsive to stakeholder requests for additional outreach in Puerto Rico and U.S. Caribbean communities. The objectives of the HMS Outreach Workshops are to educate fishers, dealers, and the general public on HMS regulations, distribute outreach materials, and assist fishers in applying for HMS permits. DATES: The HMS Outreach Workshops will be held August 19, 2024 through August 21, 2024. See the SUPPLEMENTARY INFORMATION section for the specific dates and times. ADDRESSES: The HMS Outreach Workshops will be held in Cataño, SUMMARY: E:\FR\FM\08AUN1.SGM 08AUN1

Agencies

[Federal Register Volume 89, Number 153 (Thursday, August 8, 2024)]
[Notices]
[Page 64878]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 2024-17614]



[[Page 64878]]

-----------------------------------------------------------------------

DEPARTMENT OF COMMERCE

National Institute of Standards and Technology

XRIN 0693-XC137


Request for Comments on the U.S. Artificial Intelligence Safety 
Institute's Draft Document: Managing Misuse Risk for Dual-Use 
Foundation Models

AGENCY: U.S. Artificial Intelligence Safety Institute (AISI), National 
Institute of Standards and Technology (NIST), U.S. Department of 
Commerce.

ACTION: Notice; request for comments.

-----------------------------------------------------------------------

SUMMARY: The U.S. Artificial Intelligence Safety Institute (AISI), 
housed within NIST at the Department of Commerce, requests comments on 
a draft document responsive to an Executive order on Safe, Secure, and 
Trustworthy Development and Use of Artificial Intelligence (AI) issued 
on October 30, 2023: NIST AI 800-1, Managing Misuse Risk for Dual-Use 
Foundation Models, found at https://nvlpubs.nist.gov/nistpubs/ai/NIST.AI.800-1.ipd.pdf.

DATES: Comments containing information in response to this notice must 
be received on or before September 9, 2024, at 11:59 p.m. eastern time. 
Submissions received after that date may not be considered.

ADDRESSES: The draft of NIST AI 800-1, Managing Misuse Risk for Dual-
Use Foundation Models is available for review and comment on the U.S. 
AI Safety Institute website at https://nvlpubs.nist.gov/nistpubs/ai/NIST.AI.800-1.ipd.pdf and at www.regulations.gov under docket number 
240802-0209.
    Comments may be submitted:
    By email:
     Comments on NIST AI 800-1 may be sent electronically to 
[email protected] with ``NIST AI 800-1, Managing the Risk of Misuse 
for Dual-Use Foundation Models'' in the subject line. Electronic 
submissions may be sent as an attachment in any of the following 
unlocked formats: HTML; ASCII; Word; RTF; or PDF.
    Via www.regulations.gov:
     To submit electronic public comments via the Federal 
eRulemaking Portal.
    1. Go to www.regulations.gov and enter 240802-0209 in the search 
field,
    2. Click the ``Comment Now!'' icon, complete the required fields, 
including the relevant document number and title in the subject field, 
and
    3. Enter or attach your comments.
     Written comments may also be submitted by mail to 
Information Technology Laboratory, ATTN: AI EO Document Comments, 
National Institute of Standards and Technology, 100 Bureau Drive, Mail 
Stop 8900, Gaithersburg, MD 20899-8900.
    Comments containing references, studies, research, and other 
empirical data that are not widely published should include copies of 
the referenced materials. All submissions, including attachments and 
other supporting materials, will become part of the public record and 
subject to public disclosure.
    AISI will not accept comments accompanied by a request that part or 
all of the material be treated confidentially because of its business 
proprietary nature or for any other reason. Therefore, do not submit 
confidential business information or otherwise sensitive, protected, or 
personal information, such as account numbers, Social Security numbers, 
or names of other individuals.
    All relevant comments received by the deadline will be posted at 
https://www.regulations.gov under docket number 240802-0209 and at 
https://www.nist.gov/artificial-intelligence/executive-order-safe-secure-and-trustworthy-artificial-intelligence. Attachments and other 
supporting materials may become part of the public record and may be 
subject to public disclosure.

FOR FURTHER INFORMATION CONTACT: For questions about this request for 
comments contact: [email protected] or Christina Knight, U.S. 
Department of Commerce, 1401 Constitution Ave. NW, Washington, DC 
((240) 961-8688). Direct media inquiries to NIST's Office of Public 
Affairs at (301) 975-2762. Users of telecommunication devices for the 
deaf, or a text telephone may call the Federal Relay Service toll free 
at 1-800-877-8339. Accessible Format: NIST will make the request for 
comments available in alternate formats, such as Braille or large 
print, upon request by persons with disabilities.

SUPPLEMENTARY INFORMATION: AISI requests comments on the draft of 
Managing Misuse Risk for Dual-Use Foundation Models.
    AISI welcomes input on all aspects of the draft guidance, such as 
modifications to the included objectives, practices, and 
recommendations; suggestions for additions or deletions; and areas 
where further empirical evidence is needed. The questions below are 
optional and intended to prompt feedback:
    1. What practical challenges exist to meeting the objectives 
outlined in the guidance?
    2. How can the guidance better address the ways in which misuse 
risks differ based on deployment (e.g., how a foundation model is 
released) and modality (text, image, audio, multimodal, and others)?
    3. How can the guidance better reflect the important role for real-
world monitoring in making risk assessments?
    4. How can the guidance's examples of documentation better support 
communication of practically useful information while adequately 
addressing confidentiality concerns, such as protecting proprietary 
information?
    5. How can the guidance better enable collaboration among actors 
across the AI supply chain, such as addressing the role of both 
developers and their third-party partners in managing misuse risk?
    Authority: Sections 4.1(a)(ii) and 4.1(a)(ii)(A) of Executive Order 
14110 of Oct. 30, 2023; 15 U.S.C. 272.

Alicia Chambers,
NIST Executive Secretariat.
[FR Doc. 2024-17614 Filed 8-7-24; 8:45 am]
BILLING CODE 3510-13-P


This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.