Access Alert: US Department of Commerce issues Proposed Rulemaking on Cybersecurity and AI for IaaS providers

Access Alert: US Department of Commerce issues Proposed Rulemaking on Cybersecurity and AI for IaaS providers

On 29 January, the US Department of Commerce, through the Bureau of Industry and Security (BIS) and its newly-created Office of Information and Communications Technology and Services (OICTS), issued a Notice of Proposed Rulemaking (NPRM) primarily impacting Infrastructure as a Service (IaaS) providers and their foreign resellers, especially those involved in training large AI models. The NPRM, developed under the direction of both the Biden Administration’s AI Executive Order and Executive Order 13984 on steps to crack down on malicious cyber-enabled activities, aims to prevent foreign malicious cyber actors from abusing the US national cloud infrastructure and threatening national security by implementing customer identification programs (CIPs) for IaaS providers to verify foreign customer identities, mirroring anti-money laundering practices in financial institutions.

The NPRM defines “US IaaS providers” broadly, encompassing a wide range of entities and individuals within the US. It details the requirements of CIPs, including verifying customer and beneficial owner identities, and mandates procedures for detecting malicious cyber activities. Providers are expected to report their compliance through a CIP certification form and are responsible for ensuring foreign resellers comply with these rules. The NPRM would also allow Commerce to identify and regulate transactions with foreign jurisdictions along with persons posing security threats and take action against certain foreign jurisdictions and persons involved in malicious activities using US IaaS products. IaaS providers with approved Abuse of IaaS Products Deterrence Programs (ADPs), which detect and mitigate malicious activities, would be exempted from the new CIP requirements.

The NPRM also introduces requirements for providers to report transactions involving the training of large AI models that could potentially be used for malicious activities. Notably, Commerce proposed a definition for what constitutes a “large AI model with potential capabilities that could be used in malicious cyber-enabled activity”. According to the NPRM, Commerce will use that definition to “determine the set of technical conditions that a large AI model must possess in order to have the potential capabilities that could be used in malicious cyber-enabled activity” and therefore be subject to additional requirements and regulation.

Comments are due by 29 April 2024.

Access Partnership is actively working on several AI projects, tracking global AI developments and empowering our clients to respond strategically. For more information, please contact Jacob Hafey at [email protected].

Related Articles

AI for All in Thailand: Building an AI-ready economy with Google

AI for All in Thailand: Building an AI-ready economy with Google

อ่านบทความนี้เป็นภาษาไทย A doctor in Bangkok analyzes medical images with AI, leading to a faster, more accurate diagnosis for her patient....

19 Dec 2024 AI Policy Lab
Transforming Trade: Cross-border E-commerce Trends in Taiwan

Transforming Trade: Cross-border E-commerce Trends in Taiwan

While physical retail remains popular, the cross-border e-commerce market has experienced remarkable growth, with global retail e-commerce sales more than...

17 Dec 2024 Reports
The Role of Earth Observation in Combating Desertification in Middle Eastern Countries

The Role of Earth Observation in Combating Desertification in Middle Eastern Countries

This month’s UNCCD COP16 in Riyadh marked a pivotal moment in combating global land degradation and drought, with outcomes including...

13 Dec 2024 Opinion
Access Alert: Enhancing Efficiency in India’s Logistics Through AI and Digital Integration

Access Alert: Enhancing Efficiency in India’s Logistics Through AI and Digital Integration

A recent panel discussion at the Bengaluru Tech Summit 2024 on 20 November 2024 focused on the transformative role of...

29 Nov 2024 Opinion