Access Alert: US Department of Commerce issues Proposed Rulemaking on Cybersecurity and AI for IaaS providers

Access Alert: US Department of Commerce issues Proposed Rulemaking on Cybersecurity and AI for IaaS providers

On 29 January, the US Department of Commerce, through the Bureau of Industry and Security (BIS) and its newly-created Office of Information and Communications Technology and Services (OICTS), issued a Notice of Proposed Rulemaking (NPRM) primarily impacting Infrastructure as a Service (IaaS) providers and their foreign resellers, especially those involved in training large AI models. The NPRM, developed under the direction of both the Biden Administration’s AI Executive Order and Executive Order 13984 on steps to crack down on malicious cyber-enabled activities, aims to prevent foreign malicious cyber actors from abusing the US national cloud infrastructure and threatening national security by implementing customer identification programs (CIPs) for IaaS providers to verify foreign customer identities, mirroring anti-money laundering practices in financial institutions.

The NPRM defines “US IaaS providers” broadly, encompassing a wide range of entities and individuals within the US. It details the requirements of CIPs, including verifying customer and beneficial owner identities, and mandates procedures for detecting malicious cyber activities. Providers are expected to report their compliance through a CIP certification form and are responsible for ensuring foreign resellers comply with these rules. The NPRM would also allow Commerce to identify and regulate transactions with foreign jurisdictions along with persons posing security threats and take action against certain foreign jurisdictions and persons involved in malicious activities using US IaaS products. IaaS providers with approved Abuse of IaaS Products Deterrence Programs (ADPs), which detect and mitigate malicious activities, would be exempted from the new CIP requirements.

The NPRM also introduces requirements for providers to report transactions involving the training of large AI models that could potentially be used for malicious activities. Notably, Commerce proposed a definition for what constitutes a “large AI model with potential capabilities that could be used in malicious cyber-enabled activity”. According to the NPRM, Commerce will use that definition to “determine the set of technical conditions that a large AI model must possess in order to have the potential capabilities that could be used in malicious cyber-enabled activity” and therefore be subject to additional requirements and regulation.

Comments are due by 29 April 2024.

Access Partnership is actively working on several AI projects, tracking global AI developments and empowering our clients to respond strategically. For more information, please contact Jacob Hafey at [email protected].

Related Articles

Indonesia’s booming app economy: Google Play has helped Indonesian developers earn IDR 2.14 trillion in 2023

Indonesia’s booming app economy: Google Play has helped Indonesian developers earn IDR 2.14 trillion in 2023

Indonesia, the world’s fourth most populous nation, is experiencing a mobile revolution. With a rapidly growing Internet user base exceeding...

20 Jun 2024 Opinion
Access Alert: Engagement opportunities at GSR-24

Access Alert: Engagement opportunities at GSR-24

In two weeks, the International Telecommunications Union (ITU) is hosting its annual event for telecommunications regulators: The Global Symposium for...

18 Jun 2024 Opinion
Lucas Gallitto of GSMA on 5G, Satellites, and Policy in Latin America

Lucas Gallitto of GSMA on 5G, Satellites, and Policy in Latin America

Join us in this episode of LATAM Digital Voices as we welcome Lucas Gallitto from GSMA, a leading figure in...

18 Jun 2024 Opinion
Key Takeaways from the AI for Good Global Summit 2024

Key Takeaways from the AI for Good Global Summit 2024

Access Partnership participated in the AI for Good Global Summit 2024, held on 29-31 May in Geneva, which brought together...

17 Jun 2024 Opinion