As companies capitalise on generative AI and digital transformation tools to improve business performance and productivity, it is crucial they stay vigilant of their implications on data and privacy obligations and the evolving policy developments in this sphere.
Existing cyber-hygiene and data protection practices employed by companies may fall short, and extra measures, including enhanced guidance and frameworks and employee training, may be necessary to ensure the security of data collected, stored, and produced by these tools. It is essential that clients and employees subject to these new processes and systems are made explicitly aware of new use purposes and cases for their data processing and have given consent where necessary.
Potential regulatory changes ahead – what you need to know
In June, G7 data protection authorities warned companies of the range of privacy concerns arising from these tools. Additionally, earlier this year, the European Data Protection Board (EDPB) introduced a dedicated task force to foster cooperation and information exchange information on potential enforcement actions relating to generative AI tools, led by data protection authorities. Both groups are investigating the appropriateness of current privacy frameworks and are expected to produce guidance for policymakers and companies within the coming months.
Navigating AI regulations – extra precautions required for AI users
Several policies in development globally, such as the EU AI Act, are expected to require AI users (the companies deploying these systems) to undertake extra precautionary measures, particularly if such systems interact with employees or are used for decision-making purposes.
The development and deployment of digital transformation tools inherently require the collection and processing of increased data, elevating the risks of sensitive data leakage. Companies should be aware of the type of data these new tools are collecting and undertake data-minimisation measures, such as excluding sensitive data from datasets or employing data anonymisation or pseudonymisation, where possible.
Companies must also understand where the data is being stored and to whom it is being transferred. Where data is being transferred to other jurisdictions, either internally, to other branches, or to or via third parties, companies must ensure appropriate safeguards are in place, such as Binding Corporate Rules (BCRs) and Standard Contractual Clauses (SCCs). Companies transferring data between the EU and US must remain particularly vigilant, as the adequacy of the EU-US Data Privacy Framework is once again subject to judicial investigation.
Address your data, privacy, and compliance needs
Lydia Dettling is Access Partnership’s EU Policy lead in AI, data, privacy, and intellectual property. Based in Brussels, she is a certified European Data Protection Professional (CIPP/E). For more information, please reach out to [email protected].