Continue reading on DataGuidance with:
Free Member
Limited ArticlesCreate an account to continue accessing select articles, resources, and guidance notes.
Already have an account? Log in
Netherlands: Government publishes guide to EU AI Act
On October 16, 2024, the Government of the Netherlands published a guide for the EU Artificial Intelligence Act (the EU AI Act). The guide clarifies that it is intended for use by developers and deployers of artificial intelligence (AI) within organizations.
The guide outlines that organizations should, as a starting point, ask the four following questions:
- Risk - does the AI system fall within one of the AI risk categories?
- AI - is the system considered 'AI' under the EU AI Act?
- Role - is the organization a provider or deployer of the AI system?
- Obligations - what obligations does the organization need to adhere to?
For example, the guide provides that there are two types of high-risk AI systems:
- high-risk products - that are directly or indirectly also subject to existing product regulation provided under the EU AI Act (e.g., an AI system that is a medical device); and
- high-risk apps - that are developed and deployed for specific applications in high-risk areas, eight of which are provided for under the EU AI Act.
Likewise, in considering whether an organization is a provider or a deployer under the EU AI Act, the guide clarifies that deployers of a high-risk AI system can become providers when the organization:
- puts their own name or brand on the high-risk AI system; or
- makes a significant change to the high-risk AI system that was not foreseen by the provider and, as a result, the AI system no longer meets the requirements or intended purpose of the system established by the provider.
You can access the press release here and the guide here, both only available in Dutch.