AI is being used in more and more organisations and is also gaining influence in key processes. This makes it more important to ensure that an AI application does what it is supposed to do and that that application complies with all rules and ethical standards. Currently, however, there are no concrete rules for AI applications. The EU is working on this, but it will be a few years before the AI Act actually comes into force. With the AI Compliance Check, we want to anticipate this and develop a method to check an AI application and thereby provide proof of compliance to users of the Check.
There is a need among organisations for this Check to demonstrate to potential customers and partners that they are actively engaged in scrutinising their own product.
If you want to think along or want more information, contact us!
Within this project, we will develop an AI Compliance Check. This will be a service to determine whether an AI application complies with applicable regulations and ethical standards. Specifically, this project will lead to:
- An assessment that can be taken;
- A report based on the assessment with an assessment of the AI application;
- Recommendations for mitigating and reducing the identified risks;
- A statement on the compliance of the AI application.
Although there are not many concrete regulations at the moment, we will already include expected new regulations when developing the Compliance Check.
On 19 January at 15:00, the official kickoff of the project took place at the AI Innovation Centre on the High Tech Campus. Together with the main project partners, we went through the planning of the project. The next steps are:
- Interviews with stakeholders to gather their input for the development of the AI assessment.
- Development of the AI assessment itself, in which we will evaluate regulatory compliance and ethical implications of AI systems.
- Conducting two pilots to test the assessment in practice.
After that, work will continue through 2023 to develop and test the Check. The aim is to complete the project by the end of 2023.
The AI Compliance Check project is being lead by LegalAIR and the AI Innovation Center.
LegalAIR is a knowledge platform for legal and ethical aspects of AI. The AI Innovation Center houses AI startups and scale-ups.
Next to the two grant partners of the MRE grant, this project a few more involved partners. These guarantee access to specialist legal, ethical and legal knowledge. An auditing specialist is also involved. With this broad knowledge base, we aim to make the Check as broad as possible and necessary. The partners are:
BG.legal is involved as the program manager for this project as well as legal partner.
This project is being supported by a contribution from the Stimuleringsfonds Metropoolregio Eindhoven under projectnumber 5.1043.