New legal opinion on AI law and universities published

Are universities operators or providers of AI systems? What obligations do universities have towards their members in terms of AI skills training? When does an AI model actually become an AI system? And what applies to high-risk AI or open source AI? The new legal opinion from KI:edu.nrw addresses these and many other questions.

However, the first key question is whether the AI Regulation applies to scientific institutions at all in terms of scientific privilege. And the answer to this is clear: mostly yes. Universities do not have to comply with the AI Regulation if they only develop and operate an AI system for research purposes. However, this must be interpreted strictly. This is because: if a later practical use is considered, the AI Regulation applies at the latest from the commissioning of an AI system – even if it is initially operated for research purposes.

Accordingly, the AI Regulation has an impact on universities. They must take measures to teach AI skills, have obligations as providers or operators resulting from the AI Regulation and must comply with requirements if an AI system is classified as high-risk AI. The latter is the case, for example, if AI tools are used to assess learning outcomes or to manage learning processes.

 

My new report on the AI Act and universities was published today, produced by the KI:edu.nrw project on behalf of the state of NRW.

At the same time, the state of NRW has prepared the communication for publication and a news item has just appeared on its website.

https://ki-edu-nrw.ruhr-uni-bochum.de/rechtsgutachten-von-kiedu-nrw-zur-bedeutung-der-europaeischen-ki-verordnung-veroeffentlicht/

You can find the full text of the report at

https://www.itm.nrw/wp-content/uploads/2025/08/KI-edu-nrw_Rechtsgutachten-zur-Bedeutung-der-europaeischen-KI-Verordnung-fuer-Hochschulen.pdf