Since Sunday in the European Union, block regulators may prohibit the use of systems that they judge to present “unacceptable risk” or harm.
February 2 is the first deadline for compliance with the EU act, a comprehensive regulatory framework and the one that the European Parliament finally approved last March after years of development. The act officially entered into force August 1; What you are following now is the first of the compliance deadlines.
The specifics are set out in Article 5, but widely, the act is created to cover a number of cases of use when it can appear and interact with individuals, from customer applications to physical environments.
According to the block approach, there are four extensive risk levels: (1) the minimum risk (eg, spam filters by email) will not face regulatory supervision; (2) limited risk, which includes customer service chatbots, will have light -touch regulatory surveillance; (3) High risk – the one for health care recommendations is an example – will face severe regulatory supervision; and (4) unacceptable risk applications – the focus of this month’s compliance requirements – will be fully prohibited.
Some of the unacceptable activities include:
- It used for social marking (eg, building risk profiles based on a person’s behavior).
- One who manipulates a person’s decisions subliminally or misleading.
- One who exploits weaknesses such as age, disability or socio -economic status.
- One who tries to predict people who commit crimes based on their appearance.
- One who uses biometrics to ascertain a person’s characteristics, such as their sexual orientation.
- One who collects “real -time” biometric data in public places for law enforcement purposes.
- One who tries to ascertain people’s emotions at work or school.
- The one who creates – or expands – the databases of facial recognition by scraping images online or from security cameras.
Companies that are found to use any of the above applications in the EU will be subject to fines, no matter where they are based. They can be hit up to 35 million € ($ 36 million), or 7% of their annual income from the previous fiscal year, whatever.
The fines will not start for some time, noted Rob Sumroy, the head of technology at the British law firm and can, in an interview with Techcrunch.
“Organizations are expected to be fully in line with February 2, but … the other great deadline for which companies must be aware of is in August,” Sumroy said. “Until then, we will know who the competent authorities are, and the fines and provisions of implementation will take effect.”
Prior promises
The deadline of February 2 is in some way a formality.
Last September, over 100 companies signed the EU Pact he, a voluntary promise to begin implementing the principles of the act he has before entering the application. As part of the pact, signatories – which included Amazon, Google and Openai – committed to identifying systems that are likely to be categorized as a high risk according to the act.
Some giants of technology, especially meta and apples, bypassed the pact. The French initial of Ai Mistral, one of the harshest critics of his act, also decided not to sign.
This will not suggest that Apple, Meta, Mistral, or others who disagreed with the Pact will not meet their obligations – including stopping unacceptable hazardous systems. Sumroy points out that, given the nature of the cases of prohibited use defined, most companies will not be included in those practices anyway.
“For the organizations, a main concern about the EU act is whether guidelines, standards and codes of clear behavior will come in time – and essentially, if they will provide organizations clarity in compliance,” Sumroy said. “However, working groups are, so far, fulfilling their deadlines for the Code of Conduct for … developers.”
Possible exceptions
There are exceptions to some of the act of the act of his act.
For example, the act allows law enforcement to use certain systems that collect biometrics in public places if those systems help perform a “target research” for, say, a victim of abduction, or help prevent a threat “specific, essential and immediate” in life. This exemption requires authorization from the appropriate governing body, and the act emphasizes that law enforcement cannot make a decision that “produces an adverse legal effect” on a person based solely on the results of these systems.
The act also creates exceptions for systems that find emotions in the workplace and schools where there is a “medical or safety” reasoning as systems designed for therapeutic use.
The European Commission, the EU Executive Branch, said it would issue additional instructions in “early 2025”, after consulting with stakeholders in November. However, those guidelines should not yet be published.
Sumroy said it is also unclear how other laws on books can interact with the prohibitions and related provisions of his act. Clarity may not reach until later during the year as the implementation window approaches.
“It is important for organizations to remember that the regulation of it does not exist in isolation,” Sumroy said. “Other legal frameworks, such as GDPR, NIS2 and Dora, will interact with the act of it, creating possible challenges – especially about the requirements of incident notice. Understanding how these laws fit together will They are as important as they understand the act itself. “