The Slovak Constitutional Court on Risk Profiling and Automated Decision-Making by the Tax Authority
The Slovak Constitutional Court just published its eKasa decision, dealing with the constitutionality of indiscriminate data collection of store receipts and their subsequent re-use for risk profiling of companies.
Some years ago, the Slovak legislature adopted a law mandating the collection of all store receipts in the country. Every receipt issued by a sole trader or a company is in its entirety sent in real-time to a central database administered by the Tax Authority. The authority subsequently uses such data for the risk-profiling of all companies. Using an unidentified system, they combine the data with other inputs (lists or datasets) which they feed into a system that then produces a list of risky companies (i.e, likely violators of tax rules). These lists are then used to prioritize the supervisory activities conducted by employees (i.e, individual checks). The law in question includes no mention of such systems or re-use of data and consequently offers no safeguards.
While the decision is interesting in many dimensions, internationally, the most interesting issue concerns what the court refers to as “automated assessment”. The concept basically tries to capture the notion of automated decision-making, including AI-assisted decision-making, employed by public authorities in the context of their governmental responsibilities. I copy & paste the key paragraphs below.
These are the key points made by the Constitutional Court:
- “Automated assesment” of individuals based on their personal data constitutes a separate interference with their right to privacy and data protection, irrespective of whether it has any adverse impact on the concerned individuals (§ 120).
- As a result, any automated assesment of individuals based on personal data has to be authoritzed by the Slovak parliament and cannot be implemented only following a managerial decision (§§ 120-123).
- Automated assesment is understood to include automated decision-making but also decision-assistance resulting in action or inaction of the authorities (§ 123).
- The legislature must consider proportionality of the deployment of such systems in each case. The automation of decision-making cannot be justified only by a fact that it is useful and saves public resources (§ 125).
- “Technological progress in public administration cannot result in an impersonal state whose decisions are inexplicable, unexplained and at the same time no one is responsible for them.” (§ 127)
- Automated assesment posses special risks because it concerns a large amount of individuals under great information asymmetry where the errors are mostly of systemic nature (§ 128).
- The legislature bears therefore a special responsibility to design three types of safeguards:
- (i) transparency,
- (ii) protection of individuals, and
- (iii) collective supervision.
- These safeguards have to reflect the intensity of risks posed by such systems.
- Although GDPR offers some safeguards, such as DPIAs, depending on the system at stake, the Slovak constitution might require that additional measures are introduced, such as registries, individual disclaimers, testing of data-sets or models used, appeals, access to the logic of decisions, or more holistic assesment of the impact on fundamental rights (§§ 133-135, 138). The supervision should focus on quality of systems and their input (datasets or models used), their error rates, and scrutiny of impact (§ 138).
- The supervision of such systems has to be ex-ante and ex-post. GDPR’s infrastructure of supervision is insufficient because it relies on the individuals and their rights. According to the Court, “it cannot be the task of an individual to regularly correct the systematic problems of public administration through his/her situation” (§ 136).
- The Court notes that Art 22 GDPR (and Art 11 LED) regulates a small subset of such decision-making, however, notes that in the context of public authorities, the Slovak Constitution can offer more protection than the secondary EU law as the issue is not fully harmonized (§ 141, pointing to Art. 6(2) & Art. 6(1)(e) GDPR).
Here are the relevant parts of the decision (it is Google Translate only, so the quality can vary):
3.1.6. Risk assessment of entrepreneurs in the financial administration:
118. According to the statements of the financial report, the data from the e-cash register are also used today “for the purpose of analytical evaluation … riskiness of tax subjects from the point of view of sales records”. The data is then linked to other databases. The data from the e-cash register are therefore used to evaluate entrepreneurs according to their riskiness. Such analytical activity represents the so-called profiling, which in literature and sub-standards is understood as “any form of automated processing of personal data which consists in the use of such personal data to evaluate certain personal aspects relating to a natural person, in particular analyzing or predicting work performance aspects of the natural person concerned, property, health, personal preferences, interests, reliability, behavior, location or movement “[see Art. 4 par. 4 GDPR (self-highlighted) and wider: HILDEBRANDT, M. (2008) Defining Profiling: A New Type of Knowledge? In: HILDEBRANDT, M., GUTWIRTH, S. (eds) Profiling the European Citizen. Springer, Dordrecht. Available at: dohttps: //doi.org/10.1007/978-1-4020-6914-7_2˃].
119. Risk assessments are then used in the context of the performance of control activities by civil servants of the financial administration. However, automated risk assessment does not have sufficient legal support. Although it is possible that in a particular case such an operation falls under a specific legitimate aim and purpose of data collection under the ZoERP, its implementation under current law, even in financial management, does not meet the legality requirements and the existence of guarantees regarding the quality of supervision.
120. Automated assessment of an individual on the basis of comprehensive data collection constitutes an interference with the right to informational self-determination, regardless of whether it has a certain consequence for him. This is one of the reasons why, for example, the Federal Constitutional Court of Germany considered the automated assessment of car registration numbers to be an intervention even for persons whose inspection did not lead to any negative result (BVerfG, file no. 1 BvR 142/15, paragraphs 45 and 51). The CJEU also in its opinion no. 1/15 takes the view that automated assessment of persons in air transport constitutes a special interference with the right to protection of privacy and personal data (CJEU Opinion No 1/15, para. 168 et seq.). Any such assessment by the State, whether it causes discomfort or a negative effect on the individual, restricts his freedom. The state exercises its power over the individual by monitoring, evaluating or controlling him. This creates an individual’s sense of observation on the part of the individual (BVerfG, file no. 1 BvR 142/15, paragraph 98).
121. In their recent opinion on the Big Brother Watch and Others v. The United Kingdom Judges Lemmens, Vehabovic and Bošnjak, they emphasized, mental and physical health. It forces individuals to internalize their social behavior too much, making them feel guilty or ashamed of their feelings, thoughts, desires or actions that they do not want to express in public ”(see Consensus of Judges Lemmens, Vehabovic and Bošnjak in Big Brother Watch and Others v. The United Kingdom). , Nos 58170/13, 62322/14, 24960/15, paragraph 4).
122. Each automated assessment requires a minimal association of data with certain criteria, models or other databases (eg the car registration number is compared with existing lists or the activity of an entrepreneur with the usual behavior of a comparable entrepreneur). In most cases, an algorithm is used to derive some data from the data, to suggest a certain arrangement or to provide a recommendation to a public authority employee or even to prepare a decision or part of it for him. Since the automated assessment of an individual constitutes an interference with his right to informational self-determination, he must be 13 par. 2 of the Constitution established by law. It is essential that the addressees of the rules be able to assess how their rights are limited and what the consequences are for them (see Leander v. Sweden  No 9248/81, paragraph 50; Margareta and Roger Andersson v. Sweden ). No 12963/87, paragraph 75). The law restricting fundamental rights must be specific enough to make its application predictable. The problematic restriction of rights cannot be hidden under a better-sounding or implicit abstract wording (PL. ÚS 13/2020, bod 84). This requirement does not in itself preclude the adoption of general legislation in the field of public administration. However, the legislation must provide the citizen with a sufficiently clear indication of the circumstances and conditions under which the public authority is empowered to interfere with his rights. Admitting a high degree of uncertainty of a legal norm would also limit the real control of the compliance of a provision of a law with the constitution, and thus prevent the constitutional court from exercising its control function.
123. The “silent law” cannot serve as a basis for an automated assessment of an individual, whether it results in a decision or inaction. Automated risk assessment cannot only be a consequence of a managerial decision of a public administration body, but should be the subject of a public debate reflected in the legislative process. The lack of explicit approval by the legislator is also reflected in the lack of regulation of the supervisory infrastructure for such assessment. It is clear from the context of the power relationship between the state and the individual that the impact of an automated assessment of the individual and his circumstances is usually crucial for his addressee. It will often be an important input for the subsequent conduct of a public authority employee who does not sufficiently critically or at all correct the result of this automated assessment.
124. There is no specific or general legal norm of this type in the legislation under review.
125. Once the legality of such a restriction has been met, the legislator must also ensure the legitimacy and proportionality of its solution. If the automated assessment is comprehensive and therefore does not cover a limited circle of “suspects”, special consideration must be given to whether the interests protected by such an assessment sufficiently outweigh the general sense of surveillance that is exacerbated. Automation cannot be used wherever it is technically possible and useful, simply because it saves the public authorities, and therefore the taxpayers, resources. The Federal Constitutional Court of Germany, for example, has concluded that unrestricted automated registration of road registration numbers for the purpose of protection against any danger would be disproportionate (BVerfG, Case 1 BvR 142/15, paragraph 104). Therefore, even in the Slovak context, the deployment of automated assessment based on personal data will need to be examined on an ad hoc basis for specific uses. The mere fact that there is a sufficient public interest in the collection of certain data does not mean that the same interest exists in their further use (by analogy, Satakunnan Markkinapörssi Oy and Satamedia Oy v. Finland, Nos 931/13, paragraphs 172-178 and 198) .
126. As the ECtHR points out, any State which assumes the role of developing and using new technologies has a special responsibility to ensure the right balance with fundamental rights (S. and Marper v. The United Kingdom, Nos. 30562/04, 30566/04, paragraph 112). ). In particular, the fact that the development of technology will inevitably lead to new, unpredictable ways in which the information collected and its use may affect individuals cannot be underestimated (S. and Marper v. The United Kingdom, Nos. 30562/04, 30566/04, paragraph 71). The deployment of technologies therefore requires a comprehensive assessment in the light of ever new possibilities for their use (eg Gaurghran v. The United Kingdom, no. 45245/15, paragraphs 67 to 70 in relation to the use of face recognition on older data sets).
127. The application of technological progress in public administration cannot result in an impersonal state whose decisions are inexplicable, unexplained and at the same time no one is responsible for them.
128. The peculiarity of automated assessment lies in the fact that it often concerns a large group of persons and the criteria, patterns or linked databases used are not easy to understand for the addressee, although they may contain errors or lead to erroneous conclusions about the individual. Such shortcomings of the system therefore create a systemic risk for society and the individual (Prof. Pasquale speaks in this regard of the risk of so-called black boxes in society, see PASQUALE, F. The Black Box Society: The Secret Algorithms that Control Money and Information. Harvard University Press , 2015.). Abroad, there are already cases where such fundamental risks of using automated assessment without sufficient supervision are clear, to which the Council of Europe has recently responded [see Recommendation CM / Rec (2020) 1 of the Council of Ministers to member states on human rights aspects of algorithmic systems from 8 April 2020, as well as Declaration (17/03/2021) 2 of the Council of Ministers on the risks of decisions in the field of social security assisted by computers or artificial intelligence of 17 March 2021; for specific cases, see the algorithmic adjustment of graduation stamps in Great Britain from 2020 – HUSOVEC, M., MESARČÍK, M. and ANDRAŠKO, J. Law of Information and Communication Technologies 1 (TINCT 2020). with. 108, and the case of exposing the abuse of the social system in the Netherlands known as the so-called Systeem Risico Indicatie (SyRI) – Available at:
129. Therefore, even after the choice of the proportional application of the technology, it is necessary to ensure the existence of guarantees for the supervision of their further use. In the context of automated air passenger assessments, the CJEU emphasized that “pre-established patterns and criteria should be concrete and reliable on the one hand” and “non-discriminatory on the other”, with the databases against which the data are compared being “reliable and up-to-date”.
130. Automated assessment based on personal data in the e-cash system, together with other criteria, patterns or databases, leads to the evaluation of certain entrepreneurs as riskier and therefore suitable for more detailed control. The Constitutional Court does not have the specifics concerning the functioning of this system. However, given the lack of legality of this system, they are not even necessary to assess the case. It is clear from the description of the financial report that, at least in the middle of the year, it uses an analytical system that automatically attributes a certain risk to a specific entrepreneur. The fact that the system itself does not make a decision on whether to carry out a tax or other control is not relevant from the point of view of the right to informational self-determination. From his point of view, it is essential that automation refers to the evaluation of a person on the basis of his personal data.
131. If personal data did not form the basis of state action, the situation would fall outside the scope of Art. 19 par. 3 and Art. 22 par. 1 of the Constitution (similar to BVerfG, file no. 1 BvR 142/15, paragraph 48), although it will still fall under other provisions of the Constitution. Relevant in this respect are, in particular, the right to a fair trial, the prohibition of unequal treatment, as well as freedom of expression or assembly. Indeed, it is clear that algorithmic assessment in particular can have an impact on an individual and his or her fundamental rights, even if it is not based on personal data (see Recommendation CM / Rec (2020) 1, Part A, point 4). At the same time, it should be recalled how it was explained in detail in paragraphs 38 to 46 of this judgment that the constitutional protection of the law under Art. 19 par. 3 and Art. 22 par. 1 of the Constitution is not always limited to personal data of natural persons, as is common in constitutional law.
132. The legislator alone has the possibility, through specific legal provisions in a general or specific regulation, to ensure that the criteria, models or linked databases used in the context of automated assessment are up-to-date, reliable and non-discriminatory. In addition to the general safeguards necessary for the processing of personal data, it does so through specific safeguards: (i) transparency, (ii) individual protection and (iii) collective supervision [similar to BVerfG, sp. zn. 1 BvR 142/15, paragraph 101, CJEU Opinion no. 1/15, para. 168 et seq., As well as Recommendation CM / Rec (2020) 1, Part B, points 4 and 5]. These guarantees must take into account the specificities of the procurement and deployment of automated systems, the operation of which will affect the individual and his fundamental rights and freedoms.
133. In the first place, the individual concerned must therefore be informed that the conduct of a public authority, such as its decision, is influenced by the use of such an automated system (Recommendation CM / Rec (2020) 1, Part B, point 4). The individual must be able to know about the existence, scope and implications of his or her assessment by automated means, whether through public registers, specific instructions, or otherwise. Only if it is clear to the addressee that he is subject to automated assessment to a certain extent can he effectively defend himself against possible errors. It is up to the legislator to materialize this demand.
134. The obligation of transparency may also result from a public authority in part from the general sub-constitutional regulation of the right to personal data protection (Articles 12 to 15 of the GDPR). In the context of automated assessment, a public authority will normally be required to carry out a data protection impact assessment under Art. 35 GDPR. However, unlike other processing operations, the impact assessment must focus on the overall human rights impact of automated systems on individuals (Recommendation CM / Rec (2020) 1, Part B, point 5.2). It must also identify specific risks, as well as document the scope of human and automated assessment in the individual steps of the data processing process, the method of data-set testing and models used, as well as alternative more environmentally friendly solutions [see Guidelines on Automated individual decision-making]. making and Profiling for the purposes of Regulation 2016/679 (2018), s. 29 and Recommendation CM / Rec (2020) 1, Part B, points 3 and 5]. The adequacy of this general regulation will depend on the circumstances and seriousness of the interference with the right to informational self-determination.
135. If the technical solution is provided to the state by a third party, the state must meet the same conditions of transparency. Intellectual property rights, trade secrets or the security of the system cannot be a reason to deny effective access to the necessary information (see also Recommendation CM / Rec (2020) 1, Part B, point 5.2). Otherwise, a simple decision to involve external suppliers would deprive the rights of the individual, and therefore, if the public authority uses solutions from external suppliers, it must ensure effective access to the necessary information. As already stated in paragraphs 97 to 104 of this judgment, the by-laws of those rights provide ample scope for the exercise of the right of access to information. However, this does not reduce the obligation for public authorities to act responsibly already at the public procurement stage of these systems (see also Recommendation CM / Rec (2020) 1, Part A, point 12).
136. Secondly, there must be independent collective control over the use of such a system, which operates both ex-ante (pre-implementation) and ex-post (post-implementation). In this respect, it is not possible to rely solely on the infrastructure of the right to the protection of personal data, as it operates on the principle of protecting the rights of the individual. However, it cannot be the task of an individual to regularly correct the systematic problems of public administration through his situation. Collective control, whether through independent state institutions, certification, civil society involvement, or academia, therefore complements individual protection against collective harm (see EDWARDS, L., VEALE, M. Slave to the Algorithm? Why a ‘right to an explanation)? ‘is probably not the remedy you are looking for (2017). In: Duke Law & Technology Review 18, p. 22.).
137. This means that the control must allow the quality of the system, its components, errors and imperfections to be assessed before it is put into operation and after it is put into use (for example, through audits, quality review of sample decisions, reporting and statistics, and Come.). The more complex the system, the deeper its collective control must be. If the system has weak or blind spots, the civil servants who will work with it must be demonstrably aware of these spots, as their ignorance can be a source of error. The use of the system must be subject to documentation and logging in order both to enable collective supervision but also to facilitate the enforcement of individual rights [BVerfG, sp. zn. 1 BvR 142/15, paragraphs 156 and 157, as well as Recommendation CM / Rec (2020) 1, Part A, paragraphs 3 and 4].
138. Finally, the supervision of automated assessment must enable the individual to effectively defend himself against system imperfections and errors. It is up to the legislator to choose exactly what means he will choose. However, they must be efficient and accessible and at the same time entrusted to an independent supervisory authority (see also Recommendation CM / Rec (2020) 1, Part A, points 4.5). Due to the different ways of judging, there is no one way to protect an individual from system errors. The effectiveness of the means, whether in the form of a right of appeal, correction of inputs or assessment criteria, or access to the logic of the decision or individual assessment, will depend on whether the automated assessment uses patterns, models or other databases and leads to a decision or inaction of a public authority. It will also depend on the riskiness of the system.
139. However, the application of the supervisory guarantees already mentioned must be able to bring about a change in the conduct of the public authority. As the Constitutional Court has emphasized in the past, good independent supervision cannot be exercised without sufficient authorization, financial resources and control tools. Therefore, if a public authority is unable to quickly correct errors or explain the discriminatory effects of the assessment, it must be possible for the public authority or court to order the public authority to rectify those errors or imperfections. As a last resort, there must also be the possibility of ordering the system to stop being used in its problematic form.
140. The space for the Slovak legislature is not exhausted by these two provisions. Although the GDPR represents full harmonization for many issues (see paragraph 30 of this judgment), this is not the case on this legal issue. The GDPR is only an incomplete harmonization of the processing of personal data by public authorities for the purpose of exercising public authority. This follows from Art. 6 par. 2 in conjunction with Art. 6 par. 1 letter (e) the GDPR, which explicitly provides that Member States may maintain or introduce more specific provisions to adapt the application of the rules of this Regulation in the context where processing is necessary to fulfill a public interest task or in the exercise of public authority entrusted to the operator. The Slovak legislator is therefore entitled to set out more precisely the specific processing requirements, as well as to take further measures to “ensure lawful and fair processing”, possibly by transposing parts of the Regulation into Slovak legislation (Article 6 (2) and recital 8). GDPR). The already mentioned constitutional requirements are therefore fully compatible with the implementation space of the Slovak legislator and take due account of its guidelines set out in Art. 6 par. 2, while also developing the requirements of Art. 6 par. 3 GDPR. Therefore, in contrast to the case before the CJEU concerning the automated assessment of individual creditworthiness by non-state actors in the context of their business, the question of the Slovak legislature’s implementation space is relatively clear (see SCHUFA Holding, C-634/21, Wiesbaden Administrative Court, 1 October 2021, file number 6 K 788 / 20.WI available at: .jhttps: //www.jurpc.de/jurpc/show? Id = 20210144˃).