Automated Decision-Making Systems in Austrian Administrative Law

Tags: , , ,

1/2023

Automated Decision-Making Systems in Austrian Administrative Law

Tags: , , ,

Il processo decisionale automatizzato è stato oggetto di discussione nel diritto amministrativo austriaco per oltre 40 anni. L’attenzione si è concentrata sempre sull’atto amministrativo (nel senso di una decisione individuale formale) e sul relativo procedimento. In questo ambito esistono principi consolidati, anche se le nuove tecnologie sollevano nuove questioni. Al di là dell’ambito della decisione amministrativa, su cui si è concentrata la maggior parte degli studi, per il resto si naviga ancora molto nel buio.


Automated decision-making has been discussed in Austrian administrative law for more than 40 years. The focus has always been on the administrative act (in the sense of a formal individual decision) and the pertaining procedure. In this area, there are established principles, although new technologies raise new questions. Beyond the administrative act, we are still very much in the dark.

Summary. 1. Automated administrative acts; – 1.1. Attribution to the administration; – 1.2. Legality; – 1.3. Comments; – 2. Administrative automation beyond the administrative act; – 2.1. The Job Center algorithm; – 2.2. Predictive analytics against tax fraud; – 3. Conclusion

1. Automated administrative acts

1.1. Attribution to the administration

The discussion[1] started in 1980, when the Constitutional Court had to decide whether some municipalities could outsource their IT to a common external service provider. The court saw no problem to do so and even to use the external service provider to produce administrative acts as long as (1) the decisions are clearly recognizable as being issued under the name of the competent municipal body, and (2) the external handling of the matter can be traced back in each individual case to the will of the body appointed by law to make the decision. «It is easy to ensure that the computer-assisted decisions can actually be traced back to the will of the legal decision-maker», the court added. «[T]his can be achieved, for example, by submitting the printouts created by the association’s data processing system to the competent municipal body for approval. On the other hand, it is possible to have the program required for the use of the EDP approved by the legal decision-maker and to design it in such a way that the personnel operating the data processing system are not left any leeway for decision-making»[2].

With that, the court also answered the next question that came up. Traditionally the Austrian Administrative Procedure Act required that, before being served, an administrative act be approved and signed by a person within the administration who is authorized to do so. The approval was a constitutive feature of the administrative act: without it, a decision was not an act of the administration. Therefore, a fully automated decision was not possible. New statutory rules for automatic speed controls and predetermined penalties for violations triggered a discussion whether the approval was a constitutional necessity. The constitutional court did not think so, but repeated, that «[t]he authority to which the act is legally attributable and which is therefore responsible for it must also actually be able to exert a determining influence on the computer-assisted process of issuing the act»[3].

Legislation reacted in a twofold way. On the one hand, for purposes of the general administrative procedure, it insisted on the tradition and explicitly required the approval of an authorized person also for electronically generated acts. However, a (hand) signature on a printout copy is no longer necessary; the approval can also be given and documented electronically in these cases, e.g., with an electronic signature certifying the identity of the approving person and the authenticity of the act[4]. So unless there is a statutory exception, an administrative act within the scope of application of the general rules is still bound to the intervention of a person; fully automated decisions remain excluded. Consequently, the courts denied a purely software-produced decision the quality as an administrative act[5] while accepting another decision where the authorized functionary had to hit an approval button before delivery[6].

On the other hand, fully automated decisions were permitted in tax law. Already as early as from 1969, a provision of the tax procedure code had provided that a decision «that was produced using punch card technology or similar processes» needed no signature[7]; if this had meant that it neither needed an approval was unclear. As the Supreme Administrative Court insisted on it[8], creating problems for the fiscal authorities which at that time were already serving 600.000 fully automated decisions per year[9], the provision was amended in 1987[10]. It now reads as follows:

«Copies prepared by means of computerized data processing […] do not require a signature or certification and, if they do not bear a signature or certification, they shall be deemed to have been approved by the head of the tax authority designated on the copy».

The amended provision is somewhat ambiguous: Linguistically, it respects tradition because it still speaks of approval; substantively, it breaks with tradition by irrefutably presuming approval so that it no longer matters. The reaction of the Supreme Administrative Court is then also somewhat contradictory: While theoretically upholding that also the new version of § 96 BAO «presupposes» that the individual decision is actually initiated by an authorized person within the administration[11], the Court practically accepted fully automated tax decisions (on surcharges for late payment) in a case where the programming had been outsourced by the ministry of finance and the tax authority figuring as issuing the decision did not even know of its existence[12]. In the meantime, millions of automated tax decisions, e.g. on the annual adjustment of income taxes, have been issued each year[13], but is not clear how many of them are fully automated or still initiated or approved by a person in the administration. All this should be seen in the light of the tax administration’s power to modify a tax decision in any direction within a whole year after its issuance[14].

Outside of tax law, a similar provision was enacted for the determination of study grants «on the basis of the submitted application form without further investigation»[15].

1.2. Legality

Where the law does not provide for exceptions, (fully or partially) automated administrative acts are subject to the same legal requirements as traditional acts. For instance, an automated decision on a surcharge for a delayed tax return was voided by a lower tax court because the computer program based the decision only on the delay whereas the respective law required an exercise of discretion, taking into account the extent to which the time limit was exceeded, the degree of fault and the amount of any advantage the taxpayer gained from the non-compliant behavior[16].

Procedural guarantees like the right to be heard, the right to a reasoning and legal protection are particularly important in this context. As they have a basis in constitutional and/or EU law, they limit the possibilities of permitting fully automated decisions or using existing authorizations[17]. The right to be heard before the decision is not guaranteed without limitations, though. Under Austrian law, it extends only to the facts of case and not to their legal classification. Decisions which are exclusively favorable to the addressee, correspond to an application or rely entirely on the facts given by the party, are exempted. This allows automatizing decisions based on a tax return or a study grant application. In other cases, the right to be heard can be limited for a good reason in a proportional way. For instance, the Act on Administrative Penalties provides for abbreviated procedures for some minor penalties, for instance for traffic offences, if there is rather reliable evidence, such as a photo of radar surveillance[18]. These procedures do not require the participation of the concerned person but produce a mere preliminary decision because a simple objection within a certain deadline by the addressee suffices to nullify the decision and to start a regular procedure. Beyond these special situations, the mere convenience of fully automated decisions is not a legitimate reason to restrict participation, though.

The requirement of a reasoning is treated in a similar way. Under general rules, a reasoning is not necessary if the decision fully corresponds to the position of the addressee. If, however, a reasoning is required it cannot be dispensed with referring to automation. «The legality of a decision cannot be measured against the capabilities of the existing IT system. Rather, the IT system must be adapted to the legal requirements», the Supreme Administrative Court ruled[19]. Unexplainable AI results are therefore no option under Austrian law.

Finally, in almost all cited decisions the courts stressed that the right to an appeal or an effective remedy must be guaranteed. In particular, the holdings on the reasoning requirements are motivated to a large degree by the necessity to secure effective legal protection. At the same time, this should fulfill the requirements of article 22 GDPR.

1.3. Comments

Treating the automation of administrative decisions foremost as a problem of attribution to the administration may be a peculiarity of Austrian law where the term for the administrative act (“Bescheid”) appears in the constitution and is often interpreted in the sense of the Pure Theory of Law which (at least in certain versions) considers an expression of the will of a human being to be constitutive for every normative act[20]. Attributability, however, is a means to achieve a well-established goal of most legal systems: the accountability of the administration. Since decisions are accepted as administrative acts only if the administration can effectively control them, the administration can be meaningfully held accountable for all administrative acts. Less clear than the principle is what effective control means in dealing with the new technologies of automation. Doctrine proposes as requirements for full automation that, once the decision to automate is made, the administration must be able to choose and understand the program, the program must operate according to predetermined rules, and the administration must be able to intervene in the automatic process at any point[21].

As shown, the requirements for legality severely limit the scope of application for fully automated decisions. In particular, they seem to preclude decisions involving discretion or the weighing of facts or interests because even in the unlikely event that the law would exhaustively list all relevant aspects and provide a formal scale for weighing them it could not foresee all the information arising in individual cases. If there is, on the other hand, only a single way to decide the matter respect for the right to be heard very often excludes automation. Of course, the lawmaker could try to avoid discretion, base decisions on formal criteria and make them mere preliminary acts that must be reconsidered in a regular administrative procedure if challenged. Where it does not do so, however, administration can only resort to partial automation, leaving the final decision to a human being.

This can lead to new difficulties, though: Algorithms can be flawed and discriminatory, but in more complex programs this is not so easy to detect. Under time pressure, without sufficient understanding of the programs, and automation biased like all of us, administrators often routinely rely on them, when they should be making their own judgments. The addressees of their decisions may not even be aware of the automation, and courts are usually unable to compensate for its shortcomings[22]. These problems have not yet been addressed by the Austrian legislator.

2. Administrative automation beyond the administrative act

Much of the current automation of the administration concerns activities that do not regularly lead to an administrative act, and therefore are not governed by comparatively strict procedural rules. Rather simple uses, at least from a legal point of view, are automatic speed limits based on the current concentration of pollutants in the air[23] or automatic payments such as child support in a so-called no stop-procedure, i.e., without an application or any other participation of the concerned person and without an administrative act, based only on register data[24]. But not all cases are so simple. Two examples demonstrate some of the problems of the new techniques.

2.1. The Job Center algorithm

A while ago, the government job center (Arbeitsmarktservice), an entity acting under private law, had an algorithm developed to classify its clients in regard of their chance to get a job. The classification was intended to help the center’s counsellors focus their efforts not on those clients who would soon get a job anyway or, on the contrary, would probably never find one, but on those in between, for whom the center’s support would really make a difference. When information on the plan became public, a big discussion arose because the algorithm, among other things, relied on very sparse data as proxies and assessed women’s job prospects lower than those of men[25]. The dispute centered around the question if the algorithm discriminated or just described a sad reality, and whether the job center was allowed to accept and prolong this reality.

The data protection authority started an investigation and finally issued a ban to use the algorithm[26]. It did not say anything about a possible discrimination but found that the use of the algorithm lacked a specific statutory basis that it deemed necessary because it qualified it as profiling exceeding the allowed data uses under the respective statute, and as an automated decision-making under article 22 GDPR. However, the federal administrative court (of first instance) to which the AMS turned did not share this view and voided the authority’s decision. It held that the Job center was entitled to use the personal data of its clients also for a sound labor market policy, a reason of substantial public interest in the sense of article 9 para 2 (g) GDPR, whether the processing constituted a profiling or not. It further found that article 22 GDPR applied only to fully automated decisions whereas the algorithm was only intended to assist the counselors who, according to the service instructions, should discuss the result with the client and then, if necessary, correct it. That they would routinely follow the algorithm’s recommendation, as the data protection authority had argued, was considered irrelevant for the legality of the algorithm, but potentially the subject of a separate investigation[27]. Obviously, that leaves some questions open, and the court decision is probably not the final word on the matter. An appeal is pending before the Supreme Administrative Court.

2.2. Predictive analytics against tax fraud

In 2022, the federal Ministry of Finance finally confirmed what for years had been an open secret: that its Predictive Analytics Competence Center had been systematically using AI in order to check millions of tax returns (and Covid 19 aid applications) for plausibility and to identify thousands of suspected cases of tax and custom evasion and subsidy fraud[28].

So far, the courts have not had an opportunity to deal with this practice but there is an academic discussion[29] circling around the following questions: Is the legal basis specific enough to satisfy the requirements of data protection law and the Austrian constitution (that provides that the entire public administration must be based on the law)? In particular: Do the existing statutes providing for the use of personal data of the taxpayers also cover their use to control the tax returns of other persons? What criteria are used to single out suspicious cases? How much transparency is possible without jeopardizing the functioning of the control? Does the algorithm punish deviant or statistically unusual behavior? How do we know of and avoid discriminatory results? Is it adequate to still consider the control mechanism as a mere internal procedure as there is no right of the taxpayers not to be controlled? And how could legal protection be organized?

As one can easily see, these questions are relevant for all kinds of automated control mechanisms, and they deserve a legislative answer.

3. Conclusion

Perhaps unsurprisingly, administrative law can tame automation most readily in the context of a formal administrative procedure for issuing an administrative act. Unexpectedly, however, it is not fully automated but automation-assisted decision-making that causes the biggest problems. Most of all, we need, at least in critical areas, an ex-ante quality control of administrative algorithms. Where the EU AI Act will probably not be sufficient, i.e., because of its narrow scope of application, the European Law Institute’s Model Rules on Impact Assessment of Algorithmic Decision-Making Systems Used by Public Administration[30] could serve as a blueprint.

  1. For a more detailed account, see M. Denk, Der maschinell erstellte Bescheid (Teil I), in Zeitschrift für Energie- und Technikrecht 2019, 189.
  2. VfSlg (Collection of the decisions of the Constitutional Court) 8844/1980. All citations were translated with www.DeepL.com/Translator; some corrections by the author. All cited court decisions and statutes are available at https://www.ris.bka.gv.at.
  3. VfSlg 11.590/1987.
  4. § 18 para 2 Allgemeines Verwaltungsverfahrensgesetz (AVG) (General Administrative Procedure Act) in the version of BGBl (Federal Law Gazette)1990/357; now § 18 para 3 AVG in the version of BGBl I 2008/5.
  5. VwSlg (Collection of the decisions of the Supreme Administrative Court) 18.949 A/2014.
  6. VwSlg 19.196 A/2015.
  7. § 96 Bundesabgabenordnung (BAO) (Federal Tax Code) in the version of BGBl 1969/143.
  8. VwGH (Supreme Administrative Court), 24. April 1986, 86/17/00720.
  9. Explanatory comments on the bill RV 108 BlgNR XVII.GP, 41.
  10. § 96 BAO in the version of BGBl 1987/312.
  11. VwGH, 16 December 2010, 2009/15/0002; VwSlg 8680 F/2011.
  12. VwSlg 8184 F/2006.
  13. A press release of the Ministry of Finance of 11 April 2022 speaks of 1.8 million “fully automated” adjustments without application in 2021: https://www.bmf.gv.at/presse/pressemeldungen/2022/April/bilanz-finanzamt.html.
  14. §§ 299, 302 BAO.
  15. § 41 para 4 Studienförderungsgesetz (Student Funding Act), BGBl 1992/305, later § 41 para 3 in the version of BGBl I 2016/54.
  16. Unabhängiger Finanzsenat (Independent Finance Senate), 15. June 2007, RV/0374-L/07.
  17. For a discussion in detail, see M. Mayrhofer, P. Parycek, Digitalisierung des Rechts – Herausforderungen und Voraussetzungen, in Verhandlungen des Einunzwanzigsten Österreichischen Juristentags, Manz, Wien, 2022, 77.
  18. § 47 Verwaltungsstrafgesetz (VStG) (Administrative Penal Act), BGBl 1991/52 in the version of BGBl I 2018/57.
  19. VwSlg 11.728/1985.
  20. See, e.g., R. Walter, Die Verwaltungsstrafgesetznovelle 1987, in Österreichische Juristen-Zeitung 1988, 361; C. Jabloner, Kein Imperativ ohne Imperator. Anmerkungen zu einer These Kelsens, in R. Walter (a cura di), Untersuchungen zur Reinen Rechtslehre, Manz, Wien, 1998, 75.
  21. Mayrhofer and Parycek (note 17) 68.
  22. For an example, see part 2.1. below.
  23. § 2 no. 23, § 3a and annex 8, § 14 para. 1, 6, 6a ff Immissionsschutzgesetz-Luft (Immission Control Act – Air), BGBl I 1997/115 in the version of BGBl I 2010/77; federal regulation BGBl II 2007/302 in the version of BGBl II 2021/153 und regional regulations, e.g., for Styria, LGBl (Provincial Law Gazette) 2014/117 in the version of LGBl 2019/72.
  24. § 10a Familienlastenausgeleichgesetz (Family Burden Sharing Act).
  25. P. Lopez, Reinforcing Intersectional Inequality via the AMS Algorithm in Austria, in: G. Getzinger, M. Jahrbacher (a cura di), Critical Issues in Science, Technology and Science Studies. Proceedings of the 18th Annual IAS-STS Conference 6-7 May 2019, Technische Universität, Graz 2019, 289 (diglib.tugraz.at/download.php?id=5e29a88e0e34f&location=browse); D. Allhutter, F. Cech, F. Fischer, G. Grill, A. Mager, Der AMS-Algorithmus Eine Soziotechnische Analyse des Arbeitsmarktchancen-Assistenz-Systems, Österreichische Akademie der Wissenschaften, Wien, 2020 (epub.oeaw.ac.at/0xc1aa5576_0x003bfdf3.pdf); K. Bachberger-Strolz, Profiling, Targeting, Algorithmen, künstliche Intelligenz – über die Irrwege einer Debatte in der Arbeitsmarktpolitik, in Wirtschaft und Gesellschaft 2020, 329; D. Allhutter, Ein Algorithmus zur effizienten Förderung der Chancen auf dem Arbeitsmarkt?, in Zeitschrift für Sozial- und Wirtschaftswissenschaften 2021, 82.
  26. The decision is not available to the public but extensively cited in the following court decision: Bundesverwaltungsgericht, 12 December 2020, W256 2235360-1.
  27. Bundesverwaltungsgericht, 12 December 2020, W256 2235360-1.
  28. See the Ministry of Finance’s press release of 11 September 2022, https://www.bmf.gv.at/presse/pressemeldungen/2022/September/pacc-bilanz.html. For the international context, see OECD, Tax Administration 2022. Comparative Information on OECD and Other Advanced and Emerging Economies, OECD, 2022 (https://www.oecd.org/ctp/administration/tax-administration-23077727.htm) chapter 6.
  29. T. Ehrke-Rabel, Die Automatisierung des Verwaltungsverfahrens am Beispiel des österreichischen Abgabenrechts, in N. Braun-Binder, P. Bußjäger. M. Eller, Auswirkungen der Digitalisierung auf die Zuordnung und Erlassung behördlicher Entscheidungen, new academic press, Wien, 2021, 21; B. Gunacker-Slawitsch, Algorithmen im Steuerrecht, in M. Holoubek, M. Lang, Algorithmen im Wirtschaftsrecht, Linde, Wien, 2022, in print.
  30. See ELI_Model_Rules_on_Impact_Assessment_of_ADMSs_Used_by_Public_Administration.pdf (europeanlawinstitute.eu)

Franz Merli

Full Professor of Public Law at the University of Vienna.