alGOVrithms - Usage of Automated Decision Making

News | FIGHTING CORRUPTION | INTERNET AND INNOVATIONS | Publications | Policy Document 28 May 2019

Policy recommendations are the result of extensive research performed in Czechia, Georgia, Hungary, Poland, Serbia and Slovakia compiled by ePaństwo Foundation and published in the report alGOVrithms - the State of Play. 

 

The report consists of data and analysis gathered by researchers from ePaństwo Foundation, KohoVolit. eu for Czechia and Slovakia, IDFI, K-Monitor and CRTA between November 2018 and April 2019. 

 

Findings

 

We have not identified an existing overall state’s policy on the implementation of alGOVrithms in any of the countries participating in the research. While some of the countries such as Poland1 or collectively V4 member states2 work on the Artificial Intelligence strategies, none of them introduced any compre- hensive documents regulating the transparency and accountability of automated decision making. The report is probably the first document describing the broad perspective of this phenomenon and we hope that our policy recommendations will be taken into consideration by decision-makers working on the implementation of such tools in the future. In Poland, neither the Ministry of Digital Affairs nor Chancellery of the Prime Minister worked on the topic. We haven’t found any examples of ethical frameworks being in- troduced in any of the countries that were subjects of the research. 

 

Neither have we found any examples of the exis- tence of the legal framework comprehensively des- cribing the rights and obligations of the states and citizens in this regard. If some legal documents exist they refer to some aspect of examples of alGOV- rithms such as allocating judges to specific court cases. This is the case in Georgia where Organic Law of Georgia on Common Courts was amended3, Poland where Regulation of the Minister of Justice of 28 December 2017 amended the Regulation – Rules for the operation of common courts was introduced or Serbia regulating its system of selection of judges in The Court Rules of Procedure (2009).

 

A general but still not comprehensive regulation of automated decision making can be found in Hungary were “The legislation on decision-making in general public administration procedures” includes regula- tion on automated decision making in decisions on requests by clients. 

 

In European Union countries, general rules on auto- mated decision making were introduced thanks to the General Data Protection Regulation (GDPR) im- plementation in May 2018.5 According to the art 22.1 of GDPR, “the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.” It seems that this provision is not relevant in most of the cases as there is a “human factor” involved or the algorithm has no direct impact on a citizen’s situation. It seems also,  that fully auto- mated systems of speed measurement identified in all countries are not subject to these provisions as it was explained on the example of Prague, Czechia “all the decisions should be overviewed first by a human (a member of the Municipal police)”.

 

We have found that algorithms used in software created for automated decision making are not subject to transparency and access to the algori- thms or the source code which includes them is not possible. In Poland, the Minister of Justice refused to provide the information requested by the ePaństwo Foundation and pointed out that the algorithm on Random Allocation of Judges System consists of tech- nical information and is not public information within the meaning of the Polish Act on Access to Public Information, and therefore is not subject to disclosure. According to provisions of the Act Amending Certain Acts in Order to Counteract the Use of the Financial Sector to Committing Tax Frauds which introduces STIR – Clearance Chamber ICT System7 the access to the algorithm describing its operation is not public due to security reasons. 

 

Access to the source code of similar solutions in other countries was also denied due to security or copyri- ght reasons. Sometimes the product is owned by an external company, as was the case of the tool for the Judiciary Council in Slovakia where the Council informed the researcher that it is not in possession of the source code. In Czechia generally, the codes of the algorithms are not public. They are under copyri- ght not owned by the public body (with exception of procedures defined directly in the law).

 

We have also not detected any case of a single institution which oversees or even possess com- prehensive knowledge on which automated deci- sion-making systems exist in the country. In every researched country the situation is the same as in Georgia where the researcher noted there isn’t any public institution, which is directly responsible for adopting and implementing policies regarding algori- thm usage in the public sector. On the contrary, each government organization can develop any software according to their needs and programs.

 

Apart from the case of Serbian system of allocating the judges to cases, where the donor (EU) has audited the system, no external and independent audits are set in place in order to monitor the accuracy and fairness of algorithmic operations. 

 

Recommendations

 

It is high time to elaborate and implement consistent policies on automated decision making. We propose that they will consist of the following principles:

 

- Introducing policies on algovrithm implementation

 

As discovered during the research the coordinating authorities like Prime Ministers have no idea that algorithms have already been introduced by their dependent entities, not to mention other public ins- titutions. Governments should introduce comple- mentary policies including ethical guidelines to make sure that algorithms are not created in silos and the system is synergic. The policy should also introduce obligatory audits of systems performed by external and independent bodies. To begin, we recommend: 

 

- Setting up a coordination body  within the government 

 

- Implementing a clear and possibly complex le- gal framework on automated decisions making

 

–  Ethical Guidelines 

 

- Engaging civil society representatives  and external experts during the whole  process of creating alGOVrithms 

 

- Introducing Algorithmic Impact Assessments

 

- Introducing transparency clauses in contracts with companies delivering the software and open access to the source code

 

– Issuing  guidelines  explaining   the operation of algorithms 

 

- Elaborating the review and remedy system

 

 

See the full report here

 

/public/upload/IDFI_2019/General/alGOVrithms - Recommendations EN.pdf

 

 

 

Other Publications on This Issue