Sie sind hier: RAV > PublikationenInfoBriefeInfoBrief #118, 2019 > From Predictive Policing to Legal Tech

From Predictive Policing to Legal Tech

A CONSTITUTIONAL STATE OF TOMORROW?

Volker Eick

The 24/7/365 use of information technologies generates data. For some years now, this has led to full recording and logging of human actions and talk (communication). Commercial enterprises, but also police and law enforcement authorities demand and/or simply take comprehensive access to this ›wealth of data‹, to Big Data. Until now, the focus has been mainly on peoples’ past, but for some years now it has also been focusing on their possible future behavior. It does not come as a surprise that the legal professions are also affected. The article gives an overview of some trends and raises questions in this regard.

Algorithms and Artificial Intelligence (AI) are used to store, search (subsequently or in real time), analyze, and apply data in a variety of ways, based on computing power, storage capacity, and the power of quantitative data analysis (Big Data) (1). These market-driven(2) trends(3) also aim at a technologization of the criminal justice system and police work.
Legal Tech (short for Legal Technology) refers to software and online-services that support or fully automate legal work processes, including algorithms allowing for what is called ›predictive justice‹ (›Pre-Jus‹). ›Pre-Pol‹ (short for ›predictive policing‹) is based on statistics and the considerations of ›sociophysics‹ and means software that is to enable the precise deployment of police forces as well as the analysis of case data to calculate the probability of who will commit a crime in the future and where the crime scenes are located(4).Cities such as Chicago(5),London(6)and Zurich(7)already deploy ›predictive‹ software that targets future offenders and are thus feeding the market of digital tech providers(8). In the Federal Republic of Germany, the BKA is currently involved in police projects in six Ländern carrying out feasibility studies, model projects and mass deployments of location-related data, particularly in the area of burglary(9).
These technical possibilities give rise to areas of application which (so far) have hardly been discussed under ethical aspects, from the perspective of fundamental rights and with a view to fundamental considerations of procedural principles (see below). These applications are particularly critical when ›Pre-Pol‹ and ›Pre-Jus‹ meet and merge.

PREDICTIVE POLICING (›PRE-POL‹)

In Chicago, for example, to take a brief look outside Europe, the police, using data mining and machine learning, keep a ›Strategic Subjects List‹ (also called ›Heat List‹)(10),i.e. a ranking of potential offenders and victims of crime. It is based on data including the employment and unemployment history, previous convictions, place of residence and contacts ›into the milieu‹. Persons who ascend in ranking within this list are placed under (permanent) police control(11).According to the same logic, the ‘no fly list’ is drawn up in the US and now also in the EU, on which persons are listed who are not allowed to board an aircraft – not even as passengers – if they are associated, according to comparable criteria as the above, with ›terror‹(12).Even in programs that would run in Germany under the slogan ›Socially Integrative City‹, community policing in the US is carried out with forecasting software(13). Meanwhile, people-oriented ›Pre-Pol‹ is quite widespread in Europe: The Automating Society project(14),developed by the NGO Algorithmen Watch offers a first overview, but current studies exist also for Austria(15),France(16),Italy(17),Switzerland(18),Turkey(19),and the UK(20). In the UK, to take one example, three police units – Kent, West Midlands and Avon and Somerset – use different types of ›predictive‹ police technology to figure where to best concentrate their manpower and financial resources. Durham deploys an algorithm to determine the likelihood of offenders’ recidivism, while probation officers* across the country use a mechanized assessment tool to measure the risk of recidivism of detainees prior to release. The Law Society(21),among others, reports on these tendencies and concludes that algorithms have the potential to establish or reinforce civil and human rights violations and distortions to the judicial system, e.g. the already existing tendency to under-supply certain parts of the population with police services, while turning even more abusive against other parts might intensify due to the use of algorithms (e.g. in racial profiling)(22).

THE SITUATION IN GERMANY

Pre-Pol‹ is currently deployed or tested in six of the German Länder(23).Unlike in the USA, however, no personal data is collected or evaluated – except for so-called »Gefährder« (roughly: potential attackers) and »relevante Personen« (literally: relevant persons) with an Islamic background, on which the BKA runs its own program, called RARADAR-iTE, which is now used by the federal and Länder police forces explicitly focusing on persons(24);  similarities apply to Hesse (see below). In general, though, only spatial or time-related data have been used to date; the former is referred to as ›predictive mapping‹, the latter as ›predictive identification‹(25).
There are basically three approaches and theoretical understandings to be distinguished on the Länder level: Firstly, the use of commercial standard software that is essentially based on Near-Repeat theory (Bavaria and Baden-Württemberg: PRECOBS; Hesse: KBL-operativ); secondly, the use of standard software further developed in-house that is essentially based on the Near-Repeat-Theory (Niedersachsen: PreM-AP); thirdly, the use of self-developed systems, which consider other indicators besides police data and Near-Repeat-Theory (Nordrhein-Westfalen, NRW: SKALA, Berlin: Krim-Pro). The Länder Hesse and North Rhine-Westphalia represent two special cases: Since 2017, the LKA Hesse works with the software KLB-operativ (Kriminalitätslagebild-operativ), which has been developed by the start-up company Palantir(26);here also personal data are processed(27).The company was founded by the Paypal founder and Facebook investor, Peter Thiel. As for data sources, the software exploits police databases, data from telephone surveillance, data from mobile phones, and Facebook profiles as well(28).The LKA NRW (State Criminal Police Office of NRW) with its SKALA software(29)is also a special case in this respect, because »crime theories are systematically examined« and »information on the infrastructure and socio-economic composition of residential neighborhoods is processed« – there is talk of 15 million data sets(30).
Apart from the developers, nobody knows how the pre-crime algorithms work. From this perspective, we are dealing with a rare case of ›self-disempowerment of the police‹, because police officers act on the basis of software they do not know and therefore cannot understand; this allows companies like Palantir and Alphabet Inc. to monopolize such knowledge. There is still a dispute about the effectiveness of such ›The Minority Report‹ software – recently, the Ministry of the Interior of Baden-Württemberg stopped using PRECOPS and stopped to work with prognosis software as a whole(31).
Whether monitoring on the basis of algorithms could bring more social minorities or people in socio-economically deprived urban areas into the focus of the police (keyword racial profiling), remains a mystery for German ministries and police forces, as does the question of on how much increasing control pressure by the police contributes to stigmatize these populations and neighborhoods.
It also remains difficult to assess the power of ›objective‹ data on police forces, which are under pressure to prevent future crimes and lack the self-confidence and knowledge to defend themselves against the (presumptive) claims of sections of the public and politicians and against the (presumed) technical conditions of algorithms (after all, such calculations increasingly determine their duty rosters and staffing targets and are thus an intrinsic part of the police’s everyday work)(32).Other challenges and unresolved issues arise from the lack of transparency of algorithms and from questions that may result from the right of access to data when software is developed by public authorities together with commercial companies. In addition, the question of the liability of Artificial Intelligence is a stake, once the software makes ›mistakes‹, suggests unlawful actions to users, or even commits crime directly. This can not only lead to data protection problems, but also make it more difficult or even impossible to challenge algorithms in court (see below, Legal Tech).

LEGAL TECH – ›MACHINE-MADE LAW‹?

In general, ›legal tech‹ refers to computer-aided, digital technologies to automate and simplify the process of finding, applying, accessing and administering the law through innovations; it is all about increasing efficiency and effectiveness(33).The term ›legal tech‹ therefore first and foremost covers the technological handling of changing consumer behavior, for example when lawyers are brought together with their clients via Internet platforms. But it also includes the electronic file (beA, the special electronic lawyer’s mailbox), the paperless office or forms of secure communication with courts, authorities, colleagues and clients, as well as the electronically supported processing and indexing of materials that could be relevant to the proceedings.
›Legal tech‹ also plays a role with regard to the collection of legally relevant data, for example in money laundering prevention – via headlines such as Customer Due Diligence and Know Your Customer it is now legally stipulated that data are to be digitally collected, stored and, if necessary, passed on to state authorities when real estate deals and other financial transactions are purchased (§§ 2 sections 1 and 2, 10, 11 section 4 GwG).
Automatically generated tax assessments or dynamically modified contracts designed by algorithms (smart contracts) are already part of everyday legal business today(34);further legal questions will arise from the use of self-propelled vehicles(35).
Finally, reference should be made to the challenges criminal law will be confronted with arising from the digital accusation, perpetration and clarification of criminal offences in criminal proceedings, particularly across borders, because »it is virtually indisputable that technology-driven digitization of internal security and criminal justice impedes or prevents the understanding and evaluation of digital evidence, affects the independence of the judiciary and undermines equality of arms for criminal defense«(36).This applies not only to access to evidence generated with the aid of software, but also to the reliability, validity, objectivity and plausibility of the digital data in general.

CYBERJUSTICE (›CY-JUS‹)

A further application of ›legal tech‹ are digital legal services performed by machines, which the EU calls ›cyberjustice‹. The European Commission for the Efficiency of Justice (CEJEP) cites facilitating access to justice and supporting judges or administrative courts as examples(37).France, for example, developed Sagace, an administrative service that enables a plaintiff to obtain summary information about his case online. The Netherlands offer a conciliation and mediation platform called Rechtwijzer, which is designed to help settle disputes relating to tenancy, neighborhood or family disputes before proceedings are to start. In addition, the United Kingdom provides a service called ›Make a plea‹, an online service for dealing with traffic offences; the service makes it possible to plead guilty and receive judgements online. The benefit of these legal services for clients is obvious and is expressed in a massive reduction in the costs of the individual services, whether for standardized legal information or for the assessment and collection of fines without the involvement of lawyers. The ongoing standardization of these legal information and of judgements will also lead to deviations becoming less likely and more difficult to prove in court. They will thus become more expensive and ultimately only be available to those who can financially afford to treat their case as an individual case. Finally, EU legislation in accordance with eEvidence, for example, needs to be considered(38).
This type of ›cyber justice‹, which is aimed at a constantly intensifying standardization of legal subjects, is to be distinguished from ›predictive justice‹, which has only recently emerged at the interface between artificial intelligence, big data and publicly available data sources.

PREDICTIVE JUSTICE (›PRE-JUS‹)

Thanks to the increasing progress in the field of Artificial Intelligence, which enables machines to process ever larger amounts of data, and thanks to governments that publish ever more legally relevant data, start-ups with a double promise have appeared on the ›justice market‹: to reduce legal uncertainty and make big data-driven legal predictions. »The idea is to destroy the time-consuming search for legal decisions«, the president of Doctrine (https://www.doctrine.fr/), pointed out during a symposium on ›Pre-Jus‹ at the Catholic University of Lille(39).The transformation of case law and legal procedures into a data-driven projection of law and judgement will make it possible to assess the chances of success in a certain case, to determine in advance the amount of possible remuneration and, on this basis, to align the lawyer’s strategy (if not already offered via ›cyberjustice‹) in accordance with what has been analyzed by the machine as successful in previous cases.
These start-ups include highly specialized providers that digitally check, collect and process claims for damages ›automatically‹ and in a highly standardized way – companies such as Mietright, MyRight, or Unfallhelden are examples. As a result, this legal tech instrument aims to bring about mass legal decisions for which the software has identified prospects of success. Such analyses identify connections between input data (law, facts of the case, justification) and output data (formal judgement, e.g. compensation amount). Correlations that are considered relevant make it possible to create models that, when used with new input data (new facts, such as the duration of the contractual relationship), result in a prediction of the decision (such as the remuneration amount). These providers aren’t newcomers on the ›justice market‹, but are a challenge insofar as they are completely exempt from any industry ethic and it is hard to predict where the boundaries of this market based on standardization and mass communication will be drawn – and to what extent this will transform legal procedures and ultimately fundamental rights.
›Legal Tech‹ goes even further, if one looks at the algorithm application developed by the French company Predictice (https://predictice.com/), whose software is currently being tested by the Courts of Appeal of Rennes and Douai and the Lille Bar Association). The company promises to put an end to »unpredictable, accidental and unequal jurisprudence« nationwide and instead »move towards something more logical, more scientific or at least a little better controlled«(40).As a result, this legal tech instrument aims to eliminate or at least reduce legal uncertainty; it is called artificial legal tech.
Essentially, a computer is taught (partly using machine learning, which allows its system to learn with an algorithm without being explicitly programmed for learning) to perform routine tasks and/or search large amounts of data. A practical implementation of AI within the legal profession is predictive coding, a form of technology-based verification of a large number of documents to clarify their relevance (e-disclosure). Predictive coding is now even mandatory in certain legal cases (EWHC 1464 (Ch))(41).
When it comes to the use of more complex and self-learning AI systems deployed for ›Legal Tech‹, the industry faces another difficulty, namely the liability of such systems. If AI produces a serious error, it is currently unclear who can be held accountable.
The US and the UK are operating such programs already for criminal justice purposes: several US states use software programs such as Correctional Offender Management Profiling for Alternative Sanctions (Compas) or Level of Service Inventory-Revised (LSI-R) to decide whether defendants are to be detained before a trial or not, or to assess the likelihood of a relapse, which in turn can influence the level of punishment(42).Durham’s police introduced the Harm Assessment Risk Tool (HART), a program that determines whether a suspect should be held in pre-trial detention or not; the program takes into account thirty different territorial and personal factors(43).Thus, both the logic and approach of ›Pre-Jus‹ are basically identical to that of ›Pre-Pol‹, namely to feed programs with perpetrator profiles, to link personal data with possible crimes and to correlate such data on the basis of a model assumptions or theory. It is not surprising that this results in discriminatory patterns, as the data are based on results of social exclusion, containment, discrimination, and racism. It is from such an algorithm-bolstered data material, via assumptions and forecasts, that a new ›objective‹ system of jurisprudence and preemptively ›identified‹ criminals is successively created in a way that is hardly comprehensible to the public.

PREDICTIVE WORKFARE (›PREWOR‹)

Algorithms are also used beyond criminal law, for example in the workfare regimes introduced in the wake of the variegated forms of neoliberalism, which aim on the one hand at compulsion to work and on the other to ›weed out‹ welfare recipients. Here, too, the predecessors were the US(44).
In Austria – and this surely will raise legal questions – from 2020 onwards the Labor Market Service (AMS) will use a statistical model which is to calculate the chances of unemployed people finding work again on the basis of their respective education, gender, previous employment career, age, citizenship and other criteria. For this purpose, unemployed persons are divided into categories and promoted or ›weeded out‹ according to their alleged chances of finding employment. If, according to the algorithm, childcare in a female-headed household is considered an obstacle to job a placement and if, according to the algorithm, this also applies to the respective age, this can lead directly from eligibility for support to a subsequent end of labor market integration attempts: The AMS therefore made the decision to continue existing injustices via algorithms and thus reinforce them(45).
A comparable program in the Netherlands – here against ›welfare fraud‹ under the title System Risk Indication (SyRI)(46)– deploys 17 categories of personal data to be processed by algorithms and aims to steer unemployed citizens out of the social welfare system.

SOME QUESTIONS AS FOR ›PROGRESS‹

Various authors have criticized both the form and the content of such approaches. According to them, the mathematical modelling of complex social phenomena is basically not a task that can be equated with other, more easily quantifiable activities (e.g. playing Go or recognizing a picture): here there is a much higher risk of incorrect correlations or of digital perpetuation of discriminatory prejudices. Furthermore, in legal theory two contradictory decisions can prove to be valid at the same time, if the legal justification is well-founded. Consequently, making predictions would be pure information that cannot claim to be a rule.
The British Law Society therefore sees the need to develop a legal framework for the use of algorithms, especially complex ones. Part of this must also be a national register of algorithm systems being in use, from which it must emerge how algorithms arrive at their respective results(47).
The ultimate question is whether justice should be predictable. With regard to algorithms, however, a rule is not predictable if one does not know the application rules of this rule, i.e. does not know how a law will be interpreted in an individual case. These subordinate rules are much more difficult to recognize and much less formalized than the former. »Therefore, they offer lawyers a certain margin of discretion«(48). If the law is to guarantee a certain degree of predictability, justice must be sought on a case-by-case basis.
In spring 2020, AED-EDL will hold a meeting to discuss these and other issues, and the RAV indeed should dig deeper into the issues at stake – as it is foreseeable that we will be confronted with these topics not only in the courtroom.

Volker Eick is a political scientist and member of the extended RAV board.

(1) See A. Završnik (ed.), Big Data, Crime and Social Control. London 2018; R. Reichert (Hg.), Big Data. Analysen zum digitalen Wandel von Wissen, Macht und Ökonomie. Bielefeld 2014; J. Vlahos, The Department of Pre-Crime, in: Scientific America, 306(1), 2012.
(2) See, on two subtopics, V. Eick, BodyCams in den USA und der BRD. In: Bürgerrechte & Polizei/CILIP 112, 2017; V. Eick, Weiche Waffen für eine harte Zeit? In: Kritische Justiz 45(1), 2012.
(3) M. McGuire & T.J. Holt (eds), The Routledge Handbook of Technology, Crime and Justice. London 2017; See (on crime mapping) P.K. Manning, The Technology of Policing. New York 2008.
(4) See S. Egbert & S. Krasmann, Predictive Policing. Eine ethnographische Studie neuer Technologien zur Vorhersage von Straftaten und ihre Folgen für die polizeiliche Praxis. Hamburg 2019; T. Knobloch, Vor die Lage kommen: Predictive Policing in Deutschland. Gütersloh 2018; A. Gluba, Predictive Policing – eine Bestandsaufnahme. Hannover 2014; W.L. Perry, B. McInnis, C.C. Price, S.C. Smith, J.S. Hollywood, Predictive Policing. The Role of Crime Forecasting in Law Enforcement Operations. Santa Monica, CA 2013; C.D. Uchida, A National Discussion on Predictive Policing. Los Angeles, CA 2009.
(5) See R. Richardson, J. Schultz, K. Crawford, Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice. In: New York University Law Review Online, February 13, 2019 (forthcoming), papers.ssrn.com/sol3/papers.cfm; A.G. Ferguson, Policing Predictive Policing. In: Washington University Law Review 94(5), 2017; J. Saunders, P. Hunt, J.S. Hollywood, Predictions Put into Practice. A Quasi-Experimental Evaluation of Chicago’s Predictive Policing Pilot. In: Journal of Experimental Criminology 12(3), 2016.
(6) See Statewatch, UK: Predictive policing in London: commercial interests trump accountability, Statewatch News Online, August 2014, http://database.statewatch.org/article. asp?aid=33881.
(7) See https://www.ifmpt.de/project_zuerich.html, für Basel-Land – jeweils mit PRECOBS https://www.ifmpt.de/project_basel-land.html
(8) In a research report on the global market for predictive analytics, the company Market Research Future (MRFR) estimates the market to grow by 21 percent annually between 2017 and 2023. Sales are estimated at 13 billion US dollars by the end of the forecast period. Statistics Market Research Consulting (SMRC) expects the figures to be more than twice as high; sales are expected to grow from USD 5.72 billion in 2017 to USD 28.71 billion in 2026 (+19.6%); the area of criminal prosecution and justice is unlikely to play a relevant role in this.
(9) D. Gerstner, Predictive Policing. Theorie, Anwendung und Erkenntnisse am Beispiel des Wohnungseinbruchdiebstahls. In: S. Ellebrecht, S. Kaufmann, P. Zoche, (Un-)Sicherheiten im Wandel. Berlin 2019; See Deutscher Bundestag, Predictive Policing in Deutschland (BT-Drs.19/1513 v. 03.04.2018);
D. Gerstner, Predictive Policing in the Context of Residential Burglary: An Empirical Illustration on the Basis of a Pilot Project in Baden-Württemberg, Germany. In: European Journal for Security Research 3(2), 2018; Deutscher Bundestag, Tests, Recherchen und Marktsichtungen zur Einführung polizeilicher Vorhersagesoftware (BT-Drs.18/3703 v. 07.01.2015).
(10) B. Sheehey, Algorithmic Paranoia: The Temporal Governmentality of Predictive Policing. In: Ethics and Information Technology 21(1), 2019; See D. Robinson & L. Koepke, Stuck in a Pattern: Early evidence on ›Predictive Policing‹ and Civil Rights. Washington D.C. 2016.
(11) M. Stroud, The Minority Report: Chicago’s new police computer predicts crimes, but is it racist? In: The Verge, 09.02.2014, https://www.theverge.com/2014/2/19/5419854/the-minority-report-this-computer-predicts-crime-but-is-it-racist.
(12) S. Ackerman, No-fly list uses ›predictive assessments‹ instead of hard evidence, US admits. In: The Guardian (10.08.2015); Statewatch, Note on Big Data, Crime and Security: Civil Liberties, Data Protection and Privacy Concerns (03.04.2014),  https://www.statewatch.org/analyses/no-242-big-data.pdf; J. Florence, Making the No Fly List Fly: A Due Process Model for Terrorist Watchlists. In: The Yale Law Journal 115(8), 2006.
(13) N. Ross & K. Pease, Community Policing and Prediction. In: T. Williamson (ed.), The Handbook of Knowledge-Based Policing. Current Conceptions and Future Directions. West Sussex 2008.
(14) On Belgium: R. van Brakel, https://algorithmwatch.org/en/automating-society-belgium/; on Denmark: B. Alfter, https://algorithmwatch.org/en/automating-society-denmark/; on Italy: F. Chiusi, https://algorithmwatch.org/en/automating-society-italy/; On the Netherlands: G. van Til, Netherlands, https://algorithmwatch.org/en/automating-society-netherlands/; on Sweden: A. Kaun & J. Velkova, Sweden, https://algorithmwatch.org/en/automating-society-sweden/; on Spain: K. Peiró, Spain, https://algorithmwatch.org/en/automating-society-spain/.
(15) A. Adensamer & L.D. Klausner, Ich weiß, was du nächsten Sommer getan haben wirst: Predictive Policing in Österreich. In: juridikum 3/2019.
(16) Institut d’aménagement et d’urbanisme de la région de l’Île-de-France, La Police Prédictive. Enjeux soulevés par l’usage des algorithmes prédictifs en matière desécurité publique. Paris 2019 sowie P. Perrot, What about AI in criminal intelligence? In: European Police Science and Research Bulletin 16/2017.
(17) With a similar focus on burglary as in Germany, M. Dugato, S. Caneppele, S. Favarin, M. Rotondi, Prevedere I furti in abitazione. Trento 2015.
(18) M. Leese, Predictive Policing in der Schweiz: Chancen, Herausforderungen, Risiken. In: C. Nünlist & O. Thränert (Hg.), Bulletin 2018 zur schweizerischen Sicherheitspolitik. Zürich 2018; T. Grossenbacher, Polizei-Software verdächtigt zwei von drei Personen falsch (05.04.2018), https://kops.uni-konstanz.de/bitstream/handle/123456789/30292/Gerth_0-284132.pdf?sequence=3 ; J. Gerth, Risk-Assessment bei Gewalt- und Sexualdelinquenz. Konstanz 2015, https://kops.uni-konstanz.de/bitstream/handle/123456789/30292/Gerth_0-284132.pdf?sequence=3.
(19) G. Bediroglu1, S. Bediroglu, H.E. Colak, T. Yomralioglu, A Crime Prevention System in Spatiotemporal Principles with Repeat, Near-Repeat Analysis and Crime Density Mapping: Case Study Turkey, Trabzon. In: Crime & Delinquency 64(14), 2018.
(20) H. Couchman, Policing by Machine. Predictive Policing and the Threat to Our Rights. London 2019.
(21) The Law Society, Algorithms in the Criminal Justice System. London 2019.
(22)Ibid., p. 35.
(23) See, among others, S. Egbert, About Discursive Storylines and Techno-Fixes: The Political Framing of the Implementation of Predictive Policing in Germany. In: European Journal for Security Research, 3(2), 2018; T. Singelnstein, Predictive Policing: Algorithmenbasierte Straftatprognosen zur vorausschauenden Kriminalintervention. In: Neue Zeitschrift für Strafrecht 38(1), 2018 sowie B. Belina, Predictive Policing. In: Monatsschrift für Kriminologie und Strafrechtsreorm 2/2016.
(24) The ›predictive policing system‹ RADAR-RiTE (Regelbasierte Analyse potentiell destruktiver Täter zur Einschätzung des akuten Risikos – islamistischer Terrorismus, rule-based analysis of potentially destructive perpetrators to assess the acute risk - Islamist terrorism) is based here on questions relating to 73 characteristics, such as socialization or attitudes to violence. So-called protective factors such as family ties, good integration or a secure job are also covered. RADAR-iTE, which is now available in version 2.0, is intended to help the security authorities to focus their surveillance on particularly relevant persons; in August 2019, the Federal Criminal Police Office (BKA) had a total of 497 evaluation sheets at its disposal. 186 persons (37%) are assigned to the high risk group with regard to the probability to commit a violent crime. 311 persons (63%) were in the ›moderate risk‹ range. For twelve persons (2%) it was recommended to check a future evaluation by means of RADAR-iTE, see Deutscher Bundestag, Zweijahresbilanz des Instruments RADAR-iTE (BT-Drs. 19/12401 v. 30.08.2019), p. 2.
(25) F. Jansen, Data Driven Policing in the Context of Europe (DATAJUSTICE Working Paper). Cardiff 2018, https://datajusticeproject.net/wp-content/uploads/sites/30/2019/05/Report-Data-Driven-Policing-EU.pdf.
(26) »In addition to KLB-operative, the analysis platform ›hessenDATA‹, based on Palantir’s software Gotham, is deployed since 2017; among other tasks, it aims for the predictive analysis of terrorist attacks and similar risk scenarios in the organized crime area and thus has clear overlaps with RADAR-iTE«, see Egbert & Krasmann (En 4), p. 31.
(27) R. Chan, Here’s what you need to know about Palantir, the secretive $20 billion data-analysis company (19.07.2019), https://www.businessinsider.de/palantir-ice-explainer-data-startup-2019-7?r=US&IR=T; Stephan Dörner, Wer ist der Kopf hinter dem Überwachungskonzern Palantir? (19.10.2018), https://www.gruenderszene.de/allgemein/palantir-alexander-karp-die-story;
(28) R. Chan, Here’s what you need to know about Palantir, the secretive $20 billion data-analysis company (19.07.2019), https://www.businessinsider.de/palantir-ice-explainer-data-startup-2019-7?r=US&IR=T; Stephan Dörner, Wer ist der Kopf hinter dem Überwachungskonzern Palantir? (19.10.2018), https://www.gruenderszene.de/allgemein/palantir-alexander-karp-die-story;
(29) Landtag Nordrhein-Westfalen, Wohnungseinbruchdiebstahl - A09 - 27.10.2016. Stellungnahme des Landeskriminalamtes Nordrhein-Westfalen. Düsseldorf 2016, p. 24ff.
(30) S. Egbert, Predictive Policing in Deutschland. Grundlagen, Risiken, (mögliche Zukunft). In: Strafverteidigervereinigungen (Hg.), Räume der Unfreiheit. Texte und Ergebnisse des 42. Strafverteidigertages Münster (2.-4. März). Berlin 2018.
(31) See Gewerkschaft kann Aus für Precobs-Software nachvollziehen (12.09.2019) https://www.stimme.de/suedwesten/nachrichten/pl/Gewerkschaft-kann-Aus-fuer-Precobs-Software-nachvollziehen;art19070,4245520.
(32) K. Briken, Ein verbetriebswirtschaftlichtes Gewaltmonopol? New Police Management im europäischen Vergleich. In: Kriminologisches Journal 46(4), 2014.
(33) M. Corrales, M. Fenwick, H. Haapio (eds.), Legal Tech, Smart Contracts and Blockchain. Singapore 2019 sowie K. Jacob, D. Schindler, R. Strathausen (eds.), Liquid Legal. Transforming legal into a business savvy, information enabled and performance driven industry. Cham, CH 2017.
(34) M. Hartung, M.-M. Bues, G. Halbleib (eds.), Legal Tech. How Technology is Changing the Legal World. München 2018; in German: Legal Tech – Die Digitalisierung des Rechtsmarkts. München 2018.
(35) E.E. Joh, Automated Seizures: Police Stops of Self-Driving Cars. In: New York University Law Review Online, forthcoming 2019, http://dx.doi.org/10.2139/ssrn.3354800.
(36) Ewald explains that the »procedural and practical contours in the handling of digital evidence by the legal parties involved in the proceedings, for example in the issuing of smartphone images as part of the right to inspect files or in the verification of the admissibility, reliability and the probative value of digital evidence, are only just beginning to emerge«, see U. Ewald, Digitale Beweismittel und neue Wege der Strafverteidigung. In: Strafverteidigervereinigungen (Hg.), Räume der Unfreiheit. Texte und Ergebnisse des 42. Strafverteidigertages Münster (2.-4. März). Berlin 2018, p. 269.
(37) European Commission for the Efficiency of Justice (CEPEJ), Guidelines on how to Drive Change towards Cyberjustice. Stock-taking of Tools Deployed and Summary of Good Practices. Strasbourg 2016.
(38) See the proposals of the EU Parliament and the EU Council on the digital collection, release and preservation of evidence, COM(2018) 225 final, 17.04.2018 und COM(2018)0226 final, 17.04.2018.
(39) Cit. in: Paris Innovation Review, Predictive Justice: When Algorithms Pervade the Law (09.06.2017), http://parisinnovationreview.com/articles-en/predictive-justice-when-algorithms-pervade-the-law. Doctrine collects and centralizes legal data with its search engine to sell them.
(40) Ibid.
(41) Dabei wird eine Kombination aus Schlüsselwortsuche und wiederholendem Computerlernen verwendet, um die Relevanz jedes einzelnen Dokuments zu bewerten; See http://go.recommind.com/hubfs/BCA_Trading_UK_PC_Order_2016_EWHC_1464_CH_5-17-2016.pdf
(42) See https://epic.org/algorithmic-transparency/crim-justice/.
(43) M. Oswald, J. Grace, S. Urwin, G.C. Barnes, Algorithmic risk assessment policing models: lessons from the Durham HART model and ›experimental‹ proportionality. In: Information & Communications Technology Law, 27(2), 2018.
(44) K.S. Gustafson, Cheating Welfare: Public Assistance and the Criminalization of Poverty. New York 2011; V. Eick, B. Grell, M. Mayer, J. Sambale, Nonprofit-Organisationen und die Transformation lokaler Beschäftigungspolitik. Münster 2004; J. Gilliom, Overseers of the Poor. Chicago 2001.
(45) M. Spielkamp, Wenn Algorithmen über den Job entscheiden. In: Die Presse v. 20.07.2019, https://algorithmwatch.org/wenn-algorithmen-ueber-den-job-entscheiden/.
(46) See https://wetten.overheid.nl/BWBR0013060/2019-07-08.
(47) On challenges and consequences within the UK, https://www.lawsociety.org.uk/support-services/research-trends/algorithms-in-the-justice-system/.
(48) Cit. in: Les enjeux de la justice prédictive. In: La Semaine Juridique (Édition Générale) 31(1-2), 2017.