Skip to main content

Privacy, Non-Discrimination and Equal Treatment: Developing a Fundamental Rights Response to Behavioural Profiling

  • Chapter
  • First Online:
Algorithmic Governance and Governance of Algorithms

Part of the book series: Data Science, Machine Intelligence, and Law ((DSMIL,volume 1))

  • 1640 Accesses

Abstract

In the diverse attempts to identify fundamental rights implications of behavioural profiling, the lines between the right to privacy, non-discrimination and equal treatment have been blurred. Scholars have struggled to develop coherent approaches to the widespread practice of evaluating and differentiating between individuals on the basis of correlative relations between random, causally unrelated categories in large data sets. This chapter suggests a response to these practices and establishes clear boundaries between the rights. It is argued that the right to non-discrimination should be interpreted narrowly, thus not applying to large parts of behavioural profiling. Extending its scope to random categories would jeopardise the distinguished quality of the right to effectively prohibit the most appalling and morally reprehensible differentiations. The scope of the right to privacy, conversely, has an open-ended structure and can be further evolved in order to respond to novel threats. However, it is first and foremost the right to equal treatment, which carries great, even though little acknowledged potential to provide a normative framework for behavioural profiling. The engagement with it can encourage, frame and respond to a much needed societal debate on machine learning based data analysis. Difficulties which occur in the application of the right to equal treatment are manifestations of a larger societal challenge. The shared question, which both, fundamental rights lawyers and society at large have to answer is how to accommodate the new way of generating knowledge and differentiating between individuals within our conventional way to understand the world, to reason and to differentiate.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (Canada)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (Canada)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (Canada)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (Canada)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

') var buybox = document.querySelector("[data-id=id_"+ timestamp +"]").parentNode var buyingOptions = buybox.querySelectorAll(".buying-option") ;[].slice.call(buyingOptions).forEach(initCollapsibles) var buyboxMaxSingleColumnWidth = 480 function initCollapsibles(subscription, index) { var toggle = subscription.querySelector(".buying-option-price") subscription.classList.remove("expanded") var form = subscription.querySelector(".buying-option-form") var priceInfo = subscription.querySelector(".price-info") var buyingOption = toggle.parentElement if (toggle && form && priceInfo) { toggle.setAttribute("role", "button") toggle.setAttribute("tabindex", "0") toggle.addEventListener("click", function (event) { var expandedBuyingOptions = buybox.querySelectorAll(".buying-option.expanded") var buyboxWidth = buybox.offsetWidth ;[].slice.call(expandedBuyingOptions).forEach(function(option) { if (buyboxWidth <= buyboxMaxSingleColumnWidth && option != buyingOption) { hideBuyingOption(option) } }) var expanded = toggle.getAttribute("aria-expanded") === "true" || false toggle.setAttribute("aria-expanded", !expanded) form.hidden = expanded if (!expanded) { buyingOption.classList.add("expanded") } else { buyingOption.classList.remove("expanded") } priceInfo.hidden = expanded }, false) } } function hideBuyingOption(buyingOption) { var toggle = buyingOption.querySelector(".buying-option-price") var form = buyingOption.querySelector(".buying-option-form") var priceInfo = buyingOption.querySelector(".price-info") toggle.setAttribute("aria-expanded", false) form.hidden = true buyingOption.classList.remove("expanded") priceInfo.hidden = true } function initKeyControls() { document.addEventListener("keydown", function (event) { if (document.activeElement.classList.contains("buying-option-price") && (event.code === "Space" || event.code === "Enter")) { if (document.activeElement) { event.preventDefault() document.activeElement.click() } } }, false) } function initialStateOpen() { var buyboxWidth = buybox.offsetWidth ;[].slice.call(buybox.querySelectorAll(".buying-option")).forEach(function (option, index) { var toggle = option.querySelector(".buying-option-price") var form = option.querySelector(".buying-option-form") var priceInfo = option.querySelector(".price-info") if (buyboxWidth > buyboxMaxSingleColumnWidth) { toggle.click() } else { if (index === 0) { toggle.click() } else { toggle.setAttribute("aria-expanded", "false") form.hidden = "hidden" priceInfo.hidden = "hidden" } } }) } initialStateOpen() if (window.buyboxInitialised) return window.buyboxInitialised = true initKeyControls() })()

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    For a sober and expansive analysis of the functioning and impact of ‘Big Data’, see Mayer-Schönberger and Cukier (2013).

  2. 2.

    For a detailed description and critical analysis see Zuboff (2019).

  3. 3.

    See for example Wei et al. (2015); on algorithms deciding the likelihood of recidivism, see Corbett-Davies et al. (2016).

  4. 4.

    For a variety of perspectives on profiling see Hildebrandt and Gutwirth (2008); for an overview on fields, in which profiling is used, see Wachter and Mittelstadt (2018).

  5. 5.

    For the distinction between data and information this article relies on the differentiation of Mayer-Schönberger and Cukier (2013).

  6. 6.

    The brief account here cannot be achieved without major simplifications. For a more detailed account see Hildebrandt and Koops (2010).

  7. 7.

    For a detailed account on what kind of “groups” machine learning algorithms create, see Kammourieh et al. (2017).

  8. 8.

    For an early, but convincing account see Hildebrandt (2006).

  9. 9.

    For detailed accounts on the legal framework of profiling under the GDPR see Wachter et al. (2017b, 2018) and Wachter and Mittelstadt (2018).

  10. 10.

    For a brief but precise explanation on the process of generating knowledge, see Hildebrandt (2006).

  11. 11.

    Kammourieh et al. (2017).

  12. 12.

    Wachter and Mittelstadt (2018).

  13. 13.

    Mayer-Schönberger and Cukier (2013).

  14. 14.

    For overviews see Hildebrandt and Koops (2010). For overviews see Mittelstadt et al. (2016), Gonçalves (2017).

  15. 15.

    For overviews on AI’s implications on privacy see Schermer (2011), Rubinstein (2013); for contributions on Group Privacy, see Taylor et al. (2017a).

  16. 16.

    On the legal status of the generated information see Wachter and Mittelstadt (2018).

  17. 17.

    Mantelero (2016) and Kammourieh et al. (2017).

  18. 18.

    Wachter et al. (2017b).

  19. 19.

    Wachter et al. (2017a).

  20. 20.

    Wachter and Mittelstadt (2018).

  21. 21.

    Floridi (2017).

  22. 22.

    Among the versatile scholarly engagement see for example Floridi (2014), Kammourieh et al. (2017), Mittelstadt (2017); for an overview see Taylor et al. (2017b), Van der Sloot (2017).

  23. 23.

    See for example: Leese (2014), Roberts (2015), Barocas and Selbst (2016), Kim (2017) and Hacker (2018).

  24. 24.

    Also discussed under the notion indirect discrimination, on this term see Schabas (2015).

  25. 25.

    Noble (2018).

  26. 26.

    See for example: Hacker and Wiedemann (2017) and Zarsky (2016).

  27. 27.

    Hacker (2018).

  28. 28.

    “Everyone has the right to respect for his or her private and family life, home and communications.”

  29. 29.

    “Everyone has the right to respect for his private and family life, his home and his correspondence.”

  30. 30.

    Before the CFR entered into force, the ECJ has ruled that the right to the respect of private life derives from the constitutional traditions of the member states and is therefore protect by the European legal order: ECJ 5 October 1994, X v Commission (C-404/92 P).

  31. 31.

    ECJ 8 April 2014, Digital Rights Ireland and Seitlinger a.o. (C-293/12 and C-594/12).

  32. 32.

    Kokott and Sobotta (2015).

  33. 33.

    For an overview see: Jacobs et al. (1996).

  34. 34.

    For an overview on co-existing definitions, see: Jacobs et al. (1996).

  35. 35.

    ECtHR 13 June 1979, Marckx/Belgium (6833/74); Jacobs et al. (1996).

  36. 36.

    ECtHR 16 December 1992, Niemietz/Germany (13710/88) para 29.

  37. 37.

    ECtHR 26 March 1985, X and Y/the Netherlands (8978/80) para 22; for the psychological integrity see for example: ECtHR 6 February 2001, Bensaid/the United Kingdom (44599/98).

  38. 38.

    See for example: ECtHR 24 June 2004, Von Hannover/Germany (59320/00).

  39. 39.

    See for example: ECtHR 7 February 2012, Axel Springer AG/Germany (39954/08).

  40. 40.

    On that the decision of the ECJ: ECJ 8 April 2014, Digital Rights Ireland and Seitlinger a.o. (C-293/12 and C-594/12).

  41. 41.

    ECtHR 21 June 2011, Shimovolos/Russia (30194/09).

  42. 42.

    See for example: ECtHR 16 July 2014, Hämäläinen/Finland (37359/09).

  43. 43.

    See for example: ECtHR 13 February 2003, Odièvre/France (42326/98).

  44. 44.

    See for example: ECtHR 18 January 2001, Chapman/the United Kingdom (27238/95).

  45. 45.

    On the debate whether privacy, at all, has intrinsic value, see: Solove (2002); for an overview of protected interests see: Harris (2009).

  46. 46.

    ECtHR 29 April 2002, Pretty/the United Kingdom (2346/02) para 61.

  47. 47.

    ECtHR 29 April 2002, Pretty/the United Kingdom (2346/02).

  48. 48.

    Whitman (2004).

  49. 49.

    BVerfG 15 December 1983, Volkszählung (1 BvR 209, 269, 362, 420, 440, 484/83); BVerfG 6 December 2005, Transsexuelle III (1 BvL 3/03).

  50. 50.

    Whitman (2004).

  51. 51.

    “Any discrimination based on any ground such as sex, race, color, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age or sexual orientation shall be prohibited.”

  52. 52.

    “Everyone is equal before the law.”

  53. 53.

    Art. 14: “The enjoyment of the rights and freedoms set forth in this Convention shall be secured without discrimination on any ground such as sex, race, color, language, religion, political or other opinion, national or social origin, association with a national minority, property, birth or other status.”; Protocol 12 Art. 1: “The enjoyment of any right set forth by law shall be secured without discrimination on any ground such as sex, race, color, language, religion, political or other opinion, national or social origin, association with a national minority, property, birth or other status.”

  54. 54.

    Early drafts of Art. 14 still included a first paragraph establishing a right to equal treatment. On the drafting of the provision, see: Schabas (2015).

  55. 55.

    For an extensive commentary on Art. 14 see: Jacobs et al. (1996).

  56. 56.

    ECtHR 7 December 1976, Kjeldsen, Busk Madsen and Pedersen/Denmark (5095/71; 5920/72; 5926/72) 56.

  57. 57.

    ECtHR 7 December 1976, Kjeldsen, Busk Madsen and Pedersen/Denmark (5095/71; 5920/72; 5926/72) 56; see also: ECtHR 15 March 2016, Novruk a.o./Russia (31039/11, 48511/11, 76810/12, 14618/13 and 13817/14).

  58. 58.

    Para 89. ECtHR 24 May 2016, Biao/Denmark (38590/10).

  59. 59.

    ECtHR 15 March 2016, Novruk a.o./Russia (31039/11, 48511/11, 76810/12, 14618/13 and 13817/14).

  60. 60.

    Ward (2018); ruling that obesity does not in itself constitute a “disability” within the meaning of Directive 2000/78: ECJ 18 December 2014, FOA (C-354/13) narrow interpretation of the scope of application of Directive 2000/78/ECECJ 2 June 2016, C (Case C-122/15).

  61. 61.

    Criticising the low impact of the ECJ’s application: Ward (2018).

  62. 62.

    ECJ 19 October 1977, Albert Ruckdeschel & Co. and Hansa-Lagerhaus Ströh & Co. v Hauptzollamt Hamburg-St. Annen; Diamalt AG v Hauptzollamt Itzehoe (117–176 and 16–77) para 8.

  63. 63.

    ECJ 11 April 2013, Soukupová (C-401/11) para 29; see also: ECJ 14 April 1994, A. v Commission (T-10/93) para 42.

  64. 64.

    ECJ 14 April 1994, A. v Commission (T-10/93) para 42.

  65. 65.

    ECJ 14 April 1994, A. v Commission (T-10/93) para 42.

  66. 66.

    See also: ECJ 7 February 1991, Tagaras v Court of Justice of the European Communities (T-18/89 and T-24/89) para 68; the ECHR, in the application of Art. 14 ECHR requires that situations are ‘relevantly’ similar or different: ECtHR 29 April 2008, Burden/the United Kingdom (13378/05) para 60.

  67. 67.

    Schabas (2015).

  68. 68.

    ECtHR 15 March 2016, Novruk a.o./Russia (31039/11, 48511/11, 76810/12, 14618/13 and 13817/14) para 90.

  69. 69.

    ECtHR 27 July 2004, Sidabras a. Dziautas/Lithuania (55480/00 a. 59330/00) para 51ff.

  70. 70.

    A number of judges of the ECtHR also support a more narrow view, see the dissenting opinions in: ECtHR 27 July 2004, Sidabras a. Dziautas/Lithuania (55480/00 a. 59330/00).

  71. 71.

    In her dissenting opinion, judge Thomassen is arguing in a similar way: “The principle of non-discrimination (…) refers above all to a denial of opportunities on grounds of personal choices in so far as these choices should be respected as elements of someone’s personality, such as religion, political opinion, sexual orientation and gender identity, or, on the contrary, on grounds of personal features in respect of which no choice at all can be made, such as sex, race, disability and age.” ECtHR 27 July 2004, Sidabras a. Dziautas/Lithuania (55480/00 a. 59330/00).

  72. 72.

    See for example: Barnard and Hepple (2000).

  73. 73.

    ECJ 11 April 2013, Soukupová (C-401/11) para 29; see also: ECJ 14 April 1994, A. v Commission (T-10/93) para 42.

  74. 74.

    ECJ 14 April 1994, A. v Commission (T-10/93) para 42.

  75. 75.

    Schabas (2015).

  76. 76.

    ECJ 22 May 2014, Glatzel (C-356/12).

  77. 77.

    The casual reasoning is also apparent in the justification: ECJ 22 May 2014, Glatzel (C-356/12) para 54.

  78. 78.

    Article 8(1): “Everyone has the right to the protection of personal data concerning him or her.”

  79. 79.

    See the similar definition in Directive 2016/680 of the EU (27 April 2016), para 21.

  80. 80.

    Arguing the opposite: Lynskey (2014).

  81. 81.

    See the explanations in Sect. 2.2.

  82. 82.

    Idem.

  83. 83.

    Floridi (2017).

  84. 84.

    Floridi (2011, 2017).

  85. 85.

    Kammourieh et al. (2017).

  86. 86.

    ECtHR 15 March 2016, Novruk a.o./Russia (31039/11, 48511/11, 76810/12, 14618/13 and 1817/14).

  87. 87.

    For the type of protected groups, see: Kammourieh et al. (2017).

  88. 88.

    Leese (2014).

  89. 89.

    Hildebrandt and Koops (2010).

  90. 90.

    Hildebrandt and Koops (2010) and Mayer-Schönberger and Cukier (2013).

  91. 91.

    For the right to privacy, see: ECtHR 13 June 1979, Marckx/Belgium (6833/74).

  92. 92.

    Mayer-Schönberger and Cukier (2013).

  93. 93.

    For a suggestion how to incorporate the explainability into the technology, see: Wachter et al. (2018).

  94. 94.

    On the value of big data analytics see: Mayer-Schönberger and Cukier (2013).

  95. 95.

    For an understanding of how that happens, see: Hildebrandt and Koops (2010).

References

  • Barnard C, Hepple B (2000) Substantive equality. Cambridge Law J 59:562

    Article  Google Scholar 

  • Barocas S, Selbst AD (2016) Big data’s disparate impact. California Law Rev 104:671

    Google Scholar 

  • Corbett-Davies S et al (2016) A computer program used for bail and sentencing decisions was labelled biased against blacks. It’s actually not that clear. The Washington Post, 17th October 2016. https://www.washingtonpost.com/news/monkey-cage/wp/2016/10/17/can-an-algorithm-be-racist-our-analysis-is-more-cautious-than-propublicas/?noredirect=on&utm_term=.9c25816db76c. Accessed Apr 2019

  • Floridi L (2011) The informational nature of personal identity. Minds Mach 21:549

    Article  Google Scholar 

  • Floridi L (2014) Open data, data protection, and group privacy. Philos Technol 27:1

    Article  Google Scholar 

  • Floridi L (2017) Group privacy: a defense and an interpretation. In: Taylor L et al (eds) Group privacy. Springer, Dordrecht

    Google Scholar 

  • Gonçalves ME (2017) The EU data protection reform and the challenges of big data: remaining uncertainties and ways forward. Inf Commun Technol Law 26:90

    Article  Google Scholar 

  • Hacker P (2018) Teaching fairness to artificial intelligence: existing and novel strategies against algorithmic discrimination under EU law. Common Mark Law Rev 55:1143

    Article  Google Scholar 

  • Hacker P, Wiedemann E (2017) A continuous framework for fairness. arXiv.org. http://arxiv.org/abs/1712.07924

  • Harris D et al (2009) Law of the European Convention on human rights. Oxford University Press, New York

    Google Scholar 

  • Hildebrandt M (2006) Profiling: from data to knowledge. Datenschutz und Datensicherheit 30:548

    Article  Google Scholar 

  • Hildebrandt M, Gutwirth S (eds) (2008) Profiling the European citizen. Springer, Dordrecht

    Google Scholar 

  • Hildebrandt M, Koops EJ (2010) The challenges of ambient law and legal protection in the profiling era. Mod Law Rev 73:428

    Article  Google Scholar 

  • Jacobs FG et al (1996) The European Convention on human rights. Clarendon Press, Oxford

    Google Scholar 

  • Kammourieh L et al (2017) Group privacy in the age of big data. In: Taylor L et al (eds) Group privacy. Springer, Dordrecht

    Google Scholar 

  • Kim P (2017) Data-driven discrimination at work. William Mary Law Rev 58:857

    Google Scholar 

  • Kokott J, Sobotta C (2015) Protection of fundamental rights in the European Union: on the relationship between EU fundamental rights, the European Convention and national standards of protection. Yearb Eur Law 34:60

    Google Scholar 

  • Leese M (2014) The new profiling: algorithms, black boxes, and the failure of anti-discriminatory safeguards in the European Union. Secur Dialogue 45:494

    Article  Google Scholar 

  • Lynskey O (2014) Deconstructing data protection: the “added-value” of a right to data protection in the EU legal-order. Int Comput Law Q 63:569

    Article  Google Scholar 

  • Mantelero A (2016) Personal data for decisional purposes in the age of analytics: from an individual to a collective dimension of data protection. Comput Law Secur Rev 32:238

    Article  Google Scholar 

  • Mayer-Schönberger V, Cukier K (2013) Big data: a revolution that will transform how we live, work and think. John Murray, London

    Google Scholar 

  • Mittelstadt B (2017) From individual to group privacy in big data analytics. Philos Technol 30:475

    Article  Google Scholar 

  • Mittelstadt B et al (2016) The ethics of algorithms: mapping the debate. Big Data Soc 3:1

    Article  Google Scholar 

  • Noble SU (2018) Algorithms of oppression: how search engines reinforce racism. New York University Press, New York

    Book  Google Scholar 

  • Roberts JL (2015) Protecting privacy to prevent discrimination. William Mary Law Rev 56:2132

    Google Scholar 

  • Rubinstein IS (2013) Big data: the end of privacy or a new beginning? Int Data Priv Law 3:74

    Google Scholar 

  • Schabas WA (2015) The European convention on human rights: a commentary. Oxford University Press, Oxford

    Google Scholar 

  • Schermer BW (2011) The limits of privacy in automated profiling and data mining. Comput Law Secur Rev 27:45

    Article  Google Scholar 

  • Solove DJ (2002) Conceptualizing privacy. California Law Rev 90:1087

    Article  Google Scholar 

  • Taylor L et al (2017a) Introduction: a new perspective on privacy. In: Taylor L et al (eds) Group privacy. Springer, Dordrecht

    Chapter  Google Scholar 

  • Taylor L et al (2017b) Group privacy. Springer, Dordrecht

    Book  Google Scholar 

  • Van der Sloot B (2017) Do groups have a right to protect their group interest in privacy and should they? Peeling the onion of rights and interests protected under article 8 ECHR. In: Taylor L et al (eds) Group privacy. Springer, Dordrecht

    Google Scholar 

  • Wachter S, Mittelstadt B (2018) A right to reasonable inferences: re-thinking data protection in the age of big data and AI. Columbia Bus Law Rev. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3248829

  • Wachter S et al (2017a) Transparent, explainable, and accountable AI for robotics. Sci Robot 2:1

    Article  Google Scholar 

  • Wachter S et al (2017b) Why a right to explanation of automated decision-making does not exist in the general data protection regulation. Int Data Priv Law 7:76

    Article  Google Scholar 

  • Wachter S et al (2018) Counterfactual explanations without opening the black box: automated decisions and the GDPR. Harv J Law Technol 31:841

    Google Scholar 

  • Ward A (2018) The impact of the EU Charter of fundamental rights on anti-discrimination law: more a whimper than a bang? Cambridge Yearb Eur Legal Stud 20:32

    Article  Google Scholar 

  • Wei Y et al (2015) Credit scoring with social network data. Mark Sci 35:234

    Article  Google Scholar 

  • Whitman JQ (2004) The two western cultures of privacy: dignity versus liberty. Yale Law J 113:1151

    Article  Google Scholar 

  • Zarsky T (2016) The trouble with algorithmic decisions: an analytic road map to examine efficiency and fairness in automated and opaque decision making. Sci Technol Hum Values 41:118

    Article  Google Scholar 

  • Zuboff S (2019) The age of surveillance capitalism: the fight for a human future at the new frontier of power. Public Affairs, New York

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Niklas Eder .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Eder, N. (2021). Privacy, Non-Discrimination and Equal Treatment: Developing a Fundamental Rights Response to Behavioural Profiling. In: Ebers, M., Cantero Gamito, M. (eds) Algorithmic Governance and Governance of Algorithms. Data Science, Machine Intelligence, and Law, vol 1. Springer, Cham. https://doi.org/10.1007/978-3-030-50559-2_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-50559-2_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-50558-5

  • Online ISBN: 978-3-030-50559-2

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics

Navigation

-