24028
portfolio_page-template-default,single,single-portfolio_page,postid-24028,stockholm-core-2.4,qodef-qi--touch,qi-addons-for-elementor-1.6.7,select-theme-ver-9.5,ajax_fade,page_not_loaded,smooth_scroll,no_animation_on_touch,,qode_menu_,wpb-js-composer js-comp-ver-7.9,vc_responsive,elementor-default,elementor-kit-38031

Autonomy Challenges in the Age of Big Data
Sofia Grafanaki*
Article

  The full text of this Article may be found here.

27 Fordham Intell. Prop. Media & Ent. L.J. 803
Article by Sofia Grafanaki*

ABSTRACT

[T]

his Article examines how technological advances in the field of “Big Data” challenge meaningful individual autonomy (and by extension democracy), are redefining the process of self-formation and the relationship between self and society, and can cause harm that cannot be addressed under current regulatory frameworks. Adopting a theory of autonomy that includes both the exploration process an individual goes through in order to develop authentic and independent desires that lead to his actions, as well as the independence of the actions and decisions themselves, this Article identifies three distinct categories of autonomy challenges that Big Data technologies present. The first is the increasing rise of lots of “little brothers,” putting individuals in a state of constant surveillance, the very knowledge of which undermines individual self-determination. In the governmental context, the idea of always being watched has long been established as a threat to freedom of expression, free speech, “intellectual privacy,” and associational freedoms. The discussion does not focus on government surveillance per se, but draws from the same reasoning to illustrate how similar dangers are present even when it is not the government or a single entity behind the surveillance. The second is an algorithmic self-reinforcing loop in every aspect of our lives, as in a world where everything is tracked, the “choices” one is given are based on assumptions about him, and these same “choices” are the ones that determine and become the new assumption, thereby creating a constantly fortified self-fulfilling prophecy. The very structure of the algorithms used is based on statistical models trained to ignore outliers, collect (im)perfect information about the past and use that to recreate the future. This is true both on an individual level and for society more generally. The third is the use of persuasive computing techniques, allowing companies to move beyond simply measuring customer behavior to creating products that are designed with the specific goal of forming new habits. Finally, this Article demonstrates the need for the development of a vocabulary to assess the ethical, political, and sociological values of these algorithms, and for a full set of ethical norms that can lay the foundations of democracy on the web.


*LL.M, New York University Law School; MBA, Columbia Business School; B.A., Oxford University. The author is Chief Operations Officer of Data Elite, an accelerator and incubator doing seed investments by providing early stage funding and counseling for Big Data start-ups.