New Practices of Quantification - between Politics, Science and Technology

New Practices of Quantification - between Politics, Science and Technology

Veranstalter
Université de Namur, Université libre de Bruxelles, ETH Zürich, Université de Liège, KU Leuven, STS Network Belgium, STS working group Oxford University
Veranstaltungsort
Ort
Namur
Land
Belgium
Vom - Bis
08.02.2016 - 12.02.2016
Deadline
01.12.2015
Website
Von
Weist, Anne-Marie

What happens to the data in 'Big Data' ?

Within the framework of the FNRS funded “Algorihtmic Governemntality” research project carried out by le Centre de Recherche en Philosophie de l’Université Libre de Bruxelles, le Centre de Recherche Information, Droit et Société de l’Université de Namur, et l’Université Saint Louis-Bruxelles, the 2015 Interdisciplinary Winter School on “the 'data' from 'Big Data'” seeks to gather historians, sociologists, philosophers and lawyers as well statisticians and computer scientists to collaborate in a reflection upon some of the most pressing material, politic, technical and epistemic questions raised by contemporary forms of data and quantification processes. While topics such as “Big Data” and “data-mining” will be of particular interest, the aim is also to gain historical, sociological and philosophical perspective on the immediacy of such questions through interdisciplinary discussions. Therefore, those applying to this Winter School do not necessarily need to be working on 'Big Data', the main requirement is to address subjects where the notions of data and quantification arise and are problematic. In the space of a few years, the apparently inoffensive word 'data' seems to have become a ubiquitous prefix – 'data-driven', 'data-scientists', 'data-mining'. Indeed, 'data' and more particularly “Big Data” are mobilizing resources and reshaping institutions, both private and public. Therefore, it is worth remembering that data, just as oil [wired], requires labor and infrastructure. Indeed, nothing is neither fully automatic nor purely virtual. Obviously, the emblematic massification of “data-centers” added to the miniaturization of electronic circuits and the proliferation of digital devices seems to provide an inescapable turning-point whereby the production and harvesting of data appears to be inextricably linked to an increasing number of human activities.

Yet, we propose to consider this turning-point both as a pretext and a non-event. In terms of the non-event: instead of taking the radical changes implied by the tenants of “Big Data” for granted, this heuristic hypothesis will force us to determine the thresholds where certain political and epistemological shifts operate, as well as the continuities that articulate these shifts. As a pretext: there is no doubt that emerging statistical and algorithmic practices will lead us towards new perspectives on old questions of counting, sorting and quantifying the “real” into objects of knowledge.

These two goals, rethinking the notions' constellation around “data” and refining the qualification of the “Big Data event”, can be addressed in at least three different ways. First, by grasping a synoptic view of data practices within the different fields of science [Sociology]. Second, by improving our understanding of the scientific concepts, practices and technical objects at work in Big Data [Epistemology]. Third, by inscribing our reflection in an historical continuum and situating the concepts and the shifts they have known [History].

Programm

PROGRAM

The Winter School proposes three inevitably intertwined themes that should guide the investigations. Proposals do not need to contain all three, but should at least focus on one of them. In the case the approach does not directly correspond to one of the themes but can nonetheless be relevant for a specific subject, applicants are encouraged to send their proposal along with a brief justification. The questions proposed within each theme are certainly not exhaustive. They should rather serve as general indications for the kinds of issues that will be addressed. Participants should feel free to suggest different problems and topics or to rephrase certain questions; the aim is to foster constructive confrontations between fields that struggle with the same challenges, albeit under different form.

Whichever theme is chosen, we assume that a certain attention is paid to the materiality of the practices studied, whether academic references come from history or anthropology of material practices (Goody, Yates, Johns), science and technology studies (B. Latour, J. Law), the so-called Stanford school (I. Hacking, P. Galison, L. Daston, N. Cartwright, M. Morgan), the more recent media studies (Kittler, Hayles, Gitelman) or the French school of technology (A. Leroi-Gourhan, G. Simondon, X. Guchet). We would like, throughout the Winter School, to shed light on the objects involved in ‘data practices’ and how ‘data’ and ‘practices’ mutually structure one another.

Theme A – Practices of quantification

The move here would be to focus more closely on the process of quantification throughout its phases of production, collection, cleaning, clustering and correlating.

- What does data count and what counts for data in the different practices through which it is constituted? Who defines what shall be measured?
- Who were the 'data-scientists' before the word was even coined?
- What does producing 'clean data' (noise- or residue-less, for instance) for a given practice involve?
- What are the significant shifts in the ways in which scientists use and share data (standardization; public and private databases, etc.)?
- How has the evolution of data practices changed the ways in which social science legitimizes its research and communicates to experts and the broader public?

Theme B – Objects of quantification

The aim here is to question data in their ontological dimensions that is to say in the forms through which they are manifest, as well as their relationships to phenomena and objects they presumably represent. The challenge will be to frame Big Data representations as modes of intervention and normative regimes so as to understand the political and juridical tensions at stake.

- Are processes of quantification separable from that which is being quantified?
- If distinctions must be made, then according to which modalities (ontological, epistemological)?
- If it existed, what would raw, amorphous data look like? What is a data form – is it reducible to a standardized format; does the device with which it is produced, manipulated and recorded determine its nature; is it the result of different contextual interpretations?
- And if data should always be deconstructed by replacing it within the context of its construction, then how can we conceptualize its relationship to what it is supposed to represent?

Theme C – Powers of quantification

It would be a mistake to understand quantification in purely instrumental terms as a tool suited for a certain purpose or adequate for a certain task. Quantification processes do not operate upon inert matter, but rather mold and stabilize targeted objects. This relative technical and scientific stabilization has always been a condition for governing through quantification. We intend here to focus here on this precise knot where technoscientific and juridico-political norms entangle.

- How is the rhetoric of 'Big Data' articulated to concrete practices? How do data miners, computer scientists, statisticians, experts and police-makers negotiate models and their pertinent indicators?
- How is the “trust in number” (T. Porter) expressed in competing Big Data practices?
- How is it reshaped by this naturalization or immanentization of data?
- How does behavioral and/or genetic profiling modify the relation of the subject to itself? And does this profiling affect the way in which medical diagnoses are carried out or the bases of our solidarities and mutualizing of risks, already partly institutionalized by insurances?
- How do suggestion/recommendation algorithms weave their way between computing science, cognitive psychology and marketing? What happens to the notions of subject and collectives experience is? - - What are the responses “the law” can advance in light of such strategies of manipulation?
- How can we understand the paradox and underlying anthropological presuppositions that make it possible for the same data to be the motor of emerging surveillance strategies, economic models and public health debates?

KEYNOTE SPEAKER

The four keynotes speakers have kindly accepted to participate in the event are:

- Emmanuel Didier, Permanent Researcher CNRS (University of California Los Angeles).
- Xavier Guchet, Professor of Philosophy at Paris 1 Panthéon-Sorbonne.
- Christoph Hoffman, Professor of Science Studies at Luzern University.
- Donald Mackenzie, Professor of Sociology at Edinburgh University

APPLICATION

Applications should be sent to “algorithmicgovernmentality@unamur.be” before the 1st of December and must contain:
1. an abstract summarizing your proposal (max. 2 pages);
2. a motivation letter for participating to the residential seminar ;
3. a curriculum vitae.

The schedule of the event is as follow:

- Proposals should be sent for the 1st December 2015
- Confirmation will be given for the 7th December 2015
- Papers shall be sent to each participant by the January 25th 2016

ORGANIZATIONAL INFORMATION

Each participant will have a 45 minutes slot dedicated to their work (presentation and discussion), within one of the organized morning panels. We require full papers to be sent at least two weeks in
advance to all the participants. Keynote Speakers give a public lecture and lead one morning panel, commenting on presentations and organizing the discussion.

COSTS

The fees of the residential seminar are set to 200
euros, and will essentially cover accommodation and breakfast for the full week. Travel expenses must be covered individually.
Other food expenses will be organized collectively on site.

Kontakt

Jérémy Grosman

rue de Bruxelles 61
B-5000 Namur, Belgium

jeremy.grosman@unamur.be