Societal Challenges for Search: Privacy, Bias, Accountability, Transparency and some other scary things
"While, traditionally, the IR community has been focused on building systems that support a variety of applications and needs; it is becoming imperative that we focus as much on the human, social, and economic impact of these systems as we do on the underlying algorithms and systems. We argue that an IR system should be fair (e.g., a system should avoid discriminating across people), accountable (e.g, a system should be reliable and be able to justify the actions it takes), confidential (e.g., a system should not reveal secrets), and transparent (e.g., a system should be able to explain why results are returned)."
-- Text excerpt from the SWIRL 2018 report.
Information Retrieval provides technology for searching the World Wide Web, social media platforms, news portals, and many other places. Meanwhile search engines start providing suggestions of what to do in certain situations or give arguments in favor or against issues of societal importance. Information Retrieval technology thus became not only a means to inform users who have a certain information need but can nowadays also be (mis)used to influence people or to drive public opinion.
The planned panel is intended to take a critical view onto the chances and the risks that come along with this development. A special focus will be set on some recent developments in Europe (e.g. the GDPR - “Datenschutz-Grundverordnung”).
Panelists: coming from academia, business and associations related to IR.
- Paula Helm: PostDoc in the project "Strukturwandel des Privaten", Goethe University Frankfurt, Germany. Abstract
- Peter Janacik: Product Owner Big Data REWE digital, Germany. Abstract
- Nicola Ferro: University of Padua, Italy. Abstract
- Claudia Hauff: Delft University of Technology, The Netherlands. Abstract
- Alexander Rabe: eco Verband der Internetwirtschaft – HGf, Germany. Abstract