This track invites contributions that enhance understanding of Ethical, Legal and Social Issues (ELSI) around information and communication technology in crisis response and management. The aim is to inform high quality transformative, careful and responsible innovation. We seek papers that explore ELSI at the juncture of lived practice with organisational, technological, policy and social innovation.
The track combines conventional (20minute) and short (7 minute) presentations with a facilitated interactive conversation session. The conversation will be driven through short (3 min) provocations expressed by the speakers of the track or audience members that feel passionate about a particular issue, and/or selected other registered participants with relevant expertise in the ISCRAM 2019 Conference, followed by facilitated interactive small group conversations amongst ‘provocateurs’, speakers, and members of the audience. Documenters of these conversations will be selected from the participants. The conversation will be facilitated with creative methods to produce questions, examples, approaches and responses that will be documented in a way that can be shared more widely in the conference and after, e.g. through a live cartoonist and on social media.
This year the ISCRAM conference focuses on “individual-centric emergency management systems” and “intelligent context processing”. The technologies involved range from advanced mobile localization of a victim’s mobile phone, to warning systems that provide situationally local information, to personalised decision-support based on machine learning. Such systems concern the whole diversity of actors in crises: managers, rescue teams, citizens as victims, bystanders or volunteers. They also concern a diversity of data, data sources, data gathering methods, decisional needs, and local situational needs.
These technologies, and the social and organisational innovations that enable their utilisation present complex ethical, legal, and social challenges and opportunities. Who will and who will not have access to personalised support? What aspects of a ‘person’ are captured through invididual-centric techniques? What cannot be captured? What happens to collective and social practices of disaster risk management as these technologies are adopted? Does personalisation have to come at the cost of increased surveillance?
As practitioners, humanitarian organisations, and citizens seek IT that can support diverse needs and actors in ways that enable more effective crisis management in a turbulent age, it is important that the design of IT tools critically, carefully and creatively considers ethical, legal and social issues. What innovative solutions are needed to make the most of the potential of individual-centric techniques, and avoid negative unintended consequences?
We invite papers that bring ELSI challenges and opportunities to life. Papers may address questions of responsibility, liability, privacy, power relations, exclusion, accessibility or any other relevant themes. They can critically assess broader issues such as what makes good disaster IT, who has the power to determine that, or how to address unintentional bias in the design of such complex systems, they can present in-depth studies of examples, or outline innovative responses to ELSI.
We welcome papers from academic, designer and practitioner perspectives. We value a range of contributions from critical discussions of ethical tensions to direct experiences and points of view.
Some questions, among others, this panel considers are:
- Who is personalization for?
- If disasters strike the poorest the hardest, what kind of personalization is available to them? Is it going to make disaster risk management more unequal?
- What are the implications of profiling different types of user to provide them with a personalised services?
- What kind of data or knowledge should be included?
- How much data needs to be processed to enable such personalisation or contextual information?
- What surveillance will these systems bring?
- Is the ethical use of such technology an individual, an organisational, or an institutional responsibility? What about liability? What responsibilities do designers have?
- How might it be possible to address inclusivity/accessibility and unintentional bias in intelligent system design?
- How do users consent or execute their rights as a data subject?
- Whose definition of disaster or emergency is being used in the design details?
The track chairs involved in the ELSI track are all experienced track chairs and they have been involved in the ELSI track in previous sessions. They are also members of the ELSI working group within ISCRAM.