Digital dilemmas surrounding internet connectivity, artificial intelligence, and data surveillance are becoming more ubiquitous in our everyday lives as technology advances. It’s difficult to keep pace with those changes. But now picture yourself with one of these digital dilemmas in the midst of an armed conflict or other situation of violence anywhere in the world. This week on the podcast, how do technologies affect our choices and options in a crisis? And, what is the International Committee of the Red Cross doing to alleviate some of these issues? We walk through a hypothetical scenario based on real-life stories with our senior advisor on digital technology and data protection, Laura Walker McDonald, and then interview Philippe Marc Stoll, ICRC’s senior techplomacy delegate in Geneva.

Important Links

Walk through the digital dilemmas experience on your own.

Enroll in “Humanitarian Action in the Digital Age,” a mass open online course for anyone interested in learning more about these issues.

This episode is now available for the hearing impaired.

[Electronic music plays]

[DOMINIQUE] Many of us have been in one of these scenarios: A call with a loved one drops in the middle of a conversation. Your personal information gets caught up in a data breach. Or maybe, you struggle to determine whether that viral piece of news circulating on social is real or fake.

But now picture yourself with one of these digital dilemmas in the midst of an armed conflict or other situation of violence anywhere in the world. How do technologies affect our choices and options in a crisis? And what is the International Committee of the Red Cross doing to alleviate some of these issues?

Today, we unpack digital dilemmas, which for me, will get really personal as a participant in a hypothetical situation.

I’m Dominique Maria Bonessi, and this is Intercross, a podcast that offers a window into the work of the ICRC and shares the stories of those affected by armed conflict and other situations of violence.

[Intercross musical interlude plays]

[DOMINIQUE:] Hi Laura. How are you today?

[LAURA:] I’m good thanks.

[DOMINIQUE:] This is my colleague Laura Walker McDonald. She works with the Washington Delegation of the ICRC as the senior advisor on digital technology and data protection. Laura what are we going to be doing today?

[LAURA:] Well in a little while you’ll be speaking with my colleague Philippe Stoll, who is the ICRC’s senior techplomacy delegate in Geneva, where we are headquartered. But first I wanted to go through some hypothetical digital dilemmas with you. These are things that we’ve been thinking about at the ICRC for many years now, as we see how new technology is being used, and misused, in warfare.

And as part of that we’ve created different hypothetical scenarios based on real stories. These help explain how issues like connectivity, hate speech, data protection, and artificial intelligence (or AI) can play out during conflict. We really want to get people thinking about how to mitigate these digital risks for us all.

[DOMINIQUE:] And just a note for our listeners. We’ll be putting the link up to the online experience and other material on the Intercross website so you can try this at home.

[LAURA:] Great! So today, I have a modified version for us to walk through. Keep in mind these are things that have taken place, that have actually happened to people affected by armed conflict and other situations violence, but we don’t mention any specific places or people. It’s just a hypothetical. Are you ready?

[DOMINIQUE:] Ready!

[Background music plays]

[LAURA:] You wake up on a nice day in your city. You get out of bed. Get dressed. And head out the door.

[Sound of an explosion]

Your city is under attack. But if that weren’t bad enough, in the same week, your city is hit by a natural disaster. You are alone. When you step outside, you see that people are suffering.

[Voice of man] “We all left. Rockets are falling every day. The shelling is really random, killing women, children. Everything has been devastated.”

[LAURA:] You survive the crisis. However, in the ensuing chaos, you lose track of your family.

[Sound of walking through rubble]

You’re walking through the rubble of your old neighborhood to find your family. In the rubble, you can’t seem to find anyone you know. Luckily, you still have your phone with you.

So, here’s the first question. What do you do? Do you call your family or try sending them a message?

[DOMINIQUE:] I would probably call them right away.

[LAURA:] OK, you’re calling them, hoping that one of them will pick up so you can hear their voice and know that they’re okay. The phone searches for a signal.

But there is no connectivity in your local area.

It seems that either owing to the conflict or disaster, the network has failed and there is no mobile connection. As you move around looking for your family (or other ways of contacting them), you see a friend. She tells you that the only place that still has internet connection is the military base, over six miles out of town.

So now, what do you do? Do you keep searching in the rubble or go to the military base to find a connection?

[DOMINIQUE:] I’m not sure I’d be able to continue looking with much luck. The area is desolate and without phone connectivity. I guess I’ll go to the military base.

[LAURA:] You arrive to the military base with your friends. There are a lot of people from your area looking for ways to reconnect with their loved ones. You are beyond relieved to learn your parents are there, too.

But after you’ve reconnected with them, they have bad news to share. Your younger brother is still missing. They haven’t seen him or heard from him.

You are able to connect to the base’s WiFi. Some of the other people are saying that a group has been set up on social media to help families find each other in the chaos caused by the crises.

So here’s the next choice you have to make. Do you join the group and post a picture of your brother and the last place you saw him?

Or, do you not join the group, and mark yourself as “safe” on social media and ask about your brother on your personal social media wall? Or do you do something else entirely?

[DOMINIQUE:] Personally, I would choose to join the group and add a picture of where I last saw my brother, and see if anyone would respond to my post…

[Music fades]

Looks like people are responding…

[Sound of phone notifications and messages pop up on the screen]

[VOICE READING MESSAGE] “I don’t know it’s such a mess out here…”

[DING]

[VOICE READING MESSAGE]  “There’s a place near town hall where the red cross/crescent has set up, they’re helping people find families.”

[Sound of additional notification dings increasing in volume and frequency]

[DOMINIQUE:] After choosing to post the photo in the group, the scenario provided some super useful information in trying to reunite with my brother.

But it also referenced other hate-filled messages that clearly make a point…

[Sound of phone notifications and messages pop up on the screen]

[VOICE READING MESSAGE] “Your filthy community is making this country sick.”

[DING]

[VOICE READING MESSAGE] “We don’t want any more of your people!”

[DING]

[VOICE READING MESSAGE] “You should be punished.”

[LAURA:] And it looks like there’s a lot of low-quality photos of your brother in a detention center. The photos have gone viral. Okay Dominique. You have to decide: do you reply to defend yourself and your family or do you just try to ignore them.

[DOMINIQUE:] I think I would try to ignore them, but that photo is hard to ignore. Could my brother really be in a detention center? I don’t want to escalate the situation.

[Music fades in]

Looks like it’s really heating up on social media, but some people are trying to help.

[Sound of phone notifications and messages pop up on the screen]

[VOICE READING MESSAGE] “Found your brother, we’re at the station.”

[DING]

[VOICE READING MESSAGE]  “You should be ashamed of yourself and what you’ve done to this country.”

[LAURA:]You are able to find your brother, but the city is ravaged by bands of armed people, and more are joining in the clashes.

[Voice of news broadcaster] “Seems to be another case of another loner who found community in his online bigotry. And then his online life, his online hate became real world action.”

[LAURA:] You and your brother return to the military base to get access to the WiFi again, but the base is under attack. Some people with guns have started shooting.

In the confusion, your brother gets injured but you manage to pull him away. With no doctor available and no cell or internet connection, you get to the nearest town. There’s open WiFi available in one of the bars, but many people have already evacuated.

Here’s your next choice to make: Do you connect to WiFi and search for doctors and hospitals nearby?

Or, do you connect and look for local emergency numbers?

[DOMINIQUE:] I think it would be better to search for a hospitals nearby since my phone doesn’t seem to have much connectivity outside of this WiFi hotspot.

[Sound of typing on a smartphone]

[LAURA:] Looks like the nearby hospitals is closed and much of the information online isn’t useful for seeking humanitarian assistance. But you learn from someone at the bar that there is a non-governmental humanitarian organization at a displaced persons camp not too far from town. They might be able to offer you food, shelter, and medical attention for your brother.

Do you try to find the camp or go back to the military base?

[DOMINIQUE:] I think I’ll try getting help from the humanitarian organization at the camp.

[LAURA:] You arrive at the camp, and before you can get some food and medical assistance for your brother, an aid worker says you’re required to provide you fingerprints.

Do you let them scan your fingerprints or refuse, and forego any food and medical assistance for your brother?

[DOMINIQUE:] I guess I’ll have to give them my fingerprints if I want to get help for my brother. But that information is sensitive, couldn’t someone steal it? How do I know my personal information is safe?

[LAURA:] After scanning your fingerprints, your brother receives medical care and you both get some food rations. The camp provides shelter as well, but you still have your family back at home. It looks people are able to return home to start rebuilding their communities, despite some ongoing conflicts around the area.

Do you leave the camp and the humanitarian aid you’re receiving? Or do you try to go back home to be reunited with your family?

[DOMINIQUE:] I think I’ll try to go back home. Now that my brother has received medical care, I think it would be best for us to return to our family.

[Sound of walking on gravel]

[LAURA:] You return home. In the months ahead, you find out that the country’s armed-opposition group took over the displaced persons’ camp where your brother received medical care. The fingerprints you provided to them are now under the authority of the opposition group, which has been killing members of your ethnic group.

[DOMINIQUE:] What am I supposed to do about that? If I try to confront the group, I could put myself and my family at risk!

[LAURA:] Okay yeah how do you feel?

[DOMINIQUE:] I felt a lot of ambiguity about the choice I didn’t make and how things could have been different and/or better.

[LAURA:] Yeah and for many people these are real choices that people are having to make. And that’s what we were trying to show with these digital dilemmas: to create an awareness around the massive impact of these technologies for people who are affected by conflict or disaster or both, and encourage people and decision-makers to think through these types of situations and mitigate risks so that people can choose safely.

[DOMINIQUE:] Yeah and it really put me in the shoes of someone having to make those decisions and seeing how digital technologies from social networks to biometrics and connectivity, are transforming lives in certain crisis situations. I realized it wasn’t so much about making the right decision as it was more about having to accept the real-life consequences of that decision you needed to make during an emergency. Thanks for this Laura.

[LAURA:] You’re welcome.

[DOMINIQUE:] Now we’re going to turn to an interview with Philippe Marc Stoll, the ICRC’s senior techplomacy delegate in Geneva, who really spear headed this whole project.

But first, I’d like to take a quick break to have you listen to our sister podcast Humanity in War, Hosted by Elizabeth Rushing.

[AUDIO Humanity in War podcast trailer]

[DOMINIQUE:] I’m now with Philippe Marc Stoll, the mastermind behind the scenario we just went through. Thanks for joining us Philippe.

[Philippe Marc Stoll] Thank you for having me. It’s a great pleasure.

[DOMINIQUE:] Can you talk about why is the ICRC talking about digital dilemmas, how is this part of our mandate in our work?

[Philippe Marc Stoll] “We saw over the past decade, a growing impact of digital technologies, digital tools in humanitarian crisis, and more specifically, in complex settings. You should imagine the growing use of autonomous weapons; we see a growing number of cyberattacks. So this is on the way conflicts are fought. We see also a growing use of digital technologies by humanitarian actors by organizations where we use different tools to register people. All to try to connect families through communication means. The pressure or the need comes also from people themselves, those who are affected, they want connectivity, where they want to send information news to their family members. But all these kinds of new push come also with new risks where we see a growing impact of surveillance, of misuse of certain data.”

[DOMINIQUE:] And then bringing this into an experience that people can actually do from the comfort of their office, home computer whatever. Why? What was the thought behind that?

[Philippe Marc Stoll] “There was clearly a need to explain what was the consequences of all this digitalization. Unlike the classic physical world where if I speak about hunger, if I speak about torture, anyone can kind of understand what it means you can physically feel it. When we speak about hacking when we speak about surveillance when we speak about misuse of the identity etc. It’s less easy to understand it because very few people have witnessed it or lived it in our world. And even if you have lived it, the consequences let’s say in Washington or in Geneva are not the same. I think one example that I like to give geolocalization I guess that everyone has this impression that whatever you do is recorded on your phone and someone knows or company knows where you go if you prefer this grocery shop to that grocery shop or where you go to have food etc if it’s in a context where it’s pretty safe, you can live with it but if you are let’s say in South Sudan in Syria and Ukraine it has a total different dimension you won’t maybe like so much that people know where you go for your religious habits you might not want to know if you are friends with this person or that person that you come from this neighborhood so these are the little bits of the differences between a classic use or basic use of technology in a in a conflict settings that that the consequences are quite much more complex.”

[DOMINIQUE:] In the experience we just went through one of the first dilemmas I had to face was a connectivity issue, and not being able to get hold of family. It might seem really obvious, but why is it critical to have access to internet in a conflict zone?

[Philippe Marc Stoll] “I don’t know how often you travel, Dominique how often our listeners are traveling but I guess the first thing is you do when you land on an airport or some you reach a place the first thing you do is you send a message to your family to your friends. Just to say that I’ve landed safely you I’m there safely etc. So in conflict situations is not different from that situation. You want to remain connected with your family with your friends. You want to say that you are in this place in case something happens but also you want to know how your family members are, if they have stayed behind, or on the contrary, if they have moved. So I think the question of access to connectivity is really a lifeline. And this is the feeling of not being able to be connected, I guess that everyone has lived it in a way or another. But it didn’t last for long. But in certain places where we work, it lasts for weeks, if not months, and in some places it was for one half year.”

[DOMINIQUE:] What about humanitarians themselves and the data they collect from people they’re assisting in emergencies. How can humanitarian organizations provide connectivity to civilians and also ensure that it’s safe?

[Philippe Marc Stoll] “So this is a great question, because that’s one of the big biggest dilemmas that a humanitarian organization faces, how can we harvest the best of technologies without creating new harms? And really, here data, as you most probably know, it’s a topic that it’s an issue that many people have an interest on, especially data that we collect, in complex situation where you might not have that many data. So we have to find a way that when we collect information, let’s say about a family member that been that you have lost contact with, with names with age, with potentially some medical information, you really want this information to be stored, and that it is used in a very safe way. So for us to find that balance between using something that will help to solve a case as fast as possible– because you can compare that basis because you can connect with some people in a third country, etc. But we need to be sure that the safeguard of the access to this information are very strong. So from a legal perspective, so it means that we have to work in an environment where humanitarian data are protected from a legal perspective, but also from a technical perspective. And that way, it’s becoming also a strong challenge for humanitarian organizations because very often, people don’t give money to humanitarian organization to buy servers or upgrade cybersecurity tools, because it costs a lot people, you might donors, or private donors want the money to go directly in the field. So this is one of the challenge and dilemmas that you mentioned organization face quite a lot, how to be safe and working according to the rules, but also take advantage of digital technologies.”

[DOMINIQUE:] “Once this hypothetical scenario played out and I got connected to the Internet there were they were looking for information on their brother’s whereabouts and social media they encountered artificially generated and real hate speech that turns into violence. How does artificial intelligence affect hate speech and disinformation online?

[Philippe Marc Stoll] “So we are not yet to fully understand the impact of misinformation, disinformation, hate speech in the digital in the AI age. I think it’s not a new phenomenon. And I think everyone remembers that there were kind of many instances of propaganda hate speech in conflict over the past centuries. But today with this new communication means it takes a whole new dimension in terms of speed, span and scope, which then we are yet to measure fully and completely how it impacts in the real life. So the connection between online misinformation, disinformation and speech and real life consequences. But what we see right now is a growing concern, a growing spread of these kind of tendencies, and this is definitely impacting the behavior of people. And we often say that, not all misinformation leads to a genocide. But all genocide started with misinformation, information and hate speech.”

[DOMINIQUE:] In the hypothetical example the person’s fingerprints are being requested of them to receive humanitarian assistance from an aid organization. This is considered biometric data. I think this is one of the more surprising instances of a digital dilemma because you might not see the effects unless your information is caught up in a data breach or under the authority of a party outside of the aid organization. How can states adhere to international humanitarian law in this instance and ensure that data isn’t being misused?

[Philippe Marc Stoll] “I think here we are speaking about a very, very sensitive data, biometric data is something that contains quite a lot of information about you, which includes some health information, especially if it’s iris scan, for example. But it’s also something that’s unlike a password, you can’t change if someone gets access to your biometric data. It’s there forever. So the responsibility that a humanitarian organization when they collect such information is even bigger. So, what we are trying to pass as as an information and as a message to states, but also to other humanitarian organizations is that these data should not be requested and should not be seeked. For we want to have an agreement at the international level, that data collected for humanitarian purposes, should be used only for the purposes and no one else should get an ask the access for that lawfully or unlawfully. I think in October 2024, we will try to bring to the attention of states during a big international conference can have a resolution as we call it, so that the data collected for any type of humanitarian activities should be protected.”

[DOMINIQUE:] And, and just to be clear, I want to go back on something that you said, the ICRC doesn’t use biometric data sort of an exchange for humanitarian aid is that correct?

[Philippe Marc Stoll] “So the ICRC has a very specific policy of using biometrics only if this is helping to solve a humanitarian problem. So we might use biometric data in the case of reuniting family members, especially dead bodies. So we might use DNA testing and these kind of technologies, but for registering someone to get food, no, this is so far, not something we will do. So it’s is this kind of finding the check and balance between the pros and the cons, and take then an informed decision.”

[DOMINIQUE:] And you’ve sort of been mentioning this all along but how can private industries and states work to help the ICRC and other humanitarian organizations achieve safety for civilians in cyberspace?

[Philippe Marc Stoll] “For us, it’s a mix of being true to very old principles that are the fundamental principles of the Red Cross Red Crescent, and they are 50 years old. It touches the questions about neutrality independence, but it’s also our way of working, we want to be confidential where we want not to create additional problems to the affected people, so we have this as a basis. But then it means investing in new capacities. We need to have tech colleagues who understand how the technology functions. And we need to work together to translate this into requests into questions we have. And therefore we try to engage with states with armed groups with academic institutions, but also the private sector to try to bring to their attention, the challenges, the dilemmas that we face, but also try to make them understand that the consequences of some of the tools or the issues that we see in our technology have real arm. And I think this is really the purpose of such an expanse the digital dilemmas and I hope people feel it is that the impact of something that sounds pretty simple, in a context such as Washington on Geneva is totally different.”

[DOMINIQUE:] So how can we as humanitarian workers who are humanitarian organization learn more about digital technologies. How can we be digitally literate?

[Philippe Marc Stoll] “We see this as a huge gap in need, because people, especially those in the field, have other priorities that learning how the Cloud Functions, what is exactly a data etc. And I’m very happy to say that we have teamed up with Doctors Without Borders and the federal political School of Switzerland based in Lausanne to create a massive open online course MOOC called you might in action in the digital age, and it should help or it will help colleagues or young humanitarian workers want to know more in a quite a simple manner to unpack digital technologies, what they are, and also how to mitigate the risk and take advantage of the opportunities.

[DOMINIQUE:] Final question is, do you have a message for the international community that you’d like to impart?

[Philippe Marc Stoll] “For me, the first element is that we should all agree that digital harms are real harm, that a cyberattack on civilian infrastructure is illegal. We need to be sure that something that happens today is not going to have a consequence in 5,10 years’ time when the conflict is over. But still there is a trace that this person was injured or was benefiting from ICRC assistance. So that’s the first thing. We have also to try to make them understand that we as a humanitarian organization cannot solve all these problems by ourselves that we need new means new funding to get to get there. And also ultimately is to be sure that digital technologies are just a tool that should be spared from any attacks, especially if it’s used by humanitarian organization.”

[Background music fades in]

That was Philippe Stoll, the ICRC’s senior techplomacy delegate in Geneva.

Thank yous this week go to Laura Walker MacDonald and Jonathan Horowitz, our in-house digital technology experts.

If you want to learn more about digital dilemmas you can visit us at intercrossblog.icrc.org. There you can also subscribe to our newsletter so you never miss a podcast.

You can also follow us on Twitter @ICRC_DC to read more about our work with digital technologies.

See you next time on Intercross.

[Outro music]