During Alter.Next 2022, Professor Yuval Noah Harari — historian, philosopher, and bestselling author, joined us to talk about how to incorporate ethics into data and analytics. Check out some of the hot issues he covered in our session recap.
Professor Harari is an expert on how technology can impact our lives in the world. He’s won countless awards for his thoughtful and sometimes provocative ways of thinking and is a recommended read by top minds in the technology world from Bill Gates to Mark Zuckerberg, as well as political leaders, such as Barack Obama.
AYX: You’ve said when people have control of data, they can have a profound impact on society and you view coders and engineers as philosophers of our society going forward. Can you elaborate on what you mean?
Professor Harari: Think of the self-driving vehicle. If there is a self-driving car and a kid jumps in front of the car and the only way to prevent overrunning the kid is to swerve to the side and run over an old lady or risk killing the owner of the car, who is asleep in the backseat — what should the car do?
To put a self-driving car on the road, the coders need to decide. They need to program the algorithm. The amazing thing is that 99% of the time algorithms will do what the coders decide so the responsibility of coders is much, much bigger.
AYX: What’s the danger of data and analytics? What bad can data and analytics lead to, even when we have good intentions?
Professor Harari: There’s the potential for digital dictatorships — a new type of regime in human history that is able to follow people around 24 hours a day, analyze that data, and based on that create a type of totalitarian society where you have zero privacy, even inside your head, because they know what you think and what you feel.
Another very big danger is data colonialism. To conquer a country, you don’t need to send soldiers. You just need to take the data out. Imagine a situation when the data of a large part of the world is harvested and sent to an imperial center, where it is used to create sophisticated technology, AI, algorithms, and smart machines, which are then sold back to the colonies.
We’re seeing this situation developing now rapidly and these scenarios are quite frightening in the potential that they have. It’s not inevitable. It depends on the decisions that people, engineers, technicians, and entrepreneurs take.
AYX: Will humanity go wrong with technology in the future?
Professor Harari: No, technology forces us to use it in just one particular way. You can use a knife to murder somebody, or to cut salad, or to save a life. The knife doesn’t force you to do this or that.
The same is true of technology. AI, big data, and surveillance can be used to create the worst totalitarian societies in history or the most equal and prosperous societies. It depends on us. We have agency. We can decide which kind of tools we are creating.
The big question is then — what do you do with this kind of power?
AYX: In a world of constant change, is there a way to make learning new things easier?
Professor Harari: For individuals there are limitations to how much we can change just by ourselves. But when we connect with a lot of other people, then enormous change is possible, because very often you just rely on strangers for your very survival. You don’t need to know how to do everything by yourself.
I think a key skill is to get to know yourself better, who you really are, what is really happening in your mind and body. Because again, this is the key to keep changing throughout life and to protecting yourself against increasingly aggressive manipulations.
Want to re-experience Alter.Next 2022?