Social networks, AI, big data: digital technology has upended social structures and undermined what holds them together: trust. How can we redefine it? To answer this question at an FIC plenary session, Jean-Gabriel Ganascia, Michel Bauwens and Éric Salobir referred to history, philosophy, sociology and their digital expertise.

When three intellectuals tackle the topic of trust in the digital world, the outcome opens up dizzying perspectives. For those who were unable to attend the two-hour plenary session at FIC 2023 devoted to this issue, inCyber is pleased to offer you a quick synopsis. We give the floor over to Jean-Gabriel Ganascia, professor at the Sorbonne and president of the CNRS ethics committee, Michel Bauwens, computer scientist and cyberphilosopher, and Éric Salobir, priest and founder of the OPTIC network, which promotes technology for the benefit of humanity and the common good.

Jean-Gabriel Ganascia: do trust and digital technology go together? This is ambiguous; it can mean « can digital technology absorb trust? » So, is there more trust, or does it mean we want to create digital technology we can trust? Careful, trust does not mean fidelity and neither is it proof: when you trust someone, you risk being wrong. There are several types of trust. There is trust in individuals, there is trust in institutions, and then there is trust in machines.

Éric Salobir: indeed, and the whole challenge with trust in digital technology is creating the conditions for trust in things that we cannot see, whereas by definition, we always tend to trust what we see. With generative AI, for example, we are now saying, « I can’t believe my eyes » and in fact, we cannot believe our eyes anymore. Digital technology has completely upended the conditions for trust.

Michel Bauwens: as society has become more complex, people have lost their ability to trust those around them, their direct acquaintances, the famous Dunbar’s number.

Distributed trust with the blockchain

Today, we are in a different sort of peer-to-peer environment: we must coordinate non-locally. We have to trust people with whom we share a project or belief, but who are not nearby. So, we are forced to connect with our peers via proprietary platforms for whom we are more or less livestock for data extraction. This is fundamental. There is no institution representing this new sociology. Our institutions are essentially geographical, such as the nation-state.

Éric Salobir: the question is why these business models have emerged. Ultimately, we did not want to be customers, so we became a product. And the question is, « how will we think about these new business models? ».

Jean-Gabriel Ganascia: in ancient times, trust was based on one’s word, and a witness was worth more than writing. As groups expanded, writing became more important Now, the major transformation is that it will be machines. With blockchain, for example, there will be new types of trust, and this trust will be distributed since it will no longer rely on trusted third parties, an institution, a central bank for currency, or a government.

Éric Salobir: what is disturbing is that this trust in machines comes at the expense of trusting people: « trust the blockchain so you no longer have to trust your neighbours ».

ChatGPT, the avatar of the « golden calf »?

The trust we had in fiat currency, from « fides » or faith, was both trust in the person and trust in the economy, in the group. All of this is disappearing in a rather Hobbesian perspective: if man is wolf to man, then I prefer the blockchain.

How will we build a society on this type of technology? How will we benefit from these technologies? I don’t want to get to a point where smart contracts end up killing the social contract.

Jean-Gabriel Ganascia: When I was talking about the blockchain, it wasn’t about trust in the machine, but trust through the machine. Trust in the machine, that’s ChatGPT, which is seen as an oracle. It has a special status, like divination. When you say, « I asked ChatGPT », that’s exactly what we mean.

Éric Salobir: yes, we anthropomorphise this machine by asking it questions. It appears to be endowed with speech, with a discrepancy between its formal perfection – it speaks well, gives the impression of being well-argued – and its complete lack of common sense. ChatGPT can say any number of things, it’s a gasbag, but that’s not a problem. People put their trust in AI in the same way that the people made the golden calf their god in the Hebrew tradition. The goldsmiths didn’t make it an idol, the people did. Are we all building the same relationship with technology?

« Capitalist » American tech firms

Michel Bauwens: to avoid this, we need to build new institutions that reflect this virtual reality. These are actually being created in the open source community, such as the FLOSS Foundations, which manage collective infrastructure in a non-territorial and often democratic way. One example I can think of is the Linux Foundation. I call these magistrates of the commons. This could also apply to data, with data trusts, data commons and data cooperatives, to partially escape those « capitalist » American tech giants that capture our attention and our data.

Éric Salobir: we are indeed witnessing an impoverishment of pre-existing institutions while new ones are struggling to be established. These foundations are great, but unfortunately, they are too marginal. To create new trusted third parties, I think we need three characteristics. The first is independence, including financial independence, which open foundations lack.

The second is transparency, where experts can verify algorithms, ChatGPT metaprompts and how our data is used. Thirdly, this governance needs to be participative. And here, I greatly agree with you. The problem is that the scale is global. There will not be one, but many trusted third parties. The question is, « what architecture do we need so they can all talk to each other »?

Social media and mob psychology

Jean-Gabriel Ganascia: these pillars that you have listed seem perfectly essential to me. Trust is being completely rewritten in our digital societies, and it is up to us to redefine all these criteria. You talked about data. The challenge is that it can be duplicated and falsified. So, we need to think about the processes that we are going to put in place so that we can rebuild trust independently of its very fluid nature.

Michel Bauwens: Beyond these issues, I think we need to introduce the notion of online civility, because the online world is very fragmented. Everyone is in their own little tribe of affinities that has access to different information. Each community is battling against the information coming from another tribe. You can’t create a society with this attitude.

Jean-Gabriel Ganascia: we do have communities, but no longer in the old sense, i.e., communities of people condemned by fate to live in the same place, with a duty of solidarity. Today, these online communities are communities of interest. The problem is collective deliberation. Within these groups, we can see concepts from mob psychology in action, as per Gustave Le Bon and Freud. In a mob or on social media, people become sensitive, aggressive, or get excited about nothing. There is no longer one public space, but spaces that are somewhere between public and private.

Covid-19, a « painful » crisis of trust for scientists

Éric Salobir: we have gone from a world of received identity (« I am so-and-so, son of so-and-so ») to a world of chosen identity and multi-identities. Everyone is forging their own paths. While this gives us a lot of freedom, it happens a bit by force and contributes to this rather agitated, violent dimension of the public space.

Jean-Gabriel Ganascia: we could see this during the Covid crisis, when trust withered away, and it was particularly painful for us scientists at the time. By nature, scientists have doubt, but here we were faced with the wider public who made scientists hostages to their own doubts. They said that if we couldn’t say for sure, that meant we had no idea, etc.

Michel Bauwens: I don’t want to be too negative, but we have entered an era of digital surveillance. When I am on Facebook, I feel a bit like I’m in China, where you can’t even share scientific articles, they’re filtered.

On the one hand, there is the « monotheistic » media in the sense that they follow a dominant narrative, and online, algorithm controls work against you. Even if we get the impression of fragmentation, we are really having trouble getting the word out. And the danger, of course, is where there is no speech, there is violence. Will we end up like the Romans with a breakdown of our structures, or will it be like the 16th century, where we found a capital solution in the nation-state?

« Monotheist » medias versus a « fragmented » Internet

Jean-Gabriel Ganascia: besides this fragmentation by groups within societies, digital technology also brings out divisions between cultural zones. For example, I participated in UNESCO’s ethics committee when it was setting up its ethics on artificial intelligence. I read a certain number of charters, and I can tell you that how Europeans see it is not how the Americans or the Chinese do.

Éric Salobir: exactly. For the Chinese, chaos is the absolute evil. It’s not dictatorship, and that says a lot about their social organisation. The Americans have a very consequentialist view. Basically, as long as there’s no class action, everything’s fine. In Europe, we apply Kant’s principle where « your maxim should become a universal law », i.e., if you do something, you should want everyone to do the same.

Michel Bauwens: even the technological system is being divided in two. Huawei can no longer invest here, and the Americans have a law that punishes Americans with up to fifteen years of prison for working for Chinese microchip companies. Even the Internet is breaking up.

Trust in digital technology needs to be reinvented at the regional and national levels, so at the global level it seems very hypothetical.

Stay tuned in real time
Subscribe to
the newsletter
By providing your email address you agree to receive the Incyber newsletter and you have read our privacy policy. You can unsubscribe at any time by clicking on the unsubscribe link in all our emails.
Stay tuned in real time
Subscribe to
the newsletter
By providing your email address you agree to receive the Incyber newsletter and you have read our privacy policy. You can unsubscribe at any time by clicking on the unsubscribe link in all our emails.