President Barack Obama affirmed this in a speech given at Stanford University in California on April 25. It is in a context of unprecedented democratic tension that the president spoke about the threats that social networks can pose to the life of the city, even though these tools are also an inevitable source of innovation. But then how? The actors of this world, regardless of their position on the social, economic and technological chessboard, must act in the right direction. Owners of these innovations, designers, engineers, scientists, users, citizens and political leaders, everyone must understand their rights, duties and obligations.
Among the algorithms that run on social networks, we can cite content recommendation algorithms – which offer you posts to see or users to follow -, user classification algorithms – which categorize individuals according to their profile and their actions on the tool – or the algorithms for detecting violent or pornographic content, for example. Not having gone behind the Machine, it is difficult to affirm with certainty the algorithmic underside of these tools, but let’s try an exploration together.
Categorization and recommendation algorithms
Categorization algorithms use statistical learning to build classes of individuals with similar profiles, whether static – their age, region, number of connections – or dynamic – the frequency of content sharing, the average feeling of content liked and shared -. The content suggestion algorithms are based, among other things, on your profile – through your class – and on the type of content in question. They can suggest posts that other users in your class have liked, assuming you’ll like them back. When these algorithms are poorly designed or poorly tested, they can lock you into observation bubbles of the world, limiting you to the content that those who look like you consume. This is how conspiracy and the spread of fake news explode on the networks, endangering democracy.
Recommendation algorithms can also highlight content – regardless of your behavior on the platform – which is massively shared over short periods of time – with the idea in theory that it will engage you more -. Indeed, we tend to share, comment or like controversial content. This is how hateful and transgressive remarks contaminate the networks with a certain nervousness in time and space. These algorithms work in this way because they support an economic model that relies mainly – if not solely – on the economy of our attention to all.
Now consider the algorithms that detect inappropriate content such as violent and pornographic posts. This problem is much more complicated than it seems. First of all, the inappropriate content that we see is that which has not been detected semi-automatically – we know that there is also a human intervention of moderation – and which perhaps represents a small part of all these posts sent to the web. Even if we detected 90% of this content, the remaining 10% would still be, rightly, unbearable for users. That being said, we will admit that there is sometimes a posture on the part of the owners of these networks, closer to ideology than to objective analysis, such as when Meta decides that a simple nipple is synonymous with pornography.
Design good development practices
But then how to solve these many problems in order to protect democracy? First of all, it is not these algorithms that are at fault but those who think them up, design them, sell them and use them. The solution is not in the rejection of these technologies but in the way we build and use them. The solutions to be explored involve all types of actors (technical giants, users, political leaders, etc.) on all scales (from the individual to the State). They must also coexist at the risk of greatly reducing their mutual effectiveness. It is essential that States build relevant regulations that protect the fundamental rights of individuals – and therefore democracy – while encouraging innovation.
What Europe and the United States have achieved with GDPR (General Data Protection Regulation) and CCPA (California Consumer Privacy Act), respectively, should be replicated in the case of algorithmic systems. To do this, technological players must be required to design and apply good development, testing and usage practices for these technologies. This also involves explaining to users the type of algorithms that run on these networks – without revealing the intellectual property – as well as the type of data collected to make them work. With the possible choice for users to deactivate certain algorithmic components such as the recommendation and therefore the personalized editorialization of content.
To hope to eliminate 100% of posts that are inappropriate except for one error, the technical giants must share the design and source code of these algorithms – which is in no way an economic differentiator – with the rest of the scientific and technological community of the world. , including researchers from university laboratories. At the same time, leaders must build public policies for funding private and public research and development to support state-driven innovation, also at European level.
Accompanied of course by major projects and national plans on education in algorithmics for the next generations of citizens so that they can defend their rights and understand their duties. These solutions draw the algorithmic governance of legal and natural persons, actors of the chessboard. The complementary solution to consider would be for the owners of these tools to revisit their business model. It is recalled that some GAFAM have staggering turnovers – 1.567 billion dollars in 2021 for Twitter -. Doing less to do better then seems reasonable. We have to recognize that this is what we overwhelmingly do on an individual scale in our professional and social lives without conceptualizing it that way.
Well constructed and well used, social networks are powerful vectors for communication, sharing and discovery. It is enough to see the number of people who have met on these platforms through intellectual and emotional acquaintances, or the number of talented individuals and inspiring initiatives that have been revealed, to affirm that these networks are right to exist. . But this will inevitably require a transformation of the system leading to a healthy connected world. In this specific case, innovation can even support democracy, by freeing up speech and confronting ideas for profound social progress.
We want to give thanks to the author of this short article for this amazing content
Democracy and social networks can coexist (Aurélie Jean)
Our social media pages here and other pages related to them here.https://yaroos.com/related-pages/