⇧ [VIDÉO] You might also like this partner content (after ad)
The last decade has been marked by incredible advances in technologies based on artificial intelligence (AI). In particular, they are now so advanced that they can be used in a wide variety of fields such as medicine, art, security management, management, etc. However, certain events demonstrate that AIs, as purely logic-based systems, can sometimes take decisions that do not necessarily correspond to our moral values. In a preliminary study investigating the risks of AI, 36% of experts believe that humanity could be overtaken by this technology this century, with a non-negligible risk of nuclear disaster world. Moreover, in view of the speed of evolution of AI, the magnitude of these risks could be underestimated.
Thanks to machine learning algorithms, AIs are able to learn by assimilating huge amounts of information, and are used today in many fields. An AI can do work in a few weeks that would take a lot more time for human experts. For example, it can detect more than 100 types of tumors with a higher level of reliability than an expert with several years of experience. AI systems can also support risk and disaster management, for example by assessing the probability of a bridge collapsing, and thus save thousands of lives by improving prevention measures.
Large-language AI models are even capable of integrating domains previously thought to be uniquely human, including art, through the creation of on-demand images. Their ability to make rational and logical decisions has even allowed them to be placed in major decision-making positions, such as that of CEO of a large corporation.
The experts of the new study, prepublished on arXiv, however, believe that the military use of AI would present a danger. Decision-making based solely on logic could in particular entail risks for humanity, because it would not necessarily take into account our moral and societal values. What decision would an AI supposed to protect the planet make and who would be in command of deadly nuclear or bacteriological weapons? The risk could be substantial for humans if such a scenario were to occur, because an X system could think at some point that humans are a factor to be eradicated to save the Earth.
With a less extreme scenario, AI automation could lead to major societal changes, especially in terms of the industrial revolution. Millions of people would be at risk of being unemployed, like the period of industrial automation, when thousands of workers found themselves unemployed.
A survey including 327 researchers
The new study, led by researchers at the New York University Center for Data Science, surveyed the opinions of 327 researchers, all of whom have authored research on AI in natural language processing. The survey found that 36% of these experts believed that a nuclear-level disaster involving AI would be possible within this century.
The fear of this doomsday scenario would be further accentuated in the specific responses of female experts who participated in the survey, as well as those of participants from specific minority groups. 46% of women and 53% of the minority group considered the event possible. Moreover, the experts interviewed would have been even more pessimistic about our ability to manage potentially dangerous future technology.
In addition, 57% of scientists participating in the survey believe that large AI models could one day exceed the intellectual capacities of humans, and 73% believe that the automation of work via AI will bring about profound changes. social. The survey’s authors are more concerned about the direct risks of AI rather than the resultant all-out nuclear war. It should also be kept in mind that these reviews only included a few hundred researchers, and the numbers may be underestimated.
Source : arXiv
We wish to give thanks to the author of this article for this remarkable material
Experts say AI could one day trigger a major nuclear disaster
Check out our social media profiles as well as other pages related to themhttps://yaroos.com/related-pages/