By Oyintari Ben
Leaders in the field of artificial intelligence seek to halt the development of strong AI systems due to concerns that humanity may be in danger.
They claim the rush to build AI systems is out of control and have signed an open letter warning of potential consequences.
Elon Musk, the CEO of Twitter, is one of those who wants to stop training AI above a specific capacity for at least six months.
Steve Wozniak, a co-founder of Apple, and a few DeepMind researchers also signed.
The developer of ChatGPT, OpenAI, recently unveiled GPT-4, a cutting-edge technology that has stunned observers with its aptitude for tasks like identifying objects in photos.
The letter, from the Future of Life Institute and signed by the luminaries, requests a temporary halt to development at that stage and warns of the dangers that potential future, more sophisticated systems may present.
The report warns that societies and mankind face grave hazards from AI systems with intellect on par with humans.
A non-profit organisation called the Future of Life Institute states that its goal is to “steer transformative technologies away from extreme, large-scale risks and towards benefiting life.”
Mr Musk, the owner of Twitter and the CEO of the automaker Tesla, is identified as the organisation’s external adviser.
The letter claims that careful consideration must go into developing advanced AIs. However, lately, “AI labs have been locked in an out-of-control race to develop and deploy ever more powerful digital minds that no one – not even their creators – can understand, predict, or reliably control.”
The letter alerted that artificial intelligence (AI) could automate occupations and saturate information channels with false information.
The letter comes in response to a recent study by the investment bank Goldman Sachs, which predicted that while AI will probably enhance productivity, it also has the potential to automate millions of jobs.
Other experts told the BBC that it was complicated to anticipate how AI would affect the labour market.