From my vantage point, it seems that despite ongoing challenges in the field of major tech AI labs like OpenAI, Anthropic, and Google, in their quest to create sophisticated AI systems due to scaling issues caused by insufficient high-quality content for model training, progress in generative AI remains robust. Remarkably, OpenAI’s CEO, Sam Altman, has hinted that the development of Artificial General Intelligence (AGI) might occur sooner than expected, with superintelligence being just a matter of “a few thousand days” away.
Beyond worries about privacy and security with AI, many people are apprehensive about the technology due to the possibility it might bring about our extinction. Roman Yampolskiy, an AI safety researcher and director of the Cyber Security Laboratory at the University of Louisville, estimates there’s a 0.000001% chance that AI could lead to human extinction. To prevent this outcome, according to Yampolskiy, it may be wise not to develop AI in the first place.
In order to keep AI from running amok or becoming too powerful, Vitalik Buterin, one of the founders of Ethereum, suggests implementing a “worldwide emergency stop mechanism” that could be used to halt technology globally if needed, thereby safeguarding humanity from being overrun by it.
According to Buterin:
The objective is to develop a means to significantly decrease the global accessible computational power by 90-99% for approximately one to two years during a crucial phase. This temporary reduction could provide additional time for humanity to prepare adequately. It’s important to note that a single year of emergency preparation can potentially equate to a century’s worth of progress under normal circumstances. Various methods for implementing this “pause” have been considered, such as mandating registration and verifying the location of hardware.
The Canadian computer programmer suggests an intelligent cryptographic technique could provide a sophisticated method to mitigate AI risks. He advocates for integrating a reliable chip into large-scale AI hardware. This chip would only function if it received three digital signatures each week from significant international organizations, at least one of which is not military.
As a researcher, I emphasized that the signatures we’re proposing would be universally applicable, meaning they could function regardless of the device used (and if necessary, we could even implement a zero-knowledge proof system verifying their publication on a blockchain). This all-or-nothing approach implies that authorizing one device to continue functioning would automatically grant authorization to all other devices.
In addition, I hinted at a mechanism where users would be required to reconnect online weekly to receive signatures. Implementing this frequent connection requirement aims to discourage the expansion of our system to everyday consumer hardware, as the inconvenience of regular connectivity might deter widespread adoption in such contexts.
Sam Altman, CEO of OpenAI, predicts that advanced AI will have the intelligence to address the repercussions stemming from rapid technological advancements, even potentially threatening humanity’s existence. Interestingly, he suggests that concerns about safety may not arise at the point of achieving Artificial General Intelligence (AGI) because the societal impact might turn out to be “surprisingly minimal.” Nevertheless, Altman emphasizes the need for AI development to be regulated on an international level, akin to how airplanes are managed by global agencies, in order to ensure proper safety testing and oversight of these advancements.
Buterin advocates this method for various reasons, such as its ability to halt the process if there are early indications of potential catastrophic harm, while causing minimal disturbance to developers’ work.
Read More
- HBAR PREDICTION. HBAR cryptocurrency
- IMX PREDICTION. IMX cryptocurrency
- TRB PREDICTION. TRB cryptocurrency
- XDC PREDICTION. XDC cryptocurrency
- MNT PREDICTION. MNT cryptocurrency
- LDO PREDICTION. LDO cryptocurrency
- FLOKI PREDICTION. FLOKI cryptocurrency
- JTO PREDICTION. JTO cryptocurrency
- OM PREDICTION. OM cryptocurrency
- CAKE PREDICTION. CAKE cryptocurrency
2025-01-06 14:09