Decentralized AI is key for self-sovereignty — Onicai executives

As a seasoned analyst with a background in both technology and sociology, I find the manifesto for Decentralized AI by Dfinity Foundation and Onicai to be an exciting development in our rapidly evolving digital landscape. The potential for abuse of centralized AI by powerful entities is a concern that resonates deeply with my experiences observing the impact of technology on society.


Researchers at the Dfinity Foundation (the organization behind the Internet Computer Protocol) and leaders from decentralized AI company Onicai have unveiled a document called “The Manifesto for Decentralized Artificial Intelligence.” This manifesto presents seven key principles to ensure that artificial intelligence serves the needs of individual users, with a focus on user-centric AI that prioritizes people over large institutions.

During an interview with CryptoMoon, Onicai CEO Patrick Friedrich expressed concern about the centralization of AI, as it could give powerful corporations or government entities almost unchecked power to influence and control masses by means of manipulation. In simpler terms, he told CryptoMoon that this concentration of AI could potentially lead to abuses by these entities, allowing them to exert near-absolute control over populations.

“Going forward, with more and more AI agents that act autonomously, we don’t know what all of them can do, and we want to make sure they are not biased by some bigger interest — whether those be governments, political parties, or huge organizations and companies.”

As suggested by an Onicai executive, a potential resolution for issues related to information suppression or distortion involves implementing decentralized artificial intelligence via smart contracts on open, uncontrolled networks. These networks are characterized by their permanence, accessibility through open source code, and high transparency.

This transparent set of rules would govern and constrain AI behavior and allow users to run their AI with custom parameters using local storage methods, decentralized clouds, or even hybrid models — giving users true control over the entire software stack running their AI.

Stifling innovation and ignoring the niche

According to Onicai’s Chief Technical Officer, Arjaan Buijk, it is not that we are driven by a path rooted in fear, but rather, we are moving towards the possibilities that artificial intelligence offers.

In simpler terms, Buijk explained to CryptoMoon that most AI companies concentrate on developing versatile AI models suitable for a wide range of applications, rather than specialized or niche models tailored for specific tasks. Unfortunately, these general-purpose models often fail to generate substantial profits, which results in fewer instances where unique and specialized AI implementations come to fruition. This lack of diversity in AI development may hinder future advancements and limit the overall growth potential of AI technology.

In November 2024, the Artificial Superintelligence Alliance (ASI) introduced Cortex, a decentralized AI system designed specifically for industrial uses. Cortex is a robust artificial intelligence solution that enables businesses to customize AI models according to their requirements, thereby minimizing dependence on standard, centralized AI systems.

Michael Casey, one of the founders of the Decentralized AI Society, has shared with CryptoMoon that this non-profit organization is investigating various financing options for AI initiatives, as an offset to projects primarily backed by venture capital.

According to Anthropic CEO Dario Amodei’s statement to Lex Fridman on November 11th, we might expect the advent of artificial intelligence with human-like capabilities sometime between 2026 and 2027.

According to Amodei’s estimation, we might witness the emergence of Artificial General Intelligence (AGI) as early as 2027, which is significantly sooner than what many AI researchers had previously forecasted – they anticipated AGI appearing in the 2030s or even later.

Read More

2024-11-29 20:32