After AI "sinks", is it time for Web3 to show its prowess?

The AI industry is shifting from centralized, large-scale models to localized small models and edge computing, as seen with Apple Intelligence, Microsoft's Mu model, and Google DeepMind's offline robots. This trend highlights the importance of privacy, reliability, and practicality over sheer computing power.

  • Local AI vs. Cloud AI: Cloud AI focuses on parameter scale and data, while local AI emphasizes engineering optimization, scene adaptation, and privacy.
  • Opportunity for Web3: Decentralized solutions can address trust and collaboration challenges in local AI, such as verifying untampered outputs and enabling privacy-preserving model collaboration.
  • Emerging Web3 AI Projects: Initiatives like Lattica (data communication protocol) and PublicAI's HeadCap (real human data collection) tackle centralized AI's monopoly and credibility issues.

The article suggests that as AI becomes deeply embedded in devices, decentralized collaboration will transition from a concept to a necessity, offering Web3 a chance to shine in infrastructure support for localized AI.

Summary

Author:Haotian

Recently, I have observed the AI ​​industry and found an increasingly "downward" change: from the original mainstream consensus of competing for concentrated computing power and "large" models, a branch has evolved that tends to be local small models and edge computing.

This can be seen from Apple Intelligence covering 500 million devices, Microsoft launching the 330 million parameter small model Mu for Windows 11, and Google DeepMind's robot "offline" operation.

What are the differences? Cloud AI competes on parameter scale and training data, and the ability to spend money is its core competitiveness; local AI competes on engineering optimization and scene adaptation, and will go a step further in terms of privacy protection, reliability, and practicality. (The hallucination problem of the main general model will seriously affect the penetration of vertical scenarios)

This will actually bring greater opportunities to web3 AI. In the past, when everyone competed for "generalization" (computing, data, algorithm) capabilities, they were naturally monopolized by traditional Giant manufacturers. It is simply a pipe dream to want to compete with Google, AWS, OpenAI, etc. by applying the concept of decentralization. After all, there is no resource advantage, technical advantage, and no user base.

But in the world of localized models + edge computing, the situation facing blockchain technology services is very different.

When an AI model runs on a user's device, how can we prove that the output has not been tampered with? How can we achieve model collaboration while protecting privacy? These issues are precisely the strengths of blockchain technology...

I have noticed some new web3 AI-related projects, such as the data communication protocol Lattica recently launched by @Gradient_HQ, which was invested 10M by Pantera, to solve the data monopoly and black box problems of centralized AI platforms; @PublicAI_ brain wave device HeadCap collects real human data and builds an "artificial verification layer", which has achieved 14M in revenue; in fact, they are all trying to solve the "credibility" problem of local AI.

In a word: Only when AI is truly "sunk" into every device, will decentralized collaboration change from a concept to a necessity?

Instead of continuing to compete in the generalization track, why not seriously consider how to provide infrastructure support for the localized AI wave?

Share to:

Author: 链上观

This article represents the views of PANews columnist and does not represent PANews' position or legal liability.

The article and opinions do not constitute investment advice

Image source: 链上观. Please contact the author for removal if there is infringement.

Follow PANews official accounts, navigate bull and bear markets together
App内阅读