PANews reported on December 21 that Ethereum founder Vitalik Buterin posted on the X platform: "My definition of AGI (general artificial intelligence) is that AGI is artificial intelligence that is powerful enough that if all humans suddenly disappear one day and the AI is uploaded into a robot body, it will be able to continue civilization independently. Obviously, this is a very difficult definition to measure, but I think this is the core of the intuitive difference between "the AI we are used to" and "AGI" in many people's minds. It marks the transition from a tool that constantly relies on human input to a self-sufficient life form. ASI (super artificial intelligence) is another matter entirely - my definition is when humans no longer contribute value to productivity in the cycle (like in chess games, we have actually only reached this point in the past decade.) Yes, ASI scares me - even AGI as I define it scares me because it brings obvious risks of loss of control. I support focusing our work on building intelligence-enhancing tools for humans, rather than building super-intelligent life forms. "