Near Protocol has announced plans to build the largest open-source artificial intelligence model worldwide. The platform announced the initiative on the opening day of its Redacted conference, held in Bangkok, Thailand.
The initiative, a 1.4 trillion parameter model, would be bigger than Llama, Meta’s open-source model. Near Protocol noted that the initiative will include crowdsourced research inputs from several researchers and contributors on the new Near AI hub. It also said participants are open to joining in the training of a smaller parameter model of 500 million starting on November 10.
Near Protocol unveils its AI model plan
Near Protocol revealed that the project will continue to grow in size, moving across seven models. It also mentioned that across each stage and model, only the best researchers and contributors will be retained. The model will also be monetized, with the platform providing privacy to protect it. Near Protocol intends to use an encrypted Trusted Execution Environment to reward contributors while encouraging constant updates throughout the process.
Near Protocol co-founder Illia Polosukin mentioned at the event in Bangkok that the firm intends to fund the expensive training using token sales. He noted that the model will cost roughly $160 million, which is a lot of money but it doesn’t mean it is not raisable in the crypto market.
See also OpenAI’s Orion isn't a massive leap in AI compared to GPT4
He clarified that tokenholders will recoup their funds from the interference that occurs when the model is put to use. “So we have a business model, we have a way to monetize it, we have a way to raise money, and we have a way to put this in a loop. And so people can reinvest back into the next model as well.”
Near can raise the funds, considering its CEO Polosukhin was one of the two people involved in the research paper that birthed ChatGPT. Co-founder Alex Skidanov also held a job at OpenAI in the period leading up to its grand release in 2022. Skidanov, who is now one of the executives at Near said the task is doable but there will be challenges.
Decentralized AI to deal with privacy issues
The firm will have to commit so many resources to the project to actualize its dreams. For example, it will need to accumulate many GPUs in the location, a move that is not ideal. But to use a decentralized network to compute, will require a technology that does not presently exist.
The distributed training technique that is needed also requires the use of fast interconnect. However, Skidanov added that research from Deep Mind shows that it can be done. Polosukin mentioned that he has yet to interact with projects like Artificial Superintelligence Alliance, but will be happy if both projects can be on the same path.
See also FERC blocks Amazon AI power deal, energy struggle with
Bitcoin miners persists
He said that no matter what happens, we need to ensure that decentralized AI technology wins so we can all benefit from it. Guest speaker Edward Snowden also hammered on the topic, talking about how centralized AI could turn the world into a big surveillance state.
“This is probably the most important technology right now and probably in the future. And the reality is, if AI is controlled by one company, we effectively are going to do whatever that company says,” he explained.
“If all AI and effectively, all of the economy is being done by one company, there’s no decentralization at that point. So it is effectively like the only way that Web3 is still relevant, philosophically, if we have AI that also follows the same principles,” Snowden added.