Quotes
During the recent Davos conference, a new term was introduced: FOBO; Fear Of Becoming Obsolete—individuals having this anxiety [that] their skills and their roles become redundant because of artificial intelligence,
Together, these world-leading technology giants are announcing the formation of Stargate, so put that name down in your books, a new American company that will invest in AI infrastructure. This is a resounding vote of confidence in America's potential. It will ensure the future of AI technology and keep it in this country."
Trump said at the announcement of Stargate You can continue to try and contain access to chips and close the walls off. While you're doing that, you're doubling down on investment into data infrastructure, supporting the development of AI in the U.S. and being first in that race,
DeepSeek's cost efficiency is praiseworthy, but the privacy implications of its data collection would raise significant concerns,
said Saeed Rehman, senior lecturer in cybersecurity and networking at Flinders University It is interesting that this breakthrough was achieved not by government-backed research institutes and large [state-owned enterprises], but by a hedge fund with no government subsidies,
noted Zhiwei Zhang, president and chief economist at Pinpoint Asset Management This situation may evoke similar concerns to those raised for TikTok, where data privacy and security have been hotly debated,
The release of DeepSeek AI from a Chinese company should be a wake-up call for our industries that we need to be laser focused on competing,
Every dollar and gig of data that flows into Chinese AI are dollars and data that will ultimately be used against the United States,
said Hawley, R-Mo., in a statement It’s plausible to me that they can train a model with $6m,
The model itself gives away a few details of how it works, but the costs of the main changes that they claim – that I understand – don’t ‘show up’ in the model itself so much,
The breakthrough is incredible – almost a ‘too good to be true’ style. The breakdown of costs is unclear,
It’s very much an open question whether DeepSeek’s claims can be taken at face value. The AI community will be digging into them and we’ll find out,
Pedro Domingos, professor emeritus of computer science and engineering at the University of Washington, told Al Jazeera DeepSeek made R1 by taking a base model – in this case V3 – and applying some clever methods to teach that base model to think more carefully,
GPT-4 finished training late 2022. There has been a lot of algorithmic and hardware improvements since 2022, driving down the cost of training a GPT-4 class model. A similar situation happened for GPT-2. At the time it was a serious undertaking to train, but now you can train it for $20 in 90 minutes,
If they’d spend more time working on the code and reproduce the DeepSeek idea theirselves it will be better than talking on the paper,
Wang said, using an English translation of a Chinese idiom about people who engage in idle talk These massive-scale models are a very recent phenomenon, so efficiencies are bound to be found,
It’s easy to criticize,
Wang said on X in response to questions from Al Jazeera about the suggestion that DeepSeek’s claims should not be taken at face value The constraints on China's access to chips forced the DeepSeek team to train more efficient models that could still be competitive without huge compute training costs,
George Washington University's Jeffrey Ding told AFP AI models have consistently become cheaper to train over time - this isn't new,
DeepSeek V3's training costs, while competitive, fall within historical efficiency trends,
Lennart Heim, an associate information scientist at the RAND Corporation, told AFP, referring to R1's previous iteration