A model like DeepSeek was inevitable. The US ban on Nvidia chips forced China to innovate, and they did. Necessity is the mother of invention. When faced with constraints, people find new ways. This is one such example.
But what are the implications of DeepSeek for the generative AI as a whole? Here are some ways this might affect the industry at large.
Accelerating AI accessibility
DeepSeek doesn’t just maintain the pace of AI development; it accelerates it. By making AI more accessible, it helps reach a broader audience faster. This increased accessibility means more problems can be solved using AI, especially as the cost of AI APIs is projected to decrease by tenfold or more in the next six months.
Higher ROI for big capital spenders
For major players like Meta, Microsoft, Stargate, and XAI, the return on investment (ROI) on capital spent will be higher and realised faster. In just six months, all model developers will be able to present their own versions of DeepSeek, driving API costs down significantly.
Debunking the LLM wall myth
Just weeks before DeepSeek’s debut, there was widespread debate about large language models (LLMs) hitting a wall. The answer is now clear: they didn’t. Scaling can occur across various dimensions—compute at training, compute at inference, networking, algorithmic, data, and capital. DeepSeek exemplifies one such dimension.
Also Read: DeepSeeking the future: The ripple effect on tech, crypto, and global markets
Diverse scaling breakthroughs
Not all new scaling breakthroughs will resemble DeepSeek. Some will be significant step changes, while others will be subtle improvements that may never make headlines. However, each contributes to the overall advancement of AI technology.
China’s role in global tech innovation
China, like the US, has a substantial pool of risk capital dedicated to new tech startups, second only to the US the DeepSeek story should serve as a blueprint for other regions with limited risk capital. However, the real cost of DeepSeek will likely exceed the quoted figure of US$6 million.
Impact on GPT-wrappers and trust issues
DeepSeek enhances the margin story for so-called “GPT-wrappers,” transforming them into higher-margin businesses overnight. As scaling continues, margins will improve further, and the application layer will flourish. However, China, as a software exporter, will continue to face trust issues. Long-term adoption of LLMs from China will be hindered by these trust problems, and DeepSeek won’t change that. For more on the vulnerabilities of LLMs, search for “Sleeper Agent Attack in LLMs.”
In conclusion, DeepSeek represents a significant milestone in AI innovation, driving down costs, improving accessibility, and setting the stage for future advancements. While challenges remain, particularly regarding trust and real costs, the potential benefits are immense. The AI landscape is poised for rapid transformation, and DeepSeek is at the forefront of this exciting journey.
—
Editor’s note: e27 aims to foster thought leadership by publishing views from the community. Share your opinion by submitting an article, video, podcast, or infographic.
Join us on Instagram, Facebook, X, and LinkedIn to stay connected.
Image credit: Canva Pro
The post Second order effects in AI from DeepSeek AI appeared first on e27.