deepseek
What's next for Chinese open-source AI
Chinese open models are spreading fast, from Hugging Face to Silicon Valley. In this photo illustration, the DeepSeek apps is seen on a phone in front of a flag of China on January 28, 2025 in Hong Kong, China. The past year has marked a turning point for Chinese AI. Since DeepSeek released its R1 reasoning model in January 2025, Chinese companies have repeatedly delivered AI models that match the performance of leading Western models at a fraction of the cost. Just last week the Chinese firm Moonshot AI released its latest open-weight model, Kimi K2.5, which came close to top proprietary systems such as Anthropic's Claude Opus on some early benchmarks. The difference: K2.5 is roughly one-seventh Opus's price.
- North America > United States > California (0.25)
- Asia > China > Hong Kong (0.25)
- South America > Brazil (0.04)
- (5 more...)
- Information Technology (1.00)
- Banking & Finance (0.95)
6 Graphs That Show Where the U.S. Leads China on AI--and Where It Doesn't
Two important things happened on January 20, 2025. In Washington, D.C., Donald Trump was inaugurated as President of the United States. In Hangzhou, China, a little-known Chinese firm called DeepSeek released R1, an AI model that industry watchers called a "Sputnik moment" for the country's AI industry. "Whether we like it or not, we're suddenly engaged in a fast-paced competition to build and define this groundbreaking technology that will determine so much about the future of civilization," said Trump later that year, as he announced his administration's AI action plan, which was titled "Winning the Race." There are many interpretations of what AI companies and their governments are racing towards, says AI policy researcher Lennart Heim: to deploy AI systems in the economy, to build robots, to create human-like artificial general intelligence.
- North America > United States > District of Columbia > Washington (0.25)
- Asia > China > Zhejiang Province > Hangzhou (0.25)
- Europe > France (0.05)
- Africa (0.05)
How China Caught Up on AI--and May Now Win the Future
He Xiaopeng launches Xpeng's next-gen Iron humanoid robot during a press conference at the company's headquarters in Guangzhou on November 5, 2025. He Xiaopeng launches Xpeng's next-gen Iron humanoid robot during a press conference at the company's headquarters in Guangzhou on November 5, 2025. It was a controversy laced with pride for He Xiaopeng. In November, He, the founder and CEO of Chinese physical AI firm XPeng, had just debuted his new humanoid robot, IRON, whose balance, posture shifts, and coquettish swagger mirrored human motion with such eerie precision that a slew of netizens accused him of faking the demonstration by putting a human in a bodysuit. To silence the naysayers, He boldly cut open the robot's leg live on stage to reveal the intricate mechanical systems that allow it to adapt to uneven surfaces and maintain stability just like the human body. "At first, it made me sad," He tells TIME in his Guangzhou headquarters.
- Asia > China > Guangdong Province > Guangzhou (0.65)
- Asia > Russia (0.14)
- Asia > North Korea (0.14)
- (17 more...)
- Law (1.00)
- Information Technology (1.00)
- Government > Military (0.94)
- (4 more...)
Thousands of Companies Are Driving China's AI Boom. A Government Registry Tracks Them All
Thousands of Companies Are Driving China's AI Boom. How the Cyberspace Administration of China inadvertently made a guide to the country's homegrown AI revolution. When DeepSeek burst onto the global stage in January 2025, it seemed to appear out of nowhere. But the large language model was just one of the thousands of generative AI tools that have been released in China since 2023--and there's a public archive of every single one of them. Here are 23 ways China is rewiring the future .
- Health & Medicine (1.00)
- Energy (1.00)
- Government > Regional Government (0.69)
- (2 more...)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (0.93)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning > Generative AI (0.36)
The Race to Build the DeepSeek of Europe Is On
As Europe's longstanding alliance with the US falters, its push to become a self-sufficient AI superpower has become more urgent. As the relationship between the US and its European allies shows signs of strain, AI labs across the continent are searching for inventive ways to close the gap with American rivals that have so far dominated the field. With rare exceptions, US-based firms outstrip European competitors across the AI production line--from processor design and manufacturing, to datacenter capacity, to model and application development. Likewise, the US has captured a massive proportion of the money pouring into AI, reflected in the performance last year of its homegrown stocks and the growth of its econonmy . The belief in some quarters is that the US-based leaders --Nvidia, Google, Meta, OpenAI, Anthropic, and the like--are already so entrenched as to make it impossible for European nations to break their dependency on American AI, mirroring the pattern in cloud services.
- Information Technology (1.00)
- Government > Regional Government > North America Government > United States Government (1.00)
- Energy (0.96)
- Government > Military (0.69)
So Long, GPT-5. Hello, Qwen
In the AI boom, chatbots and GPTs come and go quickly. On a drizzly and windswept afternoon this summer, I visited the headquarters of Rokid, a startup developing smart glasses in Hangzhou, China. As I chatted with engineers, their words were swiftly translated from Mandarin to English, and then transcribed onto a tiny translucent screen just above my right eye using one of the company's new prototype devices. Rokid's high-tech spectacles use Qwen, an open-weight large language model developed by the Chinese ecommerce giant Alibaba. OpenAI's GPT-5, Google's Gemini 3, and Anthropic's Claude often score higher on benchmarks designed to gauge different dimensions of machine cleverness.
- Asia > China > Zhejiang Province > Hangzhou (0.25)
- North America > United States > Michigan (0.05)
- North America > United States > California (0.05)
- (2 more...)
WIRED Roundup: The 5 Tech and Politics Trends That Shaped 2025
In today's episode of, we dive into five stories--from AI to DOGE--that encapsulate the year and give us clues as to what might unfold in 2026. For better or for worse, this year had it all--from the AI industry shaping the global economy and our lives, to the so-called Department of Government Efficiency taking over US federal agencies under Elon Musk's leadership. In today's episode, host Zoë Schiffer and executive editor Brian Barrett get together to reflect on some of this year's key moments--and how they give us important clues as to what we can expect this upcoming year. The FBI's Jeffrey Epstein Prison Video Had Nearly 3 Minutes Cut Out Write to us at uncannyvalley@wired.com . You can always listen to this week's podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here's how: If you're on an iPhone or iPad, open the app called Podcasts, or just tap this link . Today on the show, we're wrapping up our news episode series by reflecting on the trends and stories that shaped 2025. And who better to do that with than Brian Barrett, our executive editor who works tirelessly in the shadows? Thank you for having me. Happy to emerge from my shadowy lair. What a year it's been, and I'm so excited for it to be almost over. Because it's been quite a year news-wise, safe to say, especially in tech and politics. Honestly, it was a little bit tricky to pick which trends we should discuss today, but we settled on five stories that kind of encapsulate this year pretty well, and I think give us clues as to what is going to be unfolding in 2026. The first one that I want to talk about is dear to my heart, and it's about AI data centers. So we all know that the investment, the amount of money being spent on data centers is absolutely staggering, with companies like Meta, Google, and Microsoft tripling down on AI infrastructure spending this year. But it's not just about the money that's being spent.
- North America > United States > California (0.14)
- Asia > China (0.06)
- Europe > Middle East (0.04)
- (6 more...)
- Information Technology > Communications > Mobile (1.00)
- Information Technology > Artificial Intelligence (1.00)
The Download: why 2025 has been the year of AI hype correction, and fighting GPS jamming
When OpenAI released a free web app called ChatGPT in late 2022, it changed the course of an entire industry--and several world economies. Millions of people started talking to their computers, and their computers started talking back. We were enchanted, and we expected more. Well, 2025 has been a year of reckoning. For a start, the heads of the top AI companies made promises they couldn't keep. At the same time, updates to the core technology are no longer the step changes they once were.
- Asia > China (0.06)
- North America > United States > New York (0.05)
- North America > United States > Massachusetts (0.05)
- (2 more...)
- Government (1.00)
- Information Technology > Services (0.30)
- Information Technology > Communications > Social Media (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (0.75)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.75)
Edge Deployment of Small Language Models, a comprehensive comparison of CPU, GPU and NPU backends
Edge computing processes data where it is generated, enabling faster decisions, lower bandwidth usage, and improved privacy. However, edge devices typically operate under strict constraints on processing power, memory, and energy consumption, making them unsuitable for large language models (LLMs). Fortunately, Small Language Models (SLMs) offer lightweight alternatives that bring AI inference to resource-constrained environments by significantly reducing computational cost while remaining suitable for specialization and customization. In this scenario, selecting the hardware platform that best balances performance and efficiency for SLM inference is challenging due to strict resource limitations. To address this issue, this study evaluates the inference performance and energy efficiency of commercial CPUs (Intel and ARM), GPUs (NVIDIA), and NPUs (RaiderChip) for running SLMs. GPUs, the usual platform of choice, are compared against commercial NPUs and recent multi-core CPUs. While NPUs leverage custom hardware designs optimized for computation, modern CPUs increasingly incorporate dedicated features targeting language-model workloads. Using a common execution framework and a suite of state-of-the-art SLMs, we analyze both maximum achievable performance and processing and energy efficiency across commercial solutions available for each platform. The results indicate that specialized backends outperform general-purpose CPUs, with NPUs achieving the highest performance by a wide margin. Bandwidth normalization proves essential for fair cross-architecture comparisons. Although low-power ARM processors deliver competitive results when energy usage is considered, metrics that combine performance and power (such as EDP) again highlight NPUs as the dominant architecture. These findings show that designs optimized for both efficiency and performance offer a clear advantage for edge workloads.
- Information Technology (0.67)
- Energy (0.50)
Large Language Models for Education and Research: An Empirical and User Survey-based Analysis
Rahman, Md Mostafizer, Shiplu, Ariful Islam, Amin, Md Faizul Ibne, Watanobe, Yutaka, Peng, Lu
Pretrained Large Language Models (LLMs) have achieved remarkable success across diverse domains, with education and research emerging as particularly impactful areas. Among current state-of-the-art LLMs, ChatGPT and DeepSeek exhibit strong capabilities in mathematics, science, medicine, literature, and programming. In this study, we present a comprehensive evaluation of these two LLMs through background technology analysis, empirical experiments, and a real-world user survey. The evaluation explores trade-offs among model accuracy, computational efficiency, and user experience in educational and research affairs. We benchmarked these LLMs performance in text generation, programming, and specialized problem-solving. Experimental results show that ChatGPT excels in general language understanding and text generation, while DeepSeek demonstrates superior performance in programming tasks due to its efficiency-focused design. Moreover, both models deliver medically accurate diagnostic outputs and effectively solve complex mathematical problems. Complementing these quantitative findings, a survey of students, educators, and researchers highlights the practical benefits and limitations of these models, offering deeper insights into their role in advancing education and research.
- North America > United States > Texas > Gaines County (0.04)
- North America > United States > Louisiana > Orleans Parish > New Orleans (0.04)
- Asia > Japan > Honshū > Tōhoku > Fukushima Prefecture > Fukushima (0.04)
- Asia > Bangladesh > Dhaka Division > Dhaka District > Dhaka (0.04)
- Research Report > New Finding (1.00)
- Questionnaire & Opinion Survey (1.00)