As access to advanced chips narrows, Chinese AI developers are focusing on fixing an algorithmic bottleneck at the heart of large language models (LLMs) – hoping that smarter code, not more powerful hardware, will help them steal a march on their Western rivals. By experimenting with hybrid forms of “attention” – the mechanism that allows LLMs to process and recall information – start-ups such as Moonshot AI and DeepSeek aim to stretch limited computing resources, while keeping pace with global…


From China - South China Morning Post via This RSS Feed.