➀ Li Mu discusses the evolution of language models, emphasizing the importance of compute power, data, and algorithms. ➁ He highlights the challenges in scaling models due to memory limitations and the increasing cost of compute power. ➂ Li Mu shares his insights on the transition from large-scale pre-training to post-training as a technical problem, emphasizing the role of high-quality data and improved algorithms.
Related Articles
- Meet the Young Talent Scaling Alibaba’s AI Future at Tongyi Lab, Developer of Qwen Models11 days ago
- AMD Expresses Confidence In AI & CPU Roadmap In Response To Intel-NVIDIA Dealabout 2 months ago
- Novo Nordisk And Its Real Value5 months ago
- DigiTech ASEAN Thailand Returns This November As Demands For Digital And AI-Driven Solutions Grow7 months ago
- Symposium on VLSI Technology & Circuits in Kyoto,7 months ago
- TDK develops 20ps Spin Photo Detector7 months ago
- Nvidia: A Generational Investment With Asymmetric Upside7 months ago
- Lemonade: An Incredible Small Cap At A Great Valuation7 months ago
- KI in Europa: How do we position ourselves in the global competition?7 months ago
- Decoding the Invisibility of Environmental Intelligence with SPAIA7 months ago