Ramin Hasani, Liquid AI | theCUBE +NYSE Wired: AI Factories - Data Centers of the Future
Show Notes
In this episode of theCUBE + NYSE Wired: AI Factories, Ramin Hasani, co-founder and CEO of Liquid AI, discusses the transformative role that AI and edge computing play in modern enterprise infrastructure. Emerging from the MIT CSAIL, Liquid AI is pioneering "liquid neural networks," which are biologically inspired models designed for adaptability post-learning. Hasani elaborates on the importance of smaller models at the edge of networks, which complement larger, hyperscale training methodologies. The discussion covers how hybrid device-cloud orchestration is on the rise, alongside the significance of efficient networking and memory management in the design and operation of AI factories. Liquid AI's focus extends to building foundational models capable of operating on various hardware platforms with multimodal capabilities, including text, audio, and vision processing.
Key Topics Covered:
- Evolution of Liquid AI from MIT CSAIL and the creation of liquid neural networks.
- Importance of smaller, adaptable AI models for edge computing.
- Hybrid device-cloud systems and networking's central role in AI operations.
- Development of foundational models across platforms such as Qualcomm, AMD, and NVIDIA.
- Liquid AI's achievement of sub-10 ms activation times for AI models.
- Practical applications of AI in devices requiring low latency and high efficiency.
- Impact of AI factories on data center strategies and enterprise architecture.
- Vision for AI's role in consumer electronics and financial services through localized intelligence.
