Logo
Manvinder Singh, Redis | Robotics & AI Infrastructure Leaders

Manvinder Singh, Redis | Robotics & AI Infrastructure Leaders

Episode 75
Jun 21, 202521 minutes
0:00/20:51

Show Notes

In this episode, Manvinder Singh, Vice President of AI Product Management at Redis, discusses the evolving landscape of high-performance databases and their critical role in modern AI applications. Joining hosts John Furrier and Dave Vellante at the CUBE + NYSE Wired: Robotics & AI Infrastructure Leaders 2025 event, Singh elaborates on Redis's advancements in semantic caching, vector search, and AI-ready infrastructure. He explains how Redis is addressing challenges such as latency, inference speed, and real-time data retrieval, which are vital for supporting growing AI demands. The episode highlights Redis's commitment to optimizing its core systems to manage the complexities and speed required by advanced AI applications. Singh also touches on the balance between speed and the incorporation of new features like vector search to maintain performance levels traditionally associated with Redis.

Key Topics Covered:
  • Redis's strategic enhancements in AI product offerings.
  • The importance of low latency and fast data retrieval for AI applications.
  • Challenges businesses face when productionizing AI technologies.
  • Transitioning from traditional databases to AI-ready infrastructures.
  • Semantic caching and its impact on AI inference performance.
  • Redis's new architecture features, including vector search and two-tier caching.
  • The evolving role of Redis in both cloud and on-premise environments.
  • Future direction of AI application development and the impact on Redis's core design principles.