Size Matters: The rise of small LMs
About the show
Hosted by Nikola Mrkšić, Co-founder and CEO of PolyAI, the Deep Learning with PolyAI podcast is the window into AI for CX leaders. We cut through hype in customer experience, support, and contact center AI — helping decision-makers understand what really matters.
Never miss an episode
In this episode of ‘Deep Learning with PolyAI,’ Damien and Nikola discuss the evolution and significance of small language models (SLMs) amid the ever-growing landscape of large language models (LLMs). They dissect the predicted rise of SLMs, their applications in privacy-focused and low-power devices, and the implications for enterprises balancing cost, efficiency, and modernization. They also explore the enduring relevance of SLMs, the pursuit of artificial general intelligence (AGI), and what the future holds for localized and on-premises machine learning solutions.
There's a growing trend towards using smaller language models (SLMs) alongside large language models (LLMs), reflecting a shift from the "bigger is better" approach to more practical, specialized models.
Smaller models, like the original language models behind PolyAI’s product, were common due to limitations in compute power and data access, but modern advancements have expanded the capabilities of both small and large models.
SLMs offer advantages such as faster processing, lower energy consumption, and the ability to operate offline, which can enhance privacy and security.