🚀 Release 2.37.0 – Real-time AI streaming & chat improvements
We’re excited to introduce Next Lab 2.37.0, bringing a major upgrade to how you interact with AI in the Lab, along with improvements in stability and overall chat experience.
⚡ Real-time AI responses (Streaming)
AI responses are now delivered using streaming across all supported models:
OpenAI
Claude
Mistral
This means you’ll see answers as they are being generated, instead of waiting for the full response—making interactions faster and more natural.
💬 Improved chat experience
We enhanced how conversations are handled by sending the full chat context, allowing models to better understand the flow and provide more relevant, consistent answers.
🛠️ Chat reliability & performance
We addressed technical debt in LLM chats and improved internal handling of messages and services, resulting in a more stable and scalable experience.
🔧 Fixes
Fixed an issue where the “New Session” button in instigators home incorrectly triggered the tour.
General bug fixes and UX improvements across chat flows.
🙌 Thanks
Thanks for the continuous feedback