🚀 Release 2.31.0 – LLM chat stability, deep links & UX improvements
We’re excited to announce Next Lab 2.31.0, focused on improving the stability, usability, and shareability of LLM chats—along with a few behind-the-scenes enhancements.
⏳ Clearer feedback when sending messages
We added a loader when sending messages, including scenarios with multiple image uploads, so it’s clear when content is being processed.
🔧 LLM chat reliability improvements
We addressed technical debt in LLM chats to improve stability, error handling, and overall reliability across chat flows.
🔗 Shareable deep links for chats
LLM chats now support deep links, allowing you to share a conversation URL and reload the chat directly in the Lab.
📊 Better internal insights
We introduced internal tooling to help identify the most engaged users, enabling better understanding and future improvements.
🙌 Thanks
Thanks for all the feedback.