Skip to main content

Cogynt.ai v2.21 Release Notes

· 3 min read

Cogility is delighted to announce that Cogynt.ai version 2.21 is generally available for AWS customers as of April 2026. See what's new below.

For more information, or to have your Cogynt.ai instance updated, please contact us.

Authoring and HCEP

SSL/TLS Support

  • TLS/SSL authentication is now available for Kafka and Opensearch.

  • Currently, only JKS format is supported.

Authoring UI

  • Function description tooltips are available in function elements.

  • "Sets" have been renamed to "groups" to make model concepts more intuitive and easier to understand.

  • When duplicating projects, the owner of the duplicate is now the user that performed the duplicate action, and not the owner of the original project.

  • Projects and project components can now be opened by clicking on the title directly.

  • Added heartbeat.timeout settings to deployment configurations to help users navigate Flink deployment timeout issues.

  • The JDBC option has been removed from data source types.

Bug Fixes

  • Missing field_name fields in error topic messages have now been restored.

  • The right-click menu no longer disappears immediately on Windows.

  • getURLPart functions now correctly output integer arrays when input is an array of URLs and the port part is selected.

  • JSON objects can now be mapped to the output of constraint computations.

  • "Sticky" connection highlights have been fixed. Highlights are now removed when the mouse moves off the connection line.

  • Constraint Delete buttons no longer appear over other node icons.

Workstation

Chatbot

  • Configurable AI Models (Allowed + Default): AI Configs now support specifying allowed and default models, enabling dynamic model switching within a single chatbot experience. This gives teams greater control over performance, cost, and use-case alignment without requiring separate configurations.

  • General Chat (No Event Required): Users can now start chat sessions without attaching them to an event, making LLM capabilities more accessible for ad-hoc analysis, questions, and workflow support.

  • Quick Access to Chat: A new AI chat button in the Workstation app bar allows users to instantly launch a general chat session, reducing friction and improving discoverability of AI features.

  • Model Switching During Chat: Users can switch between LLMs mid-conversation (when enabled in the AI Config), allowing them to adapt responses based on speed, quality, or task requirements without restarting sessions.

  • Redesigned Chat Interface: The updated chat window includes a side panel for viewing, searching, and filtering chat sessions, making it easier to manage conversations and revisit prior work.

  • Rename Chat Sessions: Users can rename chat sessions for better organization and context tracking, improving collaboration and recall.

  • Cancelable Streaming Responses: In-progress AI responses can now be cancelled, giving users more control and saving time when responses are no longer needed.

UX Improvements

  • Added a scale to the map to help visualize the distance between points.

  • The list of allowed file uploads is now previewable next to the Attach Files button via a tooltip to improve awareness over system rules.

Data Management Tool

The DMT table has been updated to have the same behavior as Authoring tables. For example, by default, the table now sorts object by name.