Smarter Support Starts With AI

By Cheryl Brown

20 Oct, 2025

At Q2, we’re pushing the boundaries of what AI can do for financial institutions, and one area where AI can make a meaningful difference is technical support.  

At our recent Dev Days 2025 event, we showcased four use cases that highlight our approach to AI: responsible, ethical, and practical. Among them was an AI-powered assistant that enables Q2 customers to cut through the noise of multiple systems and find fast, trusted answers

Why it matters 

Getting technical answers or product support can mean digging through long documentation or waiting for a case response. Applying a large language model (LLM) would change that experience completely. With AI-powered search and source-cited responses, financial institution staff could ask natural-language questions, like, “Does Q2 have a product to help me fight check fraud, and are there any version dependencies to deploy it?” and get instant, reliable results.  

How it works 

At Dev Days, Q2 Chief Data Scientist Jesse Barbour demonstrated how our internal knowledge platform, Kraglin, is transforming from a smart search tool into an intelligent, evolving system of record. The solution unifies content across Confluence, Salesforce, SharePoint, and Teams into one trusted interface where employees—and soon, financial institution staff—can ask natural-language questions and get answers with precise citations. Within the first few months of use at Q2, Kraglin already has 1,500 active users. 

What makes this innovation powerful is the LLM framework beneath it. Rather than simply retrieving documents, the model interprets the intent of a question, retrieves relevant data, and synthesizes an accurate response, complete with source references for transparency. It operates through a retrieval-augmented generation (RAG) pipeline, meaning the model is continuously grounded in verified, domain-specific knowledge, minimizing the risk of hallucinations while improving trust in every interaction. 

Over time, the model will also learn and expand through continuous ingestion of new data. Each resolved customer support case, technical update, or knowledge article adds to its corpus, allowing it to detect emerging patterns and respond to similar questions more effectively in the future. The vision: Every financial institution will have its own instance, tailored to its history and context, creating a self-improving ecosystem of knowledge. As the model accumulates new cases and user interactions, its responses become increasingly personalized, relevant, and predictive—turning institutional memory into institutional intelligence 

See it in action

 

The bigger picture 

Looking ahead, we envision this tool being trained on millions of historical support cases, learning over time to provide personalized, context-aware help for each institution. But this is more than a search tool. It’s the foundation for a new model of AI-assisted customer service in banking.  

As part of Q2’s broader AI strategy, providing Q2 knowledge via LLM represents how we’re putting AI directly in the customer loop to help financial institutions move faster, reduce friction, and focus on human expertise where it’s needed most. It’s a tangible example of what we mean when we say we’re experimenting boldly and delivering responsibly.  

Ready to explore more AI innovation? 

This is just one of the forward-looking AI prototypes unveiled at Q2 Dev Days 2025. Each represents a different way we’re applying AI to make digital banking smarter, faster, and more human. Read more about it