4 Ways to Leverage Past User Data to Predict Future Behavior
As companies recognize the need for more user-centric and evidence-based strategies, user experience (UX) research is becoming an increasingly integrated part of most modern product organizations’ development process. UX research can help teams ensure they’re solving for real needs, reduce risk by catching problems early, and spark creativity through increased empathy for customers. Yet, there are many challenges facing those who conduct UX research. In this series, Braze UX Researcher Sofia Linse tackles common issues and explores how companies can approach this essential work more effectively.
Anyone who regularly conducts user experience (UX) research is well aware of the challenges that come with attempting to predict future behavior. You only have to look at the subscriptions that go unused every year to understand that humans aren’t that great at forecasting what we’ll do in the future. Yet the most critical questions in research often concern future behavior: Will customers use it? How will they use it? Will they choose us over competitors’ products? Will they pay for it?
The good news is that today’s digital products are making it easier to analyze actual user behavior—through activity logs and tools for viewing real interactions with your product—instead of relying on hunches or anecdotal claims. At Braze, we constantly seek ways to leverage these opportunities in our UX research. And over the years, we’ve found several successful approaches to incorporating quantitative and qualitative behavioral data to more confidently approach the difficult questions about future usage:
1. Identifying User Workarounds
Seeing the many creative ways customers use Braze is an invaluable source of insight and inspiration for our Product team—and one of the best ways to understand what that activity looks like is by examining how our customers interact with the platform.
One Braze feature, Canvas, allows our users (marketers) to orchestrate communication by mapping out customer journeys and messages recipients should get at each point of that journey. Canvas is a flexible tool that largely gives marketers free rein to customize messaging flows in accordance with their companies’ strategies. As a result, each Canvas can serve as a reflection of that company's current priorities and goals. In a recent project, we examined a variety of customer Canvases to help us identify inefficient build-outs and workarounds where our platform wasn’t fully meeting a given customer’s needs. After the analysis, we built on the understanding we’d gained by conducting interviews with selected customers in order to validate whether our interpretation of the data was actually correct. Even though it was painful at times to see how customers had struggled to accomplish their goals using our technology, their testimonies provided strong evidence of how important these use cases were to them—increasing our confidence that we were tackling the right problem, and would see strong feature adoption once we solved it.
2. Conducting Experiments Directly in the Product
At Braze, we often preach the importance of reaching your customers where they are—and that’s just as true for a B2B company like ours.
One example? Our reporting product team recently launched a survey inside the Braze platform that asked customers if they were interested in receiving either of the two reports that the team was considering building. Once customers signed up, our team manually created minimum viable products ( MVPs) of these reports based on existing Braze data, sent them to interested customers, and followed up with qualitative interviews a week later. This made it possible to gauge which of the two reports customers preferred and how they’d actually used them (if at all), instead of having to rely on claims about future actions. Using this approach, we witnessed actual customer behavior, instead of asking customers to predict what they’d be likely to do in a hypothetical future situation. The upshot? This experiment required very little engineering effort, was extremely quick to turnaround, and nearly eliminated the risk of building the wrong product.
3. Getting into the Game with an Early MVP
Sometimes interviews with customers about potential solutions result in very vague and hypothetical discussions, especially if we’re approaching new territory. Without tangible references, it’s easy to imagine the perfect product in a perfect context, as we tend to forget about the realities of technical feasibility, competing priorities, and the difficulties associated with using something new and unfamiliar. It’s also hard for a customer to communicate the details of that perfect experience. During a given interview, we in the product team might imagine something very different from what’s in our customers' heads—without even realizing it.
That’s what happened during the research effort that preceded a new AI-based feature. Instead of going through our usual robust development cycle (i.e. discovery, design, evaluate, and iterate), we quickly designed and released an MVP to a handful of customers who agreed to complete assignments and meet with us weekly to provide feedback. In a few short months, we'll be able to identify issues, test out new ideas, and craft a scalable product for all customers to use. This way, we’re able to get ongoing feedback and insights that will give us a concrete understanding of how customers will actually use this feature before committing to building a scalable product beyond the MVP.
4. Leveraging External Tools to Capture Real Product Interactions
At Braze, we run a weekly “FullStory Fridays” internal event, where a bunch of people from different departments inside the company come together to watch, analyze, and discuss recordings of customers using the Braze product through the tool FullStory. This approach regularly triggers insights and questions that would be hard to uncover in customer interviews (either because we wouldn’t know to ask about a given issue or because customers are so used to their routines that they no longer actively reflect on them).
Another external tool we leverage to analyze product interactions? Looker is a modern business intelligence and data application platform that we use to identify relevant companies to approach for research recruitment—based on their behavior or usage of specific Braze features—and to quantify insights discovered during qualitative research.
That could mean answering questions like:
How common is this behavior across the broader customer base?
Are there certain customer segments who do this more frequently than others?
How are customers who use feature X different from those who don’t?
To make sure we spot opportunities for quantitative analysis in connection with our UX research projects, we also have regular meetings with the Business Intelligence team to discuss upcoming initiatives and current research questions.
When it comes to UX research, there’s always going to be data you wish you had and situations where clarity can feel hard to come by. Hopefully, these four approaches can inspire new ways to get more reliable, data-driven conclusions that can improve your product over time.
Interested in learning more about the product development process? Check out Braze SVP of Product Kevin Wang’s look at the key questions you need to ask before making product decisions.
Sofia Linse is a Senior UX Researcher based out of our NYC headquarters. When she’s not digging into customer needs and problems, you can find her hunting out the next Asian restaurant to add to her favorites list or debating which park has the prettiest view of the New York skyline.