CONTRIBUTOR
Managing Editor and Podcast Host,
Techstrong

Synopsis

In this Leadership Insights video interview, Amanda Razani speaks with Andy MacMillan, CEO of UserTesting, Ketaki Shriram, co-founder and CTO of Krikey, and Jhanvi Shriram, CEO of Krikey, about the importance of data for understanding and improving user experiences.

 

Transcript

Amanda Razani: Hello and welcome to the Leadership Insights series. I’m Amanda Razani and I’m excited to be here today with a few different guests. First is Andy MacMillan. He’s the CEO of UserTesting. Hello.

Andy MacMillan: Hi, Amanda.

Amanda Razani: And next I have Ketaki Shriram and Jhanvi Shriram who are the CTO and CEO of Krikey. How are you all doing today?

Jhanvi Shriram: Great. Nice to meet you.

Ketaki Shriram: Well, thank you for having us.

Amanda Razani: Well, first, Andy, can you share a little bit about UserTesting and what services do you provide?

Andy MacMillan: Sure. UserTesting is a technology platform, a SaaS solution if you will, where our customers can use our platform to have their customers show them what it’s like to be a customer, to be a user. If you’re building a digital product, a website, a mobile app, it can sometimes even be a physical product and you want to know what’s it like to walk a mile in my user’s shoes, you can use UserTesting. We actually help you find the participants. We have this large network of people that will walk through your product. Those folks will record themselves so they know they’re being recorded. They’re opting into sharing their feedback and they’ll actually walk through your product. They’ll share their screen and they’ll say, “This makes sense. This doesn’t make sense. I find this confusing. This is really helpful,” and things like that. And then lately, we’ve been using AI to help you analyze all that feedback, and we do it all very quickly. Most of the results on our platform come back in an hour or two. You can imagine having lots of different people go through your product quite quickly. One of the things that’s helpful is to allow you to go through all that feedback and really find those interesting insights to help make for a better experience for your users.

Amanda Razani: Awesome. Okay. Ketaki and Jhanvi, can you share a little bit about your company, Krikey, and what services do you provide?

Jhanvi Shriram: Yes. We are an AI animation company and so we enable people to animate a 3D character and build out a short film within our tool. We have two AI models. One is text animation, so the user can type in a text prompt like Jumping Jacks, and then generate a 3D character animation based off the text or video to animation. You can upload a video of yourself dancing and we will apply that to a 3D character. Our tool doesn’t require any knowledge of coding or programming or prior animation experience, so it’s really democratizing the process of 3D animation. It empowers anybody to animate a character and create a video.

Around our AI tool, we have a 3D editor. Once a user has generated their AI animation, they can customize their avatar, they can add facial expressions, hand gestures, they can even add lip sync to dialogue in over 20 languages. There’s lots of other tools that we provide for people to create this content. We actually started as a gaming company. We’ve been around for about seven years, and so five of those years we spent building mobile games. Just about a year and a half ago, we ended up pivoting partially through our UserTesting data and seeing where the consumer was, what they were looking for, and taking a part of our own animation pipeline and turning that into our main product. What we have today is a result of that business decision of the pivot and of all the data we’ve seen through UserTesting that enabled us to make that decision.

Amanda Razani: Oh, cool. That sounds really awesome. Andy, they’re talking about the data that you gathered, so can you share a little bit about what data did you gather? And then we can talk about what a difference that made and maybe some use case scenarios that you all took that data and moved with.

Andy MacMillan: Sure. I think one of the things that I find important and fascinating about what we do at UserTesting is there are so many different ways that folks collect data about their prospects, their users, their customers. I think every one of us knows that as you wander around the internet, you’re constantly being tracked and data’s being collected. But interestingly, what a lot of people don’t do is the simple task of asking their users to share with them their experience. What I really like about UserTesting, as I said before, it’s all opt in.

You’re asking people. That can be people that are in our network or it could be people just you can send a link to, a user, a friend, a prospect, and say, “Hey, could you walk through this thing, a prototype, a game I’m building, a design I’m interested in, and share with me your feedback?” I think what’s really powerful about that is sometimes the best feedback is when somebody is directly sharing with you what works and what doesn’t. What it’s not is kind of traditional big data where people are trying to sift through all of the bits and bytes and try to figure out, “What in the world were people doing in my app or what were people doing on my website? Why is this happening?”

We hear that all the time. Why are people not clicking the checkout button? Why are people not subscribing to my newsletter? What we do is help people actually get to that why. Often, they’ll take some of that traditional big data and they’ll learn something, “I wonder why people are doing this.” And then they’ll come on to UserTesting and they’ll say, “Well, let’s run 10, 20, 30 people through that experience.” We collect all that click data, the screens that they’re sharing. Again, they’re all opting into sharing this, their voice, and then we pull all that together into an analytics platform that analyzes that, that tags it, that understands is somebody sharing frustration or confusion. Is there a good idea in here? Is it feedback?

And then we actually bundle that up and you can create what we call highlight reels. Imagine being able to share around your organization short little video clips or compilations of video clips of real people going through that experience. I think that’s really important because it’s hard to be your user. You’re really close to the problem. It’s usually obvious to most people who built something how it was supposed to work. We’ve all had that experience of watching somebody then try to use the thing you built. You’re like, “Well, they realize they’re supposed to do this. Why aren’t they doing that?” The ability to create that video package, if you will, and share that around the organization, we think it’s really helped people get aligned on what do people really think of our product, what really works and what doesn’t, what other idea, what pivot might be there if we were to put ourselves in the shoes of that user.

Amanda Razani: Well, and I know that with most business leaders, it always comes down to the data and getting as much data as possible and harnessing it, especially in this age of everyone being online a lot more. The digital transformation initiatives that are happening, it all revolves around the data and good data and utilizing it as best as possible. That being said, how did you you all then take this information? Can you share some use cases and maybe the outcomes?

Ketaki Shriram: Yeah. I think in the Krikey context, first of all, we’ve been using the UserTesting tool since 2020 and it totally transformed, first, our mobile app gaming business. And then, now as we’re a B2B tools business, we’ve been able to continue using this tool but in a totally new way. I’ll share a couple of examples of how this has helped our current product. We’ve been talking a lot about use cases, where a new product we just launched in August and our homepage we found was not converting people to signups the way we thought it would. It discussed the features of our product, what it could do for customers, but for some reason people just weren’t signing up in the numbers that we thought they would be. We used the UserTesting Open Text entry question type, and we asked people, “What are five things that you can do with the Krikey AI animation maker?”

We immediately found out what the issue was, which was that our homepage was discussing product features instead of end customer use cases. When customers would come and look at the homepage, they would say, “This is really well-designed. It looks interesting.” But when we asked them, “Can you write down five things that you can do with this product that are useful,” most people couldn’t even list three things.

We realized immediately that even though our homepage looked beautiful, it wasn’t necessarily serving the end customer goal of, “I’ve come to this product page. What are the things I can immediately get from this product and how is this valuable to me?” And so we made a change on the homepage where we listed out all the use cases and these beautiful tiles and we immediately saw signups double, because people finally understood what the value of the product could be for them. that’s one example.

A second example is our loading screen. We were seeing a drop-off in number analytics. Looking at Google Analytics, people are coming to our homepage, but they’re not going into our main product. What’s happening on the loading screen? We can see that there’s a drop-off there, but as Andy pointed out, you don’t know the why. You don’t understand why people are maybe dropping out of a certain step in the funnel. All you can do with numbers is really diagnose that there is an issue. But to get at some of these implicit problems, you really need to see customers interacting with the product. We found right away that when people were coming into the tool on certain computers, it was just taking a little bit too long to load and we didn’t have any type of media in the loading view to entertain people while they were waiting. Unfortunately, customers were leaving at that step in the process.

Now if we didn’t have UserTesting, we would never have known that people didn’t understand what are five things they could do. They were leaving during the loading process for our product. Just by running a small simple user test, we’re able to get these results. I would say a third part of the UserTesting tool that’s really important for us is the highlight reels. This, I think, is an age-old issue in most companies where developers and product teams and designers are all coming from different perspectives. We all want what’s best for the product, but people don’t often see eye to eye on how to get there. Having highlight reels where we can actually show people, “Hey, here’s a customer interacting with our product and here are some of the pain points that they’re running into,” it’s really helpful for us to unify our team and make sure that everyone understand, “Why are we making this product change? Why do we need to redesign our homepage so urgently?” “Well, if you watch this user test, 10 out of 10 people cannot list five ways to use our product. This is why we’re doing this.”

I think it helps make the team understand that we’re only doing what’s best for the customer and for the product. This is all based on facts and mixed methods analysis and we couldn’t do that and couldn’t unify our team without the UserTesting tool.

Amanda Razani: Awesome.

Andy MacMillan: I think one of the things that really highlights, that I like in that last example, is it’s often the data along with seeing it that’s so powerful for teams. I use the analogy when politicians talk, they give us these numbers we don’t always understand. Something’s a trillion dollar problem, we all kind of, “Okay.” And then they’ll tell you a story about somebody they met who’s dealing with this healthcare problem and you’re like, “Well, we should fix that.” And I think we see that often when we’re trying to figure out our users where that’s a great example. If we said, “10 out of 10 people didn’t understand this,” versus “I’m going to show you 10 people unable to describe it,” your empathy picks up and you’re like, “Well, I don’t want that. That doesn’t feel good.” I think there’s an aspect of seeing and feeling that disconnect that’s so powerful. I really like that example of alignment around visibility. It’s a great example.

Amanda Razani: Yeah. That personal aspect brings more to it. Andy, with that being said, as you’ve dealt with company leaders that are trying to improve the customer experience, what are some other roadblocks or issues they tend to face and how does harnessing and analyzing the data a little bit better help in these scenarios?

Andy MacMillan: I think the two big things that I see senior leaders struggle with is, one, this alignment issue we were just talking about. How many companies have we talked about who have struggled before, where everybody, their customers, everybody knows what the problem is but they can’t see it for some reason? And it’s not that they can’t see it. They can’t get aligned around the answer. Everybody internally has their own opinion on what they should do differently and those opinions aren’t really grounded in what the users want; it’s grounded in the opinions of people inside the building. I think one is getting that outside perspective of real users and getting aligner on that.

I think the second is actually having a program in place that builds empathy with end users versus simply driving a data. Nobody gets up in the morning and says to their family, “I can’t wait to go to the office today. We’re going to take the net promoter score from 20 to 22.” It’s not what we all try to do. I really do believe most people really want to do something great. They want the thing they build, the thing they work on, the customer service call their supporting to be good. Yet, we all have so many experiences that we deal with that are just terrible.

I think a lot of it is that those companies just aren’t able to put themselves in the shoes of the end user and build that level of empathy. I think if there’s a way to elevate programs that show that empathy, show what it’s like to be a user, and put that alongside these data programs, I think that’s what can really empower leaders to drive change, where they can get up and align people around a shared narrative and say, “This is the thing we’re going to go fix. This is the experience we want. We want an experience like this.” I think most people in an organization would rally to that. I think it’s very hard to do that when you’re simply talking about the net promoter score or the conversion number on the website. I think that’s a really important change management tool that a lot of organizations lack.

Amanda Razani: Ketaki or Jhanvi, do you all want to weigh in on this topic? What’s your experience been when it comes to improving that digital customer experience?

Ketaki Shriram: Yeah. I think one of the things we really get from UserTesting is how our product makes people feel. That’s something that we can never understand just by looking at funnels and charts. We really like the feature where you can actually see someone’s face as they’re using your product alongside the screen share of your tool. Many times, the feedback doesn’t even need to be spoken if it’s like someone’s furrowed brow or a long pause and they’re trying to figure out like, “What button do I click next?” Those moments are really important to us, because we always want people to leave our product feeling inspired and excited and they want to return and create another experience. Anytime when people are confused, maybe that’s not the experience that we want to leave them with.

Again, these are, I think, implicit feelings and findings and we would never be able to know those without actually watching people interact live with our tools. It’s been incredibly helpful. We do actually show highlight reels at our team meetings and I think that’s been really useful for people to understand again, “How does our product make people feel today and where are the areas…” These are all micro changes, but where are the areas in our different flows where we can make sure that we’re leaving people with a positive inspirational feeling?

Jhanvi Shriram: And just to add on to that, I think one of the things we talk about as a team is that we want to surprise and delight our users. But what does that mean and what does it look like? The videos that come with the UserTesting tool are so impactful for our team, because they can immediately see when we have changed the home page, there’s delight on the user’s face because they’re like, “Oh, cool, I can do this and I can do that. Wow.” That moment is really hard to capture just in hard numbers data or in a written report. But to actually see the user’s face in that moment of excitement makes a really big difference to our team and gets them excited about building the product, about making these changes, even if it’s something as small as the text on a button. They can immediately see that this can delight or change the emotion of the user very viscerally.

Andy MacMillan: I think one of the things that I hear sometimes that I really enjoy is when folks say that they saw something that was maybe counterintuitive to them internally but felt very obvious in retrospect. An example I’ll give is with one customer. They’re a leading brand in their space. They were seeing declining conversions over time on their main website, their main product landing page, and they said they kept iterating and changing stuff and adding more features and adding more features and explaining their value prop more and more and more, and the rates kept going down and down and down until they ran some user tests on it. What they discovered was most people came to their site believing they were the best brand ready to buy, but the more stuff they put on the page, the more the user stopped and thought, “Oh, this is a much more considered purchase than I thought. I should go shop around.”

So they took a whole bunch of stuff off their site and their conversion shot up 33%. Again, it’s a bit obvious in retrospect. When I tell that story, everybody nods like, “Well, that makes tons of sense,” but you can imagine being on that team and you’re trying so hard to put an offer in front of people that would drive the behavior. Again, all the data was telling them, they’re A/B testing all this stuff, and they’re trying new offers, but sometimes it’s obvious in retrospect and it’s such a simple adage. Go talk to your customers and they’ll lead you the right way, but it’s hard to talk to your customers when you are disintermediated by technology when your user’s coming to a website. I love the stories, much like these examples, where you get to see what it’s like to be a customer and it illuminates a whole different perspective.

Amanda Razani: Absolutely. Let’s talk a little bit about AI. We mentioned that earlier. From your experience with AI rapidly advancing over the last year and a half or so, how is this impacting the customer experience, the digital experience, and what do you anticipate in, say, the next year?

Andy MacMillan: From our point of view as UserTesting, I think the thing that we see the most is a really rapidly changing landscape of user… Not only user expectations but user’s understanding of what it does. I think about ChatGPT being launched and AI went from this sci-fi concept that nobody outside of people that really knew machine learning understood to everybody I know has some idea of what AI is. They are open to using it, they want to apply it to problems. If you are building an experience, think of the amount of change in your users’ not expectations but behaviors and norms in that short period of time.

I think a lot about how do we help our customers rapidly iterate their experience, but also their understanding of where their customer is at on this journey and getting that rapid feedback. You can’t spend six months building something then find out you missed the mark and the market’s gone a different direction. You really have to figure out a way in every one of your sprints, in every one of your designs. Where’s my user at? What’s their expectation around this? How do I create that sort of delightful experience that Jhanvi talked about? How do you do that in a rapidly changing world? I think you lean more into the feedback from your users.

Amanda Razani: And as far as Krikey’s side, what are you all seeing?

Ketaki Shriram: Yeah. That’s a great question. I think, for us, our tool is designed to help people who have no 3D content experience generate animations using AI and then quickly edit them into a beautiful final product. The biggest hurdle that we face always is, “How do we onboard people to not just building 3D content but using AI to generate 3D animations?” These are two totally new concepts and most customers who come to our product don’t know anything about either of them. There’s a lot of fine grain design changes that are involved to making sure that customers don’t feel like the tool is too complicated for them.

One small change we made recently was around our tool tip copy. When people come in, we have tool tips to guide them through the onboarding and we found that we were using too many technical terms in our tool tips. For people who don’t know anything about 3D content or AI, they would often feel like maybe the tool was designed for somebody who was more advanced than them and they weren’t sure if the product was quite right and if they were ready to use it. Maybe they had to go learn other skills. When we changed our tool tip copy, we found that people felt like the product was more welcoming. They felt positively when going through the onboarding. That seems like a very small change, but it’s actually extremely important because to onboard people and make sure that they feel empowered to utilize AI inside of our tool, we have to give them all the information they need in a truly accessible way.

Amanda Razani: Wonderful. Well, if there’s one key takeaway for our audience today in regards to any of the topics we’ve discussed, what would that be if you all want to give each of your key takeaways?

Jhanvi Shriram: Yeah. I can take the first one. I think definitely to surprise and delight users, you have to user test and share back those findings with the team. One of the things that we found early on is we were trying to do user testing locally, going to schools and colleges, but that wasn’t our audience. I think one of the great things about this tool is you can actually sort for the audience you expect to be coming to your tool, whether that’s teachers or specific types of students, filmmakers, and immediately get their feedback. I’d strongly recommend it even to small startups. We were a small startup when we started using this four years ago and it’s dramatically changed our business and we could not exist or thrive without UserTesting.

Andy MacMillan: I think that’s well said. I just love the idea of focusing on delighting users in a rapidly changing space like AI. I think that anchoring around that mission is such a great idea and I think it’s why you all have been so successful. Congrats on all the success and I think those are great words of advice.

Amanda Razani: Yes, absolutely. Thank you all for coming on and sharing these scenarios. I enjoyed speaking with you, and have a great weekend.

Andy MacMillan: Thank you.

Jhanvi Shriram: Thank you.

Ketaki Shriram: Thank you so much.

Amanda Razani: All right.