Summary
In this episode, we look at the latest trends in experience management with a focus on Qualtrics' new solution, "Discover." Benjamin Granger, Chief Workplace Psychologist, shares insights on Qualtrics' pivotal role and the significance of employee listening. We talk with him to understand the Use Case behind Discover, who should consider purchasing the solution and what is the business case they should be making to their C-Suite.
Discover's approach combines surveys and social listening, revolutionizing how organizations comprehend employee experiences.
Takeaways:
- Qualtrics' evolution into an experience management company is exemplified by Discover, a pivotal product in their toolkit.
- Discover's integration of surveys and social listening offers a holistic view of employee experiences.
- Open-ended text analysis yields valuable insights into sentiment and intention, enhancing understanding of employee experiences.
- Prioritizing transparency and consent is essential in employee listening initiatives, with clear communication of the benefits to employees.
- Combining shorter pulse surveys with diverse listening methods can mitigate survey exhaustion, while providing value to respondents combat survey fatigue.
Chapters:
00:00 Introduction to Qualtrics and Discover
03:02 Evolution of Qualtrics and Experience Management
05:01 Overview of Discover and its Acquisition
06:24 Difference between Surveys and Social Listening
07:48 Combining Surveys and Social Listening for Employee Experience
09:43 Dealing with Conflicting Data
13:04 Understanding the Value of Open-Ended Text
15:10 Transparency and Consent in Employee Listening
21:12 Employee Comfort with Open-Ended Comment Analysis
22:15 Differentiating Monitoring and Listening
23:05 Survey Exhaustion, Pulse Surveys, and Anonymous Surveys
23:20 Survey Fatigue and Value
24:18 Open Dialogue and Employee-Driven Listening
25:14 Connecting the Dots for Employees
26:20 Data Dashboards and Actionable Insights
27:14 Segmentation and Targeted Actions
28:24 Prescriptive Analytics and AI Recommendations
29:18 Tying Data Dashboards to Action
30:12 Listening to Candidates and the Hiring Process
31:17 Listening to Alumni and Continuous Improvement
32:12 The Impact of Employee Listening on Hiring
33:07 The Range of Listening Tools
34:14 The Context of Survey Responses
35:17 Turning Data into Actionable Insights
36:18 Different Listening Channels
37:16 Listening to Work Channels and External Sources
38:26 The Impact of Mood on Survey Responses
39:27 Relational Listening and Transactional Listening
41:26 Finding Meaning and Value in Work
44:43 Work as a Currency Exchange
Learn more about your ad choices. Visit megaphone.fm/adchoices
Powered by the WRKdefined Podcast Network.
[00:00:00] Well, we have some data on this as you might have suspected. What? We have data. No way. So we did this study. We do this study at the end of every year of our Employee Experience Trends Research.
[00:00:14] And one of the things we asked about specifically this year was how do people feel about open-ended comment analysis and more specifically, we were asking about their comfort with it. How comfortable with they or they were their company monitor basically listening to them
[00:00:32] via certain channels and we broke it down. Things like work email. You almost said the word monitoring. And I think people have made that a bad word. It's almost like a curse. Oh my goodness. Bad touching, harassment, sex, violence, fraud, threats, all things that could have been avoided.
[00:00:56] If you had Fama, stop hiring dangerous people. Fama.io. You know what I like about I Solved Everything. I solved this people centric and in a people centric world, you need a people centric solution. I solved People Cloud is a comprehensive human capital management solution that helps
[00:01:17] you employ, enable and empower your workforce throughout the entire employment life cycle from tracking to recruiting to onboarding and clients from payroll to benefits to time and labor management. Transform your employee experience for a better today and a better tomorrow with I Solved.
[00:01:34] For more information, go to isolvedhcm.com. Hey, this is Williamson, the governor of Ryan and Larry. You're listening to Use Case Podcast. Today we have Benjamin on from Qualtrics. So we'll be talking about a product that they have called Discover. So Benjamin's been on the show many times.
[00:01:55] Benjamin, would you do us a favor and introduce yourself? Yeah, of course. I'm Ben Grange. I'm the chief workplace psychologist here at Qualtrics. And I've been here for going on nine years now.
[00:02:07] I've seen a lot of the history of the company, but at the heart, Qualtrics is an experienced management company. We help organizations gather data and information from their employees, their suppliers, their customers to make better decisions faster.
[00:02:21] And this product Discover is part of the tools and toolkit that we have, that we help organizations do exactly that. So if people were to look back maybe five, six years ago, they'd think of Qualtrics as the best kind of survey company, right?
[00:02:34] They kind of came up that way, right? So we fast forward. We're not just into surveys. We still do a lot of surveys. We're still great at surveys. However, the business has expanded. That's exactly right.
[00:02:50] I'd say probably this was maybe 2015 or so when I started with the company. We had about shade under 500 people, and that is exactly right, William. We were, I think, an exceptional Swiss Army knife of survey tools then.
[00:03:08] But we at that point really hadn't focused on what are the actual use cases going to be in the applications to business? So we've continually refined that over the years until we eventually coined the space experience management and decided that this is the space that we're in.
[00:03:27] This is the space we're creating and hyper-focus on that. And then as you pointed out, we've moved beyond just surveys and doing a lot more than that today. Walk us down experience management. What does that mean?
[00:03:43] The way we define it, we have a very specific definition of it, of a discipline of gathering information about people's experience to improve their experience. And so there's a couple of key words there I like to highlight.
[00:03:59] The first is discipline, which that word we use that intentionally because it really embeds the idea that this is continuous and it has to be done with intention. We do this intentionally.
[00:04:12] It's not something that's just one off and it happens to come, oh, we happen to get this feedback. We're intentionally having conversations essentially with our consumers, our customers and our employees. And we're taking that information in not to say, ooh, that's interesting, but to
[00:04:28] actually go back and improve those experiences that we've created for our employees and our customers knowing that there's always blind spots. There's always blind spots in how we do that. That gives us an access to the individuals going through the experiences that we're creating,
[00:04:44] getting that feedback and improving those experiences. So discover how did you come about how did the product come about? And what's the power customers using it? The product was actually an acquisition of a company that many people are familiar with in the space, Clare Bridge.
[00:05:03] Very, very powerful tool and competitor of ours at the time. Really. And tremendous amount of respect for Clare Bridge in the space at, you know, really around that time in the early 2020s. They were just extremely capable in terms of their ability to collect open ended
[00:05:24] information and use natural language processing and natural language understanding to pick out, well, what are people talking about? How do they feel? And so we made that acquisition a few years ago, which did a few things.
[00:05:36] One, I might say we brought in a tremendous amount of talent from Clare Bridge, just some really good people and thinkers that have helped us evolve our approach. But also this amazing technology that's helped us to add.
[00:05:52] And like you said, William, at that point, we were very survey focused, collecting data in a structured way. And this gave us a really powerful tool that was already well known in the market,
[00:06:02] especially on the customer experience side to then take unstructured data and make sense of it. So we're obviously talking about employee experience, but you mentioned that you can talk about customer experience. The application of Discover can be pointed in a lot of different ways.
[00:06:22] But for this podcast, we're really talking about the employee. Maybe you can't do it, I don't know, but definitely employees. So if I understand this correctly, surveys are you go out and you can either do it anonymously or otherwise, but you're going out and asking an audience.
[00:06:41] They respond or they don't, et cetera. And then that comes back. Then you can kind of go through that data. Social listening or listening is looking at probably connected to tools that they already used, possibly Slack, possibly email, et cetera.
[00:06:57] It's connected to tools and it's listening to those things to then bubble up things that could make for a better experience. If you know, if you know those things are going on, again, surveys are, surveys
[00:07:09] are wonderful, but it's, it's a something that you know what's going on with your employees, right? Whereas social listening, you don't have to make that assumption. And if you combine the two, again, if I have this correct with my mind, if
[00:07:23] you combine the two, you've got an understanding that they've given you in terms of surveys and then they've given you data that you can then ball up to then say, okay, I think these are the top three priorities with our employees. That's extremely, extremely well said, William.
[00:07:39] That is exactly how we think about the use of discover. And there's a couple of things I might add to that. The first point you made is, is, I were applying this to employee listening today in this conversation.
[00:07:54] And it is interesting to, to point out that when, when Clairbridge was, was really dominating the space of unstructured listening, most of that work was done on the customer side. Oh, really? Customer data. And they were starting to get build models that were employee focused.
[00:08:14] They were getting some demand from their customers. And then when we acquired them, me being really focused on the employee experience at Qualtrics, that's a big part of my job is to focus on that. We were starting to bring that in and figure out how can we integrate
[00:08:28] that in more and ramp up the work that they were already starting to do to apply that same logic that was already being done on the custom customer side. Why not start using that on the employee side? And so that's, yeah. Right. Just a quick follow up.
[00:08:44] Just quick. What if the data tells you conflicting things? So you survey them and say, you know, let's just say, do you like the return to work or are you okay with the less flexible schedule? And the employee says, absolutely. Five hundred percent. Absolutely agree.
[00:09:05] Yet the social listening comes back and says, I hate Ritz. You know that they're lying. You know, you know, that's an easier one, right? It's like, okay. So that's Hitler, bad person. Okay. So that's super easy. But like how do you deal with conflict in the data?
[00:09:22] So where one of them tells you one thing and another tells you completely the opposite, how does leadership go? What, what's, when do we believe? Well, like any good consultant, I'm going to say, oh, it depends. It depends. Right. But, but I'm not going to stop there.
[00:09:39] Well, wait, there's more. If you say rubber meets the road, I'm coming through this screen. I swear to God. Well, just kidding. Well, essentially it, we have to consider what I'll do is I'll provide some, some research to allow the audience to make their own decision about this.
[00:10:03] And what I would say is that we know that in certain environments, and I'm going to, I'm going to say we occasionally see this. We don't see a lot of it, but occasionally when we work with a company that
[00:10:17] or usually it's prospective customers, because we tend not to work with companies that think this way. But essentially some companies create this environment when they're doing employee listening where, hey, William and Ryan, you better score a five on these. Okay.
[00:10:36] So like survey times coming up and I'm metric on this. That's a, that's a shout out to great places to work, best place to work, awards and things like that. Because when they take surveys, you basically go around with a gun and you hold it to their head.
[00:10:53] You say take this survey. It's literal. I mean, not little. It's a metaphoric gun, not a real gun. You've got to clarify it today. Yeah. Yeah. I'm in Texas. Absolutely. True. So. Fair.
[00:11:05] That's the idea is you, hey, and I had a conversation with a CHRO not that long ago and she was saying, you know, my, I always think about the survey as a stock price. Where do I get my next two points from?
[00:11:25] I couldn't possibly disagree with that mindset more. Right. I could not possibly disagree with that more. That is totally the wrong objective. Why do it? Why do it? That's exactly right. It's a vanity metric at that point.
[00:11:39] So in that environment, that data you're getting from the survey isn't worth a dang. Right. That it's garb is Geigo, right? Garbage and garbage out. So in a case like that, if you're collecting information, quantitative information, and then you have some sort of open ended channel where
[00:11:57] employees can go talk about whatever they want to talk about. Well, then in that case, I'm going to, I'm going to anchor on the ladder. Right. Now, companies who have that mindset don't often open up channels for employees to say so you don't. I'm right.
[00:12:13] So you don't get that they don't care because what they care is about the score and not about the experience. Now on the flip side, we do know, for example, one other point on that when we're doing surveys, we sort of do this, this odd thing.
[00:12:29] We don't often think about it in this way, but I think a lot about this because I spend my day thinking about this stuff. Right. Well, where we're doing is we're essentially going into your mind, Ryan and your mind, William, and we're saying, okay, we're going
[00:12:42] to have a conversation, but we're going to guide you through it. We're going to have these questions or items you think about each one of them and then map your judgment about that experience onto the digital piece of paper that has your response.
[00:12:57] So what we're essentially doing is we're taking something very personal in your mind and we're abstracting it. We're taking it immediately down to the number. When you do that, you lose a lot of information. Right. When we go through that process of surveying each of you, we lose
[00:13:14] a lot of very personal information. Now we're willing to make that trade off because of the computational power, we then can take those numbers and start making good decisions about so we accept that trade off. But when you now let's contrast it with the open-ended text,
[00:13:29] we're not what we're doing in natural language processing and natural language understanding. And as in discover is we're taking a really important intermediate step where we're taking what's in your mind more directly from you. And then before we quantify it, we're taking it down to say,
[00:13:46] what topic were you talking about? What was your sentiment around it? What was the intensity around your sentiment? What's your intention? What do you think you're going to do about this? We're breaking it down into very personal and easily understandable
[00:14:01] components, and then we're quantifying it and abstracting it. So in a sense, if you think about the comparison when you're doing the natural language processing, you're capturing more of the information that's in the mind of the person. The caveat, last point I'll make here, when you do have
[00:14:21] those open-ended channels where people are just coming in and saying, I have this channel where I can put whatever I want whenever I want. What do you tend to get is very polarized responses because people aren't going to take the time that they feel about it. Right? Right.
[00:14:36] But if you're really negative about it or really positive, I might then take the time. So you lose the middle of the distribution. So that's why it depends. And I wanted to provide those data points. We know those are points because depending on what situation
[00:14:50] you're in, what environment you're in, you might make one decision or another as to do I anchor on the open-ended as being more reflective of the truth? Or do I anchor on the quantitative? It really does depend.
[00:15:01] So are you able to or are you attaching this to individuals? Or is this all anonymized data? That's a great question, right? One of the things I think is really important is that this is absolutely not the same as performance monitoring or monitoring software.
[00:15:22] We do not go down to the individual level. When we do this sort of open-ended listening, we're never going to say, here's what Ryan was talking about. But rather... We need to get rid of Ryan. Ryan seems unhappy, let's be honest.
[00:15:40] And the follow up there, as you're answering this, the impetus there is, if one, does the employee population know this is being done through the company or is it just in the paperwork they sign as they join on? And two, have you seen a difference?
[00:15:59] Have you seen a difference of how employees communicate amongst each other once they know this is happening? Yeah. Oh, those are two great questions. Firstly, I think what are the things... Oh, you do have a good job. That's a good job. I'm out. I'm done. See ya.
[00:16:19] Yeah, you're done. You're all set. So the... I'm trying to remember the first and second question. I was good to say it as we kicked off. You know, I kind of am like, man, I... Does the employees know? Yeah, do the employees know? Thank you. The...
[00:16:39] One of the things that we always anchor all this on is an ethical principles of usage. Right. And so almost always our recommendation is, if you're going to start using this, be highly transparent with people. If we're going to start using this and sitting on, if this is
[00:16:55] going to start scraping information from internal chat via... There's a webinar that happens quarterly and there's open chat for employees to ask questions and comment. We're going to let you know that we're looking at, we're scraping information from this channel.
[00:17:09] And what we always recommend companies do is be very transparent about what they're doing and where they're using it. Right. Here's the channels we're using, maybe it's email, maybe it's internal communications, maybe it's a webinar chat, whatever, any of those things. Let them know.
[00:17:25] Be honest about it and tell them what you're going to do with that information. Give them the with them. What's in it for you? What's in it for you is one, we're definitely not going to come and identify the individual.
[00:17:35] What we want to know is what do people think we, this is a conversation and that's an important analogy I like to use here is this open ended, this sort of open ended listening to discover powers is an employee driven conversation. Surveys are company driven conversations.
[00:17:55] Open ended or employee driven conversation. If you frame it that way and you say, this is your opportunity to tell us what's important, how you feel about it, what your intention is. We can take that back and improve things at work.
[00:18:09] We can improve the way that you're working. There's a with them for you. So I'm a big fan. We are big fans of being very honest, not just in the fine print of the employment agreement, but be honest and transparent.
[00:18:21] Hey, we're turning it on for this use case, but here's the benefit it could have for you. And by the way, it can even be a self service tool. So it's not just we're not just collecting information and distilling it, but we can close the loop.
[00:18:34] And I assume I assume there's an understanding at some point where the company or whoever's in the company that is turning this on would have some type of communication out to the employees, let them know like one, yes, we're listening. We're scraping all this information, but we
[00:18:57] historically could not take action on this. Because we couldn't process all of these 400 comments during the quarterly call and make sense of it. Right. So now this is how we're doing it, and this is going to be the next step for the outcome.
[00:19:13] You know, Ryan, as you speak, it's like consent and disclosure and all that stuff. It's common sense school to me, whether or not a company does it or doesn't, you know, bad actors with bad intentions do bad things. Things news 11.
[00:19:28] It seems more of a here's again, here's what's in it for you. And here's how you grow. I think the emphasis, the heavy emphasis should be for the company, the communication should be we want your experience to be better in order for it to be better.
[00:19:46] We have to know more. We just have to know more of the ebb and flow of what's going on today. Today, business, the grind that you go through. We're not aware of what you go through. Right. We know what you do.
[00:19:58] We know the outputs that you create for the company were thankful, etc., etc. I don't think that there's a to me. I you I think the communication layer and consent layer is like, yeah, check, you do it. But I think really if I'm over indexing, I'm
[00:20:14] over indexing on the we're trying to make a better experience for you. Right. And in order to do that, we need data. So how do we get data? This is what we do to get data. We're going to survey you. We're going to ask your responses, etc.
[00:20:30] I want to listen, etc. We're going to have all this data. And then what the output to you is, is you have a better time, you have a better performance. You have just a better experience overall. So I want I want the audience to kind of
[00:20:44] when they're thinking about this pondering this with their moment or grass is like, yes, disclosure is important and yes, consent is important. But actually, I think if we just talk about the experience and talk about it being better, how do we get there?
[00:20:59] I think that I think most people, at least rational and logical people will go, yeah, tracks. You would think so. Well, we have some data on this as you might have suspected. What? We have data. So we did this study.
[00:21:15] We do this the study at the end of every year our employee experience trends research. And one of the things we asked about specifically this year was how do people feel about open ended comment analysis and more specifically, we were asking about their comfort with it.
[00:21:33] How comfortable with they are they with their company monitor basically listening to them via certain channels and we broke it down. Things like working almost said the word monitoring. And I think I think people have made that a bad word. It's almost like a curse word.
[00:21:49] You all you almost cursed. Now, you didn't mean to but you almost cursed. And you almost said monitoring. Good. I think for most people, they hear that and they think monitoring. You mean they're looking at all my stuff. It's like listening listening.
[00:22:04] We're just trying to make a better experience. We don't we don't really care. We just want to make sure about the output of having a great experience. That's exactly right. It's a conversation. And so you you when you're you're essentially
[00:22:18] I lost my train of thought actually sorry about that. Monitoring monitoring. Right. Monitoring is a four level word. Could be a great webinar. Just next week then you go. Monitoring is monitoring is a four level word and actually getting people to understand the difference between monitoring listening.
[00:22:35] Yeah, yeah. Any. Yeah, sorry. I'm running on. I'm running in like four hours of sleep tonight. My brain. Well, hey, all right, I got three things for you. Pick them up. Oh no, go ahead. Finish your thought. Yeah. Do y'all remember what train I was on?
[00:22:47] No, not at all. So sorry. Hey, you lost. You lost your train. All right. It's going to come back to me. We can hold sir as it is like a boomerang. Oh my goodness. Bad touching, harassment, sex, violence, fraud, threats. All things that could have been avoided
[00:23:11] if you had Fama. Stop hiring dangerous people. Fama.io. All right, I want to talk to you for a moment about retaining and developing your workforce. It's hard. Recruiting is hard. Retaining top employees is hard. Then you've got onboarding payroll benefits, time in labor management.
[00:23:33] You need to take care of your workforce and you can only do this successfully if you commit to transforming your employee experience. This is where I solve comes in. They empower you to be successful. We've seen it with a number of companies that we've worked with.
[00:23:48] And this is why we partner with them here at WorkDefine. We trust them and you should too. Check them out at isolvedhcm.com. All right, so pulse surveys, survey exhaustion and anonymous service. What's your take on those three things?
[00:24:06] I think survey exhaustion is more as better characterized as survey. We talk about survey fatigue. Right. I think survey fatigue, Trump's survey and action fatigue, Trump's survey fatigue. Right. We also know that in the research that we've done on this concept of survey
[00:24:27] fatigue, when and where does it happen? It's more likely to happen in a single long survey. Shorter pulses. And so we in our community will say within survey, survey fatigue is more powerful than between survey fatigue. So you're better.
[00:24:47] I think I think I think fatigue comes from the lack of value. So I'm on the other side of this argument or not argument, but I'm on the other side where I don't think that those things actually exist.
[00:24:59] If you add value, if you're telling them in advance, here's what we're doing. Here's what we're trying to figure out. And here's what you get. I mean, we're providing value back to them. I don't think that the respondent has exhaustion or fatigue. I agree. I agree with this.
[00:25:13] I think that if they, if they see the action occurring, correct. And that actually was the train I was on a minute ago was. Oh, see, came back. Yeah, we're right back. So if you, if you are following through and again, I'm going to use that conversation metaphor.
[00:25:29] If this is, if we're treating this like a conversation and we're having at scale via technology with employees and we're identifying what those blind spots are, what do you see that we don't see? We're acknowledging that. Now you have open dialogue in the organization that's crossing this
[00:25:44] big chasm from senior leadership to front, the frontline where that's much it's very difficult to do. And so that is a critical type of conversation to have. And you open it up from both ways. The leadership is able to drive the conversation.
[00:25:58] Frontline is able to drive the conversation. Sometimes the principle that balance of having organizational driven listening and employee drift listening, I think is a critical principle for any modern employee listening program. But that the listening, the active listening, the active listening
[00:26:13] means it doesn't just stop at I heard what you said. Right. It's it continues and it's I'm going to do something about this going to improve your work experience and your experience as a human. And then you have to connect the dots for employees.
[00:26:27] If you have leadership doesn't connect the dots for employees, they still don't understand. Exactly. Exactly. And by the way, it's often a communication issue. The vast majority of our organizations that I work with do in good faith act on the feedback they get through open ended or
[00:26:45] structured list your listening. The problem is not that they're not acting on it. It's that they're not communicating clearly enough to draw the connection between Ryan, William, your feedback led to this action. So, so Ben, let's go a little deeper on this.
[00:26:59] So we go through the listening process and we're going through the feedback and all of that stuff. On the leadership side, what do I see? What am I getting out of that? Am I just getting a bunch of data? Am I getting recommendations? How does that look? Yeah.
[00:27:15] So we have some really powerful dashboarding that basically will demonstrate it'll show that there's a variety of different things because it's a highly configurable tool so we can really tailor make it to the specific audience and the needs because sometimes
[00:27:30] very senior leader decision maker has to be able to make decisions fast, fast. So we're going to display that differently for that person than a program administrator, you know, a nerd like me. I'm going to want to dive much more into that.
[00:27:43] So it really will depend on the audience, audience, but let's let's use your example, Ryan, we have a senior leader. What we're really going to tee up is, hey, well, firstly, what are the topics people are talking about?
[00:27:53] Being able to then break down if people are talking about leadership. Leadership was one of the most talked about topics via this channel or multiple channels. But when you click into it, you can start to see, well, what more specifically about leadership?
[00:28:07] And then you can also see the sentiment. See essentially a dial of when they talk about leadership, 95 over 95% of it was negative. Yeah. 2% was neutral, 3% was good. Lack of diversity or something like that. You can see the topic.
[00:28:22] You can start to really keep on what the topic is if it's important to them. And you can drill down, you can look at their breakdown, but then you can even go further. I was just looking, one of our customers uses this in the financial and insurance industry.
[00:28:35] And they were just using Discover for looking at a lot of the open-ended comments that they get through their big survey out of the year. And what we were able to be able to do is show things like if you, if somebody responded positively to this particular
[00:28:50] survey item. So this again structured listening, but they responded positively to the survey item or they were really engaged. Here, all the topics they talked about and how they felt about them. So now you can start to hone in on very specific personas of employees.
[00:29:06] Again, we're not getting down to Ryan and William's level, but what we're saying is there's people like that in the organization. They're high performers. They tend to be, let's see, equal distribution of male and female. They tend to be of this race.
[00:29:24] This group of people feels this particular way about this particular topic. That segmentation is crucial because looking at that top line result and saying, oh, this is what people are talking about. I tell you very little. It's when you get down to say, this group of people,
[00:29:39] this is what they're talking about. This is the experience that they need improved. And so you can get very specific in your and targeted in your actions. So the first of all, I love listening tools, but I have a kind of strong belief that it's a dashboard is
[00:29:56] wonderful up into the point where a dashboard doesn't tell you what to do, meaning the recommendation engine is if not now is the future of what those things do. Right. So again, here's a dashboard to talk about leadership specifically click into it or talk about the lack of
[00:30:15] diversity in the leadership team and as it relates to trust. OK, so you got some really, really, really, really, really specific that pops up on a dashboard. I'm a leader. I go in and look at it great. It's red. Fantastic. I know that it's red.
[00:30:32] What do I do about that? Right. And I think the technology over time has to tell that person, you know, here's three things. Here's here's here's to reconcile those things. But it's very specific stuff. Here's the three things that you should consider.
[00:30:49] So what's your take on kind of the how you tie data dashboards and then into action? The framework we use and I like to use is as you have three different types of analytics that you can you can use with data. There's descriptive analytics. Here's what's happening.
[00:31:08] Yeah, there's predictive analytics in the back. In now when usually in the window, a rear view mirror. Yeah, yeah, yeah, yeah. So right. Yeah, that's another analogy. So, you know, you get the descriptive. Here's what's going on predictive based on what we know is going on here.
[00:31:25] What we think is going to happen and then three where you're going is prescriptive. What do you do? And so the prescriptive is the point where that's frankly, we've invested a tremendous amount into artificial intelligence. And that's really where we're going with it is
[00:31:44] looking at how can we get better and faster at team up back to Ryan's example, that senior executive sitting there and saying, OK, here's what here's what's going on. But there's a lot of signal there. And what do I focus on?
[00:31:57] And now we're going to not only give you recommendations for what to do about this particular topic, we're going to tee those up based on expert guidance from folks like myself and my team who do this for a living.
[00:32:09] But we're going to start to train the AI with our material with our thought and we're going to iterate over time. So, hey, we're going to track the actions that happen after this company did this in response. And then we're going to measure the effectiveness of it.
[00:32:23] And then we're going to refine those recommendations over time. We're going to let the AI do a lot of the work, but we're going to incorporate the human intelligence and the experience into that and training it. So that's how we're trying to solve
[00:32:33] for that. And I firmly agree with your point. We agree with that point that it does not stop at oh, here's, you know, in a way, it's not just the dashboard. It's not just telling you red. It's saying it's red. You need to turn off
[00:32:46] and go get that fixed right now. And here's where to go to get that fixed. I could see the path on how this leads to hiring. Right? I'm curious to get you to get your take on this. How does this information you're gathering
[00:33:01] on the current population, employee population? How does that affect the hiring process or does it not? Are you guys not thinking about that? We are at two, maybe two different angles. And one is I think William, you even teed it up earlier that the candidate experience might be
[00:33:18] part of it. And that is absolutely true. So can we listen organically to what job candidates say and a lot of the external sources of data that our clients will scrape from are sites that people use during the recruiting process, right? There are those
[00:33:37] websites where they go and say, hey, what kind of place? What am I getting into? Essentially, it's an expectation setting step that employees will go through before they decide to apply or when they're in the hiring process. So in a way, we do do
[00:33:52] candidate listening to listen to what those people are saying in the in the experience. But I think more so the more common application of it is grabbing that information that's out there externally and looking at well, people who have been an employee here,
[00:34:07] people who have been a candidate here. What is their experience been like? And then let's match that up with what we actually observe. We've seen some companies do like Six Sigma processes through their hiring processes. I've been involved in a lot of those projects actually in a past
[00:34:26] life before Qualtrics. And that's the sort of thing where we would look at each step in that journey of the hiring process and let's look at data that we're collecting, and in this case, organically from people outside and saying, OK, people are saying
[00:34:40] that our application takes way too long to complete. Let's gut check that. And let's actually look at it and let's measure that. How does that compare to what we're seeing externally? So it gives us some good signal to look at what rock do we go on to overturn?
[00:34:54] And it helps us identify and actually in the natural language and understanding and processing that happens, we're looking for design signals. So it pulls out things like, again, what are people talking about? What do they feel about it? But it also pulls out things like how much effort
[00:35:09] was it? Was it too effortful? Then they expected that's a design principle and an experience like a hiring process. Brian, you could be tied to all the interviewing technology. So the video interview, yeah, all those types of things. So every time that we make
[00:35:25] an interaction as sources or recruiters, there's an interaction and there's a recording of some type of form of gathering data. We can run that through to then find out what's important to them, which could feed learning development. Exactly. Right, right. And the other thing that we
[00:35:42] haven't talked about, but it's probably as important is the alumni experience. So once someone leaves the company, how do we continue to listen to what they've said about the company? So, you know, I don't believe in exit reviews. So just personal bias aside,
[00:35:59] because I just think at that point, what are you really going to tell the person that was meaningful? But once they're gone, once they're actually gone, how do we create a better experience with them so that we learn where we could have made changes?
[00:36:15] Where we could have had a better employee experience? I think for the audience is say Benjamin, I think that explaining what can and can be turned off in terms of listening applications. So of course, Ryan and I, we use Google Chat, Slack, you know, all of these
[00:36:35] types of tools, email, etc. And I'm assuming company by company can turn on and off different tools if they want to depending on their needs. But what are the things, what are the things that are in the array of what they could use and use as listening
[00:36:50] tools? Yeah, good question. So you mentioned a bunch of them. I would say it starts on the spectrum from the most basic, which is if you're running a big survey, you're running a pulse survey, you have a pulse survey strategy or annual survey and you have open into
[00:37:04] comments in there. And you can mine those open into comments using the natural language understanding and processing powered by Discover to look at that. So that's probably the most basic and then it progresses to internal forums where it might be an always on, right?
[00:37:21] Maybe it's sitting on the company internet where it's the digital suggestion box. Anybody at any point in time can go and raise any topic or you could have it sub-organized. If you're on the benefits page, there's a place where you can go and you can
[00:37:34] we're naturally narrowing it down to benefits and so on. So that progresses a little bit. Then you get to the work channels and this is kind of going back to the research I was alluding to where we conducted that big annual
[00:37:46] study at the end of every year and we were looking at those work channels that people feel comfortable with. Well, in the data, what we found was that people were very comfortable generally speaking with work related. If I'm on my work computer and working mail and work
[00:38:01] communication, hey, if you're listening to me to help my experience, I don't mind all that much. Now, if you start snooping around my social media, that's different and people feel far more uncomfortable with that. So that kind of tells you the progression looking at email.
[00:38:15] What's going on via email? What are the topics and sentiment? Again, being careful, we're not identifying individuals. Right. And then you go to where Discover becomes really powerful is when you have Omni Channel or you have comments coming from external sources of data that
[00:38:31] might be publicly available to the company. You might have internal sources of data, internal communication, chat, whatever tool they're using if they have access to that data, they own that data. We can feed that in. It could be done via email, depending on the application and survey data.
[00:38:48] And then you can start to really look at different sources of data. What are those themes? What are those topics? What's the intention, etc. So within your brethren, the good folks at SIO, etc. How much of a response to a survey is based on
[00:39:07] that person at that moment, like the timing in their life? You send me I just had a hellish day and you send me a survey with all the best intentions to find out something. And I've just had a hell of a day.
[00:39:21] I go through there and you get a zero, zero, zero, zero, zero, zero, zero on every answer versus I just did a performance review. They said I was amazing. I'm going to get a new job and erased. You send me the exact same survey. You get 10, 10, 10, 10, 10, 10, 10, 10.
[00:39:36] So how much is this contextual to that person and what they're going through with that particular day? And I'll give you an example. If you ever studied the catalog business at any point, I can't say the catalog business. All right. So in the catalog
[00:39:54] business, I was a big fan of the Sears catalog during Christmastime though, you know, that's a different issue that we don't need to know. We don't need to know any more about that. No, just talking about toys, man. If you said best pro or $1, we'd be with you.
[00:40:09] No worries. No worries. We're not going to go down that or go down that road. No, the thing in the catalogs, you take a traditional catalog, they put the exact same sweater in six different places. And when you call a catalog company
[00:40:24] or when you respond to a catalog, the first thing they ask you is what page was that on? You say 22. They're trying to figure out what you responded to where. OK, because it's the same item. It's the same sweater. It was just used in different ways.
[00:40:38] It was folded up. It was on somebody, you know, this, that and the other. It was used in different ways. And so I think of surveys in the same way that it depends on when, when that survey landed in their world and
[00:40:50] how do you separate their world from the responses of those surveys? So the question is with you and your colleagues, etc., etc. Like how do y'all know that's how they really feel or is that how they feel it because of the day that they've had? Right.
[00:41:07] Right. Well, the yes, the short answer is the state that the person is in at the time absolutely has a as an effect on how they respond to a survey. And it's important to dissect what we're what's the purpose of a big survey like that, because we talked
[00:41:30] about the when you're having a holistic employee listening program, you have this range of organizationally driven listening and employee driven listening, but there's more nuance within it. And that some on the extreme organizational side of the conversation, you have what we call relational listening.
[00:41:48] And there what we're trying that we want to know is overall how do you feel about the work experience? We're trying to get to the to the general state of William in his work life and to Ryan in his work life. But you can also move on
[00:42:05] and say, well, we want to get very transactional where you just went through, you just use this brand new application that we're rolling out. And we want feedback on that application specifically, how was your experience there? And so that's an important backdrop to this question because sometimes
[00:42:25] you want the recent experience to that is what you are asking about. That's a good point. And but when you're doing relational listening, you in a way, you don't want that noise. However, what if who's to say that that current environment you're in today isn't
[00:42:44] highly relevant to your work experience? So I think you can really get into an argument where and we've been in this argument at SIPP, in fact, is that signal or noise? Right. Is does that have an effect? Absolutely. But is that signal or noise? That's a tough argument.
[00:43:02] I want to ask him at the beginning, are you in a good mood? Are you having a good day? Yeah, kind of like a little circles, the green red yellow circles that you get on. So at the airport, mood, you having a good day?
[00:43:15] I don't know who touches those things. I'm just saying. I do. I do. But I just keep touching all different ones like I just stand in front of it. It's dynamic. Tell me that. Just keep my mood. I'm like, I'm in a good mood.
[00:43:30] I'm in a bad mood. I'm in a terrible mood. If you hit the green one, you get a drop down survey. You hit red. You're like, yeah. Yeah. Well, what we can do like we actually can look for that. And then you ask that question about pulsing, right?
[00:43:44] Where you're getting that regular signal when you do that sort of thing, but you can look at it. We're not identifying individuals where we're pulling that identifying information out. But you can look at an individual visual level and look at the baseline of the individual and
[00:43:57] where you see in that person fluctuate over time. You can do that where you're you're surveying very frequently about things that mentally if you're in a state, right? When we think about state and trait in psychology, right? Your trait is like it's like climate, right?
[00:44:12] This is your climate. You're always going to be in this range. It's going to vary a little. The weather is going to vary a little bit. You're in that your climate. State is more like the weather and that can change, but it doesn't change when we're doing
[00:44:25] relational listening. We're not looking for daily fluctuations in weather. We're looking at seasonal fluctuations, right? In the winter, it's going to be colder. It might be a little bit warmer, but you get you get the analogy here. Well, when we're looking at those states,
[00:44:39] what we can do is we can increase the frequency with which we're surveying and look at what that baseline looks like and see what those fluctuations are like. And what you'll notice is some people have some people naturally have with it more within person fluctuation.
[00:44:54] I maybe tend to be a little bit more moody. So when I'm in a bad mood, it really affects my scores. Some people might be a little bit more emotionally stable. They they're not as affected by those day to day annoyances. And so it stays relatively
[00:45:06] stable. So you can look at that individual effect and you can pull that out of the quote unquote error variance when we're doing an early statistics on the back. Right. It's normalized. Ryan, you're going to ask about work. Ah, yes, work. The big old question.
[00:45:21] So the name of the network work defined, we're going to throw you on the spot here. We've talked about this before. Gosh, this is probably months ago back in the fall. That's right. So you've had some time to prepare. Too much time. And I'm thinking you might
[00:45:37] have prepared. I'm feeling a little bit of judgment here. Work. What does it mean for you to find it? Well, I have thought about this and I think it's a fantastic question. I love the way you are crowdsourcing these these this information. One of the most productive ways
[00:45:57] I think to think about it is as a currency transaction work as a transaction of currencies and sort of like when you go to the airport and you're getting ready to fly internationally and go to the little currency exchange. But specifically what you're exchanging is
[00:46:12] at the most basic level, you're exchanging time for a fungible currency of money. You're making a currency exchange. Now. I'm going to make this real bleak for a second. Right? So let's say you're you know, we're going to test the age of your audience here
[00:46:28] with a Jim Croce reference. So Jim Croce walks in with his bottle time in a bottle. Right. Yeah. And it's opaque. He has no idea how much time is in there. But you go to the currency exchange every day and say I'm going to take
[00:46:41] eight hours out of here. I have no idea how much is left. That might be it or there might be tons of time left that you don't know. And you take that time, that finite resource and currency he traded in for a fungible currency of money.
[00:46:56] Now that's if you think about it from that perspective and sometimes I do I like to think of the exchange of currencies because it immediately dawns on me. That's a terrible exchange. If that's what you're exchanging. Right. That's a terrible exchange and it's not in your favor.
[00:47:15] So that forces you to then move at least for me, forces me to then say well OK, well let me think about this a little deeper if let's say I was in a position where I had to work to survive. My family's lives are on the line
[00:47:34] and so I'm going to work every day. I'm making that exchange of my time, which is a finite resource that I have no idea how much I've left for a fungible resource, but that fungible resource then turns into life. I'm trading time for life.
[00:47:47] Is that worth it? Hell yeah, it's worth it. Right. Because it's not just my life now it's my family's. That OK. Now that's a meaningful that's a meaningful transaction. Now let's fast move on to people with a little bit more privilege and I'll say
[00:48:02] myself right off right because I don't have to do the job I do. I could do a number of different things and get paid less or more and we'd be able to survive. So I'm in a privileged position and for people like that you run
[00:48:15] into the same thing OK well is that a good trade for me? What it forces me to think about is well well hold on a second it's not just time that I'm trading in just for money. What I get from work is I get
[00:48:29] to work with some of the smartest people on earth every day. I get to build relationships. I'm a social person we're social beings. I get to build those relationships and look at you too. Right. You all have work you've built this amazing relationship and bond with each other.
[00:48:46] You find meaning in your work right. You're adding something valuable to the collective our brains and our bodies from an evolutionary standpoint they reward us for that. When we add value to the group that's what we do at work. So if you if it forces you
[00:49:00] I like this idea of transaction exchange exchange or currency exchange because it forces you to say well I that choice of whether it's just the time for money transactions kind of up to me and it's up to my mindset. Right. This is where I think the
[00:49:14] personal responsibility comes in. People were waiting for it to get swept off their feet falling in love with a job. Good luck. You take the first step. Love it. Just Mike walks off stage right. Not sure how many mikes we have but just drop the bucket.
[00:49:32] Benjamin thank you so much for coming on the show. We appreciate you and have a wonderful day. Thanks gentlemen. It was a pleasure.


