In this episode, we look at micro learning, bite-sized content, AI, learning management system, engagement, customization, and data analytics. By delivering personalized learning experiences where employees already spend time, Arist revolutionizes how organizations approach employee development, making L&D more agile and impactful.
Key Takeaways:
Arist increases engagement with bite-sized content delivered every six minutes instead of every six weeks.
AI customizes learning experiences, with over 85% of recommendations accepted by users.
Arist targets companies with complex training needs and large employee bases.
Data analytics and AI are essential for improving L&D and personalizing learning.
The role of L&D professionals is evolving, with learning being integrated into other roles.
Speed and agility in learning are key to staying competitive in today's fast-paced environment.
Chapters
00:00 Introduction and Overview
01:11 Micro-Learning and Bite-Sized Content
04:54 The Power of Shorter, Engaging Content
06:58 AI in Content Creation
09:20 The Advancements in AI
16:13 Building Trust in AI-Generated Content
18:59 Deploying Content and Testing
21:51 Target Buyers and Industries
22:59 The Changing Role of Learning and Development Professionals
24:17 The Importance of Data and Analytics in Learning and Development
25:08 The Potential of AI in Personalized Learning Experiences
27:06 Overcoming Resistance to New Learning Methods
29:45 The Need for Speed and Agility in Learning
Connect with Ryan Laverty here: https://www.linkedin.com/in/ryan-laverty/ and learn more about Arist.co here: https://www.arist.co/
William Tincup LinkedIn: https://www.linkedin.com/in/tincup/
Ryan Leary LinkedIn: https://www.linkedin.com/in/ryanleary/
Connect with WRKdefined on your favorite social network
The Site | Substack | LinkedIn | Instagram | X | Facebook | TikTok
Share your brand across the WRKdefined Podcast Network
Learn more about your ad choices. Visit megaphone.fm/adchoices
Powered by the WRKdefined Podcast Network.
[00:00:00] Hey, what's going on everyone? Ryan Leary here from Work Defined. You know, if there was one thing that I could change about recruiting, it would probably be the amazingly awful candidate experience that job seekers have to endure at one of the most stressful times in their life. Hiring teams, it is time to step up.
[00:00:23] You've got to create an experience that is memorable, fast and efficient. And you can do that with Indeed Smart Sourcing. Check them out online at Indeed.com or just Google Indeed Smart Sourcing.
[00:00:37] Deal has helped over 35,000 businesses simplify global hiring, onboarding, payroll and compliance. Visit Deal.com to learn more. That's D-E-E-L.com.
[00:01:02] Hey, this is William Tencup and Ryan Leary and you are listening, hopefully watching, the Use Case Podcast. We have Ryan on today. We're going to be learning all about his business. Ryan Leary. I got to do it because there's two Ryan L's.
[00:01:15] I was going to say, you know what? Knowing that Ryan, knowing he was on the call already, surprise, when you said we're going to learn about Ryan's business, I thought, shit, what do I got to talk about?
[00:01:26] Yeah, where are we going to take this? So how are you doing today?
[00:01:30] I'm doing great, Ryan. I was hoping I could just tap you in, but I guess I'm too common. So yeah, really excited to be here, Will. Thanks so much.
[00:01:39] Good, good, good, good. Well, all right, let's do an introduction. Introduce yourself and your company.
[00:01:45] Yeah, so my name is Ryan Laverty. I'm the co-founder and president of a company called Arist.
[00:01:49] The super short version is we help people learn on all the places they spend all their time. So that's tools like SMS, Microsoft Teams, WhatsApp tools. The average person checks every six minutes.
[00:02:01] I love it. So micro content?
[00:02:04] Yeah, micro learning, micro content, bite size spaced out over time.
[00:02:07] And so it's in the theme of go where they are and train where they are rather than to get them to go to a different place.
[00:02:17] Yeah, exactly. And so the average person checks a learning management system like once every six weeks if you don't go nudge them or kind of twist their arm or get a marketing feed behind it.
[00:02:27] So when we switch that to every six minutes, there's actually a – it's a 1600x attention difference if you can believe that, Will, and how much time we spend in Microsoft Teams, SMS.
[00:02:36] And that's actually across every generation, every type of role.
[00:02:39] So let's go back to every six weeks first off. That's pretty insane. So every six weeks, an employee checks into the LMS.
[00:02:51] Yeah. On average, if unprompted, just – hey, the thing I always tell people is we'll go to like strategy sessions with CLOs and they'll say, oh, we spent millions of dollars on this.
[00:03:02] And we'll go, okay, when was the last time you just decided to go hop into a master new one on the podcast and enter a name of –
[00:03:09] I think that's generous. I think that's actually generous.
[00:03:12] I think for most people, Ryan, let's take compliance-related content out because that has a different track.
[00:03:21] But if it's regular content, I think six weeks is probably – I think people probably log in like twice a year.
[00:03:30] I don't remember logging in very much.
[00:03:33] Oh, no. No, no. Compliance is different because that's a – yeah, you have to.
[00:03:38] That's a different bit. I agree with you on if you have a bunch of marketing behind it.
[00:03:43] But even then, it's still –
[00:03:45] It's forced learning.
[00:03:47] Yeah, they got to go to this place and do this thing.
[00:03:50] I've always felt when I was in corporate, forced learning never made me want to learn.
[00:03:56] Although, I loved what I did, right?
[00:03:59] And I loved learning about what I did.
[00:04:02] And I would have moments of inspiration where I would say, let's go learn today, right?
[00:04:07] So usually – well, you always – he always catches me, Ryan, when I'm going to the gym.
[00:04:13] And I take pre-workout or something.
[00:04:15] So I'm like revved up.
[00:04:16] And I'm like – and I'm super creative at that moment.
[00:04:19] And then he gets me on the phone and I sit in the car and I talk to him about all this great stuff.
[00:04:26] That's how I felt about learning in corporate.
[00:04:30] But then it was bad.
[00:04:33] It was 60 minutes, 90 minutes or six weeks of sessions.
[00:04:37] It's awful.
[00:04:38] So love what you're doing, microlearning.
[00:04:41] All that to get to, I love microlearning.
[00:04:43] Tell us more about that.
[00:04:44] Yeah.
[00:04:45] So I think – and you're 100 percent right.
[00:04:48] A few kind of comments on what you just mentioned there.
[00:04:50] But microlearning on a high level, the technical definition is just learning that's under 15 minutes.
[00:04:55] The average Eris lesson is about five to seven minutes.
[00:04:58] We found that that's kind of a sweet spot between like I can teach you enough but you don't get annoyed and you don't stop engaging.
[00:05:04] Right.
[00:05:05] The biggest thing usually though that I think people are always asking about is there's a few kind of big myths in learning.
[00:05:11] And two of the biggest myths are one, more content equals I learn more.
[00:05:15] And two, there's different types of learners like audio learners or visual learners or learning styles.
[00:05:21] Right?
[00:05:21] And that more engaging animated content actually creates better learning outcomes.
[00:05:25] And a lot of our data has found the opposite is that the shorter you can keep it, the more bite size.
[00:05:30] Like what I always remind a lot of our teams is brevity at the end of the day controls what people remember.
[00:05:34] Right?
[00:05:34] If I talk to you for an hour versus if I talk to you for 10 minutes and I say, okay, give me three bullet points of what we just went through.
[00:05:41] Right?
[00:05:41] The 10 minutes one is going to be way more targeted obviously.
[00:05:44] And so that's one big component.
[00:05:46] The other component being the formats that I mentioned.
[00:05:48] Like right now we're spending millions and millions of dollars to make things animated and to have a celebrity talk to you on a video and to go through all this stuff.
[00:05:56] And the reality is that, again, it's the same as the one hour lecture.
[00:05:59] You're going to forget most of that stuff.
[00:06:01] And most of that stuff is, you know, it's passive versus active.
[00:06:04] You don't have to respond.
[00:06:05] You don't have to engage.
[00:06:06] You can just sit there and let it run.
[00:06:07] And so I think that, you know, it puts a huge burden on these teams that have to build all this stuff.
[00:06:12] But at the same time, we always go back to the outcomes.
[00:06:14] We're an outcomes first company.
[00:06:15] Like it's not actually creating any better outcomes to do all of this fancy stuff.
[00:06:18] It's interesting because it's have to versus want to.
[00:06:21] And if you put content, especially bite-sized content, where they're already working, I can see people wanting to do it, wanting to consume versus having to consume and having to consume in kind of a forced way, whatever that may be.
[00:06:37] Are you creating the content for your clients or are they creating the content for themselves?
[00:06:42] Like let's dig into content creation.
[00:06:45] Yeah, sure thing.
[00:06:46] So there are a few primary ways that content is created.
[00:06:49] So clients are creating content themselves, to answer your question directly.
[00:06:54] The way we think about content, pretty much all of it is either I go buy a library and it's not customized or I go have to build a PowerPoint, articulate or something of the like, and it takes me forever.
[00:07:05] We like to sit in the middle.
[00:07:07] So we have a full content library built by subject matter experts.
[00:07:10] But most of where people are creating content today is actually through AI.
[00:07:13] So because it's so image-based and text-based rather than like long-form videos, we actually, you know, the medium is really conducive to having high-accuracy AI.
[00:07:23] And so we're now at a place where I think it's like 98% of our courses are built by AI.
[00:07:29] And the average course, you know, over 95% of the source content that AI outputs, which you can then go edit or version or translate or adjust, is pretty much unedited.
[00:07:40] And so in layman's terms, most people are plugging in prompts, having AI build the whole thing.
[00:07:45] Hey, Ryan, that was great.
[00:07:46] Now I want this for my salespeople, my healthcare professionals, whoever that might be.
[00:07:50] Here's a PowerPoint, a PDF, a video, a SCORM file.
[00:07:53] Go just build all this stuff for me.
[00:07:55] And then learning designers become editors and reviewers and fact-checkers rather than, you know, kind of from-scratch creators.
[00:08:03] And that was my next question is how are they verifying?
[00:08:06] And you just said, you said fact-checkers.
[00:08:09] So maybe walk us through some of your, you don't have to mention client names or anything.
[00:08:14] But companies that are using your platform, how are they building through AI?
[00:08:20] Got it.
[00:08:21] But what is that process, right?
[00:08:23] So they're building, they're prompting.
[00:08:25] Yeah, I want to know the prompts.
[00:08:26] Yeah, what are they doing here?
[00:08:27] Give us something like that.
[00:08:29] Because I actually think that's where most people get stuck, right?
[00:08:33] They know they need a platform.
[00:08:34] They want a cool platform.
[00:08:37] They want something quick and easy that it integrates and is where everyone is and all that great stuff.
[00:08:41] I want to take a break real quick just to let you know about a new show we've just added to the network.
[00:08:48] Up Next at Work, hosted by Gene and Kate Akil of The Devin Group.
[00:08:54] Fantastic show.
[00:08:55] If you're looking for something that pushes the norm, pushes the boundaries, has some really spirited conversations, Google Up Next at Work, Gene and Kate Akil from The Devin Group.
[00:09:09] But then they get it and they're like, what do I do?
[00:09:12] Right, right.
[00:09:13] And I think this is great.
[00:09:14] I'm glad we have this conversation because something I always remind people to kind of set this up is that AI right now gets about 150 times more powerful every six months.
[00:09:24] Right.
[00:09:24] Which, like, for the human brain to even comprehend what 150 X is, it's like it's too hard for us to do, right?
[00:09:30] Right.
[00:09:30] And so for most people, they say, hey, AI is really inaccurate and likes to make a bunch of stuff up because they use ChatGPT.
[00:09:37] And that's what their experience with ChatGPT has been.
[00:09:40] Right, right.
[00:09:40] So there's a few components of this, you know, a few parts to your question.
[00:09:43] One is that we've actually built the new model for any AI nerds kind of listening on the Claude 3.5 Sonnet model by Anthropic.
[00:09:51] And we have found it to be – you can feel the 150 X.
[00:09:54] I'll put it that way.
[00:09:55] Is it pulling several models into one?
[00:09:59] So it's its own model, but it's using, you know, for sourcing, just the way that it processes an output state of the way that it – the context windows to how much it can take in at once and then the sources it pulls from.
[00:10:10] Got it.
[00:10:11] It's just on all fronts a much higher level of accuracy.
[00:10:13] So that's one part of it.
[00:10:14] To the question on clients and accuracy.
[00:10:16] So our largest customer segment actually, without getting too much into specific clients, is the life sciences industry because if you think about, you know, what is AI plus SMS like most powerful for?
[00:10:29] Hey, I have to teach a large scale of people not at a laptop all day on really complex subjects.
[00:10:34] And, you know, kind of by accident we sort of fell into this.
[00:10:37] You know, the pharma industry meets a lot of that criteria.
[00:10:40] And so we went on site at a major pharmaceutical company in New York who made some vaccines that everyone is very familiar with.
[00:10:48] And we said, hey, let's take, you know, 700 pages of complex medical documentation, feed it through this.
[00:10:56] We'll have your medical SMEs in the room, and you can basically roast us if things are just wrong and accurate and it's making things up.
[00:11:03] And so they set it all off.
[00:11:04] They sat there and went through it, and you could hear a pin drop.
[00:11:07] And they basically were just like, that would have taken us seven months, and none of us found a single thing that was incorrect in the output.
[00:11:14] This is like 99.9% of where we want it to be, which, again, six months ago they would have said, oh, this is okay.
[00:11:20] It's 40%, and it's right sometimes, right?
[00:11:23] And so it's – again, it's the past few months that really create that magic in terms of technical development.
[00:11:27] Right, right.
[00:11:27] And I'm guessing the reaction is in, okay, now you've taken 50% of my job away.
[00:11:34] I would guess they're happy you're taking away the creation portion and letting them be the researchers and experts that they are by fact-checking and then making it better.
[00:11:46] 100%.
[00:11:47] And when we – the way that we look at the space broadly, right, is that Hollywood is a great example.
[00:11:53] When visual effects came out, everyone said movies are going to get cheaper and video editors are going to lose their job.
[00:11:58] And Mark Andreessen coined this term, the Hollywood effect, and said, you know, I'm a little too young to remember this all happening.
[00:12:04] But basically, you know, as a huge Star Wars fan, I remember all the visual effects that that had on the world.
[00:12:10] That's a shame.
[00:12:11] The average cost of a movie, you know, what, 50, 100x the staff you needed to make a movie because consumer expectations went up.
[00:12:18] And I think that learning AI, we're going to see the exact same trend.
[00:12:21] You're going to have more learning designers.
[00:12:23] You're going to have more people involved.
[00:12:24] And the average learning team today, the biggest complaint teams have is, hey, my learning team is like, you know, has a nine-month backlog.
[00:12:30] And so I'm going to try to create a lot of this learning myself.
[00:12:33] And that's actually what we're trying to get out of is like, look, when your learning team can turn something around to you tomorrow, you're not going to try to go around them, right?
[00:12:40] And people kind of treat procurement departments like this now.
[00:12:43] Like, hey, it's going to take six months.
[00:12:44] So let's see if we can not go through that.
[00:12:46] But we know we have to, right?
[00:12:48] And so that's really what you're solving is that massive backlog.
[00:12:51] And you're also, again, along the way, you're raising consumer expectations in terms of things like customizability to the individual.
[00:12:58] And so you're going to end up having a lot more learning teams.
[00:13:00] They're just going to have these tools at their disposal.
[00:13:03] Love it.
[00:13:04] Love it on a lot of levels.
[00:13:06] So this is going to seem a little meta.
[00:13:08] Do we have to teach the people that are going to be doing this how to do prompt engineering?
[00:13:14] So we thought so too.
[00:13:15] Like, I've used ChatGPT in the past, and it takes some, like, get to know you, like, really go back to you.
[00:13:22] With the new model, we have not onboarded anyone into prompt engineering.
[00:13:26] And I think the reason why is because so with the AI, the way that we've constructed it, it basically asks you, you know, five questions.
[00:13:34] What's the topic?
[00:13:35] What's the audience?
[00:13:36] What's the kind of beginner intermediate expertise, like level of complexity?
[00:13:41] What's the learning objective?
[00:13:42] And then what are the knowledge components that you want as a result of this?
[00:13:45] And you can go feed it, you know, documents, videos, et cetera, PowerPoints, a certain language you want it in, like filter criteria.
[00:13:52] But it's basically just asking you in plain language, all right, Will, what do you want to teach people?
[00:13:57] If this is successful, what do they know?
[00:13:59] What audience is this for?
[00:14:00] And describe the subject to me in one sentence.
[00:14:02] And for most people, if we give it those guardrails, there's not, like, a special way you can describe it.
[00:14:06] I think the other thing that's really important, though, is after topic and audience.
[00:14:10] So, again, I'll use the life sciences example.
[00:14:12] We put in a life sciences company and the name of a drug they have coming on the market.
[00:14:18] And it auto-recommends, because, again, it's been trained on millions of courses.
[00:14:21] It auto-recommends the learning objectives and the knowledge components.
[00:14:24] And people sit there and they're like, while they're building, yeah, okay, this is exactly what I want it to be, right?
[00:14:29] And so you don't even have to think through those pieces.
[00:14:32] Over 85% of our users just take all the AI-recommended learning objectives and knowledge components, which is insane because you don't even have to do that part of it now.
[00:14:40] Wow.
[00:14:41] When you say take it without fact-checking.
[00:14:44] No, these are recommendations.
[00:14:46] Accept the recommendation.
[00:14:47] Accept the recommendation.
[00:14:48] Yeah.
[00:14:49] Yeah.
[00:14:49] AI just spits back the recommendation based on those other two inputs and says, this is probably what you would recommend that they learn, right?
[00:14:56] Yeah.
[00:14:57] Yes.
[00:14:58] What was the timeframe it took them to build that trust?
[00:15:02] To build the trust with knowing that the AI – with trusting the knowledge that they gave it.
[00:15:08] I mean pretty immediate because, again, I'll give you another example here.
[00:15:10] Let's say I'm someone at one of these companies and I put in the audience and the name.
[00:15:15] And in my head, it takes about four seconds to give that.
[00:15:19] In my head, I'm formulating, okay, here's what I think I want to teach you.
[00:15:23] And then it gives you three or four different options and you look at it and go, actually, number three is not only right, that includes stuff I forgot about and is more accurate.
[00:15:31] I think that – you only have to have that happen once for you to understand, okay, this got to where I wanted to go in a better place and faster.
[00:15:39] I trust this at least to do this part of it.
[00:15:41] And then that keeps happening at every stage.
[00:15:43] So when you say ingesting, you can upload obviously documentation that you want.
[00:15:48] But it's also searching other forms of data, right?
[00:15:55] Yeah, spot on.
[00:15:56] And so the way that this AI model is constructed, it can use – it uses the internet and other sources.
[00:16:01] Like a question I get all the time is, hey, Ryan, how much is it going to rely on like the internet versus what I gave it because what if stuff on the internet is wrong?
[00:16:08] And I think that's where a few components.
[00:16:10] One is that you can ask for sourcing and things and source references, right?
[00:16:15] That's where the Claude model being really good comes in.
[00:16:17] Like it pulls from what it considers reputable sources.
[00:16:20] If I could publish a blog, it's not going to just pull from that, right?
[00:16:23] And then the second component of this is that for a lot of these tools, you basically now have – like for us at least, we give an output that is fully editable.
[00:16:32] And you can go in and you can kind of verify that.
[00:16:34] But again, for most folks, they're not really verifying it because the output of that, it's relying on your documentation except for the pieces in your documentation that it doesn't think are relevant to.
[00:16:46] To the objectives you gave it.
[00:16:48] And so it'll look – first it looks at the objectives.
[00:16:50] Then it looks at everything you gave it.
[00:16:52] Then if it says, hey, you asked me for this but this wasn't mentioned in anything you said, then I'm going to go pull from one of my reputable sources.
[00:16:59] So, okay, Ryan, so this maybe isn't a question fully directed at your product.
[00:17:07] But where are we at?
[00:17:09] If you had to kind of put a pin on it, where are we at in the world of training and AI content creation with companies trusting your data – not your data but the AI training created data?
[00:17:25] The result over their own documents.
[00:17:29] And are you seeing that companies are – when they put their own documents in, realizing out of this 400-page document, we've got about 25 pages that are just really wrong and bad.
[00:17:42] Are we seeing that?
[00:17:44] I haven't – I can't say I've seen companies look at their own documentation and say like, hey, this documentation is wrong or bad.
[00:17:51] I think that would probably happen kind of early in the process.
[00:17:54] I think the opposite is true where anything that's not in their document, even if it is more correct or up-to-date or right, people are really hesitant about.
[00:18:03] And so I saw an example last week.
[00:18:05] We were working with a large manufacturing company.
[00:18:07] And without getting too into the weeds, there was some type of process.
[00:18:11] And the AI had given an overview of this process.
[00:18:13] And they were like, wait, this part of it is wrong.
[00:18:16] And they went and looked it up.
[00:18:17] And their documentation was four years old.
[00:18:19] And this process had changed like six months ago.
[00:18:21] And the AI had the most up-to-date process in there.
[00:18:24] But that's a great example of you wouldn't believe how many back and forths of, hey, the AI invented stuff.
[00:18:31] And if this goes out to people, this is so bad, right?
[00:18:34] Because in companies, we look at a PDF that exists somewhere as like this is the holy grail.
[00:18:40] This is the source of truth.
[00:18:42] And I think that people forget that in today's day and age, relevancy is an ever-and-ever ongoing battle, right, especially in certain industries.
[00:18:51] And I think that the magic of a lot of these tools is if you, instead of saying, okay, here's a snapshot of what's correct, the way that these models work is basically saying, here's sources that are reputable on the subject.
[00:19:02] You're never going to have something new or important come out in this world that's not through one of these sources at the same time.
[00:19:08] And so you basically are pulling from a database, not from a Google Drive, for lack of a better term, right?
[00:19:14] Right.
[00:19:15] So I think more broadly in terms of where we are, companies still don't trust AI.
[00:19:21] Like we still go through a crazy process.
[00:19:23] The big thing is like closed-loop systems.
[00:19:25] We've got to prove to people, look, your company info is not going to – Will is not going to go ask for the top five competitive podcasts, you know, strategy docs they put in there.
[00:19:35] And I think that's, again, because companies are made up of people and people's experiences using ChatGPT one year ago.
[00:19:41] Right, right.
[00:19:42] Right.
[00:19:42] So let's talk about deploying.
[00:19:44] And you call these courses, or at least you've called them courses in some ways.
[00:19:48] So it's going to be in the places that they're at, whether or not that's text or Teams or Slack or wherever.
[00:19:55] So wherever they're collaborating, if you will, you're putting training in there.
[00:20:02] And a person that's over that, they can put those into different channels.
[00:20:05] They can set, you know, the length to be whatever it is.
[00:20:08] Do you have clients asking you about testing?
[00:20:12] Because it was a big thing years ago, especially on the LMS side, was it's not just enough to give somebody a course.
[00:20:22] They could run that thing on another monitor.
[00:20:25] You know, you don't even know if they're listening to it or whatever.
[00:20:28] There was a whole thing about, you know, we want to make sure that they actually got the bit.
[00:20:33] So we'll do a small test at the end of it to just make sure that they consumed it and they, you know, actually understood it, et cetera.
[00:20:41] Do you have clients?
[00:20:42] Do they care about that anymore?
[00:20:44] Or is it more about getting the content in front of them and where they're at?
[00:20:51] Yeah.
[00:20:51] I mean, it's both, right?
[00:20:53] Like in terms of testing for our solution broadly, folks always really want to test.
[00:20:59] I mean, I think the more new something is, the more someone wants to test it, right?
[00:21:02] I think that's a good rule of thumb.
[00:21:04] Yeah.
[00:21:04] You know, the way that I think about this is we as a company have basically taken a stance of like, look, everyone's going to promise you the world.
[00:21:11] We're just going to let you try a bunch of stuff, but we're going to do it in a super not hands-on way because you got to understand 50 people at A ask us to just try a bunch of stuff and love to waste your time.
[00:21:21] What we've basically done, people usually want to test a few things.
[00:21:25] Does the AI work?
[00:21:26] And on the AI side, does the AI work?
[00:21:28] Is the AI accurate, right?
[00:21:29] And the SMS, you know, we've been talking a lot about AI.
[00:21:31] SMS, Teams delivery, such a new component, such like a brand new thing.
[00:21:35] Scary for a lot of companies still, right?
[00:21:38] I'll literally – the most common probably like one-liner I have in these conversations is look for the AI tool.
[00:21:44] We can – you know, it's called Creator.
[00:21:45] You can go to airs.co.com slash Creator.
[00:21:47] Go try it on the website.
[00:21:48] If it doesn't blow your mind, like we don't have to keep talking.
[00:21:52] The courses themselves –
[00:21:53] I love that pitch right there.
[00:21:54] Dude, the courses themselves, if you are worried that you want to try it with your company content,
[00:22:01] some stuff, we'll put it through the AI right now.
[00:22:03] I'll send you the course.
[00:22:04] Share that with as many people as you want.
[00:22:05] If you want a course off the website, you know, text sales, text improve,
[00:22:09] whatever that course name is to the short code and take the course.
[00:22:12] Share it with as many people as you want.
[00:22:14] Let's talk to each other in a week.
[00:22:15] Because I think for us, like if you go share it with 20 people and they all say,
[00:22:19] hey, this is terrible.
[00:22:20] We don't like this.
[00:22:21] Which, knock on wood, has not happened.
[00:22:22] But in their mind, it could, right?
[00:22:25] We're going to waste each other's time anyways.
[00:22:26] And so I've never liked that whole dance of, oh, let's create a 50-page plan and then try something.
[00:22:34] Like try it.
[00:22:35] Get bought into it.
[00:22:36] What I won't engage with someone on is we want to run like a six-month study to measure this.
[00:22:42] Got like a 5% better engagement than this because we've done that with university partners.
[00:22:46] We've got all the data.
[00:22:47] Like let's not spin our wheels.
[00:22:48] First off, Ryan, love the pitch.
[00:22:50] Right.
[00:22:50] I love very concise.
[00:22:52] You get it and you know how to tell their story, which is great.
[00:22:55] Great.
[00:22:56] Who are we selling to?
[00:22:58] Who's the buyer?
[00:23:00] That's a great question.
[00:23:01] Is it every company or are we talking certain size of company, complexities?
[00:23:05] You're talking about positions too, aren't you, Ryan?
[00:23:06] Positions.
[00:23:07] Yeah.
[00:23:07] Like who are we talking about?
[00:23:08] Yeah, absolutely.
[00:23:09] So our sort of center of the target buyer is a head of learning and development, chief learning officer,
[00:23:16] head of commercial learning and development at a Fortune 500 company.
[00:23:20] Aeris is a tool.
[00:23:21] Again, it works for every type of company.
[00:23:23] But the reality is that the more you have industry complexity in what you're trying to train,
[00:23:29] e.g.
[00:23:29] life sciences, manufacturing, and just the more people you have for a tool like this, it works better with scale.
[00:23:35] Our customers are everyone from, you know, two, three hundred person tech companies who don't want to stand up a learning team in an LMS yet.
[00:23:43] And so they just use all this all the way up to, you know, 250,000 person, million person like retail, logistics, pharmaceutical company, et cetera.
[00:23:52] But again, our sweet spot is really do you have a lot of people and is your job running training bonus points if it is something that is highly complex
[00:24:01] because you're just going to feel the benefit of the simplicity of this medium more.
[00:24:04] Do CLOs still exist?
[00:24:08] Oh my goodness.
[00:24:10] Bad touching, harassment, sex, violence, fraud, threats, all things that could have been avoided if you had Fama.
[00:24:23] Stop hiring dangerous people.
[00:24:26] Fama.io
[00:24:29] This complex financing talk is very exciting.
[00:24:33] Do I have any of my own depots ever?
[00:24:35] But you already have a depot.
[00:24:37] No.
[00:24:38] Yes, you have the Vodafone Gigadepot.
[00:24:40] Yes, and I have myself in the hand how big my depot is.
[00:24:44] Now with the Vodafone Gigadepot and the data volume of the next month.
[00:24:48] Not mitnehmen.
[00:24:48] Go on in the 5G net from Vodafone.
[00:24:51] Vodafone.
[00:24:52] Together we can.
[00:24:55] they do they do exist um i think they're in the back room
[00:25:00] first of all i used to go to see a little conferences so um okay i'm that person atv
[00:25:06] been all of those spoken a couple of them but it just seems like there's not as many of them
[00:25:13] or at least there's not as much discussion about them yeah as there was a decade ago they've been
[00:25:18] incorporated or into other roles yeah i you know i'm saying i i'm legitimately legitimately asking
[00:25:26] are they still kind of the people that there were a decade ago where they ran training and
[00:25:30] development learning and development all the bespoke uh kind of high performer high performance
[00:25:36] types of programs they ran those things right they're the ones that bought atsss yeah i think um
[00:25:42] but my take on this more broadly is that you know we've what i've really seen is it is a massive
[00:25:47] bifurcation of the learning development function in some companies it's become incredibly highly
[00:25:53] relevant other companies it's kind of falling to the wayside right and i think the reason why that's
[00:25:57] happening you know again um you look at a lot of these organizations that are putting out products
[00:26:02] a lot and they actually will nest learning development departments under commercial teams like a marketing
[00:26:06] or sales leader and they're on a tight schedule they've got to train thousands of sales people
[00:26:10] they've got to get something out instantly right they have huge budgets they're highly relevant you
[00:26:15] know you can talk to those teams they clearly have recruited the smartest people
[00:26:17] you've ever met to go run those things right um and so i think that you've got on one side on the
[00:26:22] other side i think really the what we'll call the crisis of the clo is really just company by
[00:26:27] company a crisis of whose responsibility is xyz right because if nothing else then learning gets
[00:26:34] kind of just left with like okay compliance and general company onboarding i think it's a really
[00:26:39] interesting parallel i see is you know the reason salesforce got its big um sort of heyday over
[00:26:45] other tools was because everyone sold all these sales tools to the you know chief information
[00:26:49] officer and then salesforce started saying hey sales team you don't want to wait nine months to go
[00:26:54] through the cio what if we just sold you know sales software right to you and i think that with
[00:26:59] learning it's interesting because again i might not want to wait nine months to go through the learning
[00:27:02] team if they take a long time they're not fast and agile i'm going to just start building my own
[00:27:07] stuff in-house take a team under my team and they're going to become irrelevant right and so again
[00:27:11] learning has this heyday moment where become fast or become highly decentralized and then fast under
[00:27:16] those teams got it do you find that most or do you find a lot of companies um going in by department
[00:27:25] by department picking up training like this or is it that they they just kind of funnel it right through
[00:27:32] the learning teams i think i've seen it highly dependent on industry um again like manufacturing life
[00:27:39] sciences technology i've seen on average are a bit more decentralized um other other industries like
[00:27:44] industries with large frontline workforces right it's obviously harder to do that with fewer product
[00:27:48] skews it's harder to do that and so they centralize it more but again i've seen really highly efficient
[00:27:54] centralized learning teams as well i think it really i think a good litmus test is you know and
[00:27:59] obviously there's a lot of different complexity for things they're training on but give me give me
[00:28:03] like three or four different types of trainings they create what is the average turnaround time for
[00:28:07] those trainings and for whatever you're training on has it has the training you're building become
[00:28:12] irrelevant in the time it has taken you to train on that thing right if it's you know six months for
[00:28:17] an ai course that course isn't relevant anymore right what's the admin kind of user statistics look
[00:28:25] like like what does their dashboard look like because i'm i'm thinking about like what hootsuite did
[00:28:30] years ago was they'd optimize post for when people would consume it they had to learn some things
[00:28:38] and some of it was a little bit of voodoo however i could see this with consumption i could see them
[00:28:43] learning the audience and learning how they consume when they consume how much they consume etc
[00:28:49] and then recommending at one point recommending when something should be rendered or sent to somebody
[00:28:57] yeah so tell us a little bit about the back end for the admins yeah absolutely and i think i think
[00:29:02] that's you hit the nail on the head i'll kind of wrap these two together but um the big piece of
[00:29:06] this is how how does learning have the data to be a strategic partner to the business and that that
[00:29:10] tipping point happens when you cross the threshold of we know something that other people in the
[00:29:15] business do not know which right now if you've just got completion rates on some sort of video you
[00:29:20] put out you don't have that and so for us we look at it in terms of three tiers so tier one is
[00:29:25] i've got we actually call them our lms metrics um completion rate answer accuracy response progress
[00:29:30] like okay i need to check the box somewhere right um level two is what we call space metrics and so
[00:29:36] confidence lift over time learner sentiment over time um i'm measuring i'm surveying people i'm
[00:29:41] measuring these criteria i had will take this on day one day five day ten etc the data you know
[00:29:46] better than tier one still just quantitative but a bit more accurate um and then tier three is really
[00:29:52] uh open-ended scenario based responses and so we're looking at things like you can actually see
[00:29:58] every single learner response and so again think about the every six minutes versus every six weeks
[00:30:03] parallel i gave translate that into number of interactions um we collect about a hundred times
[00:30:09] as much data um that's the simplified version it's actually like 140 something as much data on the
[00:30:14] average person as an lms does because of frequency of interaction again i'll put this all in layman's
[00:30:19] terms i launch a sales training in lms someone goes and looks at it i know they completed it
[00:30:24] i launch it in the system it happens over two weeks in five to ten minutes per day i know how confident
[00:30:30] how their confidence is increasing over time i saw how they handled the objection to this on day one day
[00:30:34] five day ten okay on day ten will did much better on this than day three everyone got this quiz right
[00:30:40] but when we talked through new features coming out all their questions were actually all their
[00:30:45] responses were actually not up to par with what we you know what we thought would be good um and a big
[00:30:50] the next big tranche of our ai after creation um i'll get into the next big tranche soon but the third
[00:30:56] big tranche is uh is a big um you know analytics and recommendations engine to basically say look will
[00:31:01] we just put this out to you know 20 000 sales people here's the top four knowledge gaps that seem to exist
[00:31:08] within your organization do you want the ai to go ahead and create a course on that for you and when you
[00:31:12] have that the clo can go back to the ceo and say hey did you know that in our north america region
[00:31:16] you know 82 percent of people um don't like their manager and we're going to see big turnover and 63
[00:31:23] percent of sales people don't understand these new capabilities we put out that's why we're not seeing
[00:31:27] this product move in north america then you become a strategic part of the business and then you get
[00:31:31] budget i love that out of all the things great things you've said that's that might be the best piece
[00:31:39] for me right there i don't know the pitch was pretty good go test it was pretty good if it
[00:31:44] doesn't knock your socks off don't call us don't waste our time it's different words you do use
[00:31:49] different words but i'm gonna go back and replay i want to use that one oh yeah yeah use it if it
[00:31:55] doesn't dazzle you don't call us don't bother us don't waste our time let's not really what you're
[00:32:00] saying is let's not waste each other's time right yeah yeah all right so i've got i got two questions
[00:32:05] um we can answer one later it's totally fine but we had a another call earlier uh today another
[00:32:13] recording where uh we were talking about it actually was it was a an ai recruiter and so i went
[00:32:21] through an audio interview this is hilarious right it's amazing it was actually really cool and he was
[00:32:28] a total asshole i was as a candidate it asked one word answers it was it was hilarious it's like no
[00:32:34] why are you calling me like and it would actually respond like instantaneously yeah does this get
[00:32:40] to a place where the courses that are being created have the interaction that way where yeah as a sales
[00:32:49] professional or customer service or it's not just one it's just a one way is right yeah bi-directional
[00:32:56] where i'm going through the course but instead of listening or reading or whatever i have to respond
[00:33:02] and you're hearing me it's hearing my inflection it's hearing my voice it's hearing how excited i am
[00:33:06] versus all right i'm really not paying attention because they're the reason i bring that up is
[00:33:11] because the feedback that the ai recruiter had on me in this example said i seem disinterested
[00:33:18] i have the background based on my profile but he was a little disinterested very short answers
[00:33:23] he may not be a good fit for your team it gave a report to the recruiter yeah it said it's an ai
[00:33:28] recruiter we're going to do some screening questions blah blah so it did all that preface stuff and then
[00:33:33] it came back and and he gave a report he showed us the report that to the recruiter it was spot on
[00:33:39] it was a 2.2 out of a five and he basically said he was despondent he wasn't really interested
[00:33:46] we would i wouldn't recommend that we move forward like it was doing all that stuff that you would do in
[00:33:50] screening yeah but so does the training get to that yeah do you think training gets to a point
[00:33:56] where it becomes more bi-directional yeah i i think it absolutely will i mean the the advent of all
[00:34:02] these ai tools what they're really doing is they are um allowing for us to have the best of both worlds
[00:34:08] of mass scale and mass customizability personalization right which is really what
[00:34:12] you're getting at with the ai recruiter um i mean for you know for us as a product broadly
[00:34:16] we actually take the opposite stance we consider ourselves strategically to say let's do pretty
[00:34:22] much everything right up until you sit down with someone one-on-one and need to have an interaction
[00:34:27] there's you know 30 to 50 tools per week coming out of let's sit down let's walk you through a
[00:34:32] scenario and i think one of those is going to really win out and for us when that one does
[00:34:36] we're going to plug right into it right i think with the with the voice agents though you bring up a
[00:34:40] really good point i kind of alluded earlier to um you know some cool stuff we have coming out soon i
[00:34:45] mean i'll just i'll just you know i'll just tell you all but uh basically a lot of what we're
[00:34:49] releasing soon are needs analysis tools and so for us again doing everything up till that like super
[00:34:54] one-to-one personalization um right now if i said to you hey look we can instantly create this learning
[00:34:59] program we can instantly deliver it to uh you know tens of thousands hundred thousands of people
[00:35:04] because they're already queued in their attention to a microsoft teams or sms and we can run all the
[00:35:08] analytics like we can kind of do everything end to end the piece in that puzzle we're missing
[00:35:12] other than that like very last step personalization of the learning journey
[00:35:15] is that first step of needs analysis right and so um the next big feature for a lot of our
[00:35:21] is that i don't know what i don't know correct stuff so true correct so when i say needs analysis
[00:35:26] i mean like look ryan that's great i can create content in an instant before we move on i need to
[00:35:31] let you know about my friend mark pfeffer and his show people tech if you're looking for the
[00:35:38] latest on product development marketing funding big deals happening in talent acquisition
[00:35:44] hr hcm that's the show you need to listen to go to the work to find network search up people tech
[00:35:52] mark pfeffer you can find them anywhere i don't know what to create content on right and so for us
[00:36:00] um a big focus uh with the ai over the next few months is what we can basically do with voice agents
[00:36:04] is say hey look i you know i run learning for a massive um uh retail company and i need to learn
[00:36:12] uh what i need to train my store associates this quarter i this ai knows my earnings report it knows
[00:36:18] things going on in my system it understands my hrs data um but hey ai i want you to you know i don't
[00:36:23] know what i don't know go talk to five of my uh retail frontline employees three of their managers
[00:36:29] um four experts in the space uh external of our company and six of our uh leaders who have a strong
[00:36:37] opinion on you know this type of thing go interview them tomorrow through a one-to-one interaction
[00:36:41] it's going to email all these people they're going to click on something go through the interaction
[00:36:44] and then come back to me and tell me what are the top five knowledge gaps in the company what should
[00:36:49] i learn i go look at that and say okay that looks great you know click a button and hey look i just
[00:36:54] created a course on these things click another button okay i just delivered it right to their sms
[00:36:59] their text messages right and so at the end of the day we can use ai to do the full end-to-end
[00:37:04] of all of these things are your customers asking you about retention of knowledge
[00:37:09] because you know it's one thing to learn like it's a half-life right so i watched it i consumed it in
[00:37:16] whatever way that i consumed it i feel like okay yeah i got the bit a week later i've slept since then
[00:37:23] i might not still have the bit like are they concerned either on the user side or the admin
[00:37:30] side are they concerned about the retention of knowledge yeah absolutely i mean i think
[00:37:34] well my most honest answer is some people are concerned some people are not i think they should
[00:37:38] all be concerned um you know candidly but uh i appreciate that depends on your philosophy there
[00:37:45] but um you know if even if we look like the the kirkpatrick model is the most recognized way to
[00:37:50] assess right and at the end of the day everyone knows it's like okay the simplified version is awareness
[00:37:55] retention action and then behavior change comes after right right now the problem is right now
[00:38:00] awareness is almost nothing because you've got you know this massive marketing effort to get people
[00:38:04] to do that for us i think we solved a lot of the awareness challenges with bringing it right to
[00:38:09] these places these tools people use right with retention um we've solved a lot of that through being
[00:38:14] intentional about how things space out so we did a study with uh the world health organization and
[00:38:19] brown university a few years ago and we compared um traditional learning methods to like an eris course
[00:38:24] and with traditional methods you forget um over 75 of what you learned in 24 hours over 90 in 30 days
[00:38:31] right we actually had done a study and we saw um people remembered 88 of information three months later
[00:38:39] so again that's a massive shift from i forgot almost everything in a day to i remembered almost everything
[00:38:44] you know uh three months later and again if we spread it out over 10 days you're going to remember
[00:38:48] that stuff right now the next piece of it is um actions taken it's not just do i remember
[00:38:54] things it's then okay is this tool constantly nudging and re-nudging me to go take action on
[00:38:59] these things and we saw a 33 percent lift on average in reported and measured actions taken
[00:39:05] and i think that's where you get to kind of the sweet spot of it is i'm much more aware because
[00:39:09] everyone's taking it that opens up retention now a lot now we're remembering a lot more stuff
[00:39:12] now let's nudge on what you've remembered okay now we're taking a lot more actions now behavior
[00:39:16] change has actually occurred i love that yeah i mean i'm i'm loving this more and more like i want to
[00:39:22] use it you want to learn course like i want to learn i'll create my own course i want to i want to
[00:39:28] create something right you've converted somebody he's he's not a learner and uh now he is yeah i know
[00:39:34] totally i i love this where where i think i know the answer because you've answered this in like
[00:39:39] 10 different ways but where are i'm going to ask you the question and then i'm going to give an
[00:39:45] opinion maybe what you just got it this is where i have to watch what i say because a lot of
[00:39:53] preface yeah watch what you say all right so let it fly one where are clients pushing back like where
[00:40:00] where yeah you've answered that yeah you've answered that in a number of ways but in the back of your
[00:40:07] mind and i know you can't just dog people that are i say clients prospects that are pushing back
[00:40:13] but what goes through your mind when you hear these excuses when they are resisting bringing
[00:40:20] learning like this to their company to hold everybody back yeah it's a fantastic it's a
[00:40:26] fantastic question a few a few things um he's thought about this see adapt or die oh yeah i had
[00:40:34] this conversation today you know what i mean yeah yeah people people buy for the same reasons they vote
[00:40:39] it's based on emotion and sentiment not based on data and logic right 100 i've had people you know
[00:40:45] so here's the most common reasons we get i don't think people are going to like this and that's after
[00:40:50] 100 of their people took it and 93 said this is the favorite right um i don't think people are going to
[00:40:56] um you know i don't i don't think uh people are actually going to be able to learn something
[00:41:00] measurable this way um and again if you think about it you know take anyone in a role we all went
[00:41:05] through the traditional um you know learning method in school how we learned most of what we know uh
[00:41:10] you know at least how well that's not true how we perceive how we learned most of what we know
[00:41:14] is sitting in a classroom right um when in reality mostly from from peers and others right
[00:41:19] it's mostly from tiktok and instagram right right um and so you know if we compare that oh i've had
[00:41:26] people say to me oh this looks really good the most common response i get this looks really good for
[00:41:32] reinforcement and for follow-on learning after real learning which for me after real learning
[00:41:38] oh wow they don't they just called your baby ugly very very polite people very nice they're like
[00:41:46] yeah you know after like a real learning experience then we'd have this happen after
[00:41:49] and for us we actually have started to say like look you know it works well for reinforcement too but
[00:41:55] um we actually won't even approve first use we won't work with someone if the first use case is
[00:42:00] reinforcement because we're like look we want to apples to apples take something as a standalone to
[00:42:05] prove to you that we can cover this much yeah so we did a study where um we basically measured um
[00:42:10] uh we did the study with the university of washington years ago where we took a bunch of ob-gyn interns
[00:42:16] so super complex medical subjects and had them um you know learn things for this comprehensive medical exam
[00:42:23] and 500 people learned through arist and 500 people learned through these online videos that they'd been
[00:42:28] using and the online videos you know par for par were hours and hours more of content and most people
[00:42:33] when you say real learning they just mean more hours and more time and more content right yeah
[00:42:38] something that feels more substantial yeah right and the reality is so there was a the study came out
[00:42:44] and you know i can share the study with you there's a 19 exam score lift in the people who took it
[00:42:49] through the errors course and the reality is that they probably spent far less time learning
[00:42:55] that's what i mean if i said if i said to you well look i spent 10 hours studying ryan over here he
[00:43:00] only spent three hours studying you're gonna be like okay you know ryan well bad example we're both
[00:43:05] named ryan but uh me ryan you're far more prepared the reality is that that ryan may have spread those
[00:43:11] three hours out over 15 days and was very active and really you know challenged himself right and i could
[00:43:18] have just been sitting there holding the textbook texting my friends all the textbooks open hey i said
[00:43:21] 10 hours that's what learning is today and that's the comparison i love it okay so when let's do a
[00:43:30] one buy side or two buy side questions what should if you could script questions from prospects to ask you
[00:43:38] what would you like for them to be asking you i think i think the the best question i've ever gotten
[00:43:44] from a prospect was um you know give me the super simple version about how this is going to change
[00:43:50] everything like tell me how this like everyone always asks for kind of the the small peanuts like
[00:43:55] oh you know what's something this will do for the company i think it was kind of like this gentleman
[00:43:59] is really innovative he's kind of like give me give me the give me the give me the big pitch for
[00:44:04] this all works exactly as you intended to what does this actually do and again this was for a company that
[00:44:09] was rolling out tons of products we're using it to train a massive sales force in these products
[00:44:12] and we said look every you know for this company every month extra it took them to roll out a product
[00:44:17] cost them uh you know i think it was like 45 million dollars a crazy amount of money massive
[00:44:21] global company and we said look if we can cut your training time in half the amount like just in half
[00:44:27] right the amount of time it takes for you to actually roll out a new product uh the training side of that
[00:44:31] equation not manufacturing gets cut in half that would save you guys tens of millions of dollars right
[00:44:37] and i think that's the big pitch for all of this is people um undersell the importance of speed
[00:44:43] and agility in learning not just for the relevance of the learning department but um again the
[00:44:49] relevancy of your AI like picture an organization where you can snap your fingers and everybody knows
[00:44:55] a new skill right that organization would be the most valuable company in the world but people don't
[00:45:00] think of learning agility and of speed to competence across an organization as something that is you know
[00:45:07] valuable enough to measure they think of competence as this general thing of oh we need to get
[00:45:11] everyone up to speed on ai not we need to have a company that is an adaptive learning function where
[00:45:17] we can we have our systems in place where we can train people instantly and i think that's really the
[00:45:20] the big pitch of what we're going for here stops mike walks off stage well he did that 20 minutes ago
[00:45:25] my god it's a great podcast ryan thank you so much for coming on the show sharing your wisdom
[00:45:32] it's been fantastic it i can speak for ryan we love what you're doing yes absolutely love what
[00:45:38] you're doing so keep doing it
[00:45:40] you


