Tim Sackett is an HR Technology Analyst, a Top 10 Global HR Influencer, and the President of HRU Technical Resources. In this episode, Tim discusses the state of recruiting and applicant tracking systems, and how he thinks AI could help organizations not just scale their recruiting efforts but also eliminate biases. He also addresses some security concerns organizations have with integrating AI into their recruitment systems.
This conversation took place at the HR Tech 2024 conference in Las Vegas.
[0:00] Introduction
- Welcome, Tim!
- Today’s Topic: The Evolution of Recruitment and Application Tracking Systems
[4:50] How is recruiting technology evolving?
- How some HR tech startups are looking to solve recruitment issues
- Why many HR tech startups seem to offer similar products
[12:01] Have there been noteworthy developments in recruiting technology?
- Why AI is not as biased as people think
- How voice AIs could humanize the recruitment process
[20:03] How can organizations avoid AI safety and security pitfalls?
- The limits of public LLMs vs. the possibilities of private LLMs
- Best practices for ensuring security and data safety when adopting AI
[32:26] Closing
- Thanks for listening!
Quick Quote
“We had this concept a year ago—maybe two years ago—where [we thought] ‘the technology is more biased than humans,’ and it was wrong . . . I truly believe the closest we’ll come to bias-free hiring is the use of AI in sourcing and screening.”
Resources:
Tim's website
Contact:
Tim's LinkedIn
David's LinkedIn
Dwight's LinkedIn
Podcast Manager: Karissa Harris
Email us!
Production by Affogato Media
To schedule a meeting with us: https://salary.com/hrdlconsulting
For more HR Data Labs®, enjoy the HR Data Labs Brown Bag Lunch Hours every Friday at 2:00PM-2:30PM EST. Check it out here: https://hrdatalabs.com/brown-bag-lunch/
Produced by Affogato Media
Powered by the WRKdefined Podcast Network.
[00:00:00] The world of business is more complex than ever. The world of human resources and compensation is also getting more complex. Welcome to the HR Data Labs podcast, your direct source for the latest trends from experts inside and outside the world of human resources. Listen as we explore the impact that compensation strategy, data, and people analytics can have on your organization.
[00:00:25] This podcast is sponsored by salary.com, your source for data, technology, and consulting for compensation and beyond. Now, here are your hosts, David Turetsky and Dwight Brown.
[00:00:38] Hello and welcome to the HR Data Labs podcast. I am your host, David Turetsky. We're recording live from the 2024 HR Technology Conference in beautiful Mandalay Bay Exposition Center in Las Vegas, Nevada. And I'm here with my friend, Tim Sackett.
[00:00:53] Tim, how are you?
[00:00:55] I'm great, David. How are you?
[00:00:56] I'm tired.
[00:00:57] I know we both got like the raspy voice going from two days of nonstop conversations.
[00:01:02] Oh my God, I can't shut the hell up. Eventually I will, especially when I'm getting food later.
[00:01:08] Yeah.
[00:01:09] But we have the pleasure of talking to you and there's so much really cool stuff that's happening all around us in the world of recruiting and applicant tracking systems.
[00:01:19] And there's a lot of buzz.
[00:01:20] Yeah.
[00:01:21] You've heard the buzz.
[00:01:22] I've heard the buzz.
[00:01:24] AI.
[00:01:24] Well, before we get to the world of AI, what's one fun thing that's happened to my friend, Tim, in the last year?
[00:01:32] Oh, wow. I actually, so we are, we are, before we got on air, I talked about, we golf a lot. I had my first hole in one.
[00:01:38] Dude, no way. That's awesome.
[00:01:41] That was exciting. I actually had a good friend of mine and my, one of my cousins golfing with me. So I had like a big crowd of people that saw it. Like we got to watch.
[00:01:50] That's awesome.
[00:01:50] It was a good shot too. It wasn't one of those like off a tree, it bounces out of the green.
[00:01:56] It was squirrel.
[00:01:57] It was like, yeah, it was like, I hit this perfect shot and I'm like, I thought it was going to be a little short. It bounced right in front of the hole about six feet and rolled in and we watched the whole thing. It was pretty cool.
[00:02:06] What was the yardage on that?
[00:02:08] 140 yards into the wind with an eight iron.
[00:02:11] Wow. Oh my God. No way.
[00:02:13] Yeah.
[00:02:14] That seems perfect.
[00:02:15] Yeah. Like at least, I mean, I mean, you can't get better than a hole in one.
[00:02:19] I think my group in this, I've actually like, so my, my son has two holes in one. I actually was golfing with him in his first. And like that time I was more excited than he was.
[00:02:28] Sure.
[00:02:29] And my playing group was more excited than I was. Like, it's just a weird thing. Like, you're just like, okay, like I just, you know what?
[00:02:36] What do you do? Like, yeah.
[00:02:38] Right. You know, we just pick up the ball from the hole and everybody else is done.
[00:02:42] I know. They're all excited because I think you're like, you know, tradition is you go by, you have to pay drinks in the clubhouse afterwards.
[00:02:47] So they're like, yes, free drinks.
[00:02:49] Oh my God.
[00:02:50] Well, I mean, but you, this is a badge of honor for you for a while now. I mean, get to say I hit a real hole in one.
[00:02:57] Yeah.
[00:02:57] It's not a mini golf hole.
[00:02:59] And then six days later I hit one within 12 inches and like, same thing. I was with one of the guys was with me again.
[00:03:04] He's like, I can't believe you just did it again. And we had four guys were standing on the, like waved us up on the green and they were all cheering.
[00:03:10] And they thought it was going to go in and just stop short.
[00:03:13] So is it because you've been playing so much that this is getting to be where you're getting so good or is it just luck?
[00:03:19] I don't think so. Like you see like Tiger Woods or somebody has like 18 or whatever. That dude's played millions of holes. Some of it's just like the amount, right? Of golf.
[00:03:28] Um, I still think, and there's also professional PGA golfers have never had one. Like it's such a lucky thing.
[00:03:32] Right.
[00:03:33] Like you look at the stats of a hole in one and it really is like a, like a bolt of lightning kind of striking. Right.
[00:03:39] Well, everything has to be perfect. Like you had to have the right stroke. You had to have the right club. You had the wind conditions had to be perfect.
[00:03:46] The grass on the green needs to be perfect.
[00:03:48] Yeah.
[00:03:49] Yeah. Yeah.
[00:03:50] So it's, it is weird. Cause like, you'd like, I would rather shoot a 79, like shoot under 80, then have a hole in one and shoot 89.
[00:03:59] Right. Yeah. Yeah. Well, but, but it's, nobody cares about your round. They care about that story though.
[00:04:07] They do. Yeah.
[00:04:07] And that's why I say it's a badge of honor because you're going to be able to carry that with you forever.
[00:04:11] Yeah.
[00:04:12] Well, congratulations.
[00:04:14] Yeah. Thank you.
[00:04:14] In the hockey world, you know, especially for a goaltender, we kind of think of, you know,
[00:04:18] what's the big deal, like maybe a shutout or something, but, but that's not just you. That's
[00:04:24] the entire team.
[00:04:25] Yeah.
[00:04:26] Yeah. And I play, because I play hockey on Monday nights really late with a bunch of guys that I know
[00:04:31] really did. My badge of honor is, you know, they scored less than 10 goals on me.
[00:04:37] Yes. Thank you for that cheer. That was a cheer from the crowd, but, uh, but that's really great.
[00:04:50] So Tim, let's now get to the topic at hand. So let's talk about recruiting and let's talk about
[00:04:58] recruiting technology. There's been a lot of evolution that's happened and really evolution
[00:05:03] has been happening for quite some time. Yeah.
[00:05:05] But to you, what have you seen lately? What is like really remarkable to you about the evolution
[00:05:11] of recruiting technology?
[00:05:13] You know, it's, it's an interesting thing. Cause you see it here. I mean, you see it like a lot
[00:05:17] of the trend stuff, like you're here when we were here last year at the pitch fest, which is a
[00:05:21] startup competition, right?
[00:05:22] Right.
[00:05:22] We were like that point, like hiring maybe started to maybe slow down a little bit. You didn't really
[00:05:28] hear it. It was still like hot as can be. Right.
[00:05:30] Right.
[00:05:31] Right. And there was no recruiting tech and startup. Really?
[00:05:34] We're like, this is like, it was odd. Like you just, I mean, there was just none really. And we
[00:05:40] just had gotten like, you know, big chat GPT stuff. And so we thought, okay, yeah, it's going to
[00:05:45] happen. But like, it just wasn't there in the startup world. And then this year we come and like every
[00:05:49] other one is a tech one. So yeah, it's, there's this trailing thing where you go, great. You built it,
[00:05:54] but now like the market's changed again.
[00:05:57] Yeah.
[00:05:57] So then next year we'll see like, okay, what are people like wishing they had this year
[00:06:01] that's not there? The other thing I've seen is because of like the generative AI stuff.
[00:06:07] Yeah.
[00:06:08] Everybody came out with, I could literally like, it's like the five same features, right? I can
[00:06:14] help you write a better job description. I can help you do better communication to, you
[00:06:17] know, like personalized, blah, blah, blah. And it didn't matter if you were a CRM and ATS
[00:06:22] or sourcing tech or whatever, interview tech, they all had the same features. And then the buyer goes,
[00:06:27] so you guys all are the same? You're like, well, no, but the marketing is so, so much so that it feels
[00:06:33] like everybody's turned into the same thing when they're really different, but they're all marketing
[00:06:37] the same features.
[00:06:39] Right.
[00:06:39] Even though that's only like a tiny part of their feature set, they want to, everyone wants to get
[00:06:42] the new AI stuff out there.
[00:06:44] So are they all competing against each other or are they all cooperating together?
[00:06:47] That's crazy. They're not really, they, I mean, they shouldn't be competing against each
[00:06:51] other, but I think the buyer actually believes they're doing the same thing. So now the buyer
[00:06:55] may be putting them in a same bucket where they're actually in different verticals. So
[00:06:59] there's, I mean, again, it's, we have this issue. It just becomes more confusing.
[00:07:03] Right.
[00:07:03] And I think, I don't know if this is really the technology that's confusing. It's more of
[00:07:07] the marketing of the technology. It becomes very confusing. You know, I always ask like,
[00:07:11] especially like we get this on the pitch fest where someone comes in, they have three
[00:07:14] minutes, tell you what they do. And after three minutes, I, my first question is, so what
[00:07:18] do you do? Like, I don't, you, you have 11 words to tell me what you do. Can you just
[00:07:23] tell me what you do?
[00:07:24] Right.
[00:07:25] And like, oh my God, like if you can't just like put that in one sentence, there's a problem.
[00:07:29] Right.
[00:07:29] You know, well, that should be what the pitch fest is about, right?
[00:07:34] We actually, we actually, cause you know, George LaRock, a friend of mine that is the MC
[00:07:38] there and does like stuff with here in the investor summit.
[00:07:41] Right.
[00:07:41] We, we actually changed the rubric and actually asked them. We gave them two sentences. I
[00:07:45] actually only wanted to give him 11 words. George like was nicer. And he said like, give
[00:07:49] us two cents of what you do. And again, it's two sentences of marketing speak to make them
[00:07:54] sound super sexy. And I'm going to go, okay, I still don't know what you do.
[00:07:58] Well, shouldn't that disqualify him right away? I mean, I'm not trying to be offensive to
[00:08:02] people, but like, if you can't be just freaking honest and just, yeah, you can have some
[00:08:07] marketing words in there to stop a bull and get right to the point.
[00:08:10] It does. It does quite, you do question if they are real techers at vapor, you know,
[00:08:14] is it really a service, not a technology? It hurts them. I think the people that come
[00:08:18] and say, here, this is exactly what we do. And they show it and they talk about the real
[00:08:24] life HR use case for it. Right.
[00:08:26] Right.
[00:08:26] And they can give multiple examples of that. Like immediately they're going to be raised
[00:08:30] up higher. They're going to get higher scores. Right.
[00:08:32] Isn't that like a say it, don't say it, prove it kind of thing.
[00:08:36] Yeah.
[00:08:36] Yeah.
[00:08:37] Cause I mean, I'm in sales and you know, I get that all the time. Listen, I heard what
[00:08:42] you have to say. Show me the use cases. Show me the client stories. Show me the references
[00:08:47] that have done what you say.
[00:08:50] Yeah. And as a judge, you try to help them. Like I try to ask the question that says, look,
[00:08:54] look, I know what you do because I'm in this space and I look at a million things and I
[00:08:58] like, but like you have all these other people in the audience that are voting for you that they,
[00:09:02] I can tell you right now they have no idea what you do. I could go around with a mic and
[00:09:05] they'd be like, I don't know.
[00:09:07] Right. But it's a person who has the best presentation or the best style or the best gimmicky
[00:09:11] bullshit.
[00:09:11] And the marketing is way better on these startups. Like they're spending money. I mean, the product
[00:09:16] looks really good.
[00:09:17] Yeah.
[00:09:17] It looks like professional product. Right. So then you get, so sometimes you get like, oh,
[00:09:21] gosh, it looks like a, like a really good piece of stuff I want to use. But just because
[00:09:25] it looks great doesn't mean it necessarily does what it should be doing. But.
[00:09:28] But is that architecture or is that real product at that point?
[00:09:32] It could be both. Some have real products. Some are still like in the idea design phase,
[00:09:37] like this is what it's going to look like, but they, but they can't show me a real product.
[00:09:40] Like, oh, so when they're in pitch fest, it's not a real product yet.
[00:09:43] Could be. Some are pre-revenue, some are post-revenue, some have already raised 3 million.
[00:09:47] Like there's, there's some rules, but the rules are kind of, it allows for a big range to come
[00:09:52] in. Okay.
[00:09:52] So you do have people coming in going like, look, I'm looking for, you know, $500,000 to actually
[00:09:57] build out an MVP. And then you have some going, we're already doing a million in AR.
[00:10:01] Right. You know?
[00:10:02] Yeah. Well, yeah. I mean, pre-revenue people to me, that's brilliant. But I mean, if it were
[00:10:09] me, I'd wait until I actually had the money.
[00:10:12] That's what I'm like. Yeah. If you can't, you know, kind of bootstrap together an MVP,
[00:10:16] like why even, why are you coming, you know?
[00:10:18] Right.
[00:10:18] Because you're competing against people who've already solved that problem. You probably
[00:10:22] already have clients and whatnot. We're telling client stories.
[00:10:26] Yeah.
[00:10:26] Yeah. So it's just so hard.
[00:10:28] You know, the great thing about that though, David, is like when you see these entrepreneurs
[00:10:31] coming and building product, like it's, there's a little bit of like, you have to disassociate
[00:10:36] yourself from reality a little bit to be an entrepreneur and think you're going to do like,
[00:10:39] you know, because again, it's the 99% of these things are going to fail.
[00:10:42] Right.
[00:10:43] But when you like listen to every single one of them, they all believe they're the one that's
[00:10:47] going to make it.
[00:10:47] Absolutely.
[00:10:48] There's something really motivating and inspiring about that. Right. That's the core of like what
[00:10:52] you do and come here because you don't know at some point, all these giant booths that
[00:10:57] are here were that glimmer of someone's eye that think I have an idea. Right. And they,
[00:11:02] you know, so you can't say that it doesn't work, you know?
[00:11:04] Right. Right. Right. Right. But, but then again, there might be people on the floor that are
[00:11:08] actually solving the same problem. They just not as good as marketers or they're not as
[00:11:13] good as pitch people as those people.
[00:11:15] Oh, for sure. Yeah.
[00:11:16] There was a Japanese company that came in and like you could tell, I mean, obviously the founder,
[00:11:21] like CEO, first language, Japanese, second language, English. He actually brought, you know,
[00:11:27] one of his employees that was a native English speaker and she led most of it. He like she,
[00:11:31] and then she would, you know, would help with some of the questioning. I think that's smart
[00:11:35] because I've had other people that haven't. Yeah. And again, there's no offense. Like,
[00:11:38] I mean, that's great that you come, but you're not, but it's hard in three minutes. Right.
[00:11:42] If you're, if you're struggling with English to get that pitch across. Absolutely. Absolutely.
[00:11:48] Like what you hear so far, make sure you never miss a show by clicking subscribe.
[00:11:53] This podcast is made possible by salary.com. Now back to the show.
[00:11:59] So the evolution of recruiting technology. Yeah. It's ongoing. It's going to continue.
[00:12:06] What's the biggest thing you saw here that kind of wowed you?
[00:12:12] I think if we go back last year and take a look at, I think everyone felt like you got to be careful
[00:12:17] with AI. The technology can be biased in the bias, you know, and it can, it can learn to do bad things,
[00:12:23] blah, blah, blah. I think we're starting to see the tipping point where people are going, Oh, wait,
[00:12:30] actually the technology can't be biased. It could be learned bias, but I can also control for that.
[00:12:36] Right. Unlike a human.
[00:12:37] Right. And so we had this concept maybe a year ago, two years ago where it was like the technology is
[00:12:42] more biased than humans and it was wrong. Right. But like, but if you went to a crowd of a hundred
[00:12:47] HR leaders, they would all believe that. Oh yeah, you're yeah, definitely. Right. Because the media
[00:12:51] portrayed it as like the evil, like technology. Right. And I kept saying like, no, like I'm in a
[00:12:57] minority, but I truly believe the closest will come to bias free hiring is the use of AI in sourcing and
[00:13:03] screening. And so, you know, I, you know, now like, I think the coolest thing is, is like,
[00:13:07] like let's go traditional. And by the way, this is still most hiring recruiter puts out a job.
[00:13:12] They get a hundred people apply. Recruiter goes in, they take a look at 2025. They reach out to 10 or
[00:13:18] 15, three or four calling back. They screen, they send those onto the hiring manager. Hey,
[00:13:22] here's the best of the who applied. Right.
[00:13:25] Right. 75 people didn't even get in the, didn't even get a sniff. Right. They didn't get part of
[00:13:29] the process. Right. And you turn AI into that and the AI starts to screen or AI at least starts to
[00:13:34] like show you who the best are. You, for the first time in history, a hundred percent of the people
[00:13:38] can be a part of the process. Right. Which is going to make us more diverse, more inclusive,
[00:13:42] better talent, higher quality. All of that stuff is because of AI, because the humans just didn't
[00:13:47] have the capacity. But again, we could have had the capacity to do it, but we couldn't put that much
[00:13:51] resources to it. And like in the company would lose, like we lose money. Yeah.
[00:13:54] At scale, it doesn't make any sense. Yeah. Yeah. The thing that bothers me, and I think we've
[00:13:59] talked about this in the past about that process though, is when they get rejected eight seconds
[00:14:04] after. Yeah. Yeah. Yeah. Again, I still think like, again, I, I can, you know, I've, I've seen four or
[00:14:12] five different live voice screening AIs and it's amazing. It's literally like, if you think about
[00:14:18] who's the best recruiter I ever talked with, the energy, the voice, the, they were interested in me
[00:14:24] and they were, they love their company. Passion. Yeah. That's those AI screeners now. Like it
[00:14:29] doesn't sound like some computer voice. It sounds like a real person. There's a little latency,
[00:14:33] right? But so I have a feeling that we could actually deliver really good feedback at scale
[00:14:40] with AI that actually is real feedback. And having that person even get that call is exciting. Yeah.
[00:14:48] When, if you just turn them, if you just turn them over to the disposition pile, they're going to be
[00:14:54] pissed. Yeah. And they're never going to come back to you again. They may be best, the best for the,
[00:14:58] the next job that comes available. Yeah. But now you've destroyed your pipeline. Here's the,
[00:15:03] like when we talk about, and like, I know Burson did this in his keynote this morning,
[00:15:05] they talk about the agent stuff, right? The AI agents. Right. Like there's, there is a real use case here
[00:15:10] where you start to take all the data that somebody did like through like, Hey, we didn't do this
[00:15:15] assessment and we don't need this interview. And AI is going to be able to go and take a look at all
[00:15:19] of that and say, Hey, you didn't get chosen, but we appreciate everything you did. And here's like
[00:15:25] where we saw your strengths. And here's what we like, we'd love to see you do like, because it's
[00:15:28] a value add. It's a value add. And like, and it's going to be able to do it with the, with all the
[00:15:33] bumpers in place to make sure they never say anything that's going to get you legally trouble.
[00:15:37] Wouldn't it be really cool to that point? If it would say back, listen, you applied for this job,
[00:15:43] but here's a better job for you. And by the way, we put you in for that role because your resume was
[00:15:49] really strong in these areas and we think you'd be a better fit there. Yeah. Oh, for sure. You're
[00:15:53] going to get a call. You know, the, the one, um, one of the companies that, you know, had me do it.
[00:15:58] They, they said, Hey, go ahead and tell them, tell us whatever job you want. We don't care. Just give
[00:16:01] us any job. Be as crazy as you want. And I said, I want to be the head coach of the Los Angeles
[00:16:05] Lakers. So the AI agent called me, right. And immediately started digging into my coaching
[00:16:10] experience, my career, what I do in these situations. Of course, I'm not a, I'm not a
[00:16:13] pro basketball coach. I never played pro basketball. It got rid of me so fast in the nicest way.
[00:16:20] And they, and they get the technology called back and they go on a scale. Like they have a scale of
[00:16:23] like one to five, one being the worst, five being the best in terms of like your rating for a candidate.
[00:16:27] They go at all of the testing we've done, the worst we've ever gotten. Anybody was a two.
[00:16:31] You got a one it knew right away. Wow. And by the way, it got rid of me very quickly. I'm like that to
[00:16:37] me. That's actually a great recruiting to know that, Hey, we have somebody here. That's just,
[00:16:40] I mean, come on, like this is a joke. Like there's no, they didn't treat me that way.
[00:16:44] Right. Treated me well, but immediately after like three or four questions knew it was really
[00:16:50] gracious. You know, asked me if I had anything, we'll be in touch, blah, blah, blah. You know?
[00:16:54] And now they treated you like a freaking human and it was a bot that treated you like a human.
[00:16:59] Yes. Yes. But that to me, that's next generation stuff. It's next generation thinking because
[00:17:05] now I feel appreciated. Yeah. We had one of them in the pitch fest that actually did a live demo
[00:17:10] of it. And that's very dangerous here because you know how it is like with like wifi and everything
[00:17:14] else. And so we were like, Oh, he's going to try it. And he did it live and it was really good.
[00:17:19] And he said like, there was like, it's only a three minute pitch, like a minute into the interview.
[00:17:24] He's like, I have to go. Sorry. And it was like, Oh, no problem. We'll like, we'll connect back.
[00:17:28] Let us know. Really? Like immediately could like respond to that. And you're just like, Oh my gosh.
[00:17:34] Well, to me, that's the promise of artificial intelligence where it's giving HR a reason
[00:17:40] for its existence. It's helping provide those ways in which HR can't freaking scale to do all
[00:17:48] those things to call back the hundred people to make sure that they felt, but also now what
[00:17:54] it's also going to do is it's going to provide feedback. As you said, not only to the person
[00:17:59] who got the interview, but also to the hiring manager to say, listen, we interviewed a hundred
[00:18:03] people out of a hundred, you know, here are the best 10, the rest of the 90, we're going
[00:18:11] to disposition them this way. Don't worry about it. We got that.
[00:18:15] I like to think like, I, so often I like, and I'm sure you do, you talk to like with CEOs
[00:18:20] and I'll just go, Oh my gosh, if you just, if I could just find people that want to work
[00:18:23] for me, they want to work for our company. Like I would teach them everything that you know,
[00:18:27] like, I just need like passionate people that want to be a part of us. And like, again,
[00:18:31] I think when you take a look at all the people that apply that maybe didn't have the stuff,
[00:18:35] so they never got a sniff, they never got an interview, but all of a sudden you could
[00:18:39] turn that, you could go into the algorithm and say, you know what, if something comes
[00:18:42] across where they're super passionate about working for us and it comes across in the interview,
[00:18:47] we might not like level them up to the, to that job, but we want to make sure we star them
[00:18:51] and put them in a pile where somebody like in person is communicating with them because
[00:18:55] we're going to find something for those people. Right. And like, all of a sudden now you're
[00:18:59] building on culture with the people who really want to work for you and like turns into like
[00:19:03] great stuff. Like you just never know. But before those people would literally, they would be,
[00:19:07] they would be a black hole. They never heard from you and they got this crappy disposition email.
[00:19:11] Yeah. Yeah. Yeah. And all of a sudden you turn like maybe one of your biggest fans to your brand
[00:19:15] into a negative or one of your biggest customers, right? Like, Hey, I'm a top five.
[00:19:20] I buy more purses from you than anybody else. Well, now they become a detractor and social
[00:19:25] media is what it is. So they're going to tell their experience to the entire world
[00:19:29] and everybody's going to know about it. Yeah. And nobody wants to be treated like that.
[00:19:34] Hey, are you listening to this and thinking to yourself, man, I wish I could talk to David about
[00:19:39] this. Well, you're in luck. We have a special offer for listeners of the HR Data Labs podcast,
[00:19:44] a free half hour call with me about any of the topics we cover on the podcast or whatever is on
[00:19:51] your mind. Go to salary.com forward slash HRDL consulting to schedule your free 30 minute call
[00:19:59] today. If only our world was that perfect where all companies could do that. Now,
[00:20:05] you mentioned something before about it's not about the technology. Is it about the configuration
[00:20:11] of the technology? Is it about the data and the training of the technology? I mean,
[00:20:15] and the reason I'm going there is because of the workday example and you know how it was seen as
[00:20:22] being biased and there's the lawsuits about it. And I don't, I will tell you, I don't know the exact
[00:20:26] details, but to me that speaks of that's a technology that wasn't configured correctly. That's just my.
[00:20:34] Yeah. I mean, part of it is understanding what AI is and what AI isn't right. Like everyone wants to
[00:20:38] say the AI is biased, but like a, like AI at its core is a young toddler being trained by something
[00:20:45] and you go, wait a minute. Like we were all, we shouldn't have been shocked when open AI did
[00:20:50] or Gemini or any of these giant open, like, you know, language models like did really bad stuff.
[00:20:55] Yeah. Because you're like, it was being trained by the internet. Right. Exactly.
[00:20:59] You know how the bad stuff on an internet, like, would you ever let a toddler loose on the,
[00:21:02] like you just wouldn't. Right. Well, so we shouldn't have been like, so I think like people figured out
[00:21:07] and I think the, like the best companies in our space are building private language models. Right.
[00:21:11] And we think, well, well, so it still could just come and do something. No, no, no. This is a
[00:21:15] software program. It can't do whatever it wants. Right. If it is designed specifically to do one
[00:21:19] thing, it can do only that thing. It can't just make up its own rules and do whatever it wants.
[00:21:24] I think people don't understand that. And so we had some early examples of some bad things happening
[00:21:29] because people were testing probably like public language models where they shouldn't have been.
[00:21:33] Right. Right. Or they, they let, you know, the training, um,
[00:21:36] in terms of machine learning go too far before being checked. Yeah. And so it learned to be biased
[00:21:42] based on what your behaviors were, not itself. Right. Exactly. Yeah. And so again, I think no
[00:21:47] company wants to be on the front page of the New York times having biased technology. No. And so I
[00:21:52] think we see now that they're being very cautious, very careful. A lot of it is like, Hey, it doesn't
[00:21:58] actually respond by itself. It will respond, but it still forces you as a person to actually send.
[00:22:03] Right. Right. And so you're the double checker. Right. Um, there's a lot of that going on again.
[00:22:08] I think eventually two years down the road, that won't be the case. We'll be, we'll understand that.
[00:22:13] Oh, this is a private model. The best ones I've seen are like, they built a private model to do
[00:22:16] something specific. They actually build a separate model that says, Hey, whatever this model says,
[00:22:22] if there's anything that could be considered offensive, we're going to go right into a pre-approved
[00:22:27] response loop. Right. Right. Right. Where it says like, Hey, you know what? That's a great question.
[00:22:31] We're going to have a human reach out to you. Right. Right. Exactly. Yeah. And so it's incapable
[00:22:36] of actually being offensive. Yeah. And like, I think that's all we really care about. Right. For
[00:22:40] our brands. In the, in the old days, we used to call that QA. We used to call it unit tests. Yeah. I
[00:22:45] think. And we used to try and test for all of those negative things from happening. In fact,
[00:22:50] at WorkScape, I had built a legal language dictionary. I called it, but it had nothing to do with legal
[00:22:55] language. It was all the bad words that you couldn't say. I, I, like I demo so many things and like,
[00:23:00] you know, work with so many of these companies and I've tested a couple of where they said like,
[00:23:03] try to break it. Right. Try to get it to do something. Try to get it to write code. Try
[00:23:07] to get it to do anything. Try to get it to be mad at you. Yeah. And you, it's just, it's impossible.
[00:23:14] Like it just, you know, and again, those are the ones that are building private. Right. I have some
[00:23:18] that I'll go and I'll ask them like, well, what are you using for your backbone? And they'll go open AI.
[00:23:22] And I'm like, Oh, and they're like, Oh, but it's enterprise. I'm like, still, it's a public
[00:23:26] language model. Exactly. You can't control that. You can bumper it. But again,
[00:23:29] I can also then create rules in my querying that, you know, that can change the bumpers,
[00:23:34] you know? So it's like, if I'm smart enough, you know, and again, that we're, the software
[00:23:39] companies are getting smarter and smarter, smarter on how to stop people trying to change it. Right.
[00:23:42] Right. So it becomes more difficult every single day. Yeah. Um, but like, again, it's still risky
[00:23:48] when you're using public models. Yeah. And that's what a lot of the people that are in these
[00:23:55] enterprises today that aren't, there are no guardrails on them being able to use open AI,
[00:24:00] Gemini, co-pilot or whatever. And, and some of the others that I don't know and be able to do that
[00:24:06] from a consumer perspective and type in whatever query they want that may actually be giving out
[00:24:12] intellectual property that they don't realize. Yeah.
[00:24:14] So where do you think that the companies need to stand? Is this an IT locking this stuff down or?
[00:24:21] I don't think so. I mean, here's, it's, here's the ironic part is like, we have job postings out
[00:24:25] there saying, Hey, we want you to come in and use AI and develop AI and do all this stuff. But well,
[00:24:31] by the way, by the way, we're not going to allow you in the recruiting process to use AI, or we're not
[00:24:37] going to, you know, do all these things. I think we have to know, like right now we let people just
[00:24:42] do whatever they want. And like, and we're like, well, isn't there a lot of risk at that? I always
[00:24:48] stick it as like, Hey, if I'm working with somebody that I think is a poor performer and I asked them
[00:24:52] to do some work for me and they send me something, I'm double, triple checking. I'm not going to let
[00:24:57] that go without me saying, wait a minute. I don't trust what Tim's putting out. I'm going to check
[00:25:02] it. I'm doing the same thing with my AI. Yeah. Like I'm not going to just go, Oh, it's AI did it. It
[00:25:08] must be perfect, you know, and just send it and all of a sudden realize, Oh my gosh, it said something
[00:25:11] not true or wrong or whatever. Right. Well, there's a warning there. There's a, there's a
[00:25:16] tale of people using these models and just saying, Oh, well it's good enough. But there's also really
[00:25:23] great use cases for this stuff that actually worked really well. Like it's amazing. You know,
[00:25:29] you can go in there and be like, Hey, I want like a 1800 a day calorie diet with 150 grams of protein
[00:25:36] and this many carbs. And I want to buy everything at Far, you know, Trader Joe's and immediately it
[00:25:41] gives you this list. You're like, Holy crap, this is amazing. And I'm gluten free or I'm
[00:25:44] shellfish intolerant or whatever. There's so many cool things you can do. So I think like, first you
[00:25:48] have to get out there, play with it, understand like what it can do, be comfortable with what,
[00:25:52] you know, what you're going to, and the hallucinations. Like, I think I told you, like
[00:25:56] when I first started doing it, I was like, you know, what's the most controversial quote Tim
[00:25:59] Sackett's ever said on, you know, and yeah. And so I came back that don't,
[00:26:06] don't hire ugly people. It will re it will decrease the value of your company. And I'm
[00:26:11] like, wow, that is controversial. I never said that. It completely. So it took a quote
[00:26:16] that I said was, I said, only hire pretty people. It increases the value of your company
[00:26:19] and it turned it around to a negative and made it more controversial. Wow. No way.
[00:26:25] It's completely hallucinated it. Cause I said, I'm like, that's a great quote. Where did you get
[00:26:28] it from? And it took me to the blog post where I wrote it and it was completely, and I said,
[00:26:33] well, find me that quote on this, you know, I can't find it. It's not, you know, so did
[00:26:37] you make this up? You know? Well, and in that case was the AI lying?
[00:26:42] Well, they called it hallucination. It's a, it's a fancy way to say it lied. Yeah.
[00:26:46] Really? So there, there's, it gets less and less, there's less, less hallucination that
[00:26:50] happens within the AI. Um, again, that's a public model, you know,
[00:26:54] but is that the AI going rogue or is that the AI trying to give you back what you,
[00:26:58] what it thought you wanted? Exactly. That's a lot of that's what it is. Right. Cause I
[00:27:02] said like a controversial, right. Um, you know, so it knew to go in like, you know,
[00:27:07] all of the millions of words I've written online, it was going to go out and like see all of those
[00:27:12] and then make something controversial, you know, dude. And you said it on the HR data labs podcast.
[00:27:20] Wow. But that's so fascinating because that does get to the dystopian concerns that people have about,
[00:27:27] you know, you know, is the AI going to start doing things that we are telling.
[00:27:32] And you see this on social all the time, especially with the image generation stuff,
[00:27:35] right. Where you can basically tell it to do anything. And for the most part,
[00:27:38] it'll give it to you. And then all of a sudden it gets out there and someone's like,
[00:27:41] Oh my God, did you see this picture? And you're like, it's AI. It's not real.
[00:27:45] But it looks so real. I know. You know, well, I I've used, um, Adobe illustrator. Yeah.
[00:27:50] AI with that's the initials for Adobe illustrator and it can do some fascinating things,
[00:27:57] but you have to know how to prompt it. You have to know how to write it the right way.
[00:28:02] Yeah. And it could take you eight hours. And then after the eight hours you go,
[00:28:06] Oh shit, I could have just drawn this myself.
[00:28:09] You know, I think though, like another trend we're seeing, like, is that,
[00:28:12] cause it used to be what we thought everyone's going to become a prompt engineer and HR,
[00:28:15] you got to learn how to prompt. They're building this stuff now where you can just natural
[00:28:19] language, ask what you want. You don't have to learn how to prompt and it will prompt,
[00:28:22] it'll write the prompts for you. Right. And you can just kind of keep going back and reiterating
[00:28:26] based on real language. And I think that's ultimately, I mean, eventually you'll just be
[00:28:29] talking to your computer, right? You know, Hey, I need this data. Like we had, I actually talked
[00:28:33] with some data science people today and they're like, you know, the part of that is if you're not
[00:28:37] really good at data anyways, you ask for something and it just gives it to you. Great.
[00:28:40] But like, right. Maybe what you got is not what you think you got. What they're trying to
[00:28:45] recreate is really teaching you how to do BI where you go, right. Hey,
[00:28:48] I need this data. And it would go, why are you looking, what do you need about that?
[00:28:53] Why? You know, Oh, because I, my, my CEO is upset because we can't open things because
[00:28:57] they think it's this and like, Oh, well maybe we should deliver this data story around all
[00:29:02] these aspects of what's causing the problem in like teaching them like really what that's
[00:29:06] all about. And I think again, all of a sudden now as an HR leader, you're like, I'm a now a
[00:29:10] data professional. Like I'm like really good at this. Like I'm going to be an expert based
[00:29:14] on I have this agent. That's a great FBI. And that to me is scary and beautiful at the
[00:29:20] same time because now it's, it's trying to train you on the better way of asking it the
[00:29:26] right thing. Yeah. And that's beautiful. Have you seen the new, like I think the very,
[00:29:32] the most late, like the, the latest model out that's a paid model for chat GPT will actually
[00:29:37] show you how it's thinking. And it actually says words like, Hmm, it actually, you know,
[00:29:42] you know, we'll type it out as it's, and it will show you the process and it's, they
[00:29:45] call, I can't remember what they actually call it, but they, what they're trying to show
[00:29:48] you is like, this is how it actually is coming up to the answer. It gets you. Wow. Because
[00:29:53] you can challenge it often. Like I'll go back and say, are you sure these are the top five
[00:29:57] things, you know, blah, blah, blah. And I'll come back and go, you know, you're right.
[00:30:00] I, I know there's two other ones I think I would add and replace this one. Like, but like,
[00:30:05] if it just gives you the answer, it doesn't, you don't see where like, Oh, well, this is where I
[00:30:09] got that. But then I found this one over here. And like, and so like now you can start to dig
[00:30:13] into the black box and like, it will show you at thinking. And that's fascinating.
[00:30:17] It is. I mean, if you ask somebody, if you're sitting there with like, if I'm sitting there
[00:30:23] with you and I'm asking you, well, how did you come up with that? Yeah. First of all,
[00:30:26] you're going to get really annoyed that I'm, but we could, if we did the exercise to brainstorm
[00:30:30] it out, I want to know everything you're thinking and we're going to write it down that again,
[00:30:35] that's how we formulate the answer as well. Absolutely. But then again, it's also the
[00:30:39] secret sauce that makes Tim Sackett who he is, right? For sure. Yeah. And I don't want to
[00:30:43] replicate that. I'd love to know your thinking, but that's the reason why I'm talking to you is
[00:30:48] because your brilliant mind, I'm not going to become you, but it's really cool to understand
[00:30:54] how you came up with that. Yeah. Yeah. I just want to know what you think, dude. I know. Yeah.
[00:30:58] I do. I still think like, to me that all this comes back to is where do you stick the human
[00:31:02] back in the loop? Right. Where do you come, where is it? Where do we need uniquely human
[00:31:07] experiences, whether that's in recruiting and HR and employee development and wherever we do stuff.
[00:31:12] Right. I think what, you know, to me, if the AI gives you a capacity to deliver a more human
[00:31:18] experience and like, I always go back and like, people are like, Oh, eventually you'll have like
[00:31:22] these AI friends and AI coworkers and all this other stuff. And I'm like, I still believe like the,
[00:31:27] the unique human thing we have is we don't want to be alone. Yeah. And we, we want to be
[00:31:31] with other humans. And so like, how are we, can AI help create those experiences where I feel like,
[00:31:38] Hey, I'm engaging with people and like, I've never like talked with before, but the AI knew we would
[00:31:44] have, we were thinking the same way about something and we could come up with a great solution for it.
[00:31:48] Right. And that's not dystopian. No, I know. I think that's a hopeful part of it. Right. Like,
[00:31:55] that's the best thing. Yeah. But it becomes dystopian in like a couple degrees difference,
[00:32:01] right. Where, you know, that person does start talking to the computer instead of talk,
[00:32:06] start, start talking to people or stop talking to people, start, stop relating. And I think this
[00:32:12] is a movie we've all seen, but hopefully that's not anytime soon. Yeah.
[00:32:16] Yeah. You've just gone through 30 minutes in one question. Like it was nothing, right? Like,
[00:32:29] exactly. And we could keep doing this like all afternoon, but I want to be respectful of your
[00:32:35] time. Thank you so much. Thank you for having me. You know, I look forward to the HR technology
[00:32:39] show to talk to you and to have these conversations. I always enjoy it. Thank you so much. Take care and
[00:32:46] stay safe. That was the HR Data Labs podcast. If you liked the episode, please subscribe. And if you
[00:32:54] know anyone that might like to hear it, please send it their way. Thank you for joining us this week
[00:32:59] and stay tuned for our next episode. Stay safe.


