Part 2 of 2
In this episode of Career Club Live, Bob Goodwin interviews Emmy Award-winning journalist Hilke Schellmann about her new book "The Algorithm" and how AI is transforming hiring and the workplace. They discuss how AI is used in applicant tracking systems, resume screening, and video interviews, as well as the potential for bias. Hilke also shares insights into how companies are using AI to monitor employees and what regulations may be needed to ensure fairness and transparency. This thought-provoking conversation provides valuable perspectives on both the opportunities and risks of AI in our working lives.
Powered by the WRKdefined Podcast Network.
[00:00:00] So let's say let's move past the candidates I know I work at company X that's using AI, you know kinds of ways that I didn't even dream a company might be using AI or some of things that are
[00:00:24] going on behind the scenes that people probably don't know about. Yeah, so in hiring we see that the one way video interviews that can be analyzed by AI, we've seen gamified assessment where you play sort of a game to understand what personality
[00:00:41] you are and if you were, you know, it's usually calibrated by people in their job playing the game. So if they have like a risk, you know, if they're all risky accountants and you
[00:00:51] hire new accountants and if I'm a job, Sika who is also like risky behavior and might land on the on the yes piles, there's a lot of questions about like maybe I'm a never-wake and playing
[00:01:02] video games but just have to do with any real life actually that this would be used in real life in the job. But you know there are questions about that but what we see also at work is like
[00:01:16] we see that eight out of the ten largest companies in the SVA, their workers so we probably see like every keystroke recording, everything you type, everything you put into shared apps. We see sentiment analysis of like Slack messages, emails, use Zoom calls, right? All of that can
[00:01:35] generate transcripts of texts that then can be analyzed and you know they can look for certain signals of bullying, toxic work environments, you know who speaks the most, you know who's a Boolean Zoom meeting is maybe, it to non-compliance in appropriate behavior and some tools that I
[00:01:55] that I looked at even said that they could find incidents of self-harm in social media so some companies also do it's called continuous employee background checking so just continuously look at
[00:02:07] your social media and make these inferences. We don't know a whole lot how good it is and usually isn't very good yet with sarcasm humor and also doesn't often know if you know we see
[00:02:21] that with social media scans on folks who have applied for a role because that then becomes lawfully they can actually get access to the data on them because it's like an official backwards part of an official background check and you can ask the employer or the potential
[00:02:41] employer for the data and some people have done that and when they looked at it they found out that like maybe they liked to tweet that and you know that mentioned alcohol and that was suddenly
[00:02:50] red flag like alcohol bad or you know they cited a you know I don't know a lyric of song and that was then interpreted as you know somebody who might be suicidal or prone to self-harm
[00:03:05] when we don't actually know that but the problem is like these signals maybe coming up and we see companies use like sort of fairy vast broad AI recording and checking tools because they
[00:03:17] they see employees as potential leakers as as as as a threat to their business so we only have seen that the very very beginning of them. Well okay so gosh where you kind of alluded to without
[00:03:30] saying it but like on the continuous background check thing right background checks are covered by the Theracretre reporting act which is what's that. That's why there's a transparency but it seems like you talked earlier about your kind of regulation and some kind of oversight
[00:03:44] of all this stuff it seems like a fair information reporting act right you know would appropriate you know like like I get if if you suspect somebody them bezzling that's a different topic as an example right yeah or receiving bribes or just something that's just patently illegal right
[00:04:03] but maybe we're using this to create the layoff list right. Yes and I think that's always that that's a weird idea. Yeah once you start getting like getting this data together maybe
[00:04:16] for one thing it's also easy to to look at it in another way so talk to an analyst at Gardner who talks a lot of employers and and he shared that you know this was pre-pandemic
[00:04:27] that the employer wanted to put more people in the way to do that was checking keycard entries like who was longer said at at the desk in the office. It's already kind of problematic right
[00:04:37] because we all know I consider my desk for 10 hours but not actually be successful at my job right like I don't know if that's a good indicator of performance but they use that. Some people work
[00:04:47] in home and seem to do okay I don't know. Yeah and you know it's just like it doesn't you know like it's it's it's not about checking the hours that that that your work will usually suggest
[00:04:57] that you check the results right because somebody can sit there for 10 hours do nothing. But the problem here was when then it came to you know the pandemic that the company had to
[00:05:07] do layoffs they wanted to look at that data because they felt like well you know we know who are the most productive employees in the unproductive employees the ones with the longer absentee times which of course you know we know from the pandemic now the people have
[00:05:20] families they have caregiving giving obligation is nothing to do if they are committed and productive employee often and that data wouldn't actually kind of take that into consideration right so this becomes really problematic really quickly and we also see it work like slight risk
[00:05:38] measurements that purport to predict who's going to leave in the next year and looks at some sort of signals on that like are you updating your LinkedIn very often do you move data about on
[00:05:49] do you put you as be sticks in your computer do you print a lot and that's usually based on behavior of past employees so we don't actually know exactly or just a hypothesis yes and you know
[00:06:02] it's really just a prediction right and it's not often actually accurate the people leave it in a year and the question is like what does an employer then do with that information right
[00:06:11] like maybe maybe maybe they're not so thrilled about you and they just feel like okay the person may be leaving that's good but you know if an HR manager knows that are they going to give
[00:06:20] you a raise and other people won't get a raise just because there is an indication that you might leave we don't even know if this is true right just a prediction or maybe if there's leadership training
[00:06:29] then not gonna put you forward because you may be leaving the company right like you know once you have that information it's incredibly difficult to not make decisions based on it right like even
[00:06:40] when I did some of these tests of these tools that I did you know even though I knew sometimes okay there's like there's literally no sign in facial expression analysis but when you see that number
[00:06:51] it just feels like objective math and it's just really hard to to ignore that and I think that's actually hard for for companies too because they see oh this person was flagged for being
[00:07:02] a flight risk and but we don't actually know that means anything but I think it's hard for for HR managers not to or hiring managers you know who or people in charge of managers to just ignore that information. Based on any discussions you've had with policymakers or other
[00:07:19] people that might influence policy do you think a day is coming when your companies will basically be forced to provide more transparency and to you know what they're doing in terms of data collection
[00:07:34] on employees. Yeah I mean you know we hope so I think there should be a little bit more transparency I think we see a little bit of that in the European Union right they have a new
[00:07:45] AI that's sort of in the in the in the works in hiring is actually and and AI it works actually classified as a high risk or high stakes endeavor which you know sort of triggers a bunch of
[00:08:00] safeguards that that that you have to keep as a company or an organization when you use that and I think that's actually really smart we haven't really seen that in the US at home but maybe my
[00:08:10] hope is sometimes you have this is you know maybe in some states in the United States for example in California if you're very high regulatory burden for car emissions so not every company is going
[00:08:21] to then build 50 cars for different states right they will build one car that will pass the higher emission rates in California and use it on a it's held in all other 50 states for the United States I guess
[00:08:32] and so we hope maybe this will happen with some of this regulation right if if one state or one city makes it harder and requires them of these regulations the off-the-shelf tools will follow this one
[00:08:44] regulation and then also we may have a little bit more insights in other jurisdictions but we don't know this and there isn't a whole lot of appetite in the US to genuinely regulate but we see a
[00:08:56] little bit with Biden pushing into this we see the national labor relations borders come out saying you know that that sort of brought to the hands of workers is could be really problematic because
[00:09:07] workers in the United States have the right to organize and form a union or you know at least like have organizing communication and those could be really swept up in some of this broad
[00:09:18] surveillance so we see like ideas hard to regulate it have we seen any actual regulation no but I think there's more and more at least people see the problem right and that's what I sort of felt
[00:09:30] with the book is like I wanted to show how I as already being used in you know the good ways in the bad ways and and just to show because a lot of people don't know that so now we can act right
[00:09:42] and now we can build regulation based on that that is actually specific to how we use these tools because I think the problem is like you know broad regulation when I don't understand AI is actually
[00:09:55] not really helpful you know so we kind of have to know like how is it being used and where does it go arrive what do we know and then we can you know build guardrails towards that and not just like a
[00:10:09] broad big regulation so you know that's my take on it. Less little topic and then we'll start to put a bow on this as a worker you know this is very broad should I be freaked out today I
[00:10:24] is going to take my job? I mean you know look they I they just mostly use for monitoring and surveillance is not going to take your job because it's like predictive AI right that looks at
[00:10:37] signals and and and and compute analyze the signals inside the company so I think we see a little bit that maybe AI is going to be kind of your manager or your boss in a way checking like to use
[00:10:49] enough emails compared to other workers right like there are people who are very successful do ex you know we can check their behavior so maybe the AI will remind you hey you know you should
[00:10:58] like send as many emails as that person have you talked to Tracy recently she's really successful right like we can see like some sort of nudges and in reports and then there's generator
[00:11:09] AI which I think is going to have will have a profound impact on on a lot of our jobs and will take a little bit of the mundane every day sort of stuff that we all hate to do a way
[00:11:21] from us and my hope is that jobs will then evolve right I get to do like more of the creative stuff that AI is not going to be able to do so my hope is that like our jobs will be better
[00:11:33] I'm sure some of the jobs kind of like elevator attendance we had to push the knobs and all of that maybe 80 or 100 years ago you know they went by the wayside but hopefully they found all other
[00:11:46] new jobs and hopefully that will help us as well but that is like ways out that is not tomorrow and you talked about eight out of ten a couple of times to the biggest companies or whatever
[00:11:58] you know I don't know what the right scale is but one is like not even paying attention tennis this is fully deployed where do you think we are kind of on a AI intelligence AI utilization
[00:12:13] scale right now? I think we see that company starting to you know I think it's you know deployed in hiring a lot because I think it's like it's a pretty easy use case right like you have a lot
[00:12:27] of people that are apply you need a technological solution you don't want humans necessarily we throw them as they because they they have all this unconscious bias who feels like okay this is
[00:12:36] this is a good use case I'm not saying it actually is but I think that we see a lot of good deployment of it but it's a good use case. Yeah you know they're the primary we have to work on
[00:12:46] what tools we actually built and use here but you know we we we see we see a pretty high penetration for the for for the surveillance like I wish I had better numbers but we don't know and we
[00:12:59] don't know because companies don't have to tell their employees most of the time you know what and case law has been in their favor like whatever happens on a work computer in the United States
[00:13:10] is usually you know there's no expedition of privacy and this belongs to their employer and they can do whatever they want sometimes you know and I think this is another forced consumerism
[00:13:20] sort of point like you know I remember when I started my first job maybe the first day when you like lock into whatever their Google mail or whatever infrastructure maybe maybe there's a
[00:13:30] little note you know we might monitor this space and you have to consent to that first of all you may forget about it maybe you've never read it also you're going to say no and like walk off
[00:13:39] the job and you may forget about it right so like you know some folks are very diligent and they use a second computer at all times you know they never use the computer issue by their
[00:13:54] employer to do any you know send even personal emails they don't want to wonder their personal emails on on that computer because it can all be sucked in we don't know exactly the limits
[00:14:04] you know there's some people who feel very strongly that they don't want any apps from their employer on their phone because they could swipe up you know vacuum in other data and we don't know the limits but I think what we see sometimes we see some companies now
[00:14:20] pushing into sort of well-ness and healthcare you know there's one that that you know it seems very benign and wants to personalize benefits but when I listen to the demo and they were like
[00:14:33] oh you know he's aida and you know this is a synthetic example but they were like you know aida and we see from his benefits that he dropped a spouse from his medical plan so he think we think
[00:14:46] you know the computer will infer that this person is now going through to a workforce and suggest therapists to them here are some therapists in your network you should call them and you know seems very benevolent and maybe maybe it will help aida you know who knows
[00:15:01] but I also feel like for a lot of employees it's like wait a second like why do why does a third party company have access to this like very personal benefit state are and can make inferences on
[00:15:13] that and and knows all of that so I think you know we will see more and more of that and there is no law for transparency here so it will come by like you know maybe journalists like me looking at
[00:15:26] what these companies do who they're working with and maybe there's some individual people who are very smart who can contract this kind of stuff because they have a computer science background
[00:15:36] but we don't know a whole lot of stuff I mean I think what's actually kind of helpful I watched a whole lot of webinars of software companies that sort of do these make these videos for
[00:15:47] for people who may be by these tools on the market for companies and they're very IT heavy and sort of you know very jargony and technical so nobody ever watches them and I watch them
[00:16:00] not all but a lot of them and so I like learn you know they talk very openly about the capabilities of these tools that they can find any kind of signals and can like you know the signals then
[00:16:10] lead to a report that maybe goes to IT that maybe goes to HR and gets like starts in investigation so we know it's often built in by very large tech companies in these tools do I know which company
[00:16:22] turns them on I don't so you know people have tips on that I would love to know more and talk about this more because I think we we will see this more and more.
[00:16:33] One less thing because it was in the news or I saw today and I'll say anybody in today's February the 21st was a deep fake so right Sora is out now right or yeah yeah yeah yeah
[00:16:47] you built in because of anything in video yes yeah yeah and so it was somebody in the finance department of apparently not a small company where the deep fake was executives at this company via video it's not like it was interactive directing the person to wire $25 million
[00:17:09] to some bank account. Yeah yeah looked like the executives sound like the executives even mimic their office backgrounds like it was very believable apparently and this person actually executed the tray or did the wire $25 million. I know and we've seen this before with like
[00:17:29] audio deepfakes right that like you know somebody got a call from what sounded like the CEO because you know they're often the audience out there and and you can train the tool and now
[00:17:39] you can do that with video so that you know the question sort of the authenticity of factual content right but I think it's also speaks to like what I've done and this was like a
[00:17:49] couple years ago I tricked some of these one-way video interviews but actually I wasn't on screen I was sitting next to the screen and I was typing in my questions and it had an early deepfake
[00:18:01] say the words that I was typing so I wasn't actually speaking and I got results like none of the companies actually none of the vendors that deployed these tools had any kind of
[00:18:12] security protocol right like we see this in facial recognition top technology at least you have to move for something to indicate that you're moving and breathing human he can't just hold in a photo
[00:18:23] to open a door most of the time so we see there's a whole lack of security processes on any kinds of ways at work so I think you know they have to put out a notice like six months or a year
[00:18:37] ago saying like hey employers really need to be aware of this like there's like candidates whopping and interviews but also like you know it could be a real hacking problem for for companies because you know if you employ somebody remote they you know they had somebody else
[00:18:53] so they're you know they used to date deepfake to do their employment videos they make it access to the computer systems on day one so it's not only about like somebody lied and it's
[00:19:03] so wrong employee who shows up and maybe they don't you know they can't they're not as good as a coder as he thought they would be but actually like you might give them access to very sensitive
[00:19:13] data and they start vacuuming out so there's a whole lot of new new world there that we haven't fully figured out either as employers or employees and and job seekers.
[00:19:27] Yeah it's weird hookah because I mean it sounds like science fiction right I mean it sounds like you're watching a movie you know we're entitled is you know two thousand two hundred and five you're like well yeah
[00:19:37] that's a long time from now saying no this kind of happened like right now and it seems so bizarre that they can't be real yeah yeah no it is happening you know it feels sometimes like
[00:19:49] very ovalian and dark but the reality is like a lot of these tools enough very good but the problem is they still make decisions and in this case they also make a slot decision so that
[00:19:59] gets even more problematic right we don't know a lot about these tools and science fiction is usually like these tools are always right and they sort of create this like over the top surveillance
[00:20:10] yes you may be surveyed but like who knows if the sentiment analysis any good but the problem is if those kinds of analyses are then used for employee decision this can be really
[00:20:23] difficult right it could lead you know it could be signals and we've seen this from surveys that company leaders do want to use some results from AI tools with many other decision making in in
[00:20:35] in layoff decisions right if you have no data or if you have one data point that is maybe a slot productivity algorithm you may you know some companies may want to use that well
[00:20:47] in fact they would even call it did you drive in decision right now yeah but no so this is fascinating topic I can keep you on this call for another two hours easy is there anything
[00:21:01] that we didn't talk about quickly before I let you go now I think we I think I think we've covered a whole lot and I want to you know sort of so to encourage people to understand like the world that
[00:21:15] we live in so we can act on it and now is the time to act and maybe push it push back a little bit in demand that like there is some guard males right that this is more transparent that there is
[00:21:26] explainable AI like we we know some of these things that we know is can be used for high stakes decision making maybe they should be in appeals process kind of like we have with like our credit
[00:21:38] score right like I have the right to check once a year what my credit score is I sort of roughly know whether the criteria why my score is this and I can appeal it and say like I actually this is
[00:21:47] the wrong hilka like that's not me I'm sure that is full of loss that's a system but there is a system for some of this hiring stuff we don't we don't know enough we don't actually have a right to
[00:21:58] know and we absolutely cannot push back except when it comes to like your your background screening in a in a social media background screening if it's part of a official background screening process but if a company monitors your social media after employment and maybe you said yes
[00:22:17] to that or I don't even know if they they have to get consent because some of the stuff that your post-introcial media is technically public it's not private data right they're not taking your Facebook password and looking at that but whatever your post an axon linked in that's
[00:22:31] usually publicly available so but do these tools I mean I've examined these tools that try to find personality traits out of your social media and I can authorid hatefully say they
[00:22:43] do not work because I tested them with me and others and we built a larger sample and what you with a computer scientist and a sociologist to test more people in this and they do not work
[00:22:55] and I think that's kind of hard to know that some companies use these tools. So what's the end of this I want to encourage people to get hilka's book the algorithm
[00:23:06] we're going to put a link in the post production for how to go buy the book also to follow you on LinkedIn because you write about this and posts on this pretty free- yes yes LinkedIn is a perfect way to talk about this right like there's so many
[00:23:19] job seekers HR managers people who like care about the world of work that congregate there so it's like you know I've sort of abandoned Twitter more or less and you know as many journalists have
[00:23:29] for obvious reasons but I also feel like well you know this conversation about the future of our work needs to happen on LinkedIn so I encourage people to get in touch with me to share
[00:23:40] maybe their experiences or you know if they have any comments feedback or if they think I missed you know big things I do want to know about that. So what I love about the book is I mean
[00:23:51] you're driving awareness you're starting to kind of pull the shroud back a little bit and it's not like this unfathomable thing you know they don't like you know somebody in you know mountain view understands so so I think it's great because you laid out in a way that
[00:24:06] real people could understand your image or when an investigative journalist so it's not just one person's rant on I hate AI or AI's with me. Yeah I mean in fact I started it because I think
[00:24:17] AI is a transformative technology you know and I wanted to understand it and you know and I think also like you know that is maybe like a good thing of Jones my already found people who have been
[00:24:29] affected by these technologies in profound ways so I think like you know talking to real people understanding what is happening to them has like really been eye-opening to me to pull out some of
[00:24:40] like maybe like their science behind it which I think we should talk about it but like in in in adjustable levels and also understanding here's how to fix people it's not just like AI philosophy or theoretical examples no like there's hundreds of people millions of people who
[00:24:57] looking for jobs and they have to apply to hundreds of jobs and I think you know I think when in doubt and you have like done some of the the things that that leading folks in this
[00:25:07] space recommend to you know AI make your resume machine readable out of yaddiaddiaddi it's not you like we know from the data that it often takes hundreds sometimes thousands of applications to find a job it's not you it's probably that tool in the middle between employers and
[00:25:26] that who knows what it's acting on but it's like at one point it's just a numbers game okay I'm so good to see I know I so appreciate that last point because that's really kind of
[00:25:36] getting into my why on career club is the emotional psychological toll that such takes and how much people internalize it must be me I must be the problem and thank you for saying it's probably
[00:25:51] not you actually it's the process it's the the system isn't efficient at all and therefore it drives all this inefficiency that comes through as rejection and and it's not it's not really
[00:26:09] you it's just a messed up system that hopefully to your point because we want to be optimistic and helpful the technology can ultimately be a solve for this but right now what we see in
[00:26:22] and it's like you're such a great job of illuminating we're not all the way to bright yet that's for sure yeah yeah I think we you know we we rush to sort of digitize processes that I actually
[00:26:36] did not really work in when humans do them either so let's actually change this all like what are like ways to hire that are not build on all processes that we just digitize or use use AI for but
[00:26:49] like how can we actually improve this whole and improve this whole system so I think there's a lot there's a lot to be done and it you know it doesn't always have to be AI tools in fact some of the
[00:27:00] more traditional tools like regression analysis that uses it just a few variables versus everything that's on a resume they're often actually as predictive and you don't have the stuff coming on like first names or locations or hobbies that are useful that like yes that like says
[00:27:16] more about your your background right you associate economic status and your background that actually if you qualify for the job and I think you know I know with it like HR folks and everyone
[00:27:28] they they don't want to use that either they want tools that actually find people based on merit on their skills and their experience like and not on their background and who are parents
[00:27:38] where we come from like so I think we all have to push into this base and make it better awesome well you're doing a very big piece of making it better so thank you for that thank you for taking
[00:27:49] a few minutes today everybody thank you all so much for listening today again please go by the algorithm by Hokel Schelman follow her on linked in comment as she said and also again please check
[00:28:01] out the resources that we have for both employers and for job seekers we're here to help you and by bringing you high-high quality content with award winning people like Hokel so thank you again
[00:28:12] so much hope everybody has a great day and I'll see you soon okay thank you


