Part 1 of 2

In this episode of Career Club Live, Bob Goodwin interviews Emmy Award-winning journalist Hilke Schellmann about her new book "The Algorithm" and how AI is transforming hiring and the workplace. They discuss how AI is used in applicant tracking systems, resume screening, and video interviews, as well as the potential for bias. Hilke also shares insights into how companies are using AI to monitor employees and what regulations may be needed to ensure fairness and transparency. This thought-provoking conversation provides valuable perspectives on both the opportunities and risks of AI in our working lives.

Powered by the WRKdefined Podcast Network. 

[00:00:00] Hello everybody this is Bob Goodwin and welcome to another episode of Career Club Live.

[00:00:15] Before we begin we've got a couple new things to tell you about a career club.

[00:00:19] If you're a job seeker we are now offering some free resources including a free weekly

[00:00:24] coaching call every Thursday at 1 o'clock Eastern just go to career.club, click on for job

[00:00:30] seekers free resources and you'll see it and if you're a company we're starting something new

[00:00:35] to add to the candidate experience and help build your employer brand by changing

[00:00:40] rejection letters into read direction letters providing free career resources to people who

[00:00:45] will not be moving forward in the hiring process. It's a great way to build your brand and have a

[00:00:50] differentiated candidate experience again you can learn more about that at career.club,

[00:00:55] under four employers and then under candidate experience. So with that said commercial

[00:01:02] over I am extremely excited to welcome our guests today, Hilke Schelman. I'm a real bit about this because

[00:01:08] her background is phenomenal. So Hilke is in any award-winning journalist in the insightful author of

[00:01:15] the recently published book The Algorithm how the rise of artificial intelligence will change

[00:01:20] everything and what we can do to protect our future. Her groundbreaking work has been featured in

[00:01:25] prestigious outlets like The Wall Street Journal, Time Magazine, The New York Times, The Guardian,

[00:01:30] The MIT Technology Review and it delves into the profound impact of artificial intelligence

[00:01:36] on the workforce and society. Besides her writing, Hilke is also a professor at NYU where she

[00:01:41] shapes the minds of future journalists with her deep understanding of AI's ethical implications

[00:01:46] in transformative potential. So today we're going to explore the depths of AI's influence on

[00:01:50] our world navigating the complex interplay between technology and human rights. With that,

[00:01:56] let me welcome Hilke. Oh wonderful. Thank you so much for that kind of introduction Bob. I'm so delighted

[00:02:03] to be here. No likewise. And like I love all my guests. I am particularly over the moon today because

[00:02:10] in all seriousness, you know, Hilke your topic. AI obviously that's dominating headlines.

[00:02:16] But our core audience is both job seekers and HR executives. And so this is like the perfect

[00:02:23] intersection of all of those things and you are a global expert on the topic. So it's going to be

[00:02:30] super, super fun. But before we dive into all that as we are want to do, would love to just

[00:02:35] sort of a little bit more about you as a human being. So just what I'm not in so much.

[00:02:41] Exactly. Where do we find you today? Where are you? I'm in Brooklyn, New York.

[00:02:45] I'm in my home office. I was four year old as I'm from Daker. So we all home, don't spring break.

[00:02:54] Yes, all the fun. I'm sure it feels like spring break up. But okay. Now, are you born in

[00:03:00] born and raised New Yorker? No. I'm born and raised in Germany. I went to high school all the way up

[00:03:05] to high school there. Got my bachelor's and master's degree at the Humboldt University in Berlin.

[00:03:11] But I got a full-buy to go to NYU in 2003 and that started all of this, you know, great

[00:03:18] American adventure for me. I started a documentary art collaborative and then I went back to Berlin

[00:03:24] and studied investigative journalism at Columbia, St. Skoleness somehow got trapped into

[00:03:29] staying. I don't know how you figured out the accent, but it's spot-on. It is absolutely spot-on.

[00:03:35] How I will tell my friends who make fun of my German accent. You don't have one.

[00:03:39] So, you anticipate one of the questions? Tell me a little bit about your family.

[00:03:42] So the four year old? Yes. She just turned four. She's four going on 15. She thinks she

[00:03:48] she's ready to move out and conquer the world. We are just like, you know, a little help us on the

[00:03:52] way. She's got to be like, so yeah, I live with a partner. I have a, like, you daughter who was

[00:03:58] born three weeks before the lockdown and the pandemic started. So she's like, you know, we

[00:04:04] will always be with my every birthday and like, oh yeah, that, that happened. Yeah, and so I,

[00:04:10] you know, I work as a journalist most days of the week and I teach usually two classes at

[00:04:16] NYU and sort of divide my time in like the best possible way. I get to do awesome journalism

[00:04:21] and I get to teach journalists. So it's, you know, I found I found my dream job.

[00:04:26] Well, it's just awesome. And we love people finding their dream jobs at career club.

[00:04:31] I mentioned the Emmy Award. I mean, that's not something that you just get to kind of say all the

[00:04:35] time. Do you mind telling people just a little bit about whether or not? Yeah, totally. I'd be, I'd be,

[00:04:39] I'd be delighted to. So I think, you know, in my previous life as a journalist, I was always a journalist

[00:04:45] but I did a lot of documentary work and one of the documentaries we did was it was my

[00:04:52] co-director and I went into Pakistan and followed a case on sexual assault and in sexual violence

[00:04:58] and in Pakistan and sort of film for like three, three years. This girl story going through

[00:05:03] the legal system and all of the pushback you see from society and especially men. And we did a

[00:05:11] film out of that and, you know, we got lucky that Sundance showed it at the film festival

[00:05:17] and frontline jump from the chance to to broadcast it on a PBS and, you know, we got an Emmy

[00:05:25] for that. So that was really, really fantastic. That is very impressive and obviously you did

[00:05:32] and rift on your laurels you moved on to a new topic of interest to you, which is what we're going

[00:05:36] to talk about. Yeah, yeah, you know, sort of founded by my chance a little bit, right? Like I've

[00:05:42] been doing, I've been doing like more or less social justice documentaries and podcast and then one

[00:05:47] day in 2017 November 2017 I was at a conference in Washington DC talking to consumer lawyers

[00:05:53] had no idea about AI and you know, nothing to do with hiring. And I needed a train ride

[00:05:59] from the conference to the train station and I talked to the lift driver and I just, you know,

[00:06:03] asking like how was your day? And he was like, you know, my day is really weird. And I was like,

[00:06:07] what, you know, how so? And he said that he was in a view by a robot for a job as a baggage hand

[00:06:14] and has like what robot doing shopping if you've never heard of that. So I think, you know, what it was

[00:06:19] is like sort of a pre-recorded voice asking him three questions. But, you know, I was really

[00:06:26] interested in that and then I went down the rabbit hole but, you know, and learn how, like, how much we

[00:06:32] use AI and hiring and it work and, you know, now so we see signals of AI, of his else being

[00:06:38] being used in fire rings. So that was sort of like the beginning that I was like, wow,

[00:06:43] a met someone who did a job interview with the Konakura robot and I had never heard of that

[00:06:48] and loan behold there's a whole world out there. Side note for listeners, this is like a really

[00:06:54] interesting example to me if not working. Right, you meet people, you can get some

[00:06:58] conversation that you never know the way it's going to break. It's always kind of this

[00:07:02] weird fractal like I wouldn't expect them out but that's cool and but you have to put yourself

[00:07:07] out there ask questions be interested and you never know what you're going to do. Yeah, you

[00:07:12] never know who knew right? Like, you know, they talked about, you know, and then I went to a

[00:07:16] conference on AI and, you know, the topic came up again and as sparsely attended presentation and I was

[00:07:22] like, oh somebody else was talking about some of the things this now somebody who had just left

[00:07:27] the Equal Employment of Petunay the Commission and she said that she can't sleep at night because

[00:07:31] companies use like basic algorithms to check for absenteeism in people's calendars of their workers

[00:07:37] and she was really worried that this might hurt mothers and people with disabilities who often

[00:07:42] have, you know, longer absentee weights than others but they're protected classes. You're not allowed

[00:07:47] to fire them or do anything against them if there is more time than others so she was really

[00:07:54] worried about that. That's you should be. So what's one to take back here a little bit because I

[00:07:59] want to, you know, dig into it but just to kind of completely level set because the book just

[00:08:06] came out at the beginning of January right? Yeah, January second is like the new year here it is.

[00:08:11] So I mean, the first question I don't mean for to be completely obvious but why this book and why now?

[00:08:18] You know, I've been working on this for like five years or so now, right? Like researching this

[00:08:23] but I felt like there's really like the world is really changing here and we now have all these

[00:08:29] indications that almost all Fortune 500 companies use these kinds of AI hiring tools somewhere

[00:08:35] in their hiring funnel right from like resume screening to even finding folks to reach out to

[00:08:42] to like ask them to do pre one way video interviews right where there's no human on the other side

[00:08:48] to play games. So really felt like a vehicle sea change and also we see that like for workplace

[00:08:54] monitoring and surveillance, we see eight out of the ten largest companies in the US do

[00:09:00] monitor some of the workers and we now see that like really really large companies use some

[00:09:05] of these AI vendors to check for sentiment analysis and check all of our writing. So it feels like

[00:09:10] you know, like AI has taken over the world of work and how into some of these AI job conference

[00:09:15] and it's like all these hundreds of vendors are all AI and I think you know when I talked to

[00:09:20] job applicants I was surprised that they sort of just didn't know that this has happened that

[00:09:26] you know somewhere maybe vaguely a new that maybe if they post something on LinkedIn,

[00:09:32] you know there there was a maybe screen by AI but I think no no one knew like how vast this

[00:09:37] industry has mushrooms and we didn't see a lot of that. So you know, when you do these conferences

[00:09:44] there's like thousand, ten thousand people go to HR tech every year and let's make it.

[00:09:48] And I was like wow there's like a couple of reporters there isn't a whole lot going on here but

[00:09:53] it feels like we are changing so much we're trying to quantify human beings in a mathematical way

[00:10:01] much more so than we ever could you know obviously we you know the fields of psychometric

[00:10:05] success that's tried that for a long time but I felt like this is a real sea change and we need

[00:10:10] to we need to talk about it now. No I think that's right and so again we're going to get into a

[00:10:14] lot of this more. So you know as as you think about how AI has been used and what's just sort

[00:10:21] of start if the start you know somebody does a job posting they've right they've got their

[00:10:27] applicant tracking system in place. I think most listeners would know the term ATS so there's an

[00:10:35] ATS and its job is to read the flood of resumes coming in and you at least start to kind

[00:10:43] of have an in and an out pile right like yes yes you're going to buy for this job this is a brain

[00:10:49] surgeon and you're a bricklayer probably not a good fit and in theory you know, shunt the right

[00:10:55] ones over to you know town acquisition or hiring manager or whatever that's the theory how's it work

[00:11:05] I mean you know we we don't know a whole lot about this world right like we know I talk to all

[00:11:11] the large job platforms like LinkedIn and indeed and all of them use some form of AI so if you submit your

[00:11:19] resume via them your resume will not likely hold check by and we know that companies use these

[00:11:28] applicant tracking systems and I think they used to be sort of glorified excel sheets so just

[00:11:33] checked like where is a candidate in the hiring funnel but now we know from all of the largest

[00:11:38] vendors that they have AI built in but you know there's no central place where companies need to

[00:11:43] tell anyone the government or anyone what they're doing so we don't actually know

[00:11:48] what do the companies turn on and off which I do other using how's it being trained you know

[00:11:53] what's the training data which is really important in in resume screening or in AI tools in

[00:12:00] general so what I know is I talk to a bunch of employment lawyers and others who sort of

[00:12:06] they're at the moment when maybe a vendor you know has a resume screening tool and they want

[00:12:11] a shelter company and a company is doing some due diligence that may bring in employment lawyers

[00:12:16] a folks like in in that space to do their due diligence and what they have found is like

[00:12:23] a little bit worrisome sometimes so one of the person I talked to Dr. John Scott he said that

[00:12:31] he found he checked five resume screeners and five all of them had had had problems in that

[00:12:37] bias variable so one of them had learned learned that the first name Thomas was a predictive success

[00:12:46] another one had that like Syria and the word Syria and the word Canada on a resume was an indicator

[00:12:54] of success so you know when I talked to employment lawyers they were like this could be

[00:12:58] discrimination based on national origin like it's not allowed to take locations into consideration

[00:13:06] you know we saw another employment lawyer found the word Africa and African American on on

[00:13:13] resumes as a as a predictor they also found one other resumes screener that had predicted that

[00:13:20] if you had the word baseball on your on your resume you got more points you know as a predictive

[00:13:24] success and if you had the words softball on your resume that you would get fewer points because it

[00:13:30] was a predicted that you wouldn't be successful at this company so probably where there's all coming

[00:13:35] from is from the training data right that companies maybe use resumes of the folks that are currently

[00:13:41] in the job or have you know applied and have gotten an interview with the last year so

[00:13:45] and then you know the hand it over to an AI tool put it ingested into the system and then the

[00:13:50] AI tool looks for statistical significance so maybe the word you know there are lots of people

[00:13:55] at the work Canada on their resume that were successful at the company good for them

[00:14:00] but you know that's not you know any human knows that there's actually not a predictor success

[00:14:04] neither is at the first name Thomas and it really shouldn't be about hobbies because as you can

[00:14:09] tell with like baseball softball there's probably gender discrimination that comes into play here

[00:14:14] right like most men in the United States they they like and maybe play baseball and women prefer

[00:14:19] often play softball so that's sort of the problem and if you have a company that is not diverse

[00:14:25] that maybe has you know some gender hiring biases or problems in the past and we they you know

[00:14:33] it's take a company who has like a gender disparities with more men and you know this kind of

[00:14:38] stuff can easily creep in into the system if it's not closely monitored and you know it's kind

[00:14:44] of a little bit interesting that maybe the vendors didn't find that out themselves like somebody

[00:14:48] from the outside of the company and to find these problems and we see a lot of these tools you

[00:14:53] know they're bought by companies because they're overwhelmed by applications you know the beauty

[00:14:59] of job platforms is like I get so many jobs as a job secret I can just apply applying it's

[00:15:04] wonderful but I think on the other side it has led to companies feeling absolutely overwhelmed

[00:15:10] with millions of applications like I think IBM gets five million applications a year

[00:15:15] Google three I mean it's just like staggering the numbers um so they're won by a technological

[00:15:20] solution that is efficient you know saves the money saves them labor finds the most qualified

[00:15:26] candidates and is buys free so it's definitely efficient and I think it saves a lot of companies

[00:15:31] a lot of labor and money but we haven't really seen any evidence of finds the most qualified

[00:15:35] candidates and that is not biased and that's really problematic because now we're using these tools

[00:15:40] that scale. So wow so yes the thing is like one is the opacity that you're talking about

[00:15:47] it's just a box like I can't see what's in it therefore like that's not healthy correlation

[00:15:54] and cause hourly you're not the same. The huge problem right it's like you know

[00:15:58] autonomous yeah it's you know it's probably correlation right like it statistically

[00:16:04] significant but we all know that if your name is Thomas that doesn't qualify for a job right

[00:16:09] it's not causally related to a job and any human would know that but obviously a I told doesn't

[00:16:16] know that it just looks for a statistical significance so you know there probably were some

[00:16:21] Thomas's in the pile that was successful good for them but if you don't you know monitor these

[00:16:27] tools and like sort of really critically skeptically look at them and keep supervising them

[00:16:32] these these problems can easily come in because you know usually if keeping giving a training data

[00:16:39] of new people and then this kind of stuff can keep coming in. Well so I mean it's like one is the

[00:16:47] quantity of data it has access to so company X and company Y may have very different data

[00:16:54] pools to be pulled from and then there's the quality of the data there's the historical as you

[00:17:03] kind of point out earlier the historical biases of their hiring practices to such as they might

[00:17:10] embed it into the machines. We're going back 40 years to like one of the most foundational

[00:17:20] computational principles garbage in garbage out like now you give it bad data and then you analyze it

[00:17:26] and not a great way you're going to get not a great result but the good news with AI is we can do it

[00:17:31] scale. Yeah and here in lies the problem right that like I do think that like you know one hiring

[00:17:37] manager and you know I'm sure you know we have unconscious bias a human's like we're not very good

[00:17:42] at hiring either and in fact we often feel like we are very objective and fair and and and reality

[00:17:48] we like people who are like a lot of the you know people went to the same school and yaddy yaddy yaddy

[00:17:55] so that is a huge problem too but like one human bias hiring manager can maybe discriminate against

[00:18:02] so many people and I'm sorry you know they those people are being discriminated against in one

[00:18:07] year but and AI tool that is used on millions or hundreds of thousands of resumes that that has a

[00:18:14] problem with like downgrading mostly women and upgrading mostly men. I mean you could have gender

[00:18:19] discrimination against hundreds of thousands of people and I think you know companies do

[00:18:25] like to use the technology they're a little bit afraid what might happen if they come forward saying

[00:18:31] that the technology didn't work that they're from point so you know I've talked to it a lot of

[00:18:35] HR managers said like oh yeah we had like we did this for a couple years we used this tool and

[00:18:40] we found out it doesn't work so we quietly you know like oh if the problem is like we are

[00:18:46] non-the-wiser right we don't know that there's a problem the vendors aren't being pressured to

[00:18:50] to change their tools and make them better and so there's next company then bias the tool right so

[00:18:56] I think that's really the problem that companies are so afraid of class action lawsuits because

[00:19:01] you know if they come forward and say like oh yeah we we've been discriminated to people

[00:19:05] it was you know there's a gender bias against most mostly women that I mean this huge problem

[00:19:11] so I think you know I need to talk to these you know sort of whistleblowers or folks who in the

[00:19:17] know to tell us what what is happening and that's just not good enough because these are like you

[00:19:23] know as you all know it's like it's high stakes decision and matters if I get a job right

[00:19:27] I'm nervous before a job and if you because it could literally change my life right I need it

[00:19:32] most of us need to work to make money and put roof over our heads and feet of children

[00:19:38] but also if I have a job that like makes me happy full fills me that is a big deal our

[00:19:43] identities tighter up you know I spend so much time at work you know I hope it's like

[00:19:47] somewhat rewarding for people so it does matter I know we get you know a lot of people get

[00:19:51] rejected all the time so it feels like oh yeah it's another way to get rejected but

[00:19:55] if I get rejected for my dream job because I'm not the most qualified candidate I'm not qualified

[00:20:00] enough I get it you know whatever you say at totally but if I would find out that I was rejected

[00:20:05] because at the word softball on my resume and I was discriminated because I didn't have the

[00:20:10] word baseball on I would be in which has nothing to do with the job I would be upset like that

[00:20:14] is not fair okay so let's talk for a minute about solutions the problem seems pretty profound

[00:20:21] if you were in front of you know sure or some large HR organization and we say guys like

[00:20:28] we can all agree that like these are some of the issues what were the top one two three things

[00:20:33] you recommend to them yeah so I mean I think there's there's a bunch of things that we should

[00:20:38] do I think first is like radical transparency and explainability you know it's a problem when

[00:20:44] companies use deep neural networks and you can immediately forget what that means but it means

[00:20:49] that we use training data to train an AI tool but we don't necessarily know as humans what

[00:20:56] the tool then in first upon so we don't know does it use these and these keywords and not all vendors

[00:21:03] check like there is a complicated way to check so I think that's a real problem if we build tools

[00:21:10] where we don't know how the tool um sort of you know judges applicants that is a real problem

[00:21:19] so I think that that we the first thing like transparency and explainability how was somebody

[00:21:25] judge not only were they judged by you know what's their new actual in the in the mix because

[00:21:31] they do feel that most consumers are sort of um and most drop at applicants I call them like

[00:21:36] force consumers because if you want the job and I send you a link to do a one-way video and

[00:21:40] a video with AI you're gonna say no no you want the job so you don't really have a way to say no

[00:21:46] to this so I mean I'm glad if the company tells you that AI is used but it's actually not super helpful

[00:21:52] but explaining how were you judged by the AI or how was their results in third I think that could

[00:22:00] be really really helpful for researchers and others I also think we need like some sort of

[00:22:05] regulation here that like um puts like guard rails up for like some of these high stakes decision

[00:22:10] making so that companies need to make sure that they don't do any harm and more than what we see

[00:22:15] maybe right now in in New York City which is like kind of one of the only places that has a pretty strong

[00:22:20] law here or strong gishla I'll want it to have a strong law to say that you know companies who use

[00:22:28] AI hiring tools need to be audited once a year so there's a lot of you know it's like it's a very

[00:22:34] loose law so a lot of companies I think get out of it and the audits are not very aggressive

[00:22:38] either so that's another problem and I think in general I think we need to have a much

[00:22:43] bunch larger discussion like how should we hire people like we know resumes are not very

[00:22:47] predictable like should we actually use them in hiring like what are some of the ways that we should

[00:22:53] hire people like for example we see a lot of personality tests um they're also not very predictive

[00:22:58] they may be 5 10% predictive so that means that 95 to 90% of your success on job has nothing to do

[00:23:05] with your personality because you know we are humans like we get to sometimes overcome our you know

[00:23:12] our problems maybe my personalities acts but you know I can work hard against that but you know

[00:23:16] these tools can't really um obviously you know like see that or or understand that in a way like you

[00:23:24] know no one can take that into account so I think we really need to think food like what do we

[00:23:28] need to how can we do this better can we have virtual reality to test people on the job maybe

[00:23:33] that is a good way to test folks maybe have a more holistic approach we have different assessments

[00:23:38] you look at the resume you look at a job interview um so it's not just one screen like either

[00:23:44] as a way now you're out like we need to have a much larger discussion here that I'm I hope

[00:23:51] to push a little bit. No I think it's great and then you know on the foot for salt I did an

[00:23:59] interview with the CHR of 130,000 person company recently they hate resumes right because

[00:24:06] the end are desperately trying to find what is the next diversion of you know candidate

[00:24:14] identification and qualification you know where they're way more interested in things like your

[00:24:20] desire to contribute your ability being agile learner your ability to be resilient I mean things

[00:24:26] it really are going to impact your ability to survive and that's survived but even thrive through

[00:24:33] everything that's going on in the world these days and what your job will be and how quickly things

[00:24:37] change so I love all that stuff but we would we need to like really have a scientific discussion like

[00:24:43] how to actually test with that. That's a very hard thing to to measure. It's not easy to build

[00:24:49] if it's any good as a completely different question that's the problem right you know it's a technology

[00:24:54] is out in front of its ability to promise what it really says it can do. Yeah it does look nice.

[00:25:01] I mean my hope is that maybe we'll see like a large starter for something that can do

[00:25:05] maybe some technical assessment or some skills based assessment on sort of hard skills

[00:25:10] because I do think teamwork and agility is actually really hard to to test and we're we're not there yet

[00:25:16] but maybe you know if you're a software developer and you say you know Python you put it on

[00:25:20] your resume as I managed I have no idea you begin or you master developer like where you at

[00:25:25] right it doesn't tell me anything about your skill level so maybe we can do one assessment

[00:25:31] and then it gets like written into sort of the blockchain with a ledger and you have an appeal

[00:25:36] process as a job seeker right and you can like send this assessment to many companies so you

[00:25:41] don't have to redo all these different assessments all the time like it's such a drain on job

[00:25:46] seekers and I'm sure a job manager is not interested in kind of facilitating that either if there

[00:25:52] was like sort of a larger system you know. Are you familiar with some of the tools we're going

[00:25:59] to move on from a TSS in a second but are you familiar with some tools that's basically the

[00:26:04] TSS reverse like job scan and TL has one and yeah yeah yeah I think they're very helpful for

[00:26:11] some job seekers to get like a general sense you know we never know which company has you know

[00:26:18] how is dairy eye tool calibrated like is a calibrated unfokes in the company or is it like an

[00:26:23] off the shelf solution so you know but it gives you a good reading of like okay if you have this

[00:26:28] job description here's the resume how much overlap do you have right and I think this sort of

[00:26:33] general consensus is like maybe 60 to 85% is a good overlap don't do 100% because then some AI tools

[00:26:40] might think you just copied the job description and like throw you around but I think they give you

[00:26:45] a good sense of like what is actually being ingested I think another good signal is it's like if you

[00:26:49] need to upload your resume to some to some companies website and you see that you know maybe your work

[00:26:55] experience isn't in the right column on on the side and when you upload you know usually tells

[00:27:00] it here is like what we thinking what you put in these different columns and feels that might

[00:27:07] be an indication that your resume isn't as well machine readable as it could be so I think

[00:27:13] jobs get and out these other tools can be really helpful because we even see like really you know

[00:27:19] these are semi structured data resumes so computer some of is getting right to get it in the

[00:27:25] deal and I mean the sum of this very basic some of is very sophisticated where problems lie

[00:27:32] so I think that might be one thing for for jobs he could still think through yeah and we do encourage

[00:27:37] people to do that one is your last point around having an ATS friendly resume so it's not complicated

[00:27:43] it's not a piece of art doesn't matter to you like you know if you're just a bit fearful

[00:27:49] but then you know when we're using tools like job scanners or others we we encourage people

[00:27:55] back on the opacity problem we don't know what drives the score but the diagnostics are helpful

[00:28:00] right so you might say customer success and they say client success it's like all right well what

[00:28:06] she's the same vocabulary I mean like if you want to be understood by somebody speak their language

[00:28:10] so what's at least speak the language you make a really good point go about not being like so

[00:28:16] like yeah you did kind of copy the job description and now I mean you should definitely copy

[00:28:21] the most important keywords and it would also suggest not to use synonyms because not all AI tools are

[00:28:27] trained we see sometimes they understand sometimes they don't and I think you know if you want

[00:28:33] a half like a really cool exciting resume you know maybe maybe your machine readable is not exciting

[00:28:39] but if you have then have an in-person job and if you can always bring you know you know

[00:28:44] it's amazing that you love right with the colors and the two columns and you can give that to a

[00:28:49] human but you know machine readable if that is your first encounter it has to be machine readable

[00:28:54] if it's not you kind of already rejected and you know like some other folks who have surveyed

[00:29:00] companies also found out that some of the ATS tools are calibrated to throw out people have

[00:29:06] a six month and longer gap between jobs right and I think that's a really insidious

[00:29:12] problem because you know who knows why he had a six month job that maybe you move maybe

[00:29:16] you're you're sparse in the military maybe he had a kid yet to take care of for his parents or

[00:29:21] were you're a real good person so I think you know knowing that and sort of like you know and

[00:29:28] I think sometimes you just need to just literally cover this gap with like maybe maybe you did

[00:29:33] do some freelance work you know maybe you took care of family members just to put that in there

[00:29:38] so that a machine will put you on the yes pile if you're qualified right because you could be the

[00:29:43] most qualified person if you have that gap and the ATS is calibrated to throw out it will reject you

[00:29:49] so it's like one and out you don't get to like check with a human being and explain why this may have

[00:29:55] happened and you know so I'm not think about benefits like there are some tools for jobs

[00:30:03] seekers that allow you to do like a video interview oh yeah yeah I think it's a great training

[00:30:09] I mean dropping your resume drop in the job description and then it goes okay you want the

[00:30:13] structure of marketing role at a consumer package that's company great and then it says Bob can you

[00:30:19] walk me through your experience and bring you new products to market or whatever and what when I do

[00:30:24] like about them is will we know from our clients is that there's great value in hearing the words come

[00:30:32] out of your mouth like in being able to practice and again with the exact you know math is on the

[00:30:39] back end of the virtual tool to say hey you really crush that interview or that was a great answer

[00:30:44] but the ability to practice and get some feedback you know we do find helpful um here so

[00:30:51] there are other tools maybe that one or other things that you've seen that where AI is being used for

[00:30:58] the good or that that's from the company side or from the candidate side yeah I mean I think what

[00:31:04] would I so I mean if I think that some job seekers are using uh generator AI right check GPT and

[00:31:10] and others to help them you know maybe help them with writing cover letter polishing the resume

[00:31:15] right especially the English is in the first language I think could be absolutely helpful you know

[00:31:20] it's like sort of a a gram allion steroids you just always check that it doesn't

[00:31:24] invent something right like so you have to be a very closely monitoring chat GPT but I think it actually

[00:31:32] helps also with kind of like interview questions right you can ask chat GPT what are the most commonly

[00:31:37] asked interview question and maybe also you know prompted for some for some answers and sort of you

[00:31:43] know maybe you'll find like great answers to like what are you strengthen weaknesses and you know

[00:31:48] sort of those cliche questions and you can prep for that with these video interview tools so I think

[00:31:53] that could be really really helpful to get over this like maybe pretty unnatural setting to

[00:31:58] like talk into your camera there's no one there but you want to sound excitable and you know

[00:32:02] you know excited about the job and it's kind of like okay um and I think you know another thing

[00:32:08] is that like um you know if there's an AI involved like actually longer answers are better than shorter

[00:32:15] for a AI to actually like calibrate a result um so I think that's helpful to actually sort of like

[00:32:22] maybe tell a little bit of a story about we know that also from behavioral questions like you know

[00:32:26] what is an obstacle you overcome like I think you want to um uh talk about that in depth um for

[00:32:33] for tool to work with you but we know very little how to pull out personality like teamwork out

[00:32:39] of these questions like I'm not 100% sure what the what the science behind that and I think that's

[00:32:44] a place for me to dig a little deeper for you for the uh next journalism series um but yeah

[00:32:52] I think those those tools are really great and I do think that there's also some

[00:32:57] uh things on the company side that is like really exciting so we see some um companies that uh

[00:33:03] vendors that help companies sort of um career lattice or career ladder yeah more like a

[00:33:09] lattice not a ladder and any more because people change jobs and functions so many times but we see how

[00:33:15] you know AI tools you know kind of like LinkedIn could actually help you like your playful

[00:33:19] one job but then the company says the company say I tool tells you like hey you know you have

[00:33:24] these other five um uh key skills would you also be interested in this job in this job so you

[00:33:30] know it could like open up your possibilities and it also inside uh the company for employees

[00:33:37] but you often see it's like sort of this career lattice that it tells you okay you are director now

[00:33:41] people in your position five years have done this in this and here's like the skills that you

[00:33:45] need to learn and I think that can be really helpful um I think the problem is a little bit

[00:33:50] that all of this is based again on resume data which we know is like very limiting yeah you know

[00:33:55] we don't know how well you actually are you know how good you are at this particular skill but

[00:34:00] I think it's just good indication so I think it's actually good use case for for big data