Ethical AI: Doing No Harm with Ivneet Kaur, Chief Information and Technology Officer at Sterling
You Should KnowAugust 20, 202400:46:47

Ethical AI: Doing No Harm with Ivneet Kaur, Chief Information and Technology Officer at Sterling

Ivneet Kaur, Sterling’s Chief Information and Technology Officer, shares her insights on AI, emphasizing the need for a holistic approach that includes strategy, governance, and upskilling. We dive deep into ethical AI, centralization for proper governance, and the necessity of removing bias and protecting privacy and security. We look at the importance of AI standards, interoperability, and third-party auditing to ensure fairness and accuracy. Additionally, we touch on the potential of AI in enhancing customer service and the need for robust security against AI-related attacks.

In this episode, we look at AI, ethical AI, AI security attacks, governance, standards, interoperability, privacy, and explore how these key areas shape the future of work and technology.

Key Takeaways

  1. Adopt a holistic approach to AI, integrating strategy, governance, and upskilling to navigate AI's complexities.
  2. Centralizing AI is essential for proper auditing, governance, and driving innovation without compromising ethics.
  3. Ethical AI prioritizes the removal of bias, safeguarding privacy, and ensuring security.
  4. Government regulation is crucial to prevent AI-driven harm and protect vulnerable populations.
  5. Establishing base standards for AI models is vital for interoperability across different providers.
  6. Third-party auditing of AI models is necessary to ensure fairness, accuracy, and mitigate risks.


Chapters

00:00 Introduction and Setting the Stage

02:27 The State of AI: Separating Hype from Reality

08:12 Centralizing AI: Balancing Innovation and Governance

11:36 Ethical AI: Doing No Harm

13:42 The Role of Government in Regulating AI

18:58 The Importance of Learning about AI

21:25 Standards in AI

23:08 Interoperability and Portability

26:10 Data Portability and Model Governance

30:09 Trust and Verification in AI

36:05 AI Security Attacks

38:29 Improving Customer Service Experiences


Connect with Ivneet Kaur on LinkedIn here: https://www.linkedin.com/in/ivneetkaur-exec/

Learn more about Sterling here: https://www.sterlingcheck.com/


Connect with WRKdefined on your favorite social network

The Site | Substack | LinkedIn | Instagram | X | Facebook | TikTok

Share your brand across the WRKdefined Podcast Network

Learn more about your ad choices. Visit megaphone.fm/adchoices

Powered by the WRKdefined Podcast Network. 

[00:00:00] [SPEAKER_01]: Oh my goodness, bad touching, harassment, sex, violence, fraud, threats, all things that could

[00:00:10] [SPEAKER_01]: have been avoided.

[00:00:13] [SPEAKER_01]: If you had Fama, stop hiring dangerous people.

[00:00:19] [SPEAKER_01]: Fama.io

[00:00:20] [SPEAKER_02]: Hey, this is William Tinkup and Ryan Leary and you are listening and hopefully watching

[00:00:36] [SPEAKER_02]: the You Should Know podcast.

[00:00:39] [SPEAKER_02]: Ryan, how are you doing today?

[00:00:40] [SPEAKER_01]: Oh, I am wonderful.

[00:00:43] [SPEAKER_01]: I've never had a bad day in my life.

[00:00:45] [SPEAKER_01]: It's beautiful outside and...

[00:00:47] [SPEAKER_01]: Not true.

[00:00:47] [SPEAKER_01]: How do you know it's not true?

[00:00:49] [SPEAKER_01]: You don't live in my head.

[00:00:51] [SPEAKER_01]: I have evidence.

[00:00:52] [SPEAKER_01]: Well, that's kind of not true.

[00:00:53] [SPEAKER_01]: You do live in my head because I hear you at night.

[00:00:55] [SPEAKER_01]: I hear you in the morning.

[00:00:57] [SPEAKER_01]: Oh, that's terrible.

[00:00:58] [SPEAKER_02]: I don't even hear me.

[00:00:59] [SPEAKER_02]: I get tired of my own voice.

[00:01:02] [SPEAKER_02]: You know, Ivneet, how are you doing today?

[00:01:07] [SPEAKER_00]: I am doing very well, William and Ryan.

[00:01:10] [SPEAKER_00]: Yes, I have definitely had a bad day, Ryan, so I will not...

[00:01:14] [SPEAKER_02]: Although you're in Miami, you're not having too many bad days.

[00:01:19] [SPEAKER_00]: I'm not.

[00:01:21] [SPEAKER_00]: I can stare at the blue ocean right there.

[00:01:24] [SPEAKER_02]: That's not bad at all.

[00:01:27] [SPEAKER_02]: So while you're inside.

[00:01:29] [SPEAKER_02]: Yeah.

[00:01:30] [SPEAKER_02]: While we do some introduction, of course, tell us a little bit about yourself.

[00:01:39] [SPEAKER_00]: So from a professional perspective, I right now serve as the chief information and technology

[00:01:45] [SPEAKER_00]: officer for Sterling.

[00:01:48] [SPEAKER_00]: Any of you who don't know Sterling, we are a global provider of background and identity

[00:01:53] [SPEAKER_00]: services.

[00:01:54] [SPEAKER_00]: We serve clients in about 214 countries and hopefully you are also using us for hiring

[00:02:01] [SPEAKER_00]: people.

[00:02:01] [SPEAKER_00]: And I have responsibility for all of their internal and external technology data and

[00:02:09] [SPEAKER_00]: security ecosystem.

[00:02:11] [SPEAKER_00]: And I'm also the founder and sponsor of Sterling's AI program.

[00:02:15] [SPEAKER_02]: So, Ryan, before we get into questions, we've got a topic that we want to...

[00:02:19] [SPEAKER_02]: We're going to go down a lot of different rabbit holes, but with your expertise and

[00:02:24] [SPEAKER_02]: breadth of knowledge, we want to keep it simple and do the state of AI.

[00:02:28] [SPEAKER_02]: I just can't get your take on a lot of things that we're seeing in the marketplace.

[00:02:33] [SPEAKER_02]: So with that being said, let's just start basically because you're like you said,

[00:02:39] [SPEAKER_02]: you're the founder of kind of...

[00:02:41] [SPEAKER_02]: I would assume what you're trying to do is build ethical AI.

[00:02:47] [SPEAKER_02]: So how do you all...

[00:02:49] [SPEAKER_02]: What's your approach on auditing right now AI so that it's doing what it's

[00:02:56] [SPEAKER_02]: supposed to be doing?

[00:02:57] [SPEAKER_02]: So what's your take on that right now?

[00:03:03] [SPEAKER_00]: Yeah, so I think of it as three different monikers that I would say ethical,

[00:03:08] [SPEAKER_00]: responsible, secure.

[00:03:12] [SPEAKER_00]: So when we started our journey, we knew that there is a lot of hype around

[00:03:17] [SPEAKER_00]: things, but we didn't want to just jump into it and just start using it

[00:03:22] [SPEAKER_00]: and making it widely available and all those kind of things.

[00:03:26] [SPEAKER_00]: So we took a step back and we said, we are going to take a thoughtful

[00:03:30] [SPEAKER_00]: and holistic approach which takes into account what is our strategy?

[00:03:37] [SPEAKER_00]: What is the governance model?

[00:03:39] [SPEAKER_00]: How are we going to handle the pace of change?

[00:03:42] [SPEAKER_00]: How are we going to handle the upskilling?

[00:03:44] [SPEAKER_00]: And more importantly, how are we going to separate the hype from reality?

[00:03:51] [SPEAKER_00]: And there was a lot of do's and do nots which come into play there.

[00:03:55] [SPEAKER_00]: And part of that is also auditing and making sure that you're not

[00:04:00] [SPEAKER_00]: providing incorrect results and answers.

[00:04:04] [SPEAKER_00]: And that's where I think the thing which comes into play really is having

[00:04:07] [SPEAKER_00]: a very strong governance model, which is not just built on just

[00:04:13] [SPEAKER_00]: having engineers in it.

[00:04:14] [SPEAKER_00]: In this governance models, we have our HR partners, we have marketing,

[00:04:19] [SPEAKER_00]: we have privacy, we have compliance, we have security, we have product

[00:04:23] [SPEAKER_00]: and engineering.

[00:04:24] [SPEAKER_00]: So what we did was we set up this cross functional group who is going

[00:04:28] [SPEAKER_00]: to be able to be our steering committee and our governance group

[00:04:33] [SPEAKER_00]: to guide us as we bring use cases.

[00:04:36] [SPEAKER_00]: And we evaluate every single use case through multiple dimensions

[00:04:42] [SPEAKER_00]: and security, privacy and compliance are a huge part of it

[00:04:46] [SPEAKER_00]: to make sure is it something we should really be doing?

[00:04:49] [SPEAKER_00]: Should not be doing or can we do a portion of it?

[00:04:52] [SPEAKER_00]: So all of those things we take into consideration, even before we

[00:04:56] [SPEAKER_00]: start developing that use case.

[00:04:59] [SPEAKER_00]: Doesn't mean we are stopping experimentation, doesn't mean we

[00:05:01] [SPEAKER_00]: are not giving people sandboxes to do things.

[00:05:05] [SPEAKER_00]: But when it comes to real use case evaluation, we are being

[00:05:08] [SPEAKER_00]: very careful and thoughtful, especially because our business

[00:05:12] [SPEAKER_00]: is also heavily regulated.

[00:05:14] [SPEAKER_00]: People depend on us to hire their most important asset, people.

[00:05:19] [SPEAKER_00]: So for us, it's even more critical to do that and

[00:05:24] [SPEAKER_00]: you know, jumping in and just doing things without thinking

[00:05:28] [SPEAKER_00]: through all of those things is not always.

[00:05:30] [SPEAKER_01]: Yeah, how do we?

[00:05:32] [SPEAKER_01]: So we'll get into a number of these these scenarios as we

[00:05:37] [SPEAKER_01]: continue on. But how does one I want to take a break real

[00:05:43] [SPEAKER_01]: quick just to let you know about a new show.

[00:05:46] [SPEAKER_01]: We've just added to the network up next at work.

[00:05:50] [SPEAKER_01]: Hosted by Jean and Kate A'Keele of the Devon Group.

[00:05:54] [SPEAKER_01]: Fantastic show.

[00:05:56] [SPEAKER_01]: If you're looking for something that pushes the norm, pushes

[00:05:58] [SPEAKER_01]: the boundaries, has some really spirited conversations.

[00:06:02] [SPEAKER_01]: Google up next at work.

[00:06:05] [SPEAKER_01]: Jean and Kate A'Keele from the Devon Group.

[00:06:10] [SPEAKER_01]: Delineate between hype and reality.

[00:06:14] [SPEAKER_01]: I know that's a big question.

[00:06:16] [SPEAKER_01]: I get it.

[00:06:17] [SPEAKER_01]: She's going to take.

[00:06:19] [SPEAKER_01]: I know she's going to take on this.

[00:06:20] [SPEAKER_01]: But it was one of the first things you mentioned.

[00:06:24] [SPEAKER_01]: And and you know, it's kind of like, do I wait for this?

[00:06:26] [SPEAKER_01]: Let's just get into how do we delineate between hype and

[00:06:29] [SPEAKER_01]: reality because there's pressure from clients that want

[00:06:33] [SPEAKER_01]: you to be more innovative.

[00:06:36] [SPEAKER_02]: There's on the edge bleeding.

[00:06:38] [SPEAKER_01]: Right. Yeah.

[00:06:39] [SPEAKER_01]: On the bleeding edge is pressure from internal people

[00:06:41] [SPEAKER_01]: that said, why can't we go and do this?

[00:06:43] [SPEAKER_01]: How do we go faster?

[00:06:45] [SPEAKER_01]: How do we go further?

[00:06:46] [SPEAKER_01]: But how do we how do we hone in on that?

[00:06:49] [SPEAKER_01]: And really, how do you look at that?

[00:06:54] [SPEAKER_00]: So and this is going to sound very, very basic.

[00:06:58] [SPEAKER_00]: But it's these things are usually to do them.

[00:07:03] [SPEAKER_00]: They are simple, but they're always hard.

[00:07:06] [SPEAKER_00]: So the way I think about the North Star is the way

[00:07:09] [SPEAKER_00]: we bring back to people to people is OK, what is

[00:07:12] [SPEAKER_00]: the problem we're trying to solve for our customers?

[00:07:16] [SPEAKER_00]: Whether it's our internal employees or our external

[00:07:18] [SPEAKER_00]: customers, what is the problem?

[00:07:20] [SPEAKER_00]: Because I remember going through that even at

[00:07:23] [SPEAKER_00]: Sterling where the use cases were coming out of

[00:07:26] [SPEAKER_00]: woodwork, even though half of them did not need

[00:07:31] [SPEAKER_00]: AI to solve them.

[00:07:33] [SPEAKER_00]: But it just became came to people's mind.

[00:07:35] [SPEAKER_00]: Oh, this is this new AI Gen AI capabilities

[00:07:37] [SPEAKER_00]: available. Can I just use it to solve this

[00:07:40] [SPEAKER_00]: problem? The answer is no, not really.

[00:07:42] [SPEAKER_00]: There are cheaper, better ways to do it.

[00:07:45] [SPEAKER_00]: So but coming back to your question on hype

[00:07:48] [SPEAKER_00]: versus reality, first is your North Star is what

[00:07:51] [SPEAKER_00]: is the real business problem you're trying to solve?

[00:07:53] [SPEAKER_00]: Whether it's related to experience, efficiency,

[00:07:57] [SPEAKER_00]: optimization, productivity, whatever that is.

[00:07:59] [SPEAKER_00]: You have to have a very good sense of that.

[00:08:01] [SPEAKER_00]: Second, you have to have a really good evaluation

[00:08:04] [SPEAKER_00]: process and a group of smart people who are

[00:08:07] [SPEAKER_00]: able to look at these capabilities.

[00:08:09] [SPEAKER_00]: Oh, yeah, because there's a lot of AI washing

[00:08:11] [SPEAKER_00]: going on. I'm sure you guys have heard

[00:08:13] [SPEAKER_00]: on greenwashing. There is AI washing going

[00:08:16] [SPEAKER_00]: on where every single vendor has now added

[00:08:19] [SPEAKER_00]: AI as a buzzword to the offerings they do,

[00:08:22] [SPEAKER_00]: whether they use AI or not, or use it

[00:08:24] [SPEAKER_00]: in a very limited capacity.

[00:08:26] [SPEAKER_00]: So having the right talent in house

[00:08:29] [SPEAKER_00]: is very important also where somebody can dig

[00:08:32] [SPEAKER_00]: deep to test the

[00:08:35] [SPEAKER_00]: capability. So that testing and learning capability

[00:08:38] [SPEAKER_00]: from a technical perspective and from a privacy

[00:08:41] [SPEAKER_00]: and compliance perspective is important.

[00:08:44] [SPEAKER_00]: And that's sort of what we did. We put

[00:08:47] [SPEAKER_00]: a cross functional group of people together

[00:08:50] [SPEAKER_00]: to do that before we

[00:08:53] [SPEAKER_00]: before we go much further into it.

[00:08:56] [SPEAKER_00]: So those two things I would definitely

[00:08:59] [SPEAKER_00]: suggest that which happened and then there's

[00:09:01] [SPEAKER_00]: a continuous education.

[00:09:02] [SPEAKER_00]: So this is not a one and done thing.

[00:09:04] [SPEAKER_00]: The space is evolving incredibly fast.

[00:09:08] [SPEAKER_00]: It has, I would say the honeymoon period has

[00:09:10] [SPEAKER_00]: gotten over a little bit in 24 and people

[00:09:12] [SPEAKER_00]: are realizing that there is a lot more

[00:09:15] [SPEAKER_00]: to it than just launching some

[00:09:17] [SPEAKER_00]: chatbots if you will. And it requires

[00:09:20] [SPEAKER_00]: a more, you know, much bigger holistic approach.

[00:09:23] [SPEAKER_00]: But it's still a very cool innovative technology

[00:09:26] [SPEAKER_00]: and I am very I would say bullish about

[00:09:29] [SPEAKER_00]: there is a lot of potential to do a lot

[00:09:31] [SPEAKER_00]: of good in for the organization

[00:09:34] [SPEAKER_00]: and for people. So keep your not start

[00:09:36] [SPEAKER_00]: in mind what business problems you're solving.

[00:09:38] [SPEAKER_00]: Have the right set of skill sets in house

[00:09:41] [SPEAKER_00]: who can help you separate this hype from

[00:09:43] [SPEAKER_00]: reality and truly test.

[00:09:45] [SPEAKER_02]: Your customers probably have given you

[00:09:47] [SPEAKER_02]: this feedback as well because some things

[00:09:49] [SPEAKER_02]: that we hear in terms of centralizing

[00:09:52] [SPEAKER_02]: a I versus decentralizing

[00:09:55] [SPEAKER_02]: a I so

[00:09:56] [SPEAKER_02]: if some companies will have marketing will be

[00:09:59] [SPEAKER_02]: using gen AI to do X Y and Z

[00:10:01] [SPEAKER_02]: finances using it to do something else

[00:10:05] [SPEAKER_02]: and they wake up

[00:10:07] [SPEAKER_02]: and it's like AI is throughout the organization.

[00:10:09] [SPEAKER_02]: It's just not no one's got one finger

[00:10:12] [SPEAKER_02]: on the pulse of what's going on with AI,

[00:10:14] [SPEAKER_02]: which I think to your point creates kind

[00:10:16] [SPEAKER_02]: of security and privacy issues that could happen.

[00:10:20] [SPEAKER_02]: And it seems like you all have centralized it

[00:10:23] [SPEAKER_02]: to where if someone in marketing is going

[00:10:25] [SPEAKER_02]: to use AI, then they've got to go

[00:10:28] [SPEAKER_02]: through an approval process or some type of process.

[00:10:31] [SPEAKER_02]: Did I get that right?

[00:10:34] [SPEAKER_00]: Yes, yes, yes.

[00:10:36] [SPEAKER_00]: So I think of it

[00:10:38] [SPEAKER_00]: as you have to centralize to decentralize.

[00:10:41] [SPEAKER_00]: So when you're starting on these things

[00:10:43] [SPEAKER_00]: from a journey perspective, you have to

[00:10:45] [SPEAKER_00]: make sure you're giving people the right

[00:10:47] [SPEAKER_00]: guardrails, the right set of tools,

[00:10:49] [SPEAKER_00]: the right set of policies, the right set

[00:10:51] [SPEAKER_00]: of processes, a combination of all these

[00:10:54] [SPEAKER_00]: things and not to mention the organizational

[00:10:56] [SPEAKER_00]: change management.

[00:10:58] [SPEAKER_00]: But then you're also giving them

[00:10:59] [SPEAKER_00]: enough freedom within the guardrails

[00:11:02] [SPEAKER_00]: to experiment and launch

[00:11:05] [SPEAKER_00]: their own capabilities.

[00:11:07] [SPEAKER_00]: So it's I think of it a little bit more

[00:11:09] [SPEAKER_00]: of a hybrid model, if you will,

[00:11:12] [SPEAKER_00]: not fully centralized and not fully decentralized,

[00:11:15] [SPEAKER_00]: especially when you're starting the

[00:11:16] [SPEAKER_00]: centralization is very critical

[00:11:18] [SPEAKER_00]: to do to set the right guardrails.

[00:11:20] [SPEAKER_02]: Especially again, letting people innovate,

[00:11:23] [SPEAKER_02]: you don't want to you don't want to squish

[00:11:24] [SPEAKER_02]: that out of any department or any area

[00:11:26] [SPEAKER_02]: in the company.

[00:11:27] [SPEAKER_02]: Everyone should be able to innovate.

[00:11:29] [SPEAKER_02]: But at the same time, if the innovation

[00:11:31] [SPEAKER_02]: opens you up to a security breach

[00:11:34] [SPEAKER_02]: or privacy breach, the innovations

[00:11:36] [SPEAKER_02]: not worth it.

[00:11:38] [SPEAKER_02]: So I mean, there is a balancing that

[00:11:40] [SPEAKER_02]: you're delicate balancing that

[00:11:41] [SPEAKER_02]: professionals like you are trying to

[00:11:43] [SPEAKER_02]: figure out like, OK,

[00:11:45] [SPEAKER_02]: we need innovation at the same time.

[00:11:47] [SPEAKER_02]: Data's got to be secure, it's got to be private.

[00:11:50] [SPEAKER_02]: I mean, yeah.

[00:11:51] [SPEAKER_01]: One of the questions I have

[00:11:53] [SPEAKER_01]: is we hear a lot about ethical

[00:11:55] [SPEAKER_01]: AI and ethical everything.

[00:11:59] [SPEAKER_01]: But from the horse's mouth, someone

[00:12:01] [SPEAKER_01]: who's doing this at a large company

[00:12:04] [SPEAKER_01]: and doing and managing overseeing a lot

[00:12:07] [SPEAKER_01]: of innovation.

[00:12:08] [SPEAKER_01]: What does ethical AI mean to you?

[00:12:12] [SPEAKER_02]: Yeah, it's a great question.

[00:12:14] [SPEAKER_02]: Because is ethical subjective?

[00:12:19] [SPEAKER_00]: It could be. But.

[00:12:22] [SPEAKER_00]: Ethical to me is do no harm.

[00:12:24] [SPEAKER_00]: OK. That's do no harm.

[00:12:27] [SPEAKER_00]: That's the definition I go with.

[00:12:29] [SPEAKER_00]: Well, you know, overall in general,

[00:12:31] [SPEAKER_00]: not just in context of using AI

[00:12:34] [SPEAKER_00]: or any technology or building any

[00:12:36] [SPEAKER_00]: product. So we don't want to

[00:12:38] [SPEAKER_00]: be biased, let's say towards

[00:12:41] [SPEAKER_00]: candidates, whether we are hiring them

[00:12:42] [SPEAKER_00]: internally or helping our clients hire

[00:12:44] [SPEAKER_00]: them externally.

[00:12:46] [SPEAKER_00]: We don't want to give our clients

[00:12:48] [SPEAKER_00]: something which can make them

[00:12:50] [SPEAKER_00]: biased towards a certain set

[00:12:52] [SPEAKER_00]: of candidate pool they're trying

[00:12:54] [SPEAKER_00]: to hire. So that's the lens

[00:12:56] [SPEAKER_00]: to use that how do we remove

[00:12:59] [SPEAKER_00]: bias and how do we not harm

[00:13:01] [SPEAKER_00]: our clients or our candidates

[00:13:04] [SPEAKER_00]: as we are developing any products

[00:13:06] [SPEAKER_00]: or solutions in AI

[00:13:08] [SPEAKER_00]: just exasperates that

[00:13:10] [SPEAKER_00]: more because of, you know,

[00:13:12] [SPEAKER_00]: the capabilities that it brings to

[00:13:15] [SPEAKER_00]: the table.

[00:13:16] [SPEAKER_00]: But yeah, that's how I think we'll

[00:13:18] [SPEAKER_02]: steal that if you don't mind.

[00:13:19] [SPEAKER_02]: We're going to steal that and use

[00:13:21] [SPEAKER_02]: that for ourselves.

[00:13:23] [SPEAKER_02]: All right. So Ryan and I do a new

[00:13:25] [SPEAKER_02]: show on Sundays and we cover

[00:13:27] [SPEAKER_02]: the market

[00:13:28] [SPEAKER_02]: a couple of weeks ago in Dresden

[00:13:31] [SPEAKER_02]: Horowitz came out and

[00:13:32] [SPEAKER_02]: said they were going to support Trump.

[00:13:35] [SPEAKER_02]: But one of the reason that

[00:13:37] [SPEAKER_02]: Mark said he's going to support

[00:13:39] [SPEAKER_02]: Trump was

[00:13:41] [SPEAKER_02]: he doesn't want government interference

[00:13:43] [SPEAKER_02]: with AI, which I found fascinating.

[00:13:46] [SPEAKER_02]: There's just a whole thing that's

[00:13:47] [SPEAKER_02]: fascinating on some some level.

[00:13:49] [SPEAKER_02]: Right. So what's your take

[00:13:51] [SPEAKER_02]: on because y'all are heavily

[00:13:53] [SPEAKER_02]: regulated industry, right?

[00:13:56] [SPEAKER_02]: Could almost like finance or like

[00:13:58] [SPEAKER_02]: insurance, health care, etc.

[00:14:00] [SPEAKER_02]: Like, OK, got it.

[00:14:03] [SPEAKER_02]: What do you think about government

[00:14:05] [SPEAKER_02]: jumping into this particular

[00:14:07] [SPEAKER_02]: foray into this particular area

[00:14:11] [SPEAKER_02]: and setting regs and setting

[00:14:13] [SPEAKER_02]: different things that you can do,

[00:14:14] [SPEAKER_02]: can't do, etc.

[00:14:15] [SPEAKER_02]: Like is that a is that

[00:14:18] [SPEAKER_02]: a positive thing for your net net?

[00:14:19] [SPEAKER_02]: Or is that you'd rather

[00:14:21] [SPEAKER_02]: like Mr. Horowitz?

[00:14:24] [SPEAKER_00]: I think it's a positive thing.

[00:14:26] [SPEAKER_00]: I personally think government has

[00:14:28] [SPEAKER_00]: a role to play.

[00:14:30] [SPEAKER_00]: I do think government has a role to play.

[00:14:32] [SPEAKER_00]: And here is here is how I think about

[00:14:35] [SPEAKER_00]: it. And again, I'm going to

[00:14:37] [SPEAKER_00]: see it in broad terms.

[00:14:39] [SPEAKER_00]: The the the way

[00:14:41] [SPEAKER_00]: I think about it is that you're

[00:14:43] [SPEAKER_00]: getting two different lenses.

[00:14:45] [SPEAKER_00]: Government has a different lens

[00:14:46] [SPEAKER_00]: to this problem than Mark does.

[00:14:50] [SPEAKER_00]: Mark is trying to drive the innovation

[00:14:52] [SPEAKER_00]: really fast and solve very,

[00:14:54] [SPEAKER_00]: very big problems.

[00:14:56] [SPEAKER_00]: Yes, got it.

[00:14:57] [SPEAKER_00]: That that is critical to do.

[00:14:59] [SPEAKER_00]: And you know what what has happened

[00:15:01] [SPEAKER_00]: in Silicon Valley has changed all our

[00:15:03] [SPEAKER_00]: lives and words.

[00:15:04] [SPEAKER_00]: So big fan of being able to

[00:15:06] [SPEAKER_00]: move fast.

[00:15:08] [SPEAKER_00]: But I'm not I'm not a big believer

[00:15:10] [SPEAKER_00]: in moving fast and breaking so much

[00:15:12] [SPEAKER_00]: that we start harming people.

[00:15:14] [SPEAKER_00]: And that's where I start.

[00:15:15] [SPEAKER_00]: I see the government's role coming

[00:15:17] [SPEAKER_00]: in where you're looking at a holistic

[00:15:19] [SPEAKER_00]: ecosystem and you're trying to

[00:15:21] [SPEAKER_00]: prevent harm to

[00:15:23] [SPEAKER_00]: the most vulnerable people.

[00:15:25] [SPEAKER_00]: So look at the use of AI,

[00:15:26] [SPEAKER_00]: which can happen in people

[00:15:28] [SPEAKER_00]: getting scammed.

[00:15:30] [SPEAKER_00]: And the scams are becoming more

[00:15:31] [SPEAKER_00]: realistic and that's just

[00:15:32] [SPEAKER_00]: Nigerian Prince email

[00:15:34] [SPEAKER_00]: those things that we do have

[00:15:35] [SPEAKER_02]: regulation in Nigeria and Prince

[00:15:36] [SPEAKER_02]: email that we used to get.

[00:15:39] [SPEAKER_02]: I'm more sophisticated.

[00:15:41] [SPEAKER_00]: Now it's like

[00:15:42] [SPEAKER_00]: my mom will send me an email and I

[00:15:45] [SPEAKER_00]: will not be able to recognize that

[00:15:46] [SPEAKER_00]: it's a fake.

[00:15:48] [SPEAKER_00]: So those kind of things,

[00:15:50] [SPEAKER_00]: that's where I think the government

[00:15:51] [SPEAKER_00]: plays a role because it looks at

[00:15:53] [SPEAKER_00]: a much wider spectrum

[00:15:56] [SPEAKER_00]: of people

[00:15:58] [SPEAKER_00]: to make sure that we have

[00:16:00] [SPEAKER_00]: the regulations and protections

[00:16:01] [SPEAKER_00]: in place while the innovation

[00:16:03] [SPEAKER_00]: is happening. The problem is

[00:16:05] [SPEAKER_00]: government is always doing

[00:16:06] [SPEAKER_00]: catching up.

[00:16:07] [SPEAKER_00]: That's what frustrates the

[00:16:09] [SPEAKER_00]: innovators because government

[00:16:10] [SPEAKER_00]: is like five years behind

[00:16:12] [SPEAKER_00]: in catching up.

[00:16:14] [SPEAKER_00]: So to me, the problem

[00:16:17] [SPEAKER_00]: to solve is how do you bring

[00:16:18] [SPEAKER_00]: government along quicker?

[00:16:20] [SPEAKER_00]: How do you have that think

[00:16:22] [SPEAKER_00]: tank that group of people

[00:16:24] [SPEAKER_00]: who are

[00:16:26] [SPEAKER_00]: working together from the very

[00:16:28] [SPEAKER_00]: beginning when a new tech is

[00:16:30] [SPEAKER_00]: getting developed and what its

[00:16:31] [SPEAKER_00]: impact can be versus

[00:16:33] [SPEAKER_00]: trying to be at odds with each

[00:16:35] [SPEAKER_00]: other because I believe they

[00:16:36] [SPEAKER_00]: can be really great partners

[00:16:37] [SPEAKER_00]: together to really drive

[00:16:39] [SPEAKER_00]: this innovation in a responsible

[00:16:41] [SPEAKER_01]: way. Then we get into a way

[00:16:43] [SPEAKER_01]: too political for me, but

[00:16:44] [SPEAKER_01]: that's when we get into Congress

[00:16:45] [SPEAKER_01]: and the Senate and being

[00:16:48] [SPEAKER_01]: tech innovators.

[00:16:49] [SPEAKER_01]: Right. And I think

[00:16:51] [SPEAKER_01]: they question Zuckerberg and

[00:16:53] [SPEAKER_01]: some of these others. And I

[00:16:54] [SPEAKER_01]: just I listen to it.

[00:16:56] [SPEAKER_01]: Man, this is awkward.

[00:16:58] [SPEAKER_01]: Like listening to tell me about

[00:17:00] [SPEAKER_02]: this my book.

[00:17:01] [SPEAKER_01]: Yeah. Yeah. What is my blog?

[00:17:03] [SPEAKER_01]: Right. Yeah.

[00:17:04] [SPEAKER_01]: So.

[00:17:06] [SPEAKER_01]: So you have face space.

[00:17:09] [SPEAKER_01]: What do you do on the face?

[00:17:10] [SPEAKER_03]: Space. Yeah.

[00:17:11] [SPEAKER_01]: Yeah. So it's awkward.

[00:17:13] [SPEAKER_01]: Yeah.

[00:17:14] [SPEAKER_01]: But I need I like your

[00:17:17] [SPEAKER_01]: take on that. I think that's

[00:17:18] [SPEAKER_01]: I didn't expect it.

[00:17:20] [SPEAKER_02]: I did not expect that actually.

[00:17:21] [SPEAKER_02]: It's very easy to understand.

[00:17:23] [SPEAKER_02]: I have to admit I thought

[00:17:24] [SPEAKER_02]: you're going to go the other

[00:17:25] [SPEAKER_02]: direction.

[00:17:26] [SPEAKER_02]: Indeed.

[00:17:27] [SPEAKER_02]: I thought you were going to go

[00:17:28] [SPEAKER_02]: the extra key from out of our

[00:17:29] [SPEAKER_02]: business. Just keep them out.

[00:17:31] [SPEAKER_02]: Let us do our bit.

[00:17:32] [SPEAKER_02]: We'll manage it.

[00:17:33] [SPEAKER_01]: But it does make sense, right?

[00:17:34] [SPEAKER_01]: I mean, technology moves

[00:17:36] [SPEAKER_01]: incredibly fast, right?

[00:17:38] [SPEAKER_01]: And and these

[00:17:39] [SPEAKER_01]: these big companies who have

[00:17:41] [SPEAKER_01]: leaders that otherwise

[00:17:43] [SPEAKER_01]: we would be significantly behind

[00:17:45] [SPEAKER_01]: the rest of the world are

[00:17:47] [SPEAKER_01]: pushing at that

[00:17:48] [SPEAKER_01]: that type of speed.

[00:17:50] [SPEAKER_01]: They don't want to be held

[00:17:51] [SPEAKER_01]: back.

[00:17:52] [SPEAKER_01]: But then we're left to.

[00:17:55] [SPEAKER_01]: The feelings of that person and

[00:17:57] [SPEAKER_01]: what they want to do to

[00:17:58] [SPEAKER_01]: accomplish their goal.

[00:17:59] [SPEAKER_01]: So the oversight, how you explain

[00:18:01] [SPEAKER_01]: that I think makes complete

[00:18:02] [SPEAKER_01]: complete sense.

[00:18:04] [SPEAKER_01]: Question I have

[00:18:06] [SPEAKER_01]: is around the state

[00:18:08] [SPEAKER_01]: of AI.

[00:18:11] [SPEAKER_01]: If you had to and I know

[00:18:13] [SPEAKER_01]: this is maybe a really bad

[00:18:15] [SPEAKER_01]: question because you can't

[00:18:17] [SPEAKER_01]: sum it up in a couple of

[00:18:18] [SPEAKER_01]: sentences, although I think

[00:18:19] [SPEAKER_01]: you might be able to do this.

[00:18:20] [SPEAKER_01]: You feel I feel like you have

[00:18:22] [SPEAKER_01]: this ability.

[00:18:23] [SPEAKER_01]: How would you describe

[00:18:25] [SPEAKER_01]: or explain the state of AI to

[00:18:27] [SPEAKER_01]: somebody who's

[00:18:29] [SPEAKER_01]: just casually interested

[00:18:31] [SPEAKER_01]: in what's happening?

[00:18:37] [SPEAKER_01]: Oh, my goodness.

[00:18:39] [SPEAKER_01]: Bad touching, harassment,

[00:18:41] [SPEAKER_01]: sex, violence,

[00:18:43] [SPEAKER_01]: fraud, threats,

[00:18:45] [SPEAKER_01]: all things that could

[00:18:47] [SPEAKER_01]: have been avoided

[00:18:50] [SPEAKER_01]: if you had Fama

[00:18:52] [SPEAKER_01]: stop hiring dangerous

[00:18:55] [SPEAKER_01]: people.

[00:18:56] [SPEAKER_01]: Fama.io.

[00:18:59] [SPEAKER_02]: I while you're thinking, I'll

[00:19:01] [SPEAKER_02]: tell you what I read the

[00:19:03] [SPEAKER_02]: other day and Ryan, I've told

[00:19:04] [SPEAKER_02]: you this.

[00:19:06] [SPEAKER_02]: Guy who's an analyst and

[00:19:09] [SPEAKER_02]: he said we're in the first

[00:19:11] [SPEAKER_02]: inning of a 10,000

[00:19:13] [SPEAKER_02]: inning game.

[00:19:14] [SPEAKER_02]: So what what we

[00:19:16] [SPEAKER_02]: what we know of AI

[00:19:18] [SPEAKER_02]: again, the difference between

[00:19:19] [SPEAKER_02]: machine learning and LP and

[00:19:22] [SPEAKER_02]: large language models and all

[00:19:23] [SPEAKER_02]: these different things that we

[00:19:26] [SPEAKER_02]: kind of understand.

[00:19:29] [SPEAKER_02]: We have no in our lifetimes

[00:19:31] [SPEAKER_02]: will never know.

[00:19:32] [SPEAKER_02]: Well, we won't see a fraction of

[00:19:34] [SPEAKER_02]: what's to be developed.

[00:19:35] [SPEAKER_02]: It's the way that he kind of

[00:19:37] [SPEAKER_02]: the way he packaged up.

[00:19:40] [SPEAKER_00]: I agree.

[00:19:42] [SPEAKER_00]: I agree. We won't.

[00:19:43] [SPEAKER_00]: We won't. And that's what makes

[00:19:45] [SPEAKER_00]: it exciting to me.

[00:19:46] [SPEAKER_00]: So to me, if somebody is

[00:19:47] [SPEAKER_00]: casually browsing, I would say

[00:19:49] [SPEAKER_00]: yeah, um, learn

[00:19:51] [SPEAKER_00]: about it. It's exciting.

[00:19:53] [SPEAKER_00]: This is the time to learn about

[00:19:55] [SPEAKER_00]: it and get yourself familiar with

[00:19:56] [SPEAKER_00]: it. You are already using it.

[00:19:58] [SPEAKER_00]: So if I'm a casual person like

[00:19:59] [SPEAKER_00]: if I use Amazon, everybody uses

[00:20:01] [SPEAKER_00]: Amazon.

[00:20:02] [SPEAKER_00]: I am using AI at this time

[00:20:04] [SPEAKER_00]: because that's what they're

[00:20:05] [SPEAKER_00]: using to give me recommendations

[00:20:07] [SPEAKER_00]: and even their own

[00:20:09] [SPEAKER_00]: operation.

[00:20:10] [SPEAKER_00]: So learn about it.

[00:20:12] [SPEAKER_00]: Be curious about it.

[00:20:14] [SPEAKER_00]: Because and I think of it

[00:20:16] [SPEAKER_00]: at, you know, from the other

[00:20:18] [SPEAKER_00]: dimensions, if if you're a

[00:20:19] [SPEAKER_00]: parent, you want to think about

[00:20:20] [SPEAKER_00]: it. How can it impact your kids?

[00:20:23] [SPEAKER_00]: If you're a student, how is it

[00:20:24] [SPEAKER_00]: going to impact your future job?

[00:20:26] [SPEAKER_00]: If you are a professional like

[00:20:27] [SPEAKER_00]: me, you have no choice but to

[00:20:29] [SPEAKER_00]: continue to learn about

[00:20:32] [SPEAKER_00]: it. So

[00:20:33] [SPEAKER_00]: everybody has something to gain

[00:20:35] [SPEAKER_00]: by learning about it.

[00:20:37] [SPEAKER_00]: So how do we simplify that?

[00:20:39] [SPEAKER_00]: How do we simplify it for

[00:20:40] [SPEAKER_00]: different personas

[00:20:42] [SPEAKER_00]: with different age groups

[00:20:43] [SPEAKER_00]: and make it more targeted on

[00:20:45] [SPEAKER_00]: how it impacts them is what

[00:20:47] [SPEAKER_00]: I would love

[00:20:49] [SPEAKER_00]: to see more focus on.

[00:20:51] [SPEAKER_00]: Because there is the information

[00:20:52] [SPEAKER_00]: is being curious.

[00:20:54] [SPEAKER_00]: There is so much information

[00:20:55] [SPEAKER_02]: out there. Ryan uses

[00:20:58] [SPEAKER_02]: chat, GBT.

[00:21:00] [SPEAKER_02]: What's version of you on your own

[00:21:01] [SPEAKER_02]: for four? Yeah.

[00:21:03] [SPEAKER_02]: Yeah. And it's it's funny

[00:21:05] [SPEAKER_02]: because like every day he'll

[00:21:06] [SPEAKER_02]: bring some new prompt or a new

[00:21:07] [SPEAKER_02]: thing that he's learning.

[00:21:09] [SPEAKER_02]: And it reminds me very much of

[00:21:11] [SPEAKER_02]: the internet in like say

[00:21:13] [SPEAKER_02]: 1998

[00:21:15] [SPEAKER_02]: where it was just fun.

[00:21:17] [SPEAKER_02]: Like you get on the web and

[00:21:19] [SPEAKER_02]: you know the message boards and

[00:21:20] [SPEAKER_02]: like it was just a fun bit.

[00:21:22] [SPEAKER_02]: Like you didn't think about all

[00:21:23] [SPEAKER_02]: the downsides and all the

[00:21:24] [SPEAKER_02]: darkness and all the other stuff

[00:21:26] [SPEAKER_02]: that would come or there

[00:21:27] [SPEAKER_02]: probably even was there.

[00:21:29] [SPEAKER_02]: But you just had fun.

[00:21:30] [SPEAKER_02]: You're like, OK, let me see if I

[00:21:31] [SPEAKER_02]: can find a pet store in

[00:21:32] [SPEAKER_02]: Pittsburgh.

[00:21:33] [SPEAKER_02]: Like you just do this random

[00:21:35] [SPEAKER_02]: stuff just to see it

[00:21:37] [SPEAKER_02]: it was playful.

[00:21:39] [SPEAKER_02]: And again, nobody got hurt.

[00:21:40] [SPEAKER_02]: It was just it was fun.

[00:21:43] [SPEAKER_02]: I see the parallels.

[00:21:45] [SPEAKER_02]: I think it's going to move far

[00:21:47] [SPEAKER_02]: faster than the internet did.

[00:21:50] [SPEAKER_02]: So I think that Moore's law

[00:21:51] [SPEAKER_02]: applied to AI versus

[00:21:54] [SPEAKER_02]: the internet. I think that's going

[00:21:56] [SPEAKER_02]: to happen.

[00:21:59] [SPEAKER_00]: I do think more people who learn

[00:22:01] [SPEAKER_00]: about it that will

[00:22:03] [SPEAKER_00]: make

[00:22:05] [SPEAKER_00]: that will make innovation faster

[00:22:07] [SPEAKER_00]: but also safer because

[00:22:09] [SPEAKER_00]: more people who know about it

[00:22:10] [SPEAKER_00]: then you have more perspectives

[00:22:12] [SPEAKER_00]: coming in both positive and

[00:22:14] [SPEAKER_00]: negative. And that's what

[00:22:16] [SPEAKER_00]: all the innovators in Silicon

[00:22:18] [SPEAKER_00]: Valley and worldwide need to

[00:22:20] [SPEAKER_00]: hear that rather than

[00:22:21] [SPEAKER_00]: you know, very close group of

[00:22:23] [SPEAKER_00]: people

[00:22:25] [SPEAKER_00]: who are at the cutting edge of

[00:22:27] [SPEAKER_00]: it. But that's where the bias

[00:22:29] [SPEAKER_00]: also comes in because you're not

[00:22:30] [SPEAKER_00]: really hearing from everybody

[00:22:32] [SPEAKER_00]: you need to hear.

[00:22:32] [SPEAKER_02]: OK, so you've probably seen this

[00:22:34] [SPEAKER_02]: in different cycles, but

[00:22:35] [SPEAKER_02]: I've always been curious about

[00:22:37] [SPEAKER_02]: how standards get made

[00:22:39] [SPEAKER_02]: like, like, you know, like

[00:22:41] [SPEAKER_02]: simple stuff like an iPhone

[00:22:44] [SPEAKER_02]: plug versus a droid plug,

[00:22:46] [SPEAKER_02]: right? Like who made the decision

[00:22:47] [SPEAKER_02]: and why didn't they just make the

[00:22:49] [SPEAKER_02]: same damn plug?

[00:22:50] [SPEAKER_02]: My house would be so much smoother.

[00:22:53] [SPEAKER_02]: But you know, like it's like big

[00:22:54] [SPEAKER_02]: companies

[00:22:56] [SPEAKER_02]: they'll make decisions on standards

[00:22:58] [SPEAKER_02]: that that are their standards

[00:23:00] [SPEAKER_02]: and and do you see

[00:23:02] [SPEAKER_02]: that play out?

[00:23:04] [SPEAKER_02]: I mean, obviously the big ones with

[00:23:06] [SPEAKER_02]: Microsoft and Google and Facebook

[00:23:08] [SPEAKER_02]: and pick all your other ones

[00:23:10] [SPEAKER_02]: that you like.

[00:23:11] [SPEAKER_02]: Is it going to become kind of a

[00:23:14] [SPEAKER_02]: another game of them creating

[00:23:15] [SPEAKER_02]: standards and adhering

[00:23:18] [SPEAKER_02]: to

[00:23:20] [SPEAKER_02]: the standard of a company

[00:23:22] [SPEAKER_02]: versus general

[00:23:24] [SPEAKER_02]: standard of, you know, AI

[00:23:27] [SPEAKER_02]: is versus AI

[00:23:29] [SPEAKER_02]: with Microsoft is AI

[00:23:31] [SPEAKER_02]: with Google is AI with Amazon

[00:23:33] [SPEAKER_02]: is, etc.

[00:23:34] [SPEAKER_02]: So question is about standards.

[00:23:37] [SPEAKER_00]: I don't have a

[00:23:39] [SPEAKER_00]: Yeah, there will be some of that.

[00:23:41] [SPEAKER_00]: I will tell you and maybe I'm

[00:23:42] [SPEAKER_00]: glass half full

[00:23:44] [SPEAKER_00]: on this one. But I do believe

[00:23:47] [SPEAKER_00]: there will be a lot more

[00:23:48] [SPEAKER_00]: interoperability.

[00:23:49] [SPEAKER_02]: Well, you are an optimist.

[00:23:50] [SPEAKER_02]: Wow.

[00:23:51] [SPEAKER_02]: I like this.

[00:23:52] [SPEAKER_02]: Wow. All righty, we found one.

[00:23:54] [SPEAKER_02]: Right? We were looking.

[00:23:56] [SPEAKER_01]: I can see them all next week.

[00:23:57] [SPEAKER_01]: They're all around a table.

[00:23:59] [SPEAKER_01]: Like, let's get standards for

[00:24:01] [SPEAKER_01]: everybody.

[00:24:03] [SPEAKER_00]: Maybe not.

[00:24:04] [SPEAKER_00]: Maybe not.

[00:24:05] [SPEAKER_00]: They all want to push their

[00:24:07] [SPEAKER_00]: products. But

[00:24:09] [SPEAKER_00]: however, but I do believe they

[00:24:10] [SPEAKER_00]: also want interoperability

[00:24:12] [SPEAKER_00]: because they also know that one

[00:24:13] [SPEAKER_00]: of them is not going to

[00:24:16] [SPEAKER_00]: what encode when this race if

[00:24:17] [SPEAKER_00]: you OK. So I'll give you a

[00:24:18] [SPEAKER_00]: parallel example, which

[00:24:20] [SPEAKER_00]: hopefully they all learn from.

[00:24:21] [SPEAKER_00]: We have three

[00:24:24] [SPEAKER_00]: main cloud providers.

[00:24:25] [SPEAKER_00]: You have GCP, you have AWS,

[00:24:27] [SPEAKER_00]: you have Azure.

[00:24:29] [SPEAKER_00]: OK, it's not that easy

[00:24:30] [SPEAKER_00]: to move from one to another.

[00:24:33] [SPEAKER_00]: But more and more companies

[00:24:35] [SPEAKER_00]: want to do that from a

[00:24:37] [SPEAKER_00]: DR perspective, BCP perspective,

[00:24:40] [SPEAKER_00]: capabilities perspective.

[00:24:42] [SPEAKER_00]: So to me that increases

[00:24:44] [SPEAKER_00]: the usage of cloud and the cloud

[00:24:46] [SPEAKER_00]: adoption and all those things.

[00:24:47] [SPEAKER_00]: If you actually make that happen,

[00:24:49] [SPEAKER_00]: so it's a good business.

[00:24:51] [SPEAKER_00]: So it's not. Yeah, yeah, yeah.

[00:24:52] [SPEAKER_00]: Goodness of the heart, but it's

[00:24:54] [SPEAKER_00]: good business.

[00:24:55] [SPEAKER_00]: That's how I think about the AI

[00:24:57] [SPEAKER_00]: journey as well, that hopefully

[00:24:59] [SPEAKER_00]: learning from those parallels,

[00:25:01] [SPEAKER_00]: these companies are going to be

[00:25:02] [SPEAKER_00]: more open

[00:25:06] [SPEAKER_00]: to having at least a set of

[00:25:08] [SPEAKER_00]: base standards where they are

[00:25:09] [SPEAKER_00]: interoperable and they're able

[00:25:11] [SPEAKER_00]: to use each other stuff or allow

[00:25:13] [SPEAKER_00]: clients like us to build things

[00:25:16] [SPEAKER_00]: on top of two providers

[00:25:17] [SPEAKER_00]: or have, you know, a stronger

[00:25:20] [SPEAKER_00]: DR plan.

[00:25:21] [SPEAKER_00]: So that's at least the parallel

[00:25:23] [SPEAKER_00]: I draw in my head when I think

[00:25:24] [SPEAKER_00]: about it.

[00:25:25] [SPEAKER_00]: Of course, they all get very,

[00:25:26] [SPEAKER_00]: very big bucks to make those

[00:25:28] [SPEAKER_00]: decisions.

[00:25:29] [SPEAKER_01]: Yeah, it's like a frictionless

[00:25:30] [SPEAKER_01]: way. We need a frictionless

[00:25:32] [SPEAKER_01]: way to work together.

[00:25:34] [SPEAKER_01]: We'll see.

[00:25:36] [SPEAKER_01]: I think of Amazon and

[00:25:37] [SPEAKER_01]: here.

[00:25:37] [SPEAKER_01]: Yeah, glad.

[00:25:39] [SPEAKER_01]: I think of Amazon and the return

[00:25:41] [SPEAKER_01]: process, which has changed a

[00:25:43] [SPEAKER_01]: little bit. They have changed a

[00:25:44] [SPEAKER_01]: little bit. You have to click a

[00:25:45] [SPEAKER_01]: few more buttons, but it is so

[00:25:48] [SPEAKER_01]: frictionless and easy.

[00:25:49] [SPEAKER_01]: You don't like it.

[00:25:50] [SPEAKER_01]: Return. Goodbye.

[00:25:52] [SPEAKER_01]: No questions asked.

[00:25:53] [SPEAKER_01]: I don't need to receive anything.

[00:25:54] [SPEAKER_01]: Well, obviously with this online,

[00:25:55] [SPEAKER_01]: but it's easy

[00:25:57] [SPEAKER_01]: and that makes me go back to

[00:25:59] [SPEAKER_01]: Amazon instead of going to

[00:26:00] [SPEAKER_01]: the store because

[00:26:01] [SPEAKER_02]: you see what I'm see what I'm

[00:26:03] [SPEAKER_02]: struggling with, Ryan is

[00:26:05] [SPEAKER_02]: that's centralized.

[00:26:07] [SPEAKER_02]: The Amazon can represent a

[00:26:08] [SPEAKER_02]: thousand sure.

[00:26:09] [SPEAKER_02]: Yeah, but they still have it

[00:26:11] [SPEAKER_02]: centralized.

[00:26:12] [SPEAKER_02]: Chat GPT for

[00:26:14] [SPEAKER_02]: is going to get to know you

[00:26:15] [SPEAKER_02]: really well.

[00:26:16] [SPEAKER_02]: Yeah, and then all of a sudden

[00:26:18] [SPEAKER_02]: you're going to move over to

[00:26:19] [SPEAKER_02]: Gemini or or you're going to

[00:26:20] [SPEAKER_02]: roll over to another

[00:26:21] [SPEAKER_02]: right large language model,

[00:26:23] [SPEAKER_02]: right? And

[00:26:25] [SPEAKER_02]: will you have to?

[00:26:27] [SPEAKER_02]: I'm assuming start.

[00:26:28] [SPEAKER_02]: You will you have to start

[00:26:30] [SPEAKER_02]: over? Yes.

[00:26:31] [SPEAKER_02]: And I'm thinking about the

[00:26:33] [SPEAKER_02]: portability of OK,

[00:26:35] [SPEAKER_02]: using chat GPT.

[00:26:36] [SPEAKER_02]: And again, this is dumb

[00:26:37] [SPEAKER_02]: examples, but

[00:26:38] [SPEAKER_02]: but using that and all of a

[00:26:40] [SPEAKER_02]: sudden

[00:26:41] [SPEAKER_02]: having to use something new

[00:26:42] [SPEAKER_02]: and starting all the way over.

[00:26:44] [SPEAKER_02]: It's like how

[00:26:45] [SPEAKER_02]: remember we went voice to text

[00:26:48] [SPEAKER_02]: and you'd use a voice to text

[00:26:49] [SPEAKER_02]: thing.

[00:26:50] [SPEAKER_02]: And if you used one of them,

[00:26:52] [SPEAKER_02]: like I use Google

[00:26:53] [SPEAKER_02]: and if you used it

[00:26:54] [SPEAKER_02]: and you stuck with it,

[00:26:55] [SPEAKER_02]: it got it got

[00:26:57] [SPEAKER_02]: OK.

[00:26:58] [SPEAKER_02]: But if you try to different one

[00:27:00] [SPEAKER_02]: man, it was horrible.

[00:27:02] [SPEAKER_02]: So that's my that's one of my

[00:27:04] [SPEAKER_02]: worries is we get siloed

[00:27:06] [SPEAKER_02]: into one of these products.

[00:27:08] [SPEAKER_02]: And then and then now

[00:27:10] [SPEAKER_02]: now it's a battle of products.

[00:27:14] [SPEAKER_00]: It's about exactly exactly.

[00:27:16] [SPEAKER_00]: And that poses another very

[00:27:17] [SPEAKER_00]: interesting question.

[00:27:18] [SPEAKER_00]: I think which we all

[00:27:19] [SPEAKER_00]: wrestle with with all of these

[00:27:21] [SPEAKER_00]: companies and always have

[00:27:23] [SPEAKER_00]: what right do I have

[00:27:25] [SPEAKER_00]: on my data?

[00:27:26] [SPEAKER_00]: The question of consent,

[00:27:28] [SPEAKER_00]: why should I not be able to

[00:27:30] [SPEAKER_00]: hold it?

[00:27:30] [SPEAKER_00]: I can put my phone number.

[00:27:32] [SPEAKER_00]: So I should be able to put

[00:27:34] [SPEAKER_00]: my data which on which

[00:27:36] [SPEAKER_00]: this model got that would be.

[00:27:38] [SPEAKER_02]: And that's a great company to create.

[00:27:40] [SPEAKER_02]: Seriously, I think we just

[00:27:42] [SPEAKER_02]: created a billion dollar company.

[00:27:43] [SPEAKER_02]: Just that alone is that

[00:27:45] [SPEAKER_01]: I'm going to hit stop recording.

[00:27:48] [SPEAKER_01]: Let's get into the back room.

[00:27:49] [SPEAKER_01]: Never get published.

[00:27:50] [SPEAKER_01]: No.

[00:27:51] [SPEAKER_02]: But just think of that technology,

[00:27:53] [SPEAKER_02]: the portability of being able

[00:27:54] [SPEAKER_02]: to have a suitcase

[00:27:55] [SPEAKER_02]: of that large language model

[00:27:57] [SPEAKER_02]: and in taking that

[00:27:58] [SPEAKER_02]: and being able to import that

[00:27:59] [SPEAKER_02]: into a new model

[00:28:01] [SPEAKER_02]: and getting them up to date.

[00:28:04] [SPEAKER_02]: And then you can use a new model.

[00:28:05] [SPEAKER_02]: Yeah, it's fine.

[00:28:07] [SPEAKER_02]: Well, we've seen somebody's

[00:28:08] [SPEAKER_02]: going to create that company.

[00:28:09] [SPEAKER_01]: We've seen that with

[00:28:11] [SPEAKER_01]: what we just had a conversation

[00:28:13] [SPEAKER_01]: recently with Gage.

[00:28:14] [SPEAKER_01]: Same idea.

[00:28:15] [SPEAKER_01]: You get a score

[00:28:16] [SPEAKER_01]: and the candidate can take that

[00:28:18] [SPEAKER_01]: from employer to employers.

[00:28:19] [SPEAKER_01]: Portable.

[00:28:20] [SPEAKER_01]: And so it's your skill set.

[00:28:22] [SPEAKER_01]: It's your references.

[00:28:23] [SPEAKER_01]: It's your performance reviews.

[00:28:24] [SPEAKER_01]: All of that.

[00:28:25] [SPEAKER_01]: Like at some point

[00:28:27] [SPEAKER_01]: will get there.

[00:28:28] [SPEAKER_02]: That'll be easier on the blockchain

[00:28:29] [SPEAKER_02]: because it'll be you can have

[00:28:31] [SPEAKER_02]: it third party validated.

[00:28:32] [SPEAKER_02]: Right.

[00:28:34] [SPEAKER_02]: What do you what do you think?

[00:28:35] [SPEAKER_02]: Because we do have

[00:28:37] [SPEAKER_02]: I think there's one

[00:28:38] [SPEAKER_02]: maybe two companies in HR tech

[00:28:40] [SPEAKER_02]: or work tech

[00:28:41] [SPEAKER_02]: that has third party

[00:28:44] [SPEAKER_02]: audited AI.

[00:28:46] [SPEAKER_02]: Thinking of a company in Australia

[00:28:48] [SPEAKER_02]: called Rijig

[00:28:49] [SPEAKER_02]: and we're talking to the CEO

[00:28:52] [SPEAKER_02]: a while back and she said

[00:28:53] [SPEAKER_02]: when we started,

[00:28:55] [SPEAKER_02]: you know, we didn't know what we didn't know.

[00:28:57] [SPEAKER_02]: So we started down the path

[00:28:58] [SPEAKER_02]: and then we just like,

[00:28:59] [SPEAKER_02]: OK, internally

[00:29:00] [SPEAKER_02]: we're only going to know so much.

[00:29:02] [SPEAKER_02]: So they hired

[00:29:04] [SPEAKER_02]: the university

[00:29:06] [SPEAKER_02]: a university

[00:29:07] [SPEAKER_02]: and they paid them really well

[00:29:09] [SPEAKER_02]: to then just take their all their algorithms

[00:29:12] [SPEAKER_02]: and then say, OK,

[00:29:14] [SPEAKER_02]: tell us what needs to change

[00:29:16] [SPEAKER_02]: because we could do it internally

[00:29:18] [SPEAKER_02]: and we do it internally.

[00:29:19] [SPEAKER_02]: However,

[00:29:20] [SPEAKER_02]: having someone

[00:29:21] [SPEAKER_02]: look at what you're trying to do

[00:29:23] [SPEAKER_02]: again in your case

[00:29:25] [SPEAKER_02]: to no harm

[00:29:26] [SPEAKER_02]: does it actually do that?

[00:29:28] [SPEAKER_02]: Have you all given that any thought?

[00:29:30] [SPEAKER_02]: The idea of hiring a consulting firm

[00:29:33] [SPEAKER_02]: or just somebody that does this for a living

[00:29:35] [SPEAKER_02]: to third party audit

[00:29:37] [SPEAKER_02]: your AI?

[00:29:40] [SPEAKER_00]: Absolutely. Absolutely.

[00:29:42] [SPEAKER_00]: We we are not

[00:29:44] [SPEAKER_00]: we're not doing

[00:29:46] [SPEAKER_00]: right.

[00:29:47] [SPEAKER_00]: AI use cases where we need

[00:29:48] [SPEAKER_02]: right.

[00:29:49] [SPEAKER_00]: That auditing

[00:29:50] [SPEAKER_00]: but that will be definitely at the top of

[00:29:54] [SPEAKER_00]: my list at least

[00:29:56] [SPEAKER_00]: to put a very strong model governance process

[00:29:58] [SPEAKER_00]: in place.

[00:29:59] [SPEAKER_00]: Right.

[00:30:00] [SPEAKER_00]: And it's not one time auditing, right?

[00:30:01] [SPEAKER_00]: It's a continuous auditing

[00:30:03] [SPEAKER_00]: just like you do your

[00:30:04] [SPEAKER_00]: you use

[00:30:05] [SPEAKER_00]: sock to sock to audits socks audits.

[00:30:08] [SPEAKER_00]: It's

[00:30:09] [SPEAKER_00]: it should be in my mind

[00:30:11] [SPEAKER_00]: it should become a similar rigor

[00:30:13] [SPEAKER_00]: when it comes to the AI models as well

[00:30:15] [SPEAKER_00]: where somebody can

[00:30:17] [SPEAKER_00]: certify those things for you

[00:30:19] [SPEAKER_00]: on an ongoing basis.

[00:30:19] [SPEAKER_00]: And yes, you have to give them data

[00:30:21] [SPEAKER_00]: and you have to

[00:30:21] [SPEAKER_00]: do all all the work.

[00:30:23] [SPEAKER_00]: It adds extra work

[00:30:25] [SPEAKER_00]: but given the challenges we have

[00:30:28] [SPEAKER_00]: with AI and Gen AI in general

[00:30:31] [SPEAKER_00]: about the explainability

[00:30:34] [SPEAKER_00]: of the

[00:30:35] [SPEAKER_00]: of the answers which are being given

[00:30:38] [SPEAKER_00]: or how you're using it

[00:30:39] [SPEAKER_00]: in in a narrow set

[00:30:41] [SPEAKER_00]: there is absolutely

[00:30:43] [SPEAKER_00]: case for third party to come in and audit.

[00:30:46] [SPEAKER_01]: indeed we've got

[00:30:47] [SPEAKER_01]: we've gotten questions recently

[00:30:49] [SPEAKER_01]: around technologies role

[00:30:53] [SPEAKER_01]: in

[00:30:54] [SPEAKER_01]: how a company acts.

[00:30:57] [SPEAKER_01]: I'm some thinking specifically around talent acquisition.

[00:31:00] [SPEAKER_01]: And so maybe the software

[00:31:03] [SPEAKER_01]: is leveraging AI to

[00:31:06] [SPEAKER_01]: create rankings or recommendations.

[00:31:09] [SPEAKER_01]: And the company goes forward

[00:31:11] [SPEAKER_01]: and they make a decision

[00:31:12] [SPEAKER_01]: based off of that recommendation, right?

[00:31:15] [SPEAKER_01]: I know you know where this is going.

[00:31:16] [SPEAKER_01]: So the the talent leaders

[00:31:17] [SPEAKER_01]: their question to us has been

[00:31:20] [SPEAKER_01]: I'm trusting the software.

[00:31:22] [SPEAKER_01]: What if that data is bad?

[00:31:25] [SPEAKER_01]: How do I then cover myself

[00:31:28] [SPEAKER_01]: and is my company liable?

[00:31:30] [SPEAKER_01]: What's your kind of take there?

[00:31:35] [SPEAKER_00]: Yep, the company's absolutely liable.

[00:31:37] [SPEAKER_00]: If you're using a software

[00:31:38] [SPEAKER_00]: if you're using a third party software

[00:31:41] [SPEAKER_00]: that third party risk management

[00:31:43] [SPEAKER_00]: and now even more so with

[00:31:45] [SPEAKER_00]: with the lens of what it's going to provide you

[00:31:47] [SPEAKER_00]: is is your responsibility

[00:31:50] [SPEAKER_00]: to make sure.

[00:31:52] [SPEAKER_00]: So I do think that

[00:31:53] [SPEAKER_00]: we if you're going to use it for hiring

[00:31:56] [SPEAKER_00]: if it's work day or whatever

[00:31:57] [SPEAKER_00]: we all know the story which happened there

[00:32:00] [SPEAKER_00]: we are liable if you're going to use it.

[00:32:03] [SPEAKER_00]: And honestly I personally

[00:32:07] [SPEAKER_00]: I believe there are a lot more use cases

[00:32:10] [SPEAKER_00]: for AI which are

[00:32:14] [SPEAKER_00]: quote unquote less risky

[00:32:18] [SPEAKER_00]: to bring into organizations

[00:32:19] [SPEAKER_00]: than using it to rank and stack

[00:32:23] [SPEAKER_00]: resumes and make hiring decisions.

[00:32:25] [SPEAKER_00]: I personally would not feel comfortable with that

[00:32:27] [SPEAKER_00]: because the I don't I don't believe that

[00:32:30] [SPEAKER_00]: the models are quite there yet

[00:32:32] [SPEAKER_00]: where they can pick up the intangibles

[00:32:34] [SPEAKER_00]: and that they're trained on historical bias data.

[00:32:37] [SPEAKER_03]: That's right.

[00:32:38] [SPEAKER_00]: That is a big problem.

[00:32:39] [SPEAKER_00]: So to me that use case

[00:32:42] [SPEAKER_00]: you have to be very very careful.

[00:32:45] [SPEAKER_00]: Maybe it's a narrow use case

[00:32:46] [SPEAKER_00]: where you're hiring just interns

[00:32:47] [SPEAKER_00]: or you're hiring gig workers

[00:32:50] [SPEAKER_00]: who have a very narrow set of skill sets

[00:32:53] [SPEAKER_00]: but not in a wider sense

[00:32:57] [SPEAKER_00]: without a human in the loop.

[00:33:00] [SPEAKER_01]: So our best case is still to train a human

[00:33:03] [SPEAKER_01]: to remove bias.

[00:33:05] [SPEAKER_02]: Well I think some of the cases that we've seen Ryan

[00:33:08] [SPEAKER_02]: is like say in

[00:33:10] [SPEAKER_02]: Storley's case let's say it's a monitoring

[00:33:12] [SPEAKER_02]: so it's not a screen.

[00:33:14] [SPEAKER_02]: It's a computer monitoring employees.

[00:33:17] [SPEAKER_02]: AI can do a whole hell of a lot to monitor

[00:33:21] [SPEAKER_02]: but then kick it to an employee

[00:33:24] [SPEAKER_02]: and say this has been flagged.

[00:33:27] [SPEAKER_02]: Of course the AI is not saying anything.

[00:33:29] [SPEAKER_02]: This has been flagged

[00:33:30] [SPEAKER_02]: and then the employee can then do the investigation

[00:33:33] [SPEAKER_02]: and make sure that that is true not true

[00:33:35] [SPEAKER_02]: etc something that actually needs to be flagged or not

[00:33:38] [SPEAKER_02]: but I think the models right now

[00:33:40] [SPEAKER_02]: that we're working with is more augmentation

[00:33:42] [SPEAKER_02]: where AI goes and does a bunch of things

[00:33:45] [SPEAKER_02]: and doesn't make a decision.

[00:33:47] [SPEAKER_02]: It just gives information to humans

[00:33:51] [SPEAKER_02]: so that they make decisions.

[00:33:52] [SPEAKER_02]: Make the decision.

[00:33:54] [SPEAKER_01]: So question I have then that comes up

[00:33:56] [SPEAKER_01]: and this is I guess for both of you

[00:33:57] [SPEAKER_01]: but if you really

[00:34:01] [SPEAKER_01]: a lot of people say and you say it this way

[00:34:03] [SPEAKER_01]: you say trust but verify.

[00:34:05] [SPEAKER_01]: Right?

[00:34:06] [SPEAKER_01]: So don't spit your water.

[00:34:08] [SPEAKER_01]: Trust but verify.

[00:34:08] [SPEAKER_02]: Reagan actually said that.

[00:34:10] [SPEAKER_02]: Well yes, yes, yes.

[00:34:12] [SPEAKER_01]: So he did yes you're correct.

[00:34:14] [SPEAKER_01]: So okay so scenario

[00:34:16] [SPEAKER_01]: and I'm curious to get your take here.

[00:34:19] [SPEAKER_01]: I'm using an AI.

[00:34:21] [SPEAKER_01]: I'm using a platform to do something right maybe

[00:34:24] [SPEAKER_01]: it's giving me it's not ranking a candidate

[00:34:27] [SPEAKER_01]: or anything like that

[00:34:28] [SPEAKER_01]: but it's giving me information.

[00:34:29] [SPEAKER_01]: It says here Ryan now take this

[00:34:32] [SPEAKER_01]: and you go make your decision.

[00:34:36] [SPEAKER_01]: So now I'm taking that information.

[00:34:38] [SPEAKER_01]: I'm trusting it because my company said yes

[00:34:42] [SPEAKER_01]: this is the software that we trust.

[00:34:44] [SPEAKER_01]: Now I need to go verify.

[00:34:46] [SPEAKER_01]: How do I verify that?

[00:34:48] [SPEAKER_01]: Am I going to another AI to verify it?

[00:34:50] [SPEAKER_01]: Am I going to Google to verify that?

[00:34:53] [SPEAKER_01]: Again, I know it depends on what information

[00:34:55] [SPEAKER_01]: but at the fingertips of a recruiter

[00:34:58] [SPEAKER_01]: or an HR manager or a talent director

[00:35:01] [SPEAKER_01]: how do they verify this properly?

[00:35:06] [SPEAKER_00]: Verify what though

[00:35:07] [SPEAKER_00]: because it's very contextual Ryan.

[00:35:11] [SPEAKER_00]: What are we going to do?

[00:35:12] [SPEAKER_00]: In the case of a background check

[00:35:13] [SPEAKER_02]: it could be a criminal conviction.

[00:35:17] [SPEAKER_02]: So something like that that gets flagged

[00:35:20] [SPEAKER_02]: right then that person can can verify

[00:35:24] [SPEAKER_02]: first of all it should have sources

[00:35:25] [SPEAKER_02]: of where it found that

[00:35:27] [SPEAKER_02]: to where you could go back

[00:35:28] [SPEAKER_02]: and see the source material

[00:35:30] [SPEAKER_02]: to then say okay I got that wrong

[00:35:32] [SPEAKER_02]: or it got that right.

[00:35:34] [SPEAKER_02]: So a person was convicted

[00:35:35] [SPEAKER_02]: but then the case was thrown out on a technicality.

[00:35:39] [SPEAKER_02]: Go to your local municipality whatever is

[00:35:41] [SPEAKER_02]: right whatever the source material is.

[00:35:44] [SPEAKER_00]: Right so the multiple layers are needed

[00:35:46] [SPEAKER_00]: and that's what we do.

[00:35:48] [SPEAKER_00]: So we don't just

[00:35:49] [SPEAKER_00]: in the entire process chain that we have

[00:35:54] [SPEAKER_00]: whether we use AI or not

[00:35:55] [SPEAKER_00]: you get something from one place

[00:35:58] [SPEAKER_00]: but then you go to the actual source

[00:35:59] [SPEAKER_00]: because this is a real case

[00:36:01] [SPEAKER_00]: that there could be a conviction in one code

[00:36:04] [SPEAKER_00]: but then it was thrown out seven years later

[00:36:06] [SPEAKER_00]: and all those kind of cases

[00:36:08] [SPEAKER_00]: come up with or without AI

[00:36:10] [SPEAKER_00]: so you have to go to the source

[00:36:11] [SPEAKER_00]: to do that verification

[00:36:12] [SPEAKER_00]: whether it's manual

[00:36:14] [SPEAKER_00]: whether it's automated

[00:36:16] [SPEAKER_00]: what AI really helps you in those cases to do

[00:36:20] [SPEAKER_00]: you don't have to read

[00:36:22] [SPEAKER_00]: 100 documents.

[00:36:24] [SPEAKER_00]: Which actually you can make more mistakes

[00:36:27] [SPEAKER_00]: reading 100 documents

[00:36:28] [SPEAKER_00]: versus giving it to a machine to say

[00:36:31] [SPEAKER_00]: could you pull this out for me

[00:36:32] [SPEAKER_00]: and find this information

[00:36:35] [SPEAKER_00]: and I actually don't miss information.

[00:36:37] [SPEAKER_00]: That's where

[00:36:38] [SPEAKER_00]: and that is something we have used at Sterling

[00:36:41] [SPEAKER_00]: the power of reading from document

[00:36:44] [SPEAKER_00]: and summarizing it

[00:36:46] [SPEAKER_00]: and actually pulling all the relevant data

[00:36:48] [SPEAKER_00]: versus training humans to go through

[00:36:52] [SPEAKER_00]: 20 code documents or 30 code documents.

[00:36:56] [SPEAKER_00]: Yeah so it can be used for that power of good

[00:37:00] [SPEAKER_00]: if you will in this case

[00:37:01] [SPEAKER_00]: versus just relying on odd brains

[00:37:02] [SPEAKER_00]: to read a ton of information.

[00:37:04] [SPEAKER_02]: Are you building your own LLM for Sterling?

[00:37:08] [SPEAKER_02]: You're using some, okay.

[00:37:09] [SPEAKER_02]: I don't want to get into proprietary stuff

[00:37:11] [SPEAKER_02]: I just was curious about that.

[00:37:14] [SPEAKER_02]: The question I have is

[00:37:17] [SPEAKER_02]: what's keeping you up at night

[00:37:19] [SPEAKER_02]: as it relates to AI?

[00:37:22] [SPEAKER_02]: I'm sure you sleep well so.

[00:37:26] [SPEAKER_00]: What keeps me up at night is

[00:37:29] [SPEAKER_00]: AI attacks,

[00:37:32] [SPEAKER_00]: AI security attacks

[00:37:33] [SPEAKER_00]: because I believe the

[00:37:35] [SPEAKER_00]: the frequency of security attacks

[00:37:37] [SPEAKER_00]: is going to increase

[00:37:39] [SPEAKER_00]: it's already increasing

[00:37:42] [SPEAKER_00]: because of the AI

[00:37:45] [SPEAKER_00]: capabilities and

[00:37:47] [SPEAKER_00]: whether it's a phishing scam

[00:37:48] [SPEAKER_00]: or whether it's

[00:37:49] [SPEAKER_00]: the ability of AI to

[00:37:53] [SPEAKER_00]: run scripts

[00:37:54] [SPEAKER_00]: you know all different kind of things

[00:37:56] [SPEAKER_00]: which can happen.

[00:37:57] [SPEAKER_00]: That's what keeps me up

[00:37:59] [SPEAKER_00]: at night

[00:38:00] [SPEAKER_00]: and I know the

[00:38:01] [SPEAKER_00]: the security tooling is trying to catch up

[00:38:04] [SPEAKER_00]: with it

[00:38:05] [SPEAKER_00]: but the

[00:38:07] [SPEAKER_00]: hacker community is moving very, very fast

[00:38:10] [SPEAKER_00]: and exploiting

[00:38:11] [SPEAKER_00]: yeah that's what keeps me up at night.

[00:38:13] [SPEAKER_02]: If I were in your position

[00:38:15] [SPEAKER_02]: that's exactly what would keep me up at night

[00:38:17] [SPEAKER_02]: is privacy and security.

[00:38:19] [SPEAKER_02]: Just how do I keep people safe?

[00:38:22] [SPEAKER_02]: Getting back to that do not harm

[00:38:24] [SPEAKER_02]: that's both your clients

[00:38:25] [SPEAKER_02]: as well as starving proper.

[00:38:27] [SPEAKER_02]: Yep I got it.

[00:38:28] [SPEAKER_02]: Brian?

[00:38:30] [SPEAKER_01]: You know as you're saying that

[00:38:33] [SPEAKER_01]: I was thinking this is exactly the reason

[00:38:35] [SPEAKER_01]: other than I'm just

[00:38:36] [SPEAKER_01]: I'm not smart enough to do what you do

[00:38:38] [SPEAKER_01]: 100%

[00:38:39] [SPEAKER_01]: not even close

[00:38:40] [SPEAKER_01]: but that is a reason why

[00:38:42] [SPEAKER_01]: I would never be able to do

[00:38:44] [SPEAKER_01]: what you do because I just

[00:38:46] [SPEAKER_01]: I don't think corporate

[00:38:48] [SPEAKER_01]: or say software

[00:38:49] [SPEAKER_01]: however you want platforms

[00:38:50] [SPEAKER_01]: are ever going to outpace the hackers.

[00:38:53] [SPEAKER_01]: It is a constant bad it's never going to

[00:38:57] [SPEAKER_02]: bad actors because there's no rules.

[00:39:00] [SPEAKER_02]: Bad actors can innovate faster.

[00:39:01] [SPEAKER_01]: Yeah and it's not a question of

[00:39:03] [SPEAKER_01]: why can't you think ahead of them

[00:39:06] [SPEAKER_01]: it's I mean why not go higher than

[00:39:08] [SPEAKER_01]: well because then they're already in your system

[00:39:10] [SPEAKER_01]: right like I mean there's a lot of

[00:39:12] [SPEAKER_01]: like that

[00:39:13] [SPEAKER_01]: that not just keeps would keep me up

[00:39:15] [SPEAKER_01]: and I prevent me from ever holding

[00:39:17] [SPEAKER_01]: your job so

[00:39:19] [SPEAKER_01]: I give you a lot of kudos

[00:39:21] [SPEAKER_01]: for that so there's no question in that it was just

[00:39:24] [SPEAKER_00]: thank you thank you

[00:39:25] [SPEAKER_01]: it was just an observation of

[00:39:28] [SPEAKER_01]: you've confirmed that I would never have your role.

[00:39:31] [SPEAKER_01]: What's

[00:39:33] [SPEAKER_00]: I will never be able to do what you're doing.

[00:39:36] [SPEAKER_02]: There you go I feel a little more important.

[00:39:37] [SPEAKER_02]: Last thing for me is

[00:39:39] [SPEAKER_02]: okay so we talked about the dark side

[00:39:40] [SPEAKER_02]: let's talk about the light side

[00:39:41] [SPEAKER_02]: what

[00:39:44] [SPEAKER_02]: what excites you

[00:39:45] [SPEAKER_02]: just immediately

[00:39:47] [SPEAKER_02]: just right around a quarter about AI

[00:39:48] [SPEAKER_02]: like what's this year

[00:39:51] [SPEAKER_02]: you just you're you're giddy about

[00:39:53] [SPEAKER_02]: this phase that we're in in a

[00:40:00] [SPEAKER_00]: you know one thing which I am really hopeful

[00:40:03] [SPEAKER_00]: that which can happen is

[00:40:05] [SPEAKER_00]: and in both personal and professional capacity is

[00:40:10] [SPEAKER_00]: if I don't have to be on phone

[00:40:12] [SPEAKER_00]: with a customer service rep for 40 minutes

[00:40:15] [SPEAKER_00]: in that let's say in a healthcare setting or

[00:40:18] [SPEAKER_00]: if I think of Sterling proper as a candidate

[00:40:21] [SPEAKER_00]: I don't have to be on phone

[00:40:23] [SPEAKER_00]: and I have

[00:40:25] [SPEAKER_00]: I don't know an AI clone

[00:40:26] [SPEAKER_00]: on either side or both sides

[00:40:29] [SPEAKER_00]: who can do that question answer

[00:40:31] [SPEAKER_00]: and then bring me in at the right time

[00:40:35] [SPEAKER_00]: the amount of time it will save

[00:40:36] [SPEAKER_00]: a lot of frustration it will take away

[00:40:40] [SPEAKER_00]: in just

[00:40:41] [SPEAKER_00]: personal professional both sides of life

[00:40:44] [SPEAKER_00]: wouldn't it be amazing

[00:40:45] [SPEAKER_01]: you know when you you're in queue

[00:40:47] [SPEAKER_01]: and it says

[00:40:48] [SPEAKER_01]: if you want to receive a call back

[00:40:51] [SPEAKER_01]: yes I do and then the phone rings

[00:40:53] [SPEAKER_01]: and I get back on the phone and it says

[00:40:55] [SPEAKER_01]: please wait as we find an opera

[00:40:57] [SPEAKER_01]: and just 10 minutes later

[00:40:58] [SPEAKER_01]: wouldn't it be great

[00:40:59] [SPEAKER_01]: for them to get the same message from you

[00:41:01] [SPEAKER_01]: like they're asking your clone

[00:41:03] [SPEAKER_01]: and then your clone says

[00:41:05] [SPEAKER_01]: please hold for if need

[00:41:06] [SPEAKER_01]: she'll be with you momentarily

[00:41:08] [SPEAKER_01]: to answer that question

[00:41:09] [SPEAKER_01]: and 10 minutes later you get on the call

[00:41:11] [SPEAKER_01]: like that would be amazing

[00:41:12] [SPEAKER_02]: I think sinking all that I love that

[00:41:14] [SPEAKER_02]: but the optimism

[00:41:16] [SPEAKER_02]: of that's such an inefficient

[00:41:19] [SPEAKER_02]: process

[00:41:20] [SPEAKER_02]: and it sucks up our time

[00:41:22] [SPEAKER_02]: Ryan chats with a lot of the technologies

[00:41:25] [SPEAKER_02]: that we use for the show

[00:41:26] [SPEAKER_02]: and it's

[00:41:28] [SPEAKER_01]: but those are humans

[00:41:29] [SPEAKER_01]: they're actual humans

[00:41:30] [SPEAKER_01]: they're just bad at their job

[00:41:32] [SPEAKER_01]: I'd rather have an AI at this point

[00:41:35] [SPEAKER_01]: there are some where the AI

[00:41:37] [SPEAKER_01]: is actually better than the humans

[00:41:38] [SPEAKER_01]: all of the platforms that we use

[00:41:40] [SPEAKER_02]: yes

[00:41:42] [SPEAKER_02]: but

[00:41:42] [SPEAKER_02]: you know again it's like

[00:41:45] [SPEAKER_02]: they'll respond in email

[00:41:46] [SPEAKER_02]: this is how inefficient it is

[00:41:48] [SPEAKER_02]: he'll start a chat

[00:41:50] [SPEAKER_02]: you do this too I'm sure

[00:41:51] [SPEAKER_02]: you'll start a chat

[00:41:52] [SPEAKER_02]: and all of a sudden he's

[00:41:54] [SPEAKER_02]: okay enough of this

[00:41:54] [SPEAKER_02]: I've got to get

[00:41:55] [SPEAKER_02]: I've got to move on to the next thing

[00:41:56] [SPEAKER_02]: and then they'll send an email and say

[00:41:58] [SPEAKER_02]: it looks like

[00:41:59] [SPEAKER_02]: you left

[00:42:02] [SPEAKER_02]: and the chat's over

[00:42:03] [SPEAKER_02]: it's like

[00:42:04] [SPEAKER_02]: I left because no one was responding

[00:42:06] [SPEAKER_02]: any way

[00:42:07] [SPEAKER_02]: this has been so educational

[00:42:10] [SPEAKER_02]: thank you so much for Carbon Outtide

[00:42:12] [SPEAKER_02]: we can't even imagine how busy you are

[00:42:14] [SPEAKER_02]: but thanks for coming on the show

[00:42:15] [SPEAKER_02]: breaking things down for us