Mark Stelzner is the Founder and Managing Principal of IA, an HR consulting firm. In this episode, Mark discusses what people aren’t talking about when it comes to AI; why organizations should let their employees safely explore AI in the workplace; and how basic AI guidelines and data governance can be transformational.
This conversation took place at the HR Tech 2024 conference in Las Vegas.
[0:00] Introduction
- Welcome, Mark!
- Today’s Topic: Practical AI Integration in the Workplace
[2:47] What are we missing when it comes to AI?
- Insights on client readiness and typical gaps in AI understanding
- Discussion on who should own AI responsibilities in an organization
[11:41] How can organizations safely explore AI?
- Why employees need a safe playground to experiment with AI
- The capabilities AI can unlock for employees
[22:49] What role does data governance play in an organization’s AI rollout?
- Differentiating AI use within and outside the workplace
- Emphasizing intentional exploration, clear guidelines, and ethical AI use
[32:26] Closing
- Thanks for listening!
Quick Quote
“If an organization has not invested in content that is accurate [and refreshed], an AI model will hallucinate . . . because the large language model is sourcing from content which is not truly authoritative.”
Resource:
IA HR
Contact:
Mark's LinkedIn
David's LinkedIn
Dwight's LinkedIn
Podcast Manager: Karissa Harris
Email us!
Production by Affogato Media
To schedule a meeting with us: https://salary.com/hrdlconsulting
For more HR Data Labs®, enjoy the HR Data Labs Brown Bag Lunch Hours every Friday at 2:00PM-2:30PM EST. Check it out here: https://hrdatalabs.com/brown-bag-lunch/
Produced by Affogato Media
Powered by the WRKdefined Podcast Network.
[00:00:00] The world of business is more complex than ever. The world of human resources and compensation is also getting more complex. Welcome to the HR Data Labs podcast, your direct source for the latest trends from experts inside and outside the world of human resources. Listen as we explore the impact that compensation strategy, data, and people analytics can have on your organization.
[00:00:25] This podcast is sponsored by salary.com, your source for data, technology, and consulting for compensation and beyond. Now, here are your hosts, David Turetsky and Dwight Brown.
[00:00:38] Hello, and welcome to the HR Data Labs podcast. I'm your host, David Turetsky. We are live recording from the HR Technology Show here in Las Vegas, Nevada at the beautiful Mandalay Bay Exposition Center. Today, I have the absolute pleasure again to talk to
[00:00:55] Mark Stelzner of AI. Sorry, IA.
[00:01:00] Well, that was prescient.
[00:01:01] Yeah.
[00:01:02] Exactly. Well, really, I mean, could you rebrand and just change the letters around?
[00:01:07] I'm intelligently artificial, so I'll own that brand, David.
[00:01:11] There you go. Hey, that's great.
[00:01:12] Good to see you, my friend. How you doing?
[00:01:13] I'm okay. How are you?
[00:01:14] I'm well, I'm well.
[00:01:15] You look well.
[00:01:15] Thank you. Well, you know, looks can deceive us sometimes, so.
[00:01:19] Well, hopefully, hopefully it does. Everything's okay.
[00:01:22] Thank you. I appreciate it.
[00:01:23] So, Mark, you know how this works. What's one fun thing that no one knows about you?
[00:01:28] Boy, what's one fun thing?
[00:01:31] I have torn both my rotator cuffs in the last two years, and actually, maybe a more fun story than my klutziness. Well, this is klutziness. I broke my toe six months ago moving a Peloton.
[00:01:44] Oh, no.
[00:01:45] So, little known fact, if you've ever had a Peloton or used one, there's little tiny weights that are hidden under the seat.
[00:01:51] Oh, no.
[00:01:51] And when you kick the Peloton up to a 70-degree angle to get the little janky wheels to start to move, those weights have a gravitational pull.
[00:01:59] Oh, my God.
[00:02:00] And a well-placed toe can catch those weights and break into pieces.
[00:02:05] So, I'm just getting more klutzy.
[00:02:08] I need to be covered in bubble wrap. I'm not sure what's going on with me right now.
[00:02:12] Well, you should go play hockey, and then you have all the equipment on.
[00:02:14] I should wear that all the time.
[00:02:16] Yeah.
[00:02:16] But trying to thrive and survive despite my own self-interest.
[00:02:20] Well, as we get older, I find that I'm doing klutzy things as well.
[00:02:25] And it's, you know, I used to be really coordinated. I guess I'm not anymore.
[00:02:29] I used to think I was, and now I'm questioning my own memory.
[00:02:32] But, hey, I'm trying to stay upright and glad to be back in action.
[00:02:36] Well, and glad that you're here at the HR Technology Show.
[00:02:39] Thank you.
[00:03:11] Thank you.
[00:03:14] Thank you.
[00:03:45] And I think the cynicism that's starting to grow is how do we apply these tools to real use cases, for real people, for real value, and in a way where it can be trusted.
[00:03:58] Right.
[00:03:58] And it can learn, and it can grow, and it can be trusted, it can learn, and it can grow.
[00:04:02] But part of how HR is organized is the fact that really there's nobody that owns AI, as it were, as a capability.
[00:04:11] our function owns end-to-end processes.
[00:04:14] So part of what we spend a lot of time on
[00:04:16] is how do you deconstruct and reconstruct journeys?
[00:04:19] How do we apply it to a multitude of personas?
[00:04:22] And our temptation in our industry right now
[00:04:24] is to talk about employees,
[00:04:25] but what about candidates?
[00:04:26] What about pre-hire?
[00:04:27] What about onboarding?
[00:04:28] What about family members, however that's defined?
[00:04:31] What about alumnus?
[00:04:33] And everybody moves through that journey
[00:04:35] and has moved through that journey
[00:04:36] multiple times in their career.
[00:04:37] And then we apply the cyclical or point in time
[00:04:40] or moment-of-matter processes, however you define it.
[00:04:42] So where does AI thread in?
[00:04:44] And one of the biggest issues,
[00:04:46] one of the biggest opportunities we see, for example,
[00:04:48] is what I'll call authoritative content.
[00:04:50] If an organization has not invested in content
[00:04:53] that is accurate, content that is refreshed,
[00:04:57] content that is in the language that one needs to consume,
[00:05:00] an AI model will hallucinate, as AI people like to say,
[00:05:05] and it will infer the wrong information
[00:05:06] or present you with the wrong information
[00:05:08] because the large language model is sourcing from content
[00:05:11] which is not truly authoritative.
[00:05:14] And we work with really large, complex global organizations,
[00:05:18] and part of the problem is
[00:05:19] nobody really owns content either.
[00:05:23] Well, I mean, there are pieces of content that Learning owns,
[00:05:26] there's a piece that OD owns,
[00:05:28] there's a piece that Comp owns,
[00:05:29] but those are disparate.
[00:05:31] Yes, and so what's the repository?
[00:05:34] Is it even possible to imagine a repository
[00:05:37] where content would live?
[00:05:39] Do we have enough time and protection
[00:05:41] to update our policies and our programs?
[00:05:44] And I'd say even the marketing and sales pitch
[00:05:46] associated with performance and merit.
[00:05:50] The answer is typically no.
[00:05:52] So even an HRMS doesn't usually hold documentation of process.
[00:05:57] It gets instantiated and implemented with process.
[00:06:02] Exactly, exactly.
[00:06:03] So if we could solve authoritative content,
[00:06:06] then we can put some of these AI tools against it.
[00:06:09] But then we have this notion of permissioning and consent.
[00:06:13] Yeah.
[00:06:13] And a lot of the debates in the organizations we work with
[00:06:16] is, is AI about opting in or is it about opting out?
[00:06:20] Does that matter where they are?
[00:06:22] Is it California versus London versus Paris?
[00:06:25] I think it could.
[00:06:26] I think based on privacy laws, GDPR,
[00:06:29] of course, any of the protections that are here in the States
[00:06:31] and hyper-localized around the world,
[00:06:32] it should matter.
[00:06:34] But I think philosophically,
[00:06:36] organizations haven't aligned on
[00:06:38] where do we believe we know better than you?
[00:06:41] Rightfully so, perhaps,
[00:06:42] that you're not taking advantage of these amazing programs
[00:06:44] or you're not consuming the information
[00:06:46] in the way that we would expect.
[00:06:47] And we want to use AI to push information.
[00:06:49] We want to use AI to activate other modalities.
[00:06:53] But how do you decide which modality you prefer?
[00:06:55] Right.
[00:06:55] I might like text right now,
[00:06:57] but depending on the use case,
[00:06:58] I still might need to talk to a live person.
[00:07:00] Right.
[00:07:01] I might want to open a case through the case management tool.
[00:07:04] I might want to have a live chat
[00:07:05] or I might want to just engage with a virtual agent.
[00:07:08] Right.
[00:07:08] But it depends on the content and the context
[00:07:10] in which I'm actually leveraging the tool
[00:07:12] in my employee journey.
[00:07:13] Can I ask you a question
[00:07:15] that may foundationally change what you just said?
[00:07:17] Sure.
[00:07:17] You had mentioned who the owner is of all of this.
[00:07:21] Shouldn't the owner really be IT
[00:07:22] since this is a technology play?
[00:07:24] It's a really good question.
[00:07:26] I think IT should own the policies associated with AI.
[00:07:32] Okay.
[00:07:33] So we work with Microsoft.
[00:07:35] Microsoft has the Office of Responsible AI.
[00:07:38] Sure.
[00:07:39] Aura, as they refer to it, right,
[00:07:41] is not only for their first-party tools,
[00:07:42] but it's also for the ingestion of co-pilot in this instance
[00:07:46] for their own employee journeys.
[00:07:48] In lieu of defining the parameters
[00:07:51] through which AI can be employed,
[00:07:53] it's a little bit of the Wild West.
[00:07:55] So we need our technology partners
[00:07:56] to establish governing principles and policies and practices
[00:07:59] that give us the guardrails
[00:08:01] through which we can then activate.
[00:08:03] Right.
[00:08:03] But that in and of itself is evolving so quickly
[00:08:06] that they need to,
[00:08:07] they being our technology partners,
[00:08:09] need to have dedicated resources
[00:08:10] that wake up every day thinking about this
[00:08:13] on behalf of all the functions.
[00:08:15] We have a client we work with
[00:08:17] who when they sent us our contract,
[00:08:20] it had AI in the contract.
[00:08:22] You are not allowed to use anything
[00:08:24] that you learned from us
[00:08:25] for the purposes of a model.
[00:08:27] Well, guess what the project was about?
[00:08:29] AI.
[00:08:29] Training models.
[00:08:30] Yeah, yeah.
[00:08:31] So yeah, the irony here,
[00:08:32] cognitive distance isn't lost on me.
[00:08:35] But it's happening so fast.
[00:08:37] We all have various belief systems
[00:08:40] around the value proposition.
[00:08:42] Right.
[00:08:42] It is being pushed into our inboxes
[00:08:44] and our brains through every medium
[00:08:46] that one can imagine.
[00:08:48] And it does have a ton of practical value.
[00:08:51] Right.
[00:08:52] But organizations need to pause for a moment
[00:08:55] and get themselves ready to ingest, deploy, learn,
[00:08:58] ingest, deploy, learn.
[00:08:59] And that's one of the barriers
[00:09:01] I'm seeing at this moment, yeah.
[00:09:02] But that requires what you were talking about before,
[00:09:05] like Aura, that requires an organizational response
[00:09:09] that doesn't take into consideration
[00:09:11] that we're all consumers.
[00:09:13] That's right.
[00:09:13] And we all hear the hype cycle
[00:09:16] about ChatGPT 4.0 and Copilot and Gemini
[00:09:18] and everything else.
[00:09:19] That's right, yeah.
[00:09:19] And there are a lot of people going rogue,
[00:09:22] quote unquote, and doing it on their own.
[00:09:23] That's right.
[00:09:24] Even asking questions
[00:09:25] that may actually cause risk management
[00:09:28] to either be furious
[00:09:30] or to ask, do you know what you're doing?
[00:09:33] A hundred percent.
[00:09:33] A hundred percent.
[00:09:34] But like everything in the consumer market,
[00:09:37] everything that you see in this hall,
[00:09:40] this vast hall that we're in,
[00:09:41] there are consumer expectations.
[00:09:43] We all have consumer expectations.
[00:09:45] Nothing is more frustrating
[00:09:47] than using a consumer-like experience
[00:09:49] to bring me into an organization
[00:09:51] only for me to slip into a wormhole
[00:09:53] from 40 years ago
[00:09:54] because we don't have device enablements.
[00:09:57] And I'm talking about sort of basic tenets
[00:09:59] of bringing technology
[00:10:00] into the hands of populations.
[00:10:01] We have a lot of frontline workers
[00:10:03] that we work with
[00:10:04] and they don't have,
[00:10:06] yes, they're on active directory for badging,
[00:10:08] but they don't have email addresses.
[00:10:09] They're not allowed to bring devices
[00:10:11] for wage and hour concerns,
[00:10:12] as you certainly know.
[00:10:14] And so all this wonderful tech,
[00:10:16] I talk to our clients
[00:10:17] about the addressable market.
[00:10:19] Yeah.
[00:10:19] Like what is the addressable market?
[00:10:21] Who is the actual consumer for these tools?
[00:10:23] If it's more for HR,
[00:10:25] we're actually increasing the dependency on HR.
[00:10:28] That's right.
[00:10:28] Versus necessarily bringing
[00:10:30] a different level of activation
[00:10:31] with our guidance,
[00:10:32] with our ethos,
[00:10:33] with our culture,
[00:10:34] through bringing these tools
[00:10:35] to the frontline,
[00:10:36] to the best of our ability,
[00:10:37] where people should own their career.
[00:10:38] Right.
[00:10:38] You're stressing out HR
[00:10:39] by trying to learn a new trade,
[00:10:42] really,
[00:10:42] by trying to train a new colleague,
[00:10:45] quote unquote,
[00:10:46] or a new productive worker,
[00:10:48] which might be an agent.
[00:10:49] That's right.
[00:10:49] And are you giving them something
[00:10:52] that is offloading something
[00:10:53] that's useful to them
[00:10:55] to get their job done,
[00:10:57] but giving them the tools
[00:10:58] to enable them to really see that?
[00:11:00] Or are you just stressing them the hell out?
[00:11:02] Well, and with all the talk,
[00:11:04] I know you've talked to Station
[00:11:05] and other wonderful minds here
[00:11:06] about skills.
[00:11:07] With all the talk about skills
[00:11:09] and skills taxonomy,
[00:11:09] what are the skills that we believe
[00:11:12] AI should bring to us
[00:11:13] as that complementary agent?
[00:11:15] Right.
[00:11:16] And therefore,
[00:11:16] what are the skills
[00:11:17] we no longer require
[00:11:18] that should be redacted
[00:11:20] from the job description?
[00:11:22] And as AI gets smarter,
[00:11:24] we maybe don't need those skills
[00:11:26] or the same application
[00:11:27] of those skills at scale.
[00:11:30] Like what you hear so far?
[00:11:32] Make sure you never miss a show
[00:11:33] by clicking subscribe.
[00:11:35] This podcast is made possible
[00:11:37] by Salary.com.
[00:11:38] Now, back to the show.
[00:11:41] But I'll argue with one thing.
[00:11:43] I don't think that,
[00:11:44] I think the person's job description
[00:11:46] changes.
[00:11:47] I think that skill
[00:11:49] and those duties
[00:11:50] then need to get added
[00:11:52] to the agent's description.
[00:11:54] And I know there's a religious work
[00:11:56] going online on,
[00:11:58] there's a religious work
[00:11:59] going on online
[00:12:00] at LinkedIn right now
[00:12:01] about one company
[00:12:03] that I'm not going to name
[00:12:04] that tried to portray
[00:12:07] AI as being a worker
[00:12:09] being hired.
[00:12:10] And then literally
[00:12:12] people came out
[00:12:13] of the woodwork
[00:12:15] screaming at this company,
[00:12:16] you know,
[00:12:17] what kind of hype bull crap
[00:12:18] is this that you're,
[00:12:20] are you going to pay it
[00:12:21] or is it going to fall in line
[00:12:22] with FLSA
[00:12:23] and wage hour regulation,
[00:12:25] blah, blah.
[00:12:25] Oh my God.
[00:12:27] Like this is the future.
[00:12:29] And the future is here.
[00:12:30] My wife is a professor
[00:12:32] of fashion and design
[00:12:33] at SCAD in Atlanta.
[00:12:35] Oh, cool.
[00:12:37] She uses AI
[00:12:38] every day
[00:12:39] for what she does.
[00:12:40] Now,
[00:12:40] think of what she's doing.
[00:12:42] She's looking at it
[00:12:42] for inspiration.
[00:12:43] She's looking at it
[00:12:44] for creative.
[00:12:45] She can actually build patterns.
[00:12:46] She can actually create
[00:12:46] a runway
[00:12:47] and see how the products flow.
[00:12:49] She can change
[00:12:50] the garment type.
[00:12:51] Right.
[00:12:51] She can change the sizing.
[00:12:53] She can play with the lighting.
[00:12:54] Right.
[00:12:54] She can put jewelry on it.
[00:12:55] She can emulate
[00:12:57] the entire production
[00:12:58] life cycle
[00:12:59] using a variety of tools.
[00:13:01] Now,
[00:13:01] they're not perfect
[00:13:02] as tools are imperfect.
[00:13:04] But at the same time,
[00:13:05] she's sat down
[00:13:06] with her seniors.
[00:13:07] She just told me
[00:13:07] this last night,
[00:13:08] her seniors,
[00:13:08] and said,
[00:13:09] listen,
[00:13:09] you're getting ready
[00:13:10] for your senior collection,
[00:13:11] which is the culmination
[00:13:12] of your experience
[00:13:13] here at our university.
[00:13:14] Why aren't you using AI?
[00:13:17] Why aren't you applying it?
[00:13:18] And which is fascinating
[00:13:19] because you would think
[00:13:20] Yeah.
[00:13:21] And a lot of
[00:13:22] the other way around.
[00:13:23] A lot of creatives say,
[00:13:24] well, the creative part
[00:13:25] is what I should be doing.
[00:13:27] Right.
[00:13:27] Right?
[00:13:28] It's the non-creative piece.
[00:13:29] But it's everywhere.
[00:13:30] It has endless applications.
[00:13:32] And once you play with it,
[00:13:34] and I do think
[00:13:35] people need to play.
[00:13:36] Yeah.
[00:13:37] So to your point about
[00:13:37] InfoSec losing their minds
[00:13:39] or risk management
[00:13:40] losing their minds,
[00:13:41] how do we create
[00:13:41] a safe playground,
[00:13:43] give parameters
[00:13:44] in which we want to
[00:13:45] encourage our employees
[00:13:47] to play an experiment
[00:13:48] and learn from
[00:13:49] that experimentation
[00:13:49] if for nothing else
[00:13:51] to give them
[00:13:52] another data point
[00:13:53] that they could apply
[00:13:54] to their work.
[00:13:54] I think there are
[00:13:58] serious guardrails
[00:13:59] that need to be put in place
[00:14:01] because
[00:14:02] the intellectual property
[00:14:04] that gets created
[00:14:05] by the AI
[00:14:05] That's right.
[00:14:06] is a very big gray area
[00:14:07] right now.
[00:14:08] That's right, yeah.
[00:14:08] Or it's actually,
[00:14:09] like if you use
[00:14:10] an Adobe product,
[00:14:11] they actually show you
[00:14:12] what the licensing terms
[00:14:13] are right away
[00:14:14] when you start using them.
[00:14:15] That's right.
[00:14:15] Yes, exactly.
[00:14:16] And like I've used AI
[00:14:18] to create artwork
[00:14:20] in Illustrator.
[00:14:22] Yeah.
[00:14:22] And I worry about
[00:14:25] the intellectual property
[00:14:26] that that creates
[00:14:27] and whether or not
[00:14:28] I really own it
[00:14:29] or whether someone else,
[00:14:31] to your point,
[00:14:32] someone else using
[00:14:32] the exact same terminology
[00:14:34] That's right.
[00:14:35] can make that exact
[00:14:36] same image
[00:14:37] and then there be
[00:14:38] an issue
[00:14:40] about who owns that.
[00:14:42] One of the skills
[00:14:43] we talked about
[00:14:44] two years ago
[00:14:44] that we want
[00:14:45] the people function
[00:14:46] to develop
[00:14:47] is prompt engineering.
[00:14:48] Oh my God, yeah.
[00:14:50] And it's funny,
[00:14:52] we have someone
[00:14:52] on our team
[00:14:53] who's an expert in this
[00:14:54] and a few years ago
[00:14:56] the one thing
[00:14:56] that he told me
[00:14:57] that I never forgot
[00:14:58] that I hadn't realized
[00:14:59] when I was just starting
[00:15:00] to play with these tools
[00:15:01] is starting with
[00:15:02] the prompt of imagine.
[00:15:04] Wow.
[00:15:05] Imagine you are a blank.
[00:15:07] The first prompt
[00:15:08] should be you telling
[00:15:09] the tool
[00:15:10] who you are emulating
[00:15:11] and how we want
[00:15:12] the tool to actually be.
[00:15:14] Imagine you are
[00:15:15] a people leader
[00:15:16] Right.
[00:15:16] in a manufacturing firm
[00:15:19] in this vertical market.
[00:15:21] Yeah.
[00:15:22] We're having an issue
[00:15:23] with employee retention
[00:15:24] and if you don't tell it
[00:15:26] to imagine
[00:15:27] it will imagine
[00:15:28] whatever it wants
[00:15:29] to imagine.
[00:15:30] And so
[00:15:30] the order in which
[00:15:32] one asks questions
[00:15:33] we do a lot of interviews
[00:15:34] stakeholder interviews
[00:15:35] is another example
[00:15:36] of how we apply AI.
[00:15:37] We used to have
[00:15:38] two people
[00:15:39] on every interview.
[00:15:40] I invite an AI agent
[00:15:41] to every one
[00:15:43] of my interviews.
[00:15:43] I ask if people
[00:15:44] are comfortable
[00:15:45] but we use it
[00:15:46] to record transcript
[00:15:47] and sentiment.
[00:15:48] And you know
[00:15:49] what the tool tells us?
[00:15:50] It tells us
[00:15:50] if we've used
[00:15:51] an appropriate language.
[00:15:53] Really?
[00:15:53] If we have gender bias.
[00:15:55] Oh it does bias testing.
[00:15:55] It does.
[00:15:56] That's brilliant.
[00:15:57] It'll tell us
[00:15:57] if we actually
[00:15:58] have visual engagement
[00:15:59] where we've lost engagement
[00:16:01] or where we've gained engagement.
[00:16:02] Wow.
[00:16:03] And we can take
[00:16:03] that transcript
[00:16:04] and I can save it off
[00:16:05] into a document
[00:16:06] and I can load that
[00:16:07] into ChatGPT.
[00:16:08] Yeah.
[00:16:09] And I can ask
[00:16:10] a million questions
[00:16:11] Wow.
[00:16:12] about what happened
[00:16:13] in that interview.
[00:16:13] And then I can take
[00:16:14] all the interviews
[00:16:15] and I can load those
[00:16:17] and ask for themes
[00:16:18] and it's really good.
[00:16:20] But what it freed me up to
[00:16:22] yes you would think
[00:16:22] okay great
[00:16:23] that's more efficient
[00:16:24] Mark didn't need
[00:16:25] another worker
[00:16:26] Right.
[00:16:26] But what I really
[00:16:27] get value out of
[00:16:27] I am fully engaged
[00:16:29] with the person
[00:16:29] across from me.
[00:16:30] Yes.
[00:16:30] I am 100%
[00:16:32] locked in
[00:16:33] to what we're discussing.
[00:16:34] I am not distracted
[00:16:35] with wait a minute
[00:16:36] how do they exactly
[00:16:37] phrase that?
[00:16:38] I've got someone
[00:16:38] catching that for me.
[00:16:40] Not bulletproof
[00:16:41] but gets me
[00:16:41] maybe 99%
[00:16:42] of what I'm looking for
[00:16:43] Right.
[00:16:44] Invaluable for the work
[00:16:45] that I do every day.
[00:16:47] That's brilliant
[00:16:47] I mean I know
[00:16:48] Zoom and Teams
[00:16:49] are actually doing
[00:16:49] something very similar
[00:16:51] in terms of
[00:16:51] having that AI agent
[00:16:53] listening
[00:16:53] you have to
[00:16:54] enable it of course
[00:16:55] Yeah.
[00:16:56] And it does tell
[00:16:57] everybody that
[00:16:57] this is being
[00:16:58] recorded and AI
[00:16:59] is transcribing
[00:17:01] but it's just
[00:17:02] brilliant because
[00:17:03] then that enables
[00:17:03] you to be so much
[00:17:04] more present.
[00:17:05] Well Hunter
[00:17:06] and that's what
[00:17:08] I wish we could
[00:17:09] start talking about
[00:17:09] like what is
[00:17:10] what capability
[00:17:12] is this unlocking?
[00:17:13] What that unlocked
[00:17:14] for me
[00:17:15] as I'm doing
[00:17:15] very senior
[00:17:16] stakeholder interviews
[00:17:17] is one
[00:17:18] transparency
[00:17:19] Right.
[00:17:20] I'm capturing
[00:17:21] I'm intending
[00:17:22] to capture
[00:17:23] everything we discuss
[00:17:24] Right.
[00:17:24] Now it'll be
[00:17:24] de-identified
[00:17:25] and anonymized
[00:17:26] but what you say
[00:17:27] and I would say
[00:17:28] how you say it
[00:17:29] is super important
[00:17:30] for the work
[00:17:31] that we do
[00:17:31] in transforming
[00:17:32] these organizations.
[00:17:33] Number two
[00:17:34] I want your consent
[00:17:35] that you're okay
[00:17:36] with that.
[00:17:37] I had a CTO
[00:17:38] interview the other
[00:17:39] day and he said
[00:17:40] I appreciate you
[00:17:41] asking.
[00:17:42] I said wait a minute
[00:17:42] do people not
[00:17:43] ask you?
[00:17:44] 90% of the time
[00:17:45] it's just running
[00:17:46] in the background.
[00:17:47] Wow.
[00:17:48] So ask
[00:17:48] this is a great
[00:17:49] example of the
[00:17:49] type of consent
[00:17:50] we're talking about.
[00:17:51] Are people comfortable?
[00:17:52] Not everyone's on
[00:17:52] a different journey
[00:17:53] or has a different
[00:17:53] point of view.
[00:17:54] Yeah that's true.
[00:17:54] But my presence
[00:17:56] and my engagement
[00:17:57] and the way I would
[00:17:58] connect with people
[00:17:59] oh.
[00:17:59] But Mark
[00:18:00] that actually brings up
[00:18:01] a good point
[00:18:01] which is
[00:18:02] could those people
[00:18:03] who aren't asking
[00:18:03] the question
[00:18:04] or the person
[00:18:05] who says
[00:18:06] I'm okay
[00:18:07] with you doing it
[00:18:07] are they
[00:18:09] potentially giving away
[00:18:11] intellectual property
[00:18:11] that maybe
[00:18:12] again talking about
[00:18:14] the risk management
[00:18:14] they may not
[00:18:16] agree with that
[00:18:17] and they may say
[00:18:17] you shouldn't
[00:18:18] have given
[00:18:18] permission there.
[00:18:20] That's sort of
[00:18:21] it came up
[00:18:22] in the same
[00:18:23] organization
[00:18:23] I was describing
[00:18:24] and that was
[00:18:25] the CTO's
[00:18:26] perspective
[00:18:27] and he asked me
[00:18:28] what did other
[00:18:28] people do?
[00:18:29] Exactly.
[00:18:30] He said
[00:18:30] what did my team
[00:18:31] do?
[00:18:31] Yeah.
[00:18:32] I said
[00:18:32] half your team
[00:18:33] said no
[00:18:34] and half your team
[00:18:35] thought it was
[00:18:37] really cool
[00:18:38] and he laughed
[00:18:39] at that
[00:18:40] because they're
[00:18:40] experimenting too
[00:18:41] but it's
[00:18:43] the perceived
[00:18:44] lack of control
[00:18:45] and I would say
[00:18:46] as we read
[00:18:47] the same publications
[00:18:48] and we're
[00:18:49] studying our market
[00:18:50] of course
[00:18:50] as one would
[00:18:52] the reality is
[00:18:53] even the originators
[00:18:54] of these tools
[00:18:55] don't know
[00:18:55] these answers
[00:18:56] don't know
[00:18:57] where that
[00:18:57] information
[00:18:57] really could
[00:18:59] be derived
[00:19:00] for another
[00:19:01] purpose
[00:19:02] hopefully a
[00:19:02] positive purpose
[00:19:03] it should be
[00:19:04] protected
[00:19:05] it should be
[00:19:06] localized
[00:19:06] it should be
[00:19:07] isolated
[00:19:07] but frankly
[00:19:08] the incentive
[00:19:09] for the tools
[00:19:09] is to get
[00:19:10] smarter
[00:19:10] through usage
[00:19:11] yes
[00:19:11] and that's
[00:19:12] what bothers
[00:19:12] me is
[00:19:13] they get
[00:19:14] especially the
[00:19:15] ones that
[00:19:16] aren't firewalled
[00:19:17] are trying
[00:19:18] to get data
[00:19:18] from everywhere
[00:19:19] as much
[00:19:20] as it can
[00:19:21] to be better
[00:19:22] at giving
[00:19:22] the right
[00:19:23] or giving
[00:19:23] an answer
[00:19:24] but I would
[00:19:25] say like
[00:19:26] we as humans
[00:19:27] how many apps
[00:19:28] do you have
[00:19:29] on your phone
[00:19:30] 50 100
[00:19:31] at least
[00:19:32] yeah
[00:19:32] and every time
[00:19:33] I get an update
[00:19:34] and every time
[00:19:35] it is terms
[00:19:35] and conditions
[00:19:36] come up
[00:19:36] I am in a race
[00:19:38] with myself
[00:19:38] to see how
[00:19:39] quickly I can
[00:19:39] scroll
[00:19:40] what do you
[00:19:40] mean oh
[00:19:40] this is one
[00:19:41] of those
[00:19:41] I have to
[00:19:41] scroll before
[00:19:42] I hit the
[00:19:42] button
[00:19:43] we give away
[00:19:44] and therefore
[00:19:46] when I'm talking
[00:19:46] to my wife
[00:19:47] about something
[00:19:47] I'm like
[00:19:48] why did that
[00:19:49] just pop up
[00:19:49] on my man's
[00:19:49] ramble
[00:19:50] shocker
[00:19:50] I just gave
[00:19:51] all these
[00:19:51] permissions away
[00:19:52] so we as
[00:19:53] consumers
[00:19:54] are giving
[00:19:55] away
[00:19:56] our personal
[00:19:57] information
[00:19:58] constantly
[00:19:58] but an
[00:20:00] organization
[00:20:00] is different
[00:20:02] and what
[00:20:03] we want
[00:20:03] to incentivize
[00:20:04] is we want
[00:20:04] our employees
[00:20:05] to tell us
[00:20:06] more
[00:20:06] we want
[00:20:07] a higher
[00:20:07] level
[00:20:07] of engagement
[00:20:08] of transparency
[00:20:09] we want
[00:20:10] to know
[00:20:10] what people
[00:20:11] are thinking
[00:20:11] about
[00:20:12] in whatever
[00:20:13] form or format
[00:20:13] they want
[00:20:14] to do it
[00:20:14] so we want
[00:20:15] to encourage
[00:20:15] them to
[00:20:16] provide
[00:20:16] this information
[00:20:17] and sentiment
[00:20:18] but people
[00:20:18] are rightfully
[00:20:19] concerned
[00:20:20] about
[00:20:21] how will it
[00:20:22] be used
[00:20:22] and where
[00:20:23] could it
[00:20:23] be applied
[00:20:24] but as
[00:20:25] humans
[00:20:26] who live
[00:20:26] in the
[00:20:27] modern era
[00:20:27] most of us
[00:20:28] are freely
[00:20:28] giving away
[00:20:29] the most
[00:20:30] personal information
[00:20:31] we could imagine
[00:20:31] every day
[00:20:32] well I'm not an
[00:20:32] ethics lawyer
[00:20:33] and I don't
[00:20:35] know if you are
[00:20:36] as well
[00:20:36] but when we
[00:20:37] go into these
[00:20:38] meetings
[00:20:39] with that
[00:20:40] bot for example
[00:20:41] taking the
[00:20:41] transcription
[00:20:43] we don't know
[00:20:44] what's happening
[00:20:45] with that
[00:20:47] particular
[00:20:48] transcription
[00:20:48] other than
[00:20:49] the fact that
[00:20:49] it's maybe
[00:20:50] going to be
[00:20:51] played back
[00:20:51] to us
[00:20:51] or provided
[00:20:52] to us
[00:20:52] later
[00:20:53] but we
[00:20:54] don't know
[00:20:54] what else
[00:20:54] is going
[00:20:55] wrong with
[00:20:55] that
[00:20:55] and I like
[00:20:57] how Microsoft
[00:20:58] has called
[00:20:59] this
[00:20:59] responsible
[00:21:00] AI
[00:21:02] everything's
[00:21:02] evolving
[00:21:02] at an
[00:21:03] unprecedented
[00:21:04] speed
[00:21:04] the investment
[00:21:05] look at the
[00:21:06] valuation
[00:21:06] of NVIDIA
[00:21:07] it's going to
[00:21:08] get smarter
[00:21:09] and faster
[00:21:09] and smarter
[00:21:10] and faster
[00:21:10] and there
[00:21:11] are two
[00:21:12] sides of
[00:21:13] the big
[00:21:13] market
[00:21:14] which is
[00:21:15] we need
[00:21:16] to stop
[00:21:16] it's almost
[00:21:17] too late
[00:21:18] and there
[00:21:18] are others
[00:21:19] which is
[00:21:19] boy we're
[00:21:19] about to
[00:21:20] unlock
[00:21:20] something
[00:21:20] we can't
[00:21:21] even
[00:21:21] possibly
[00:21:22] imagine
[00:21:22] and both
[00:21:22] can
[00:21:23] absolutely
[00:21:23] be true
[00:21:24] and coexist
[00:21:24] and that's
[00:21:25] where the
[00:21:25] legislation
[00:21:26] is trying
[00:21:26] to head
[00:21:27] off
[00:21:27] or at least
[00:21:28] understand
[00:21:28] it first
[00:21:29] and try
[00:21:30] and head
[00:21:31] off
[00:21:31] the apocalypse
[00:21:32] right
[00:21:32] yeah
[00:21:33] but what
[00:21:33] is
[00:21:34] responsible
[00:21:35] and
[00:21:35] to your
[00:21:36] point
[00:21:36] about
[00:21:37] governance
[00:21:38] internal
[00:21:38] governance
[00:21:39] organizations
[00:21:40] being
[00:21:40] intentional
[00:21:41] about the
[00:21:41] responsible
[00:21:42] application
[00:21:43] of these
[00:21:44] tools
[00:21:45] in a way
[00:21:46] that's
[00:21:46] foundational
[00:21:48] and I
[00:21:49] don't know
[00:21:49] about you
[00:21:50] my data
[00:21:51] gets stolen
[00:21:51] every third
[00:21:52] week
[00:21:53] I think
[00:21:53] I have
[00:21:54] I have
[00:21:55] credit
[00:21:55] monitoring
[00:21:56] until I'm
[00:21:56] 150 years
[00:21:57] with the
[00:21:59] number of
[00:22:00] letters
[00:22:00] I'm getting
[00:22:00] but I'm
[00:22:02] upset with
[00:22:02] it and I
[00:22:03] certainly
[00:22:03] don't want
[00:22:03] my employer
[00:22:04] to give
[00:22:05] away my
[00:22:05] information
[00:22:05] without my
[00:22:06] consent
[00:22:06] so we
[00:22:07] have to
[00:22:07] come up
[00:22:08] with
[00:22:08] mechanisms
[00:22:08] methods
[00:22:09] to the
[00:22:09] best
[00:22:10] of our
[00:22:10] ability
[00:22:10] and we
[00:22:11] have to
[00:22:12] have
[00:22:12] governing
[00:22:12] bodies
[00:22:13] that provide
[00:22:14] prescriptive
[00:22:14] guardrails
[00:22:15] that determine
[00:22:16] the appropriate
[00:22:16] application
[00:22:17] some organizations
[00:22:18] are pulling
[00:22:18] way back
[00:22:19] and some
[00:22:19] are saying
[00:22:19] this is the
[00:22:20] future
[00:22:20] we've just
[00:22:21] got to
[00:22:21] find a way
[00:22:21] to bind
[00:22:22] it
[00:22:23] hey are you
[00:22:24] listening to
[00:22:24] this and
[00:22:25] thinking to
[00:22:25] yourself
[00:22:25] man I wish
[00:22:26] I could
[00:22:27] talk to
[00:22:27] David about
[00:22:28] this
[00:22:28] well you're
[00:22:29] in luck
[00:22:29] we have a
[00:22:30] special offer
[00:22:31] for listeners
[00:22:31] of the HR
[00:22:32] data labs
[00:22:32] podcast
[00:22:33] a free half
[00:22:34] hour call
[00:22:35] with me
[00:22:35] about any
[00:22:36] of the topics
[00:22:36] we cover
[00:22:37] on the podcast
[00:22:38] or whatever
[00:22:39] is on your
[00:22:39] mind
[00:22:40] go to
[00:22:41] salary.com
[00:22:41] forward
[00:22:42] slash
[00:22:43] HRDL
[00:22:44] consulting
[00:22:45] to schedule
[00:22:46] your free
[00:22:46] 30 minute
[00:22:47] call today
[00:22:49] and I
[00:22:50] think one
[00:22:50] of the
[00:22:51] really key
[00:22:52] areas that's
[00:22:53] tangential to
[00:22:54] this but
[00:22:55] key to it
[00:22:55] is data
[00:22:56] governance
[00:22:57] oh my god
[00:22:57] yes
[00:22:58] and what
[00:22:59] data are
[00:23:00] we using
[00:23:01] what data
[00:23:02] do we have
[00:23:02] and especially
[00:23:03] as we've
[00:23:04] talked about
[00:23:04] in the past
[00:23:04] HR data
[00:23:05] being the
[00:23:06] very nature
[00:23:07] of its
[00:23:07] existence
[00:23:08] of always
[00:23:09] being in
[00:23:09] some kind
[00:23:09] of flux
[00:23:11] how do
[00:23:12] we provide
[00:23:13] the right
[00:23:13] information
[00:23:15] and what's
[00:23:16] fascinating
[00:23:16] about that
[00:23:17] is we have
[00:23:17] some very
[00:23:18] forward looking
[00:23:19] employers that
[00:23:20] are even
[00:23:20] thinking about
[00:23:21] data governance
[00:23:21] in the context
[00:23:22] of when
[00:23:23] have you
[00:23:24] left us
[00:23:25] meaning
[00:23:25] the auspices
[00:23:26] of what it
[00:23:27] means in
[00:23:28] the employee
[00:23:28] and employer
[00:23:29] relationship
[00:23:29] when do we
[00:23:30] have to be
[00:23:31] intentional
[00:23:31] about the
[00:23:32] fact that
[00:23:32] we will
[00:23:32] enable
[00:23:33] these
[00:23:33] capabilities
[00:23:34] but you
[00:23:35] are now
[00:23:35] outside
[00:23:36] the walls
[00:23:37] of this
[00:23:38] relationship
[00:23:38] a great
[00:23:40] example of
[00:23:40] this could
[00:23:41] be an
[00:23:41] HSA
[00:23:42] account
[00:23:42] or your
[00:23:43] 401k
[00:23:44] account
[00:23:45] that's a
[00:23:46] banking
[00:23:46] relationship
[00:23:47] that you
[00:23:48] have
[00:23:48] you own
[00:23:48] that
[00:23:49] bank
[00:23:49] it transcends
[00:23:50] the employment
[00:23:50] relationship
[00:23:51] exactly
[00:23:51] and I want
[00:23:51] to be
[00:23:52] really clear
[00:23:52] and I may
[00:23:53] move you
[00:23:54] from my
[00:23:54] bot
[00:23:55] to their
[00:23:55] bot
[00:23:55] right
[00:23:56] if it
[00:23:56] exists
[00:23:57] but in
[00:23:57] the moment
[00:23:58] I want to
[00:23:59] make clear
[00:23:59] that congratulations
[00:24:00] you now
[00:24:01] have left
[00:24:01] our land
[00:24:02] and you're
[00:24:03] entering
[00:24:04] theirs
[00:24:04] and they
[00:24:05] may have
[00:24:06] other
[00:24:06] requirements
[00:24:07] and consents
[00:24:08] that you need
[00:24:08] well there's
[00:24:09] a transom
[00:24:09] you pass
[00:24:10] right
[00:24:10] from being
[00:24:11] a current
[00:24:12] employee
[00:24:12] to being
[00:24:12] a former
[00:24:13] employee
[00:24:13] slash alumnus
[00:24:14] or never
[00:24:16] hire back
[00:24:16] or yeah
[00:24:17] we'd love
[00:24:18] to get
[00:24:18] them back
[00:24:18] but there's
[00:24:19] that transom
[00:24:20] you pass
[00:24:20] where the
[00:24:21] ownership
[00:24:22] of that
[00:24:23] person
[00:24:23] the responsibility
[00:24:24] of that
[00:24:25] person
[00:24:25] on some
[00:24:26] things is
[00:24:27] gone
[00:24:27] but in
[00:24:28] other ways
[00:24:28] it transforms
[00:24:29] global
[00:24:29] and global
[00:24:30] legislation
[00:24:31] for
[00:24:31] after hours
[00:24:32] notwithstanding
[00:24:33] there aren't
[00:24:34] really clear
[00:24:35] lines of
[00:24:36] demarcation
[00:24:36] it's really
[00:24:37] fuzzy
[00:24:38] in a lot
[00:24:38] of areas
[00:24:39] I think
[00:24:39] it would
[00:24:40] take a lot
[00:24:41] of work
[00:24:41] frankly
[00:24:42] for many
[00:24:42] organizations
[00:24:42] to say
[00:24:43] when has
[00:24:44] this relationship
[00:24:45] ended
[00:24:45] when have you
[00:24:46] timed out
[00:24:47] as it were
[00:24:48] of what we
[00:24:48] would expect
[00:24:49] of you
[00:24:49] right
[00:24:50] and then
[00:24:51] the classification
[00:24:51] of course
[00:24:52] of you know
[00:24:52] what type
[00:24:53] of employee
[00:24:53] I am
[00:24:54] appears into
[00:24:55] that too
[00:24:55] but without
[00:24:55] that
[00:24:56] we don't
[00:24:57] know how
[00:24:58] to govern
[00:24:58] the application
[00:24:59] of some of
[00:25:00] these amazing
[00:25:00] tools and
[00:25:01] technologies
[00:25:01] I want to
[00:25:02] dive into
[00:25:02] something you
[00:25:03] said there
[00:25:03] which I'm
[00:25:03] going to
[00:25:03] take it
[00:25:04] in a
[00:25:04] different
[00:25:04] way
[00:25:04] sure
[00:25:04] please
[00:25:05] you said
[00:25:06] when the
[00:25:06] time runs
[00:25:07] out
[00:25:07] think about
[00:25:08] the current
[00:25:09] employee
[00:25:09] relationship
[00:25:10] and I'm
[00:25:10] not talking
[00:25:11] about just
[00:25:12] more
[00:25:14] exempt
[00:25:14] roles
[00:25:15] or more
[00:25:15] professional
[00:25:16] roles
[00:25:16] even though
[00:25:17] that's a
[00:25:17] really
[00:25:18] important
[00:25:18] topic
[00:25:19] I'm
[00:25:19] talking
[00:25:20] about
[00:25:20] the
[00:25:20] two
[00:25:20] thirds
[00:25:20] of
[00:25:21] employees
[00:25:21] in the
[00:25:22] US
[00:25:22] which
[00:25:22] are
[00:25:22] non-exempt
[00:25:23] or hourly
[00:25:24] workers
[00:25:24] when they
[00:25:25] leave
[00:25:26] the office
[00:25:27] they're
[00:25:28] off the
[00:25:28] clock
[00:25:28] if the
[00:25:29] bot reaches
[00:25:30] out to
[00:25:30] them
[00:25:30] after hours
[00:25:32] does that
[00:25:32] constitute
[00:25:33] work
[00:25:33] and there's
[00:25:35] a lot
[00:25:35] of interesting
[00:25:36] conversations
[00:25:36] about whether
[00:25:37] this is a
[00:25:37] compensatory
[00:25:38] event
[00:25:39] and then it
[00:25:40] goes to the
[00:25:41] point and
[00:25:41] this is sort
[00:25:41] of the opt
[00:25:42] in
[00:25:42] out
[00:25:42] catalyst
[00:25:43] as well
[00:25:43] if we
[00:25:44] have
[00:25:44] offered
[00:25:45] you
[00:25:45] a series
[00:25:46] of
[00:25:46] capabilities
[00:25:47] and you
[00:25:48] have
[00:25:48] attested
[00:25:49] a desire
[00:25:49] to learn
[00:25:51] more
[00:25:51] about
[00:25:51] you want
[00:25:52] to do
[00:25:52] professional
[00:25:53] development
[00:25:53] and move
[00:25:54] from
[00:25:55] handling
[00:25:55] material
[00:25:57] in the
[00:25:57] distribution
[00:25:57] center
[00:25:58] to being
[00:25:59] a shift
[00:25:59] leader
[00:25:59] in the
[00:26:00] distribution
[00:26:00] center
[00:26:00] and that
[00:26:01] requires
[00:26:02] professional
[00:26:02] development
[00:26:03] it's
[00:26:04] optional
[00:26:04] we haven't
[00:26:06] put you
[00:26:06] on that
[00:26:06] trajectory
[00:26:07] we've
[00:26:08] provided
[00:26:08] the learning
[00:26:08] tools
[00:26:09] and the
[00:26:09] interventions
[00:26:10] for you
[00:26:10] to do
[00:26:10] so
[00:26:10] is this
[00:26:11] you
[00:26:11] developing
[00:26:11] yourself
[00:26:12] outside
[00:26:12] the auspices
[00:26:13] of work
[00:26:14] or is
[00:26:14] this
[00:26:14] us
[00:26:15] paying
[00:26:15] you
[00:26:16] to develop
[00:26:16] yourself
[00:26:16] is this
[00:26:18] about
[00:26:18] us
[00:26:18] driving
[00:26:19] you
[00:26:19] for
[00:26:19] internal
[00:26:19] mobility
[00:26:20] which
[00:26:20] again
[00:26:21] we
[00:26:21] would
[00:26:21] like
[00:26:21] for the
[00:26:22] sake
[00:26:22] of
[00:26:22] growth
[00:26:23] and
[00:26:23] retention
[00:26:25] we
[00:26:25] we
[00:26:25] chosen
[00:26:25] we
[00:26:26] have
[00:26:28] be
[00:26:29] to
[00:26:29] to
[00:26:31] to
[00:26:33] a
[00:26:33] distinction
[00:26:33] of
[00:26:33] that
[00:26:33] does
[00:26:34] it
[00:26:34] no
[00:26:34] not
[00:26:35] to
[00:26:35] my
[00:26:35] understanding
[00:26:35] and
[00:26:36] so
[00:26:36] but
[00:26:36] those
[00:26:36] distinctions
[00:26:37] get
[00:26:37] really
[00:26:37] what about
[00:26:38] benefits
[00:26:38] so you're
[00:26:39] eligible
[00:26:39] for
[00:26:40] benefits
[00:26:40] we want
[00:26:40] to provide
[00:26:41] you
[00:26:41] all
[00:26:41] the
[00:26:42] tools
[00:26:42] necessary
[00:26:42] to
[00:26:43] consume
[00:26:43] and be
[00:26:44] aware
[00:26:44] of
[00:26:44] maintaining
[00:26:45] financial
[00:26:46] physical
[00:26:46] emotional
[00:26:47] well-being
[00:26:47] we want
[00:26:47] you to
[00:26:48] thrive
[00:26:48] we want
[00:26:48] your
[00:26:48] loved
[00:26:49] ones
[00:26:49] to
[00:26:49] thrive
[00:26:49] but
[00:26:50] if
[00:26:50] you
[00:26:50] engage
[00:26:51] with
[00:26:51] a
[00:26:51] tool
[00:26:51] outside
[00:26:52] of
[00:26:52] ours
[00:26:52] which
[00:26:52] is
[00:26:52] mostly
[00:26:53] when
[00:26:53] you're
[00:26:53] likely
[00:26:53] to
[00:26:54] engage
[00:26:54] with
[00:26:54] a
[00:26:55] benefits
[00:26:55] experience
[00:26:55] of
[00:26:56] course
[00:26:56] yeah
[00:26:56] is that
[00:26:57] a
[00:26:57] compensatory
[00:26:57] event
[00:26:59] probably
[00:26:59] not
[00:27:00] this
[00:27:00] is
[00:27:01] where
[00:27:01] the
[00:27:01] law
[00:27:02] hasn't
[00:27:02] kept
[00:27:03] up
[00:27:03] with
[00:27:03] the
[00:27:03] times
[00:27:04] yes
[00:27:04] whether
[00:27:05] it's
[00:27:05] from
[00:27:06] a
[00:27:06] non-exempt
[00:27:06] or exempt
[00:27:07] hourly
[00:27:07] worker
[00:27:08] salaried
[00:27:09] worker
[00:27:09] right
[00:27:10] it
[00:27:10] just
[00:27:10] hasn't
[00:27:10] caught
[00:27:11] up
[00:27:11] right
[00:27:11] and
[00:27:12] we
[00:27:14] I'm
[00:27:14] going to
[00:27:14] say
[00:27:14] something
[00:27:15] really
[00:27:15] rude
[00:27:15] all
[00:27:16] of
[00:27:16] us
[00:27:16] that
[00:27:16] work
[00:27:17] in
[00:27:18] professional
[00:27:18] jobs
[00:27:19] where
[00:27:19] we're
[00:27:19] at least
[00:27:20] middle
[00:27:20] management
[00:27:21] or
[00:27:21] senior
[00:27:21] management
[00:27:21] we
[00:27:22] work
[00:27:23] all
[00:27:23] the
[00:27:23] freaking
[00:27:23] time
[00:27:23] there's
[00:27:24] literally
[00:27:24] not
[00:27:24] a
[00:27:25] time
[00:27:25] that
[00:27:25] I
[00:27:25] don't
[00:27:25] check
[00:27:26] my
[00:27:26] phone
[00:27:26] and
[00:27:28] some
[00:27:28] days
[00:27:29] I'm
[00:27:29] really
[00:27:29] mad
[00:27:29] about
[00:27:29] that
[00:27:30] because
[00:27:30] not
[00:27:31] that
[00:27:31] I
[00:27:31] should
[00:27:31] get
[00:27:31] paid
[00:27:32] for
[00:27:32] that
[00:27:32] you
[00:27:33] could
[00:27:33] say
[00:27:33] it's
[00:27:33] built
[00:27:33] into
[00:27:33] my
[00:27:34] compensation
[00:27:34] right
[00:27:35] but
[00:27:36] I'm
[00:27:37] losing
[00:27:37] my
[00:27:38] life
[00:27:38] I'm
[00:27:38] not
[00:27:39] living
[00:27:39] then
[00:27:40] right
[00:27:40] I
[00:27:40] know
[00:27:40] you
[00:27:41] love
[00:27:41] your
[00:27:41] life
[00:27:42] outside
[00:27:42] of
[00:27:42] work
[00:27:44] and
[00:27:44] how
[00:27:44] you
[00:27:45] post
[00:27:45] I'm
[00:27:45] clawing
[00:27:46] at
[00:27:47] it
[00:27:47] every
[00:27:47] single
[00:27:47] day
[00:27:47] my
[00:27:48] friend
[00:27:48] but
[00:27:48] yeah
[00:27:48] and
[00:27:49] then
[00:27:49] people
[00:27:49] will
[00:27:50] try
[00:27:50] geofencing
[00:27:50] but
[00:27:52] you
[00:27:52] can't
[00:27:53] geofence
[00:27:53] and
[00:27:53] then
[00:27:53] not
[00:27:54] enable
[00:27:54] right
[00:27:55] so
[00:27:56] we've
[00:27:56] got to
[00:27:57] pick
[00:27:57] lanes
[00:27:57] and
[00:27:58] I
[00:27:59] guess
[00:27:59] I
[00:27:59] would
[00:27:59] say
[00:28:00] the
[00:28:01] big
[00:28:01] nexus
[00:28:01] of what
[00:28:01] we
[00:28:02] discussed
[00:28:02] today
[00:28:02] is
[00:28:03] intentionality
[00:28:05] experimentation
[00:28:05] is great
[00:28:06] I think
[00:28:07] we should
[00:28:07] experiment
[00:28:07] I think
[00:28:08] we need
[00:28:08] to
[00:28:08] but
[00:28:09] if
[00:28:09] we're
[00:28:09] not
[00:28:10] intentional
[00:28:10] about
[00:28:11] the
[00:28:11] guidelines
[00:28:11] the
[00:28:12] use
[00:28:12] the
[00:28:12] capabilities
[00:28:13] the
[00:28:13] activation
[00:28:14] the
[00:28:15] interpretive
[00:28:16] applications
[00:28:17] the
[00:28:18] journeys
[00:28:18] the
[00:28:18] personas
[00:28:19] that
[00:28:19] we're
[00:28:19] bringing
[00:28:19] to
[00:28:20] life
[00:28:20] it's
[00:28:21] a
[00:28:21] bit
[00:28:21] of
[00:28:21] a
[00:28:21] free
[00:28:22] for
[00:28:22] all
[00:28:22] and
[00:28:23] I
[00:28:23] spend
[00:28:23] every
[00:28:24] day
[00:28:24] trying
[00:28:24] to
[00:28:24] unwind
[00:28:25] that
[00:28:25] hairball
[00:28:25] that
[00:28:26] is
[00:28:26] the
[00:28:27] free
[00:28:29] for
[00:28:29] all
[00:28:29] of
[00:28:29] trying
[00:28:29] to
[00:28:30] knit
[00:28:30] all
[00:28:30] this
[00:28:31] amazing
[00:28:31] stuff
[00:28:31] together
[00:28:32] everybody
[00:28:33] needs
[00:28:33] to
[00:28:33] coexist
[00:28:34] with
[00:28:34] everybody
[00:28:35] else
[00:28:35] there's
[00:28:35] no
[00:28:35] one
[00:28:36] person
[00:28:37] or
[00:28:37] provider
[00:28:38] in
[00:28:38] this
[00:28:39] hall
[00:28:39] that
[00:28:40] does
[00:28:40] everything
[00:28:40] and
[00:28:41] so
[00:28:42] and
[00:28:42] there's
[00:28:42] no
[00:28:42] very
[00:28:43] boring
[00:28:45] exactly
[00:28:45] yeah
[00:28:46] what
[00:28:46] would
[00:28:46] we
[00:28:46] have
[00:28:47] to
[00:28:47] talk
[00:28:47] about
[00:28:47] god
[00:28:47] forbid
[00:28:48] so
[00:28:48] what
[00:28:48] are
[00:28:48] they
[00:28:48] coming
[00:28:49] out
[00:28:49] with
[00:28:49] next
[00:28:51] it's
[00:28:52] like
[00:28:52] listening
[00:28:52] to
[00:28:53] an
[00:28:53] Apple
[00:28:53] announcement
[00:28:53] but
[00:28:54] but
[00:28:54] seriously
[00:28:55] though
[00:28:55] one
[00:28:56] of
[00:28:57] the
[00:28:57] beautiful
[00:28:57] things
[00:28:57] about
[00:28:57] being
[00:28:58] in
[00:28:58] this
[00:28:58] hall
[00:28:58] is
[00:28:58] the
[00:28:59] differences
[00:28:59] in
[00:29:00] the
[00:29:00] thought
[00:29:01] processes
[00:29:01] what
[00:29:02] what
[00:29:05] attacking
[00:29:05] right
[00:29:06] and
[00:29:06] you know
[00:29:07] last year
[00:29:08] we knew
[00:29:09] that this
[00:29:09] year was
[00:29:09] going to be
[00:29:10] all about
[00:29:10] AI
[00:29:10] because last
[00:29:11] year was
[00:29:11] the
[00:29:11] starting
[00:29:12] and as
[00:29:12] I said
[00:29:12] beyond the
[00:29:13] hype cycle
[00:29:14] what I'm
[00:29:14] hoping for
[00:29:15] next year
[00:29:15] is
[00:29:16] for real
[00:29:17] use case
[00:29:18] real world
[00:29:19] examples
[00:29:20] of where
[00:29:20] people used
[00:29:21] it
[00:29:21] and what
[00:29:22] were the
[00:29:22] outcomes
[00:29:22] and how
[00:29:23] did it
[00:29:23] transform
[00:29:23] things
[00:29:24] 100%
[00:29:24] we
[00:29:25] relentlessly
[00:29:26] hit this
[00:29:27] refrain
[00:29:27] everywhere
[00:29:28] process
[00:29:28] led
[00:29:29] tech
[00:29:29] enabled
[00:29:29] when we
[00:29:31] we run
[00:29:32] a lot
[00:29:32] of RFPs
[00:29:33] I don't care
[00:29:34] about the
[00:29:35] 8000
[00:29:35] cut and
[00:29:36] paste
[00:29:36] ridiculous
[00:29:37] questions
[00:29:37] that no
[00:29:38] one
[00:29:38] reads
[00:29:38] yes
[00:29:38] I need
[00:29:39] some
[00:29:39] information
[00:29:39] for the
[00:29:40] purpose
[00:29:40] of
[00:29:40] protecting
[00:29:41] the
[00:29:41] procurement
[00:29:42] process
[00:29:42] but I
[00:29:43] want high
[00:29:44] value
[00:29:44] use cases
[00:29:45] like I
[00:29:45] want you
[00:29:46] to come
[00:29:46] in
[00:29:46] and I
[00:29:47] want you
[00:29:47] to tell
[00:29:47] me
[00:29:47] how my
[00:29:48] hypothesis
[00:29:49] an RFP
[00:29:50] is a
[00:29:50] hypothesis
[00:29:51] I want you
[00:29:52] to activate
[00:29:52] against real
[00:29:53] use cases
[00:29:54] I want you
[00:29:54] to show
[00:29:55] me and then
[00:29:55] I can tell
[00:29:55] you yes
[00:29:56] yes no
[00:29:56] or tell
[00:29:57] me more
[00:29:57] about this
[00:29:58] and how
[00:29:58] would you
[00:29:59] connect
[00:29:59] with all
[00:30:00] these other
[00:30:00] experiences
[00:30:01] and the
[00:30:01] other
[00:30:02] hundred
[00:30:02] tools
[00:30:02] that we've
[00:30:03] already
[00:30:03] bought
[00:30:03] how does
[00:30:03] everything
[00:30:04] come together
[00:30:05] because people
[00:30:06] are lost
[00:30:07] in this
[00:30:07] ecosystem
[00:30:08] they don't
[00:30:08] we work
[00:30:09] with some
[00:30:09] really
[00:30:09] fascinating
[00:30:10] organizations
[00:30:10] the utilization
[00:30:11] of these
[00:30:12] tools
[00:30:12] is shockingly
[00:30:14] low
[00:30:14] and then new
[00:30:15] capability
[00:30:16] comes out
[00:30:17] people aren't
[00:30:17] ingesting even
[00:30:18] what they have
[00:30:19] in my
[00:30:20] conversation
[00:30:20] with Stacia
[00:30:20] this morning
[00:30:21] it was
[00:30:21] amazing
[00:30:22] her research
[00:30:22] basically
[00:30:23] says that
[00:30:24] a lot
[00:30:25] of companies
[00:30:26] say that
[00:30:27] they do
[00:30:27] AI
[00:30:33] they don't
[00:30:33] realize
[00:30:34] those
[00:30:34] benefits
[00:30:34] because
[00:30:35] it was
[00:30:35] oversold
[00:30:36] under
[00:30:36] delivered
[00:30:37] and then
[00:30:38] at the
[00:30:39] end of
[00:30:39] it
[00:30:39] they look
[00:30:40] at it
[00:30:40] and they
[00:30:40] go
[00:30:41] well
[00:30:41] this
[00:30:42] was
[00:30:42] really
[00:30:42] disappointing
[00:30:43] yeah
[00:30:43] of course
[00:30:44] because
[00:30:44] when you
[00:30:45] listen to
[00:30:45] all the
[00:30:45] keynotes
[00:30:46] and you
[00:30:46] know
[00:30:47] I love
[00:30:47] my peer
[00:30:48] group
[00:30:48] but we
[00:30:48] are creating
[00:30:49] an expectation
[00:30:50] that thus
[00:30:50] far has
[00:30:51] rarely
[00:30:51] delivered
[00:30:52] now
[00:30:53] we're not
[00:30:54] too far
[00:30:54] away
[00:30:54] from that
[00:30:55] I think
[00:30:55] there's a
[00:30:56] lot of
[00:30:56] investment
[00:30:56] as I
[00:30:57] talk to
[00:30:57] the
[00:30:58] providers
[00:30:58] and I'm
[00:30:59] you know
[00:30:59] I think
[00:31:00] I've had
[00:31:00] 40
[00:31:00] breakfasts
[00:31:01] in two
[00:31:01] days
[00:31:01] I don't
[00:31:02] know
[00:31:02] if I
[00:31:02] can
[00:31:02] consume
[00:31:02] anymore
[00:31:03] I'm
[00:31:04] going to
[00:31:04] go to
[00:31:04] one
[00:31:04] I'm
[00:31:04] going to
[00:31:04] go to
[00:31:05] my
[00:31:05] fourth
[00:31:05] lunch
[00:31:06] probably
[00:31:06] right
[00:31:06] after
[00:31:06] this
[00:31:07] but
[00:31:07] when
[00:31:07] we
[00:31:08] talk
[00:31:08] to
[00:31:08] them
[00:31:08] about
[00:31:09] the
[00:31:09] fact
[00:31:09] you
[00:31:10] really
[00:31:11] have
[00:31:11] to
[00:31:11] get
[00:31:11] use
[00:31:12] case
[00:31:12] centered
[00:31:12] you
[00:31:12] really
[00:31:13] have
[00:31:13] to
[00:31:13] get
[00:31:13] journey
[00:31:14] centered
[00:31:14] because
[00:31:14] once
[00:31:15] people
[00:31:15] get
[00:31:16] it
[00:31:16] it
[00:31:17] could
[00:31:17] be
[00:31:17] game
[00:31:18] changing
[00:31:18] it
[00:31:18] can
[00:31:18] be
[00:31:19] game
[00:31:19] changing
[00:31:19] but
[00:31:19] if
[00:31:19] they
[00:31:20] can't
[00:31:20] apply
[00:31:20] it
[00:31:20] the
[00:31:21] stories
[00:31:21] matter
[00:31:21] and
[00:31:21] the
[00:31:21] stories
[00:31:22] that
[00:31:22] we
[00:31:22] can
[00:31:23] tell
[00:31:23] about
[00:31:24] the
[00:31:24] implementation
[00:31:25] of it
[00:31:25] and how
[00:31:26] it
[00:31:26] worked
[00:31:26] and how
[00:31:26] it
[00:31:27] transformed
[00:31:28] that's
[00:31:28] the
[00:31:28] next
[00:31:29] that's
[00:31:29] the
[00:31:30] next
[00:31:30] it's
[00:31:30] got
[00:31:30] to
[00:31:30] be
[00:31:30] the
[00:31:30] next
[00:31:31] wave
[00:31:31] of
[00:31:31] this
[00:31:31] we
[00:31:31] don't
[00:31:31] want
[00:31:31] to
[00:31:31] I
[00:31:48] I've
[00:31:48] learned
[00:31:48] so
[00:31:49] much
[00:31:49] in
[00:31:49] my
[00:31:49] life
[00:31:49] from
[00:31:50] failing
[00:31:50] at
[00:31:50] things
[00:31:50] that's
[00:31:51] why
[00:31:51] I'm
[00:31:51] here
[00:31:51] exactly
[00:31:52] I've
[00:31:53] failed
[00:31:53] more
[00:31:53] than
[00:31:54] I've
[00:31:54] succeeded
[00:31:54] and
[00:31:54] thank
[00:31:55] God
[00:31:55] for
[00:31:55] how
[00:31:55] boring
[00:31:55] would
[00:31:56] life
[00:31:56] be
[00:31:56] exactly
[00:31:57] I went
[00:31:58] out
[00:31:58] and I
[00:31:59] put a
[00:31:59] dime
[00:31:59] in
[00:31:59] and I
[00:32:00] won
[00:32:00] a million
[00:32:00] dollars
[00:32:02] I'm
[00:32:03] paying
[00:32:03] for
[00:32:03] light
[00:32:04] fixtures
[00:32:04] and
[00:32:18] lots
[00:32:26] Mark
[00:32:26] as
[00:32:27] always
[00:32:27] it's
[00:32:27] a
[00:32:27] pleasure
[00:32:27] and
[00:32:28] an
[00:32:28] honor
[00:32:28] to
[00:32:28] spend
[00:32:29] time
[00:32:29] with
[00:32:29] you
[00:32:29] I
[00:32:29] learned
[00:32:30] so
[00:32:30] much
[00:32:30] talking
[00:32:31] to
[00:32:31] you
[00:32:31] as
[00:32:31] do
[00:32:32] I
[00:32:32] from
[00:32:32] you
[00:32:32] and
[00:32:32] I
[00:32:33] wait
[00:32:33] thank
[00:32:33] you
[00:32:34] I
[00:32:34] wait
[00:32:34] for
[00:32:35] next
[00:32:35] year
[00:32:35] we
[00:32:35] could
[00:32:36] do
[00:32:36] an
[00:32:37] earlier
[00:32:37] one
[00:32:37] next
[00:32:38] year
[00:32:38] coming
[00:32:38] back
[00:32:53] and
[00:32:53] if
[00:32:53] you
[00:32:53] know
[00:32:53] anyone
[00:32:54] that
[00:32:54] might
[00:32:54] like
[00:32:54] to
[00:32:55] hear
[00:32:55] it
[00:32:55] please
[00:32:55] send
[00:32:55] it
[00:32:56] their
[00:32:56] way
[00:32:56] thank
[00:32:57] you
[00:32:57] for
[00:32:57] joining
[00:32:57] us
[00:32:58] this
[00:32:58] week
[00:32:58] and
[00:32:58] stay
[00:32:58] tuned
[00:32:59] for
[00:32:59] our
[00:32:59] next
[00:32:59] episode
[00:33:00] stay
[00:33:00] safe


