Bob Pulver speaks with John Hansen, Chairman and CEO of Atana, about his extensive background in technology, the importance of diversity in the tech industry, and the transformative potential of AI. John shares his journey through various tech companies (this is his 8th, in fact), his passion for creating inclusive workplaces, and his insights on the latest advancements in generative AI. Bob and John discuss the user experience with AI, the adoption challenges it faces, and the enduring need for human touch in technology. They discuss the overarching challenges and opportunities presented by modern technology, particularly in the context of learning and development. John shares stories and insights from his experiences in Africa, highlighting how leapfrogging technology can lead to significant advancements. John talks about the unique characteristics of Gen Z as digital natives and the importance of adapting educational methods to better serve future generations. The conversation emphasizes the need to maximize human potential through technology while addressing the challenges of integrating new tools responsibly. It’s an enlightening discussion exploring the intersection of technology, culture, and education.
Keywords
AI, technology, diversity, generative AI, workplace transformation, human-AI interaction, John Hansen, Atana, leadership, innovation, technology, AI, learning, development, Gen Z, digital natives, leapfrogging, education, infrastructure, cultural change
Takeaways
- John Hansen has founded eight companies and has a strong background in technology.
- Diversity in decision-making bodies leads to better outcomes.
- Generative AI represents a significant shift in user interaction with technology.
- AI should support human endeavors rather than replace them.
- The limitations of AI include issues with authenticity and hallucinations.
- Human touch and physical interaction remain important despite technological advancements.
- AI can enhance content creation but still requires human input.
- The future of technology will involve a blend of AI and human creativity.
- Sticky notes may persist as a preferred method of note-taking for many.
- Legacy infrastructure often hinders technological advancement.
- Cultural traditions can coexist with modern technology.
- Leapfrogging technology can lead to unexpected breakthroughs.
- AI's evolution presents both opportunities and challenges.
- Different learning modalities serve different educational needs.
- In-classroom instruction should focus on engagement, not just knowledge transfer.
- Gen Z's digital skills are unmatched, but they face social challenges.
- Technology can help eliminate intellectual waste in learning.
- Companies must adapt to the strengths of new generations.
- Responsible use of technology is crucial for future success.
Chapters
00:00 Introduction to John Hansen and Atana
06:05 The Importance of Diversity in Tech
12:01 User Interaction with AI
18:00 Challenges and Limitations of AI
26:23 Legacy Infrastructure vs. Modern Technology
32:10 The Evolution of AI: Breakthroughs and Challenges
45:04 Gen Z: The Digital Natives and Their Impact
51:30 Maximizing Human Potential with Technology
John Hansen: https://www.linkedin.com/in/johnjhansen
Atana: https://www.atana.com
For advisory work and marketing inquiries:
Bob Pulver: https://linkedin.com/in/bobpulver
Elevate Your AIQ: https://elevateyouraiq.com
Powered by the WRKdefined Podcast Network.
[00:00:00] Welcome to Elevate Your AIQ, the podcast focused on the AI-powered yet human-centric future of work.
[00:00:04] Are you and your organization prepared? If not, let's get there together. The show is open to sponsorships from forward-thinking brands who are fellow advocates for responsible AI literacy and AI skills development to help ensure no individuals or organizations are left behind. I also facilitate expert panels, interviews, and offer advisory services to help shape your responsible AI journey. Go to ElevateYourAIQ.com to find out more.
[00:00:28] Hey everyone, thanks for listening to another episode of Elevate Your AIQ. I'm excited to share this conversation with John Hansen, Chairman and CEO of Atana, who has an incredible background in technology and leadership. John has founded eight companies, yes, eight, and is passionate about building diverse and inclusive workplaces while leveraging technology to maximize human potential. John's a super smart, fascinating guy, and I loved talking about technology's evolution with him. We cover a range of topics, including the transformative potential of generative AI,
[00:01:09] challenges of integrating new technologies responsibly, and how cognitive diversity and decision-making leads to better outcomes. This is really important to think about as we adapt to more humans plus AI scenarios and balance responsibility with innovation.
[00:01:23] John also shares powerful insights from his experiences working across the globe, including how leapfrogging technology can create breakthroughs and what it takes to adapt education for a generation of digital natives, like Gen Z and even Gen Alpha, the latter of which is starting to enter the workforce.
[00:01:38] This is a thoughtful and inspiring discussion about the intersection of technology, culture and education. I know you'll enjoy it.
[00:01:46] Hello everyone. Welcome back to another episode of Elevate Your AIQ. I'm your host, Bob Pulver. With me today, I have the pleasure speaking with Chairman and CEO of Atana, John Hansen. How are you, John?
[00:01:57] John Hansen, CEO of Atana, John Hansen. How are you, John?
[00:01:57] Doing well. Thank you for having me, Bob. This is exciting.
[00:02:00] It's my pleasure. I'm sorry we didn't get to do this at HR Tech when we bumped into each other.
[00:02:04] Less background noise. Oh, for sure. For sure. Yeah, that was tricky trying to get the mic, you know, set up with all that noise for sure.
[00:02:13] It's great to sit down and talk to you. There's so much I want to ask you about, but why don't we just start with you, you know, giving my listeners just a quick overview of your background.
[00:02:23] Yeah. Perfect. This is my eighth company. My previous seven and I had a great opportunity to be the Secretary of Technology for the State of Colorado back in the early aughts and experience.
[00:02:36] I got to see some really cool technology. Matter of fact, some pretty cool AI stuff that went on in Colorado.
[00:02:44] Hmm. Wow. 20 some years ago, Bob. I mean, it was pretty exciting time.
[00:02:49] So this is my eighth company. And, you know, I started my first one back way back when I was a young puppet, 30 years old and actually didn't know anything at that time.
[00:02:59] I'm a computer science undergrad, an MBA graduate. And so I'm a technologist. I'm a computer science person. I'm one of those geeky guys that's become CEO.
[00:03:10] And so I I love technology. But when I was my previous companies were all what I call hard tech. Right.
[00:03:19] Matter of fact, I tell friends, you know, they'd ask me, what do you do, John? I go like, yeah, you'll you won't understand.
[00:03:25] So now I thought it's pretty cool when I could tell them what we're doing here about helping transform the workplace.
[00:03:33] But so I understand that point of view. So I but I've been a technologist my whole life. But, you know, as I was a young CEO and I had some great mentors, but I was really I don't know.
[00:03:47] It's kind of weird. It was I was frustrated by how homogeneous the space was.
[00:03:53] Now, I was a person not to brag because I didn't know what I was doing back then.
[00:03:58] I mean, I had senior women as executives in my company.
[00:04:03] I now know a couple of years ago, figured out that I could pick very competent, high powered women because nobody else would hire them.
[00:04:13] And so in the tech space and the hard tech space. And so this always bothered me.
[00:04:18] Matter of fact, I had a VP of marketing, a brilliant woman, still a woman.
[00:04:23] And we had a relationship going on in Japan. So I sent her to Japan for a few weeks.
[00:04:28] Well, that's kind of cruel. Right. If you go to a very patriarchal society like Japan was back then. Yeah.
[00:04:34] But I don't know, I just always struggle, Bob. And so here as my swan song at a ton, I had an opportunity to buy this company back in 2017.
[00:04:42] And I bought the company passionate about can we do anything to to change the workplace, the respect, you know, just the openness, you know, to include people.
[00:04:57] And of course, that was ahead of time, a little bit about a lot of things that happened over the last five, six years.
[00:05:04] And so to me, I'm passionate. This is my swan song. You know, I know your audience probably doesn't see me visually, but, you know, I'm an old guy.
[00:05:13] And so but man, I have never been this passionate about really trying to make a difference. Right. This is my legacy. This is like my swan song. It's like, you know, play the violin.
[00:05:21] So I think it's fantastic. You're definitely more technical than I am, but I always still had trouble explaining to friends and family exactly what I did.
[00:05:33] So at some point, I think my mother made me actually write it down so she could explain to her friends what exactly I'm doing.
[00:05:41] But now that I have the podcast, she at least can listen.
[00:05:43] My mom, my dear mom, who I love, she never had a clue what I did. And I just pull out. I was in the wireless industry from the early days, from the early 90s.
[00:05:52] I pull out, you know, the cell phone and say, yeah, I make that work. Oh, OK.
[00:05:59] She never, never understood what I did.
[00:06:02] So from that point on, she now calls you anytime there's a problem with the phone.
[00:06:05] Exactly. Yeah.
[00:06:08] She does actually.
[00:06:09] Yeah.
[00:06:10] Yeah.
[00:06:11] We become tech support, no matter what the actual problem is.
[00:06:14] We become tech support for our parents.
[00:06:16] Yeah.
[00:06:17] So I love your swan song, as you call it.
[00:06:21] I mean, I feel like I am on a similar mission.
[00:06:24] You know, I'm no spring chicken myself.
[00:06:27] I'm on the back nine.
[00:06:28] And so I think a lot about those things.
[00:06:30] I think a lot about the impact that I can have, you know, speaking to organizations and now, obviously, you know, hopefully my listeners come from a broad swath of, you know, demographic and geographic backgrounds.
[00:06:44] And because I do think that, you know, everything coming with with AI, it's coming fast.
[00:06:50] It will come at different at a different pace for different roles and industries, as you know, as you've sort of traversed different industries.
[00:06:57] But I think some of those fundamental problems prevail in terms of equality and fairness and, you know, the distribution of opportunity for a lot of folks.
[00:07:08] I also think just, you know, in your example, we've known forever that, you know, having women as part of a decision making body helps you make better decisions.
[00:07:18] Like it's been proven time and time again.
[00:07:21] I did a lot of work at IBM around, like, collective intelligence and collaborative decision making and things like that.
[00:07:27] And so it's just like, guys, if you think a bunch of old white guys are the best, you know, decision making body, you are sorely mistaken.
[00:07:36] I was asked by actually a couple students in my class where this came from, where this mindset, because, you know, I was a hard, I was a hard techie, right?
[00:07:45] I mean, I was like, Ed's down, you know, built my first computer.
[00:07:49] You know, I went to school up here in Seattle area.
[00:07:53] And of course, back in the seventies, huh?
[00:07:55] Well, yeah, there's a little bit of buzz going on about computers and Bill Gates.
[00:07:59] You know, nobody knew who Bill Gates was back then, but, you know, there's a lot of work going on at the Lakewood school that he was at.
[00:08:04] And I was, you know, public high school out here and had an amazing professor, not professor, teacher in my senior year.
[00:08:12] And I'll never forget him.
[00:08:14] And he was an older guy and same thing.
[00:08:16] He had come back into teaching after Ben.
[00:08:19] And anyways, he had this special advanced science and math class.
[00:08:23] I'd taken all the calculus stuff, all that stuff already in high school.
[00:08:26] You know, talk about individuals who have profound impact on your lives, right?
[00:08:30] Almost like inflection points.
[00:08:32] And he was one of those.
[00:08:33] So we, as a class, there was, you know, eight or nine of us and we built a computer and it was like a such a novel thing.
[00:08:41] Well, same time, Gates is over at Lakewood building a computer himself.
[00:08:46] And so, you know, so it's kind of interesting how you can be informed at such a young age about technology that grows up.
[00:08:56] Now, when I was in school, you know, studying computer science.
[00:09:00] So this is now late 70s, early 80s.
[00:09:02] And I was like, oh, AI was a big thing.
[00:09:05] And I just chuckled.
[00:09:06] Here we are.
[00:09:07] 2024.
[00:09:09] Wow, AI is a big thing.
[00:09:11] Well, it's, I don't know, it's like the fifth wave of AI, right?
[00:09:14] This one's a little bit different because it's called generative AI.
[00:09:17] I mean, there's generative AI, but it has all the problems that AI had from the beginning.
[00:09:23] And so I've been through a few of these AI waves where it's going to, you know, transform.
[00:09:29] And obviously we've all seen Terminator with Skynet taking over, you know, but there's a lot of fear about AI as well.
[00:09:36] But there's also not an appreciation for its limitations.
[00:09:41] And there's a lot of hype on AI.
[00:09:45] And I don't know, those of us who are technologists who study the space, watch the space and we'll talk about it, Tana.
[00:09:52] Yes, we have AI, obviously, in the background.
[00:09:55] Yeah.
[00:09:56] But, you know, for some of us, it's like, yeah, okay, it's got a little different color of the rose, but it's still got the thorns on the side.
[00:10:05] That's a good way to put it.
[00:10:07] So for me, this is because I didn't have that sort of super techie, you know, background.
[00:10:14] I didn't love the regular, you know, sort of computer science that everyone took in undergrad, you know, B school.
[00:10:20] But I was at IBM watching, you know, the advent of what we came to call big data and, you know, advanced analytics and working on an IBM research campus.
[00:10:32] I won't claim to be a researcher, but working on that campus connecting C-suite, you know, folks like yourself to what's coming with research.
[00:10:41] I mean, a lot of them absolutely did not have the background that you have.
[00:10:46] And so this was eye-opening to them about what's coming and what the implications of that for, you know, how they operate, how they go to market, you know, understanding more about, you know, their target market, whether that's B2B or B2C, etc.
[00:11:01] And so I literally had the Jeopardy studio in my office.
[00:11:06] Wow, nice.
[00:11:07] So we would show everyone how, we would show like behind the scenes, like this is how it's sort of thinking.
[00:11:14] This is how.
[00:11:14] The early days of Watson.
[00:11:16] Yeah, exactly.
[00:11:17] So, so it was a research project back in 2012 or whatever it was called Deep QA, Deep Question Answering.
[00:11:25] And they didn't, hadn't given it a name yet, but they knew they needed to make a big splash and they hadn't really done anything like that since, you know, Deep Blue with the chess playing computer.
[00:11:36] But it was fascinating to me, not just to learn about the inner workings of it, but to see firsthand the early sort of, you know, response to what it was doing and how it was thinking about, you know, its probabilistic determination of potential answers that it would literally show you its known unknowns.
[00:11:58] Yes.
[00:11:59] Like if only you could give, if you could give me these other pieces of information, I can increase my confidence score, perhaps above a certain threshold at which in Jeopardy, you would actually, you know, buzz in.
[00:12:11] So it wouldn't answer if it didn't hit that threshold.
[00:12:13] It didn't meet a threshold of confidence of the answer.
[00:12:16] And then of course they had to train it through different models to say, well, when, what about betting?
[00:12:21] How do I know when to bet?
[00:12:22] How much to bet?
[00:12:23] How much to bet?
[00:12:24] If I get the double Jeopardy or you can.
[00:12:25] Double Jeopardy.
[00:12:26] Yep.
[00:12:26] Yeah.
[00:12:27] Anyway, so I think one of the things that I try to, you know, explain to folks, whether they're a client or a neighbor or whatever is like you mentioned before, generative AI is like the latest wave.
[00:12:39] And that's the one that's people are becoming most familiar with because they actually get to interact with it as opposed to if you weren't deep in the, in the trenches of, you know, the computer lab.
[00:12:49] Right.
[00:12:49] You may not have had a chance to interact with it or you weren't a data scientist or what have you.
[00:12:54] That was one of the user interface breakthroughs, as you know, right?
[00:12:59] Yeah.
[00:12:59] Is that it isn't, you know, the large language model and all that.
[00:13:03] I mean, I gotta believe that 95% of the people that, you know, started playing with chat GPT.
[00:13:08] I mean, they never even heard of what a large language model is or what the algorithms are with it.
[00:13:13] They just saw this.
[00:13:15] I can type in a prompt question and I get this answer.
[00:13:19] And of course, that's raised the whole thing about prompt engineering.
[00:13:24] And, and they're like, Oh my gosh, I can interact with it.
[00:13:27] Like I'm talking to my neighbor.
[00:13:29] And that was of everything.
[00:13:31] And by the way, that's not AI, but I mean, that's this brilliant user interface.
[00:13:36] Yeah.
[00:13:37] I mean, I gotta hand it to, you know, Sam and the open AI that that little, and it is actually a little change to actually open it up so people can type in questions and ask it.
[00:13:50] And I mean, yes, I was, you know, chat GPT one, you know, just kind of like, Hey, this is pretty interesting.
[00:13:57] And, and so, and the, and the courage to open up the faility of the large language models to the world.
[00:14:05] Of course, a lot of sophisticated people, especially reporters, right?
[00:14:08] They got to go into some wild hallucination.
[00:14:11] And it was pretty fun to watch those, those days.
[00:14:14] But anyways, that, that really was like you said, Bob, the AI was in the lab.
[00:14:18] And actually not in the lab.
[00:14:20] I mean, there's companies that were using it in the back end, especially machine learning and neural networks and all that stuff has been around for a long time.
[00:14:27] It was all in the back.
[00:14:29] Now, all of a sudden, what chat GPT did was it pushed it to the front and pushed it out so that people could interact.
[00:14:36] And that was the major breakthrough that happened with AI.
[00:14:39] Yeah.
[00:14:39] A few years.
[00:14:41] Absolutely.
[00:14:41] Yeah.
[00:14:41] The, the AI became part of the UI.
[00:14:43] And so you got to see and interact and use natural language.
[00:14:49] And, you know, we saw some of that with, with Watson, whether that was, you know, it could, you know, understand, you know, vocal cues.
[00:14:59] It could do, you know, speech to text, text to speech, et cetera.
[00:15:01] So that's why it's been around, right?
[00:15:04] This is you and I, as old geezers were like, this has been around.
[00:15:09] Okay.
[00:15:09] You came out with a UI and you made it publicly available and okay, that's, that's it.
[00:15:14] But this, all this stuff.
[00:15:16] And of course, I got to say natural language processing, audio video, right.
[00:15:22] That's, that's, that's a huge breakthroughs in the last couple of years as far as its authenticity, as far as its reliability in that area.
[00:15:31] And so, you know, part of that is the fact that there's no compute power that we would have, no computer in the world would have had this, you know, back in the nineties.
[00:15:40] So, right.
[00:15:41] And, and that, well, ironically, it's of course consuming all electrical power in the world.
[00:15:46] Yeah.
[00:15:47] I think, I think we're going to solve that.
[00:15:49] Yeah.
[00:15:49] Yeah.
[00:15:50] I think we're going to solve that, but it'll be some creative, open-mindedness about that.
[00:15:56] Yeah.
[00:15:56] We have the technology solutions.
[00:15:57] They've just never been deployed.
[00:15:59] Yeah.
[00:15:59] I mean, I think we had some, I know this is at a whole nother scale, but we did have efficiency discussions when, when cloud computing came about and when the internet came about and things like that.
[00:16:09] So.
[00:16:10] I remember that.
[00:16:11] But, you know, I think it's interesting because folks like you and I, who have, you know, seen this movie, we recognize like.
[00:16:18] Some of the capabilities that did exist, you know, a decade ago.
[00:16:22] And so sometimes I'm fascinated and amazed at what it can do.
[00:16:26] What I say, you know, general sense about some of the generative AI tools, but it's also sometimes frustrating when I see, you know, if I'm interacting, I was just chatting with, I forget if it was cloud or chat to be T the other day.
[00:16:41] And.
[00:16:42] Cloud's my favorite.
[00:16:43] But anyways, I like cloud.
[00:16:45] I like cloud a lot.
[00:16:46] Yeah.
[00:16:46] I was trying to get my, my post-its.
[00:16:48] I've got this, these piles of post-its and I'm like, this is enough of this.
[00:16:51] I got to break this habit.
[00:16:52] I got to digitize all this stuff.
[00:16:54] So I was asking it like, what's the most efficient way to do this?
[00:16:57] You know, I'm, I don't have the ability to, Oh, I asked if I could just read it to.
[00:17:02] Cloud.
[00:17:03] And so.
[00:17:03] And then it was recorded and processed and organized it for you.
[00:17:07] That's what I thought.
[00:17:07] I said, I said, can I just, can I just read this?
[00:17:09] Yeah.
[00:17:09] Go ahead and get started.
[00:17:11] And I'm like, how, how exactly are you hearing me?
[00:17:15] Because I'm talking, I have my headset on, I'm talking, I'm connected and it's not doing anything.
[00:17:22] I said, can you hear me?
[00:17:23] And it's, Oh, I'm sorry.
[00:17:24] I miss, you know, you misunderstood me.
[00:17:26] Like you have to type in.
[00:17:28] Yes.
[00:17:29] What I mean.
[00:17:29] I'm like, okay, well that's, that was kind of the nature of my request.
[00:17:33] Can you actually listen through my voice?
[00:17:36] So anyway, I know on the mobile devices, some of the jet, some of the generative AI tools allow you to do that, but it was like this whole back and forth about, well, you got to use an app that has OCR that can, you know, take your, your chicken scratch and, you know, convert it to text.
[00:17:50] And that was, Oh, here's five ways you can do that.
[00:17:53] But none of them were, were great.
[00:17:54] Right.
[00:17:54] I still have to.
[00:17:55] Not yet.
[00:17:56] Yeah.
[00:17:57] So it was a whole convoluted thing, but, but it made me think about going back to some of the Watson, what they used to call cognitive services, which were their whole like library of APIs.
[00:18:07] And it's like, wait a minute, 10 years ago, I could combine speech to text, text to speech to do that.
[00:18:13] And so you're telling me.
[00:18:15] You're telling me.
[00:18:16] At the end of 2024.
[00:18:18] You still can't do this.
[00:18:20] You cannot do that.
[00:18:21] That's what you're telling me.
[00:18:22] I'm still using a computer keyboard interface.
[00:18:29] Yeah.
[00:18:29] Come on.
[00:18:30] It's 2024.
[00:18:31] 2024.
[00:18:32] So it's like, Oh, well, if you just scan them and get them into a PDF, you know, I could do it.
[00:18:37] And then it's like, yeah.
[00:18:39] But it had, but then it said, no, let me, let me clarify that as well.
[00:18:43] It has to be a text document.
[00:18:45] Text based large language model.
[00:18:47] It's text based large language.
[00:18:49] Now, of course they, there's a lot of work going on both in, in audio as well as, you know, videos and pictures.
[00:18:57] And of course it's causing some problems and you know that we're still missing and why people get a little, we're nervous about the future about the AI, but AI still, everybody talks about AGI, right?
[00:19:10] Artificial general intelligence.
[00:19:12] And yeah, we're a long ways from AGI because really it's quite simple.
[00:19:20] If you really understand what it tells processing the large language models, it's just brute strength that they're doing now applying to it.
[00:19:28] Just get a massive amount of data and computers are excellent at processing massive amounts of data, improving the predictability and being able to predict, you know, predict, you know, how this sentence should end, except sometimes it goes off track.
[00:19:42] And that's called hallucinations, but doesn't have authenticity.
[00:19:46] And we know that it's not human, right?
[00:19:48] Everybody goes like, Oh my gosh, at some point there's going to be, you're going to be talking to somebody and you're not going to even know that it's a, you know, that it's a, an AI, an AI bot.
[00:19:57] And I'm like, well, humans have this gift of being able to detect authenticity.
[00:20:05] And, you know, that's the uncanny valley, right?
[00:20:08] We can just kind of detect something a little weird here.
[00:20:10] And so we, we have a philosophy, our approach at Atana is that, you know, we're human first and AI support.
[00:20:21] I mean, that's what we're doing.
[00:20:22] It's, it's like, you know, we create content.
[00:20:26] Learning content is one of our way, one of the things that we are using in order to collect behavioral and attitudinal information through the content.
[00:20:34] And are we using AI? Of course we are. I mean, the AI that processes video now and clears up things, the translations are, are more accurate now than they are for human translators.
[00:20:46] So when we create a piece of content and run it through, we can, we can do 20 languages in the course of about a half hour to translate those into those languages.
[00:20:56] And they can do it, you know, with voice, voiceovers that, boy, a lot of people go like, you can't detect that it's an AI generated.
[00:21:07] And so we're using all that in the backend, but the actual script writing, the actual story, the actual, we, you know, we still use actors per se.
[00:21:19] Yeah.
[00:21:20] And, you know, AI can clean it up.
[00:21:23] It can kind of refine.
[00:21:24] It can take stuff out of the picture, make it clear.
[00:21:26] It can do all these amazing things, but there's still a human element that I think is going to be around for a long time.
[00:21:33] And I laugh about, you know, Bob, you know, we, we've seen fashion trends, right?
[00:21:40] And so it's always funny how like, man, I wish I kept my bell bottoms from back in the seventies, but, but even, you know, the, the desire, my, my daughter loves reading books.
[00:21:52] I said, well, aren't you using Kindle on your iPad?
[00:21:56] I mean, and she goes, you know, I just love something about the physical book, people taking pens on notepads.
[00:22:04] Right.
[00:22:05] And so there's an element of humanity that still likes the analog, what I call the analog world.
[00:22:12] Yeah.
[00:22:13] We still like that human and we are humans that like touch as well.
[00:22:17] So I do not panic or get upset about what's coming.
[00:22:23] Like I didn't back in 98, the last big AI wave, or I mean, one of the second big AI wave, or when Watson came out, right?
[00:22:32] And, you know, and everyone's like, Oh my gosh, Watson's going to take over the world.
[00:22:38] Yeah, no, it actually did.
[00:22:40] And so I think that AI's best role is as a thoughtful support of human endeavors.
[00:22:48] And it can do things that we don't need to waste our time on those things.
[00:22:52] I still, by the way, Bob, have not figured a solution to get rid of my sticky notes.
[00:22:57] So when you do, please share that.
[00:23:00] And I've looked at so much technology like you have, I still have not cracked that code.
[00:23:07] So if you do, you know, or do you develop something?
[00:23:11] Let me know.
[00:23:11] I'll be a procurer of that.
[00:23:13] I want to take a break real quick, just to let you know about a new show.
[00:23:18] We've just added to the network up next at work hosted by Gene and Kate Akil of the Devon Group.
[00:23:25] Fantastic show.
[00:23:27] If you're looking for something that pushes the norm, pushes the boundaries, has some really spirited conversations, Google up next at work, Gene and Kate Akil from the Devon Group.
[00:23:42] I think what I would like, honestly, now that you mention it, is I do still want to write stuff down.
[00:23:51] And I do have a digital, like, note taker.
[00:23:55] Of course.
[00:23:55] If I'm on a call, right?
[00:23:56] Of course.
[00:23:56] And so it'll record it.
[00:23:57] It'll give me action items.
[00:23:59] And yeah, I have the new iPad, you know, on the new pen.
[00:24:03] And, you know, I got that little screen, that little film that goes over the screen.
[00:24:07] So it's trying to mimic it.
[00:24:09] And I have chicken scratch, you know, sadly.
[00:24:12] Yeah.
[00:24:13] That's something I paid attention to in grade school.
[00:24:15] But chicken scratch.
[00:24:17] And it's better.
[00:24:19] But boy, sometimes if I really need to write a note, I whip out my pen and put it on a sticky note.
[00:24:26] And I think what I'd like is, and I think maybe that device called Remarkable.
[00:24:33] Yeah.
[00:24:33] I think it's called, like that might be the closest.
[00:24:35] Yeah, the card of mine's got Remarkable.
[00:24:36] Okay.
[00:24:37] So that might be good for me.
[00:24:38] That might be the best sort of compromise.
[00:24:41] So it does take good notes.
[00:24:43] Yeah.
[00:24:43] You have a great translation.
[00:24:45] It only supports PDF.
[00:24:47] Okay.
[00:24:47] It doesn't link to any of your apps.
[00:24:49] It doesn't.
[00:24:50] And I'm like, come on, you know?
[00:24:55] Okay.
[00:24:55] Take a Remarkable, merge it with the iPad.
[00:24:58] You know, the two of them should make, and there we go.
[00:25:01] And so I'm waiting for that, Bob.
[00:25:03] If I could just take, we're still right on my post-its, and then just peel it off and
[00:25:11] stick it into like a little-
[00:25:13] A little scanner.
[00:25:14] A little slot, a little scanner.
[00:25:16] And then, so I can literally immediately throw the piece of paper away, that would help
[00:25:20] me because just the piles and the clutter and the curling of the notes and the, it's
[00:25:26] just, it adds to the rest of the clutter.
[00:25:29] Sure.
[00:25:29] Now, you know, Bob, that somebody that's listening to your show that's maybe 20 years younger than
[00:25:36] us is rolling in laughter.
[00:25:39] Like these two old codgers, man.
[00:25:42] But then again, I said, like my daughter, she was, they like physical books.
[00:25:46] As a matter of fact, in one of my classes, because I teach MBAs as well on angel investing.
[00:25:53] And I asked, you know, so these are 20, some, a few are early 30s, you know, so that age demographic.
[00:26:00] And man, to me, it's like this great research lab, Bob.
[00:26:02] I mean, I can just, I got my own captive audience to ask questions like this.
[00:26:07] And I asked them how many of you like to read, you know, books, you know, physical books.
[00:26:13] They all put their hands up.
[00:26:15] And I went, well, why?
[00:26:17] I mean, I love my, my Kindle app on my iPad.
[00:26:21] And, and they said, you know, I, there's just something physical feeling.
[00:26:27] So I, I don't know, you know, maybe we're going to be stick it's good old 3M.
[00:26:32] I think we're going to have, you know, post-it notes, you know, well into the 2100s, right?
[00:26:40] The 22nd century.
[00:26:42] There are certain things that just don't die.
[00:26:45] This don't go away.
[00:26:46] Right.
[00:26:46] Yeah.
[00:26:47] I mean, I saw somebody post about this on LinkedIn the other day.
[00:26:51] It's like as fast as some of us, you know, in the, in the trenches, in the, in the middle
[00:26:56] of all this, you know, these AI advancements as fast as we think it's moving, the rest of
[00:27:02] the world is not moving nearly that fast.
[00:27:06] People are still have legacy, you know, infrastructure.
[00:27:09] They've still, I won't say they're on, they're using, using like Lotus one, two, three, but I
[00:27:13] mean, you know, they're not necessarily using like the most advanced things to, to run their,
[00:27:19] their business or, you know, the things that they use on a daily basis in their personal
[00:27:24] lives.
[00:27:24] Well, one of the coolest things I saw a few years back, you get a sense of my previous
[00:27:30] companies were all in the cellular industry.
[00:27:31] And, and I just had the great fortune of being at the right place at the right time in the
[00:27:37] early nineties when, you know, all the cell carriers, you know, grew fanatically, a whole
[00:27:43] bunch of them came up and we had a piece of technology that was unique that nobody else
[00:27:48] had, but they all had to have it.
[00:27:50] And so that was a fun, fun ride for me.
[00:27:54] This was my second company.
[00:27:56] First one, I don't talk about much, but my second company that had a great, a great exit,
[00:28:01] you know, called Metapath.
[00:28:03] But anyways, so a few years ago I was in Africa climbing Kilimanjaro.
[00:28:08] And so I'm in East Africa.
[00:28:10] We have the Maasai warriors.
[00:28:12] And first time that was one of my first times I've been to East Africa.
[00:28:16] And so we're driving and you know, the Land Rover, you know, driving along and I look out
[00:28:22] and there's a Maasai person tending, you know, their goats.
[00:28:28] And this is in Tanzania and in Tanzania, the Maasai, they have open border rights.
[00:28:34] In other words, they can go where they want with their flocks.
[00:28:36] And so here, I mean, literally it's like National Geographic.
[00:28:39] Here's a person like, well, this is cool.
[00:28:41] I guess National Geographic didn't dress people up.
[00:28:44] They actually live like this.
[00:28:45] Okay, well, this is cool.
[00:28:47] And so I'm watching that we're driving and I'm watching that.
[00:28:50] And all of a sudden, he reaches in, pulls out a cell phone, a smartphone.
[00:28:57] And he's talking on it.
[00:29:00] And my whole time in Tanzania, everybody had a smartphone and they use it for their primary form of commerce.
[00:29:07] Right.
[00:29:08] All the payments, everything is paid on their smartphones.
[00:29:11] Matter of fact, Africa, as you may or may not know, was one of the cutting edges of payments over cell phones.
[00:29:18] We had a company, but I had a company back in the early 2000s.
[00:29:23] And we were one of the primary drivers of that.
[00:29:25] But I mean, they, because they didn't have anything else.
[00:29:28] They didn't have a wallet.
[00:29:29] They didn't have credit cards.
[00:29:30] They didn't have whatever, but they all had phones.
[00:29:31] And so anyways, to me, that was such a juxtaposition of the traditional ways that they've been doing for hundreds, thousands of years, combined with new technology, state of the art technology, far in advance of even what we in the West were using.
[00:29:48] And it was just so fascinating to see that how those folks could embrace technology because it solved so many fundamental problems that they had.
[00:29:58] But to see a Maasai warrior with a smartphone, I just, I was like, man, this, I should have videoed it.
[00:30:06] Right.
[00:30:06] To me, it's just like you couldn't have created an ad for Verizon better than that.
[00:30:13] Right.
[00:30:14] Yeah.
[00:30:14] I remember because IBM had set up a research lab somewhere in Africa, I'm trying to blank on where exactly, but that was one of the primary industries that they were focused on.
[00:30:26] And it's really, I mean, it would have been a trip to see that firsthand.
[00:30:32] But just, it wasn't just the fact that they embraced that technology is it's that they kind of leapfrogged technology, right?
[00:30:39] They didn't need, where are they sit?
[00:30:40] Where are they going to set up a desktop computer?
[00:30:43] Right.
[00:30:44] Like, why would you even do that?
[00:30:46] It's nowhere.
[00:30:46] I mean, it was such a leapfrog.
[00:30:49] And one of the pioneers of this was cell net.
[00:30:52] When I sold the MetaPath, we merged with a British company called Mobile Systems International.
[00:30:59] And they had a sister division called cell net, which was, you know, primary providing cellular service in Africa.
[00:31:07] And they rolled out this latest technology.
[00:31:12] And, you know, CTO was a brilliant man and he and I were quite close.
[00:31:17] And so anyway, so I, so they rolled this out.
[00:31:19] They said, I think we could do this.
[00:31:21] And of course they could because they didn't have any of this legacy that we had in the West.
[00:31:26] That we had that, well, we got to make sure we're, you know, adapting to the past and whatever.
[00:31:32] No, they just were able to go plunk and drop it in.
[00:31:36] And it was something to behold, which taught me a lesson that even in the case of AI, sometimes the breakthroughs are leapfrogs.
[00:31:46] And that is hard to predict when those leapfrogs are happening.
[00:31:50] Is generative AI a leapfrog?
[00:31:52] I don't know.
[00:31:54] I will know in another year.
[00:31:56] Yeah.
[00:31:57] Of course, you know, the rumblings out of open AI is their next version is not doing well.
[00:32:04] It's not the leapfrog that they had hoped it to be and that they're struggling with, that they're reaching the end of some capacity.
[00:32:12] Which those of us in the computer science world, we know the limitations of large language models.
[00:32:18] Yeah.
[00:32:19] Yeah.
[00:32:19] That problem, I agree, is coming up.
[00:32:22] We've seen like the speed of the advancements is slowing down.
[00:32:27] That's not a problem unique to open AI by any stretch.
[00:32:30] But yeah, that's why I'm fascinated by some of the alternative, you know, models that people are coming up with.
[00:32:36] There's a startup out of, spun out of MIT, I think, called Liquid AI.
[00:32:41] Yeah, Liquid AI.
[00:32:43] Taking a totally different approach to how we do this, much more efficient.
[00:32:47] Totally different approach.
[00:32:48] And of course, that's the beauty of technology.
[00:32:51] More importantly, that's the beauty of science, right?
[00:32:53] Is that, you know, everything is built upon the predecessors, right?
[00:32:57] I mean, you can see the work that IBM did sprinkled throughout everything that's happening now.
[00:33:03] And that's the beauty of technology when, you know, it's sad that companies like IBM, and I know they're scrambling, trying, you know, why isn't IBM one of the leaders in general AI?
[00:33:14] It doesn't normally happen that way.
[00:33:15] Normally, it's a small bunch of very smart people that can remove some boundaries.
[00:33:21] They remove some limitations that may exist at some of these companies.
[00:33:25] Yeah.
[00:33:26] I give Satya and Adela a lot of credit at Microsoft for truly reinventing Microsoft.
[00:33:32] Of course, that's back in the old cloud days, you know, all five years ago.
[00:33:37] Right, right.
[00:33:38] And of course, then they, you know, dumped $10 billion in the open AI in order to get a leap on AI, you know, large language models.
[00:33:46] So, yeah, I think he's been an amazing leader, but I never count IBM out because there's some really brilliant people there.
[00:33:53] Oh, I never count them out.
[00:33:55] I would go to the Luke Gerstner days, right?
[00:33:57] So, the dark days of IBM and then they came charging back.
[00:34:01] So, IBM actually bought one of my companies, Bob.
[00:34:04] Oh, really?
[00:34:04] I have a near and dear feeling towards IBM.
[00:34:08] Interesting.
[00:34:08] I have a nice home courtesy of IBM.
[00:34:12] Yeah, Lou and I started right about the same time actually.
[00:34:16] Is that right?
[00:34:17] Oh my goodness.
[00:34:18] Yeah, back in the mid 90s.
[00:34:19] I think we started just a couple months apart.
[00:34:22] So, you know, certainly I occasionally joke that, you know, their revival was, you know, who knows?
[00:34:28] Was it Lou?
[00:34:29] Was it me?
[00:34:29] I don't know.
[00:34:30] I don't know, yeah.
[00:34:32] I wanted to ask you about some of the ways that you think about how AI, you know, we're talking about leapfrogging and, you know, there are, if you think about that in the context of learning and development, there are people just like your students at University of Washington might still like to, and your daughter still like to read, you know, have that tactile, you know, feeling of reading a book or maybe they like, maybe they still are clinging to a Blackberry because they like the tactile feel of a real keyboard.
[00:35:00] But in learning and development, I mean, you may still have people that think they learn better, you know, in a classroom, an instructor-led, you know, classroom.
[00:35:08] And that's why maybe that was one of your motivations for going back to get your MBA, you know, later in life.
[00:35:14] But I mean, I guess how do you think about advancements in L&D and like in this sort of age of AI and how it's used?
[00:35:25] So, every modality of learning has advantages and disadvantages.
[00:35:29] And I don't think there's a winner or a loser.
[00:35:32] I think it has to do with the learning.
[00:35:34] We believe, I believe that it has to do with what modality is best served by the content or the topic you're trying to address.
[00:35:43] If you're trying to address leadership, for instance, I don't know about you, but my leadership over my career has always been delivered by great mentors, one-on-one relationships.
[00:35:58] I've done a number of leadership courses led by some amazing people.
[00:36:04] And you can ask questions.
[00:36:06] It's impromptu.
[00:36:07] There's still, as humans, we still like a little physical touch on very critical areas.
[00:36:13] But if I'm trying to learn, you know, Python, you know, language, I don't need a professor to tell me that.
[00:36:22] So, it's interesting when I, so I've taught at the University of Washington part-time for eight years, a few years full-time when I was between my gigs.
[00:36:33] And it's interesting that I, my delivery has changed.
[00:36:40] And I'm one of those instructors that welcomes, you can imagine, welcomes technology into the room.
[00:36:47] Why not?
[00:36:48] They're going to do it anyways.
[00:36:50] They're either going to hide it underneath the desk.
[00:36:51] And, I mean, this is a big debate at all universities.
[00:36:54] And especially now with AI coming in, which is hilarious.
[00:36:57] I, the fear in the professorships about, you know, AI.
[00:37:03] And, you know, they're using AI to detect the fake AI.
[00:37:08] And so, it's a battle of the AIs, you know, to see who's cheating.
[00:37:12] I am a case study professor.
[00:37:15] And as a result, well, hey, Genre of AI, man, that's written for, you know, answering cases.
[00:37:22] Yeah.
[00:37:23] To a limit.
[00:37:24] Because then if you write a brilliant case, oh, I, when I'm meeting with them in person, you know, I've read these.
[00:37:33] And I flagged those that I sense that they may have used a lot of AI.
[00:37:39] And so, I'll just call them out in class.
[00:37:41] I mean, I just, we have a very interactive class.
[00:37:43] And so, then I said, now, now, Joey, you said this was one of the points that you made.
[00:37:51] And could you talk about that?
[00:37:52] And can you defend that?
[00:37:54] And can you, you know, I have some rebuttal against that.
[00:37:56] And, you know, so anyways, my point there was that I use a lot of technology for them to prepare to come into class.
[00:38:10] In the past, a lot of classes, and you've been there, a lot of classes were informational dumps.
[00:38:17] In other words, come to class, to a class, you know, in-person instruction.
[00:38:23] And it was just a dump of information.
[00:38:27] Knowledge.
[00:38:27] I'm dumping my knowledge to you.
[00:38:30] Boy, what a waste of time.
[00:38:32] Now let's use as much technology as we can out of class.
[00:38:37] And this is called flip the classroom.
[00:38:39] Which is like, you know what?
[00:38:40] I'm going to help you learn all that garbage.
[00:38:43] Not garbage.
[00:38:44] It's valuable stuff.
[00:38:45] All that knowledge.
[00:38:48] Outside, on your own, at your convenience, in whatever technology you want to use.
[00:38:56] But in the classroom, what we're going to do is we're going to elevate it, right?
[00:38:59] The Kirkpatrick's model.
[00:39:01] We're going to have some deep conversations about what you learned from it.
[00:39:06] Of course, if you haven't prepared in the old days, you know, if you weren't prepared, you don't want to come to my class.
[00:39:13] And so, because, and I'm not, I don't call them out.
[00:39:17] I'm not that Harvard one.
[00:39:18] I was, I've been guest lecturers at the Harvard Business School.
[00:39:21] I actually have two cases from my previous companies at Harvard.
[00:39:24] Not bragging.
[00:39:25] I'm just saying like, wow, this is amazing.
[00:39:27] Amazing experience.
[00:39:28] I told my MBA students at the University of Washington, I would say it to anybody that's in higher education.
[00:39:35] There are students at public schools, private schools that are as best, if not better, than the elite at Stanford and Harvard or MIT.
[00:39:45] I mean, there's some smart people there.
[00:39:47] And, but there's smart people everywhere.
[00:39:50] I do not discount the human ingenuity around the world.
[00:39:54] And I love that at teaching at University of Washington.
[00:39:56] But anyways, the purpose, to your point, if you elevate the conversation in the classroom, then everybody wins.
[00:40:05] It just boggles my mind that we're still doing in classroom instruction where it's nothing but a knowledge dump.
[00:40:14] Those days are gone.
[00:40:15] So, look, one of the things that we offer, and we have a couple of customers, you know, we do e-learning.
[00:40:23] You know, so, and what we call it is building a foundation.
[00:40:27] And in our e-learning, we're doing in the moment, in context, and it's video rich vignettes, great stories.
[00:40:34] And in those moments, we ask behaviors and attitudinal questions.
[00:40:37] And that's the great offering that we're doing is using, instead of just doing assessments of pulse surveys or stuff like that.
[00:40:45] I mean, we're actually in the moment.
[00:40:46] We got a captive audience, and we're doing some pretty exciting behavioral stuff.
[00:40:50] Well, what we do then is we share that with the company.
[00:40:53] They will then hold a in-classroom or over Zoom, but in-classroom is best, a facilitation with the managers and leaders to talk about the results that we have been, are now able to share with them.
[00:41:07] About their company and their people.
[00:41:10] And it's like, and they've all taken the content as well, and we have the results, and we can say, here are some issues that your company has in regards to behaviors and attitudes and cultural mismatches.
[00:41:23] But now let's have a conversation about that.
[00:41:27] Let's have a higher level processing and what we're going to do to cover that.
[00:41:33] That's the purpose of in-classroom training in my regards.
[00:41:38] It's in-classroom learning and discussion, not just a knowledge dump.
[00:41:46] You know what you should know?
[00:41:48] You should know the You Should Know podcast.
[00:41:51] That's what you should know.
[00:41:52] Because then you'd be in the know on all things that are timely and topical.
[00:41:57] Subscribe to the You Should Know podcast.
[00:42:00] Thanks.
[00:42:00] Yeah, no, I think that's fantastic.
[00:42:03] I'm certainly sensitive to a lot of this.
[00:42:06] I mean, certainly looking back on my formal education, but also because I've got a teenage daughter applying to, well, not yet applying, but she's a junior in high school.
[00:42:17] She's a junior in high school.
[00:42:17] Yeah, but we're already sort of in this process, partly because she's got friends who are seniors or even freshmen in college now.
[00:42:25] Well, she's taken the PSATs, she's been talking about her AP classes and all this other stuff.
[00:42:30] But one of the reasons I wanted to meet with the Board of Education and the school administrators personally is because I want to make sure that they are paying attention to what's going on around them.
[00:42:42] That even if they're not like the very early adopters, because it's going to take a while, right?
[00:42:49] It's a process, right?
[00:42:50] You've got to get the teachers comfortable.
[00:42:51] You've got to get perhaps some of the parents comfortable.
[00:42:54] But this is an absolutely net positive thing.
[00:42:57] But it is a behavioral change for everyone.
[00:43:02] But you've got to really think about what is best for these kids to actually learn and gain the skills that are going to be needed in the future, the innately human skills, the durable skills that are going to be needed to succeed.
[00:43:16] Because to your point, just memorizing a bunch of stuff is not really, it might be good if you get on Jeopardy.
[00:43:23] Yes, yes.
[00:43:25] But otherwise, you're going to have a co-pilot.
[00:43:28] This technology is not going to go away.
[00:43:30] It's only going to get better and smarter and more personalized and more contextual, which is one of the things I love about your example with getting sort of in the moment.
[00:43:40] This is exactly what we're seeing.
[00:43:42] I mean, surveys are just point in time, emotional responses.
[00:43:48] You took months to draft it.
[00:43:50] You're going to take months to analyze the results.
[00:43:52] This is in the moment.
[00:43:54] This is in the moment, real time.
[00:43:55] Yeah, here's the insight.
[00:43:56] Here are the patterns that we see.
[00:43:58] Here are some anomalies maybe that we want to account for and do some one-off stuff.
[00:44:04] But that's how people, I don't want to say evolve, but it is how you are going to learn more effectively and increase your potential to succeed in all kinds of different roles that we can't even fathom yet.
[00:44:20] That's right.
[00:44:20] Could any of us have imagined four years ago coming out of COVID, chat GPT?
[00:44:28] It wasn't even in our consciousness.
[00:44:32] And then look at how rapidly around the world it became adapted.
[00:44:37] It's just mind-boggling.
[00:44:38] I get asked about my thoughts on the Gen Zs.
[00:44:46] And now they are coming into college.
[00:44:49] And I've taught a number of Gen Zs.
[00:44:54] And people are like, you know, they're the first real digital generation.
[00:45:00] Right?
[00:45:01] So they've been digital since birth.
[00:45:04] Right?
[00:45:04] To see my – I have a grandson that's three years old.
[00:45:08] And to see his digital adeptus is stunning.
[00:45:12] It is stunning.
[00:45:13] I just can't fathom that.
[00:45:17] And so Gen Zs are the first digital ones.
[00:45:20] And they're used to, and with all of its positives and negatives, you know, having a phone that's got real time, real instance.
[00:45:28] So I've been asked, how is it to teach them?
[00:45:31] And I said, you know, every generation says this.
[00:45:35] You know, matter of fact, most people critical of Gen Zs are actually millennials.
[00:45:38] But anyways, I can relate to this, right?
[00:45:42] So when my generation was young, they have the same passions, the same desires, the same excitement, the same – I love teaching Gen Zs.
[00:45:55] Now, do they have some peculiarities?
[00:45:57] Yes.
[00:45:58] But guess what?
[00:45:59] We have to adapt to those peculiarities and help them become strengths.
[00:46:04] Now, do they have great social skills, which is a lot of companies now as they're coming into the workforce?
[00:46:11] They're suffering from a lack of social awareness.
[00:46:15] And they have some behaviors because of their lack of social interactions, you know, to the extent that we had growing up or even millennials had.
[00:46:28] So there's a social element that companies are dealing with.
[00:46:31] It's impacting teams and how teams work.
[00:46:34] And I see that because in my classes, I do teams a lot.
[00:46:38] And there's not a lot of compromise.
[00:46:42] They don't know how to compromise naturally.
[00:46:45] And so I do find myself teaching them some basic team building that I didn't have to do in the past.
[00:46:52] And so there's some adaptation that has to go on.
[00:46:56] But guess what?
[00:46:57] Corporate, the world, the organizations that they're going to work for, they've got to adapt, too, and measure and take on their strength.
[00:47:04] But they have the same passions.
[00:47:05] They want to succeed.
[00:47:06] They want to make a difference.
[00:47:07] They care.
[00:47:08] They care about the society.
[00:47:11] They care about the net.
[00:47:12] So they have these same passions like I did when I was a 21-year-old and had these great dreams.
[00:47:18] So do they.
[00:47:19] And so I have a lot of faith and confidence in our future generations.
[00:47:24] And they now have digital skills that Bob, you and I would have dreamed to have.
[00:47:29] And they actually just don't even think about having to worry about some knowledge stuff that we had to memorize in U.S. history class in high school.
[00:47:43] So, you know, they don't even have to waste those brain cells to process that.
[00:47:49] So I'm excited for you and for your daughter going into college.
[00:47:54] Yeah, no, it's going to be an exciting time.
[00:47:57] I'm psyched to go visit some campuses.
[00:48:00] She's got some older cousins in college now.
[00:48:04] So we're going to try to see them in the springtime.
[00:48:07] And so that'll be.
[00:48:11] Yes.
[00:48:12] And it's changed, Bob.
[00:48:14] Heads up.
[00:48:14] It's changed.
[00:48:18] I have no doubt.
[00:48:20] I have no doubt.
[00:48:20] But I also, to your point, I mean, I think they are digital natives and they're comfortable with technology, which is all the more reason.
[00:48:26] I mean, I don't know when you got your first computer or built your first computer.
[00:48:31] But 76 was the first time I built my computer and got my computer science degree in the early 80s.
[00:48:36] And so it's like, we had a Commodore 64 at home, probably early 80s.
[00:48:43] And we also had Apple II Cs or something in class in middle school.
[00:48:48] So I was, what, 11, 12 years old or whatever.
[00:48:52] So early 80s.
[00:48:53] So I just think, you know, it's even more sort of approachable and accessible now than ever.
[00:48:59] And they're taking this in stride.
[00:49:01] They're probably using it, whether it's a voice virtual assistant like Siri or Google or whatever, or it's just technology that's embedded in the social apps that they use all the time.
[00:49:12] I mean, they can handle this, right?
[00:49:15] So let's not pretend like they can't or it's too dangerous or whatever.
[00:49:20] We went through this with social media.
[00:49:21] We went through this with the internet.
[00:49:23] You've got to teach them how to use it properly.
[00:49:26] Responsibly.
[00:49:27] Yeah.
[00:49:27] You know, I am a little nervous, obviously.
[00:49:29] And there needs to be some safeguards about that.
[00:49:32] The algorithms now are so advanced that they can create addiction so easily.
[00:49:37] And, you know, there's the tribal nature of it that tends to just reinforce.
[00:49:42] And so there is some safeguards that need to be put on.
[00:49:45] Oh, you know what?
[00:49:47] I think we had this conversation in 95 with the internet.
[00:49:50] Right.
[00:49:52] When Mozilla came out and created the World Wide Web.
[00:49:55] I, you know, we had these same conversations, same song, you know, sixth verse.
[00:50:02] And so, but they, it's funny that how, how they process digital information casually where,
[00:50:12] you know, I don't know about you, like you said about converting our post-it notes, sticky notes.
[00:50:18] They don't even think about those sorts of things.
[00:50:20] Yeah.
[00:50:21] They're just natural to them.
[00:50:23] They're digital natives.
[00:50:24] And I, that's why I'm so excited about what's happening.
[00:50:28] I'm excited what's happening in learning and development.
[00:50:32] Because it's changing rapidly to, and like I was saying, that the stuff, the normal knowledge-based stuff,
[00:50:41] that can be done digitally.
[00:50:43] It can be done online.
[00:50:44] It needs to be engaging, but all those good things.
[00:50:47] And that we can spend our precious time face-to-face or in a classroom or in learning or, you know,
[00:50:54] coolers or over Zoom or whatever on higher order issues that we've been kicking down the street for so many years.
[00:51:03] And I like to say that our goal is to eliminate the administrative,
[00:51:08] mid-astrivia out of our lives.
[00:51:12] And if, how much more productive we would become if we're maximizing our own human strength and letting the digital tools,
[00:51:21] including AI,
[00:51:23] eliminate so much of this,
[00:51:25] what I call intellectual waste out of our lives.
[00:51:30] I welcome that with open arms.
[00:51:33] Right.
[00:51:34] Well, I think for you, I mean, as a leader and being in the L&D space, it's, you've got two things.
[00:51:42] You've got the efficiency gains of automation, intelligent, you know, cognitive automation.
[00:51:47] And then you've also got the augmentation.
[00:51:50] Like, how are we helping people actually learn the things that they need to learn to be better?
[00:51:55] And with behaviors and attitudes that they really need to work on and how can we build civility and respect in the workplace?
[00:52:03] And man, those are hard things, you know, I, that we've been working on for 30, 40 years and we barely made a dent.
[00:52:11] Those, those are the issues that impact companies' bottom lines when you have these issues inside the company that we've now seen over the last couple of years
[00:52:20] that just can tear entire culture and company to pieces.
[00:52:24] I mean, I love Nike.
[00:52:27] Man, talk about, you know, self-destruction and not be knocked off that pedestal, which is sad.
[00:52:36] It's sad.
[00:52:36] And I wish the best that they can get around this.
[00:52:40] But what was it?
[00:52:41] Oh, it was human behaviors and attitudes that destroyed the culture at Nike from Phil Knight.
[00:52:49] And so, you know, this, this stuff, this stuff is where we can really make an impact on helping companies grow and be more productive and more successful.
[00:53:01] Absolutely.
[00:53:02] John, I could probably talk to you for a couple more hours, but I want to be respectful of your time.
[00:53:07] This has been fantastic.
[00:53:09] Thank you again for spending so much time with me.
[00:53:11] I think there's a lot for my listeners to take away from this.
[00:53:15] And yeah, thank you again for, for all that you do.
[00:53:18] And I hope we get to talk again soon.
[00:53:20] You could tell them I may be a little passionate about the topic.
[00:53:24] And so I found out about the opportunity to being on, you know, elevating your AIQ.
[00:53:31] I was just like a, this, I was like a kid in a candy store.
[00:53:35] Like, oh, I can't wait to talk to Bob.
[00:53:37] I love it.
[00:53:38] So I, this is a real service that you're doing, Bob.
[00:53:41] And I wish you the very best.
[00:53:43] This is so much fun and I respect you so much.
[00:53:47] And I wish you the best.
[00:53:48] Thank you, John.
[00:53:49] I don't even know what to say.
[00:53:50] Thank you so much for that.
[00:53:52] That makes me feel like it's all worth it.
[00:53:54] So thank you.
[00:53:55] All right.
[00:53:56] All right.
[00:53:57] Thanks again, John.
[00:53:58] And thank you everyone for listening.
[00:53:59] We'll see you next time.




