L&D In Action: Winning Strategies from Learning Leaders - The Technology Question: Balancing AI with Strong Leadership and Learning Content
Episode Date: May 30, 2023In this episode of L&D in Action, we’re joined by Christopher Lind, VP and Chief Learning Officer at ChenMed. As a career investigator of the intersection among technology, people and business, he u...ltimately asks one question, which he and host Tyler interrogate at length: What’s the healthy center for adopting and utilizing new technology in the workplace? When it comes to artificial intelligence, Christopher identifies an uncertain path ahead, though he agrees there is potential to enhance interactions between leaders and their teams by making them more profound, personalized and meaningful.
Transcript
Discussion (0)
You're listening to L&D in Action, winning strategies from learning leaders.
This podcast, presented by GetAbstract, brings together the brightest minds in learning and development
to discuss the best strategies for fostering employee engagement, maximizing potential,
and building a culture of learning in your organization.
With an eye on the future and a preference for the practical,
we address the most important developments in edtech, leadership strategy, and workflow learning.
Let's dive in. Hello and welcome to L&D in Action. I'm your host, Tyler Lay, and today I'm speaking with Christopher Lind. Christopher is the VP and Chief Learning Officer at ChenMed. He is the host
of Learning Tech Talks,
and he's also a husband and father of seven. Christopher, thank you so much for joining me
today. Thanks for having me on. Looking forward to chatting. I always start off just asking my
guests if you can explain a little bit about how you got into learning and development,
about your background, and how you got to ChenMed, and what it is that you do today.
Yeah, so I've got a bit of an unorthodox background. I was originally going to be tech space. I was a big into coding, decided I didn't like coding, was more fascinated by people.
So I thought I was going to be a teacher then, decided I didn't like teaching in a traditional
sense. So was lost in the ways of corporate America and then found my way into corporate
learning, not in the traditional HR sense, but I was always that person
that business leaders called on to help get their people strategy aligned and not just bigger things
and bigger companies, bigger responsibilities. And now I'm sitting in my first HR stint as the
chief learning officer, which has been an interesting run. What was it that you didn't
like about teaching, would you say? I'm curious. So, the way we treat traditional education to me was just always broken. I was always that
student that went, we spend so much time doing this thing that is actually dysfunctional. It
just was so much about filling people's heads with information instead of actually teaching
them how to do things, to actually put skills into place. And so, the idea of standing in front
of a room and filling people's heads with knowledge was not something that I could do. Plus the parents, honestly, the kids were great,
but the parents just drove me bonkers. I was going to say, I hope it wasn't the kids that
drove you crazy because you produced seven of them yourself subsequently.
I know parents that came in and were like, you got to do it this way and that way. And I'm like,
what do you stay out of my business? They all know what's best for their kids. And they're going to make sure that you do what
they want. Of course. Right. Even though I'm the one dealing with them all day long.
Let's start off hot here. I want to jump right into what's on everybody's mind right now. Make
the people happy. I would like to talk about AI, generative AI, and where we're at with that right
now. I've seen a few of your talks recently, a few lives that
you've been on, a handful of conversations that you've had where you address this topic.
One way that really intrigued me that you described what we're dealing with right now,
and this is, I think, generally true of the progress of technology, but what we're dealing
with is new innovations that ideally lead to greater efficiency. But there's also this tension
between, are we getting more efficient here or we're getting
more anxious are we increasing our own anxiety because now things are supposed to be done faster
and that kind of alters all the processes around whatever it is that we're doing so this is like a
spectrum between anxiety and efficiency i just was looking at some articles a long time ago where it
was just one of these things where by the time technology advances we'll only have to work four
hours a week because we'll be so efficient with everything we do. And you look at
it now. And it's like people are burning the midnight oil working 100 hours a week, they're
more stressed out than they've ever been in their whole life. And you just look at it and go, we
obviously did something wrong here, because all this stuff is supposed to be creating greater
efficiency. But somehow we missed the mark. And instead of saying, wow, how is this
improving the quality of what we do? How is this making us distinctly better? Instead, it's just
like more and faster. Let's just keep charging forward. And I think that's probably one of the
greatest disservices we could do to what technology is capable of adding to humanity right now.
Is that the path that we're on inevitably? Or the way that I think about this
is ultimately, these innovations have to have some sort of social good aligned with them at the end.
Ideally, you know, something in the way of equity or in the way of health or in the way of general
happiness has to be the greater outcome of all of this. But what it really seems like right now,
especially with AI, and I would say with
past few sort of waves of tech innovation and web 3.0 type stuff, going back to like crypto and
NFTs or whatever it is, it really just seems like it's sort of a wave of hype. And it's a wave of
secondary and tertiary businesses coming off sort of main technology that are really just focused on
optimizing whatever that thing is and churning out more and more of it without a real true genuine focus on a greater social good like i
was saying would you agree with that some extent i mean human history has shown us that this is
what we have horrible habit of doing even going back you know pre-computer age it's like we come
up with innovations and we just charge at them with everything and think anything we do with
them is good and then we realize we scorched the earth decades later and then look back and go, maybe we should have approached it a little
bit differently and we wouldn't have had such a fallout. And I think inevitably we're going to see
a lot of that now. And I think that's one of the things where I'm trying to be an outspoken voice
in this to go, it doesn't have to be that way. We could take a different path than we have before,
but I think right now we are at this precipice of choice where we go, so what are you going to do?
Are you going to make the right choice? Maybe you don't even realize you made the wrong choice till five, six, 10 years later. And then you look back and go, and I think that's
where right now we're seeing a lot of this stuff with technology. There's a ton of stuff spinning
up that honestly, and it's going to take us a while to realize it's garbage. And there's going
to be people that are going to get hurt and bad things are going to happen. And then we're going
to hopefully at some point realize what we're doing. Obviously, it's not 100% bad. And there's going to be people that are going to get hurt and bad things are going to happen. And then we're going to hopefully at some point realize what we're doing.
Obviously, it's not 100% bad. And there's plenty of good that's coming from this. And
there is greater efficiency in certain areas. And some people are being greatly helped in many ways.
But you see that the letter that was signed by the thousand something tech leaders that said,
hey, let's kind of slow this down more or less. Are you on the side of that?
Do you think maybe that we need to slow down a little bit?
Do we need to reconsider and just have a more thoughtful conversation?
So here's the thing is going back to your earlier point, because sometimes people hear
me talking about the risks and they think I'm opposed to it.
And I'm like, I'm actually not.
I see huge benefit to mankind and people.
And I mean, it's fantastic what we can do with it.
I just also see it's like this 50-50 scale. And it's like, there's fantastic what we can do with it. I just also see
it's like this 50-50 scale. And it's like, there's gonna be a lot of people that go the wrong way.
The challenge I have with the question on do we slow things down or not? Is that even a question
we can ask? Because even when announcement came out that they were saying we should slow it down,
I just kind of chuckled at it going, who's gonna slow it down? The train is out of the station.
of chuckled at it going, who's going to slow it down? The train is out of the station. Is anybody going to actually slow it down, at least on that end? I think where we have an opportunity to slow
it down is if you look at our industry or in organizations, even, I mean, I'm a dad and I'm
looking at like my kids and going, okay, how is this going to affect them? This is where we almost
have to slow it down on a personal level and an organizational level, instead of worrying about
are the big tech giants
going to slow down their training of advanced AI algorithms when it doesn't matter how many
thousands of people sign a petition saying that they should. Well, ultimately, I mean, that is how
you slow something like that down is the democratic way, I would say, is as individuals and as
organizations, if you decide not to adopt things super rapidly and you take more care and you're
more discerning,
the money flows in more slowly.
I mean, there's always the crazy capital investment.
But theoretically, you know, it's the behavior of the market.
As individuals and as organizations, we slow things down.
You know, that will force the larger tech companies to slow down.
Is that the sort of future that you would call for?
They have no choice to because there's no consumer market for it. You compare generative AI to immersive tech, right?
And you can just see the difference, which one has exploded and in a matter of weeks had hundreds of
millions of users using it. Well, the one that everybody went out and thought it was the greatest
thing since sliced bread and signed up and did all this other stuff. What's the one that's taking a
slow walk and kind of figuring out what does it want to be when it grows up? The one that people
are kind of going, I don't know. And like, are we really going to dive into this? So we have a lot more control over it than we realize.
So from a learning perspective and an educational, even like a leadership perspective,
what are the sorts of things that you think that we should be looking to adopt? Or what
sorts of tools do you think make the most sense for us to at least start looking into right now?
I think the ability for us to do more active learning, and I would say the market's not quite there yet.
You know, a lot of the technology we're seeing
is around content generation.
We can produce a lot of the stuff we have a lot faster
and a lot more efficiently, which isn't all bad,
but I don't know that anybody's going,
I really wish our LMS had more crap in it right now.
And so I think then that asks the question of like,
okay, so what else
can it do? And I think its ability to adapt and exchange in a human way. It's like, well, what
better way to tailor that to actually help people grow and develop in their human skills? Now, again,
it also runs the risk of making people dumber. And I think that's one of the biggest risks we
have to watch as learning leaders is are we grooming an organization to not think for themselves through this technology? Or
are we using it to actually engage active learning by putting people through? I mean, just one
example that is growing in a lot of companies right now is the ability to assess conversation
dynamics in meetings or in situations and give people real-time feedback.
That's just one example of that can become active learning. You can actually start to go,
how was my conversations with Tyler? How was it objectively? What areas could I have done better
at? How did I communicate? What things? That's something that can actually help people get
smarter and think more critically. I've had a few folks tell me about AI programs. I haven't seen this myself yet, but some sort of AI application that works within a learning
marketplace of sorts at larger companies where it makes very clear what you can do with that
company, as in how you can grow, where you can be promoted, what you maybe could pivot your skill
set to achieve or like reskill slash upskill yourself in a certain
direction. Because these companies are so vast and because skills are so vast, it takes AI application
to really be able to determine, okay, this is your specific skill set. Maybe it analyzes the
kinds of tasks and assignments that you've done or the projects that you've worked on.
And it says, based on these things, you could maybe go in this direction, or this might make
sense for you to look into and learn this. Or if you really want to improve yourself, you know, then it gives you like
specific learning recommendations. But the ultimate goal, I think, is, as you were saying,
kind of like to improve the human individual, like in terms of their professional career.
Have you seen that sort of thing as well? Yeah, I mean, I think that's where sometimes I think
organizations are getting ahead of themselves is they're trying to use it to align people to,
like you said, opportunities in the company and all this other stuff. And it's failing to recognize
that most people don't even understand their own skills. And most organizations don't understand
the skills of their people yet. And so it's trying to jump to this, like, how do we align people to
jobs? And you're like, don't you think you might need to know what the people are capable of doing
and what the jobs really are first, so that then it can do a better job matching because I think the risk you run if you skip
that step is I mean, AI used well, it's great used poorly, it just accelerates your dumpster fire.
I mean, it just turns your dumpster fire into a forest fire. And so I think that's one of the
risks is you throw this thing at something and go great, start telling people what jobs they have.
And all of a sudden, it's like going down a path with the wrong information type of a thing. But I think that's where we can get to. And I
think we can start with the skills first. Well, why do you think we don't fully understand our
own skills? I see that too. And to me, it's like, it's a limit of language almost. It's like,
how do you describe all the granular tiny things that you do? Like, I really feel like there is
some sort of limitation there that's just nearly impossible to achieve to really understand what it is that you know. But I think that's kind of what some people are seeking
through AI is like, well, maybe, you know, this machine that is able to do things that I can't
with my brain, like, maybe there's some way for it to come up with a better system of understanding
than what we as humans have. I guess you're kind of arguing that the answer is that's probably not
true. And I think that the thing is, is that it can in better ways now than we could before. So I've done
some work with some companies who are doing this in really powerful and impactful ways where what
AI is really good at is starting to identify patterns and match those patterns to things,
which really when you think about the what things you do at work, and how those ties to skills and
how those tie to different jobs, it's all about patterns. And so if you can actually layer AI on top of all the activity,
the beautiful thing about so much of our work being digital now is we're literally creating
data around this stuff constantly. So you can actually throw an algorithm at something and go,
what does Christopher do every single day? What are the different things he does? And what are
the kind of patterns that's creating? And where else do we see those patterns in the organization? And now we can create a consistent set of language
so that you can start to go, oh, even though these are two distinctly different functions,
when we actually look at the patterns of behavior that are happening on a day to day basis,
they are eerily similar. And you can start to say, okay, well, then I think that's really what
we're talking about when we talk about a skill. This is an interesting question to me, because I think historically, we would be
looking at leadership to do that sort of thing. Ideally, leaders would be have a general awareness
of their organizations and be able to say, okay, this skill set is very conducive to something else
that's going on over here, or we could apply this pattern of capabilities over here. And that to me
would be
really cool to have like just leaders, you know, the sort of middle level, but also at the,
you know, upper levels to, to be able to identify those things, maneuver people and push them in
certain directions and help them advance in that way. So my question ultimately here is,
what is your sort of ideal learning architecture between this option of new tech, even just your
standard sort of learning tech LMSs and LXPs, but also the combination of AI versus or in combination with the interpersonal
communication component of learning, which is your leadership encouraging you to learn this
and that and to move in certain directions. Based on all the data out there consistently,
we see when AI is paired with humans, it always does better than humans alone.
And it always does better than AI alone, you put the two together, it's the sweet spot. So when I
start to look at this stuff, I don't necessarily see AI replacing that human component. So even
let's go back to your use case. So would that mean that, okay, well, we'll just let AI do this
workforce. And now managers just disconnect, they don't even have to worry about what their people
do, because the AI is telling them what to do and where they should go.
No, I don't think that's the case at all. But what it can do is when you talk to most people
managers, while they may have a desire to want to know the intricacies of their teams and what they
do, they don't have the capacity. They just don't to actually know what's actually going on
underneath the curtain type of a thing. And so it's not so much
you go, well, just let AI take that, but how can it analyze that and then present you with different
things that you now have the ability to make choices and decisions on and have dialogue with
your people. And I think that's the really empowering part of this tech is that it can
start to see what your people are doing, combine that with options of where might that match with
learning, and then present that to both people are doing, combine that with options of where might that match with learning,
and then present that to both people and leaders,
which the important part is
there needs to be a dialogue on that then.
Because there's so much nuance and contextual stuff,
you can't just be like,
well, the algorithm told me I should apply for this job.
Well, your boss probably knows the politics
and the dynamics of the organizational thing
and your personality and go,
listen, like you may think you'd really like it over there. I mean, I can think of a recent example of this where somebody's like,
I really want to go work in this group. And I'm like, actually, no, you don't. And here's why,
right? Like, here's why. Let me give you these other contextual pieces of information that yes,
on paper, that looks good. And I think that's why you can't ever take people out of the loop. You
do. It's a mess. And I think this is one of the biggest risks with AI if we're not careful, is when you really think about what AI does best, it presents you with what
you want. That's not always a good thing. I mean, I don't know about you, but I know that sometimes
the things I want aren't always what's best for me. And I think that's one of the risks that we're
even already seeing to rise as AI starting to take things off. It's presenting you with more of what
you think you want. Well, that is not always a good thing. And you think about even I don't know about your
journey. But for mine, some of the best pivotal moments in my career is where someone told me
the thing I didn't want to hear or pushed me in the direction I didn't want to go.
I want to pick on one thing that you said, which is that you think that managers and leaders,
they don't have the capacity to really keep track of everything. And like we were saying,
understand fully all the skills and all the minutiae of what somebody is
doing every day. And I think that is to be expected to some extent, there's still a duty
of leaders to do better than they're doing to understand that capacity should be at least
mentioned that first. Yes. And so that's where I would say and this again, is the paradox. We're
in with this. Can you use AI to hit cruise control? You can. And I think
that would be a terrible decision to make. So to your point on that, should you go, well, I don't
need to know what my people do now because AI is going to serve this up for me? No. But is it going
to be able to capture and highlight more of those things for you? And again, should you be using it
in conversation? I mean, I do leadership development stuff within our organization all the you. And again, should you be using it in conversation? I mean, I do leadership development
stuff within our organization all the time. And one of the things I hit and more on is,
if you are not spending time having conversations with your team, you should not be in your position.
Like if you are not sitting down and talking to your people, you don't belong in a people leader
position. But I think what AI can do is then augment that so your conversations are more
impactful, more impactful,
more meaningful, more personalized instead of the generic, how's this project going?
Sounds fantastic type of a thing.
So I want to talk about content now.
You mentioned content earlier.
Obviously, with generative AI, we are, there's content everywhere.
There was already content everywhere. But now those that were churning it out at, you know, a thousand miles an hour are doing
it at 10,000 miles an hour. And I can't say that I'm too familiar with any examples of AI generated,
like corporate learning content, like L&D content, that sort of thing. I'm curious if you have seen
anything like that, where you know, it's coming from AI. Have you seen that? Yeah. And the thing
is, that's not new. Like this is a new trend on the market right now. But even five, seven years ago, I was working with some companies and they were playing around with things where you could do two things. It could ingest your PowerPoint and you could ask it, you could set the parameters for what kind of training you wanted, the length and the audience, and it would spit out like an actual learning pathway for it. Or you could even ask it to come up with content based on publicly available information,
you could type in a topic and say, I want training on this. And can you make it like
shoot the stuff out? It's much better at it now than it was, I was gonna say, I imagine it didn't
start off super great, but it's getting better. I do think that it's, it's good enough that it
should be used in any sort of mainstream way. I am suggesting and we're in the process of
going through our operations to
figure out where generative AI can give us back more time to focus on the important stuff. So I
do think that very critical role for even just going back to efficiency, you get handed a subject
matter expert deck, that's 120 slides, and you're like, I don't even know what this thing is using
generative AI to be able to summarize and highlight key points for you then to react to
and start to build questions on okay, so how might I want to structure this? Or what might be the
important parts? Or what other information might I need to bring in that's not included in this to
provide greater context? It's extremely powerful for that. And I encourage my teams regularly,
like you should be using it. Now, where I see it go wrong is people are like, hey, chat GPT, write a 30 minute course script
that we can put into e-learning
and then we'll dump it into our AI generated voiceover tool.
And then we'll also put it into our AI image generator thing
and we'll just hit easy button.
Right, like let's just put an easy button on.
We can have an AI generated avatar
that comes and talks to us too.
And like, boop, And then we just hit done.
I'll tell you right now what that create is garbage. And it's actually setting up
some real risk. So can you use it for good? Yeah, you can also use it create some really crappy
stuff. I mean, I'm a big believer that anything that is educational at all really needs sort of
editorial oversight in some extent. I worked in publishing for a while and then I worked in
actually creating video content based on best selling books and that sort of thing. And it's important to me to have
humans involved with that sort of thing. And obviously...
Because it gets it wrong.
Yeah, of course. Especially with large language learning models. With LMs, you just... There's
so much there. And obviously, OpenAI, they had that sort of human oversight. I think that's
maybe going in the right direction. They understand that humans have to be involved there. But it's like, where do we pass the buck to just the AI?
Where do we say, okay, now you take this from here. That's a critical point that I don't think
I fully understand. We haven't really achieved understanding of that yet, maybe.
No, and I think that's going to be a constantly moving target because as it advances,
where that line is comfortable, people being in the the loop they have to be because you're
going to have to make critical decisions on well okay based on the context of this project we're
doing or the impact or the risks associated with how much are we really comfortable handing over
to automation well i mean that's something you're going to have to decide on an individual basis but
going back to your point on editorial stuff in my opinion that's never going to go away i can
already tell you there are people being like well we, we have AIs to do our editing for our editors. And I'm like, it's going to blow
up in your face at some point and you may not even realize. So, when it comes to content more
generally, if we're talking about all the different media that you can distribute learning content
through, I mean, just as simple as, you know, video versus written versus audio versus whatever
it is. Do you have any broad guidelines or thoughts on what medium we should use in different
scenarios to achieve certain outcomes or just, you know, for specific learning purposes or
different roles? Is there any sort of guidelines that you have, especially when it comes to like
workflow learning, you know, without disrupting the flow of work? Do you have any sort of data or, you know, any ideas of what we should be
focusing on in terms of media. So people get really annoyed with me on this one.
Because I think in general, a lot of times people are looking for like, what is the answer? And my
answer is always it depends. Because there's not a universal answer to this. You know, I saw when
the video trend was growing, everybody's like videos, the thing, everything should be video. It's like, no, not everything should be video. But I think
really, what we can focus on more now, and this is, I think, where generative AI and other tech
can really help is, we don't have to limit ourselves to one medium anymore. And I think
that's something that historically was just never possible before. We used to be in this paradox of choice where it was like, okay, everybody, part of the strategy session is figuring out what kind of medium are we going to develop in because we can only do this once. And now it's like, well, says who? Why can't we repurpose a video and also publish it as audio? And hey, why don't we also turn this into some convenient job aids and takeaways? And heck, we might as well make it into this or that while we're doing it too. And I think
that's where we start to get to know the dynamic needs. Because the reality is, even if you look
at a specific audience and say, this group needs this, that's still a very broad assumption that
it's like, oh, well, that audience, I mean, there mean there's people if anything part of the thing
that fascinated me and got me to leave coding to study people was the fact that we are confusing
beings one plus one sometimes it equals two and sometimes it equals five and we're like what
happened in there like these are the same individuals same circumstances and yet the
outcomes are completely different and so i think when it comes to mediums i think that's where
being dynamic and saying why does it have to be one? Why can't it be multiple?
You answered that question the exact way that I answered it in my head when I was
formulating it a few days ago, because I know the answer is that it depends. But I also know
that videos, you know, it had its moment. And it's very important to acknowledge that the more
accessible, for instance, video becomes, you know, the more it gets used i don't know with the rise of tiktok and social media and everything it's just becoming
doubly present and and here's the thing i think there's some really important things to consider
with that and you do have to think what are you trying to accomplish with this because is there
things that video does really really well yeah it does but also can that also be done in audio
you know can you feel more of a personal connection with someone in audio? Sure, but video too. So, it's just, honestly, this is where the
complexity of it always baffles. My teams and I are always doing this. We're like, I think this
is going to be what works. And you try it and you're like, well, that completely flopped. And
like, I don't know why, because according to our calculations, that should have soared.
Well, testing is obviously very important, too. I had a few
conversations about this, though, that when it comes to metrics and data and analytics with
learning and development, the history isn't very strong. You know, it's a lot of, you know,
satisfaction type surveys and completion numbers. You know, did you enjoy doing this? Did you
complete it? Simple stuff there. And it's very little about what the ultimate outcomes of that
learning should be at a corporate level. Do you see any good examples of strong metrics being tracked right now?
Yeah, but I think this goes back, and this is just a whole philosophical change.
It's why I got out of traditional academics, was historically, L&D a lot of times treats
what we do as education, which it's very difficult to have metrics when you're doing
education, because it's really a
matter of do you know this and did you like it because that's what you care about in education
it's like did you actually walk away with new knowledge and was the experience enjoyable but
in a corporate setting we should be going beyond do you know it and did you enjoy it we really
should be going are you going to do something with? And did you do it better than you did before as a result of this? And so when I look at like the metrics, when I'm looking
at different things, like that's a big part of our process is defining what are we actually hoping
to accomplish with this? And to be fair, sometimes what we're hoping to accomplish is, did you enjoy
it? That's not a bad metric. I mean, it can be a good metric if that's what you're setting out to
do. If it's if your goal is to improve sales numbers, then that's a terrible metric to set.
And so I think that's where actually defining some of this stuff makes a huge difference. I mean,
granted, there's some static metrics that work well, you know, we're constantly looking at
engagement, and 90 day retention, and things like that. So there's some consistent ones,
but I think when you get down to the granular level of a program, you actually have to understand what you're setting out to accomplish before you can start tracking anything.
Isn't it ultimately ROI though? Isn't at the end of the day, the C-suite with their learning
programs, they want to be able to identify some sort of a return on investment of that learning,
whatever that is, they want to see some profit or revenue from that, don't they? In most cases.
So in my experience, no. And now that may just be how I've approached it, where I set really
clear expectations on what I as a learning leader am responsible for. I can't control
the end game, the outcomes. Now that doesn't mean we won't play a role and we will track that kind of stuff. But some of it is the expectations I set in things of working with the
leaders to say, what are you really hoping to accomplish through this? And sometimes it's a
return on an investment. But sometimes it's we're seeing this problem in our organization. Can you
help that issue stop happening? It's not my job to judge whether or not that's a return on the
investment. If
that's really what you wanted to see happen and it stopped happening, then we've done our job.
And my experience has been nobody's come back to me and gone, I need you to do a reverse digression
analysis on whether or not what was the dollars invested for the dollars returned on that.
Because I'm like, you're the one who told me that was a problem. If it wasn't a problem and we fixed
it, like that's your fault. That's not my fault.
I feel like I've seen a handful of L&D posts on LinkedIn and on social media lately where,
you know, for certain cuts and layoff series, L&D was hit pretty hard.
And the explanation that I saw from those is, you know, it's tough for us to draw a
direct line to revenue through what we do.
And in some cases, it's not even being deliberately done, like you're saying.
Like, it's just not really a part of that goal. But I think maybe what I'm trying to say is
that with all other things being equal in a relatively stable organization of an L&D program,
it does, it just has an ROI, even if it's not abundantly clear what that is, because when
you're improving your people, you know, you're probably making them do a better job at their job.
I know what you're talking about. And this is where I think sometimes L&D has a bad habit
of actually pissing off their stakeholders when we sit and get on the ROI train. I mean,
because again, sometimes the ROI conversation is a convenient way to not tell us what they
really want to say, which is, quite frankly, we don't really see any value in what you're doing
for us. And it goes back to because whenever we have a problem, you're too busy like doing other things or trying to tell us
what you think is better instead of actually hearing us out and helping us solve the problem.
Because I've been through many organizations, we've gone through some major cuts, even recently,
we didn't lose anybody. And it was because our focus is we're here as an enablement function
to the organization, we'll enable what you say is the important things to enable.
And if you pick wrong, I mean, we'll go down with you,
but you can't hang that on our shoulders and go, well, it's your fault.
You fixed this problem and it didn't solve it for the organization.
Like that's what you told us was going to fix it.
Yeah, I see what you're saying.
So when it comes to achieving those things,
we talked about, you know, medium a little bit, but what about content quality? So I have more recently, I have a marketing background. And four and five years ago, like we already talked, video was, you know, very important, especially in advertising. And you're talking about social media advertising and on advertising and there was pretty strong evidence that it was an interesting curve of the quality of video that you did as an ad.
If you did something that was very sort of, you know, low production value, like a selfie
video that was really short, lo-fi, straight to the point, that performed about as well
as like longer form, really high production value videos that we found. And this
is a specific niche and everything. But those two categories performed very similarly. Whereas
anything sort of in between that if you tried to, you know, have sort of like a middle ground
production value, it just it didn't perform as well. And ultimately, the high production value
stuff tended to have sort of longer life overall, but the simple, really basic, you know, off the top of the head almost type thing versus the really well thought out high production value stuff, those perform similarly well. learning and development employee cohort where you have you can put together really really high
level really strong quality content and that will do really really well or you can almost
really simplify it maybe you know the the shorter more brief things that maybe even come locally
from like internally where you're not investing in some sort of high level production value content
would you say that those have maybe like a similar effectiveness there, whereas the stuff
in the middle isn't as strong? Is that maybe one way to think about this? This is kind of how I
think about content more broadly, but I'm not sure if that applies in a learning context like this.
What I would say I've seen, again, my observations on this have been similar to what you shared.
What I would say is I almost very rarely find instances where super high production quality
on learning content is worth it just because of the fact that the reality is the shelf life
of learning content. Well, we'd like to think, oh, this is timeless. It's like it's not. And so the
time and investment that goes into some of this stuff. And at the end of the day to most employees,
well, we like to have this idea that they're like sitting in beanbag chairs, you know, in a hookah lounge, like just doing nothing but
wanting to develop themselves. That's not how work happens. Usually, when you think about what people
are looking for, when they're looking for learning content, it's like I'm trying to get better at
something or I'm trying to accomplish something and I just need something that's going to help
me do that. It's different than marketing. You're trying to create this like emotional experience
that makes someone want to associate with this brand.
And I think sometimes that's where learning's like,
that's what we want too.
And it's like, but that's not what people want from learning.
Is this resource extremely helpful
for what I'm trying to accomplish?
And I think the one thing I will say though,
the really low, just like too much off the cuff
actually doesn't perform very well because I would say, especially when really low, just like too much off the cuff actually doesn't perform very
well. Because I would say, especially when it comes to educational content, most people are
not good at just organically being good at education or development, just not the way they
think. So I think that is where L&D actually plays well is kind of in that messy middle,
where it's like, it's got to be guided enough, structured enough. We've got to kind of prompt it enough so that it's actually getting to the
point that we're trying to accomplish without having to do, well, let's get a whole sound
studio in here and all this kind of stuff. If you try and put out learning content,
you sound like you're in a fishbowl. Forget it. I mean, nobody's going to pay attention to it.
Yeah, of course. That's really interesting, though, because it is very clear that what I'm talking about
is, you know, what do people want from content, which is what do people want from being captured
on their phones and then, you know, being enthralled by whatever it is that they're
just currently looking at, which is objectiveless, more or less outside of, you know, the serotonin
dopamine release and just, you know, chilling.
But when it comes to learning that, you know, that's a categorically different thing, especially
in the context of an organization where it's obviously structured,
and there's probably a serious end goal there, where if you're, you know, learning, you're
advancing yourself, and you're maybe getting promoted, then you're just, you know, developing
yourself more seriously. So that makes a lot of sense what you're saying. Yeah, I mean, I think
sometimes L&D wants to think we just want people to treat L&D like Netflix. And it's like people
just aren't it's a different purpose. That's not what people look to L&D like Netflix. And it's like, people just aren't. It's a different purpose.
That's not what people look to L&D for.
I mean, that was kind of my initial thought is like, you know, can this be as entertaining
as a movie or a Netflix series, that sort of thing.
But I guess that's just not what it's about, ultimately.
As long as we're talking about employees and, you know, a hookah lounge, you know, just
working on self-development, I want to talk about the concept of a of a learning culture because this is this word comes up a lot and to me it's i also shake my head i'm glad you
make that noise and shake your head too because to me this is like asking the question of how to
you know make a utopian society how do we how do we just be better human beings how do we improve
ourselves in this like cosmic way that it's not a good question you know just me saying to you like
how do you develop a learning culture to me that's not a good question. You know, just me saying to you, like, how do you develop a learning culture? To me, that's not a good question. But what is a
good question is talking about the individual pieces and parts that contribute to a learning
culture that you can like put action items on and kind of develop there. And one of the things that
I've heard you talk about and some of your guests and folks that you've had conversations with is
communities of practice. And I think that this, can you just talk about communities of
practice really quickly? That is something that you speak of, right? So, my first thing on the
learning culture and why I sometimes roll my eyes when I hear people say we need to build that is,
it's actually kind of insulting to your employees because it implies that they're not learning
today. They're just these meat sacks sitting out there just going through the motions. And it's
like, no, people are dynamic. They're learning new things. They're figuring things out. They're running into issues
and they're solving them now. Where we sometimes get frustrated is we go, but they're not doing it
the right way. And I think we got to be careful there because it's like, well, that's assuming
we always know the right way and we don't. But it's also saying like, okay, maybe there are things we need to tweak. So how do we do that?
And I guess reform our learning culture or shape our learning culture to say, okay, and
I think that's where communities of practice come into play is instead of going back to
this ethereal utopian society where we're all just drinking tea and studying, instead
of saying, okay, who are the people that really know what's
going on in these areas? And how are we equipping them to reach people where they are going back,
even to our content thing? Many people don't know how to create content. So how are we equipping
them to know, hey, you have this expertise, here's how to shape that. And here's how to get it in
people's hands. Here's how you use this technology to do all of that. And I think that's really where
you do this well. And I think that's really where you do this well.
And I think that's why we're seeing collaborative learning rise so much right now is because people
are going, oh, like a good learning culture isn't a corporate university with green lawns and snacks
galore. It's people who understand the culture of the organization and are actually helping it
develop. So a community of practice is ultimately, like you
were saying, it's people who sort of share a common goal of learning more about a thing, getting better
at the thing, and maybe solving a specific issue. And they're willing to sort of actively do that.
Like you said, when it comes to learning in an organization, everybody's doing it, but there's
always people that have more enthusiasm about something specific, or maybe just more general enthusiasm
about learning in certain contexts. And identifying those people, I think, is very critical. Obviously,
you know, someone like you has probably spent a lot of time doing that and has systems for it.
But what is the best way to identify those people to empower them? Is it a matter of giving them the
right technology? Is it a matter of giving them voice within an organization? Is it all of the above, maybe? Yeah, I think it's all of the above. I mean,
the way I've done it historically, and even now, we have, we don't call them community of practice,
because then it gets the nickname cops, and nobody likes that one. But we've got the councils,
we've got the different councils that we run. And again, we do it around shared goals,
shared things like that. And then we facilitate that as the learning leaders of the organization.
We're the one that then brings them together.
And it's not only us telling them what they should do, but also them telling us what they're
trying to accomplish and us then helping formulate, okay, these might be the tools you might need.
And this may be the right way to approach these kinds of things so that you do it well
type of thing.
And it's really a lot of best practice sharing in a community driven way. I've seen a lot of these things develop through things like
corporate universities, especially because that just really amplifies the amount of collaborative
learning that takes place. I've at least read pieces where certain corporate universities
were almost entirely driven, just democratically and autonomously, where you'd like sign up to teach
your own thing, like me at Get Abstract, or whoever the company could just sign up to teach
a course. And then with a little bit of oversight and approval of what they're teaching, anybody can
also attend. And to me, that's pretty conducive to this. I mean, it's very democratic, and it
might be hard to identify, like, who really has the best ideas that are really going to sort of
move the organization forward here. But do you advocate for that sort of autonomy and democracy?
Very much so. And I think sometimes where you see this go south is L&D gets in the way of it.
Because in some ways, if you're not looking at this right, it can be a perceived threat,
because you can look at it and go, oh, well, if we're involving other people,
then we're not as important. Or if we're not the ones kind of isolating this all to ourselves, is our value diminished? And I would say my experience has been no, not at all. It
actually goes up significantly and actually allows you to scale in a way that you could never do it.
But leaders as teachers, I have seen it hands down throughout my career as be one of the most
effective ways to actually drive change. I think what you're beginning to hit on here is that L&D
is going through a bit of a transformation right now. There's a lot of experts that are advocating
for a total reconsideration of what learning and development really does for its employees and for
its organizations. Do you agree with that? Yeah, very much so. And the thing is, the unfortunate
part is, to me, this isn't a new trend. This isn't like, oh, now we need to be acting different. To
me, this is one of those, this is really how we should have been operating before. But I think
there were a lot of barriers that were in the way from us being able to do it. And I think now what
we're seeing is kind of the perfect intersection of organizations are recognizing what they really
need from us. We're in many ways maturing and what we bring to the organization and the tech
now is enabling some of this stuff that previously wasn't possible. I would say I do feel like it is
a bit of not a trend, but it's an important moment right now to acknowledge that one thing that I
think and I've heard others say this to L&D needs to better educate people on emerging tech and
what's going on here in a way that is much more deliberate and active
and consistent than L&D might have been in the past, you know, more than a passive system,
but like instituting an active education on AI and automation and all those things,
not just because it's so important for general professional knowledge and to understand where
industry is going, but because of the broad risk that it applies to, you know, jobs and the fact
that your job will definitely change, you know, jobs and the fact that
your job will definitely change, you know, all of our jobs will change in some way in the next
couple of decades. Do you think that's true that L&D needs to be very deliberate in teaching people
about emerging tech right now? Very much so. And I would say this extends beyond just emerging tech.
I mean, in many ways, what I've edict to my team and what we look at ourselves as is I'm like,
I mean, in many ways, what I've edict to my team and what we look at ourselves as is I'm like, we have to be 10 times 10 steps ahead of everyone else in the organization, whether
it comes to technology trends, whether it comes to workforce trends, whether it comes
to any of these things, because as the leaders of development of people, we have to be ahead
of where the rest of the organization is so that not necessarily that we're out teaching
everybody what the new tech is, but so that as we're approaching any topic, we're bringing that lens of how does this topic get
impacted by technology? How does the landscape of technology influence the conversations we have
with stakeholders as part of the project? And so all of those things lead, I think that's where
the L&D leaders that are going to really thrive in these next few years are the ones that are being very intentional on keeping themselves and their organizations ahead of everything that's going on everywhere else.
Because there is a need, because everybody right now is kind of going, I don't even know, up from down right now.
And we actually have a unique opportunity to step in and go, let us help you figure that out.
Because not that we have all the answers, but we're at least a little bit further along and can help guide you through it so learning leaders ultimately need to
do the most learning yeah and i feel like if we don't we're not even eating our own dog food how
can we go tell the organization you need to you know upskill and do all this other stuff and be
on top of your game and then we're like yeah well but you know we haven't really upgraded ourselves
in 1978 like it's hypocritical i mean practice you preach. I think it's as simple as that.
Well, I think this is a good place to wrap. Christopher, thank you so much again for
joining us. Before you hop off, can you just let our audience know where they can learn more about
you and what it is that you do? Sure. Yeah. Well, thank you so much for having me. Hopefully,
folks got something out of the conversation. And the best place to find me is on either LinkedIn
or YouTube. I create a lot of
content on both places and that's where most people find me. Cool. All right. Well, thanks
again for joining us. Everybody at home. Thanks for tuning in. We'll catch you on the next episode.
Cheers.
You've been listening to L&D in Action, a show from Get Abstract. Subscribe to the show and
your favorite podcast player to make sure you never miss an episode. And don't forget to give us a rating, leave a comment, and share the episodes you love.
Help us keep delivering the conversations that turn learning into action.