Monday, September 22, 2014

Subject Matter Experts or Learning Experts.

As I've stated many times, even in the title of this blog,I am a professional.
My profession is that of Workplace Learning and Performance, also know and Training & Development and more recently Talent Development.

I have a deep understanding of many areas of expertise in my profession, from Instructional Design and Training Delivery to Change Management, Knowledge Management, Evaluating Learning Impact and Coaching.

I have had that deep understanding certified by my professional body and since 2011 have been proud to claim and use the CPLP designation.

So clearly I am qualified to be a banker, or an insurance agent, or a system developer, or a biotech researcher.

My argument for that is simple.
Over the last few months when I have been looking for my next opportunity to help create a world and a workplace that works better, I've seen a lot of job postings.  Time after time I've seen roles where the responsibilities are a great match for my skills and experience. However when they list the necessary experience the first things on the list looks like

  • 5-7 years in the <insert specific here> industry
  • deep understanding of the day to day role of a <insert title of person doing the job>

It's only towards the bottom of the list I start to see things like.

  • Previous experience in a training or similar role a plus.


In a previous career I used to be a programmer. I would often hear my managers bemoan the fact that senior people in the accounting team would spend hours, days even weeks, building complex spreadsheets to be fed from multiple sources and then use those spreadsheets to make decisions on the business. They wailed that these spreadsheets had no formal QA process to check that they were correct. They pointed out stats on the web that showed the very high percentage of these that had errors in them. In essence they were saying. Why can't they leave this to us, the programming professionals?

And now those same IT managers along with managers across the board so often but Subject Matter Expertise above expertise in Learning & Development for L&D roles.

I've been fortunate a few times to work for managers who understand the importance of balancing the in house subject Matter Expertise they already have with someone who knows how people learn, what the current best practice is in the profession and how to pull the knowledge from an SME and use it to create engaging and effective learning.

I'm looking forward to doing that again, and again.

Happy Learning

Alan



Thursday, August 28, 2014

Gamification meetup

I attended a great meetup yesterday.
Organized by some folks at LinkedIn it's the Bay Area Learning Design & Technology group
I chatted with some old friends and made a couple of new one. I was luck to be sat at the table with Clark Quinn, one of the speakers and had some great conversations with him.
I came away with two main learning points.
First
If you get the design right, the the implementation tool really doesn't matter.
Second.
Adding gamification onto a well designed piece of learning is like taking your iPod with a great playlist along when you run. It just makes it better.

I live tweeted some notes of the presentations as did others.
Here they are.


Saturday, August 23, 2014

Walking the walk.



I was in an interview yesterday.

It's a job I like and even though it can be irritating I have to admit that the long and very thorough selection process means I'm pretty sure that by this stage I have a very clear idea of what they want and they have a clear idea of what I can do.

One of the things I've been a passionate advocate for is having learning that built around the business impact needed.  I've mentioned it before in prior blog posts and it usually comes up in an interview.

However this time I got a great follow up question. "Have you ever managed to work that way?"  I had to think for a second, The CEO who asked me was very polite about it. He pointed out that we often find it hard to actually walk the walk that we would like.

But it came to me that I had.

I explained that at the last place I worked I had asked several time about what the business metrics were that my success would be judged on. In that case it was how quickly we could get the training to the customers. (and therefore start their subscription and billing.) That had been one of the guiding principles that dictated a blended learning, mostly on-demand, self paced solution.

So if I get an opportunity to work for these folks I already know what the metrics are that will show my success and I plan to walk the walk again to design a solution that will target that need, as well as the individual needs of the learners.

Do you have a design philosophy and do you walk your walk?

Happy Learning

Alan

Tuesday, May 13, 2014

Evaluating the business impact of learning.


As my regular reader(?) will know I was at the ASTD 2014 conference in DC this week.
On the plane back west I thought I'd put in some thoughts on one of the sessions I attended while I was there. With the news of the passing of Don Kirkpatrick this is particularly relevant.

The session was led by Ken Phillips.  Ken is an old colleague from my CCASTD days. He is a past president of the chapter and a regular presenter both there and at the International Conference. 

I remember seeing Ken present at the Chicago conference back in 2010 on creating Level 1 evaluations that went beyond reaction. After seeing that I had to completely rewrite my level 1 evaluations but as what I got out of them was so much more useful, I forgave him for that.



A brief side note for anyone not familiar with training evaluation levels. 
The great Don Kirkpatrick wrote a series of articles for ASTD's magazine some 40 years ago. There he defined 4 levels that can be evaluated. I first heard about them at a conference several years back from Bob Wilson who supplied the subtitle explanations I placed in parenthesis below.

Level 1. Reaction (Did they like it)
Level 2. Learning (Did they learn it)
Level 3. Behavior. (Are they using it) 
Level 4.Results (Is it making a difference) 

If you have ever had any workplace training you almost certainly encountered some sort of level 1 evaluation, often discounted by learning professionals as smile sheets. 

In addition to these 4 levels, Jack Phillips has also added a 5th, ROI. ( Personally I see this as just a part of level 4 )



OK,   back to the session.
The session title was. Business Results Made Visible: Design Proof Positive Level 4 Evaluations.

Learning pros will tell you that level 4 is the hardest to get to and will have all sorts of reasons why they just can't provide it. So I was happy to see some stats that Ken provided in his session on how much level 4 is actually being done.

According to some notes I tweeted out, the survey said that 37% of organizations say that they measure at least some of their training programs at level 4. Quite frankly I found that number unbelievably high. My experience has generally been that rarely if ever has a program I've been involved in actually had any level 4 evaluation done. But then I started thinking. The majority of the work I've done over the last few years has been on behalf of software vendors who provide training for the products they use. In such case we, the vendors, are (or at least should be) responsible for levels 1&2 and in some cases level 3 if we have the ability to observe or measure how they use the product we just trained them on. Level 4 in those cases is in the hands of the organizations who's employees we just trained. 

So it looks like many of those companies are taking their responsibility seriously, as are my colleagues who run internal programs. 

So the next question and statistic is about how much training is being evaluated at level 4.  Well, the 37% that do any level 4 do so on about 15% of the programs they run. That seems low but on sober reflection it looks like there are some careful choices being made. Ken pointed out some best practice criteria in use for choosing which programs to evaluate and the key point is it's the ones that are of strategic importance to the organization that are the ones that make the most sense to invest in level 4 evaluation of.  The other important factor is that the metric that the training is targeted against is one that the people doing the evaluation have a realistic chance of having access to. Many companies have  a tight reign on some of their most important metrics and may not be keen to share that with the (often very) junior levels of management that might be leading the L&D function. So pick data that you can have the needed level of access too. 

But the good news in the world of big data is that much of that data about how people work is available if not already being collected. And another interesting statistic is that 75% of the organizations collecting data points as part of a level 4 evaluation see the data as having high or very high value. 

And that is good news all around. Not just for the Learning Pros out there but also for CEOs. In another recent survey CEOs were asked if they wanted to see the numbers on the business impact of training programs. 96% of them said they did want to see it.  But only 8% of them say they got to see this information. ( I think the question also had something to do with ranking how important that information was and that the 96% may have been how many of them ranked it at #1 but I'm not entirely sure - Long week!)

One question I will need to follow up with Ken is why the large discrepancy? If 37% do level 4s but only 8% of CEOs see the results then something is missing. 

So encouraging news on evaluation of business impact, but the learning profession still has a way to go on this to satisfy the wishes of our CEOs. 

Happy learning. 
And evaluating.

And Rest in Peace Don.

Alan

Friday, May 9, 2014

Talent Developer

Yesterday I sat in the audience at the ASTD conference as the CEO announced a rebranding.
As my regular reader will know I had a big problem last year when my professional organization dropped the term Workplace Learning & Performance and returned to Training and Development. It turns out that they managed to piss me ( and a few others) off for no reason. They were planning on dropping "Training &Development" to something new that reflected the breadth of the work that professionals in our industry do. 
So yesterday they announced it.

We are now the Association for Talent Development. 
Putting aside (or at least trying to) what I think about the way they handled the change I have to decide if I'm on board with this. 

I'm still a Certified Professional in Learning & Performance. And although I'd have liked to see the word performance in the name I'm going to be seeing so much of, I am welcoming the name change. 

The work we do has always been more than just training. Developing talent and helping people and their organizations grow and improve has a far greater impact that running a training class. As a professional I'm always looking at the organizational goal and how the work I'm doing can move needles. 

I used to welcome the conversation explaining what I meant by workplace learning and performance. 

I welcome the opportunity to have that conversation about Talent Development. 

Happy Learning
(happy developing.)

Alan

ASTD2014 Tweets and reTweets

Sunday, April 27, 2014

Helping Inspirer

A couple of weeks ago ASTD Golden Gate had a session on the Insights Discovery personality profiling tool. We had a speaker from Insights who introduced us to the concepts of the tool along with a Learning & Development specialist from Google who shared with us some of the ways that Google use this and other similar tools.  

As part of the evening all the attendees had been invited to fill out the online profile and we're presented with their profile reports. 

This was not my first time being profiled with this tool, in fact it wasn't even my second time. I first came across these folks at the ASTD ICE in Chicago in 2010. I've always thought that self awareness is the first step to any real learning or meaningful change and so when they offered a code to get my profile done for free I took it. 

When I got my report in the email I read it and had to admit it had just about summed me up. I was classified as a "Helping Inspirer", position 27 on their "wheel".  This meant that my dominant colors we "Sunshine" yellow and "Earth" green. One of the cool things about the profile is that it also shows you both your conscious persona (what you project to the world and the workplace) as well as your unconscious persona (the natural you.) It's worth noting that when these two are aligned you don't spend a lot of time at work fighting to be something you are not. I'm please to say that although there were small differences in my individual scores, I came out as a 27 for both. 

A couple of years later I was part of a group of a group of trainers who all worked for the same group of companies and at a best practice sharing summit we all got Insighted again. It was interesting to note that although my unconscious persona was still a 27 my conscious one had tweaked slightly.  I'd dialed up my "Firey" red score a bit. Most likely as by that time I was leading a team of junior folks. 

Fast forward another couple of years and I'm really interested to see what my scores come out at this time. 

And when I open the report what do I see.  27/27 Helping Inspirer again.