Friday, December 19, 2014

How to make time for training.

My client asked me if I could help out a couple of his people.

Some background

There is going to be a small meeting on Jan 15th and a big one mid February and at both of those meetings a bunch of new technology pieces are going to be rolled out.

One of those things I've created some on-line job aids for already. Another is the project I'm working on at the moment.

But there are several others that I've not worked on and for some of them nothing exists at all by way of training or job aids.

It is now half way through the week prior to the office pretty much shutting down for 2 weeks.

The request

Could I create a '2 or 3 page pamphlet' to go with the new tech? You see they don't have time for any training.

This started a conversation.

Me: "So, At the end of this what do you want them to be able to do?"

Client: "I want them to be able to understand what is on their new computer"

Me: "What do you mean by 'understand' that and how will you know if they do?"

Client: ........ Has to think for a bit.


The idea

I know what I'm good at.
I'm good at creating multi-level, learner-centric eLearning. The sort that gives the learners the knowledge that there is information available to them on several topics and that they can choose how much detail they want/need for each of the topics.

I also know that if I try and condense large volumes of information it will take me a long time ("I would have written a shorter letter but didn't have the time"- Not Mark Twain) and there will be a lot of discussion about what needs to be in or out of the document.

So I suggested that instead of putting a 'help' PDF on the desktop of these new laptops I could create something a little bit interactive. "Let me show you what I mean."

So I headed to my Storyline, grabbed a Tabs template and created proof of concept in a couple of hours.

The Result



The first screen has one line descriptions of the six new things. If they click on the icons next to those we open up a layer with room for a very brief description and maybe an image or two. The common feature being a Learn more... link that takes them to a full page (or pages if we need to) that will give them a bit more depth on the subject. I populated most of these pages with paragraphs from my favorite Lorem Ipsum generator (It's better with bacon!) and showed it to them.

So I know what i'm going to be doing over the next couple of weeks.

Happy Learning & Happy Holidays

Alan



Monday, December 1, 2014

Try and bribe me OR Compliance made interesting


It's Holiday time, and for so many companies that means end-of-year compliance training for so many of their staff.

One of the reasons that eLearning is, let's not mince words here, hated by so many people is that for so many the only eLearning they take in any given year is the raft of compliance training that their company mandates them to take. So much of it is simply of the 'click next to continue' variety. Taken from a classic Death by PowerPoint slide show and converted using a PowerPoint to Flash converter like Articulate Presenter with a voice over added. When it was an instructor led class is was bad. Now it's simply awful. 

And that is what people face all too often. 

But it could be worse.
I worked for a company that had a Sexual Harassment course that was almost that bad. As a manager living in California I had to have a minimum of 60 minutes of Sexual Harassment training each year. The eLearning took about 40 minutes to complete if you did it slowly. When I inquired how to make up my missing 20 I was told to take it again.  

But it could also be better.

Let's look at ethics training. 
Most companies have something along the lines of this and for most they include something about not accepting gifts from vendors or from people outside the organization when this could be seen as some sort of conflict of interest. And maybe there is something there about reporting contacts that could be seen as such to the ethics office or similar.

A good eLearning course will present scenarios and ask the learner to respond to them, giving the learner feedback as to how well those responses match the company policy. 

But how about an alternate approach. This is built on something I first heard out loud from Tom Kuhlman when he came to ASTD Golden Gate for a couple of days in 2013. (Blog post on that here)

Send out the updated policy in a PDF via email and tell everyone to read it and follow it.
Send an email from outside the company to everyone. In the email ask them to do something in breach of policy, like giving a heads up on how sales are this quarter, or if that new product will be ready on time. Offer an incentive of some sort.
Then sit back and see what happens.
  • Some people will follow policy perfectly. Reporting this contact to the appropriate group and awaiting instructions.
  • Most will simply ignore or delete the email. A Minor breach of policy.
  • Some might reply to it in an appropriate way. Telling the person that this is unacceptable behavior. An understandable reaction if not one totally in keeping with policy.
  • And a few might make other contact with the sender.

After a while the folks in group 1 get a 'gold star' and a pass on any further ethics training that year. Maybe a prize goes to the fastest reaction in this group.
Groups 2 and 3 have to take a short eLearning reminding them about the policy and why it is important to follow it.
Group 4 get some serious training. Maybe eLearning, maybe even Instructor led.

Now that is something that is truly performance based training.

Happy Learning

Alan


Wednesday, November 19, 2014

Advocating for an industry


I had the pleasure earlier this week of spending some time in the company of Dr Allen Partridge. 

Allen was in San Francisco yesterday presenting to ASTD members and guests for a full day workshop (that I missed half of due to Jury Service!) and a Chapter meeting in the evening.
Allen works for Adobe and his title is Senior Technology Advocate for eLearning. What a cool title.

The job is even cooler.
He gets to listen to what the world of eLearning development folks want and need from their vendors and explain all that to the team at Adobe.
He also gets to listen to what the team at Adobe have built and how they see it being used and he gets to explain that to the eLearning developer community.

So Allen was the perfect person to show us around the new capabilities that have been built into Captivate 8 and he spend the workshop focusing on responsive design and the ways that with the new tool you can build a course that looks good on a mobile phone screen, a tablet and a laptop or desktop sized screen as well.

For me however the evening was the most fascinating. 
Because here he shared what he was hearing in those advocate conversations he has been having all around the world. I tweeted out a few notes  and they are captured here.

But the overarching theme, as I saw it, was this:
  • The learners of today and certainly the learners of tomorrow want to learn
  • ELearning designers want to create great and effective eLearning
  • Companies need their workforces skilled up in effective, scale-able and cost efficient ways
  • And the technology is just about catching up with those needs, despite it's attempts to trip itself up again and again.


What exciting times we live in.

Happy Learning


Friday, November 14, 2014

What Happened in Austin... Without me

I missed my first Applied Improvisation Network conference since 2004 due to a family medical emergency.
But so many of my AIN buddies are tweeters.
So I've captured the tweets and storify-ed them.
I'll be using this to review some key learning that I may have missed and as an excuse to connect with some people to ask them questions.



Monday, September 22, 2014

Subject Matter Experts or Learning Experts.

As I've stated many times, even in the title of this blog,I am a professional.
My profession is that of Workplace Learning and Performance, also know and Training & Development and more recently Talent Development.

I have a deep understanding of many areas of expertise in my profession, from Instructional Design and Training Delivery to Change Management, Knowledge Management, Evaluating Learning Impact and Coaching.

I have had that deep understanding certified by my professional body and since 2011 have been proud to claim and use the CPLP designation.

So clearly I am qualified to be a banker, or an insurance agent, or a system developer, or a biotech researcher.

My argument for that is simple.
Over the last few months when I have been looking for my next opportunity to help create a world and a workplace that works better, I've seen a lot of job postings.  Time after time I've seen roles where the responsibilities are a great match for my skills and experience. However when they list the necessary experience the first things on the list looks like

  • 5-7 years in the <insert specific here> industry
  • deep understanding of the day to day role of a <insert title of person doing the job>

It's only towards the bottom of the list I start to see things like.

  • Previous experience in a training or similar role a plus.


In a previous career I used to be a programmer. I would often hear my managers bemoan the fact that senior people in the accounting team would spend hours, days even weeks, building complex spreadsheets to be fed from multiple sources and then use those spreadsheets to make decisions on the business. They wailed that these spreadsheets had no formal QA process to check that they were correct. They pointed out stats on the web that showed the very high percentage of these that had errors in them. In essence they were saying. Why can't they leave this to us, the programming professionals?

And now those same IT managers along with managers across the board so often but Subject Matter Expertise above expertise in Learning & Development for L&D roles.

I've been fortunate a few times to work for managers who understand the importance of balancing the in house subject Matter Expertise they already have with someone who knows how people learn, what the current best practice is in the profession and how to pull the knowledge from an SME and use it to create engaging and effective learning.

I'm looking forward to doing that again, and again.

Happy Learning

Alan



Thursday, August 28, 2014

Gamification meetup

I attended a great meetup yesterday.
Organized by some folks at LinkedIn it's the Bay Area Learning Design & Technology group
I chatted with some old friends and made a couple of new one. I was luck to be sat at the table with Clark Quinn, one of the speakers and had some great conversations with him.
I came away with two main learning points.
First
If you get the design right, the the implementation tool really doesn't matter.
Second.
Adding gamification onto a well designed piece of learning is like taking your iPod with a great playlist along when you run. It just makes it better.

I live tweeted some notes of the presentations as did others.
Here they are.


Saturday, August 23, 2014

Walking the walk.



I was in an interview yesterday.

It's a job I like and even though it can be irritating I have to admit that the long and very thorough selection process means I'm pretty sure that by this stage I have a very clear idea of what they want and they have a clear idea of what I can do.

One of the things I've been a passionate advocate for is having learning that built around the business impact needed.  I've mentioned it before in prior blog posts and it usually comes up in an interview.

However this time I got a great follow up question. "Have you ever managed to work that way?"  I had to think for a second, The CEO who asked me was very polite about it. He pointed out that we often find it hard to actually walk the walk that we would like.

But it came to me that I had.

I explained that at the last place I worked I had asked several time about what the business metrics were that my success would be judged on. In that case it was how quickly we could get the training to the customers. (and therefore start their subscription and billing.) That had been one of the guiding principles that dictated a blended learning, mostly on-demand, self paced solution.

So if I get an opportunity to work for these folks I already know what the metrics are that will show my success and I plan to walk the walk again to design a solution that will target that need, as well as the individual needs of the learners.

Do you have a design philosophy and do you walk your walk?

Happy Learning

Alan

Tuesday, May 13, 2014

Evaluating the business impact of learning.


As my regular reader(?) will know I was at the ASTD 2014 conference in DC this week.
On the plane back west I thought I'd put in some thoughts on one of the sessions I attended while I was there. With the news of the passing of Don Kirkpatrick this is particularly relevant.

The session was led by Ken Phillips.  Ken is an old colleague from my CCASTD days. He is a past president of the chapter and a regular presenter both there and at the International Conference. 

I remember seeing Ken present at the Chicago conference back in 2010 on creating Level 1 evaluations that went beyond reaction. After seeing that I had to completely rewrite my level 1 evaluations but as what I got out of them was so much more useful, I forgave him for that.



A brief side note for anyone not familiar with training evaluation levels. 
The great Don Kirkpatrick wrote a series of articles for ASTD's magazine some 40 years ago. There he defined 4 levels that can be evaluated. I first heard about them at a conference several years back from Bob Wilson who supplied the subtitle explanations I placed in parenthesis below.

Level 1. Reaction (Did they like it)
Level 2. Learning (Did they learn it)
Level 3. Behavior. (Are they using it) 
Level 4.Results (Is it making a difference) 

If you have ever had any workplace training you almost certainly encountered some sort of level 1 evaluation, often discounted by learning professionals as smile sheets. 

In addition to these 4 levels, Jack Phillips has also added a 5th, ROI. ( Personally I see this as just a part of level 4 )



OK,   back to the session.
The session title was. Business Results Made Visible: Design Proof Positive Level 4 Evaluations.

Learning pros will tell you that level 4 is the hardest to get to and will have all sorts of reasons why they just can't provide it. So I was happy to see some stats that Ken provided in his session on how much level 4 is actually being done.

According to some notes I tweeted out, the survey said that 37% of organizations say that they measure at least some of their training programs at level 4. Quite frankly I found that number unbelievably high. My experience has generally been that rarely if ever has a program I've been involved in actually had any level 4 evaluation done. But then I started thinking. The majority of the work I've done over the last few years has been on behalf of software vendors who provide training for the products they use. In such case we, the vendors, are (or at least should be) responsible for levels 1&2 and in some cases level 3 if we have the ability to observe or measure how they use the product we just trained them on. Level 4 in those cases is in the hands of the organizations who's employees we just trained. 

So it looks like many of those companies are taking their responsibility seriously, as are my colleagues who run internal programs. 

So the next question and statistic is about how much training is being evaluated at level 4.  Well, the 37% that do any level 4 do so on about 15% of the programs they run. That seems low but on sober reflection it looks like there are some careful choices being made. Ken pointed out some best practice criteria in use for choosing which programs to evaluate and the key point is it's the ones that are of strategic importance to the organization that are the ones that make the most sense to invest in level 4 evaluation of.  The other important factor is that the metric that the training is targeted against is one that the people doing the evaluation have a realistic chance of having access to. Many companies have  a tight reign on some of their most important metrics and may not be keen to share that with the (often very) junior levels of management that might be leading the L&D function. So pick data that you can have the needed level of access too. 

But the good news in the world of big data is that much of that data about how people work is available if not already being collected. And another interesting statistic is that 75% of the organizations collecting data points as part of a level 4 evaluation see the data as having high or very high value. 

And that is good news all around. Not just for the Learning Pros out there but also for CEOs. In another recent survey CEOs were asked if they wanted to see the numbers on the business impact of training programs. 96% of them said they did want to see it.  But only 8% of them say they got to see this information. ( I think the question also had something to do with ranking how important that information was and that the 96% may have been how many of them ranked it at #1 but I'm not entirely sure - Long week!)

One question I will need to follow up with Ken is why the large discrepancy? If 37% do level 4s but only 8% of CEOs see the results then something is missing. 

So encouraging news on evaluation of business impact, but the learning profession still has a way to go on this to satisfy the wishes of our CEOs. 

Happy learning. 
And evaluating.

And Rest in Peace Don.

Alan

Friday, May 9, 2014

Talent Developer

Yesterday I sat in the audience at the ASTD conference as the CEO announced a rebranding.
As my regular reader will know I had a big problem last year when my professional organization dropped the term Workplace Learning & Performance and returned to Training and Development. It turns out that they managed to piss me ( and a few others) off for no reason. They were planning on dropping "Training &Development" to something new that reflected the breadth of the work that professionals in our industry do. 
So yesterday they announced it.

We are now the Association for Talent Development. 
Putting aside (or at least trying to) what I think about the way they handled the change I have to decide if I'm on board with this. 

I'm still a Certified Professional in Learning & Performance. And although I'd have liked to see the word performance in the name I'm going to be seeing so much of, I am welcoming the name change. 

The work we do has always been more than just training. Developing talent and helping people and their organizations grow and improve has a far greater impact that running a training class. As a professional I'm always looking at the organizational goal and how the work I'm doing can move needles. 

I used to welcome the conversation explaining what I meant by workplace learning and performance. 

I welcome the opportunity to have that conversation about Talent Development. 

Happy Learning
(happy developing.)

Alan

ASTD2014 Tweets and reTweets

Sunday, April 27, 2014

Helping Inspirer

A couple of weeks ago ASTD Golden Gate had a session on the Insights Discovery personality profiling tool. We had a speaker from Insights who introduced us to the concepts of the tool along with a Learning & Development specialist from Google who shared with us some of the ways that Google use this and other similar tools.  

As part of the evening all the attendees had been invited to fill out the online profile and we're presented with their profile reports. 

This was not my first time being profiled with this tool, in fact it wasn't even my second time. I first came across these folks at the ASTD ICE in Chicago in 2010. I've always thought that self awareness is the first step to any real learning or meaningful change and so when they offered a code to get my profile done for free I took it. 

When I got my report in the email I read it and had to admit it had just about summed me up. I was classified as a "Helping Inspirer", position 27 on their "wheel".  This meant that my dominant colors we "Sunshine" yellow and "Earth" green. One of the cool things about the profile is that it also shows you both your conscious persona (what you project to the world and the workplace) as well as your unconscious persona (the natural you.) It's worth noting that when these two are aligned you don't spend a lot of time at work fighting to be something you are not. I'm please to say that although there were small differences in my individual scores, I came out as a 27 for both. 

A couple of years later I was part of a group of a group of trainers who all worked for the same group of companies and at a best practice sharing summit we all got Insighted again. It was interesting to note that although my unconscious persona was still a 27 my conscious one had tweaked slightly.  I'd dialed up my "Firey" red score a bit. Most likely as by that time I was leading a team of junior folks. 

Fast forward another couple of years and I'm really interested to see what my scores come out at this time. 

And when I open the report what do I see.  27/27 Helping Inspirer again.

Monday, February 17, 2014

The top learning miss-steps that start-ups make. Part 5

Living and working as a Learning Pro out in Northern California I see and hear about a lot of training that is not serving an organization in the way it should.

So I've put together a list of the most common miss steps and mistakes I've seen and some thoughts on how these can be course corrected.

The first posting on this was about how Founders think that their systems are more obvious and intuitive than they may be.

The second was about cramming too much content into a course.


The third was about moving content online without understanding the difference of the delivery channel.

The fourth posting was about how failing to cross train makes silos worse.

This is the fifth and final posting in the series.

All learning is created equal(y).


Although too many startups have had their training materials created by people with no background in Instructional Design, many of them have hired trainers or instructional designers with experience.  Most people working in the Learning & Development field follow some version of the Instructional Systems Design ADDIE model when designing learning.  This approach is very much aligned with the old waterfall model of systems development, with each phase following on from the last. It's a really strong model for creating what I call learning products. These would be pieces of learning that can be expected to be delivered time and time again, either as elearning or in a regularly repeated class. It was created 70 years ago by the US Military as a tool to create training. Over the last 70 years it has been refined and reshaped to meet the needs of learning designers in an ever changing world.

Not surprisingly though some people look at this 70 year old technology and feel that there must be a better way.

As regular readers of this blog will know I'm a Thiagi fan. One of my most transformative learning experiences was a full day workshop he ran on Rapid Instructional Design at an AIN conference in Chicago in 2008. It was at that point that I realized that if your objective is to create learning rather than learning content then the ADDIE model is not the best model. For internal learning now I always default to a Rapid ID approach, that starts with the premise of making the learners create the learning content as they learn and having Subject Matter Experts on hand to ensure that the content is factually correct as well as a learning facilitator on hand to guide the learning experiences.

But just as I'd never consider using ADDIE for a single run internal class, I'd never use RID to create a training product.

My customers are the learners themselves AND the organization that is paying for the learning to happen. Understanding this balance and using the right tool is the only way to satisfy the needs of both sets of stakeholders.


Happy Learning

The top learning miss-steps that start-ups make. Part 4

Living and working as a Learning Pro out in Northern California I see and hear about a lot of training that is not serving an organization in the way it should.

So I've put together a list of the most common miss steps and mistakes I've seen and some thoughts on how these can be course corrected.

The first posting on this was about how Founders think that their systems are more obvious and intuitive than they may be.

The second was about cramming too much content into a course.


The third was about moving content online without understanding the difference of the delivery channel.

Here is the fourth, and there may be more to come.......


Failing to cross-train and allowing Silos to flourish.


Startups don't normally carry much in the way of spare capacity in terms of the people in the organization. The first few hires are usually engineers selected to achieve key things, building key pieces of functionality in the product. Next hires will start to fill out the people in areas like sales, finance. and professional services. Then maybe product management, customer support and marketing make the next wave. Usually just a single individual in each area at first. Within engineering your people become specialists in their part of the system.  As you grow each of these people gets to the point of maximum capacity and you add someone new, usually because you need someone to work on the next project.

Everyone is working heads down full out to get their work done. No one really knows the details of what the person two cubes over is really working on and as for that other department, well who has the time to find out? You might have a quarterly or even possibly a monthly 'all-hands' meeting but all that can ever be hoped to achieve in those is to give everyone a big picture of what the company is doing. 

This works, up to a point.

It falls down for two reasons.

The Talent War

In the Bay Area there is a cutthroat war going on for certain types of talent. Talk to any recruiter and they will tell you that finding the right people to allow for the growth that is expected is one of their biggest headaches. The investment in finding new people and getting them up to speed is a huge overhead. So when you do get someone in place and working on those critical projects the last thing you want to think about is that someone might poach them. After all isn't your company the best one out there? Don't you have great meals and regular fun times together? And of course everyone is just waiting for those stock grants to make them millionaires. 

But remember those incentives that you OK'ed for your recruiter to use to get your people to pull their friends and coworkers from former companies into your team? Other companies use them too.

So in the tightly knit community that is the Bay Area tech world you will have someone leave. 

And when that two weeks notice lands on your desk it's going to be very tough to start cross training someone else to pick up on that crucial work that was being done. 

Silos prevent cross functional cooperation.

When people on a team don't know what their colleagues on other teams do what happens when the inevitable requests for help or input come in? 

Without the bigger picture it's impossible for them to accurately evaluate them against all the other things on their desk. 

Two problems that share the same cause, but should be addressed in different ways.


  • Peer reviews of work within teams is one of the best ways to not only produce better quality outputs by trapping and eliminating errors early, but they also mean that when your key person is on vacation or hands in their notice then there is someone else who won't be starting from scratch. 


  • A well planned and executed orientation can give new (and existing) employees some vision into what each team does for an organization. 
  • Regular communication on the projects that are being worked on give context to the request for help on those projects.

The knowledge in a startup's employees heads are it's most valuable asset.  A few well thought out actions can leverage those assets.


Happy Learning


Thursday, February 13, 2014

The top learning miss-steps that start-ups make. Part 3

Living and working as a Learning Pro out in Northern California I see and hear about a lot of training that is not serving an organization in the way it should.

So I've put together a list of the most common miss steps and mistakes I've seen and some thoughts on how these can be course corrected.

The first posting on this was about how Founders think that their systems are more obvious and intuitive than they may be.

The second was about cramming too much content into a course.

Here is the third and there are more to come.......


We'll move our classes online.


In their early days most start-ups do their training live as part of a high touch implementation. As they grow they often find that this option no longer scales beyond the 1 or 2 sessions a month they did in their first months and maybe years. The solution that so many of them take is to simply move their live class to a virtual classroom. As a concept the virtual classroom is fine and has be successfully implemented by many organizations.

Three problems however tend to rear their ugly heads.

  • Firstly that any weakness in the existing content get amplified by the physical disconnect between the facilitator and the learners.  A good 'stand up' trainer is constantly scanning the faces of their learners, looking for nods of understanding or those puzzled looks that indicate that it's worth spending a few extra moments on this point or that concept. Even the best online classroom tools do not give the facilitator that level of connection
  • Secondly most online facilitators don't really know the full functionality of their delivery platform and those that do all too often don't use that full functionality.  Trainers who would never dream of starting a live class without putting a flip chart sheet up on the wall to act as a parking lot, so often don't bother to get everyone using the chat capability of their tool if only to say Hi.
  • Thirdly using virtual classrooms make it much harder to give the learners an opportunity to demonstrate that they have learned. In a live classroom people can work in groups on role plays or to create teach backs or other tools. If you are dealing with software use then a classroom will usually have the learners using some sort of sandbox to show that they can follow the processes they have just learned. 



Moving your content online does not mean you have to completely rewrite it, but it is worth a good look to see if the three problems outlined above are risks to the learning being effective.

Lesson structure can be altered to get regular check-ins with the learners, facilitators can learn and use best practice with their tool sets and facilitators can hand control of their environments to the learners to demonstrate learning.

There are solid solutions, but first you must understand that you need them.

Wednesday, February 5, 2014

The top learning miss-steps that start-ups make. Part 2


Living and working as a Learning Pro out in Northern California I see and hear about a lot of training that is not serving an organization in the way it should.

So I've put together a list of the most common miss steps and mistakes I've seen and some thoughts on how these can be course corrected. 

The first posting on this was about how Founders think that their systems are more obvious and intuitive than they may be.

Here is the second. 
More to come


2. There is a lot they need to know.

Start-ups put years of development time into producing software with a wide array of features and functionality. Product managers have put long hours into conversations with existing and potential customers to define what the software needs to do. It is quite likely that the software has been enhanced multiple times based on the wishes, or needs, of specific customers. It is quite understandable then that you want your users to know about all these great features and options that have been painstakingly built into the product to give it the depth and fullness that you are so rightly proud of. However as this list gets longer and longer the time taken to introduce these to a new user in a way that is useful to them when they come to use the product, starts creeping up and up.  
How long is reasonable for an online training class? 15 minutes?  4 hours? 
How many modules of self paced learning is it reasonable to expect a new user to sit through to be able to use your product? 5? 10? 50?

I've seen classes grow and grow. What started out as a 90 minute web class grew to 2 hours, then 3. And then into a pair of 2 hour sessions. 

I'd ask you to think back to the last time you learned something for work. How long was your company prepared to invest in your learning time?

As Learning Professionals we love creating courses. The more the merrier. The longer the better. But if we are going to be honest advisers and partners in our businesses, we do need to ask if all this content needs to be in the training.
I take the 80:20 rule as my guide. 

I've seen that time and again 80% or more of the use of any piece of software uses 20% or less of the functionality. To be truly effective training you need to identify that 20% and get your learners a chance to learn it and do it. 

What about the rest?
  • Would a single page job aid work instead? 
  • Is your user manual set out in such a way that the information is easy to find? 
  • Do you have a social platform that your users might go to like a LinkedIn or Facebook group? Or a YouTube channel?  
  • Have you tried Googling the question? 
  • If you are building eLearning, can these be optional modules?
It's way too easy to overwhelm your learners. When that happens, they miss the 20% that they need to be able to use the product. If you are lucky that means they call support for help. If not they just give up, or tell their bosses, colleagues and peers that your software just doesn't do what they need it to do.

So decide what really needs to be in that precious time you get with a new learner, and find other ways to get the rest to them.

Monday, February 3, 2014

The top learning miss-steps that start-ups make. Part 1

Living and working as a Learning Pro out in Northern California I see and hear about a lot of training that is not serving an organization in the way it should.

So I've put together a list of the most common miss steps and mistakes I've seen and some thoughts on how these can be course corrected. Here is the first. More to come.

1. It's obvious. It's Intuitive

By there very nature start-ups are founded by smart people who see a need that is not being fulfilled. In Silicon Valley that usually means they have an idea that translates into a software solution to a problem that people have. Founders live and breath that idea and that solution. They see it through iteration after iteration, continuously improving and refining it. The lucky ones convince backers and develop that solution into a product and then iterate again and again. Then customers connect with the solution and those first few connections are always very high touch interactions. All the people connected with the solution are passionate about it and the invest heavily in making sure that the customers get the most out of it. Maybe you create some training for it. Leveraging existing content from the sales cycle into a training presentation and reworking the sales demo into a hands-on user experience for training. Then, if all goes according to plan, there are more customers and more people connecting with the product.

Then you get that piece of feedback.
People aren't using the software fully.
Or.
People are calling support to help through something that you thought was covered in training.

When you take a look at the what it is they are doing you just don't see why they just don't get it.
It's Obvious. It's Intuitive.

You are right of course. It is always obvious. But sadly not as obvious to someone who has not lived and breathed this idea and it's execution for the last few months and years.

As a learning Professional I always start with the fresh novice mind.  I am able to start with the luxury of looking at the product fresh and clean, without the impact of years of experience with the product, but with years of experience of people learning software.

Take a fresh look, or better yet ask someone else to take that look for you.

Part 2 follows soon

Happy Learning
Alan