CAT | Uncategorized
There’s a lively discussion in the CLO LinkedIn group about how to build trust in our organizations. It’s heartening to see how many people in the learning community have thought deeply about this issue.
And it makes sense, because as learning professionals, we exercise significant influence over the propagation of our organization’s culture. The learning experiences we facilitate for our coworkers speak not only of our own vision, but of that of the organization writ large. What are we communicating, and how does that communication promote trust?
- Are our materials current, reflecting accurately the situation in the field?
- Are our materials relevant, dealing with issues central to our colleagues?
- Are learners freed from line responsibility while in training, so they can concentrate on what they are learning?
- Is there followup to see how learners are faring in implementing new skills on the job?
- Are learners’ managers involved with the training process?
I could probably come up with a dozen more questions, all of which get to the extent to which our learning organizations demonstrate competence in our field, and respect for our learners. We can’t build trust without these elements. And who pays attention to training deemed untrustworthy? Who should?
I’m sorry they took a blanket approach at Yahoo. It seems clear that something is lost when people don’t run into one another in the halls, and in the cafeteria, but also that there are times when one needs to be heads-down on a project and avoiding the distraction other people provide. Creating an office environment which supports both modes is a challenge which often goes unmet.
I think that’s true for learning situations, as well. I’m a big believer in social learning – that we humans learn best within a context of other humans comparing notes on how things work. But sometimes, we need to be heads down, concentrating on written material. Sometimes, we need to be alone in our own heads, reflecting on what we’ve learned, making the connections to the other things we know. That’s what writing papers was about in school. And it’s often what writing analyses and recommendations is about at work.
When I’m training somebody, I’m usually doing it via computer. And I often recommend that they see whether they might work from home for the training. Because I have a MUCH better chance of full learner focus when there won’t be co-workers stopping by to drop something on my learner’s desk.
I find it interesting that some of the discussion centers around whether a given individual is equipped with the skills which enable productive work when unsupervised. Self-management skills are indeed critical to success in this environment. Are they something the organization might be able to nourish?
Michael Echols at Chief Learning Officer has a post this week urging learning leaders to do the research to check whether “best practices” for improving performance which are articles of faith in their organizations are more than myths. It’s shocking how much of what guides common practice has no actual scientific basis, but is merely “the way we do it here.”
Some best practices really do depend on corporate culture– what works for some organizations may not work in others. Others, though, depend on more universal parameters, like, say, human cognitive function, and hence can be widely applied in different organizations.
Way back in 2003, Ruth Colvin Clark and Richard E. Mayer published e-Learning and the Science of Instruction, an extremely practical book which reviewed the research on how various design elements in e-learning modules affect learner retention, and distilled that research into best practices for authoring these modules.
Ten years later, we’re still seeing uneven adoption of good design. For example, we’ve known for at least a decade that learners retain less when we have the audio narration channel “reading” the text on the page. For reasons probably related to limits to the aural and visual cognitive channels (not to mention that the speed with which individuals can read text is generally different from the rate at which they can understand the spoken word!) we have study after study which demonstrates that it is better for retention when the audio channel is used to elaborate on the information presented to the video channel, not replicate it. Nevertheless, the how-to videos Adobe publishes for use of their Captivate authoring software not only “narrate” the text on the page, but use a robotic voice which mispronounces both technical and not-so-technical words!
When a major vendor to the profession demonstrates unawareness of good design, (or, at the very least, the willingness to compromise design in the interest of showing off spiffy new features like that robotic voice!) I think we have a problem. I suppose it’s probably not the worst thing for those of us involved in instructional design to have to take Really Bad Training every once in a while, in order to equip ourselves to empathize with our learners. But we’ve been building multi-media computer-based training for a while now. It’s time that designers getting the basics right was something our learners, and the organizations which pay us all, could count on.
Bill is going to share his 10 principles today — 2p.m. Eastern. It’s free, come check out our latest thinking…
10 principles for selecting the right Learning Management System
Kellogg has LONG been identified with a collaborative approach to the teaching and learning of management skills, so this is an obvious fit. As a loyal alumna who has been part of that economy for the last decade or so, I am pleased to see this focus. I’ve met Sally Blount, Kellogg’s new dean, and think highly of her – she’s very sharp, a dynamic speaker, and seems to have her mind around an extremely dynamic environment. So I was disappointed to see that in her article describing Kellogg’s focus on aligning with and driving the collaboration economy, that she repeats the notion popularized by our least critical technology enthusiasts — that trust is something that is somehow embedded within collaborative technology. As a professional in this field, I have sad experience with the powerlessness of collaborative technology to build trust where trust is not already part of the culture of the humans using it to work together.
The Collaboration Economy is also rooted in an emerging human culture of access, openness and trust. When people enter the digital world through their computers, smartphones and other devices, they ask questions, share information, and reach agreements with a fluidity seldom seen before in human history
A new global culture is emerging that transcends national, ethnic, and organizational boundaries – the old institutions that used to develop and regulate our shared, taken-for-ranted rules for interaction. It is a culture that assumes 24/7 electronic access—for emailing tweeting, posting, and texting.
This new culture is particularly powerful in the norms of trust that it engenders. Markets require trust to operate effectively, and the old rules of building trust over long periods of time have softened. Over the Internet, parties with limited histories of personal interaction readily connect, communicate, and take risks together.
Perhaps one of the earliest and best examples of this phenomenon is the “open source movement,” founded in 1998 by a group of free software advocates. Through that movement, the Linux operating system was created and is now widely adopted by corporate computing managers as a hig-performance, lower-cost alternative to propriety software from Microsoft, Sun, and others.
Kellogg is where I learned about how fundamental trust is to the efficient functioning of markets – it’s where we explored what happens in the Real World when the “perfect information” assumed by economists isn’t available, and people have to make leaps of faith.
I would argue that what the technology can do in the facilitation of building trusting relationship is not much in its role in facilitating the meeting of people who might find something they can do together. It’s that it makes transparent the reputations of these people. It’s nice to be able to ask questions and share information – what’s even better is that if the information is incorrect, or incomplete, it’s possible to find out quickly and adjust one’s level of trust. What made, and makes Linux such a success is the alignment of the contributors (everyone is in it to make better-working software) and the transparency inherent in software – if your code isn’t very good, people find out right away, and maybe fix what didn’t work so well, and possibly avoid your contributions in the future.
It seems to me that technology can indeed quicken the pace of our experiments with trust, teaching us faster who can be relied upon to follow through and who cannot. It makes it possible, in some situations, to take smaller risks to begin with – when I break a project up into phases, I can judge my collaborators’ performance on phase I before committing to phase II. It is also driving a cultural change in which we have started to expect a higher level of accountability when things go wrong—when the project plan for our joint venture is readable to the entire team, it’s quite clear who it is who is missing their dates!
In the end, though, the path to success in the collaborative economy for any organization is the development of a track record of excellent performance. Technology gives our markets many new windows for observing us, so we need to shine more brightly than before.
Schectman reports:The tool scoured messages for keywords such as “healthcare” or “education,” and displayed issues on a dashboard campaign staffers could look at to figure out what concerns or questions were surging in citizen correspondences with the campaign. The dashboard also allowed staffers to look at what issues were trending by state, city or town, allowing the campaign to adapt its ground game in real time, according to [Salesforce EVP Vivek] Kundra. Those insights could help staff in the field that had mobile versions of the dashboard. “
I find this development fascinating, partially for its creative use of a tool ostensibly designed for a somewhat different application (sales) and successfully applied to the campaign trail.
More importantly, it’s yet another instance of the “tools that build tools” which were foretold as part of the future back when I was a management grad student back at the dawn of the personal computer.
The quest for human understanding often begins with the search for “a place to stand” from which one can get a new perspective on the situation. Powerful computing tools put to this use produced immersive simulators like the CAVE at the University of Illinois back in the mid 90’s. Now even the CAVE runs on a desktop machine, and regular folks without special grant funding can buy into cloud services on even more powerful servers.
The power is there. Tools like Salesforce make it possible to assemble custom reports and dashboards which report the metrics that matter most to our organizations.
Are people in your organization mobilizing the latest tools for understanding your customers? Are the people who are doing it training others in their methods?
Does your training software give you “a place to stand” to see where your training efforts stand, and where needs may be emerging?
They are a firm which provides health care, with employees who are on their feet during their work day, attending to patients.
Like most organizations today, they are concerned about the cost to the organization of taking people out of the production role for any length of time. Traditional instructor-led training creates staffing issues on the floor, and appointment unavailability in the clinic. Unlike many of our financial services clients, moving training to the desktop isn’t an obvious solution – to the extent that doctors and nurses HAVE desktops, they are not places at which these individuals spend a lot of time!
As electronic health records are implemented in health care organizations, more of these folks do have computers or tablets—and a need for training in their use! So this organization is wondering – does it make sense to move some training to the tablet?
We think it does, but that there are some principles which need to guide this transition:
- Putting references online (just in time learning) is perhaps the most direct path to a quick win. Healthcare professionals are already accustomed to looking up drug interactions and dosage indications online – structuring organizational references so that they are easily accessible from the charting tablet will likely drive more effective usage.
- On-line training still takes learner time. If it matters to the organization that training happens, it matters to schedule it at times learners can reasonably participate, and to compensate learners for that time.
- On-line training takes space. It’s unrealistic to expect learners to be able to concentrate on their training if they are sitting in the break room. Some learners may welcome the opportunity to do their training in the quiet and comfort of home. Others may have small children or other competing responsibilities at home and need to have access to a conference room or some other space at work. Counting on learners to use the space provided on trains or busses during their commute is a risky move—especially if most drive themselves to work!
Given these principles, we recommend implementing these strategies to hurdle training obstacles:
- Structure training as a process, over time, in small chunks. A doc who has brought the tablet home after a full day in the office to catch up on her charting is not going to sit through a 1 hour e-learning module. But she may be willing to knock off an activity which asks 5 minutes of her attention to an article or some other content, followed up by 10 minutes of answering questions about it.
- Keep the training as close to the task as possible. If you are training on use of an EHR, make sure there’s a “sandbox” version of the system for learners to experiment in (and that their login credentials work!)
- Recognize the training effort. Use a system which makes tracking learner progress effortless, and make it clear to the learners that people who matter are noticing their efforts.
In a recent issue of Chief Learning Officer Magazine, Mike Prokopeak reported on a recent poll by Lee Hecht Harrison.
An overwhelming 91 percent of workers said that job training and career development were among their top priorities…Six percent called it a “duty.” Just 3 percent said it is a hindrance.
The polling agency sees this as a shift – traditionally responses indicated that training was a hindrance to busy people getting their work done.
Kristen Leverone, SVP and Global Development Practice leader at Lee Hecht Harrison, suggests that a large number of employees are not getting the development they need, and that perhaps part of that issue is due to over-reliance on managers to manage the development of their direct reports.
I don’t know about you, but I see this phenomenon in the organizations I work with. Heck, in many places manager spots are vacant, with other managers sort of covering the responsibilities of a departed one. In this situation, the only thing that happens is the putting out of day-to-day fires. Identification of talent and setting up employees with optimal training opportunities is so far on the back burner that it might as well be off the stove.
Where there is a wider organizational commitment to development and training, and the systems to support it, this is less of a problem. Where the worker can see what training is required for the levels s/he aspires to, it’s possible for him/her to take the initiative in seeking out that training.
Who is accountable for talent management at your organization? Are employees empowered to do some of it on their own initiative?
We are unavailable today – Normally we all work from our respective globally distributed offices, but this week we’re having a company retreat in which we actually gather face-to-face. We’ll return to our virtual spaces next week.
Last Wednesday I attended a KnowledgeAdvisors (KA) users group meeting in San Jose. I love KA events because the companies and people who purchase the KA Metrics that Matter analytic tools and services care enough about the effectiveness of their talent management or training efforts to pay good money to measure their effectiveness. It is exhilarating to be in room full of people who have the courage to put their jobs on the line to demonstrate that the learning solutions they are delivering make a difference. (It’s just like hanging out with our xPERT eCampus customers who are putting it out there designing and delivering their training to proficiency learning solutions.)
The presenters and in the ad hoc networking discussion at this meeting revealed that virtually every learning solution that produced Kirkpatrick Level 3 or better results is a structured blended learning process which
- takes place over time (typically 3 months to a year.)
- features required application and reinforcement activities
- involves coaching by either SME’s or the learner’s manager that were also monitored and tracked
- features accountability for reinforcing the learning on the part of the learner’s manager
When I asked a number of folks, “What are the biggest challenges to overcome in getting these kinds of great results?” they pointed to two major challenges:
- Securing the level of sponsor commitment and resources needed to do a training to proficiency process right without cutting corners. (primarily learner, SME & learner boss time).
- Effectively tracking and supporting all the players involved in the application and coaching activities of the blended process.
The people I talked to agree that, of the two challenges, the most difficult challenge is tracking and supporting all the moving parts and players in their blended solution. While they agree that getting sponsors is tough, once they get commitment, the real problem is maintaining quality execution as the programs scale.
To a person, they say that their company’s LMS wasn’t designed to support these types of robust blended learning processes. The LMS is great for tracking the typical kinds of training stuff like classes, WBT’s and possibly virtual classroom activities, but doesn’t have the capability to support and track the more social, collaborative and coached kinds of learning activities that are central to the success of their programs. In response, they have adopted very expensive and creative work-arounds involving a bunch of people running around behind the scene managing all the moving parts. Unfortunately, several folks said that even with a lot of “elves” performing this tracking work manually, it is very hard to scale.
At Q2, our next-generation social learning system which enables our customers to successfully create, track and support these types of training-to-proficiency learning solutions.
If you would like to learn a little more, the links below will take you to several short videos that will give you a better idea of what I’m talking about.
Tour of formal learning features: http://www.q2learning.com/tour.php
Tour of informal learning features: http://q2learning.com/video_informal_learning.php
Tour of performance support features: http://q2learning.com/video_just_in_time_learning.php
Tour of On-boarding: http://q2learning.com/video_on-boarding.php
If you would like to contact me directly my email address is: email@example.com