Correlation is an interesting concept when applied to training. In the book “Big Data” the authors state that if action “A” and action “B” take place together then the existence of “B” is a reliable predictor of “A”, even if “A” cannot be observed or even measured.

A cited example was when Wal-Mart looked into past customer purchases in concert with multiple other factors, one of which was weather. What they found was that prior to a hurricane there was a corresponding increase in the sale of flashlights. This was certainly understandable. But there was also a similar increase in the sale of Pop Tarts. So the next time violent storms were forecast they located the Pop Tarts next to the flashlights, increasing the sales of both items.

So what does this have to do with eLearning? Correlations between web-based training activities and student success may prove to reveal the importance of MANY unexpected but NOT unrelated issues in the overall efficacy of our educational efforts. Certainly we expect the attractiveness of the design to impact the initial student engagement with the course, as well as the level of interactivity. But what about factors that we might overlook, and not consider as to how they might influence or predict success as measured by changes in attitude or behavior post-training? Consider the launch date.

How many companies think about the launch date of a course as an indicator of the success or effectiveness of an eLearning initiative? I would be willing to bet that it falls far behind other considerations such as the availability of funding, resource scheduling, audit probabilities and administrative directives. I wonder if it is even considered at all.

It’s a fact that most people/families take vacation from late July through the month of August. Let’s suppose funding becomes available on June 1st to update your annual Sexual Harassment web-based training course. Based on an internal directive to meet compliance all company employees must complete this training by the end of the third quarter. A review of internal development resources shows that the updates can comfortably be completed, with reviews, to allow a launch date of July 15th, giving employees a 45 day window to complete the training. Everyone is pleased from the administration through the developers with the schedule.

The training is launched on schedule and completed before the end of August, keeping the company compliant. During the last quarter of the year harassment complaints reach an all-time high. The competency of everyone from the training developers to the Director of Learning and Development is brought into question. The course is re-reviewed and found to be of exemplary quality, covering all of the issues now plaguing the company from unwanted advances through innuendos. What could have gone wrong? In a face saving move the LOD director is fired to at least demonstrate the company’s concern over the problem.

But the reason the training was unsuccessful may never be dealt with because it was overlooked. The truth may have been that people were focused on their upcoming vacations and their motivation for completing the training was “time based” and not “comprehension based”.

Simple statistical correlation analysis holds the key to identifying those factors, sometimes hidden, which influence or predict the success or failure of training initiatives. If we are going to take online education seriously it’s time to connect the “data dots” – and not rely on simple causation theory.

Compliance = White Flag Training

The concept of personal responsibility seems to have become a point of negotiation in web-based compliance training. When did the reaction of the learner to the difficulty of the assessment become a concern in developing training?

Recently an Instructional Design colleague mentioned that he hoped his audience didn’t get upset with him because he was going to “clear their previous answers” AFTER they failed the test AND BEFORE they retook it. His primary concern was the reaction, not the level of understanding, of the learner.

When a fine is levied on a company due to a failure to comply with a regulation, could “soft” assessments be the cause? What responsibility does an Organization, Trainer, LOD Manager AND Trainee have in assuring training adequately assesses learner understanding?

Testing to competency is an acceptable goal in performance training and likewise should be in compliance training. In either case, failure in achieving a level of proficiency OR understanding can lead to some type of future loss.

A potential client asked about our experience in developing validated knowledge assessments. It was the first time in over 7 years that anyone has shown a real interest in determining the true value of questions in measuring learner understanding.

When assessments are “too difficult” then the learner gets frustrated. When learners take “too much time” in their training employers get upset with lost man hours. So the designer begins to select distractors that are more effective at indicating the correct answer rather than inviting thoughtful consideration. This pleases BOTH the employee AND the business owner. It also accomplishes the goal of “faux” compliance.

But when an Instructional Designer consciously undermines his own value by “dumbing down” an exam he performs a disservice to himself, his company, his audience AND his fellow designers. It’s is a better outcome to allow an individual to fail – for the company AND the individual. The company gets to identify who needs help with concepts and the employee gets the help he needs. Anything less is like putting a band-aid over an irregularly shaped mole.

I think it might be time to retire the world “compliance”. It’s too soft. Compliance infers surrender.

Instead, how about if we decided to call it partnership, brotherhood, cooperative or, even better, regulatory ownership training. These words indicate a shared personal interest in a common goal rather than concession, submission or consent.

We don’t need to put more teeth into regulatory training; we just need to do more than merely comply.

How do You know you NEED it?

How does a company define what is most important to their organization when it comes to web-based training?

I just returned from a capabilities presentation where the majority of the questions reflected a deep concern about Project Management, or, more to the point, about how a company could competently develop courses without using Microsoft project.

This concern was in reference to building multiple “page turning style courses” in a short time frame, and the primary concern of the attendees was Project Management?

If you were to reach into your pocket to pay to take a training course, web based or otherwise, what would be most important to you? I would hope your deepest concern  would be that you learned something, followed by whether you enjoyed the course.

People select a way to evaluate a process by what is most comfortable rather than what is uncomfortable, the familiar rather than the unfamiliar. It’s certainly understandable, but probably not most practical.Targeting Project Management tools as a way to evaluate elearning development competency is like measuring the quality of the programmer by the cleanliness of his desktop. In a way the truth is disproportionate – the messier the desk the better the programmer.

I think it’s time for us to become uncomfortable. Begin to consider practical methods of evaluating what is important to understand – NOT what is easy to understand. We shouldn’t focus on the easy-to-measure. We should instead look at impact, effectiveness, efficacy, experience, approach, understanding, engagement and comprehension.

These are the true evaluators of educational success, and the results may make you uncomfortable with your instructional approach and efforts to date. It’s time to take off the rose colored glasses.

Our Solution: CHRONNICLE®

I recently had a conversation with a business colleague about how the world evaluates. My position was that, in the business world, the value of what we offer has more to do with perception rather than the scope or level of our efforts. It’s a truth the world of marketing has understood and employed for a very long time.

In the world of elearning we are all governed by perceived value. It is not the guiding principle that attracted us to the field, but the reality to which we are introduced when we begin to ply the tools of our trade.

We, as providers, have been reluctant to play a role in defining the rules by which our efforts, products or services are measured and assigned a value. For our industry in general and developers individually to thrive, we must begin to influence the process.

So then, how do we begin to change the current business perception that the value of web based training is reflected by LMS completion records?

We recently released CHRONNICLE® as a Lextension® (Lectora Extension). We have been developing this patent pending technology for the past five years.

CHRONNICLE® is based on the premise that data flows in two directions in web-based training. In the “downstream” content is delivered or presented to the student. In the “upstream” the learner reacts to the content, presentation and opportunity to learn. It is these reactions – or behaviors – that represent the true measureable value of web-based training.

Individual learner actions are measured by content or concept “engagement factors” indicating comprehension or completion, interest or disinterest, engagement or disassociation, challenge or success – for the student as well as for the course. This data is the “dark matter” of elearning – the 98% that has yet to be seen or evaluated but represents more value than what is presently considered:

  • Individual learner actions reveal the behavioral tendencies of the learner.
  • Aggregated actions indicate the learning tendencies of a defined group.
  • Collective experiences indicate the competency of the course itself.

Beyond evaluating recorded learner actions after courses have been completed, CHRONNICLE® includes Artificial Intelligence (AI) construct named Paige Turner® that monitors and analyzes learner behaviors in real time, providing a path and process to modify these behaviors – much like a teacher observing a student and correcting his actions. In fact Paige monitors a student’s actions in a more granular manner as she can record not only the result of the student action, but the sequential behaviors which lead up to the action, and has the ability to independently interrupt the student when the timing or sequencing of their behaviors indicates a challenge.

This pedagogical agent can also be “summoned” to the screen by the learner to engage in a “dialog” to assist the learner should they encounter a challenge during the course – just like raising a hand in a classroom and asking for help.

As each learner progresses in a course their actions are recorded to a database. These data include the sequence of actions, the time considered in performing actions, the time between actions, the time required to conclude actions and the reconsideration of actions – a full spectrum of the learner’s engagement with the individual objects, the environment and the course itself.

These captured actions provide a rich behavioral profile of the learner’s experience while at the same time identifying their learning preferences and action tendencies.

In addition a performance profile is constructed for each course so they may be evaluated for challenges, competency and effectiveness.

So what does this technology afford the individual professional and the elearning industry?

We now have a solution, for companies purchasing our development skills and products, to change the perceived value of our efforts.

We meet the expectations of Business by defining the process to measure the value of web-based training in terms that Businesses understand – Return On Investment (ROI).

By returning learner behavioral and course performance data as Business Intelligence we move elearning from the expense to the income side of the ledger.

As Businesses realize richer training experiences return more valuable data, Instructional Designers will be given the resources required to design and develop the highly interactive branching courses that reflect the breadth of their knowledge while learners will be introduced to more interesting and engaging courses.

We can change the perception, and our industry, at the same time.

Next: How Do You KNOW You Need It?

The Symptom:

When we visit a doctor because we are not feeling well, they begin by questioning us about our symptoms. Why? Because a sniffle, a headache or a sore throat are merely indicators that something is wrong, not the root cause of the illness itself.

As an elearning advocate, I have railed against rapid development as an anathema in the field of web based training. I have spent too much time making the argument with colleagues in print, in person and online that something that begins with PowerPoint cannot be run through a conversion engine and magically transformed into elearning.

It wasn’t until I recently realized that rapid development, like a cough or a sneeze, is merely a symptom of an unhealthy elearning industry, and not the illness itself that I began to look for the root cause of the elearning’s current state of health.

In my previous four blog posts, I have discussed various aspects of what elearning lacks by comparing web-based training to the classroom experience. It was during this journey that I came to identify the underlying issue generally impacting elearning. We continue to compare the efforts, concepts and products of our industry to a training ideal that is rooted in the past; in one teacher, a piece of chalk and a blackboard.

It is not that some of the tenets and strategies of this established and familiar training approach are ill-conceived, but that a web-based solution should yield far better results and provide a far richer experience due to the technologies and support systems currently available.

Pass/Fail captured by an LMS shouldn’t be the lone evaluative measure of teaching success.

Web technologies today can suggest sites to visit, or music to consider, based on our past selections. Commercial behaviors are tracked by our transaction history. Web searches are assisted and completed in real-time based on words we enter into a browser search field. Our location is requested and tracked through geo-location technologies to allow our movements to be plotted and our friends to find us. Smart phones provide instant access to language translation and object identification. Text message strings are populated based on syntax algorithms. Experts all over the world offer assistance to solve problems posted by motivated learners or those in need of a knowledge-based or experience-based solution.

The elearning industry does not need to replicate the support systems and instructional methods of the classroom. We need to build a better teacher who is constructed on available technologies and open to predicted possibilities.

Construction though does not define success.  We will need to commit resources necessary to prove the effectiveness as well as demonstrate the return on investment to the industries that purchase and employ our development applications and technologies. If our industry does not advocate for itself by supporting research that evaluates the effectiveness of courses built using our concepts, tools and constructs then we leave ourselves open and vulnerable to evaluation by others less qualified.

We must also demonstrate that the return on investment is directly related to the size of the investment itself. Inexpensive training solutions return simple results, such as checkmarks in an LMS. They will not provide rich intelligence available by evaluating and analyzing captured training experiences, statistical data with the potential to influence business decisions and reveal individual competencies and challenges.

We need to consider elearning as a greater paradigm not only offering rich possibilities of learning to the individual but providing an opportunity to learn more about what motivates and interests the student.

A recent response to my earlier blog postings noted that I am making some very valid points but offering no real solutions. Stay tuned for the next blog.

Next: The Solution


In elementary school teachers assign seating alphabetically, perhaps later adjusting for student height, visual or hearing challenges. Future seating modifications may consider behavioral issues, placing disruptive students closer to the front to reduce classroom distractions.

In Middle or High School, students are allowed more autonomy. We select our own seat in class. We adjust our schedule based on friends or shared opinions of specific teachers. We voice our ideas and impact curricula through participation in student councils.

College offers us the freedom to select our own school, create unique class schedules, select professors, attend classes or skip them, study in groups or independently.

At each ‘education milestone’ our ability to influence and shape our learning environment expands to meet each new level of maturity. The desire for personalization is a natural expectation for the mature learner. As we age we value individuality and readily accept certain responsibilities associated with educating ourselves. The opportunity to play a role in influencing our learning environment provides for security and comfort, assuring a personalized experience.

Consider common experiences where your comfort is considered before engaging in an activity. The dentist adjusts your chair and offers Novocain. The restaurateur offers you a selection of seats then asks you how you want your meal prepared. An adjustment of an environment or situation to accommodate your personal needs creates an atmosphere of security and comfort.

Now consider more expensive experiences such as the luxury vehicle that adjusts your seat, mirrors, and headrest and sound levels to a pre-programmed state each time you prepare to drive. An automatic adjustment of an environment to accommodate your personal needs based on previous knowledge and experience enabled by familiarity. For those who remember Cheers – think about the comfort level of the barfly Norm.

Let’s look at self-directed corporate web-based training. Many of us develop a product we like to call “customized courseware”.  But, does the student experience any personalization?  Do we ask him before he begins what might make the experience more enjoyable for him?  If he is fortunate we might include some ‘really advanced’ tools that will allow him to self-adjust the environment, like muting the audio track.

What if the course could customize itself as it ran, to reflect the personal preferences of each student? What if the behavioral tendencies and learning styles of the student became collected data to be mined in real time to assist in personalizing the learning experience?

What if content engagement or navigation was not “force required” on students?

Consider that ‘intelligent content’ could actually “learn from the learner’ and thereby provide insight into the type of interactivity that invites the engagement of the student.

Think about a course that doesn’t ask you what makes you more comfortable because it already ‘knows’ what makes you more comfortable, and prepares the online environment by making the necessary adjustments for you.

These questions should start us thinking about the experience of enhanced web based training when it is built with the type of knowledge gleaned from more conventional classroom based student/teacher interactions. This new generation of online training can provide as much insight into the psyche of the student as it can deliver training in the most palatable and personalized format.

Next: The Symptom


Years ago I had an instructor who, in response to a request for clarification, would simply repeat his earlier instructions in a louder voice, as if volume had been the impediment to comprehension.

I have come to understand the value of appropriate and meaningful responses when help is requested, and the frustration of limited options or repetition as available solutions. Like others, I now surround myself with assistive electronic gadgets, such as smart phones, gps navigation aids and iPads to help me get me the help I need, in real time, in the format that best suits my needs.

The exception to personalized on demand help is in corporate web-based training courses, where the student learns in isolation, and have limited access to supportive or assistive elements in a course. Tools found in the classroom, in our cars, or clipped to our belts that provide assistance are missing from the online training environment.

Is an environment in which you can’t ask a question or request help really an invitation to learn?

Consider the simplicity of the Amazon Kindle. If, while reading an electronic book, you encounter a word you don’t understand, you leave the reading environment, select one of two available dictionaries, look up the definition of the word, and return to the book with little effort. A smart solution for word challenges with potential to increase our understanding and enjoyment.

The role of web based training is to transfer knowledge in order to help us understand new concepts or gain new skills. Dictionaries or glossaries are of less value in these instances.

When help is needed to clarify challenging ideas, we may seek a secondary resource to present the concept in an alternate way or format to facilitate comprehension. In a classroom this resource is the Teacher. In business it is the Mentor.

The role of the Instructional Designer is to create a course where information is structured and presented in a manner that facilitates learning and meets a training goal. To further require the I.D. to predict where individual learners might have difficulty in comprehending specific concepts and then create additional supportive options to address those potential learning issues is unfair and unrealistic.

A design or approach that presents with clarity to one person or group may be completely confusing to others. Learning styles and challenges have been recognized in the classroom for decades. These differences do not represent intelligence, but do need to be addressed for learning to take place.

Elearning developers can look to the classroom for a better solution. Teachers can’t predict where problems in a lesson plan will occur. Instead they watch challenges reveal themselves during the presentation and then respond accordingly. Instructors allow students to interrupt the flow and request clarification to support individual or group comprehension. The tool?  A raised hand. The same tool is available in presentation software such as WebEx to allow a participant to ask a question. The tool?  A raised hand icon.

Web-based training needs an engaged observer in order to respond to the needs of all students and ultimately meet the goal of comprehension. We need an on demand solution for the online student so they can request assistance and break the isolationism of the corporate elearning environment. We need a way of letting the student know that although they are self-directing, they are not alone in their education experience.

In short, we need a Teacher.

Next: Personalization

Blindfolded Design:

There’s an old joke about a husband and wife whose child was born perfect in all ways but never uttered a sound. For five years the child was silent until the day he was served lima beans for lunch. “This tastes horrible.” said the boy. The parents were stunned. “You can speak.” they said, “Why haven’t you spoken before?” The child replied “Everything has been good until now.”

Feedback. Without it we have nothing more to guide our actions but trust and faith. We will continue to perform our jobs the same way every day in the belief that we are delivering an acceptable performance or product unless our audience is given the opportunity to offer an opinion.

What we notice about classroom teaching directed by a seasoned professional is that, in spite of a “lesson plan”, there is room for modifying the instructional approach based upon audience reaction. The teacher’s real time assessment of the success of their own performance is referenced to modulate the rate and style with which the content is delivered.

While the syllabus (instructional design) needs to be followed, good teachers seek to engage students by stepping back or moving ahead based on visual and aural feedback. There is no reason to progress to the “next topic” unless general comprehension has been observed. Teachers are able to sense interest, hesitation, doubt or confusion by watching and then responding.

To date, elearning developers have been guided only by content and generally accepted instructional design theories. Our target audience has yet to be given an appropriate forum or vehicle with which to offer their opinion of our efforts. We might consider that developing courses without feedback from the student to be a form of “Blindfolded Design.”

A broader understanding of the unique learning needs of the student must encourage us to introduce elearning to the world of responsive customization. It is the only way to begin to replicate, in the online experience, the malleable and supportive environment of the classroom.

Consider this situation. Based on research indicating the value of including videos in training courses (Bassilli (2006), Barton and Haydn (2006), Gebhard (2005), etc.) your client wants to include a video in their training with the requirement that the student views the entire video before they can progress in the course. So you include the video in the course and employ some technique to disable forward navigation until the video ends – thus meeting the criteria.

What’s the “takeaway”? What have YOU – the elearning designer – learned about the efficiency of the video in imparting knowledge versus a simpler text-based approach? What have you learned about the resonance of the content with the student? What can you report back to the client about the training value of this expensive option? What PROOF can you offer as to the level of learner engagement OR enhanced training effectiveness with the video that would in any way validate, if challenged, the cost of developing the video, the programming time to include it in the training, the seat time required to view it, or the increased bandwidth to stream it?

Without the hard data to support our belief in the instructional effectiveness of our designs or ANY of the training content/learning objects we include, then our blindfolds will continue to directly influence our instructional approach and present an opportunity to for anyone to challenge our assumptions on a cost of development basis. And perhaps rightly so.

Next: Isolation

The Problem:

Here are three commonly held assumptions about online courseware.

  1. Elearning is an efficient way to train people
  2. Rapid development equates to streamlined training
  3. People enjoy elearning

Consider the impact of these assumptions.

  • Corporations continue to transition leader-led to WBT
  • Development costs influence WBT design
  • Seat time trumps comprehension
  • Learning theories evolve to support popular assumptions

In transitioning from classroom learning to web based training, we may have forgotten the first two tenets of effective teaching: “observe the student” and “evaluate the experience”.

What role does observation and evaluation play in assessing the effectiveness of your web based training initiative?  Do you follow the ADDIE model such that Evaluation arrives at the END of your process?  Upon successfully launching a WBT might the “E” actually become a “silent E” and then be quietly forgotten? Do you perform an Evaluation at all?

In training situations where evaluation is of value, unbiased feedback assists in identifying the success or failure of entire curricula; and, on a more granular level, the success or failure of individual learning objects in each course.

We must recognize “feedback” for the potential value it holds; as unfiltered reactions to experiences that indicate interest, boredom, displeasure or neutrality. Without direct feedback, we have nothing upon which to base our beliefs except “smile sheet” opinions and a bit of guesswork.

For businesses, unlike online colleges, training is not a commercial product. Companies do not attract employees because their training courses are first class.

When businesses consider training an expense that marginalizes profit, it is natural for the organization to search for the best low cost solution.  Certainly then “best” becomes subjective, most often defined as fast or simple. In a profit-or-loss paradigm, one would agree that “best” cannot include anything that increases time, cost or effort.

So what is the overarching influence of this reality? Companies seek streamlined (fast and cheap) rapid development solutions. Application developers shift their focus to meet the demand for tools that allow training courses to be created faster and cheaper. Without the need for skilled (expensive) Instructional Designers, organizations provide these “easy” development tools to anyone who knows how to use PowerPoint. Students are expected to learn from rapidly developed page turners. Training is completed and success is measured by the only evaluation measurement required – the completion status in the LMS.

This is a bleak picture for those recognizing the value of the professional Instructional Designer’s contribution of learning theories to course development. It should disturb anyone who believes education/training should impart knowledge and impact job performance.

Solution? In a business sense we must move training to the profit side of the ledger. We can provide a greater scope of value by capturing mineable data from each student/course engagement that can help a business make decisions that positively impact its bottom line.

We need to show organizations the data returned from a student’s training experience is at least equally as valuable as the content and concepts presented in the training itself. This can be achieved by following teaching tenets 1 and 2…observe and evaluate.

Next: Blindfolded Design


We don’t do anything without feedback. It’s so much a part of us that our entire nervous system is built to provide it to help us determine appropriate responses.

In some situations we seek out feedback to validate our own assumptions – “Does this dress make me look fat?” At other times feedback is unsolicited – “That dress makes you look fat.” In either case the information is useful in developing a response action.

But what about elearning? How do we get feedback on the courses we’ve developed or purchased, and implemented?

If you’re like most of us you’re just glad to get the courses posted on the LMS and assigned to the target audience. The passing grade and completion checkmark are sufficient feedback for determining success for both the student and the course.

This passive approach to evaluation negatively impacts all stakeholders. As such, we need to collectively address the process by which our educational efforts are being evaluated, or those less invested in the process will define the terms by which we develop and implement web based training solutions.

Consider that 70% of all corporate online training is compliance-based. This means the company has not independently elected to develop or implement training in these areas. The natural impulse when we are TOLD to do something is to, at first: resist, second: look for a way not to have to comply, and finally: identify the least expensive means to comply. For most organizations this defines elearning. It’s the less expensive option versus classroom instruction.

Let that concept sink if for a while. Consider COST as the original ROI of our educational efforts. We’re cheaper than a classroom.

Soon less expensive was no longer enough. Courses had to be developed faster. So in “response to market demand” application developers valiantly stepped up to the plate and offered “rapid development tools”. This streamlined approach was based on employing the sound pedagogical principles embodied by the Microsoft PowerPoint application to allow educators to construct and deliver less expensive courses in a shortened timeframe. Educational Nirvana had been achieved.

Or was it? Has fast and cheap helped achieve the goals of education or simply met the minimal requirements for proof of compliance?

Has the checkmark in the LMS report become the new standard by which we measure success?

If you share my values as an educator, when you cannot be happy with how our efforts or our industry is generally perceived. Less expensive than a classroom; constructed with a presentation tool.

How did we get here? How did we allow those who may not be as interested in the goal of true comprehension that we value as educators to define the tools we employ?

It began with feedback. We passively allowed the checkmark in the LMS to become the defacto feedback standard by which educational success was measures. The checkmark became the goal. Comprehension was an afterthought – and certainly not required by the audit report.

Puns aside, I believe it is time to make a drastic course correction. We must make COMPREHENSION the standard by which we measure our attempts at educating our students and employees. All invested parties need to play a role in this effort. That means Chief Learning Officers, Compliance Managers, Application Developers, Training Managers, Instructional Designers, Students and Budget Directors.

“How quickly can they get through it?” is no longer an acceptable question. Consider instead asking: “Will they understand the concepts and assimilate them into the way in which they approach their job?”