Son-of-Fire's Learning Landscape Headline Animator

Showing posts with label learner. Show all posts
Showing posts with label learner. Show all posts

Monday, January 10, 2011

It's Not the Weapon, but the One Who Wields It...

I have been spending some time on LinkedIn lately, specifically on group conversations. Recently the E-Learning 2.0 group got my attention. The discussion was based on what would be the most relevant eLearning technologies of the 21st century. Now I am a product strategist, learning technologist, and lacrosse coach...
Whenever it comes to discussing how to best use technology, learning systems, or lacrosse sticks, and whether using it will make a difference, it always comes down to the same thing... "It's not the tool, but the one who wields it" that scores the goal... That takes preparation...


This means that the future of Enterprise 2.0, social tools, mobile, or any other advances in eLearning will depend highly on whether those who implement these tools to make business critical and or academic information available - are mapping the right tools, to the right job roles, and for the right reasons. It all comes down to needs analysis and how providers design a set of learning and knowledge management systems around those needs.


 As a learning or knowledge management professional, what can you do?
  • Conduct a needs analysis.
  • Identify the organizations goals and business drivers.
  • Identify critical job- and team-based tasks.
  • Understand the parameters of the work environment and its context.
  • Know your learners and knowledge consumers.
  • Involve all stakeholders in the process (IT, HR, Operations, any business unit that may be "touched")
  • Avoid the "flavor of the month..."do not implement a new technology just because everyone else is using it.
  • Bring in experts when you need them
  • Choose the right tools for the right jobs
New technologies are the weapons for success in the years to come, but remember, it's not the weapon, but the one who wields it...

Friday, June 26, 2009

Tips for Driving the Appropriate Use of Learning Technologies through Practical ISD

Like many powerful tools, learning technology can help or hurt – it depends on how it is used. Getting others to use these technologies appropriately can be a challenge. Those of you around when eLearning was young were able to observed mismatches in learning technology to solution. In that case, eLearning often mismatched the needs and job context of the learner especially when the requirement included some form of behavior or skill acquisition that needed to be applied on the job. This problem is equally if not more prevalent today with new social media and mobile learning technologies available. Many are building wikis, writing blogs, and syndicating podcasts while grouping them under the Learning 2.0 umbrella, but does their content really support learning or a form of communication by providing access to information? If we are going to use or evangelize any learning technology, we need to get back to the basics – instructional systems design. If other learning professionals lack this basic competency, we need to help them.

Ideally, learning professionals have some training and background in instructional systems design (ISD), but we do not always see that in the real world. Even instructional design degree and certification programs tend to focus on the academic versus practice and lack preparedness for applicability when the graduate gets a job. Anecdotally, how many folks do you know with degrees in an area of expertise but lack the real-world experience required to apply it effectively? The old theory versus reality problem… At the same time, there is a basis for the application of ISD, and that’s the trick, applying ISD at practical and appropriate levels.

As learning technologists, practical ISD at a simplistic level tells us we need to help stakeholders and other learning professionals focus on alignment with business goals, business tools, and the work environment. Then we need to identify the needs of the learner by role and what must be accomplished on the job in the ideal world. We need to determine if a training or learning solution is required (it’s not when the problem is systemic or motivational), create a profile of the learner to include what they do in the real world, and map all that to the type of content that will be required (auditory, visual, kinesthetic, blended). Factor in budget, obstacles, and time to proficiency requirements and we can prescribe a set of learning technologies that meet those needs. As learning technologists, we need to push this approach.

If a learning technologist is not available, job-aides like tool matrices or decision work-flows can help stakeholders and development teams make decisions where learning technology is not a core competency, but later on, experts in the required technology need to be involved - especially in the analysis, design, and development phases. Learning technologists or experts will need to hold hands when their stakeholders are too far from their comfort zones.

If starting fresh or embarking on bleeding edge technologies, do some research to benchmark what’s been done successfully by others. When you have identified which technology will meet your needs, incubate and pilot. Start with small groups and the low hanging fruit during the initial test phases and then focus on the larger wins with larger groups as you proceed. Over communicate your wins but document your mistakes and don't forget the lessons learned - they drive efficiency and cost savings later on.

Lastly, ensure you communicate the purpose and value of the learning technologies up and down the food chain as this will drive adoption with your stakeholders and learners, along with prescriptive usage the learning professionals at your organization will need to drive.

Thursday, May 21, 2009

Instructional Design and Technology: Where’s the Beef?

It's important to address the science behind instructional design. For reasons unknown, programmed instruction (PI) in eLearning seems to be all but abandoned in much of the learning content I review. In my opinion, this is a flaw in instructional design and worse, is only supported by popular eLearning development tools, because they omit this capability. Thus begging the question:

"Where's the beef?"

Gagne actually pioneered this mode of instruction in the mid 1960s. For those unfamiliar, programmed instruction models assess a learner's needs through some form of testing and loop back, branch-forward, or multi-path based on performance to pass/fail criteria. More robust PI models incorporate both pretesting an posttesting. The biggest advantage to this design model is that delivery of content is customized to a learner's needs because only filtered content is delivered (based on what the learner did not pass). In essence, a form of needs assessment is built into course delivery and the user experience. How cool is that?

Development tools like Authorware or KnowledgeTRACK were great at facilitating this mode of design. Unfortunately these tools are no longer available, (in all honesty - they were cumbersome to use.) I certainly don’t see this in most of the Articulate or Captivate content I’ve seen lately.

Meanwhile, social and Learning 2.0 suites are starting to bake PI functionality into learning path where not only a test can assess performance, but a virtual instructor or coach can pass or fail. Others even advance the learner automatically if they simply complete a task as instructed. More of such suites are capable of housing SCORM modules or acting as a friendlier front-face to what is typically presented in the learning path of a traditional learning management system (LMS). For an example, check this one it out by Q2 Learning at: http://www.q2learning.com/ . None-the-less, this is a methodology we should revive. Take another look at the process illustrated above and decide for yourself.

Have a comment or question? Leave one and I will get back to you.

Saturday, December 6, 2008

Analysis and Planning: Learner Analysis

The Learner Analysis identifies what is actually done on the job and required in the real world. A learner analysis (also known as a Person Analysis) should identify the range of learner needs based on the differences between job demands and role characteristics to assure coverage of the gap between actual learner skill sets, the expectations and standards established during the organizational analysis, and the criteria set from Task-KSA Analysis. Based on subsequent technical and media specification analysis, large differences in learner groups such can be leveraged into an appropriate medium through design and delivery supports or facilitated through blended delivery media. Learner analysis role-summaries should be submitted. Data collected at this stage should:
  • Identify logistics of the base learning pool
  • Detail role characteristics for learner profiles
  • Gauge current performance measures and characteristics for comparison against what is required in training
  • Develop an approach to providing targeted recurrent training based on identified performance gaps as required by stakeholder
  • Ensure training program complements activities relating to recruitment and performance appraisal as required by stakeholder
Risks if not conducted or conducted improperly:
  • Not understanding the learner - your customer
  • A course that does not focus on the gap between tasks/KSAs assessed as required in “a perfect world” and task/KSAs the learner was doing before the learning event in “the real world”
  • Delivery modalities that do not meet the differing needs of course participants
  • Training that does not result in improved on the job performance criteria
  • Dissatisfied course participants and a dissatisfied stakeholder
We will review technical and media requirements next.

Saturday, November 29, 2008

Analysis and Planning

The first phase of ADDIE is the Analysis and Planning. This is the phase where we we gather information and identify requirements for a course or curriculum (a learning path to a set of courses based on roles). The Analysis phase is also known as Needs Analysis or Needs Assessment. The goal is to establish requirements for success by targeting outcomes and setting up the design for a training curriculum and subsequent courseware solution. This phase entails an examination of mission-critical needs; identification of required skill-sets mapped to respective performance criteria; an assessment of the learner’s actual competencies (Task-KSAs), performance measures, demands, role characteristics; and lastly, the training delivery media requirements.

The Analysis and Planning phase should include some level of the following sub-phases:
  • Organizational Analysis
  • Task-KSA Analysis
  • Learner Analysis
  • Technical and Media Specifications Analysis
We will break each sub-phase of analysis down in subsequent entries. Although Analysis is the most important phase, it is at times overlooked or not given the attention it deserves. This is most evidenced when training solutions and learning events appear ineffective or seem to "miss the mark." In my next entry, I will commence our breakdown of the Analysis Phase by examining Organizational Analysis.