Menu
Home
Log in / Register
 
Home arrow Education arrow Measuring the Success of Learning Through Technology

BUILT-IN APPLICATION TOOLS

Building data collection tools into technology-based learning is perhaps one of the most important areas where designing for results works extremely well. This is particularly helpful for blended learning and e-learning programs where data collection can easily be a part of the program. Ranging from simple action plans to significant job aids, the tools come in a variety of types and designs. They serve as application and data collection tools.

Action Plans

A simple process, the action plan is a tool that is completed during the program, outlining specifically what the participant will accomplish after the program is completed and during its implementation. The action plan always represents application data and can easily include business impact data where a business measure will be improved. Figure 6-2 shows an action plan where the focus is directly on improving a business measure. In this example, unplanned absenteeism in a call center is being improved from a high of 9 percent to a planned level of 5 percent. The actions listed in the plan on the left side of the document are the steps that will be taken to improve the business measure. The information on the right focuses more detail on the data, including the value it delivers. While this tool serves as a data collection process, it also keeps the focus on business impact. As the data are collected, it can even be used to isolate the effects of the program on the impact data, validating that the business alignment did occur. For this to work extremely well, several steps must be taken before, during, and after the action plan to keep the focus on business impact. Figure 6-3 shows the steps that are followed to ensure that the action plan is built into the process and becomes an integral part of achieving the business success.

Figure 6-2. Example of Action Plan

Action Plan

Participant: Program Manager: Follow-Up: Sept. 1 Objective: Decrease unplanned absenteeism. Evaluation Period: March – September

Improvement Measure: Absenteeism rate Current Performance: 9% Target Performance: 5%

Action Steps

Analysis

Meet team to discuss reasons for absenteeism.

March 10

A. What is the unit of measure? One absence

Review absenteeism records for each employee, looking for trends and patterns.

March 20

B. What is the value (cost) of one unit? $54.00

Counsel with problem employees to correct habits and explore opportunities for improvement.

March 20

C. How did you arrive at this value? Standard value

Conduct a brief performance discussion with each employe returning to work after an unplanned absence.

March 20

D. How much did the measure change during the evaluation period (monthly value)? 3.5%

Provide recognition to employees with perfect attendance.

March 20

E. What other factors could have contributed to this improvement? Changes in job market and disciplinary policy

Follow up with each discussion—discuss improvement and plan other action.

March 31

F. What percent of this change was actually caused by this program? 40%

Monitor improvement and provide recognition when appropriate.

March 31

G. What level of confidence do you place on the above information? (100% = Certainty and 0%

= No confidence) 80%

Intangible benefits: Less stress, greater job satisfaction

Comments: Great program. It kept me on track with this problem.


Figure 6-3. Sequence of Activities for Action

Before

• Communicate the action plan requirement early.

• Require business measures to be identified by participants.

During

• Describe the action planning process.

• Allow time to develop the plan.

• Teach the action planning process.

• Have the program manager approve the action plan, if possible.

• Require participants to assign a monetary value for each proposed improvement (optional).

• If possible, require action plans to be presented virtually to the group.

• Explain the follow-up mechanism.

After

• Require participants to provide improvement data.

• Ask participants to isolate the effects of the program.

• Ask participants to provide a level of confidence for estimates.

• Collect action plans at the predetermined follow-up time.

• Summarize the data and calculate the ROI (optional).

 
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >
 
Subjects
Accounting
Business & Finance
Communication
Computer Science
Economics
Education
Engineering
Environment
Geography
Health
History
Language & Literature
Law
Management
Marketing
Mathematics
Political science
Philosophy
Psychology
Religion
Sociology
Travel