OBJECTIVES
After the decision was made to go with the program, the next step was to develop the detailed objectives at all five levels. This step was completed with input from a project manager, a sales manager, and subject matter experts. At Level 1, it was decided that participants should see this program as relevant to their work and important to their success. It should have content that they intended to use and that they would recommend to others.
In terms of learning, a self-assessment on the five modules included a simple true/ false quiz at the end of each module. Each module had five questions, representing 25 questions total. A participant should score at least 20 out of 25, allowing for one missed question for each module. The score would not be punitive as there would not be any consequences for missing the desired score. This was only a gauge for participant success, as they immediately saw the correct answers with an explanation. Sales associates were encouraged to repeat the exercise if they scored less than four out of five correct answers for each module.
For application, the objectives focused on sales associates using the skills quickly and to make the first scheduled call within a week of completing of the program. By the end of the month, the goal was to see routine use of each of the major tasks, actions, or skills from the five modules.
For business impact, sales should occur within three weeks of program completion, and the associates should reach $10,000 in sales per month within three months. This was suitable to the management team and should result in success for the program.
For the Level 5 objective, a 20 percent ROI was set. This is slightly above what Transoft would use for capital expenditures (for example, the headquarters building), and it would seem reasonable to executives. The objective was the minimum acceptable performance, not only for this level of ROI, but for the other levels as well.
Exercise
Based on these objectives, what is your recommended approach for data collection and analysis? Please complete the data collection plan and the ROI analysis plan for this program. See Figures 11-1 and 11-2.
Figure 11-1. Data Collection Plan
Program: |
Responsibility: Date: |
|
||||
Level |
Broad Program Objective(s) |
Measures |
Data Collection Method/Instruments |
Data Sources |
Timing |
Responsibilities |
1 |
REACTION & PLANNED ACTION |
|||||
2 |
LEARNING & CONFIDENCE |
|||||
3 |
APPLICATION & IMPLEMENTATION |
|||||
4 |
BUSINESS IMPACT |
|||||
5 |
ROI |
Baseline Data: |
||||
Comments: |
Figure 11-2. Action Plan
PLANNING
Data Collection Plan
The evaluation planning meeting was conducted with the program manager, the designers and developers who were on contract, and the project manager for the program. In addition, the evaluator moderated the meeting. In this case, the evaluator was an external consultant who was conducting the ROI study. Figure 11-3 is the data collection plan, which details the methods, source, and timing for collecting data at four levels. Level 1 and 2 data were captured in the system as the participants completed five modules in the mobile learning program. Level 3 was a web-based questionnaire with simple questions. To achieve a good response rate, 20 techniques were used, which are shown in Table 11-1. Level 4 impact data were retrieved directly from the Salesforce.com system at Transoft.
Table 11-1. Techniques to Increase Response Rates
1. Provide advance communication.
2. Communicate the purpose.
3. Identify who will see the results.
4. Describe the data integration process.
5. Let the target audience know that they are part of a sample.
6. Design for simplicity.
7. Make it look professional and attractive.
8. Use the local manager's support.
9. Build on earlier data (Level 1 and 2).
10. Pilot test the questionnaire.
11. Recognize the expertise of participants.
12. Have an executive sign the introductory letter.
13. Send a copy of the results to the participants.
14. Report the use of results.
15. Introduce the questionnaire during the program (first and last module).
16. Use follow-up reminders.
17. Consider the appropriate medium for easy response.
18. Estimate and report the necessary time needed to complete the questionnaire.
19. Show the timing of the planned steps.
20. Collect data anonymously or confidentially.
Figure 11-3. Completed Data Collection Plan
Program: Product Upgrade With Mobile Learning Responsibility: Date: |
||||||
Level |
Broad Program Objective(s) |
Measures of Success |
Data Collection Method/ Instruments |
Data Sources |
Timing |
Responsibilities |
1 |
REACTION & PLANNED ACTIONS Achieve positive reaction on: • Relevance to my work • Recommend to others • Important to my success • Intent to use |
Rating of 4 out of 5 on a composite of four measures |
LMS survey, built into program |
Participant |
End of program |
Program manager |
2 |
LEARNING Learn to use five concepts to sell new upgrade: • Rationale for upgrade • Features of upgrade • How upgrade will increase client profit • Pricing options • Implementation and support |
Achieve 4 out of 5 correct answers on each module Achieve 20 of 25 total correct answers |
True/False quiz |
Participant |
End of program |
Program manager |
3 |
APPLICATION/ IMPLEMENTATION Use of five skills: • Explain rationale for upgrade • Identify key features of upgrade • Describe how upgrade increases client profit • Identify pricing options • Explain implementation and support • Make the first call in 5 days |
Rating (4 of 5) on a 1-5 scale System check |
Questionnaire, webbased Performance monitoring |
Participant Salesforce. com |
1 month after program 1 month after program |
Evaluator |
4 |
BUSINESS IMPACT • Increase in sales to $10,000 per month • Sell first upgrade in 3 weeks |
Monthly sales per associate Actual sale |
Business performance monitoring Business performance monitoring |
Salesforce. com Salesforce. com |
3 months after program 1 month after program |
Evaluator |
5 |
ROI 30% |
Comments: |