Evaluation of financial education programmes

As highlighted in the INFE Guidelines in Annex A, monitoring and evaluation are essential components of the successful introduction of financial education programmes in schools. Evaluation evidence is key to improving the overall effectiveness of the programme and the accountability of the stakeholders involved.

Monitoring and evaluation should ideally focus on each stage of the programme implementation.19 Evaluation should account for both short-term outcomes and long-term impacts and may take several forms, depending on its focus.

  • • One of the first important steps consists of monitoring the actual teaching of financial education, for example through case studies and oversight mechanisms that can be put in place by local or national authorities.
  • • The second phase consists in evaluating the relevance and impact of the programme, the learning framework and the teaching through direct feedback from involved stakeholders such as students, teachers, educational system’s management, parents and the local community.
  • • Finally, in order to test the change in the level of financial literacy of students, their competencies might be assessed throughout the curriculum via testing in the classroom, formal examinations or by including such evaluation in national tests.

The latter example is also the preliminary step towards the longer term impact evaluation of financial education programmes in schools. Such impact can also be partly measured through baseline surveys of students’ financial literacy, in order to set benchmarks and goals. The use of international survey results (such as the financial literacy assessment included in the PISA 2012 and 2015 exercises [OECD, 2013]) will add further value to this evaluation method (see Box 2.1 and also “Assessment of students’ achievement” in Chapter 3).

Box 2.1. The Financial Literacy Option in the OECD Programme for International Student Assessment (PISA)

The OECD Programme for International Student Assessment (PISA) began in 2000. It aims to assess the capacity of students to use their knowledge and experience in “real world” situations. The emphasis of the test is on understanding concepts and mastering skills in three areas: mathematics, reading and sciences. Around 470 000 students from 65 countries completed the fourth edition of the test in 2009. Through the Financial Literacy Option first introduced in 2012 (OECD, 2013), PISA will also test for the first time 15 year-olds on their knowledge ofpersonal finances and ability to apply it to their financial problems.

The PISA 2012 Financial Literacy Assessment is the first large-scale international study to assess the financial literacy of young people. The dedicated framework published in 2013 is the first step in constructing this financial literacy assessment of international scope by providing an articulated plan for developing items, designing the instrument and providing a common language on financial literacy issues. This framework provides a working definition for financial literacy for youth and organises the domain around the content, processes and contexts that are relevant for the assessment of 15-year-old students.

Content areas described by the framework include money and transactions, planning and managing finances, risk and reward and financial landscape. The framework covers such mental processes as identifying financial information, analysing information in a financial context, evaluating financial issues, and applying financial knowledge and understanding. These contents and processes are applied in a number of contexts, comprising education and work, home and family, individual, and societal contexts. The assessment is illustrated with 10 sample items. Additionally, the framework discusses the relationship of financial literacy to non-cognitive skills and to both mathematics and reading literacy, and the measurement of students ’ financial behaviour and experience.

In 2012, 65 countries or regions have taken part in the PISA test which focuses on testing mathematics literacy. Students from 18 of these countries have also tackled problems related to financial literacy: Australia, Belgium (Flemish Community), Shanghai-China, Colombia, Croatia, Czech Republic, Estonia, France, Israel, Italy, Latvia, New Zealand, Poland, Russia, Slovak Republic, Slovenia, Spain and United States. Results for the 18 participating economies will be available in June 2014.

A second assessment of financial literacy is planned in the 2015 PISA Financial Literacy exercise, with the following volunteering countries: Australia, Belgium (Flemish Community), Brazil, Canada (some provinces), Chile, England, Italy, Lithuania, Netherlands, New Zealand, Peru, Poland, Russian Federation, Slovak Republic, Spain, and United States.

In spite of its recognised importance, evaluation of the relevance and impact of programmes (second phase) remains relatively rare, although the situation is improving. In Spain, data compiled from 2011 has been assessed in 2012-13 for adaptation of the National Strategy on Financial Education and the Netherlands has evaluated different teaching methodologies. In the province of British Columbia, Canada, there has been ongoing evaluation of the effectiveness of the financial education programme. In New Zealand, an independent evaluation of the draft financial education framework was undertaken and the findings of the evaluation were used to shape the final form of the framework. In Australia, ASIC has contracted the Australian Council of Education Research to conduct an evaluation of the teaching of the MoneySmart programme pilot phase (2012) and notably to recommend, on the basis of these findings, how to track long-term behavioural change.

Public authorities also have a role to play in pushing private sector providers to evaluate their initiatives. In the United Kingdom, the Money Advice Service will deliver a voluntary ‘Code of Practice’ for financial education providers in winter 2013/spring 2014. This “Code of Practice” aims to maximise the impact of industry-funded programmes, and will include an evaluation framework in order for intervention providers to assess impact and increase the body of evidence over what works.

Concerning tests on changes in the financial competencies of students (third phase), in most countries financial education is not part of student examination as a separate subject. Instead, countries are evaluating financial education as part of the already existing evaluation of the subject to which it is integrated, as in Korea where personal finance competence is tested as part of other subjects.

Some countries have however set up formal and/or informal assessment of financial education (rather than examinations). Malaysia has monthly interactive games, selfassessment quizzes and writing competitions. The United Kingdom has qualifications by the Qualifications and Curriculum Development Agency (QCA) and National Database of Accredited Qualifications (NDAQ), which contain units on personal finance education, and within the United Kingdom there have been further evaluations in Scotland (one of the early adopters of financial education in schools) undertaken by George Street Research and an evaluation of pfeg’s “Learning Money Matters” programme in England undertaken by the National Foundation for Educational Research.

The case studies in this section of the report are based on the evaluations of a wide pilot project in Brazil, the Planning 10 programme in the province of British Columbia, Canada and various provisions in England and Scotland, Italy, Malaysia and South Africa.

Brazil shows the benefits of evaluating a pilot programme ahead of a nation-wide implementation. British Columbia is an example of a monitoring exercise used to strengthen the effectiveness of financial education programmes and to inform the development of further programmes. England illustrates the need not only to count the number of schools effectively engaging in financial education, but also to understand whether this sample is representative of the overall educational system. Italy provides interesting insights into knowledge retention gauged through tests repeated over several years of financial education. Malaysia provides an interesting example of evaluation in a context characterised by a strong involvement of the private sector where financial institutions serve both as providers of content and as channels for monitoring and testing results. Scotland is a useful example of a two-stage evaluation that included a survey of the adoption of financial education programmes in different sectors of the educational system and secondly an analysis of the perception of their effectiveness. Finally South Africa shows how to develop guidelines for evaluation and a toolkit and how to make their use compulsory in all financial education programmes.

< Prev   CONTENTS   Source   Next >