Underdeveloped system-level monitoring of ECEC
The quality of ECEC is a multi-faceted concept (Box 2.6) and its interpretations vary across countries, making this a complex policy area. Developing adequate monitoring tools is becoming an increasingly vital issue as they would provide much-needed information on system performance. In order to use them to full effect, governments need to define the purpose and scope of their monitoring efforts; this may include assessing needs for staff training or mentoring, making funding decisions, adjusting curricula or policy changes.
Most often, countries monitor minimum standards or child outcomes (the latter predominantly in Anglo-Saxon countries; OECD, 2006). The tools available include programme records, structured child observations and learning outcomes, but they need to be chosen carefully as they each provide different information.
Box 2.6. Quality of ECEC: A multi-faceted concept
The 2006 Starting Strong II report offers a coherent framework to understand the different aspects of quality from the perspective of overall ECEC governance. it has seven inter-related elements:
Orientation quality: the type and level of attention that a government brings to early childhood policy, e.g. through national legislation, regulations and policy initiatives.
Structural quality: the overarching structures needed to ensure quality in ECEC, which is ensured by the clear formulation and enforcement of legislation or regulations. These may include the quality of the physical environment, staff training levels, etc.
Educational concept and practice: centres’ educational concepts and practice are generally guided by the national curriculum framework which sets out the key goals of the early childhood system.
Interaction or process quality: the warmth and quality of the pedagogical relationship between educators and children, the quality of interaction between children themselves, and the quality of relationships within the educator team figure among the progress goals most frequently cited.
Operational quality: operational quality is maintained by leadership that motivates and encourages working as a team and information sharing. It includes regular planning at centre and classroom level, opportunities for staff to engage in continuous professional and career development and time allowed for child observation.
Child-outcome quality or performance standards: ECEC services are provided not only to facilitate the labour market or other aims but above all to improve the present and future well-being of children.
Standards pertaining to parent/community outreach and involvement: this area is mentioned less than other quality standards in national regulations and curricula, but can emerge strongly in the requirements for targeted and local ECEC programmes.
Source: OECD (2006), Starting Strong II: Early Childhood Education and Care, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264035461-en.
In recent years, Latvia has tried to improve its system-level monitoring of ECEC. Since 2009 the State Education Information System has collected data about the children in ECEC institutions, as well as information on the staff working there. In January 2015 the Childcare Register was incorporated into the system to gain a more coherent overview of the full ECEC system (MoES, 2015).
Despite these recent efforts, the data collection, monitoring, and use of data and research for policy making all require considerable improvement, particularly in certain areas. For example, Calite (2010) noted the lack of accurate data about the number of children with disabilities in ECEC.
Furthermore, very little is known about the actual quality of ECEC in Latvia. The State Pre-school Education Guidelines (2012) as mentioned describe the pedagogical process, content and learning outcomes of ECEC programmes, and how the evaluation process is to be organised. There are no national-level data available on the learning outcomes of children in ECEC, however, which one can argue leaves the country guessing about the quality and effectiveness of ECEC provision. in Latvia, the monitoring of children’s development is done solely by municipalities, whose approaches tend to vary (MoES, 2015), due to the absence of any national assessment instrument.
The evidence suggests this is an issue of concern. According to the PISA 2012 results, in most countries with available data, students who reported having attended ECEC for more than one year performed better in mathematics than those who reported they had not, even after accounting for students’ socio-economic status. Latvia was one of the few exceptions where such a relationship was not observed (OECD, 2013a). Though one can argue these data only provide an insight into the quality of Latvian ECEC in the early 2000s, the lack of national data sources makes it hard to dispute these findings.
Monitoring of child developmental and learning outcomes is crucial to informing ECEC staff and families about children’s skills and development. Such knowledge can improve staff interactions with children and facilitate the adaptation of curricula and standards to meet their needs (Litjens, 2013). In addition, the monitoring of ECEC can show how effective ECEC interventions or programmes have been.
The literature urges caution, however, and notes the importance of ensuring monitoring tools are developmentally appropriate (Copple and Bredekamp, 2009; Gestwicki, 2011; Kostelnik et al., 2011; Meisels and Atkins-Burnett, 2000; Sattler, 1998; Saracho and Spodek, 2013). The tools should be designed to identify children’s learning needs, abilities and skills according to their age groups. The best tool will vary according to the knowledge and skills children have or are expected to have at different developmental stages. For instance, young children are usually not able to complete a paper-and-pencil test. Children’s comprehensive development is also not just reflected in and affected by academic knowledge and cognitive skills, but also by physical well-being, motor development, social and emotional development, and approaches towards learning (Barblett and Maloney, 2010; Raver and Knitzer, 2002; Snow, 2007).
The review team learned that Latvia is in fact considering a pilot project (to be funded through the European Social Fund) to systematically monitor child development and outcomes. We agree this pilot initiative is important for exploring a suitable approach for monitoring the developmental outcomes of children in ECEC. The Early development index (EDI) may serve as a source of inspiration for this effort. The EDI is a population-level measure of children’s development or well-being which was originally developed in Ontario, Canada. Other countries have since developed their own EDI according to their cultural and societal needs. For instance, Australia developed the Australian Early Development Index. The EDI consists of a checklist on children’s development which is completed by teachers. The results are aggregated at the group level (school, neighbourhood, city, etc.) to provide a population-based measure of community, and across the country (if implemented at country level). The checklist measures five key domains of early childhood development: 1) physical health and well-being; 2) social competence; 3) emotional maturity; 4) language and cognitive skills (school-based); and 5) communication skills and general knowledge. The data are not reported at the child or class level which means they are not used as a diagnostic tool for individual children or to assess their school readiness. The results of the EDI do allow local authorities, communities or providers to assess how local children are developing relative to other children (Litjens, 2013).