Open access peer-reviewed chapter

School-University Partnership for Evidence-Driven School Improvement in Estonia

Written By

Kätlin Vanari, Kairit Tammets and Eve Eisenschmidt

Submitted: 15 August 2019 Reviewed: 03 September 2019 Published: 15 October 2019

DOI: 10.5772/intechopen.89513

From the Edited Volume

Pedagogy in Basic and Higher Education - Current Developments and Challenges

Edited by Kirsi Tirri and Auli Toom

Chapter metrics overview

1,185 Chapter Downloads

View Full Metrics

Abstract

It has been acknowledged that evidence-driven practices may lead schools to improved instructional practices, student learning, or organizational improvement; still the evidence is underused by the teachers or school leaders. This study focuses on analyzing how to strengthen the evidence-driven school improvement in school-university partnership programs. Five schools learnt over a period of one school year in collaboration with the university coaches how to collect evidence in classroom and organizational level for improvement process. The results of our study illustrate profiles of the schools based on the usage of data-informed evidence, research-based evidence, or both to make decisions in the instructional and organizational level. Enablers and barriers of data use from the perspective of organizational, user, and data characteristics to implement evidence-driven practices are discussed.

Keywords

  • data-informed evidence
  • research-based evidence
  • evidence-driven school improvement
  • school-university partnership

1. Introduction

The Estonian Lifelong Learning Strategy 2020 [1] aims to implement a learning and teaching approach that supports each learner’s individual and social development, learning skills, creativity, and entrepreneurship in the work of all levels and types of education. To achieve this demanding goal, new teaching practices should be developed in collaboration between universities and schools. This also means that every school should focus on their students’ individual needs—instead of implementing already existing approaches, new solutions should be created or modified to fit into local context. When adapting new teaching and learning methods, important questions arise: what is the impact of these approaches and what other factors are influencing the outcomes.

In this new situation, schools continuously develop their practices, analyze the needs of teachers’, and find ways for their professional development. Hansen and Wasson [2] have pointed out that there is a need to change teachers’ professional development format—instead of traditional participation in training courses, teachers should be supported in developing and improving their existing practice through teacher inquiry. Nowadays, capacity building, inquiry-oriented practice, and data-driven decisions are considered as central themes of educational improvement [3, 4]. Concepts like practitioner research and teacher inquiry have been widely used for several decades—yet schools still face difficulties in using evidence for school improvement processes [1].

In the age of big data, it is difficult to imagine any educational improvement that does not include data as a key pillar [6]. Developing evidence-driven school improvement processes through school-university collaboration is one option for helping schools work with evidence. Therefore, school-university joint programs are initiated and the Future School Program was launched in Estonia. The aim of the Future School Program is to support whole-school innovation and sustainable improvement of teaching practices by enhancing the teaching and learning culture through school-university co-creation of new methodologies and implementation of evidence-driven innovation.

In this chapter, we analyze how to strengthen the evidence-driven school improvement in school-university partnership program. Following questions are discussed:

  • How evidence-driven school improvement is actualized in school development programs?

  • What are the enablers and barriers of using evidences in school development program?

Advertisement

2. Evidence-driven school improvement: theoretical underpinnings

Nowadays, educational innovation is not only the “business” of scholars—practitioners are actively involved and discussions about educational improvements revolve around the importance of evidence and data. Different authors use distinct terminology [7] evidence-informed education [8], evidence-informed practices [9], evidence-based practice [10], evidence-based education [11, 12], data-based decision-making [13, 14], data-informed practice [15], data-driven decision-making [16, 17], data-based decision-making [18], data use [7, 19, 20, 21] and practice-informed evidence [22]. The main idea behind these concepts seems to be concurrent; however, the use of different terms is not incidental. One of the broadest explanation has been given by Davies [23], who sees evidence-based education as a set of principles and practices, which can alter the way people think about education, the way they go about educational policy and practice, and the basis upon which they make professional judgments and deploy their expertise—but it is not the provider of readymade solutions to the demands of modern education. In the following sections, we compare and analyze how different concepts supplement each other and how the evidence-based improvement can be identified for the schools.

To start with, we need to unravel the concepts of evidence as they are widely used. Evidence is a kind of information, which points to the truth or validity of a claim and is the joint starting point for all authors; opinions differ on how truth or validity is achieved. It is assumed that the main source of evidence practitioners should consider when making decisions in social science research, namely experimental research and randomized controlled trials [10, 11, 24]. The idea that research can make a major contribution to improving practice stems from the assumption that it is systematic and rigorous and provides explicit evidence, which can be assessed objectively [10]. It can be concluded that one sub-concept of evidence-based education concentrates on implementation of research results, especially implementation of these teaching techniques and methods, which have been found to have a positive effect on students’ assessment results. In the following, we distinguish this sub-concept as a research-based school development.

Research-based evidence as a source for school development and teachers’ professional development has been criticized from different aspects. The disapproval of research-based evidence has been argued with the nature of research, its generalizability, and objectivity. It is recognized that professional judgments cannot be made without taking into consideration the value-based foundation of education [11]. However, research findings merely inform practitioners about what the general outcomes are of different kinds of decisions [24], and there are a variety of formal and informal sources of information that also contribute to the decision-making process [10, 24]. Schools and teachers cannot wait until the valid and reliable research results say how to implement new teaching practices.

Evidence-based education operates at two levels. First is to utilize existing evidence from worldwide research and literature on education and associated subjects [23]. This gives a broader base for professional knowledge-in action [15]. The second level is to establish sound evidence where existing evidence is lacking or of questionable, uncertain, or weak in nature [23]. It requires acquiring, using, critiquing, and creating the evidence base by the lived experience of observing and assessing students in particular contexts on a regular basis [15]. This type of professional knowledge relies on multiple values, tacit judgment, local knowledge, and skill; research usually cannot supply what the notion of evidence-based practice demands of it—specific and highly reliable answers to questions about what works and what does not [10]. In this case, the basis for innovating instruction is the data what the context offers. The data about the students, their background, their previous achievements, as well as teaching processes, and school organizational existence is wide and the potential of this data is unused.

The definition of data is broad. The focus is on raw data that must be organized, filtered, and analyzed to become information, then combined with stakeholder understanding and expertise to become actionable knowledge. The data not only enclose student test results, but also any other form of structurally collected qualitative or quantitative data on the functioning of the school, such as outcomes, inputs, processes, and perceptions [13, 25]. In short, data are the information that is collected and represent some aspect of schools [26]. If the evidence incorporates the question and the answer, the data comprehend only the question and the potential of the answer. The evidence incorporates the interpretive and evaluative elements, which are missing from the data. In conclusion, we distinguish the second sub-concept of evidence-driven school improvement as a data informed.

In addition to the data-informed and research-based dimensions of evidence, the distinction of the outcomes can be identified [7]. The expected outcomes of the evidence usage describe the goals for which the evidence is used, more specifically, the aspect of the school culture which is expected to be improved and changed according to the conclusions made from the evidence.

Discussions of evidence-based or evidence-informed practice refer to teachers, their classroom activities, and interactions with students [7, 9]. The data and evidence use are implemented with the goal of improving instruction. The quality of teachers’ instruction is an important influence on student achievement, and using data for improving instruction can enhance student achievement [13].

Data and evidence can also be used to inform decision-making in school management and leadership levels. This process is often called data-based or data-driven decision-making [13, 16]. Data-driven decision-making is the purposeful process of selecting, gathering, and analyzing relevant data to define school problems, develop alternatives, estimate outcomes of the alternatives, and choose the preferred alternative [16]. Data do not objectively guide decisions on their own—but people do. To do so, they select particular pieces of data to negotiate arguments about the nature of problems as well as the potential solutions [14]. The use of data is not only a matter of new competencies and skills, it is more about the new culture to arise. Good things do not happen thanks to data—it should be supported by data-informed leaders. Leaders should take the responsibility to evaluate what types of data are useful and for what purposes [17]. Organizational practices have an important role in affecting the way that people in organizations think and work, so it is possible to shift patterns of practice by creating organizational supports and incentives that give greater prominence to the consideration of research findings and their implications [8]. In such a case, the data and evidence can be used for school development purposes and it refers to schools using data to improve themselves; for instance, student satisfaction surveys and exam results can be used to evaluate the extent to which the school is achieving its goals [13]. The processes of decision-making and interpretation happen in parallel; this way, there is potentially a higher coherence among the data, the decision, contextual factors, as well as the risk of misinterpretation or biased interpretation.

In conclusion, we have identified two dimensions of the concept of evidence-driven school improvement (Figure 1). One of the dimensions is the input dimension, which refers to different inputs of the evidence: the evidence can be data informed or research based. The data-informed evidence can appear from assessment results, characteristics of teaching staff, national or school surveys, etc. The research-based evidence can be the result of some experimental study or qualitative study on teachers’ behavioral patterns. The second dimension characterizes the output of evidence: whether the evidence is influencing decisions made for school development, incorporating the organizational aspects like the structures, communication, or decisions made for the improvement of instruction by the teacher, usually in the interaction with the student and used educational method.

Figure 1.

Dimensions of evidence-driven school improvement and some examples.

Studies of data use have analyzed the factors influencing evidence-driven school improvement, and based on the synthesis of recent studies [13], it can be concluded that these factors are organizational characteristics, user characteristics, and data characteristics. Organizational factors include the shared vision and clear norms for data usage, encouragement by the school leader, possible expert support, time, and conditions provided for collaboration between teachers. Data use depends on the user characteristics of teachers. In order to use data, teachers need to have the knowledge and skills needed to analyze and interpret different forms of data; they need to understand the quality criteria for data use and data-use concepts; and they need skills to diagnose student-learning needs and adjust instruction accordingly. Data characteristics are identified as access to student relevant data, and the usability and high quality of data. It is important to note that these factors can be enablers or barriers depending on the goal of the data use. The study [13] shows that data use for school development is influenced by organizational and data characteristics, but data use for improving teaching and learning is influenced by organizational and user characteristics.

Additionally, the evidence-driven school improvement implemented in school-university partnerships is influenced by the character of the relationship. The partnership can be as two types of relationships between schools and universities: one type of partnership can be labeled as transactional and refers to a relationship, which is driven by individual purposes—in this case, the organizations remain unchanged; the second type is transformational partnership, where the parties come together to pursue a common purpose and create the possibility of growth and change through mutual interaction as they apply their resources to address complex problems [27]. Studies [7, 17, 25, 28] investigating strategies of school-university partnerships for supporting evidence-driven school improvement have identified four key domains: (1) human support, (2) leadership, (3) technology support, and (4) designed routes.

One possibility to offer human support is to use coaches. In order for coaching on data use to be effective, teachers needed to believe that the coach possesses strong interpersonal skills, content, and pedagogical knowledge that would be useful for them to learn. Facilitation of coaches includes assessing teachers’ needs, modeling how to interpret and act upon data, and observing teachers while they attempt to engage in the data-use process. Another possibility is to support professional development, but from previous studies, it is evident that the structured training in how to use data is not common in schools. A third approach to human support is networking with a university: either the researcher guides the process of data analysis and brings a theoretical framework to the practice or relies primarily on workshops and ongoing consultancy.

Schools make efforts to have technology support: data systems that organize and analyze interim assessment data, and data warehouses with current and historical student data. It is acknowledged that the trainings for school teams on data use are rare and focus primarily on technological support and how to access the data management system. Technological support needs to be combined with other strategies.

School leadership—principals are key players in facilitating data use among teachers, they play an important role in allocating resources and time to enable teachers to use data effectively. Their espoused beliefs about data use are critical as well, so they help set the tone for data use among in school teams. For the school leader, it is important to have a whole school perspective on the improvement initiated. If the instructional and organizational improvements are not aligned, it is confusing and unmotivating for the teachers to participate. It is important to communicate for the teachers why the evidence is being collected in classroom level and how it helps to monitor the big picture of the improvement and data are not used to blame-and-shame teachers. The evidence-driven school improvement cannot be implemented without data-literate and research-wise school leader, so the crucial target to support strategies is the leaders in schools.

Schools are required to follow norms and designed routes—specific data-driven decision-making practices—when developing their school improvement plans or for teachers to follow when using data to guide instruction. One of the primary ways that is used to build teachers’ capacity to use data is providing structured time for collaboration. This includes adoption of data-discussion protocols in order to ensure that discussions about data occurred and that actions were taken on the basis of these conversations.

Factors that influence the successful implementation of a school development program with the aim to support evidence-driven school improvement have been studied. Schools are more successful, if the entire school team participates in the program, the school staff is stable and the school leader provides their teachers with sufficient time and materials. It is concluded that school leaders and trainers should pay attention to developing clear guidelines and agreements on the execution of evidence-driven school improvement activities [18].

Advertisement

3. Methodology

3.1 Context: overview of the school improvement program

The research context is formed around the school improvement program established at the Tallinn University. The program aims to support the evidence-driven improvement in Estonian schools for improving teaching and learning culture. Five schools applied (Table 1), based on their interest, to join the program in 2018/2019. Each school team consisted of 5–6 members, whereas 1–2 of them where members of the management and each school formulated their own student-centered goal for the improvement they aimed to achieve.

Table 1.

Characteristics of schools participating in the school improvement program.

3.1.1 Evidence-driven improvement process

In the first phase of the program, each school prepared an action plan for improvement. Before creating the action plan, an analysis of the state of the school, built on existing evidence, had to be carried out. Some of the schools used data collected at the national level (satisfaction surveys, students’ study results, and existing research studies) to understand the current situation, defining the problem, and formulating the action plan. When analyzing the evidence, three school teams changed their initial goals because they did not find clear evidence about the problem they thought the school had or they identified another problem based on the evidence. During program activities, schools had to monitor and reflect on their own activities to understand their improvement processes. Each team agreed upon their own approach and tools for monitoring and data collection, which were discussed with their university coach. In addition to the regular monitoring, each school had to design their own action research plan, carry out the study in a classroom setting, analyze the collected data, and come up with suggestions on how the data will be used in the next decision-making steps.

3.1.2 School-university partnership

The program consisted of elements of human support, support for leadership, and designed joint activities. The school team—where the school leader was a compulsory member—participated in monthly seminars, where the next steps of the program were explained through theoretical underpinnings and practical suggestions. The seminars were used in the program, because the studies have shown that supporting professional development is essential in raising data-literacy skills of educational practitioners [25]. The networking aspect of the seminars is also effective to support for schools. Between seminars, the school team was supported by their university assigned coach. The coach is recognized as one of the key elements in offering human support [25]. Each step was scaffolded with the special task designed according to principles of change management and evidence-driven improvement. Data use can be improved by data-use routines, ensuring that it is a recurrent and patterned interaction that guides how people engage with each other and data [7].

3.2 Data collection and analysis

We followed the case-study approach, which has been acknowledged as an appropriate method in educational studies about evidence use [28]. Case studies do not aim to produce generalizable theories, but aim to provide practical wisdom, which is “about understanding and behavior in specific situations” [29]. That was also the aim of our study—to better understand the collaborative practices supporting schools in implementing evidence-driven school improvement.

Data were collected throughout the program and after the completion of the program. A variety of data gathering techniques that are summarized in Table 2 were used.

Table 2.

Overview of data collection.

Data were analyzed based on the framework from theoretical underpinnings, where different dimensions of evidence use for school improvement were defined (Table 3). Instructional-level decision-making refers to the teachers’ decisions to improve their own teaching, assessment, feedback, etc. Organizational-level decisions refer to the decisions made by school management or school improvement team to improve school-level processes, practices, curriculum design, etc.

Table 3.

Profiles of the cases based on dimensions of evidence-driven school improvement.

Evidence-driven practices of the five cases were classified according to nine possible profiles of evidence-driven school improvement. These profiles were created according to criteria defined from the dimensions of evidence-driven school improvement. The criteria were the following:

  • Whether the school collected (a) data-informed evidence, (b) research-based evidence, or (c) both. We classified the school as using data-informed evidence when the data were collected by the school or made available for the schools by other stakeholders, and analysis was done by the school team based on their own research and improvement interest. We classified the school as using research-based evidence when the data have been collected, analyzed, and published by researchers, and the results are used by schools in their improvement process.

  • Whether the school analyzed the results with the goal (a) to improve school management, (b) to improve instruction in the classroom, or (c) both. The school was classified to use evidence on management level when the school team made decisions about communication, professional development, work organization, procedures, etc. We classified the evidence as used for the instructional improvement if the conclusions and recommendations were targeted toward teachers and their activities.

Advertisement

4. Results

Evidence-driven practices as part of the school improvement were tightly embedded into the different phases of program activities. Next, the schools’ practices to actualize the evidence-driven school improvement, the challenges, and enablers of the process will be analyzed and discussed. The aim was to understand the following: to what extent schools used evidence collected from wider research, whether they collected or analyzed data based on their own research interests, and was the results used in organizational-level or instructional(teacher-student)-level decision-making processes.

Based on teams’ reflections, interviews and analysis of the documents schools were profiled as follows (Table 4): usage of data-informed evidence, research-based evidence, or both to make decisions in the instructional level or organizational level or both.

Table 4.

The schools evidence-driven profiles.

4.1 Data-informed decision-making in organizational level

The aim for school 1 was to improve the teachers’ collaboration and through that improve the students’ learning experience, for that a new initiative was established as “collaboration day.” Based on the reflections and document analysis, the school team focused mainly on collecting data from teachers and students to understand the usability and effectiveness of the collaboration format—questionnaire for the teachers and students after each collaboration day, students’ self-analysis, and observation sheets. Evidence regarding well-established methods and theoretical underpinnings were less emphasized by this school in their improvement process. The main outcome for the school team was that the intervention supported teachers’ collaboration and integration of subjects:

Teachers are more involved in collaborative learning: the number of teachers participating in more than 1–2 integration projects has increased by about 20%; teachers make more suggestions to colleagues for collaboration.

The majority of the decisions based on the collected and analyzed data were done in management level: improving the format of the collaboration days, identifying the needs for teacher training, sharing practices, and supporting documentation of the integration projects.

4.2 Research-based and data-informed decision-making in organizational level

School 4 focused on students’ engagement in extracurricular activities. Interventions were carried out in teacher-student level and students’ engagement was analyzed with observation sheets. Students’ motivation was analyzed and teachers’ feedback was collected with self-analysis:

We analyzed what emerged from the teachers’ work analysis and students' motivation questionnaire.

Theories and studies regarding students’ learning motivation to support engagement were used as evidence to plan the interventions and data collection:

We used motivation theories, introduced by the university, to plan our intervention.

Decisions were made mainly in the management level: observation process and techniques need to be improved:

Not everything is always visible—how to go on with the improvement of the observation sheet.

More focus on supporting teachers’ sharing of experiences and good practices was put.

4.3 Research-based and data-informed decision-making in instructional and organizational level

The aim of the school 2 was to implement different learning strategies to support the development of students’ learning to learn skills. For monitoring the process, several data collection techniques were used: teachers’ empowerment survey, survey about teachers’ understanding of learning to learn skills, and teachers’ interviews about different strategies. Students’ self-analysis about the learning process was carried out; students learning skills and reading strategies were tested. Evidence from national-level satisfaction surveys was used when planning the activities and later analyzed:

National survey 2018 was used to plan the activities; National survey 2019 was used to analyze the state of the school.

Approved training programs about reading and meaningful learning were used when designing interventions in collaboration with the university team. To support the collaborative culture, a teachers’ professional learning community was initiated and research on teachers’ professional community was used to support teachers’ collaborative learning. Teachers in this group were also studied:

We conducted interviews with the teachers’ part of the learning community.

Decisions were made in management level (training and management support for teachers’ to implement the new strategies to support students’ learning to learn skills) and in instructional processes (new strategies will be implemented and students’ self-analysis process more systematically enhanced).

School 3 aimed to raise the students’ motivation to learn through more systematic integration of the lessons and outside of the classroom activities. Self-determination theory was used as a research ground in different activities:

In designing and conducting action research, we relied on self-determination theory.

To analyze the effectiveness of the interventions, data were collected with the students’ survey after each intervention (based on self-determination theory) and teachers’ feedback. Evidence from the national-level students’ satisfaction survey was used for planning interventions. Decisions regarding the future activities were made in students’ level: focusing on explaining the goals of different learning activities to enhance the meaningfulness, enhancing students’ skills to give feedback:

Students may not have taken the feedback seriously; the purpose of the survey should be better explained to the students.

In management-level lesson, observations based on self-determination theory were developed.

The aim of the school 5 was to implement the meaningful learning experience for the seventh grade students through integrating more real-life situations to classroom activities. For data collection, an instrument was created to analyze to what extent students understand what they learn and how it supports their professional growth. Also, all the students were tested with scientific tests:

Grade 7 students took a motivation test and a social skill and learning to-learn skill test.

Students and teachers gave weekly feedback, and teachers analyzed the students’ evaluation sheets:

In addition to the paper-based feedback, we also received feedback from students electronically, which makes feedback for teachers more concise.

Also oral feedback was collected from teachers and students for more in-depth analysis of the new experiences. Some evidence about the studies on integration of subjects was also used. To some extent, research results were also read by the team:

We read some research about integration of the subjects.

Decisions were made mainly on management level: improving evaluation sheets, reformulating learning outcomes to make them easier for the students to understand. In the instructional level, teachers will focus more in the future to create shared understanding with the students about what learning outcomes mean and what students are actually expected to learn:

The teacher does not refer to the relation of the subject's learning outcome to everyday life, the result—teacher formulates the links between the learning outcomes together with the students.

Also the plan to create individual learning paths for the students is in the focus for the future activities.

Our analysis indicates that all five schools participating in our program focused on collecting data and finding research evidence on the management level and three schools worked with evidence in the instructional processes. Four schools out of five focused on improving students’ learning experience; one school focused on teachers’ collaboration, but still with the aim to implement integration projects to improve teaching practices in the classroom level. It can be also concluded that all schools used data as part of their own studies to understand the effectiveness of the interventions, but the usage of the research evidence did not happen systematically in all of the cases. Schools collected data from both students and teachers; the instruments were mainly prepared by the schools themselves. In a few cases, additional data were collected with research instruments proposed by the university (testing the skills of the students for instance). Decisions made based on the data and research results were mainly focused on management level: improving everyday processes, data collection techniques, formalizing methodologies, and better supporting teachers’ collaboration. Some important decisions were also made on the student level: enhancing feedback skills, goal-setting of learning activities, enrichment of classroom activities, etc.

4.4 The enablers and barriers using evidence in school improvement program

Schildkamp and colleagues [5, 13] have proposed several factors influencing data use by school teams; they distinguish data use for accountability, school development, and instruction. In our research, we mainly focused on data use for school improvement and classroom-level instruction. Deriving from Schildkamp et al. [5, 13], we analyze the enablers and barriers of data use from the perspective of organizational, user and data characteristics.

Organizational characteristics include the shared vision, which includes a joint understanding about the nature of good teaching, student learning, and ways to evaluate the student learning. As our program focused on school improvement, building shared understanding about the change and ways to monitor the process were crucial. Schildkamp et al. [13] emphasize that effective data use also requires collaboration—teachers should share and discuss their students’ results and their own functioning with students, parents, and teachers. In our case, all of the schools focused on improving teachers’ collaboration and different solutions were found to find time to share experiences as part of the program activities. However, school 2—which created a teachers’ professional learning community where the collected data were analyzed and results discussed—stood out among others for its evidence-driven school improvement practices. In our study, it was learnt that for the schools, it was difficult to design and conduct empirical studies (in action-research form) on their own (It is a very complex process for the school to develop research-based inquiry.) This was emphasized by the school that collaborated more tightly with the university experts to carry out research activities. On the other hand, same schools used more systematically research-based evidence in their improvement process than schools who used less university support in their activities. Therefore, the collaboration between the school team and university became very important in our study. Research data were used, but schools needed help in this regard, because it was challenging for the schools to understand what research data they could use and for what purposes and how to adapt the research-based solutions for their school settings. In our program, it was the role of the coaches to found experts, refer to the relevant studies, share validated tests and observation sheets to adapt, collect research data, etc. This relates well with Schildkamp et al. [13] user characteristics as well—data literacy of the teachers is something that needs promotion. It is not easy for the teachers to have the inquiry mindset, skills to collect data, interpret, and act based on the data. Mandinach [21] has concluded that pedagogical data literacy is the ability to transform information into actionable instructional knowledge and practices by collecting, analyzing, and interpreting all types of data to help specify educational steps by combining an understanding of data with standards, disciplinary knowledge and practices, curricular knowledge, pedagogical content knowledge, and an understanding of how children learn. Once teachers are prepared to work with the data, data characteristics—quick and convenient access to accurate data—also become very important. In our study, data-collection instruments were mainly prepared in collaboration with the university coaches and experts or by school teams themselves. It can be concluded that planning the data collection in collaboration with the university is something that schools can apply during the program activities. However, analyzing data quickly for feedforward purposes is something that needs further planning. For instance, school 4 who developed paper-based observation sheet learned that such documentation format does not support instant decision-making for classroom-level instruction. And school 1 changed their paper-based surveys to electronic surveys in the middle of the program for more efficient data analysis purposes.

Advertisement

5. Further perspectives and practical implications

Our study indicated that in school-university partnerships, schools are able to acquire easier the mindset of evidence-driven improvement based on data collection, analysis by school team, or evidence from theoretical and methodological underpinnings. However, there are some aspects that need to be considered.

5.1 Human support

A coach has been suggested as one possibility to offer human support, which was also applied in the current program, and it can happen in school-university partnerships where the university coach guides the process. Facilitation by coaches includes assessing teachers’ needs, modeling how to interpret and act upon data and observing teachers while they attempt to engage in the data-use process. It is recommended to design trainings for the school team with the following learning outcomes: learning the capabilities of the data system, understanding and using a cycle of instructional improvement, avoiding common data analysis mistakes, data transparency and safety, fostering a culture of data use, interpreting data in context, and using data to modify instruction. From the perspective of human support in the school-university partnership, our experience highlights the importance of the university coach. The school teams recognized the coaches help with practical questions and choices. This opens the discussion on the role of the coach in the school-university partnership. The university coach is often conceptualized in the literature as a data coach [7] or researcher [30] who pays attention primarily on evidence use. It may be too narrow of an approach if the final aim is to find and co-create innovative teaching and leading practices for school improvement. Yet, in our case, the profiles of the schools evidence-driven school improvement show that finding and selecting appropriate research-based evidence needs strengthening in the school improvement program. Also the main focus of the coaches was on bringing in theoretical frameworks, fostering a culture of evidence use and understanding the cycle of inquiry. The data analysis mistakes or accuracy was less emphasized by the schools. However, it was mentioned by one of the schools that they actually would like to get feedback if their inquiry design, data collection, and analysis are adequate.

5.2 Technology support

When technology training exists, it often focuses primarily on technological support and how to access the data management system. Studies show that schools pay efforts to have data systems that organize and analyze interim assessment data and data warehouses with current and historical student data. Our study indicated that elements to scaffold teachers to conduct teacher-led inquiry in the technology-enriched classroom as suggested by Hansen and Wasson [2] can be better supported. In our program, the data were collected rather traditionally—tests, surveys and questionnaires, mainly, and paper-based observation sheets. Focusing more on process-oriented data collection—with a variety of tools and efficient ways for classroom observations—timely access to students’ learning results might influence the use of data for improving the classroom instruction. The growing use of technology as part of teachers’ practice opens up the possibility for a change from researcher-centered studies to teacher-centered approaches to inquiry [2].

5.3 Leadership

School principals are key players in facilitating data use among teachers—they play an important role in allocating resources and time to enable teachers to use evidence effectively. Their espoused beliefs about data use are critical, as they help to set the tone for data use in school teams. School leaders also have access to a variety of data, performance indicators, and study results; making these available for the teachers is important to enhance the data culture in the organization. However, we recognized that during the program, schools mostly used the data they gathered by themselves and the use of data gathered by or for the national or municipality level was used rarely. This raises the question of the capabilities to interpret such data by the school team, and capabilities to support and coach this interpretation by the university coaches. Moreover, our coaches could recognize some hesitations and doubts for using such data by the school teams because of the meaningfulness of the data gathered in this manner. We recognize the effective use of national data as an improvement area for the school development program.

5.4 Norms and designed routes

The schools are required to follow specific data-driven decision-making practices when developing their school improvement plans or for teachers to follow when using data to guide instruction. Providing structured time for collaboration is one of the primary ways that schools try to build teachers’ capacity to use data. This includes adoption of data-discussion protocols in order to ensure that discussions about data occurred and that actions were taken on the basis of these conversations. Our program focused on understanding how can we better support schools in working with the data; in the next iteration of the program, we can more systematically focus on supporting the development of practices to create norms and routes for more systematic evidence-driven school improvement.

Our study demonstrated that in school-university partnership, when schools are scaffolded, evidence-driven practices are more widely adopted by the schools as part of the school improvement process. However, we also learned that the need for teachers to obtain complex data skills is becoming more and more important. Understanding about the inquiry process is just one angle of the challenge; also the understanding of how to read, interpret, critically evaluate, and act based on data is as important. In this iteration, the program did not systematically emphasize designing practices for collecting evidence from data and from the research, which could be better supported in the future. Also, we learned that schools understand quite well how to improve the practices in the school level based on collected evidence. Synergy between instructional-level data collection and decision-making, and organizational-level improvement can, however, be enhanced. In the future, it is important to analyze the impact of using classroom data in novel pedagogical and assessment approaches, and for teacher’s professional development to determine if it changes the students’ learning.

Our study also informs us how to improve initial teacher education and school principals’ preparation in Estonia. The main practical implication is rooted in the dimensions of evidence-driven school improvement. Currently, in initial teacher education, students are expected to carry out action research project during their internship period. Individually they learn how to collect data in the teaching process. They do not experience how their collected data from classroom interventions could feed the school improvement process and what is the relation between classroom-level evidence with school-level evidence. It can be concluded that it needs strengthening the dimension of evidence for school improvement in the initial teacher training. Additionally, current initial teacher training tends to prepare future teachers to collect action research data rather traditionally through surveys and interviews, but the usage of the learning analytics solutions as part of the inquiry could enable to monitor the practices more efficiently. Simultaneously, in principals’ training program, topics like evidence-driven school improvement and schools’ self-evaluation are rather theoretical. However, school principals need skills how to collect, analyze, interpret, and integrate data about instructional interventions conducted by teachers to plan improvements in school-level processes.

Advertisement

Acknowledgments

This research has received funding from the European Social Fund program “Development of Competence center for educational innovations in Tallinn University” and European Union’s Horizon 2020 research and innovation program under grant no. 669074 (CEITER).

References

  1. 1. Estonian Ministry of Education and Research. Estonian Lifelong Learning Strategy 2020. 2015 Available from: https://www.hm.ee/en/estonian-lifelong-learning-strategy-2020 [Accessed: August 12, 2019]
  2. 2. Hansen C, Wasson B. Teacher inquiry into student learning: The TISL heart model and method for use in teachers’ professional development. Nordic Journal of Digital Literacy. 2016;11:24-49
  3. 3. Hargreaves A, Shirley D. The international quest for educational excellence: Understanding Canada’s high performance. Education Canada -Toronto. 2012;4:10
  4. 4. Louis K, Stoll L. Professional Learning Communities: Divergence, Depth and Dilemmas. London/New York: Open University Press/McGraw Hill; 2007
  5. 5. Schildkamp K, Smit M, Blossing U. Professional development in the use of data: From data to knowledge in data teams. Scandinavian Journal of Educational Research. 2017:1-19
  6. 6. Datnow A, Park V. Data-Driven Leadership. Chichester, United Kingdom: John Wiley & Sons; 2014. 165 p
  7. 7. Coburn C, Turner E. Research on data use: A framework and analysis. Measurement: Interdisciplinary Research and Perspectives. 2011;9(4):173-206
  8. 8. Levin B. Leadership for evidence-informed education. School Leadership and Management. 2010:303-315. Available from: https://www-tandfonline-com.ezproxy.tlu.ee/doi/full/10.1080/13632434.2010.497483 [Accessed: July 05, 2019]
  9. 9. Parr JM, Timperley HS. Teachers, schools and using evidence: Considerations of preparedness. Assessment in Education: Principles, Policy and Practice. 2008;15(1):57-71
  10. 10. Hammersley M. Some Questions about Evidence-Based Practice in Education. 2001 Available from: http://www.leeds.ac.uk/educol/documents/00001819.htm [Accessed: June 30, 2019]
  11. 11. Biesta GJ. Why ‘what works’ still won’t work: From evidence-based education to value-based education. Studies in Philosophy and Education. 2010;5:491
  12. 12. Slavin RE. Evidence-based education policies: Transforming educational practice and research. Educational Researcher. 2002;31(7):15-21
  13. 13. Schildkamp K, Poortman C, Luyten H, Ebbeler J. Factors promoting and hindering data-based decision making in schools. School Effectiveness and School Improvement. 2017;28(1):242-258
  14. 14. Spillane JP. Data in practice: Conceptualizing the data-based decision-making phenomena. American Journal of Education. 2012;118(2):113-141
  15. 15. Jimerson JB. How are we approaching data-informed practice? Development of the survey of data use and professional learning. Educational Assessment, Evaluation and Accountability. 2016;28(1):61-87
  16. 16. Childress M. Data-driven decision making: The development and validation of an instrument to measure principals’ practices. Academic Leadership: The Online Journal. 2009;7(1):18
  17. 17. Datnow A, Hubbard L. Teacher capacity for and beliefs about data-driven decision making: A literature review of international research. Journal of Educational Change. 2016;17(1):7-28
  18. 18. van Geel M, Visscher AJ, Teunis B. School characteristics influencing the implementation of a data-based decision making intervention. School Effectiveness and School Improvement. 2017;28(3):443-462
  19. 19. Breiter A, Karbautzki L. Data Use in Schools—A Cross-Country Study. Institute for Information Management, University of Bremen; 2012. 24 p. Available from: https://www.ifib.de/publikationsdateien/ICSEI_2012_WAB_1792909_Breiter_%26_Karbautzki_Data_Use_in_Schools.pdf [Accessed: September 30, 2019]
  20. 20. Park V, Daly AJ, Guerra AW. Strategic framing: How leaders craft the meaning of data use for equity and learning. Educational Policy. 2013;27(4):645-675
  21. 21. Wayman JC, Wilkerson SB, Cho V, Mandinach EB, Supovitz JA. Guide to Using the Teacher Data Use Survey. 2016. 60 p
  22. 22. Bryk AS. 2014 AERA Distinguished lecture: Accelerating how we learn to improve. Educational Researcher. 2014;44(9):467-477
  23. 23. Davies P. What is evidence-based education? British Journal of Educational Studies. 1999;47(2):108-121
  24. 24. Spillane JP, Miele DB. Evidence in practice: A framing of the terrain. In: Yearbook of the National Society for the Study of Education. 2007;106(1):46-73
  25. 25. Marsh JA. Interventions promoting educators’ use of data: Research insights and gaps. Teachers College Record. 2012:1-48
  26. 26. Schildkamp K, Karbautzki L, Vanhoof J. Exploring data use practices around Europe: Identifying enablers and barriers. Studies in Educational Evaluation. 2014;42:15-24
  27. 27. Butcher J, Bezzina M, Moran W. Transformational partnerships: A new agenda for higher education. Innovative Higher Education. 2011;36(1):29-40
  28. 28. Sheard MK, Sharples J. School leaders’ engagement with the concept of evidence-based practice as a management tool for school improvement. Educational Management Administration and Leadership. 2016;44(4):668-687
  29. 29. Thomas G. How to Do your Case Study. A Guide for Students and Researchers. London: Sage; 2011
  30. 30. Geijsel FP, Krüger ML, Sleegers PJC. Data feedback for school improvement: The role of researchers and school leaders. Australian educational researcher (Australian association for Research in Education). 2010;37(2):59-75

Written By

Kätlin Vanari, Kairit Tammets and Eve Eisenschmidt

Submitted: 15 August 2019 Reviewed: 03 September 2019 Published: 15 October 2019