Coaching Through Feedback: A Close and Critical Analysis
Coaching Through Feedback: A Close and Critical Analysis
Helen S. Timperley, Judy M. Parr, Ngaire H. Hulsbosch,
Faculty of Education, University of Auckland, New Zealand
Paper presented to the American Educational Research Association Annual Meeting, New York, 24-28 March 2008.
NOT TO BE CITED WITHOUT PERMISSION
Acknowledgements: This study is part of a national professional development in literacy project in New Zealand. The authors wish to acknowledge the willingness of the literacy coaches and teachers to have their practice studied in such detail and the project leaders for supporting the research and their willingness to learn. The ongoing funding from the New Zealand Ministry of Education is also gratefully acknowledged.
Observing teachersâ classroom practice, analyzing the lesson and providing feedback is often cited as a central feature of mentoring and coaching relationships that result in improved learning for students (e.g. Adey, 2004; Strong & Baron, 2004; Veenman, Denessen, Gerrits & Kenter, 2001). The particular advantage of this form of coaching is the direct embedding of the interactions between the coaches and teachers in the context of teachersâ daily work, a key tenant of effective professional development (Kinnucan-Welsch, Rosemary, & Grogan, 2006). This tenant has almost achieved the status of accepted wisdom but few studies have undertaken an empirical analysis of what occurs in the coaching relationship as the basis for both improving its effectiveness and developing improved theories to inform it.
The current study examined this aspect of literacy coaching alongside an evolving theoretical framework over two phases of data collection and training. Baseline data of audio-taped teacher-coach interactions and follow-up teacher interviews showed that coaches provided many indirect suggestions for change that in most cases the participating teachers reported that they did not intend to enact. The coaches were provided with training in principles and practice of effective feedback process using the protocols of learning conversations (Robinson, 1993; Timperley, 2001) that were consistent with key findings on how people learn (Donovan, Bransford, & Pellegrino (1999). These protocols focused on a process of collaborative knowledge construction (Tillema & Orland-Barak, 2006) the process of jointly deconstructing practice and engaging teachersâ theories while doing so and co-constructing new practices in ways that involved both coachesâ and teachersâ viewpoints. A second phase of research analysis of the feedback between coaches and teachers revealed that the new coaching practices were mostly consistent with the principles presented in the training and teachers rated the usefulness of the sessions very highly with most indicating an intention to enact the suggestions discussed. A close analysis of the transcripts, however, revealed that the focus of the interactions were restricted to the immediate context. The coachâs suggestions, when made, were very practical and made no reference to wider principles or theories of effective teaching. In addition, the coaches failed to develop explicitly strategies for how teachers would judge the effectiveness of proposed changes in their practice in the absence of the coach. New principles and practices were developed with the coaches that situated the practical advice in relevant theoretical frameworks of effective teaching and focused on developing teachersâ self-regulatory strategies so that they could monitor the effectiveness of any changes they made in terms of their impact on student learning both in the immediate situation and sustaining those changes in the future.
Context for the study
The coaching episodes analyzed were part of a literacy professional development project in New Zealand involving nearly 200 schools. Although New Zealand is a small country of approximately four million people, it forms a single educational jurisdiction and so can be thought of as equivalent to a medium sized state in the United States. One key difference, however, is that there is no equivalent to district administration. New Zealand schools are responsible for most operational decisions, with the central Ministry of Education responsible primarily for policy and funding, including national professional development initiatives such as the one described in this paper.
Structurally, the project functioned on four levels, not unlike similar projects internationally. At the apex was the Ministry of Education, whose officials contracted the professional development providers (Learning Media Ltd) and monitored the progress of the project. Key personnel from the providerâs organisation combined with regional team leaders of school-based facilitators, two researchers (the first and second authors) and a Ministry of Education representative to constitute a leadership team who met regularly to review progress, make project adjustments and plan implementation strategies. The regional team leaders from the leadership team co-ordinated and headed a small team of facilitators in their geographic areas who met together regularly. In addition, the facilitators, as a group, met for several days on three occasions each year at national seminars. Facilitators each worked with a small number of schools, a number that varied according to school size and geographic isolation.
The project was designed as one in which the professional development was to be based on the individual and collective professional learning needs of the teachers and leaders in each school. While coaches were trained to use particular tools and to begin with an analysis of studentsâ and professionalsâ learning needs, what happened next was designed to be sufficiently flexible to enable coaches to meet the studentsâ teachersâ and leadersâ learning needs as they emerged.
A strong feature of the project has been the iterative nature of project learning designed to parallel that of school and teacher learning. Through cycles of feedback from the embedded research, further development of the projectâs approach, combined with ongoing testing of the impact of any changes has been an integral part of the projectâs development (Timperley & Parr, 2007). This coaching study was typical of this project learning approach and involved two phases of data collection and subsequent training with a third phase planned.
The projectâs focus has been on improving studentsâ literacy outcomes and has been highly effective. Writing achievement data were obtained from a criterion referenced (to the national curriculum) measure of writing, Assessment Tools for Teaching and Learning (asTTle) (Glasswell, Parr & Aikman, 2001) that has associated national normative data (Grades 3-7). Reading achievement data were from a standardized test of reading, Supplementary Test of Reading (STAR) (NZCER & Elley, 2001) alternating the use of an A and B form.
The average effect size gain (Cohen, 1988) (relative to where the students started) on standardised assessments for schools that chose to focus on writing in the first cohort of schools (data from a moderated sample of 1,064 students) was 1.28 (Ministry of Education, 2006), equivalent to 2.6 times the expected gain over the two years of the project. For the lowest 20% of students, the target group of students, the effect size was 2.05 that is approximately four times the expected gain over the two years. For reading (data from 3,787 students) the effect size gain was 0.87, equivalent to approximately twice the expected gain over the two years of the project. The lower effect size for reading can be explained partly by a ceiling effect on this assessment, so the results for the lowest 20% of students are possibly more relevant. For these students, the effect size gain in reading was 1.97 (Ministry of Education, 2006), equivalent to approximately four times the expected gain over the two years of the project. To put these gains in perspective, a recent review of the impact of professional development on reading and writing (Timperley, Wilson, Barrar & Fung, 2007) indicates that the average project gains overall and for the lowest achievers are at the high end for literacy interventions, particularly for populations who have been traditionally under-served by the education system.
This phase took place relatively early in the project. Coaches had received training in the principles of effective literacy instruction and the learning needs-based approach. The coaches had not received any specific training in analysing teacher practice and providing feedback following classroom observations. They were, however, encouraged to interview students during the observed lessons using a set of questions designed to gauge the extent to which students understood the learning aims of the lesson, they were able to identify what success looked like and they knew what their teachers wanted them to work on. The reason for this encouragement was to help the teachers to become responsive to their students, a central principle of professional development that has an impact on student outcomes (Timperley & Alton-Lee, 2008).
Phase One: Method
Nine interactions between teachers and coaches (with seven different coaches) were audio-recorded and followed-up with interviews with the participating teachers to find out what they had learned and whether they intended to change their practice as a result of the coaching episode. During the lesson, students were interviewed by the coach to ascertain whether or not they understood the learning aims and related constructs of the lesson. The transcripts were analysed according to a theoretical framework that focused primarily on the processes of interaction between teachers and their coaches and the studentsâ interview responses.
The key features of the analytical framework were built on the findings by Donovan et al (1999) related to how people learn. The first of the three findings was the importance of engaging learnersâ initial theories about how the world works. In the case of the professional learning of teachers, this finding was translated to engaging teachersâ current theories of effective practice (Robinson, 1993). If these theories are not engaged, then new concepts and information that are presented are unlikely to be well understood because they will be interpreted in terms of existing theories. Darling-Hammond and Bransford (1995) refer to this problem in teaching as one of over-assimilation. Teachers believe they understand new concepts, but do so only partially and so enact them in ways consistent with their existing theories. This problem has been well documented in mathematics and science instruction also by Firestone, Schorr & Monfils (2004) and Spillane, Reiser & Reimer (2002).
The second key feature involved jointly deconstructing practice and co-constructing new practice. Part of the rationale underpinning this strategy was the importance of having teachers actively engaged in their learning (Kinnucan-Welsch et al., 2006) and by doing so assisting in the development of the requirements of Donovan et alâs second finding on how people learn. This finding states that learners must understand facts and ideas in the context of a conceptual framework and organise knowledge in ways that facilitate retrieval and application. Through being an active participant in the analysis process it was hoped to promote the development of these kinds of conceptual frameworks and allow the retrieval of salient features in the context of competing demands in the everyday life of classrooms.
The third key feature focused on a developing meta-cognitive or self-regulatory approach to learning, the third of Donovan et alâs (1999) three findings. Fundamental to developing self-regulation is the need to have learning goals and to monitoring their progress towards achieving them (Butler & Winne, 1995). It was assumed that if studentsâ literacy achievement was to improve, then the goals needed to pertain to both promoting literacy learning and to modifying teaching practice in response to information about studentsâ progress (Kinnucan-Welsch et al., 2006; Timperley & Alton-Lee, 2008). This kind of self-regulation is central to maintaining an ongoing learning focus that is likely to lead to continued improvement of student learning (Timperley et al., 2008). In this phase, therefore, a feature of the analytical framework was the referencing of any lesson analysis by the coach and teacher to the impact on students. In the immediacy of a specific coaching session, short terms goals were substituted for those more typical of learning goals, such as improvements on standardized assessments. The questions the coaches asked the students related to their understanding of the learning aims of the lesson, what success looked like and their understanding of what their teachers had told them to work on was intended to serve this purpose.
Phase One Results
The analysis of these first nine episodes showed that only two showed consistent evidence of the features identified in the analytical framework. These two transcripts both involved the same coach with different teachers. The coach summarised the key features of the teachersâ practice, asked the teachers to identify their personal beliefs on which their practice was based and outlined the consequences for students in terms of their considerable confusion about the learning aims if the lesson and success criteria related to the learning aims because the teachers had not been explicit about these features of the lesson. Teachers and coaches then moved on to jointly construct new practices designed to solve the problem of student confusion, to identify sources of information on which the teachers could draw to develop needed pedagogical content knowledge and to develop systems of peer feedback to support them in the construction of their new practice. These teachers expressed high levels of motivation to change their practice and over a four month period engaged further with the coach and managed to improve their studentsâ writing levels significantly (ES=1.04).
In the reminder of the episodes, only four coaches referred to the studentsâ interview responses in the follow-up research interviews. One of the teachers reported in the interview that finding out about the studentsâ confusions provided a highly salient learning experience but she was unsure of how to change her practice to make the learning aims and success criteria clearer for her students. Another rejected the studentsâ views as invalid and the other two teachers did not comment. In these and the remaining four episodes, the coaches failed to engage teachersâ existing theories of practice in sufficient depth to allow them to understand the difference between existing and new practice. Many suggestions were provided in all episodes, most of it non-specific and dissociated from the analysis of the observed lesson or any specific theoretical construct. As identified in the mentoring literature (Strong & Baron, 2004), most of the suggestions were indirect, using words such as, âI was wondering if you might âŚâ, âWhat do you think about âŚâ. Despite the tentativeness with which the advice was offered, its worth was treated as self-evident by the coach because it was not justified in terms of any literacy instructional theory. In nearly all cases, the advice was consistent with such theories but the links between the advice and these principles was not articulated to the teachers.
In the follow-up interviews, none of these teachers indicated that they intended to change their future practice as a result of the coaching session. Four could see some possible benefits to themselves or their students but did not know how to change their practice in line with the changes suggested. The remaining three indicated that they did not believe the changes would be of benefit to students or they disagreed with the advice. For example, one stated that students would learn when âreadyâ and changing her practice was not likely to accelerate their readiness. Another expressed the view that the coachesâ advice was contradictory to that offered in her earlier teacher training and that if she waited out this phase of the project, the trend would swing back to support her existing practice.
Phase One: Developing New Strategies
Following the analysis of the above transcripts, all facilitators in the project were required to record a feedback conversation with one of their teachers as part of their training and were taken through a set of analysis protocols based on the principles and practices of learning conversations (Robinson, 1993; Timperley, 2001) and how people learn (Donovan et al., 1999) that were used for the original analysis framework. Central to the principles of learning conversations was the concept of shared responsibility for learning and engagement of both teachersâ theories underpinning their practice and the coachesâ reasons for advocating alternatives. In practice, this meant that deconstruction and discussion of the lesson needed to be undertaken jointly with meaning arising from this activity co-constructed as were any suggestions about changes to practice. Reasons were to be provided for any questions asked by the coach so that the teacher did not feel interrogated but rather understood why considering such a question was important. In this way, teachersâ theories underpinning current practice and both teachersâ and coachesâ theories of effective practice were to be engaged.
It was hoped that this process would also provide the conditions for Donovan et alâs (1999) second finding on how people learn, that is, that learners understand facts and ideas in the context of a conceptual framework and organise knowledge in ways that facilitate retrieval and application. Through the discussion of the teachersâ practice, it was assumed that meanings would be constructed and frameworks about effectiveness developed.
The third key finding by Donovan et al (1999) related to developing meta-cognitive or self-regulatory approach to learning. The main way in which this was enacted in the training was for coaches to help the teachers to understand the immediate impact of their practice in terms of studentsâ understanding of their learning. The questions the coaches asked of the students (as outlined above) were consistent with developing meta-cognitive awareness.
This second phase comprised similar data collection procedures. Eighteen of the coaches participated in the research and a similar cycle of analysis and feedback to the coaches and the project leaders was undertaken. This time, the new protocols were co-constructed with the project leadership, rather than by the researchers alone.
Phase Two: Method
The eighteen coaches who agreed to participate in the research submitted 50 practice analysis and feedback episodes. Given the larger number of episodes in this phase of the research, teachers were asked to complete a written questionnaire following the coaching session rather than have a face-to-face research interview. The questionnaire asked them to rate the usefulness of the feedback on a 1-6 scale with 1 representing âdefinitively not usefulâ and 6 representing âdefinitely usefulâ. They were also asked to indicate the extent to which they intended to change their practice through ticking a box against a set of four categories ranging from âContinuing with present practiceâ to âMaking major changesâ and to indicate the specifics of any intended changes. These questionnaires were sent to the researchers directly.
The same theoretical framework as in phase one was initially used for the analysis, but as its limitations became evident, two additional categories were added. The first comprised linking any suggestions about alternative practice to a theoretical framework in order to place discussions of teachersâ pedagogical content knowledge in a theoretical structure. This new category was based on the second of the three principles of Donovanâs findings of how people learn. If prior knowledge was to promote the development of deep structures that would allow retrieval of principled knowledge rather than purely practical techniques, then the theoretical underpinnings of alternatives needed to be made more explicit.
The second comprised the more explicit development of self-regulated learning on the part of the teachers, a process closely associated with meta-cognition, the third of Donovan et alâs (1999) findings on how people learn. Self-regulated learning depends on the learner having specific goals and monitoring strategies to judge progress towards those goals (Butler & Winne, 1995; Butler, D. Lauscher, Jarvis-Selinger & Beckingham, 2004). If teachers are to continue to improve their practice independently of the presence of their coaches, then the development of such self-regulatory practices is essential.
Phase Two: Results
The dimensions previously analyzed and presented to the facilitators in the training were, overall, effectively implemented in most of the coaching episodes. In 42 of the 50 episodes, time was spent discussing the links between teaching practices and studentsâ understanding of the learning aims of the lesson and the associated success criteria. In some cases, the form of this discussion involved simply âtellingâ the teacher the responses of the students but, more frequently, involved actively using them as evidence to support the coachâs analysis of the lesson as illustrated in the following quote about the studentsâ understanding of a hamburger metaphor the teacher had used to describe the structure of a speech.
âSo when it came down to [asking students] what are you learning to do as a writer and how will you know that youâve been successfulâŚ there was a big range of what they thought they were doing. So, [studentâs name] was, âWeâre learning to write speeches in the hamburger formâ, so she clicked into what you were talking about the hamburger. âAnd the audience will like it.â And then [studentâs name] has also hooked into the hamburger and how to produce the speech. He wasnât sure how he was going to know if he had done a good job, or not.â
All but seven teachers accepted the validity of these studentsâ responses and they proved a strong motivator to discuss changes in teaching practice. Reasons for rejection by the seven teachers included a statement that the students were reciting, âformulaic phrasesâ, a claim that even though students could not answer specific questions they had a â broad ideaâ of the lesson, and finally that it was unreasonable to expect students to be able to âthink on the spotâ.
Engaging with teachersâ current theories of effective practice and probing teachersâ reasons for particular teaching practices were evident in 22 of the episodes suggesting that this was a difficult skill to enact. The only other specific skill addressed in the training that was not evident in the majority of transcripts related to coaches giving reasons for the questions they asked. Many more questions were asked without giving reasons (295) than there were questions with reasons given (72).
Nearly all episodes showed strong evidence of jointly deconstructing the lesson and co-constructing new strategies together but not at a level of theory engagement as illustrated in the following interaction.
Teacher: âWell they seemed to (understand). Like when I was saying to them what an action verb is they could tell me it was a âŚ sophisticated doing word, and that sort of thing. But then they were saying things like, âthe slithering âŚ snailâ and the âslitheringâ they were sayingâŚ would be the action verb.â
Facilitator: âAnd thatâs where that confusion arises, doesnât it?â
Teacher: âYeah. Yeah. And I felt I was probably confusing them a bit becauseâŚ I mean, âslitheringâ is âŚ a verb. But in that context itâs not.
Facilitator: â Yeah, I know. And .. thereâs no point in saying that we wonât have that sort of wordâŚâ
And then together they work to reconstruct the practice:
Teacher: ââŚ maybe if I did it again Iâd do adjectives and verbs.
Teacher: âYou know, rather than trying to push that into the action verbsâŚ I think it was a little bit too much.â
Facilitator: âYes, and just do the verb first without upping the expectation that it will have a lot of âŚ quality to it.â
In all episodes, suggestions about alternatives were more closely linked to the analysis of the lesson than in the phase one transcripts, but the suggestions remained at a practical level.
This approach to coaching through lesson analysis and feedback led to more positive responses from the participating teachers. Thirty-four of the fifty teachers gave a rating of six on the six-point scale indicating that they found it âdefinitely usefulâ. In all but three episodes, teachers indicated their intention to change, but most changes involved âtweaking practiceâ rather than making substantive change.
A deeper analysis of the transcripts, however, revealed that deliberately addressing theoretical understandings and promoting self-regulatory processes were largely absent. These processes may not be high on the teachersâ agendas in that they received practical advice that was based on their joint analysis of the lesson and they deepened their understanding of what their students understood or did not understand with respect to the learning aims or success criteria of that particular lesson. Only eight episodes showed evidence of linking the practical advice to a more theoretical idea. The most advanced instance involved a coach articulating a reason for revision, âRevising learning is a form of scaffolding because you are setting up [the idea] that these are the things you need to support your learning today. You have given them access to it because children often have strategies but they donât use them or canât access them.â
Consistent with the limited number of instances of linking practical suggestions to theory or principle was the absence of linking the ideas discussed in the practice analysis and feedback to other sites of learning that formed part of the professional development. In only three episodes did the coach remind teachers of strategies learned and discussed in workshops in which the material was introduced within a theoretical framework of effective literacy practice.
The second strategy absent from any episode was the promotion of self-regulated learning in the sense that by the end of the session teachers had set specific goals for themselves and their students and articulated monitoring strategies to determine if their new practice was more effective in promoting their studentsâ learning. Much was made of the information from the student interviews about their understanding of the lesson by both teachers and coaches, but in no case did the coaches discuss how the teachers may access the same information for their own learning purposes. âNext stepsâ were about changing teaching practice, not about monitoring the effectiveness of that practice.
Phase Two: Developing New Strategies
When the results of the analysis were discussed with the project leadership, a group of facilitators and leaders worked with the first author to develop strategies that would combine the useful features of the joint practice analysis (linking analysis to studentsâ responses, joint de-construction of the lesson and co-construction of new strategies linked to the observed lesson, and probing teachersâ reasons for their existing practice) with those less apparent. These latter features included situating the practice analysis and feedback in the context of a theoretical framework and developing self-regulatory systems so ongoing improvement was not dependent on the presence of the coach. In practice, it involved beginning any lesson analysis by co-constructing with the teacher involved, either prior to the lesson itself, or prior to the lesson analysis, a theoretical framework for the analysis of whatever aspect was the focus of the observation and identifying criteria for effective practice in relation to that framework. For example, if it was decided that student engagement was to be the focus of the analysis and feedback, then the teacher and coach would co-construct criteria for effective engagement and how they would determine if students were engaged. This co-construction would draw on other professional learning sessions undertaken by the facilitator to link sites of learning. The lesson analysis then would use this framework as the basis for critique of the specifics of the lesson. Reports from those who have trialled the framework indicate that they have found it challenging to articulate the theoretical underpinnings of what they have meant by particular concepts, such as student engagement, but when they were able to do so clearly, they found that the teachers led the lesson analysis, identified what it was they did that was likely to be effective, and conversely, ineffective.
These same informal reports suggest that developing ongoing self-regulatory processes for the teachers have proved more challenging. Traditional roles involving a critique by coaches with a focus on next steps for the next lesson appear to be more difficult to change. The concept that teachers should develop explicit monitoring strategies to assess their own effectiveness in the absence of the coach has yet to be established.
Lesson observation, analysis and feedback frequently form the basis of a coaching relationship. It cannot be assumed, however, that it is effective. In the current study, reaching the stage where coaches operate in ways consistent with the effective principles of promoting professional learning required several iterations of analysis and training. In the phase one interactions, coaches gave many indirect suggestions for improvement, treated the worth of their suggestions as self-evident and assumed that teachers were able to both understand it as they did and accept its relevance. This did not prove to be the case in most of the episodes.
The phase two transcripts and teachersâ reactions showed that nearly all teachers found practical suggestions linked to a joint analysis of the lesson very useful. There did not appear to be any expectation on the part of either the teachers or the coaches that this advice should be situated in a framework of theoretical constructs or other sites of learning. The danger of being satisfied with this level of learning is that it is likely to episodic and not generalized or integrated into a wider picture of principled learning. Over-assimilation (Darling-Hammond & Bransford, 2005) in such situations is almost inevitable.
Similarly, the problem with not developing explicit self-regulatory learning processes in terms of explicit goals and how progress towards them might be monitored is that the practice analysis is restricted to a single event and the responsibility for monitoring the effectiveness of any change becomes that of the coach rather than the teacher. Most coaches established ânext stepsâ with their teachers which no doubt contributed to the teachersâ high ratings of usefulness of the practice analysis and feedback. What may add to these next steps is a more detailed articulation of the ways in which teachers could monitor their effectiveness in terms of their impact on student learning so that both the professional learning and student learning could be sustained.
The approach developed in this paper is, in many ways, antithetical to typical ways of mentoring and coaching in which âextreme efforts of mentors to avoid giving adviceâ are usual (Strong & Baron, p. 55). Our phase one findings indicate that this avoidance applies to coaching also. If the analysis of practice is to be contextualised in both teachersâ classrooms and theoretical frameworks, then the analysis and the suggestions arising from them must be explicit. It is difficult to situate any analysis in this way, if indirect suggestions are embedded in a general conversation. We suggest that rather than avoiding the discussion of suggestions, mentors and coaches should avoid assuming their suggestions are helpful but rather are offered in the spirit of possibilities which may or may not translate well into a particular teacherâs practice context.
If coaching relationships are to maximize the possibilities for sustained teacher learning, then this kind of detailed theoretical analysis of particular approaches can serve two functions. The first is that the coaching process can become more effective. The second is that this kind of close analysis can further refine and develop theory. In this way much of the advocacy about effective coaching might be replaced by the development of a theory that has both been tested empirically and informed by the evidence.
Adey, P. (2006). A model for the professional development of teachers of thinking. Thinking Skills and Creativity, 1(1), 49-56.
Butler, D. Lauscher, Jarvis-Selinger & Beckingham, (2004). Collaboration and self-regulation in teachersâ professional development. Teaching and Teacher Education, 20(5), 435-455.
Butler, D. & Winne, P. (1995). Feedback and self-regulated learning: A theoretical synthesis.
Review of Educational Research, 65(3), 245-274.
Darling-Hammond, L., & Bransford, J. (Eds.) (2005). Preparing teachers for a changing world: What teachers should learn and be able to do. Indianapolis, IN: Jossey-Bass.
Donovan, S. M., Bransford, J. D., & Pellegrino, J. W. (Eds.) (1999). How people learn: Bridging research and practise. Washington, D.C.: National Academy Press.
Firestone, W. A., Schorr, R. Y., & Monfils, L. F. (2004). The ambiguity of teaching to the test: Standards, assessments, and education reform. Mahwah, NJ: Lawrence Erlbaum.
Glasswell, K., Parr, J.M., & Aikman, M. (2001). Development of the asTTle Writing Assessment Rubrics for Scoring Extended WritingTasks (Tech. Rep. No. 6). Project asTTle, University of Auckland
Kinnucan-Welsch, K., Rosemary, C., Grogan, P. (2006). Accountability by design in literacy professional development. International Reading Association, 59(5), 426-434.
Ministry of Education, (2006). Student achievement findings for cohort 1 of the Literacy Professional Development Project 2004/05. Wellington: Ministry of Education.
Ministry of Education and the University of Auckland. (2001). Assessment Tools for Teaching and Learning: Project asTTle.
Robinson, V. M. J. (1993). Problem-based methodology: Research for the improvement of practice. Oxford: Pergamon Press.
Spillane, J. P., Reiser, B. J., & Reimer, T. (2002). Policy implementation and cognition: Reframing and refocusing implementation research. Review of Educational Research, 72, 387-431.
Strong, M., & Baron, W. (2004). An analysis of mentoring conversations with beginning teachers: Suggestions and responses. Teaching and Teacher Education, 20, 47-57.
Tillema, H., & Orland-Barak, L. (2006). Constructing knowledge in professional conversations: The role of beliefs on knowledge and knowing. Learning and Instruction, 16, 592-602
Timperley, H.S. (2001). Mentoring conversations designed to promote student teacher learning.â Asia-Pacific Journal of Teacher Education. 29(2), 111-123
Timperley, H.S. & Alton-Lee, A. (2008). Reframing teacher professional learning: An alternative policy approach to strengthening valued outcomes for diverse learners 328-369. In G. Kelly, A. Luke and J. Green (Eds.) Disciplines, Knowledge and Pedagogy. Review of Research in Education, Vol. 32. Washington, DC; Sage Publications.
Timperley, H., Wilson, A., Barrar, H., & Fung, I. (2008). Best evidence synthesis on professional learning and development. Wellington, N.Z. Ministry of Education.
Veenman,S., Denessen, E., Gerrits, J., & Kenter, J. (2001). Evaluation of a coaching program for cooperating teachers. Educational Studies, 27(3), 317-340.
Wang, J., & Odell, S. (2002). Mentored learning to teach according to standards-based reform: A critical review. Review of Educational Research, 72(3), 481-546.