Ainsley B. Rose

Ainsley B. Rose is an author, a presenter, and a consultant. An experienced elementary and secondary teacher and principal, he is former director of education for the Western Quebec School Board.

Data-Informed Versus Data-Driven PLC Teams

One of the three big ideas of a professional learning community calls for us to be data driven. A recent blog raised an interesting perspective when it was suggested that we should be more concerned about being data informed. Is this merely a matter of semantics, or should we pay attention to the apparent distinction?

On his informative blog, Larry Ferlazzo (2013) sites Ted Appel, who contends, “If schools are data driven, they might make decisions like keeping students who are borderline between algebra and a higher level of math in algebra so that they do well on the algebra state test. Or, in English, teachers might focus a lot of energy on teaching a ‘strand’ that is heavy on the tests—even though it might not help the student become a lifelong reader. In other words, the school can tend to focus on its institutional self-interest instead of what’s best for the students. In schools that are data informed, test results are just one more piece of information that can be helpful in determining future directions.”

Indeed, the data that school-based teams focus on should be derived from common formative assessments. Whereas assessment is intended to inform teaching, there is some justification to think of the data as informing. So, if as Ted Appel argues, “test results are just one more piece of information,” what should teams look for to enhance their conversations so they lead to greater levels of student achievement? Data generated from common formative assessments are from a given point in time. We need to also pay attention to trends and patterns, and those do not always arise just from our common assessment or state test scores, benchmark assessment, or standardized assessments alone.

Professor John Hattie writes that teachers should see learning through the eyes of their students, and students should become their own teachers. In both instances, the basis that forms their respective actions is derived from information about student learning, attitudes, and dispositions, among many other attributes. Numbers or data generated from common assessments and state tests are largely necessary but entirely insufficient in helping teachers and students in making decisions about where they are going, how they are going, and where to next in the student’s learning (Hattie, 2008). To be able to arrive at answers to these three questions will surely require more information that can be obtained from just assessment data. It has to involve conversations with students, teachers, and other adults that intervene in the learning equation.

The extent to which schools ask questions of their students will lead to greater depth of understanding of the adult impact each is making. After all, the extent to which we think we are giving good feedback to students about their learning should be best judged by the students’ reaction to the feedback. How often do we really get the student’s opinion about what works best for him or her? So, as Hattie suggests in his book Visible Learning for Teachers, we need to be gathering data about student perceptions of what works best for their learning. Data about their levels of engagement, their desire to learn, their need for appropriate feedback, and their perspective on the quality of teaching should all be added to the scores students get in the common formative assessment. This additional information will surely inform the teachers about not only the results from test scores, but also the dispositions on the part of the students to learn.

What is interesting for me is we could really align our data gathering from the students according to the very questions PLC teams are encouraged to ask themselves:

What is it we expect students to learn?

How will we know when they have learned it?

How will we respond when they do not learn?

How will we respond when they already know it?

The questions Hattie (2012) encourages us to be mindful of for students again are: Where am I going? How am I going? Where to next?

In every case, we can gather the data not only of student behavior, but also of the adult actions that have caused the student responses. This, I suggest, will give PLC teams more appropriate and complete data from which to proceed in the classroom and in the school.

So, for me, data informed versus data driven is not merely a matter of semantics. I say it is clearly more complicated. We certainly have to be purposeful, attentive, and professional in examining all aspects that contribute to the success, or lack thereof, in student learning and instructional practice. As Hattie contends, teachers and school leaders must constantly examine their impact on student learning and make informed decisions on the basis of a range of evidence. We would be wise to take heed of his advice.

References

Ferlasso, L. (2009, August 26). Data-driven versus data-informed. Retrieved from http://larryferlazzo.edublogs.org/2009/08/26/data-driven-versus-data-informed

Hattie, J. (2008). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York: Routledge.

Hattie, J. (2012). Visible learning for teachers: Maximizing impact on learning. New York: Routledge.

Meier, D. (2009, March 5). Data informed, not data driven. Retrieved  from http://blogs.edweek.org/edweek/Bridging-Differences/2009/03/dear_diane_sometime_i_imagine_1.html

Comments

jont

I really enjoyed reading and relating to this article. It seems to often data driven over runs our classes. If more teachers would utilize this as an additional resource, or tool, rather than the end all answer.

Posted on

lagorham

I thoroughly enjoyed this article. I am one who believes that yes, being data driven may have its purpose, but we have to be more data informed. What politicians are not understanding is that our education system is set up to pass students whether or not they pass the tests. Well, if this is the case, we need to worry more about the learning our students are engaging in rather than the test prep, and pep talks, and so forth. Think of the hours spent for test prep rather than actually teaching a specific skill that can be used for life. As mentioned, these assessments do not consider behavior, or accurately assess those with learning difficulties.

Posted on

JereHochman

Excellent article - a "must read" today. If I may - let's give credit where credit is due - and had we stayed true to our roots perhaps the other 20% wouldn't find themselves subject of an achievement gap.

1949 Tyler's Rationale

http://education.stateuniversity.com/pages/2517/Tyler-Ralph-W-1902-1994.html#ixzz1aQNgRimJ

Answering a call from the participating schools in the study for more curriculum assistance, Tyler designed a curriculum planning rationale for the participating schools. After moving to the University of Chicago in 1938 to take the position of chairman in the Department of Education, Tyler continued to cultivate his ideas on the rationale, using it in a syllabus for his course on curriculum and instruction and eventually publishing it in 1949, under the title Basic Principles of Curriculumand Instruction. In the rationale, Tyler conceived of school action as moving across a continuum of concerns that speaks to school purposes, the organization of experiences and the evaluation of experiences. His basic questions are now famous:

What educational purposes should the school seek to attain?
What educational experiences can be provided that are likely to attain these purposes?
How can these educational experiences be effectively organized?
How can we determine whether these purposes are being attained?


Read more: Ralph W. Tyler (1902–1994) - Contribution to Testing and Curriculum Development, Advisory Role - School, Educational, Education, and University - StateUniversity.com http://education.stateuniversity.com/pages/2517/Tyler-Ralph-W-1902-1994.html#ixzz2LC8eVWX6

Posted on

mickey

A student's success I believe should not be based on assessments alone. There are times when students are not in a test mode of recalling information learned. This one day of taking an assessment to give an account of what has been learned, could be disastrous for a student who may have occurred a bad experience from home or within the classroom. Data used to assess a student's ability of learning is one thing, but being data driven is not proper in determining a student's success in my opinion.

Michelle

Posted on