SCENARIOS: DISTRICT

scenarios process image scenarios step 2 image scenarios step 3 image scenarios step 1 image scenarios step 4 image

Notice this section is about Step 1: Assessing the Present State because the group is:

  • Identifying existing assessment data sources
  • Developing questions about how students are performing in mathematics on two different assessments
scenarios print state img scenarios print state img scenarios print state img
 

Plan for Improving Achievement in Mathematics

At a meeting of the district leadership team of a moderately large urban school district, the assistant superintendent, Ms. Sullivan, is discussing a plan to increase data use in the district with the goal of improving student achievement in upper elementary mathematics.

The team's decision to take this action resulted from an analysis of state test data, which showed little change in scores at these grades levels for the past three years. They were concerned that students were not entering middle school with the necessary knowledge and skills to be successful. The leadership team had previously agreed that they would run a pilot project to increase data use and use lessons learned from the pilot to eventually scale up to a district-wide roll out of the data use plan.

Ms. Sullivan suggests that the first task for the pilot project is to identify existing assessment data sources to clarify expectations for their use.  She comments, “I am not sure that schools realize how much data are available or understand how the data can be used. I see this as a first step in increasing data use. Once we have this information clearly communicated and understood, we can start to help people analyze the data and make use of the results to improve mathematics achievement. Doing this will also help identify the kind of professional development that teachers and administrators need to support data use.”

Asking the Right Questions

Ms. Sullivan beings the next meeting: “I’ve been thinking quite a bit about the inquiry questions and it seems to me that we should have a general framework of the kind of questions that should be asked of any kind of data within which we can tailor more specific questions. I’d like to suggest, for example, that we have five areas of major foci.” At this point, Ms. Sullivan refers to a slide in her PowerPoint presentation that lists her proposed major foci: a) Overall, how are students performing in mathematics?, b) What are the trends in student performance in mathematics over the past 3 years?, c) How are subgroups performing currently and over time?, and d) What are our relative strengths and weaknesses in teaching and learning?

Mr. Ramirez, the person responsible for district data use, comments, "I think I see where you are going with this, but if we were to use these questions as the platform for inquiry, how would we make them specific to the state data and the interim (benchmark) data?”  “Well,” adds Mr. Newell, the district’s professional development coordinator, “I can imagine that for the first questions we could ask something like ‘How do our students perform by grade level?’ and ‘How does our students’ performance compare with the state performance?’ Questions like these would help them get deeper into the data and begin to get a sense of the current status of student performance.”  With general agreement that this is the right way to proceed, the group engages in discussion about the questions they would include in each of the major foci.

They develop questions that ask about trends in the data, comparisons with other similar schools’ performance, relative strengths and weaknesses according to content strand data, and evidence from new programs and initiatives. “Okay, I think we can be pretty happy with what we’ve got so far,” says Ms. Sullivan. “ I suggest that we move on to think about the questions that we would want schools to ask about the interim or benchmark assessment data.”  The rest of the group agrees. Mr. Casey introduces the idea of looking at the state data and interim data in relation to one another: “It seems to me that we would want the schools to be examining whether the same content areas arise in both the state and interim data. For example, we could see if the weaknesses that show up on the interim data are the same as the ones shown by the state data. What do you all think?” The others nod in agreement with Mr. Casey. Ms. Zacharius suggests, “Why don’t we start by identifying the questions with the major foci categories for them to focus on and then, when we’ve done that, we can go back and think about points where we’d want them to compare the state results and the interim results.”  The group proceeds with Ms. Zacharius’ suggestion and eventually comes up with questions like 'Are the interim test results consistent with the results of the state test?' and 'How are students performing on the interim tests on those areas identified as weak on the state tests?' to show schools how they can use these two sources of evidence.

At the end of the meeting Mr. Ramirez comments: “I’m a little worried that we don’t have the capacity to display the results of our questions in a format that will be easy for the schools to understand. I’m going to go back to my team with these questions and see what we have and what we can come up with.” “Yes,” says Ms. Sullivan, “it’s going to be really important for the schools to have clear visualizations of the data – I really appreciate that you are going to do this. But there is one more thing that I suggest this group needs to do before you talk to your team: I think we need to help schools interpret the data. I’d like to propose that at our next meeting we focus on some interpretive questions within each of our categories. Maybe you could think of a few, email them to me, and I’ll have them ready for the meeting so that we are not starting from scratch. Thank you, everyone.”

 

Professional Development Resources


Leedy, P. D. & Ormrod, J. E. (2004). Practical research: Planning and design (8th edition). Upper Saddle River, New Jersey: Prentice Hall. 978-0131108950

Summary:

This book provides a broad overview of basic research methodology and takes a hands-on approach to understanding the processes necessary for conducting quality research and generating meaningful results. Topics covered include an overview of the scientific method and research tools, methods for identifying the research problem and conducting a literature review, research design planning and proposal writing, qualitative, historical and descriptive research approaches, experimental and causal-comparative designs, statistical techniques for quantitative data analysis, and writing and publishing the research report.

How To Use:

Stat personnel could use this resource as a foundation for professional development, breaking into modules based on the topic descriptions above. District-level personnel could use this book to support their knowledge and understanding of research methodology, including helping them identify available data from the district and individual schools to develop research questions.