21st Century Assessments in Jewish Day Schools
There is no accountability system forcing day schools to use assessment tests. But that does not mean that there are not compelling reasons to do so. The truth is, assessment done right can help us answer many of the pressing questions facing our families, community members and our staff and faculty. For example:
Parent: How is my child doing?
Teacher: What can I do to better support this child’s learning?
Principal: Are our students ready for the next phase of their educational journey?
Board member: How does our school compare to other schools?
Head of school: How can we market the academic advantages of attending our specialized programs and encourage donors to give?
Director of Jewish studies: How do we support a rigorous Jewish studies program and integrate Jewish concepts throughout the general studies program?
Director of admissions: What should I tell prospective parents about how students perform here?
Student: Have I learned this year?
Before we attempt to answer these important questions for our stakeholders, we need to ensure that the assessments we choose are appropriate for our community, and demonstrate the kinds of learning that we value in education. Linda Darling Hammond, emeritus professor at the Stanford Graduate School of Education, says, “What tests measure matters because what is on the tests tends to drive instruction.” That is, you get what you test.
Measuring 21st Century Skills
Many educational leaders still perceive assessments as a typical fill-in-the-bubble assessment, but the testing industry has finally started to catch up with decades of research to support students’ 21st century learning. With the new expectations set forth in the national accountability systems, and updated standards, most testing publishers are going through an overhaul and a change in mindset.
Most Jewish day schools have the freedom to choose the assessments they want that align with what we value in education. Most of us agree that we would like assessments that not only answer some of the questions of our stakeholders above, but also give students the opportunity to demonstrate reasoning, individual personalized growth and design thinking. Do such assessments exist? Yes.
Measuring Reasoning to Inform Instruction
When students use reasoning, we often expect them to cite evidence. Thanks in large part to the cost-reductions brought about by technology and the advances within the measurement community, innovative items that measure reasoning have now appeared on most assessments. They are often referred to as either “forced choice” or “open ended.” For example, one way the industry has turned a multiple choice recall question into a “forced choice” question that measures higher level thinking is to include justification or reasoning in the answer choices (Figure 1).
At Yavneh Day School, for example, our 4th through 8th grade students participated in the new Writing Assessment Program (WrAP) offered by the Educational Records Bureau (ERB). Our students were required to read 1-3 short articles and answer both forced choice and open-ended questions, both of which required citing evidence and demonstrating their reasoning process (Figures 1 and 2).
Figure 1. Forced Choice Measuring Reasoning
Figure 2: Open-Ended Question Measuring Reasoning
In mathematics, reasoning is at the heart of numerical proficiency. This is why, at Yavneh, we offer the Middle School Math Reasoning Inventory (MSMRI) to every 4th and 5th grade student to see if they have the prerequisite numerical reasoning to jump into middle school math. When we listen to students explain their thinking, we can hear the strategies and understandings they use to compute mentally and make estimates. For example, in this open-ended question, “What is 99 plus 17,” we can listen to students’ reasoning process and record their explanations. In this example, we can see if students apply reasoning appropriate to the numbers at hand, like “Added 100 + 17, and then subtracted 1,”or they use reasoning not appropriate, such as counting up by 1s, often using their fingers. Both students may get the answer correct, but the one with the more sophisticated reasoning has a better chance of being successful in higher-level mathematics courses.
Figure 3: Students’ Explanations Recorded on the Math Reasoning Inventory
This example is among a series of questions in the Middle School Math Reasoning Inventory, designed by Marilyn Burns and colleagues at Math Solutions (www.mathsolutions.com), and offered for free online as part of a grant from the Bill and Melinda Gates Foundation (www.mathreasoninginventory.com). The tool includes assessments in three domains—whole numbers, fractions, and decimals. The benefit of this assessment is that the teacher/interviewer can hear firsthand students’ reasoning process when solving a multitude of math problems, and focus on those areas in their next phases of instruction. A disadvantage is the time-intensive nature of this personalized assessment. It can take up to 90 minutes to test one student over the course of three mathematical domains.
Personalizing Assessments to Measure Individual Growth
One way to overcome the challenge of time-sensitivity for personalized assessments is to utilize technology as the source for asking questions to multiple students at the same time. At Yavneh, we also administer the Children’s Progress Academic Assessment (CPAA), originally designed by Columbia and MIT, and offered from the Northwest Evaluation Research Association (NWEA) to students in the primary grades (PK-3rd). In this 20-minute online assessment, students engage with a computerized instructor, answering questions and getting feedback on both literacy and mathematics skills. The assessment is adaptive in nature (i.e., it gets harder as they get the questions correct), and provides hints to students when they get the question wrong. For example, as shown in Figure 4, students may be asked to spell “crown” by clicking on individual letters. If the students spell it incorrectly, another question appears that helps scaffold the information (right), asking them to choose the correct spelling among common distractors. By providing immediate feedback to the students, as well as a second chance with some hints, this assessment attempts to mimic the student-teacher interaction and embed instruction throughout the course of the experience.
Figure 4: Scaffolding in Action—A Test That Teaches (https://mapnebraska.wikispaces.com/file/view/CPAA_Children%27s_Progress_one-sheet_June12%5B1%5D.pdf)
One of the most impressive advantages of this assessment, in addition to its short duration and built-in instruction, is the detailed narrative reports for parents and teachers stating explicitly what students got correct (with or without a hint), as well as at-home and in-class activities that support student learning. In addition, this assessment suggests groups of students for teachers to help support differentiated instruction. We offer the CPAA up to three times a year for our students, in order to measure growth and provide feedback within and across years of school.
For students in grades three through eight, we administer NWEA’s Measures of Academic Progress (MAP), an online adaptive assessment, which provides results that can be interpreted on a “growth chart.” Often touted as one of the premier innovators in measurement and assessment, NWEA has been utilizing its technological prowess to produce technology-enhanced items that test a broad range of items across Depth of Knowledge levels, which categorizes tasks according to the complexity of thinking required to successfully complete them (Recall and Reproduction, Skills and Concepts, Strategic Thinking, Extended Thinking).
By utilizing their easy-to-read reports, students can see their progress and set goals for themselves within and across years. In addition, a percentile rank is provided in the report to offer a comparison to their peers within and across schools, and to be used as an eligibility factor for national gifted and talented programs.
Interactive Scenario-based Tasks to Measuring Design Thinking
The assessment industry continues to develop new advances with greater educational benefits. This year Yavneh will have the opportunity to participate in the new innovative items offered on the National Assessment for Educational Progress (NAEP). In the new NAEP science interactive tasks (Figure 5), students have 20 minutes to “predict the effect of the freeze/thaw cycle on concrete sidewalk.” Students first investigate what happens to the volume of water when it freezes. Then they use the results to predict and test what will happen when water freezes in the cracks of a concrete sidewalk.
Figure 5: Sample NAEP Science Interactive Task (http://www.nationsreportcard.gov/science_2009/ict_tasks.asp).
The rapid changes in education brought about by contemporary technology and the drive toward “21st century education” challenge educators to develop new tools for assessment. With the growing consumption of cheaper technology, many schools have begun to focus on STEM (or STEAM), design thinking, engineering, innovation, and the like. How do we measure the ways in which our students are interconnecting ideas, making meaning and designing new prototypes? NAEP has developed a tool to measure the way students apply their knowledge in Technology, Engineering and Literacy (TEL) (www.youtube.com/watch?v=uexguF1674k). In this 60-minute interactive scenario-based task, students have the challenge of fixing a broken well in a remote village. They have to play the role of an engineer, and get at the root of the problem to help the villagers fix their well. They conduct research to get background on wells, talk to people in the village about the well, and identify the cause of the problem to fix it. As we embark on the administration of NAEP for the first time at Yavneh, we will have the advantage of getting feedback about how well our new approach to design thinking and STEAM learning in the 21st century can be measurable.
Answering our Stakeholders
For individual student reports that are shared with parents, students and teachers, we use the standard reports provided by NWEA, CPAA and MSMRI. These reports are relatively easy-to-read and interpret. They can help us answer our stakeholders in the following way (names and data changed):
Parent: Maya is well above grade level. She is in the top 95th percentile for reading, and top 80th percentile for math, so she is eligible to apply for the Center for Talented Youth program, offered by Johns Hopkins University. (from the NWEA MAP)
Teacher: Ava identified the sentence with the correct capitalization of the first word and names of people, but only after a hint. Perhaps, I can reinforce this concept in my work with her and the other students who struggled with this same concept. (from CPAA)
It seems that Eli should work on understanding how ½ could be used as a benchmark number when reasoning with fractions. (from the MSMRI)
Student: I grew 15 points since fall on the math assessment. But I could still use some more practice in measurement and geometry.
As we embark in analyzing aggregate reports, at the school and grade level, it is important to have someone who understands statistics. In addition to having some statisticians on staff at Yavneh, we have also had the benefit of working with Hanover Research, which helps K-12 organizations make data-driven decisions to impact and improve student learning. With an outside lens, and 100+ researchers on staff, they help us think about our strategic plan goals, and analyze our raw data from student test scores. With their 10,000-foot view, they can offer the following answers to our stakeholders:
Principal: 100% of our students are at or above grade level, demonstrating that they are ready to take on the challenges of school, no matter where they go. 35% of the students qualified for the Gifted and Talented programs offered by Johns Hopkins University.
Board member: The histograms show, for example, how many students in our 5th grade achieved at or above the 90th percentile on the test in language usage compared to their peers. These are national comparisons with samples randomly drawn from 5.1 million students across 13,000 schools, public and private.
Head of school: We can market the advantages of attending our school by demonstrating that we value individual student growth; our assessments are adaptive, offering an individualized experience to meet the needs of the students. In addition, we can market the design thinking/STEAM approach, with our willingness to try new assessments like the ones offered by NAEP.
Director of Jewish studies: Our test scores from the Writing Assessment Program (WrAP) by ERB suggest that we need to work as a cross-disciplinary team to develop students’ ability to cite evidence as they respond to prompts in argument, informative and narrative genres.
Director of admissions: 85% of our students in 4th-5th grade are in the top 90th percentile of students nationwide in mathematics. This is good information to share with my prospective families. (from NWEA MAP)
As you respond to your stakeholders and take stock in the system of measurement in your school, remember how important it is to determine which assessments make sense for your community. Your assessments must align with your philosophy of learning and move your school towards the vision you have set.
We are fortunate to have options, and as we explore these options, we must continue to be on the lookout for new and innovative assessments that follow the trends of learning in our school community. For years there has been a perceived lack of alignment between what we value as education and how publishers have produced tests for consumer use. The new innovations in assessment show how far the test publishers have gone to further support what we believe as meaningful learning. In fact, the assessments themselves have turned into opportunities for real instruction. This is a blessing.
Diana Wilmot PhD is the chief academic officer/principal of Yavneh Day School in Los Gatos, California; she is the former director of research, evaluation and assessment for the Palo Alto Unified School District, and the current president for the California Education Research Association (CERA). email@example.com