Thursday, 31 January 2013

Worksheets-Friend of Foe?


I use worksheets. There. I said it.

It seems like worksheets have gotten a bad name. They must appear to parents and educators as a soulless, uninspired way to teach. Any idiot can throw a piece of paper at a student- that doesn't take training. And what does that say about our students? Are we just training them to be obedient drones, quietly filling in bubbles without any sass? Where is the critical thinking?


Some of these things are not worth the paper they're printed on. With computer tutorials and Ipads, wastes of paper like the example above are becoming a thing of the past. Already, my three year daughter can trace all the letters of the alphabet and write her name-a precursor to writing. 

I don't think this means that worksheet are arcane, antiquated approaches, but they're certainly being phased out for digital approaches. Take the example below: 


For my math class, I used this challenge activity for a math lab investigation. Working in pairs, students modelled perfect squares and used this worksheet to document their investigation by writing equations and sharing their findings in a whole class setting afterwards. Of course they could do it on an Ipad, or on a piece of butcher paper, but in this case, I wanted each student to have a record for referral. Perhaps this is more inspired than doing 20 "drill for skill" type worksheets, but it's in the same ballpark, isn't it? 

Worksheets are not the lesson. I think when parents see a worksheet, they think the whole period was spend doing them. I usually spend no more than one third to 50% of a class lesson using a worksheet; using them to assess "as" learning. We usually start by debriefing homework, a math chat or lab followed by some independent practice. The expectation is that students have some time for meaningful practice, and that they shouldn't feel compelled to finish all of it. After 30 minutes of practice, we move on to other things.

Worksheets are the entry point of scaffolding. Alot of worksheets look like pretty entry level recall and knowledge, and usually they are. But I would argue that for math, some of those foundations are necessary in order to gain some deeper level understanding down the line. Proponents of inquiry based/connected math approaches would say that students should start with the deeper understanding first and then they'll have a better understanding of the surface level questions. Either way, most teachers I've worked with prefer a mix of both. I like the mix of questions below:



Worksheets can be differentiated for readiness. Our math program has tiered resources which I think are great. More advanced students can choose more challenging material and struggling students can choose to do more entry level assignments. My teaching partner identifies "anchor" problems from the traditional textbook which he uses to challenge different groups of students. If you're thinking of subscribing to a math program, see if such resources exist.

With a little twist, worksheets become graphic organizers. Graphic organizers seem to be coveted by teachers, and they're a distant relative of worksheets. Strange huh? I think what proponents of graphic organizers would argue is that they're more creative and allow students to construct their own meaning. In the case below, I transformed a supplemental worksheet into a graphic organizer on our science lesson "States of matter"



There is much variation on the quality and use of worksheets. We must be careful of dismissing them completely and replacing them with video tutorials and Ipad apps because that seems to be "cutting edge". Students do respond well to these digital interfaces, but I think there is still an absence of evidence supporting whether or not these digital interfaces are clearly superior and lead to an overwhelming increase in student achievement.





Monday, 21 January 2013

Entry Interviews for Math Readiness


I've written a bit lately on the use of exit interviews to assess whether learning objectives have been met by the end of a lesson. Such assessments can be signposts to lack of understanding or making progress towards it. A number of teachers who favor a flipped classroom model (myself included) might consider using entry interviews with resulting math stations.

Carol Ann Tomlinson, the guru of differentiation notes that differentiation can take the form of not merely a learning product, but readiness for a learning activity. Using a flipped classroom model, teachers can quickly collect information through tech tools like Google Forms, so when students walk in the door a teachers knows exactly how ready a student is for certain station activities and suggest activities that are appropriate for their background knowledge. Such questions all teachers ask themselves at the beginning of a lesson are:
  1. Which students might need more time to review before independent practice?
  2. Can accelerated students compact out of doing some review activities?
  3. How can I create an individualized learning experience for all my students?

If you use a entry interview, make sure it is high quality and tied to curricular objectives.

Question #3 Courtesy of Holt Reinhart Winston "Holt Mathematics Course 3"
Quick analytical tools allow educators to assess which students may need further review and which are ready to do certain stations. For example, students that struggle with the entry interview may benefit from watching videos, re-reading sections from the text, or personal reflection. Students that miss a mere 1-2 items may do smaller amounts of the above, but then move on to modelling activites with the teacher, or independent practice. Students that demonstrate understanding of all three may move start with independent practice but move towards other activities such as content creation or application problems.


Entry interviews may seem like a daunting preparation, but if you have good resources and good classroom management, they can provide opportunities for all students to meet the curriculum at a level for which they are ready and ensure a more individualized learning experience.


Sunday, 20 January 2013

Grade Level Referencing with MAP Test Scores

I recently wrote a post about how to look at MAP test data through a process called "Stop Highlighting" wherein individuals and departments could look at RIT scores and categorize students into groups that were:

a.) Exceeding expectations
b.) At or above grade level expectations
c.) Making progress to, or just under grade level expectations
d.) A concern

A number of districts publish these cut scores in this manner, but you should not "borrow" a districts RIT scale norms as they are referenced with with their specific population.


How to Apply Grade Level Referencing
Another way to look at MAP test data is to determine who is at grade level, who is two grades above grade level, who is below grade level and who is two grades below grade level. Teams can do this with a method called "Grade Level Referencing". Figuring out how to apply grade level referencing is quite easy, although you must be aware of how your school compares to RIT scale norms. NWEA publishes RIT scale norms that can be referenced through their website.

How it works is like this-after printing out your RIT scores, identify the mean average in your subject level by grade. For example, if your average is "219" for grade 7 and you teach grade 7, students just above this mean of 219 are just above grade level and making progress towards grade 8 means.The next step is to identify what the means are in grade 8, 9, 6 and 5 to determine who in your class is a grade or two above or below grade level to provide enrichment or remediation/intervention activities. I decided to use our own math mean scores to reference norms as our school's math scores are above NWEA's published RIT scores and I wanted a high level of academic achievement. Our averages were:

Fall Mean Math RIT scores by grade level
Grade 9-253.3
Grade 8-243.3
Grade 7-239.9
Grade 6-237.8
Grade 5-227.7
Grade 4-213

From this, simply draw lines on your spreadsheet to see how students in your class compare to mean averages in those grades above or below you:






From this, I was able to determine the number of students who were at each grade level. For my breakdown:

2 Grades Above Grade Level-2 Students
1 Grade Above Grade Level-5 Students
At Grade Level-5 Students
1 Grade Below Grade Level-2 Students
2 or More Grades Below Grade Level-9 Students


The bottom 9 students should be targeted for tighter enrichment, monitoring, and consideration for IEP's.

The Most Important Norm to Disclose 
Be very careful in deciding whether to use your school's or published RIT scale norms when identifying student as either at or below grade level. For example, if you use mean scores of your school for grade level referencing, it is important to say that you are using that benchmark. So if you're fifth grader is reading on average at a fifth grade average of XYZ international school be sure to make a point to say "Suzy is reading at the fifth grade norm of XYZ international school".

The problem is when your school is on average performing below world wide RIT scale norms. If your 5th graders are reading at a 4th grade level as indicated by RIT scale norms but reading at a 5th grade level as based on mean scores by your school, be sure to address this discrepancy. There is nothing more confusing to get differing views on this data, so be sure to indicate if students are 1 or two grade levels above or below expected outcomes as reference either by RIT scale norms, or mean scores of your school. It makes sense for a school to be unified in this decision as indicated by administration.

When I indicated that 9 students are 2 or more grades below grade level this was based on grade levels as indicated by our averages, not district norms.Those students in fact are well above norms published by NWEA so be tactful in how you present that data to parents. If parents hear that their children are two grades below grade level, it invites a calamity. However, compared to district norms, their standing was higher.

Friday, 18 January 2013

Google Spreadsheets in the Science Classroom


My sixth grade science class has really responded well to Google Apps. This is our second year in a 1:1 laptop program and I feel things are really humming along. We've used a variety of apps this year already, but have really focused on using Google Spreadsheets for our unit on chemistry. Physical sciences, as you might recall, is intensively laced with data and measurement and it's important to not merely generate data, but to share and discuss our findings to look for consistencies, inconsistencies, patterns, and outliers. I have a very specific benchmark related to this practice:

  • Communicate scientific procedures, data, and explanations to enable the replication of results. Use computer technology to assist in communicating these results. Critical review is important in the analysis of these results.
    Level: Important

Google Spreadsheets and other 2.0 interfaces can allow students to compare and contrast their work.






For the last week, students have been investigating properties of materials, namely density. We've been focusing on the question: "Does the density of a substance change if it's divided into pieces?" Of course, although each piece has a smaller mass than the mother piece, but proportionally less volume. This in turn, makes density independent of the amount of a substance. I'm astounded by the number of students that think ice floats is that it's lighter than water. Many adults too.

Comparing mass, volume and density through the sum of an objects parts

In the lab above, students in small groups investigated this relationship by measuring the mass and volume through displacements of irregularly shaped objects.They all copied a spreadsheet and then added their own data. Each group of 4-5 people added their data and collected it.

After they shared my document with me, I was able to have each groups data tabbed on my web browser so I could peruse their work. Any inconsistencies in data collection were immediately addressed. More subtle concepts like labeling with the proper units were as well. The old school way of doing this was on butcher paper, but in the case of math and statistics, multiple measurements are required to ensure reliability and replication of results.

Web 2.0 interfaces allow teachers to toggle between tabs of groups to provide feedback and monitor progress

At the end of the lesson, we were able to assemble at the front of class and address the data collection. It's so crucial that students have the correct data or otherwise they fail to connect concepts and build their understanding of a topic. Often, and inquiry based approach will yield findings that are incorrect and students don't "see" the concept they were meant to discover. 

Groups addressed inconsistencies in the data.
At the end of the lesson, each group presented their findings. In the case above, we brainstormed why a piece of modelling clay was broke into two parts and there was a piece that had less mass than the other piece but strangely, more volume. We were also curious why a crayon had different densities but saw that the sum of the volume did not add up to the whole. 

Spreadsheets also allow whole class collaboration for comparison

In short, if you teach math and science, consider applying Google Spreadsheets to data collection and analysis. It has some great application, notably the ability to:
  • Give timely feedback
  • Graph data
  • Provide a digital record of investigations for digital portfolios
 Related Posts








 











Wednesday, 16 January 2013

Mine MAP Test Data with Stop Highlighting



I recently attended a workshop entitled "Using Assessment Results" which had some great ways of disaggregating and looking at data. We have established a data committee PLC at our school that has been meeting but we are taking steps to use data we receive from MAP to improve program, and we're starting to take steps of how to do it at a team and department level. Our school is lucky. We are using this data to improve teaching, not use it as a means to fire teachers which seems to be the norm in many schools across the US.

MAP test data has been hard to analyze for us. We're all aware of how RIT scores show comparison and growth, but other than putting them on a scale, we haven't had any definitive ways of grouping students together for interventions.

Determining Cut Scores
One of the key workshop points was to determine "cut scores" as an institution. This should be done first and foremost and these cut scores give a basis for comparison. Some examples of what cut scores look like are as follows:
  • Who is exceeding expectations, meeting expectations, just below expecations, or a concern?
  • Who is at grade level, just below grade level, or grades above or below grade level?
Many schools and states have a board of education that determine cut scores but they are based on a state or district average for that particular state. In short, one should not adopt a states cut scores because they look good. They've been determined for that particular population which may be different than yours.The question is: "How do we determine cut scores?"

Identifying Stanines on A Bell Distribution Curve
One method for determining cut scores is using the stanines on a bell distribution curve as seen below:

Although this could be a potential area for debate amongst your school, one could use stanines and percentile rank to determine cut scores. For example
  • Students in stanine #8 and #9 are above average and higher. The sum of their percentages is 11% so students in the top 11% of the test taking population are exceeding expectations. This translates to "Students in the 89th percentile and up are exceeding expectations"
  • Students in stanine #5, #6 and #7 are just above the middle. Stanine #4 could be as it is near the middle, but if you're trying to develop a high quality academic program, consider using stanines #5-#7 as students who are meeting grade level expectations. This translates to: "Students in the 77th to 88th percentile are proficient, at or just above grade level norms" 
  • Students in stanines #5 and #6 are close to, but just under grade level norms. The sum of their percentiles are 37% so this translates to "Students in the 41st percentile to the 76th percentile are just under grade level expectations but are making progress to it"
  • Finally, students in stanines #1-#4 are well below average. The sum of their stanines are 40% so "Students in the bottom 40% are a concern"

Now we have established cut scores for our population. The are:

89%-99% Exceeding Expecations
77%-88% Meeting Grade Level Expectations
41%-76% Just Below Grade Level Expectations
1%-40% Concern

Enter Stop Highlighting
Now that our cut scores have been established we can determine who in the class is in each category and look at distrbutions. Stop highlighting involves marking with green who is exceeding, yellow who is meeting or just below grade level and marking with red who is a concern.






The black line through the middle is the 77th percentile. From this we can see that there is one student who is a concern, five who are exceeded expectations, and the majority of the class progressing to or proficient in grade level expectations.

We Have Cut Scores-Now What?
Instead of teaching to the middle, consider offering learning activities of various levels of complexity. For instance, the green students need to be challenged. The red students need a lot of entry level help and should be monitored more frequently in class. Rather than "pidgeon holing" students into activities which is likely to make them feel like they're being stereotyped, consider giving them choices of meeting the curriculum. These different levels of challenge by choice can be packaged with different levels of difficulty (for example through blooms taxonomy) which lead to targeting instruction to their ability.








Tuesday, 15 January 2013

Build Presentations with SlideRocket

Sliderocket is a nice presentation tool that is free, allows embeddable media and allows shows to be embedded in blogs, wikis or moodle-type organizational platforms.

What really sets Sliderocket apart from other presentation tools is the ability to collaborate and build presentations with others much in the way of a google presentation.


Saturday, 12 January 2013

Assessment Of, For and As Learning


I'm in the middle of a workshop of data and assessment and one of the most interesting thing was the concept of assessment Of, For, and As learning as shown by the pyramid below:

(Image courtesy of Jennifer Sparrow)
I really like the concept of this and will flesh out which each level means.

Assessment "Of" Learning
The top of the pyramid. This level shows the highest level of understanding and would be quantified through summative assessments such as final tests, research paper or similar performance tasks. Usually, students will do the best on this level of understanding as they've had much practice leading up to it. Also, students will take this assessment the least amount of time, maybe once at the end of the unit. The important point of assessment "of' learning is that it would go in the gradebook with no conferencing with the student.

Assessment "For" Learning
This sort of learning happens more frequently, but the purpose is not to merely give a grade, but conference with students to help support learning. A typical example might be quiz, but the quiz cannot merely be entered in the grade book; it has to be passed back to students with descriptive feedback and conferences to help students learn. The assessment is "for" learning. As these assessment are still early in the learning process, consider having them weighted very low, or not counted at all. I usually give 1 of these per lesson, either as an entry interview or exit interview.

Assessment "As" Learning
These are the little feed backs that we give to students throughout the course of a lesson. In my case, having students using individual whiteboards, web 2.0 tools gives me immediate feedback about whether a student has learned something. These are not graded, but assessed through self assessment or other meta cognitive strategies like "fist to finger" or "thumbs up". I have about 5-6 of these in any lesson and they are not graded nor entered into the gradebook.

Related Posts
Formative Assessment Ideas in the Math Classroom
Scaffolding Math Benchmarks with Blooms Taxonomy
Using Flubaroo with Google Forms
Turning Student Failure into Information
Making the Most from Quizzes





Friday, 11 January 2013

Is There a Correlation between Homework Completion and Learning?

I've just started a weekend workshop on data and assessment to drive student learning. We are being facilitated by Jennifer Sparrow, who is the assessment and data instructor at Singapore American School. As a member of the data committee of our school, we got the chance to meet with Mrs. Sparrow this afternoon for ideas for how our school could use data before the conference starts tomorrow and I left thinking about the concept of homework and its assessment as an means of student learning. I went home on Friday afternoon and spent the evening collecting and analyzing data from my first semester's math classes on a Friday night. Can you say "nerd"?

Does Doing Homework Lead to Learning?
Alfie Kohn argues no. He cites a number of case studies that say homework is no indicator of student learning in his book "The Homework Myth". I have met a number of educators that have accepted his conclusions as fact, but the thing that Mr. Kohn fails to mention in his book is the variablity and breadth of what homework is. For example, he does not mention what forms homework can take, nor does he mention the wide range of how homework is assessed. For his argument, homework is ultimately dreary, a chore and unrelated to learning outcomes.

I kindly disagree. Some teachers have insightful discussions about a homework assignment. Some merely give it a cursory glance and a rubber stamp. Some grade it, some don't. Such a variable must give the teacher the onus to assess whether or not learning outcomes are supported by its usage, and not merely use it as a means of grade inflation or deflation. It's an entry level formative assessment. However, homework, if assigned, must be meaningful, interesting and open-ended.

Getting back to data, I took it upon myself to answer the question: "Is there a correlation between homework completion and learning?" For this case study, I took a spreadsheet and graphed the number of missing or late assignments and what a student's semester grade was as a percentage by the end of first semester. I use a flipped classroom model which has very open ended responses but if assignments were not completed, students were asked to resubmit.


Analysis of C Block
For my "C" block class above, there is is a obvious but subtle negative correlation. Generally, the more assignments a student did not complete on time, the lower their semester grade was as a percentage. Students that completed all their assignments and had "0" marked as late had an average semester grade of 90.2% or an "A-". There were some outliers though. I had a student that had 4 late assignments who achieved a 91%. Also, I had a student that didn't complete four assignments (as pictured above) who had the lowest grade in the class at 70%. The lowest end of the range was a student that didn't complete 12 assignments on time or were incomplete who achieved 72%.

 
Analysis of H Block 
In the analysis of H block, there is also a negative correlation with noticeable outliers. One student had 12 late assignments and achieved an 85%. Also, one student that had 4 missing assignments which is well above the line of best fit. Still, group averages continue to support the hypothesis that more independent practice will raise mean understanding.

Sources of Error
This measure does not take into account the work that students did in class following the flipped learning model. Generally, students that had homework completed (notes, front loading of concepts ahead of time) came better prepared and used time more efficiently for in-class practice. Students that did not required more modelling, which gave less time for independent practice. The data points also represent individuals or groups of individuals which may affect sample size. There is also subjectivity regarding what it means for homework to be "well done". As the classroom teacher, if assignments were well written with explanations, definitions, and conjectures supported by examples, an assignment was reported as "complete".

Conclusions
In Malcom Gladwell's book "Outliers" he advocates that the more practice one does will increase mastery; he uses a benchmark of 10,000 hours of practice at a task will lead to mastery. The data above supports this claim as when students put in more time outside of class to review or prepare for concepts, they generally show higher levels of understanding on summative assessments. Another author, Paul Tough in his book "How Children Succeed" says that the basis for academic achievment is non-academic skills such as time management, grit, and the ability to ask for and seek help are the precursor to academic achievement. Because of this, another area of inquiry is the question: "Do students not understand because they don't do the homework or because they are unmotivated?"
It's no wonder that students who are detached, uninterested and apathetic tend generally show lower levels of understanding, although this is a statistical average, not a foregone conclusion.

We as educators must find ways to connect to these students. We must find ways to give them the practice and pedagogical approaches that adapt to their learning style. Every educator has a nagging feeling that when students don't put forth effort, they are setting themselves up for failure, so the data came as no surprise to me.

What's interesting about my data workshop has been this: everyone can collect data. Often there is a mountain of statistics that fall through the hands of teaching professionals every year. We as professionals must take the time to wade through this data of what we are doing and ask ourselves: "Is this an effective high-quality assessment that leads to learning?" We need not be published professionals, but rather humble teachers that are looking to improve our professional trade.

By the way, if you're reading this Mr. Kohn, I'm still a big fan. 

Bibliography
Alfie Kohn "The Homework Myth"
Paul Tough "How Children Succeed"
Malcom Gladwell "Outliers"

Related Posts
Formative Assessments in the Math Classroom
Turning Student Failure into Information
Using Data to Support Instructional Practice
Making the Most from Quizzes
Using Homework Effectively
Making Flipped Lessons Meaningful





Wednesday, 9 January 2013

App Review: Haiku Deck

Haiku Deck is a remarkably simple app to make presentations in an Ipad classroom. What's best, is that after typing in titles or text on slides, it automatically suggests images that one might use and gives a database from the creative commons. The images are great and make for a very entertaining format.


Haiku Deck is the best application for creating presentations on iPad

Tuesday, 8 January 2013

Edistorm


It's been two years since I used "Edistorm" so I thought I'd revisit it to see if it's improved with any new features. Turns out it has. Big time.


Edistorm is one of the many web 2.0 tool that supports blended learning environments. Namely, being able to brainstorm online with others. What makes Edistorm so great is the huge list of graphic organizers that it allows users to upload and work on. After creating a 'storm' simple grab the URL and send it. The only downside is that users have to register.

In the case here, I wanted to activate prior knowledge with a KWL chart for my chemistry class. Rather than use the old school butcher paper, Edistorm allowed me to create an online forum similar to Google Docs/Drawings, or Wallwisher. It also allows the easy integration of video and images.





With any 2.0 tool, there is a signature for every single student. It's not just the brave or smart kids that raise their hands-everyone has a voice. With tools such as these, there is a record for each students individual inquiries and teachers know if they've met them.

Monday, 7 January 2013

Facing the Future


My first post of 2013. It's been nice to take some time of of writing but I'm excited to be back. All my classes gave an astounding round of applause yesterday and today so I think they're jazzed too. It's a good sign when middle schoolers are excited about being in the math and science classroom. Keep it up, I tell myself.

Despite being a geeky teacher with an emphasis on curriculum and assessment, I am a big advocate for service learning and environmental awareness. I recently came across "Facing the Future" which has a number of resources directed towards sustainability projects that can be implemented in local communities.


I've used a website called CIESE for collaborative projects and have done the international boiling point project for three years in a row now. Such projects allow students to share and communicate data and procedures with others which complements math and science with a little humanity.

We need to infuse learning with such values. If not, why study them at all?