Tag Archives: year in review

2016 - 2017 Year In Review: PreCalculus

Overview

This was the first time I taught a true PreCalculus course in six years. At my current school, the course serves the following functions:

  • Preparing tenth grade students for IB mathematics SL or HL in their 11th grade year. Many of these students were strong 9th grade students that were not yet eligible to enter the IB program since this must begin in grade eleven.
  • Giving students the skills they need to be successful in Advanced Placement Calculus in their junior or senior year.
  • Providing students interested in taking the SAT II in mathematics some guidance in the topics that are covered by that exam.

For some students, this is also the final mathematics course taken in high school. I decided to design the course to extend knowledge in Algebra 2, continue developing problem solving skills, do a bit more movement into abstraction of mathematical ideas, and provide a baseline for further work in mathematics. I cut some topics that I used to think were essential to the course, but did not properly serve the many different pathways that students can follow in our school. Like Algebra 2, this course can be the swiss army knife course that "covers" a lot so that students have been exposed to topics before they really need to learn them in higher level math courses. I always think that approach waters down much of the content and the potential for a course like this. What tools are going to be the most useful to the broadest group of students for developing their fluency, understanding, and communication of mathematical ideas? I designed my course to answer that question.

I also found that this course tended to be the one in which I experimented the most with pedagogy, class structure, new tools, and assessment.

The learning standards I used for the course can be found here:
PreCalculus 2016-2017 Learning Standards

What worked:

  • I did some assessments using Numbas, Google Forms, and the Moodle built-in quizzes to aid with grading and question generation. I liked the concept, but some of the execution is still rough around the edges. None of these did exactly what I was looking for, though I think they could each be hacked into a form that does. I might be too much of a perfectionist to ever be happy here.
  • For the trigonometry units, I offered computer programming challenges that were associated with each learning standard. Some students chose to use their spreadsheet or Python skills to write small programs to solve these challenges. It was not a large number of students, but those that decided to take these on reported that they liked the opportunity to think differently about what they were learning.
  • I explicitly also taught using spreadsheet functions to develop student's computational thinking skills. This required designing some problems that were just too tedious to solve by hand. This was fun.
  • Differentiation in this course was a challenge, but I was happy with some of the systems I used to manage it. As I have found is common since moving abroad, many students are computationally well developed, but not conceptually so. Students would learn tricks in after school academy that they would try to use in my course, often in inappropriate situations. I found a nice balance between problems that started low on the ladder of abstraction, and those that worked higher. All homework assignments for the course in Semester 2 were divided into Level 1, Level 2, and Level 3 questions so that students could decide what would be most useful for them.
  • I did some self-paced lessons with students in groups using a range of resources, from Khan Academy to OpenStax. Students reported that they generally liked when I structured class this way, though there were requests for more direct instruction among some of the students, as I described in mu previous post about the survey results.
  • There was really no time rush in this course since after my decision to cut out vectors, polar equations, linear systems, and some other assorted topics that really don't show up again except in Mathematics HL or Calculus BC where it's worth seeing the topic again anyway. Some students also gave very positive feedback regarding the final unit on probability. I took my time with things there. Some of this was out of necessity when I was out sick for ten days, but there were many times when I thought about stepping up the challenge faster than I really needed to.

What needs work:

  • I wrote about how I did the conic sections unit with no-numerical grades - just comments in the grade book . The decision to do that was based on a number of factors. The downside was that when I switched back to numerical grades for the final unit, the grade calculation for the entire quarter was based only on those grades, and not on the conic sections unit at all. The conic sections unit did appear on the final exam, but for the most part, there wasn't any other consequence for students that did not reassess on the unit.
  • Students did not generally like when I used Trello. They liked the concept of breaking up lessons into pieces and tasks. They did not like the forced timelines and the extra step of the virtual Trello board for keeping track of things. This Medium article makes me wonder about doing this in an analog form if I try it in the future. I also could make an effort to instill the spirit of Scrum early on so that it's less novel, and more the way things are in my classroom.
  • I should have done a lot more assessment at the beginning of units to see what students knew and didn't know. It sounds like the student experiences in the different Algebra 2 courses leading to PreCalculus were quite different, which led to a range of success levels throughout. Actually, I should probably be doing this more often for all my courses.
  • Students could create their own small reference sheet for every exam. I did this because I didn't want students memorizing things like double angle identities and formulas for series. The reason this needs work is that some students are still too reliant on having this resource available to ever reach any level of procedural fluency. I know what students need to be fluent later on in the more advanced courses, sure, but I am not convinced that memorization is the way to get there. Timed drills don't seem to do it either. This challenge is compounded by the fact that not all students need that level of fluency for future courses, so what role does memorization here play? I have struggled with this in every year of my fourteen year career, and I don't think it's getting resolved anytime soon. This is especially the case when Daniel Willingham, who generally makes great points that I agree with, writes articles like this one.

Conclusion

This course was fun on many levels. I like being there to push students to think more abstractly as they form the foundation of skills that will lead to success in higher levels of mathematics. I like also crafting exercises and explorations that engage and equip the students that are finishing their mathematics careers. We should be able to meet the needs of both groups in one classroom at this stage.

I frequently reminded myself of the big picture by reading through Jonathan Claydon's posts on his own Precalc course development over the years. If you haven't checked him out, you should. It's also entertaining to pester him about a resource he posted a few years ago and hear him explain how much things have changed since then.

2015-2016 Year in Review: IB Mathematics SL/HL

This was my second year working in the IB program for mathematics. For those that don't know, this is a two year program, culminating in an exam at the end of year two. The content of the standard level (SL) and higher level (HL) courses cross algebra, functions, trigonometry, vectors, calculus, statistics, and probability. The HL course goes more into depth in all of these topics, and includes an option that is assessed on a third, one-hour exam paper after the first two parts of the exam.

An individualized mathematics exploration serves as an internally assessed component of the final grade. This began with two blocks at the end of year one so that students could work on it over the summer. Students then had four class blocks spread out over the first month of school of year two two work and ask questions related to the exploration during class.

I taught year one again, as well as my first attempt at year two. As I have written about previously, this was run as a combined block of both SL and HL students together, with two out of every five blocks as HL focused classes.

What worked:

  • I was able to streamline the year 1 course to better meet the needs of the students. Most of my ability in doing this came from knowing the scope of the entire course. Certain topics didn't need to be emphasized as I had emphasized in my first attempt last year. It also helped that the students were much better aware of the demands of higher-level vs. standard level from day one.
  • I did a lot more work using IB questions both in class and on assessments. I've become more experienced with the style and expectations of the questions and was better able to speak to questions about those from students.
  • The two blocks on HL in this combined class was really useful from the beginning of year one, and continued to be an important tool for year two. I don't know how I would have done this otherwise.
  • I spent more time in HL on induction than last year, both on sums and series and on divisibility rules, and the extra practice seemed to stick better than it did last year in year one.
  • For students that were self starters, my internal assessment (IA) schedule worked well. The official draft submitted for feedback was turned in before a break so that I had time to go through them. Seeing student's writing was quite instructive in knowing what they did and did not understand.
  • I made time for open ended, "what-if" situations that mathematics could be used to analyze and predict. I usually have a lot of this in my courses anyway, but I did a number of activities in year one specifically to hint at the exploration and what it was all about. I'm confident that students finished the year having seen me model this process, and having gone through mini explorations themselves.
  • After student feedback in the HL course, I gave many more HL level questions for practice throughout the year. There was a major disconnect between the textbook level questions and what students saw on the HL assessments, which were usually composed of past exam questions. Students were more comfortable floundering for a bit before mapping a path to a solution to each problem.
  • For year two, the exam review was nothing more than extended class time for students to work past papers. I did some curation of question collections around specific topics as students requested, but nearly every student had different needs. The best way to address this was to float between students as needed rather than do a review of individual topics from start to finish.
  • The SL students in year two learned modeling and regression over the Chinese new year break. This worked really well.
  • Students that had marginally more experience doing probability and statistics in previous courses (AP stats in particular) rocked the conditional probability, normal distribution, and distribution characteristics. This applied even to students who were exposed to that material, but did poorly on it in those courses. This is definitely a nod to the idea that earlier exposure (not mastery) of some concepts is useful later on.
  • Furthermore, regarding distributions, my handwaving to students about finding area under the curve using the calculator didn't seem to hurt the approach later on when we did integration by hand.
  • This is no surprise, but being self sufficient and persevering through difficult mathematics needs to be a requirement for being in HL mathematics. Students that are sharp, but refuse to put in the effort, will be stuck in the 1-3 score range throughout. A level of algebraic and conceptual fluency is assumed for this course, and struggling with those aspects in year one is a sign of bigger issues later on. Many of the students I advised this way in year one were happier and more successful throughout the second year.
  • I successfully had students smiling at the Section B questions on the IB exam in the slick way that the parts are all connected to each other.

What needs work:

    For year one:

  • I lean far too hard on computer based solutions (Geogebra, Desmos) than on the graphing calculator during class. The ease of doing it these ways leads to students being unsure of how to use the graphing calculator to do the same tasks (finding intersections and solutions numerically) during an assessment. I definitely need to emphasize the calculator as a diagnostic tool before really digging into a problem to know whether an integer or algebraic solution is possible.
  • Understanding the IB rounding rules needs to be something we discuss throughout. I did more of this in year one on my second attempt, but it still didn't seem to be enough.
  • For year two:

  • Writing about mathematics needs to be part of the courses leading up to IB. Students liked the mini explorations (mentioned above) but really hated the writing part. I'm sure some of this is because students haven't caught the writing bug. Writing is one of those things that improves by doing more of it with feedback though, so I need to do much more of this in the future.
  • I hate to say it, but the engagement grade of the IA isn't big enough to compel me to encourage students to do work that mattered to them. This element of the exploration was what made many students struggle to find a topic within their interests. I think engagement needs to be broadened in my presentation of the IA to something bigger: find something that compels you to puzzle (and then un-puzzle) yourself. A topic that has a low floor, high ceiling serves much more effectively than picking an area of interest, and then finding the math within it. Sounds a lot like the arguments against real world math, no?
  • I taught the Calculus option topics of the HL course interspersed with the core material, and this may have been a mistake. Part of my reason for doing this was that the topic seemed to most easily fit in the context of a combined SL/HL situation. Some of the option topics like continuity and differentiability I taught alongside the definition of the derivative, which is in the core content for both SL and HL. The reason I regret this decision is that the HL students didn't know which topics were part of the option, which appear only on a third exam section, Paper 3. Studying was consequently difficult.
  • If for no other reason, the reason not to do a combined SL/HL course is that neither HL or SL students get the time they deserve. There is much more potential for great explorations and inquiry in SL, and much more depth that is required for success in HL. There is too much in that course to be able to do both courses justice and meet the needs of the students. That said, I would have gone to three HL classes per two week rotation for the second semester, rather than the two that I used throughout year one.
  • The HL students in year two were assigned series convergence tests. The option book we used (Haese and Harris) had some great development of these topics, and full worked solutions in the back. This ended up being a miserable failure due to the difficulty of the content and the challenge of pushing second semester seniors to work independently during a vacation. We made up some of this through a weekend session, but I don't like to depend on out-of-school instruction time to get through material.

Overall, I think the SL course is a very reasonable exercise in developing mathematical thinking over two years. The HL course is an exercise in speed and fluency. Even highly motivated students of mathematics might be more satisfied with the SL course if they are not driven to meet the demands of HL. I also think that HL students must enjoy being puzzled and should be prepared to use tricks from their preceding years of mathematics education outside of being taught to do so by teachers.

2014-2015 Year In Review: IB Physics SL/HL

This was my first year teaching IB Physics. The class consisted of a small group of SL students with one HL, and we met every other day according to the block schedule. I completed the first year of the sequence with the following topics, listed in order:

    Semester 1

  1. Unit 1 - Experimental Design, Uncertainty, Vectors (Topic 1)
  2. Unit 2 - Kinematics & Projectile Motion (Topic 2.1)
  3. Unit 3 - Newton's Laws (Topic 2.2)
  4. Unit 4 - Work, Energy, and Momentum (Topic 2.3)
  5. Semester 2

  6. Unit 5 - Circular Motion, Gravitation, and Orbits (Topics 6.1, 6.2)
  7. Unit 6 - Waves and *Oscillation(Topic 4, AHL Topic 9, *AHL Engineering Option Topic B3.1,3.2)
  8. Unit 7 - Thermal Physics (Topic 3, Engineering Option Topic B2)
  9. Unit 8 - *Fluid Dynamics (Engineering Option Topic B3)

For the second semester of the course, there was at least one block every two weeks that was devoted to the HL student and the HL only content - the SL students worked on practice problems or other work they had for their IB classes during this time. Units 7 and 8 were concurrent, so the HL student had to work on both the thermodynamics content and the fluid dynamics content together. This was similar to how I did it previously while teaching the AP physics B curriculum.

One other fact that is relevant - none of my students are native speakers of English. More on this later.

What worked:

  • The growth students made during the year was significant. I saw students improve in their problem solving skills and their organization in the process of doing textbook style assessment problems.
  • I learned to be honest about the IB expectations for answering questions on assessments.In the beginning, I tried to shield students from questions that combined conceptual understanding, computation, and complex language, often choosing two out of the three of them for any one question that I either wrote or selected from a bank. My motivation was to isolate assessment of the physics content from assessment of the language. I wanted answers to these separate questions:
    1. Does the student understand how the relevant physics applies here?
    2. Does the student understand how to apply the formulas from the reference table to calculate what the question is asking for?
    3. Can the student process the text of the question into a physics context?
    4. Can the student effectively communicate an answer to the question?

    On official IB assessment items, however, this graininess doesn't exist. The students need to be able to do all of these to earn the points. When I saw a significant difference between how my students did on my assessments versus those from IB, I knew I need to change. I think I need to acknowledge that this was a good move.

  • Concise chunks of direct instruction followed by longer problem solving sessions during class worked extremely well. The students made sense of the concepts and thought about them more while they were working on problems, than when I was giving them new information or guiding them through it. That time spent stating the definitions was crucial. The students did not have a strong intuition for the concepts, and while I did student centered conceptual development of formulas and concepts whenever possible, these just didn't end up being effective. It is very possible this is due to my own inexperience with the IB expectations, and my conversations with other teachers helped a lot to refine my balance of interactivity with an IB pace.
  • Students looked forward to performing lab experiments. I was really happy with the way this group of students got into finding relationships between variables in different situations. Part of this was the strong influence I've developed with the Modeling Instruction curriculum. As always, students love collecting data and getting their hands dirty because it's much more interesting than solving problems.

What needs work:

  • My careless use of the reference sheet in teaching directly caused students to rely excessively upon it. I wrote about this previously, so check that post out for more information. In short: students used the reference sheet as a list of recipes as if they provided a straight line path to solutions to questions. It should be used as a toolbox, a reminder of what the relationships between variables are for various physics concepts. I changed this partly at the end of the year, asking students to describe to me what they wanted to look for on the sheet. If their answer was 'an equation', I interrogated further, or said you aren't about to use the reference sheet for what it was designed to do. If their answer was that they couldn't remember if pressure was directly or inversely related to temperature, I asked them what equation describes that relationship, and they were usually able to tell me.
    Both of these are examples of how the reference sheet does more harm than good in my class. I fault myself here, not the IB, to be clear.
  • The language expectations of IB out of the gate are more of an obstacle than I expected at the beginning of the year. I previously wrote about my analysis of the language on IB physics exams. There tends to be a lot of verbal description in questions. Normally innocuous words get in the way of students simultaneously learning English and understanding assessment questions, and this makes all the difference. These questions are noticably more complex in their language use than that used on AP exams, though the physics content is not, in my opinion, more difficult. This is beyond physics vocabulary and question command terms, which students handled well.
  • Learning physics in the absence of others doesn't work for most students. Even the stronger students made missteps working while alone that could have been avoided by being with other students. I modified my class to involve a lot more time working problems during class and pushed students to at least start the assigned homework problems while I was around to make the time outside of class more productive. Students typically can figure out math homework with the various resources available online, but this just isn't the case for physics at this point. It is difficult for students to get good at physics without asking questions, getting help, and seeing the work of other students as it's generated, and this was a major obstacle this semester.
  • Automaticity in physics (or any subject) shouldn't be the goal, but experience with concepts should be. My students really didn't get enough practice solving problems so that they could recognize one situation versus another. I don't want students to memorize the conditions for energy being conserved, because a memorized fact doesn't mean anything. I do want them to recognize a situation in which energy is conserved, however. I gave them a number of situations, some involving conservation, others not, and hoped to have them see the differences and, over time, develop an awareness of what makes the two situations different. This didn't happen, partly because of the previous item about working physics problems alone, but also because they were too wrapped up in the mechanics of solving individual problems to do the big pciture thinking required for that intuition. Group discussions help on this, but this process is ultimately one that will happen on the individual level due to the nature of intuition. This will take some time to figure out.
  • Students hated the formal process of writing up any parts of the labs they performed. This was in spite of what I already said about the students' positive desire to do experiments. The expressions of terror on the students' faces when I told them what I wanted them to do with the experiment break my heart. I required them to do a write-up of just one of the criteria for the internal assessment, just so they could familiarize themselves with the expectations when we get to this next year. A big part of this fear is again related to the language issue. Another part of it is just inexperience with the reality of writing about the scientific process. This is another tough egg to crack.

There was limited interest in the rising junior class for physics, so we won't be offering year one to the new class. This means that the only physics class I will have this year will be with the same group of students moving on to the second year of IB physics. One thing I will change for physics is a set of memorization standards, as mentioned in my post about standards based grading this year. Students struggled remembering quick concepts that made problem solving more difficult (e.g. "What is the relationship between kinetic energy and speed?") so I'll be holding students responsible for that in a more concrete way.

The issues that need work here are big ones, so I'll need some more time to think about what else I will do to address them.

2014-2015 Year In Review: Web Programming

This was the first year I've taught a computer programming course. The class was a broad survey of programming in HTML5. This was the overall sequence:

    Semester 1:

  1. Hacking a webpage from the browser console
  2. HTML tags, structures, and organization
  3. CSS - page design, classes and IDs, along with using Bootstrap
  4. Javascript - variables, structures, conditionals
  5. jQuery - manipulating the page using events and selectors, animations
  6. Semester 2:

  7. Mongo Databases & Queries
  8. HTML Templates using Blaze
  9. Writing Meteor Apps
  10. Meteor, Media, and the HTML5 Canvas
  11. HTML5 Games using Phaser

I have posted the files and projects I used with students at this repository on Github:
https://github.com/emwdx/webprogramming2014-2015

What did I do?

The class generally began with a warm-up activity that involved students analyzing, writing, or running code that I gave them. This always led into what we were going to explore on a given day's lesson. I would show the class a few lines of code, ask them to make a prediction of what they thought would happen. This might be a visual request - what will this look like? Will there be an error? Was this error intentional or not?

This was all done while students had their laptops closed and notebooks open. I usually designed a series of tasks for students to complete using some code snippets that were saved in the directory on the school server.

We didn't use any textbook, so I knew I needed to create a reference that students could refer back to whenever they got stuck. For each class, I took notes either in Microsoft OneNote or the SMART Notebook software and saved the notes in PDF form. I don't know if students used this or not.

I had three types of assessment:

  • Mini-projects, which were fairly straight forward and had unique answers. These were assessed by general completion (4.5/5) with a (5/5) given for effort to creatively make the code their own. I was fairly loose on that final half point, giving it whenever I saw students clearly engaged by the task. You can see an example of this assignment here.
  • Projects, which had clear guidelines and requirements to meet the minimum grade that ranged from 80 - 90 percent, and then a series of additional tasks that raised the grade up to 100%. The additional task points weren't awarded until the basic requirements were met, though that didn't stop students from trying (see below).
  • Blog posts, which were required for every class. The expectations required a summary of what we learned for each class, along with code snippets, questions about what we learned, or confusion about something they wanted to go over in the next class. As the students became more skilled, this turned into questions that started as "How can we.../Is it possible to...".

Once every two weeks, and usually on a Friday, I had a 20% day during which students could work on anything they wanted related to web programming. Some students worked on previous projects to resubmit them, others experimented with code from the previous class or week. In a couple of cases, students worked on their own pet projects, which included a chat application, a mathematical formula parser, and applying visual design principles to the pages we created in class. I often made suggestions for what students could do at the beginning of the class block, including providing some basic code they could use to experiment.

What worked:

  • Based on feedback from the end of the year, students enjoyed the course. They had a lot of positive comments on the ways I ran the class and that they always got help when they needed it.
  • Forcing students to write down code helped with retention and building a useful reference for later. I didn't require them to write down long blocks of code, but for things like HTML tags and Javascript, I wanted there to be some written reinforcement that things were important. I was pretty strict on deciding when I wanted students to write down code (to activate that part of the brain) and when I wanted them to copy it directly into a text editor and run it.
  • Forcing students to recreate code (and not copy and paste) led to higher activity and interaction between students while learning to code. I saved some code as images, not text, which required students to go line by line and see what they were doing. This was a decision I made early on because it helped me when learning to code. That extra step of needing to look at the code while I was typing it in led me to take a closer look at what it said, and I wanted to give a similar opportunity to my students.
  • The more open ended projects led to much richer questions and interaction between students. I really liked the range of responses I received when I gave open ended projects. Some students were exceptionally creative or went well beyond the requirements to make code that mattered to them.
  • Students were constantly helping each other with their code...when they eventually asked for this help. I was called over many times by students calling out the blanket statement "my code doesn't work" and then handing me their laptop, but over time they learned that I wasn't going to just fix their code for them. They became careful readers of each other's code, when they finally made the step to ask someone to help, though this took some time.
  • I succeeded in having students do more than listen. I never talked for more than 15 minutes before students were actually writing and experimenting with code. This was exactly what I wanted.
  • 20% days were a big hit. Some students wanted this time as extra processing time to complete the mini projects from the rest of the week. Others liked being able to ask me how to do anything, or to find tutorials for HTML elements that they wanted to learn to use. I really liked how well this worked with this group of students and looked forward to it, and not just because it was a reduction in the planning required for class.
  • Videos offered an effective and preferred method for learning to write code in this class. I put together a number of screencasts in which I spoke about the code, and in some cases coded it live. Students were able to pause, copy code to the editor, and then run it pretty easily. Some zipped through it, others took longer, but this is nothing new. The time required to do this, as is always a consideration for me, was often more than I could afford. Luckily, there is plenty of material available already out there, so I was able to step back and give another voice and face a chance to teach my students.

What needs work:

  • The bonus elements for projects were the first things most students wanted to figure out first. Many students did not put in the time to read and complete the basic requirements for projects, resulting in submitted projects that were sent right back as being incomplete. Some of this was a language issue, as there were many ESOL students in the class, but most of it was what we always encounter when working with adolescents: not reading the directions.
  • Students reused a lot of old (and unrelated) code. I emphasized creating simple code from scratch throughout the year, as my expectations were relatively simple. For many students, copying and pasting code was a crutch that led to many more problems than simply writing simple, clean code from the start. I get it - I copy and paste code myself - but I also know how to clean it up. They knew why not to do it (because they all tried it at some point) but some students continued doing it to the end. I need a better plan for helping students not fall into this trap.
  • Many students did not pick up on error messages in the console that said precisely where the problem with the code was located. At times, I expected too much from students, because the console is a scary place. That said, I think I could do a better job of emphasizing how to find the line numbers referenced in these errors messages, regardless of what the error message is.

I really enjoyed teaching this class, and not just because of the awesome group of students that took it. It helped me refine my knowledge and get better at some of the HTML, CSS, and Javascript coding skills that I had used, but often had to relearn every time I wanted to use them.

Feedback, as always, is welcome!

2014-2015 Year-In-Review: Standards Based Grading

This was my third year using standards based grading with my classes. I wrote last year and the year before about my implementation.

What did I do differently?

  • I had my WeinbergCloud implementation working from the beginning of the year, so it was part of the expectations I introduced on day one.
  • I also adjusted this system a bit to make it easier to link the reassessments and the content of the standards. There seemed to be too much uncertainty about what each standard represented, which translated into more confusion when signing up for reassessments than I wanted. Creating a list of standards and resources associated with each standard shrank this gap.
  • I did not limit the number of reassessments per day explicitly. I expected that students would not sign up for a ridiculous number given the limitations on their credits, which students earned by doing homework or coming to tutoring.
  • I included time within at least one class a week per student during which students could do reassessments without having to come in outside of class time.
  • Unit exams continued to be assessed purely on course standards, not points. Semester final exams were percentage based.
  • I scaled all of my standards levels from 1 - 5 to be from 6 - 10 to make it easier to communicate the levels to parents and be consistent with our school grading policy of not giving numerical grades below 50%. No student actually received lower grades due to my system of adding a base grade to each standard, but the process of explaining to students and parents that a 1 was really a 60% (5 for the base grade + 1 for the standard level) was clearly more complex than it needed to be.
  • For my combined IB HL/SL class, the HL students had standards that only they were responsible for learning, while also being responsible for the SL standards. More on this later.

What worked:

  • Students seemed to have a better understanding from the beginning of the year of what standards based grading and assessment was all about. I did a bit more deliberate instruction on the ideas behind it at the beginning of the year. I also had smaller classes than before, so I was better able to have individual conversations about signing up for reassessments and talking about the process.
  • A small proportion of students were fully sold on the idea of reassessment as a learning tool. Some students reassessed at least twice a week throughout the semester, and these students had strong performances on the cumulative final exams.
  • By the second unit exam, students were generally not leaving questions blank on assessments. They were trying their best to do some amount of work on each question.
  • As with last year, I gave more challenging questions to assess the range of student ability. Most of these involved either multiple standards combined in one, more open ended responses, or questions requiring explanation. Assessing at the higher levels of mastery became strongly subjective, and students accepted this, though they occasionally advocated for themselves as to why they deserved to be marked higher. They generally felt that it was fair when arithmetic errors kept them in the 8/10 range.
  • Having students report their mastery level when signing up for a reassessment made it much easier for me to know what problem type or category to give them. Furthermore, this made it easier to justify changing the mastery level higher after a successful reassessment, but not making it the highest level on the scale. A student that was a 6 and answered a couple of questions correctly might move to an 8, whereas a student that was previously an 8 would be given more challenging questions and some conversation explaining their understanding in order to move to a 10.
  • It was my priority to get assessments back within the same period, and I estimate that I was able to do this more than 95% of the time. Simple, short, and carefully designed assessments can reveal quite a bit about what students do/don't understand.

What needs work:

  • Similar to previous semesters, I had high participation of a small group of students, with far too many students choosing not to reassess until the very end of each semester. Some students did not initiate their own reassessments at all.
  • Students again hoarded their credits to the end of the semester. I flirted with the idea of adding an expiration date to credits to discourage holding on to credits for long periods of time, but time constraints kept me from implementing this.
  • As a consequence of credit-hoarding, students near the end of the semester signed up for absurd numbers of reassessments in a day - I believe the largest quantity was nine. I shared with students that a good rule of thumb for planning purposes is 10 minutes per reassessment, so doing five reassessments before school isn't practical, but that didn't come across well. Students that couldn't do all of their reassessments in the morning simply pushed them to later in the day. This was a problem for me because I never knew if students were going to show up according to their scheduled time, or just do everything after school. Canceling after no-shows at the end fixed this problem pretty efficiently, however.
  • When a student would answer all questions correctly on an unannounced standards quiz, I generally assigned this a mastery level of 8 on a 6 - 10 scale. Students that had less than an 8 in this case usually had trouble with the same questions on a unit assessment or reassessment on the same standard later on. In other words, the students that had trouble initially learning a concept did not necessarily get the help they needed to make progress before the unit exam. This progress often happened after the exam, but this led to a lot of students falling behind pretty early on. I need to introduce interventions much earlier.

Under consideration for next year:

These are the ideas I am mulling over implementing before school gets started in a month, and I'd love to hear what you think.

  • Make credit expiration happen. This has been an issue for the year and a half of WeinbergCloud's existence. I threatened implementing this in speaking with students, and they were immediately asking me not to because it would prevent them from putting off reassessments as they preferred to do. This includes students that were doing the practice problems between classes anyway, so this wasn't just about losing the credits. Adding a "why not just give a reassessment a try" argument worked in face-to-face conversation with students that were hoarding credits, so forcing the process might be worth the effort. I understand that learning takes time, but many of the students putting off reassessment weren't actively reviewing the standards over time any way. I'd rather force the feedback cycle through more iterations since that is when students seem to learn the most.
  • Introduce submitting work into the process of reassessment. This could be electronic ("To complete your sign up, submit a scan/photo of the work you have done to prepare") or could just be shown before I give them a reassessment. This would reduce some of the sign-ups that happen only based on the mastery score rather than reviewing the concepts that come with it. Students earn credits by doing practice problems or coming to tutoring, and these let them sign up for reassessments - this won't change. To actually go the final step and take the reassessment, I need to see what students have done to prepare. In some cases (students that see me the day before, for example) I may waive this requirement.
  • Require X number of reassessments per two week cycle of the block schedule. This might be in lieu of the previous change, but I'm afraid this might encourage (rather than prevent) a rush of reassessments at the end of a two week period. On the other hand, if the goal is to increase opportunities for feedback, this might be more effective.
  • Make it possible for students to sign-up for an appointment to go over (but not be assessed) material on a given standard. Reassessments are great opportunities for feedback, but sometimes students want to come in to go over material. I get emails from students asking this, but it might be easier to just include this within WeinbergCloud.
  • Introduce skills/definition standards for each unit. This would be a standard for each unit that covers basic recall of information. I'll discuss why I want these (particularly in physics) in more detail within a later post. The short story is that I want to specifically assess certain concepts that are fundamental to all of the standards of a unit with a single binary standard.
  • Classify standards mastery levels in terms of 'likelihood of success'. This is a lower priority, and when I tried to explain this to a colleague, she wasn't convinced it would be worth the effort. If you have a 10, it means you have a 95% or higher likelihood of answering anything I give you correctly. The probabilities might not scale linearly - a 9 might mean between 90-95%, an 8 between 75% and 90, etc. I don't know. The reason I want to do this is to justify giving a 10 to students that have demonstrated solid proficiency without requiring perfection, and have a better reason for only raising a student from a 6 to an 8 after answering a couple questions on a single reassessment.

    Right now the difference between an 8, 9, and 10 are defined (in order) by answering questions correctly on a single standard quiz, a comprehensive unit exam, and correctly answering stretch questions correctly. A student that gets an 8 on a standards quiz before an exam might then answers related questions incorrectly on the multi-standards exam and remains an 8. If this student then takes a quiz on a single standard and answers that question correctly, does it make sense to then raise their mastery level above 8? This is what I often do. I can also control for this by giving a more challenging question, but I'm not sure I need to.

    In short, something is fishy here, and I need to think it out more in order to properly communicate it to students. In my head, I understand what I want to communicate: "yes, you answered these questions correctly, but I'm still not convinced that you understand well enough to apply the concepts correctly next time." This is not the highest priority out of the ones I've mentioned here.

As always, I appreciate your feedback. Thanks for reading!

Standards Based Grading, Year Two (Year-In-Review)

This was my second year using standards based grading with my classes. I wrote last year about how my first iteration went, and made some adjustments this year.

What did I do?

  • I continued using my 1-5 standard scale and scale rubric that I developed last year. This is also described in the post above.
  • As I wrote about in a previous post, I created an online reassessment organization tool that made it easier to have students sign up and organize their reassessments.
  • The new requirement for students signing up for reassessments involved credits, which students earned through doing homework, seeing me for tutoring
  • I included a number of projects that were assessed as project standards using the same 1-5 scale. the rubric for this scale was given to students along with the project description. Each project, like the regular course learning standards, could be resubmitted and reassessed after getting feedback and revising.

What worked:

  • My rate of reassessment was substantially better in the second semester. I tweeted out this graph of my reassessments over the course of the semester:Reassessment plot EOYBqAGuNKCAAAUZjW.png-large There was a huge rush at the end of the semester to reassess - that was nothing new - but the rate was significantly more consistent throughout. The volume of reassessments was substantially higher. There were also fewer students than in the first semester that did not take advantage of reassessment opportunities. Certain students did make up a large proportion of the total set of reassessments, but this was nowhere near as skewed a distribution as in the first semester.
  • Students took advantage of the project standards to revise and resubmit their work. I gave a living proof project that required students to make a video in which they went through a geometric proof and explained the steps. Many students responded to my feedback about mathematical correctness, quality of their video, and re-recorded their video to receive a higher grade.
  • Student attitude about SBG was positive at the end of the year. Students knew that they could do to improve their grade. While I did have blank questions on some unit assessments, students seemed to be more likely to try and solve questions more frequently than in the past. This is purely a qualitative observation, so take that for what it is.

What needs work:

  • Students hoarded their reassessment credits. This is part of the reason the reassessment rush was so severe at the end of the semester. Students didn't want to use their credits until they were sure they were ready, which meant that a number were unused by the end of the year. Even by the end of the year, more than a quarter of credits that had been earned weren't used for reassessments. <p\> I don't know if this means I need to make them expire, or that I need to be more aggressive in pursuing students to use the credits that they earned. I'm wrestling a lot with this as I reflect this summer.
  • I need to improve the system for assessing during the class period. I had students sign up for reassessments knowing that the last 15 - 20 minutes of the class period would be available for it, but not many took advantage of this. Some preferred to do this before or after school, but some students couldn't reassess then because of transportation issues. I don't want to unfairly advantage those who live near the school by the system.
  • I need to continue to improve my workflow for selecting and assigning reassessments. There is still some inefficiency in the time between seeing what students are assessing on and selecting a set of questions. I think part of this can be improved by asking students to report their current grade for a given standard when signing up. Some students want to demonstrate basic proficiency, while others are shooting for a 4 or 5, requiring questions that are a bit higher level. I also might combine my reassessment sign up web application and the quiz application so that I'm not switching between two browser windows in the process.
  • Students want to be able to sign up to meet with me to review a specific standard, not just be assessed on it. If students know specifically what they want to go over, and want some one-on-one time on it since they know that works well for them, I'm all for making that happen. This is an easy change to my current system.
  • Students should be able to provide feedback to me on how things are going for them. I want to create a simple system that lets students rate their comprehension on a scale of 1 - 5 for each class period. This lets students assess me and my teaching on a similar scale to what I use to assess them, and might yield good information to help me know how to plan for the next class.

I've had some great conversations with colleagues about the ways that standards based grading has changed my teaching for the better. I'm looking forward to continuing to refine my model next year. The hard part is deciding exactly what refinements to make. That's what summer reflection and conversations with other teachers is all about, so let's keep that going, folks.

2012-2013 Year In Review – Learning Standards

This is the second post reflecting on this past year and I what I did with my students.

My first post is located here. I wrote about this year being the first time I went with standards based grading. One of the most important aspects of this process was creating the learning standards that focused the work of each unit.

What did I do?

I set out to create learning standards for each unit of my courses: Geometry, Advanced Algebra (not my title - this was an Algebra 2 sans trig), Calculus, and Physics. While I wanted to be able to do this for the entire semester at the beginning of the semester, I ended up doing it unit by unit due to time constraints. The content of my courses didn't change relative to what I had done in previous years though, so it was more of a matter of deciding what themes existed in the content that could be distilled into standards. This involved some combination of concepts into one to prevent the situation of having too many. In some ways, this was a neat exercise to see that two separate concepts really weren't that different. For example, seeing absolute value equations and inequalities as the same standard led to both a presentation and an assessment process that emphasized the common application of the absolute value definition to both situations.

What worked:

  • The most powerful payoff in creating the standards came at the end of the semester. Students were used to referring to the standards and knew that they were the first place to look for what they needed to study. Students would often ask for a review sheet for the entire semester. Having the standards document available made it easy to ask the students to find problems relating to each standard. This enabled them to then make their own review sheet and ask directed questions related to the standards they did not understand.
  • The standards focus on what students should be able to do. I tried to keep this focus so that students could simultaneously recognize the connection between the content (definitions, theorems, problem types) and what I would ask them to do with that content. My courses don't involve much recall of facts and instead focus on applying concepts in a number of different situations. The standards helped me show that I valued this application.
  • Writing problems and assessing students was always in the context of the standards. I could give big picture, open-ended problems that required a bit more synthesis on the part of students than before. I could require that students write, read, and look up information needed for a problem and be creative in their presentation as they felt was appropriate. My focus was on seeing how well their work presented and demonstrated proficiency on these standards. They got experience and got feedback on their work (misspelling words in student videos was one) but my focus was on their understanding.
  • The number standards per unit was limited to 4-6 each...eventually. I quickly realized that 7 was on the edge of being too many, but had trouble cutting them down in some cases. In particular, I had trouble doing this with the differentiation unit in Calculus. To make it so that the unit wasn't any more important than the others, each standard for that unit was weighted 80%, a fact that turned out not to be very important to students.

What needs work:

  • The vocabulary of the standards needs to be more precise and clearly communicated. I tried (and didn't always succeed) to make it possible for a student to read a standard and understand what they had to be able to do. I realize now, looking back over them all, that I use certain words over and over again but have never specifically said what it means. What does it mean to 'apply' a concept? What about 'relate' a definition? These explanations don't need to be in the standards themselves, but it is important that they be somewhere and be explained in some way so students can better understand them.
  • Example problems and references for each standard would be helpful in communicating their content. I wrote about this in my last post. Students generally understood the standards, but wanted specific problems that they were sure related to a particular standard.
  • Some of the specific content needs to be adjusted. This was my first year being much more deliberate in following the Modeling Physics curriculum. I haven't, unfortunately, been able to attend a training workshop that would probably help me understand how to implement the curriculum more effectively. The unbalanced force unit was crammed in at the end of the first semester and worked through in a fairly superficial way. Not good, Weinberg.
  • Standards for non-content related skills need to be worked in to the scheme. I wanted to have some standards for year or semester long skills standards. For example, unit 5 in Geometry included a standard (not listed in my document below) on creating a presenting a multimedia proof. This was to provide students opportunities to learn to create a video in which they clearly communicate the steps and content of a geometric proof. They could create their video, submit it to me, and get feedback to make it better over time. I also would love to include some programming or computational thinking standards as well that students can work on long term. These standards need to be communicated and cultivated over a long period of time. They will otherwise be just like the others in terms of the rush at the end of the semester. I'll think about these this summer.

You can see my standards in this Google document:
2012-2013 - Learning Standards

I'd love to hear your comments on these standards or on the post - comment away please!