Category Archives: year-in-review

2016 - 2017 Year In Review: Standards Based Grading

Overview

I've used Standards Based Grading, or SBG, with most of my classes for the past five years. It transformed the way I think about planning, assessment, classroom activities...and pretty much everything else around my teaching practice. I have a difficult time imagining what would happen if I had to go back. I've written a lot about it this year - here are some of the posts:

Scaling up SBG for the New Year
Standards Based Grading and Leveling Up
The How and Why of Standards Based Grading @ Learning 2.0
Too Many Reassessments, Just in Time for Summer

As I wrote in that last post, I still wrestle with the details. I'm fully invested in the philosophy though. I am glad to have my administrators supportive in having me adapt it to work within the more traditional system. I've also had some great conversations with colleagues who are excited by the concept, but that wonder how to make it work in their courses.

Here's the rundown of how it went this year.

What worked:

  • Students really bought into the system. The most common responses on student surveys on what I needed to keep involved the grade being defined by standards and the reassessment system. I found students were often the system's best advocates when other teachers and parents had questions, which made communication much easier.
  • The system was the gateway to many very positive conversations with students around learning, improvement, and the role of feedback. Conversations were around understanding concepts and applying them, not asking for points. Many students would finish a reassessment and tell me that they their grade should stay the same, but that they would keep trying. Other students would try and argue their way to a higher score, but by using the vocabulary I use to define my standard descriptors (linked here). They understood that mistakes are informative, not punitive. Transplanting this understanding to students in my new school was a major success of the year.
  • I developed a better understanding of what I'm looking for at each level on my 5 - 10 scale. Part of this came from being at a new school and needing to articulate this to students, parents, and administrators. The SBG and Leveling up project (linked above) helped refine my definitions of what distinguishes a 9 from a 10, or a 6 from a 7.

What needs work:

  • I had way too many reassessments. Full stop. I wrote about this in my post Too Many Reassessments, Just in Time for Summer and am exhausted just thinking about doing it again. There are a couple elements of this to unpack. One is that my credit system allows for reassessments to occur more frequently than I believe deep learning can really take place. I'm thinking about making it so students are locked out of reassessing on a standard for a set period of time, at least when going for a score of 8 or above where the goal is transfer of skills and flexibility of application. The other thing I am considering is limiting students to a single reassessment per week, or day, or some other interval. I have some time to decide on this, which is good, because both require a rewrite of my online signup tool, WeinbergCloud.
  • Long term retention was still not where it needs to be. I wrote about this already in my post about my IB Mathematics Year 1 course. As I have taught more and more in this system, I have believed ever more strongly that clear communication about what grades signify about a student matters. A lot. Moving from quarters to semester grades is one part of improving this, a change that my administrator team made for this coming year, but a lot of it still sits with me. I need to spiral, I need to reassess on old standards, and still hold students accountable for older material.
  • Communicating the role of semester exams was a major challenge for me this year. In a small school, I found it was easy to communicate with individual students and parents about the role of semester exams. I based much of my outreach on what I understood about these exams and the role of learning standards grades throughout the year. A standards based grade book breaks down the entire topic into bite sized pieces, which makes it easier both to communicate strengths and weaknesses, and for students and teachers to decide what is the best next step. Semester exams are opportunities to put all of these pieces together and assess a students's ability to decide which standards apply in a given problem. Another way of looking at it is a soccer practice versus a soccer game mentality.

    Ultimately, I do want students to be successful across the breath of the content on which a course is based. Semester exams serve as one way to measure that progress in the bigger picture of an entire course, rather than a unit. This also serves as a third scale on which to consider assessment in my course. Quizzes assess a standard, exams assess a unit of standards (with a few older standards thrown in), and semester exams assess mastery of a portion of the course. That different scale is why the 80% quarter grade, 20% exam grade proportions that I've followed for seven years is entirely reasonable.

    A student that aces all of the standards with a 100 but gets a 50 on the final ends up with a 90. This student receives with the same semester grade as someone that has a 90 up until the final, and gets a 90 on the final. I'm fine with this parity in grades. I would have very different conversations with those two students before the next semester of mathematics in their plans.

    The main challenge I found was that students and parents often looked at that final exam grade in isolation from, not together with, the rest of the scores in the grade book. The parent of the first student (100 than 50) that asks me to explain that disparity is certainly justified in doing so. Where I fell short was communicating the reality that in a standards based system, grades usually drop after a semester exam. It's a fundamentally different brand of assessment.

    I'll also point out that the report card presented a semester of assessment in table form as quarter 1 grade, quarter 2 grade, exam grade, and then semester grade. This artificially shows the exam grade as perhaps being more consequential to the grade than it actually is. This isn't in my realm of influence, so I'll stop talking about it. The bottom line is that I need to to a better job of communicating these realities to everyone involved.

Conclusion

I'm glad to be starting another year soon and to continue to make this system do good things for students. Cycle forward.

2016 - 2017 Year In Review: IB Mathematics SL

Overview

This was my third time around teaching the first year of the IB mathematics SL sequence. It was different from my previous two iterations given that this was not an SL/HL combined class. This meant that I had more time available to do explorations, problem solving sessions, and in-class discussions of the internal assessment (also called the exploration). I had two sections of the class with fourteen and twenty students respectively.

I continued to use standards based grading for this course. You can find my standards (which define the curricular content for my year one course) at this link:

IB Mathematics SL Year 1 - Course Standards

What worked:

  • My model of splitting the 80 - 85 minute block into twenty minute blocks of time works well. I can plan what happens in those sub-blocks, and try as hard as I can to keep students doing something for as much of those as I can. The first block is a warm-up, some discussion, check in about homework or whatever, and then usually some quick instruction before the next block, which often involves an exploration activity. Third is summary of explorations or the preceding activities, example problems, and then a fourth of me circulating and helping students work.
  • Buffer days, which I threw in as opportunities for students to work on problems, ask questions, and play catch up, were a big hit. I did little more on these days than give optional sets of problems and float around to groups of students. Whenever I tried to just go over something quick on these days, those lessons quickly expanded to fill more time than intended. It took a lot of discipline to instead address issues as they came up.
  • I successfully did three writing assignments in preparation for the internal assessment, which students will begin writing officially at the beginning of year two. Each one focused on a different one of the criteria, and was given at the end of a unit. Giving students opportunities to write, and get feedback on their writing, was useful both for planning purposes and for starting the conversation around bad habits now.

    I had rolling deadlines for these assignments, which students submitted as Google Docs. I would go through a set of submissions for a class, give feedback to those that made progress, and gentle reminders to those that hadn't. The final grade that went into PowerSchool was whatever grade students had earned by the end of the quarter.

    The principle I applied here (and one to which I have subscribed more fervently with each year of teaching) is that my most valuable currency in the classroom is feedback. Those that waited to get started in earnest with these didn't get the same amount of feedback as students that started early, and the quality of their work suffered dramatically. I'm glad I could have the conversations I had with students now so that I might have a chance in changing their behavior before their actual IA is due.

    An important point - although I did comment on different elements of the rubric, most of my feedback was on the criterion that titled the assignment. For example, in my feedback I occasionally referenced reflection and mathematical presentation in the communication assignment. I gave the most detailed feedback for communication, and graded solely on that criterion.

    These were the assignments:

  • I budgeted some of my additional instruction time for explicit calculator instruction. I've argued previously about the limitations of graphing calculators compared to Geogebra, Desmos, and other tools that have substantially better user experiences. The reality, however, is that these calculators are what students can access during exams. Without some level of fluency accessing the features, they would be unable to solve some problems. I wrote about this in my review of the course last year. This time was well spent, as students were not tripped up by questions that could only be solved numerically or graphically.
  • Students saw many past paper questions, and seem to have some familiarity with the style of questions that are asked.

What needs work:

  • I've come to the conclusion that preemptive advice is ineffective. "Don't forget to [...]" or "You need to be extremely careful when you [...]" is what I'm talking about. It isn't useful for students that don't need the reminder. It doesn't help the students that don't have a context for what you are telling them not to do, not having solved problems on their own. I have found it to be much more effective to address those mistakes after students get burned by them. Some of my success here comes from my students subscribing to a growth mindset, which is something I push pretty hard from the beginning. Standards based grading helps a lot here too.
  • I desperately need a better way to encourage longer retention of knowledge, particularly in the context of a two year IB course. I'll comment more on this in a later post, but standards based grading and the quarter system combined were factors working against this effort. I did some haphazard spaced repetition of topics on assessments in the form of longer form section two questions. The fact that I was doing this did not incentivize enough students to regularly review. I also wonder if my conflicted beliefs on fluency versus understanding of process play a role as well.
  • Students consistently have a lot of questions about rounding, reporting answers, and follow through in using those answers in the context of IB grading. The rules are explicitly stated in the mark schemes for questions - answers should be reported exactly or to three significant figures unless otherwise noted. The questions students repeatedly have relate to multiple part questions. For example, if a student does a calculation in part (a), reports it to three significant figures, and then uses the exact answer to answer part (b), might that result in a wrong answer according to the mark scheme? What if the student uses the three significant figure reported answer in a subsequent part?

    I did a lot of research in the OCC forum and reading past papers to try to fully understand the spirit of what IB tries to do. I'd like to believe that IB sides with students that are doing the mathematics correctly. I am not confident in my ability to explain what the IB believes on this, which means my students are uncertain too. This bothers me a lot.

  • Students still struggle to remember the nuances of the different command terms during assessments. They also will do large amounts of complex calculations and algebraic work in site of seeing that a question is only two or three marks. There is clearly more work to do on that, though I expect that will improve as we move into year two material because, well, it usually does. I wish there was a way to start the self-reflection process earlier.
  • Students struggle to write about mathematics. They also struggle with the reality that there is no way to make it go faster or do it at the last minute without the quality suffering. I still believe that the way you get better is by writing more and getting feedback, and that's the main reason I'm glad I made the changes I did regarding the exploration components. That said, students know how to write filler paragraphs, and I call them out on filler every single time.
  • We spent a full day brainstorming and thinking about possible topics for individual explorations. Surveying the students, only four of them are certain about their topics. The rest have asked for additional guidance, which I am still figuring out how to provide over the summer. I think this process of finding viable topics remains difficult for students.

Conclusion

I'll be following these students to year two. We have the rest of probability to do first thing when we get back, which I'll combine with some dedicated class time devoted toward the exploration. I like pushing the probability and Calculus to year two, as these topics are, by definition, plagued by uncertainty. It's an interesting context in which to work with students in their final year of high school.

2016 - 2017 Year In Review: PreCalculus

Overview

This was the first time I taught a true PreCalculus course in six years. At my current school, the course serves the following functions:

  • Preparing tenth grade students for IB mathematics SL or HL in their 11th grade year. Many of these students were strong 9th grade students that were not yet eligible to enter the IB program since this must begin in grade eleven.
  • Giving students the skills they need to be successful in Advanced Placement Calculus in their junior or senior year.
  • Providing students interested in taking the SAT II in mathematics some guidance in the topics that are covered by that exam.

For some students, this is also the final mathematics course taken in high school. I decided to design the course to extend knowledge in Algebra 2, continue developing problem solving skills, do a bit more movement into abstraction of mathematical ideas, and provide a baseline for further work in mathematics. I cut some topics that I used to think were essential to the course, but did not properly serve the many different pathways that students can follow in our school. Like Algebra 2, this course can be the swiss army knife course that "covers" a lot so that students have been exposed to topics before they really need to learn them in higher level math courses. I always think that approach waters down much of the content and the potential for a course like this. What tools are going to be the most useful to the broadest group of students for developing their fluency, understanding, and communication of mathematical ideas? I designed my course to answer that question.

I also found that this course tended to be the one in which I experimented the most with pedagogy, class structure, new tools, and assessment.

The learning standards I used for the course can be found here:
PreCalculus 2016-2017 Learning Standards

What worked:

  • I did some assessments using Numbas, Google Forms, and the Moodle built-in quizzes to aid with grading and question generation. I liked the concept, but some of the execution is still rough around the edges. None of these did exactly what I was looking for, though I think they could each be hacked into a form that does. I might be too much of a perfectionist to ever be happy here.
  • For the trigonometry units, I offered computer programming challenges that were associated with each learning standard. Some students chose to use their spreadsheet or Python skills to write small programs to solve these challenges. It was not a large number of students, but those that decided to take these on reported that they liked the opportunity to think differently about what they were learning.
  • I explicitly also taught using spreadsheet functions to develop student's computational thinking skills. This required designing some problems that were just too tedious to solve by hand. This was fun.
  • Differentiation in this course was a challenge, but I was happy with some of the systems I used to manage it. As I have found is common since moving abroad, many students are computationally well developed, but not conceptually so. Students would learn tricks in after school academy that they would try to use in my course, often in inappropriate situations. I found a nice balance between problems that started low on the ladder of abstraction, and those that worked higher. All homework assignments for the course in Semester 2 were divided into Level 1, Level 2, and Level 3 questions so that students could decide what would be most useful for them.
  • I did some self-paced lessons with students in groups using a range of resources, from Khan Academy to OpenStax. Students reported that they generally liked when I structured class this way, though there were requests for more direct instruction among some of the students, as I described in mu previous post about the survey results.
  • There was really no time rush in this course since after my decision to cut out vectors, polar equations, linear systems, and some other assorted topics that really don't show up again except in Mathematics HL or Calculus BC where it's worth seeing the topic again anyway. Some students also gave very positive feedback regarding the final unit on probability. I took my time with things there. Some of this was out of necessity when I was out sick for ten days, but there were many times when I thought about stepping up the challenge faster than I really needed to.

What needs work:

  • I wrote about how I did the conic sections unit with no-numerical grades - just comments in the grade book . The decision to do that was based on a number of factors. The downside was that when I switched back to numerical grades for the final unit, the grade calculation for the entire quarter was based only on those grades, and not on the conic sections unit at all. The conic sections unit did appear on the final exam, but for the most part, there wasn't any other consequence for students that did not reassess on the unit.
  • Students did not generally like when I used Trello. They liked the concept of breaking up lessons into pieces and tasks. They did not like the forced timelines and the extra step of the virtual Trello board for keeping track of things. This Medium article makes me wonder about doing this in an analog form if I try it in the future. I also could make an effort to instill the spirit of Scrum early on so that it's less novel, and more the way things are in my classroom.
  • I should have done a lot more assessment at the beginning of units to see what students knew and didn't know. It sounds like the student experiences in the different Algebra 2 courses leading to PreCalculus were quite different, which led to a range of success levels throughout. Actually, I should probably be doing this more often for all my courses.
  • Students could create their own small reference sheet for every exam. I did this because I didn't want students memorizing things like double angle identities and formulas for series. The reason this needs work is that some students are still too reliant on having this resource available to ever reach any level of procedural fluency. I know what students need to be fluent later on in the more advanced courses, sure, but I am not convinced that memorization is the way to get there. Timed drills don't seem to do it either. This challenge is compounded by the fact that not all students need that level of fluency for future courses, so what role does memorization here play? I have struggled with this in every year of my fourteen year career, and I don't think it's getting resolved anytime soon. This is especially the case when Daniel Willingham, who generally makes great points that I agree with, writes articles like this one.

Conclusion

This course was fun on many levels. I like being there to push students to think more abstractly as they form the foundation of skills that will lead to success in higher levels of mathematics. I like also crafting exercises and explorations that engage and equip the students that are finishing their mathematics careers. We should be able to meet the needs of both groups in one classroom at this stage.

I frequently reminded myself of the big picture by reading through Jonathan Claydon's posts on his own Precalc course development over the years. If you haven't checked him out, you should. It's also entertaining to pester him about a resource he posted a few years ago and hear him explain how much things have changed since then.

2016 - 2017 Year In Review: Technology Tools

Overview

I've always been a pretty heavy user of technology. I've been more careful in the past few years to use it for a reason, not for its own sake though. I also balance that use though with a healthy desire to try new things in a way that I would actually use them in the classroom.

This is also the first year I've been able to take advantage of the Google Tools suite since Vietnam is not subject to the limitations of China's Great Firewall. Though there have been times when the internet connection to the entire country has been subject to shark attacks, connections in general have been smooth. Seeing how effectively some folks use Google in the classroom after being unable to use it for six years make me feel seriously behind the times. Luckily, my colleagues are really eager to share what they do. I might be caught up.

  1. A Macbook Pro where I do most of my lesson planning. I connected an external widescreen monitor that mirrored all projected content. The second screen was sent through AirPlay to an Apple TV, which was then connected to the projector.
  2. Class worksheets electronically created and stored in Google Docs. These are printed out on A5 size sheets for students to tape into their notebooks for a physical record of what we did.
  3. For IB Mathematics SL and PreCalculus, I had two students per class make an additional Google Doc that was a copy of the handout. In this document, students would paste solutions to class work, homework, and whatever else they thought might be important to their classmates. The student responsibility for doing this was on a rotating schedule, similar to what I've used in my previous classes.
  4. Notability app for class notes, with a Wacom Tablet for input. I used the wireless accessory kit for around two days, because it disconnected too frequently.
  5. iPhone as a document camera for capturing student work for sharing answers or for conversation during the class. I would take pictures of student work and use AirDrop to upload them for inclusion in the notes.
  6. Moodle as a repository for all of the above documents and links. I also used it occasionally for distributing quizzes and automatic grading.
  7. My WeinbergCloud website for managing, assigning, and recording reassessments throughout the semester.
  8. PearDeck on a trial basis for first semester, and then regularly during second. I sometimes used an iPad to manage the class, but every time I regretted it, and just used my computer.
  9. Desmos Calculator usually at least once per lesson
  10. Desmos Activity Builder about once per unit per course
  11. EdPuzzle for self paced lessons, videos, and quizzes in Algebra 2. Most of the videos were produced by my colleague, Scott Hsu.
  12. Spreadsheets for building useful calculators (like discriminants for quadratics, arithmetic/geometric series sums, etc)
  13. Khan Academy for practice exercises and monitoring of student effort in reviewing material.
  14. Geogebra for checking exam questions and demonstrating its use as a work-checking tool for students.
  15. Camtasia for recording videos from time to time of solving problems, using
  16. Quizster as a way to have students submit specific homework problems for feedback.
  17. Wireless keyboard and trackpad, though these lasted about a week and a half.
  18. I dabbled with GoFormative, primarily when I was on sick leave for a while. Connection issues that were inconsistent across the class led to my abandoning it for regular use.
  19. For two units in PreCalculus, I used Trello as a way to organize units and help students organize their work for each day of class.

What worked:

  • The process of cutting and pasting images of problems or student work into Notability, and then annotating them was great for recording important information during class. These notes were then either pasted into the document created by students for each class, or exported as PDF for posting on Moodle. This felt like a good way to have a record of what went on during a given class block in case students missed a block.
  • I liked automated grading of quizzes through Google forms and Moodle. This definitely saved time, but the process of getting feedback to students in response still is awkward. When student work is analog, but answer checking is digital, where should that feedback go? Quizster offers some way of making all of this occur in the same tool, but the workflow never was smooth enough to fully commit to it.
  • The combination of PearDeck and Desmos Activity builder, along with photos of student work, made for great sources of understanding (and misunderstanding) that helped me decide how or whether to proceed with other material. These also made for great motivating elements for direct instruction when it needed to happen. The students really liked using these tools, and said they looked forward to them, according to the end of year survey results.
  • I don't think Khan Academy exercises work well for assessing students beyond a basic level. I think they can provide the practice some students need on procedural skills like factoring or evaluating trigonometric functions. It's just one tool among many to serve the needs of my students.
  • When I provided guidance on how spreadsheets could be used for more than just making charts, students appreciated it. One student went so far as to say that this instruction was "actually useful". I decided not to ask what this student thought about the rest of the class.

What needs work:

  • I posted homework problems on the printed class handout, digital handout, and on a dedicated document for assignments that is an expectation across classes our high school division. Consistently updating all three documents was a challenge, despite my best efforts.
  • The student notebook entries for each class were among the least favorite element of the class, as reported by students. I've written previously about the need to have some record of what happens during class, but frustration over students that do not produce their own record through regular use of a dedicated notebook. This isn't the best solution, but I think it's the closest I've come to something that actually reaches the right balance. I just wish I could figure out how to get students to buy into its usefulness.
  • I still have not figured out the best way to bring the class back together after letting them work at their own pace through lessons. The only times it makes a lot of sense to do this is at the very beginning of class, and at the very end.
  • PearDeck, Desmos Activity Builder, and GoFormative each offer features that I really like. None of them do everything. I'm ok with this, but I wonder whether the fragmentation of activities is good for students, or a problem since their work is distributed across these tools.
  • While I liked using Trello, and some students reported that they also appreciated it, many students did not. I'm not sure if it actually is the self-paced lesson tool I'm looking for, but it was better than a static Google document.
  • At the end of the year, despite my own research and attempts to improve this, the Apple TV disconnected at least once every class period, if not more frequently.

Conclusion

My focus continues to be on using technology to free up time for the ways that I can best add value in the classroom. Many students don't need my help in making progress. Some do, and some like having me explain ideas to them. It's hard to simultaneously meet these different needs without technology, which enables me to be in multiple places at once.

Having the range of tools I describe above, and not fully committing to one, is both a blessing and a curse. The fragmentation means the residue of learning is distributed across many web addresses. The variety helps keeps students (and me) from getting into a rut. I don't know if this balance is appropriately tuned yet.

2016-2017 Year in Review: Being New

Overview

This was my first year since 2010 being the new kid in school. Developing a reputation takes time, so I was deliberate about establishing who I am as a teacher from the beginning. I wrote about learning names at the beginning of the year, for example. My school, surpassing 1,000 students this year, is the second largest of those at which I have worked. The high school division is just over 320 students. There are many systems that are in place to manage the reality of this group of ninth through twelfth graders that has a tremendous diversity of interests, programs (IB and AP), extra-curricular organizations, and obligations outside of the school walls. I walked in admittedly intimidated by the scope of this place and what it aims to accomplish.

After one of our meetings before student orientation, there was a lot of information that had been shared. I asked our high school principal what the priority needed to be in the first quarter in terms of processing all of that information. He put me at ease - the focus should be on figuring out how this place works. He promised (and certainly delivered) on a pledge to remind us of what was important throughout the year, but with the understanding that there would be a learning curve for our group of newbies. The faculty is passionate about teaching and creative in how they go about designing classroom experiences. They were intensely committed to sharing what they do and helping those of us that were new how to prioritize at any given time.

What worked:

  • The beginning of school was was a mix of content and getting to know them/me activities that were deliberately designed for those purposes. This sort of thing is important at the beginning of any year if the composition of a class is new. It's essential if the teacher is new too. Each group is unique and has chemistry that not only is important to suss out in the beginning, but must be regularly assessed as the year proceeds. I thought this series of activities worked really well. I will modify these for the purpose of offering variety for next year's student groups, but not really improvement.
  • I was able to get most of my work preparing lessons at school during my prep periods. Exceptions were after exams, the end of the year, and near reporting deadlines. This required serious levels of prioritization and disciplined decisions around what I actually could accomplish in those blocks of time. While I maintained to-do lists, a major component of my success came from block-scheduling those tasks and sticking to the schedule. This left time after school and at home to spend designing the explorations, experiments, and bigger picture puzzles that were nice, but not necessary.
  • I streamlined many of the administrative procedures I had created in my previous schools. I rebuilt spreadsheets that had been unchanged for several years rather than hacking old ones to work. Part of this was required to address the fact that my class sizes were substantially larger, but I also decided it was time.
  • As I had hoped to do, I spent much of the year watching. I did not want to come in and identify everything that this school did not have that I may have had a hand in organizing in past school years, and then add it myself. That is how I came to feel burnt out every time June came around. I was quite picky with what I involved myself in. I said no to things. When I was ready to design a VEX robotics sprint (more on that later) at the end of the year, however, this meant I had the energy and drive to do so.
  • The level of support I have felt from administrators and colleagues this year has been incredible. Nothing makes you feel so effective as a team that has your back, and that is realistic about what should, what can, and what cannot be accomplished with a given set of resources.

What needs work:

  • I did not get out and visit my colleagues anywhere nearly as frequently as I wanted. This is a seriously impressive group of teachers trying different things. Part of the problem was my commitment to trying to get things done during my prep periods, so I do take responsibility for that. It would not have been too devastating to that structure, however, if I also planned blocks of time when I would visit specific colleagues. I ate in the lunchroom with colleagues fairly regularly, and that was great for learning what everyone was doing. It was not enough. More of that next year.
  • I originally planned on doing outreach with parents more regularly this year. They are incredibly trusting of what we as teachers design for students, and this was evident at parent teacher conference nights during both semesters. I want more than that though. I want them to understand my philosophy for why learning mathematics right now is important. I don't think the parents understand standards based grading, and although the students made solid attempts to explain it during conferences, these conversations don't happen nearly as frequently as they should. I need to think more about what that communication looks like, and why I feel it is important, because I don't think I can fully articulate that here. I do know that there is a lost opportunity when it comes to parents really understanding what we do in the classroom on a regular basis.
  • I now believe that the ease of establishing community and connections with others is inversely related to the ease of living in that place. I often tell the story of how it was easy to rally a group of colleagues in my early days of China to go find cheese, which was difficult to find. Many of my closest bonds were formed during those adventures of first world adversity. Here in District 7 of HCMC, there is no such difficulty. Life is really good and easy here. This means, however, that one must work a little bit harder to leave the comfortable bubble of life to find adventure and make new friends. This is especially the case for me as the father of a now toddling ball of energy. It takes effort and time to build those relationships. That's definitely something that I need to work on deliberately planning more frequently next year.

Conclusion

The second year anywhere is always less scattered than the first. The next few weeks are all about figuring about how to use the time not spent learning the ropes.

2015-2016 Year in Review: IB Mathematics SL/HL

This was my second year working in the IB program for mathematics. For those that don't know, this is a two year program, culminating in an exam at the end of year two. The content of the standard level (SL) and higher level (HL) courses cross algebra, functions, trigonometry, vectors, calculus, statistics, and probability. The HL course goes more into depth in all of these topics, and includes an option that is assessed on a third, one-hour exam paper after the first two parts of the exam.

An individualized mathematics exploration serves as an internally assessed component of the final grade. This began with two blocks at the end of year one so that students could work on it over the summer. Students then had four class blocks spread out over the first month of school of year two two work and ask questions related to the exploration during class.

I taught year one again, as well as my first attempt at year two. As I have written about previously, this was run as a combined block of both SL and HL students together, with two out of every five blocks as HL focused classes.

What worked:

  • I was able to streamline the year 1 course to better meet the needs of the students. Most of my ability in doing this came from knowing the scope of the entire course. Certain topics didn't need to be emphasized as I had emphasized in my first attempt last year. It also helped that the students were much better aware of the demands of higher-level vs. standard level from day one.
  • I did a lot more work using IB questions both in class and on assessments. I've become more experienced with the style and expectations of the questions and was better able to speak to questions about those from students.
  • The two blocks on HL in this combined class was really useful from the beginning of year one, and continued to be an important tool for year two. I don't know how I would have done this otherwise.
  • I spent more time in HL on induction than last year, both on sums and series and on divisibility rules, and the extra practice seemed to stick better than it did last year in year one.
  • For students that were self starters, my internal assessment (IA) schedule worked well. The official draft submitted for feedback was turned in before a break so that I had time to go through them. Seeing student's writing was quite instructive in knowing what they did and did not understand.
  • I made time for open ended, "what-if" situations that mathematics could be used to analyze and predict. I usually have a lot of this in my courses anyway, but I did a number of activities in year one specifically to hint at the exploration and what it was all about. I'm confident that students finished the year having seen me model this process, and having gone through mini explorations themselves.
  • After student feedback in the HL course, I gave many more HL level questions for practice throughout the year. There was a major disconnect between the textbook level questions and what students saw on the HL assessments, which were usually composed of past exam questions. Students were more comfortable floundering for a bit before mapping a path to a solution to each problem.
  • For year two, the exam review was nothing more than extended class time for students to work past papers. I did some curation of question collections around specific topics as students requested, but nearly every student had different needs. The best way to address this was to float between students as needed rather than do a review of individual topics from start to finish.
  • The SL students in year two learned modeling and regression over the Chinese new year break. This worked really well.
  • Students that had marginally more experience doing probability and statistics in previous courses (AP stats in particular) rocked the conditional probability, normal distribution, and distribution characteristics. This applied even to students who were exposed to that material, but did poorly on it in those courses. This is definitely a nod to the idea that earlier exposure (not mastery) of some concepts is useful later on.
  • Furthermore, regarding distributions, my handwaving to students about finding area under the curve using the calculator didn't seem to hurt the approach later on when we did integration by hand.
  • This is no surprise, but being self sufficient and persevering through difficult mathematics needs to be a requirement for being in HL mathematics. Students that are sharp, but refuse to put in the effort, will be stuck in the 1-3 score range throughout. A level of algebraic and conceptual fluency is assumed for this course, and struggling with those aspects in year one is a sign of bigger issues later on. Many of the students I advised this way in year one were happier and more successful throughout the second year.
  • I successfully had students smiling at the Section B questions on the IB exam in the slick way that the parts are all connected to each other.

What needs work:

    For year one:

  • I lean far too hard on computer based solutions (Geogebra, Desmos) than on the graphing calculator during class. The ease of doing it these ways leads to students being unsure of how to use the graphing calculator to do the same tasks (finding intersections and solutions numerically) during an assessment. I definitely need to emphasize the calculator as a diagnostic tool before really digging into a problem to know whether an integer or algebraic solution is possible.
  • Understanding the IB rounding rules needs to be something we discuss throughout. I did more of this in year one on my second attempt, but it still didn't seem to be enough.
  • For year two:

  • Writing about mathematics needs to be part of the courses leading up to IB. Students liked the mini explorations (mentioned above) but really hated the writing part. I'm sure some of this is because students haven't caught the writing bug. Writing is one of those things that improves by doing more of it with feedback though, so I need to do much more of this in the future.
  • I hate to say it, but the engagement grade of the IA isn't big enough to compel me to encourage students to do work that mattered to them. This element of the exploration was what made many students struggle to find a topic within their interests. I think engagement needs to be broadened in my presentation of the IA to something bigger: find something that compels you to puzzle (and then un-puzzle) yourself. A topic that has a low floor, high ceiling serves much more effectively than picking an area of interest, and then finding the math within it. Sounds a lot like the arguments against real world math, no?
  • I taught the Calculus option topics of the HL course interspersed with the core material, and this may have been a mistake. Part of my reason for doing this was that the topic seemed to most easily fit in the context of a combined SL/HL situation. Some of the option topics like continuity and differentiability I taught alongside the definition of the derivative, which is in the core content for both SL and HL. The reason I regret this decision is that the HL students didn't know which topics were part of the option, which appear only on a third exam section, Paper 3. Studying was consequently difficult.
  • If for no other reason, the reason not to do a combined SL/HL course is that neither HL or SL students get the time they deserve. There is much more potential for great explorations and inquiry in SL, and much more depth that is required for success in HL. There is too much in that course to be able to do both courses justice and meet the needs of the students. That said, I would have gone to three HL classes per two week rotation for the second semester, rather than the two that I used throughout year one.
  • The HL students in year two were assigned series convergence tests. The option book we used (Haese and Harris) had some great development of these topics, and full worked solutions in the back. This ended up being a miserable failure due to the difficulty of the content and the challenge of pushing second semester seniors to work independently during a vacation. We made up some of this through a weekend session, but I don't like to depend on out-of-school instruction time to get through material.

Overall, I think the SL course is a very reasonable exercise in developing mathematical thinking over two years. The HL course is an exercise in speed and fluency. Even highly motivated students of mathematics might be more satisfied with the SL course if they are not driven to meet the demands of HL. I also think that HL students must enjoy being puzzled and should be prepared to use tricks from their preceding years of mathematics education outside of being taught to do so by teachers.

2014-2015 Year in Review: Work & Life Balance

This is more of a comment on things I did outside of the classroom rather than in, but it was something that my wife and I made a focused effort to do during the second semester.

The idea was simple: buck the routine of the house (and classroom) during the week with something specific that didn't involve work. Make dinner with friends. Go for a walk to somewhere new in the neighborhood. Watch a movie. Work on a fun side project.

These scheduled, specific plans meant I had a reason to leave my classroom and end planning earlier than the usual, which often pushed well past 5:00 PM. If there was a need to do more before the next day, I'd take a look at it before going to bed. I took the time to ask myself whether the work left unfinished was actually going to make the learning better the next day. Sometimes it was, often it was not.

I realize now that Parkinson's Law is notoriously problematic for perfectionists like me:

From Wikipedia, the free encyclopedia:


Parkinson's law is the adage that "work expands so as to fill the time available for its completion....

There is always more tweaking that can be done. The law of diminishing returns (and importance) is a major reason not to do so, particularly in light of the restorative energy that comes from spending time with good people.

These reasons for wrapping up work and being more efficient also made a big difference in my use of planning time throughout the day. I prioritized much more effectively knowing that I had a limited time to complete planning for the next day.

One important comment here: specificity was crucial. I couldn't just say I wanted to finish early to have more free time at home. It made a big difference to be able to picture the end goal of these time limitations. The goal is having a specific activity to look forward to rather than just a negative space formed by the absence of work.

I will be deliberate about continuing this throughout the coming year. This is too important.

2014-2015 Year In Review: IB Physics SL/HL

This was my first year teaching IB Physics. The class consisted of a small group of SL students with one HL, and we met every other day according to the block schedule. I completed the first year of the sequence with the following topics, listed in order:

    Semester 1

  1. Unit 1 - Experimental Design, Uncertainty, Vectors (Topic 1)
  2. Unit 2 - Kinematics & Projectile Motion (Topic 2.1)
  3. Unit 3 - Newton's Laws (Topic 2.2)
  4. Unit 4 - Work, Energy, and Momentum (Topic 2.3)
  5. Semester 2

  6. Unit 5 - Circular Motion, Gravitation, and Orbits (Topics 6.1, 6.2)
  7. Unit 6 - Waves and *Oscillation(Topic 4, AHL Topic 9, *AHL Engineering Option Topic B3.1,3.2)
  8. Unit 7 - Thermal Physics (Topic 3, Engineering Option Topic B2)
  9. Unit 8 - *Fluid Dynamics (Engineering Option Topic B3)

For the second semester of the course, there was at least one block every two weeks that was devoted to the HL student and the HL only content - the SL students worked on practice problems or other work they had for their IB classes during this time. Units 7 and 8 were concurrent, so the HL student had to work on both the thermodynamics content and the fluid dynamics content together. This was similar to how I did it previously while teaching the AP physics B curriculum.

One other fact that is relevant - none of my students are native speakers of English. More on this later.

What worked:

  • The growth students made during the year was significant. I saw students improve in their problem solving skills and their organization in the process of doing textbook style assessment problems.
  • I learned to be honest about the IB expectations for answering questions on assessments.In the beginning, I tried to shield students from questions that combined conceptual understanding, computation, and complex language, often choosing two out of the three of them for any one question that I either wrote or selected from a bank. My motivation was to isolate assessment of the physics content from assessment of the language. I wanted answers to these separate questions:
    1. Does the student understand how the relevant physics applies here?
    2. Does the student understand how to apply the formulas from the reference table to calculate what the question is asking for?
    3. Can the student process the text of the question into a physics context?
    4. Can the student effectively communicate an answer to the question?

    On official IB assessment items, however, this graininess doesn't exist. The students need to be able to do all of these to earn the points. When I saw a significant difference between how my students did on my assessments versus those from IB, I knew I need to change. I think I need to acknowledge that this was a good move.

  • Concise chunks of direct instruction followed by longer problem solving sessions during class worked extremely well. The students made sense of the concepts and thought about them more while they were working on problems, than when I was giving them new information or guiding them through it. That time spent stating the definitions was crucial. The students did not have a strong intuition for the concepts, and while I did student centered conceptual development of formulas and concepts whenever possible, these just didn't end up being effective. It is very possible this is due to my own inexperience with the IB expectations, and my conversations with other teachers helped a lot to refine my balance of interactivity with an IB pace.
  • Students looked forward to performing lab experiments. I was really happy with the way this group of students got into finding relationships between variables in different situations. Part of this was the strong influence I've developed with the Modeling Instruction curriculum. As always, students love collecting data and getting their hands dirty because it's much more interesting than solving problems.

What needs work:

  • My careless use of the reference sheet in teaching directly caused students to rely excessively upon it. I wrote about this previously, so check that post out for more information. In short: students used the reference sheet as a list of recipes as if they provided a straight line path to solutions to questions. It should be used as a toolbox, a reminder of what the relationships between variables are for various physics concepts. I changed this partly at the end of the year, asking students to describe to me what they wanted to look for on the sheet. If their answer was 'an equation', I interrogated further, or said you aren't about to use the reference sheet for what it was designed to do. If their answer was that they couldn't remember if pressure was directly or inversely related to temperature, I asked them what equation describes that relationship, and they were usually able to tell me.
    Both of these are examples of how the reference sheet does more harm than good in my class. I fault myself here, not the IB, to be clear.
  • The language expectations of IB out of the gate are more of an obstacle than I expected at the beginning of the year. I previously wrote about my analysis of the language on IB physics exams. There tends to be a lot of verbal description in questions. Normally innocuous words get in the way of students simultaneously learning English and understanding assessment questions, and this makes all the difference. These questions are noticably more complex in their language use than that used on AP exams, though the physics content is not, in my opinion, more difficult. This is beyond physics vocabulary and question command terms, which students handled well.
  • Learning physics in the absence of others doesn't work for most students. Even the stronger students made missteps working while alone that could have been avoided by being with other students. I modified my class to involve a lot more time working problems during class and pushed students to at least start the assigned homework problems while I was around to make the time outside of class more productive. Students typically can figure out math homework with the various resources available online, but this just isn't the case for physics at this point. It is difficult for students to get good at physics without asking questions, getting help, and seeing the work of other students as it's generated, and this was a major obstacle this semester.
  • Automaticity in physics (or any subject) shouldn't be the goal, but experience with concepts should be. My students really didn't get enough practice solving problems so that they could recognize one situation versus another. I don't want students to memorize the conditions for energy being conserved, because a memorized fact doesn't mean anything. I do want them to recognize a situation in which energy is conserved, however. I gave them a number of situations, some involving conservation, others not, and hoped to have them see the differences and, over time, develop an awareness of what makes the two situations different. This didn't happen, partly because of the previous item about working physics problems alone, but also because they were too wrapped up in the mechanics of solving individual problems to do the big pciture thinking required for that intuition. Group discussions help on this, but this process is ultimately one that will happen on the individual level due to the nature of intuition. This will take some time to figure out.
  • Students hated the formal process of writing up any parts of the labs they performed. This was in spite of what I already said about the students' positive desire to do experiments. The expressions of terror on the students' faces when I told them what I wanted them to do with the experiment break my heart. I required them to do a write-up of just one of the criteria for the internal assessment, just so they could familiarize themselves with the expectations when we get to this next year. A big part of this fear is again related to the language issue. Another part of it is just inexperience with the reality of writing about the scientific process. This is another tough egg to crack.

There was limited interest in the rising junior class for physics, so we won't be offering year one to the new class. This means that the only physics class I will have this year will be with the same group of students moving on to the second year of IB physics. One thing I will change for physics is a set of memorization standards, as mentioned in my post about standards based grading this year. Students struggled remembering quick concepts that made problem solving more difficult (e.g. "What is the relationship between kinetic energy and speed?") so I'll be holding students responsible for that in a more concrete way.

The issues that need work here are big ones, so I'll need some more time to think about what else I will do to address them.

2014-2015 Year In Review: Web Programming

This was the first year I've taught a computer programming course. The class was a broad survey of programming in HTML5. This was the overall sequence:

    Semester 1:

  1. Hacking a webpage from the browser console
  2. HTML tags, structures, and organization
  3. CSS - page design, classes and IDs, along with using Bootstrap
  4. Javascript - variables, structures, conditionals
  5. jQuery - manipulating the page using events and selectors, animations
  6. Semester 2:

  7. Mongo Databases & Queries
  8. HTML Templates using Blaze
  9. Writing Meteor Apps
  10. Meteor, Media, and the HTML5 Canvas
  11. HTML5 Games using Phaser

I have posted the files and projects I used with students at this repository on Github:
https://github.com/emwdx/webprogramming2014-2015

What did I do?

The class generally began with a warm-up activity that involved students analyzing, writing, or running code that I gave them. This always led into what we were going to explore on a given day's lesson. I would show the class a few lines of code, ask them to make a prediction of what they thought would happen. This might be a visual request - what will this look like? Will there be an error? Was this error intentional or not?

This was all done while students had their laptops closed and notebooks open. I usually designed a series of tasks for students to complete using some code snippets that were saved in the directory on the school server.

We didn't use any textbook, so I knew I needed to create a reference that students could refer back to whenever they got stuck. For each class, I took notes either in Microsoft OneNote or the SMART Notebook software and saved the notes in PDF form. I don't know if students used this or not.

I had three types of assessment:

  • Mini-projects, which were fairly straight forward and had unique answers. These were assessed by general completion (4.5/5) with a (5/5) given for effort to creatively make the code their own. I was fairly loose on that final half point, giving it whenever I saw students clearly engaged by the task. You can see an example of this assignment here.
  • Projects, which had clear guidelines and requirements to meet the minimum grade that ranged from 80 - 90 percent, and then a series of additional tasks that raised the grade up to 100%. The additional task points weren't awarded until the basic requirements were met, though that didn't stop students from trying (see below).
  • Blog posts, which were required for every class. The expectations required a summary of what we learned for each class, along with code snippets, questions about what we learned, or confusion about something they wanted to go over in the next class. As the students became more skilled, this turned into questions that started as "How can we.../Is it possible to...".

Once every two weeks, and usually on a Friday, I had a 20% day during which students could work on anything they wanted related to web programming. Some students worked on previous projects to resubmit them, others experimented with code from the previous class or week. In a couple of cases, students worked on their own pet projects, which included a chat application, a mathematical formula parser, and applying visual design principles to the pages we created in class. I often made suggestions for what students could do at the beginning of the class block, including providing some basic code they could use to experiment.

What worked:

  • Based on feedback from the end of the year, students enjoyed the course. They had a lot of positive comments on the ways I ran the class and that they always got help when they needed it.
  • Forcing students to write down code helped with retention and building a useful reference for later. I didn't require them to write down long blocks of code, but for things like HTML tags and Javascript, I wanted there to be some written reinforcement that things were important. I was pretty strict on deciding when I wanted students to write down code (to activate that part of the brain) and when I wanted them to copy it directly into a text editor and run it.
  • Forcing students to recreate code (and not copy and paste) led to higher activity and interaction between students while learning to code. I saved some code as images, not text, which required students to go line by line and see what they were doing. This was a decision I made early on because it helped me when learning to code. That extra step of needing to look at the code while I was typing it in led me to take a closer look at what it said, and I wanted to give a similar opportunity to my students.
  • The more open ended projects led to much richer questions and interaction between students. I really liked the range of responses I received when I gave open ended projects. Some students were exceptionally creative or went well beyond the requirements to make code that mattered to them.
  • Students were constantly helping each other with their code...when they eventually asked for this help. I was called over many times by students calling out the blanket statement "my code doesn't work" and then handing me their laptop, but over time they learned that I wasn't going to just fix their code for them. They became careful readers of each other's code, when they finally made the step to ask someone to help, though this took some time.
  • I succeeded in having students do more than listen. I never talked for more than 15 minutes before students were actually writing and experimenting with code. This was exactly what I wanted.
  • 20% days were a big hit. Some students wanted this time as extra processing time to complete the mini projects from the rest of the week. Others liked being able to ask me how to do anything, or to find tutorials for HTML elements that they wanted to learn to use. I really liked how well this worked with this group of students and looked forward to it, and not just because it was a reduction in the planning required for class.
  • Videos offered an effective and preferred method for learning to write code in this class. I put together a number of screencasts in which I spoke about the code, and in some cases coded it live. Students were able to pause, copy code to the editor, and then run it pretty easily. Some zipped through it, others took longer, but this is nothing new. The time required to do this, as is always a consideration for me, was often more than I could afford. Luckily, there is plenty of material available already out there, so I was able to step back and give another voice and face a chance to teach my students.

What needs work:

  • The bonus elements for projects were the first things most students wanted to figure out first. Many students did not put in the time to read and complete the basic requirements for projects, resulting in submitted projects that were sent right back as being incomplete. Some of this was a language issue, as there were many ESOL students in the class, but most of it was what we always encounter when working with adolescents: not reading the directions.
  • Students reused a lot of old (and unrelated) code. I emphasized creating simple code from scratch throughout the year, as my expectations were relatively simple. For many students, copying and pasting code was a crutch that led to many more problems than simply writing simple, clean code from the start. I get it - I copy and paste code myself - but I also know how to clean it up. They knew why not to do it (because they all tried it at some point) but some students continued doing it to the end. I need a better plan for helping students not fall into this trap.
  • Many students did not pick up on error messages in the console that said precisely where the problem with the code was located. At times, I expected too much from students, because the console is a scary place. That said, I think I could do a better job of emphasizing how to find the line numbers referenced in these errors messages, regardless of what the error message is.

I really enjoyed teaching this class, and not just because of the awesome group of students that took it. It helped me refine my knowledge and get better at some of the HTML, CSS, and Javascript coding skills that I had used, but often had to relearn every time I wanted to use them.

Feedback, as always, is welcome!