2016 – 2017 Year In Review: IB Mathematics SL

Overview

This was my third time around teaching the first year of the IB mathematics SL sequence. It was different from my previous two iterations given that this was not an SL/HL combined class. This meant that I had more time available to do explorations, problem solving sessions, and in-class discussions of the internal assessment (also called the exploration). I had two sections of the class with fourteen and twenty students respectively.

I continued to use standards based grading for this course. You can find my standards (which define the curricular content for my year one course) at this link:

IB Mathematics SL Year 1 – Course Standards

What worked:

  • My model of splitting the 80 – 85 minute block into twenty minute blocks of time works well. I can plan what happens in those sub-blocks, and try as hard as I can to keep students doing something for as much of those as I can. The first block is a warm-up, some discussion, check in about homework or whatever, and then usually some quick instruction before the next block, which often involves an exploration activity. Third is summary of explorations or the preceding activities, example problems, and then a fourth of me circulating and helping students work.
  • Buffer days, which I threw in as opportunities for students to work on problems, ask questions, and play catch up, were a big hit. I did little more on these days than give optional sets of problems and float around to groups of students. Whenever I tried to just go over something quick on these days, those lessons quickly expanded to fill more time than intended. It took a lot of discipline to instead address issues as they came up.
  • I successfully did three writing assignments in preparation for the internal assessment, which students will begin writing officially at the beginning of year two. Each one focused on a different one of the criteria, and was given at the end of a unit. Giving students opportunities to write, and get feedback on their writing, was useful both for planning purposes and for starting the conversation around bad habits now.

    I had rolling deadlines for these assignments, which students submitted as Google Docs. I would go through a set of submissions for a class, give feedback to those that made progress, and gentle reminders to those that hadn’t. The final grade that went into PowerSchool was whatever grade students had earned by the end of the quarter.

    The principle I applied here (and one to which I have subscribed more fervently with each year of teaching) is that my most valuable currency in the classroom is feedback. Those that waited to get started in earnest with these didn’t get the same amount of feedback as students that started early, and the quality of their work suffered dramatically. I’m glad I could have the conversations I had with students now so that I might have a chance in changing their behavior before their actual IA is due.

    An important point – although I did comment on different elements of the rubric, most of my feedback was on the criterion that titled the assignment. For example, in my feedback I occasionally referenced reflection and mathematical presentation in the communication assignment. I gave the most detailed feedback for communication, and graded solely on that criterion.

    These were the assignments:

  • I budgeted some of my additional instruction time for explicit calculator instruction. I’ve argued previously about the limitations of graphing calculators compared to Geogebra, Desmos, and other tools that have substantially better user experiences. The reality, however, is that these calculators are what students can access during exams. Without some level of fluency accessing the features, they would be unable to solve some problems. I wrote about this in my review of the course last year. This time was well spent, as students were not tripped up by questions that could only be solved numerically or graphically.
  • Students saw many past paper questions, and seem to have some familiarity with the style of questions that are asked.

What needs work:

  • I’ve come to the conclusion that preemptive advice is ineffective. “Don’t forget to […]” or “You need to be extremely careful when you […]” is what I’m talking about. It isn’t useful for students that don’t need the reminder. It doesn’t help the students that don’t have a context for what you are telling them not to do, not having solved problems on their own. I have found it to be much more effective to address those mistakes after students get burned by them. Some of my success here comes from my students subscribing to a growth mindset, which is something I push pretty hard from the beginning. Standards based grading helps a lot here too.
  • I desperately need a better way to encourage longer retention of knowledge, particularly in the context of a two year IB course. I’ll comment more on this in a later post, but standards based grading and the quarter system combined were factors working against this effort. I did some haphazard spaced repetition of topics on assessments in the form of longer form section two questions. The fact that I was doing this did not incentivize enough students to regularly review. I also wonder if my conflicted beliefs on fluency versus understanding of process play a role as well.
  • Students consistently have a lot of questions about rounding, reporting answers, and follow through in using those answers in the context of IB grading. The rules are explicitly stated in the mark schemes for questions – answers should be reported exactly or to three significant figures unless otherwise noted. The questions students repeatedly have relate to multiple part questions. For example, if a student does a calculation in part (a), reports it to three significant figures, and then uses the exact answer to answer part (b), might that result in a wrong answer according to the mark scheme? What if the student uses the three significant figure reported answer in a subsequent part?

    I did a lot of research in the OCC forum and reading past papers to try to fully understand the spirit of what IB tries to do. I’d like to believe that IB sides with students that are doing the mathematics correctly. I am not confident in my ability to explain what the IB believes on this, which means my students are uncertain too. This bothers me a lot.

  • Students still struggle to remember the nuances of the different command terms during assessments. They also will do large amounts of complex calculations and algebraic work in site of seeing that a question is only two or three marks. There is clearly more work to do on that, though I expect that will improve as we move into year two material because, well, it usually does. I wish there was a way to start the self-reflection process earlier.
  • Students struggle to write about mathematics. They also struggle with the reality that there is no way to make it go faster or do it at the last minute without the quality suffering. I still believe that the way you get better is by writing more and getting feedback, and that’s the main reason I’m glad I made the changes I did regarding the exploration components. That said, students know how to write filler paragraphs, and I call them out on filler every single time.
  • We spent a full day brainstorming and thinking about possible topics for individual explorations. Surveying the students, only four of them are certain about their topics. The rest have asked for additional guidance, which I am still figuring out how to provide over the summer. I think this process of finding viable topics remains difficult for students.

Conclusion

I’ll be following these students to year two. We have the rest of probability to do first thing when we get back, which I’ll combine with some dedicated class time devoted toward the exploration. I like pushing the probability and Calculus to year two, as these topics are, by definition, plagued by uncertainty. It’s an interesting context in which to work with students in their final year of high school.

2016 – 2017 Year In Review: PreCalculus

Overview

This was the first time I taught a true PreCalculus course in six years. At my current school, the course serves the following functions:

  • Preparing tenth grade students for IB mathematics SL or HL in their 11th grade year. Many of these students were strong 9th grade students that were not yet eligible to enter the IB program since this must begin in grade eleven.
  • Giving students the skills they need to be successful in Advanced Placement Calculus in their junior or senior year.
  • Providing students interested in taking the SAT II in mathematics some guidance in the topics that are covered by that exam.

For some students, this is also the final mathematics course taken in high school. I decided to design the course to extend knowledge in Algebra 2, continue developing problem solving skills, do a bit more movement into abstraction of mathematical ideas, and provide a baseline for further work in mathematics. I cut some topics that I used to think were essential to the course, but did not properly serve the many different pathways that students can follow in our school. Like Algebra 2, this course can be the swiss army knife course that “covers” a lot so that students have been exposed to topics before they really need to learn them in higher level math courses. I always think that approach waters down much of the content and the potential for a course like this. What tools are going to be the most useful to the broadest group of students for developing their fluency, understanding, and communication of mathematical ideas? I designed my course to answer that question.

I also found that this course tended to be the one in which I experimented the most with pedagogy, class structure, new tools, and assessment.

The learning standards I used for the course can be found here:
PreCalculus 2016-2017 Learning Standards

What worked:

  • I did some assessments using Numbas, Google Forms, and the Moodle built-in quizzes to aid with grading and question generation. I liked the concept, but some of the execution is still rough around the edges. None of these did exactly what I was looking for, though I think they could each be hacked into a form that does. I might be too much of a perfectionist to ever be happy here.
  • For the trigonometry units, I offered computer programming challenges that were associated with each learning standard. Some students chose to use their spreadsheet or Python skills to write small programs to solve these challenges. It was not a large number of students, but those that decided to take these on reported that they liked the opportunity to think differently about what they were learning.
  • I explicitly also taught using spreadsheet functions to develop student’s computational thinking skills. This required designing some problems that were just too tedious to solve by hand. This was fun.
  • Differentiation in this course was a challenge, but I was happy with some of the systems I used to manage it. As I have found is common since moving abroad, many students are computationally well developed, but not conceptually so. Students would learn tricks in after school academy that they would try to use in my course, often in inappropriate situations. I found a nice balance between problems that started low on the ladder of abstraction, and those that worked higher. All homework assignments for the course in Semester 2 were divided into Level 1, Level 2, and Level 3 questions so that students could decide what would be most useful for them.
  • I did some self-paced lessons with students in groups using a range of resources, from Khan Academy to OpenStax. Students reported that they generally liked when I structured class this way, though there were requests for more direct instruction among some of the students, as I described in mu previous post about the survey results.
  • There was really no time rush in this course since after my decision to cut out vectors, polar equations, linear systems, and some other assorted topics that really don’t show up again except in Mathematics HL or Calculus BC where it’s worth seeing the topic again anyway. Some students also gave very positive feedback regarding the final unit on probability. I took my time with things there. Some of this was out of necessity when I was out sick for ten days, but there were many times when I thought about stepping up the challenge faster than I really needed to.

What needs work:

  • I wrote about how I did the conic sections unit with no-numerical grades – just comments in the grade book . The decision to do that was based on a number of factors. The downside was that when I switched back to numerical grades for the final unit, the grade calculation for the entire quarter was based only on those grades, and not on the conic sections unit at all. The conic sections unit did appear on the final exam, but for the most part, there wasn’t any other consequence for students that did not reassess on the unit.
  • Students did not generally like when I used Trello. They liked the concept of breaking up lessons into pieces and tasks. They did not like the forced timelines and the extra step of the virtual Trello board for keeping track of things. This Medium article makes me wonder about doing this in an analog form if I try it in the future. I also could make an effort to instill the spirit of Scrum early on so that it’s less novel, and more the way things are in my classroom.
  • I should have done a lot more assessment at the beginning of units to see what students knew and didn’t know. It sounds like the student experiences in the different Algebra 2 courses leading to PreCalculus were quite different, which led to a range of success levels throughout. Actually, I should probably be doing this more often for all my courses.
  • Students could create their own small reference sheet for every exam. I did this because I didn’t want students memorizing things like double angle identities and formulas for series. The reason this needs work is that some students are still too reliant on having this resource available to ever reach any level of procedural fluency. I know what students need to be fluent later on in the more advanced courses, sure, but I am not convinced that memorization is the way to get there. Timed drills don’t seem to do it either. This challenge is compounded by the fact that not all students need that level of fluency for future courses, so what role does memorization here play? I have struggled with this in every year of my fourteen year career, and I don’t think it’s getting resolved anytime soon. This is especially the case when Daniel Willingham, who generally makes great points that I agree with, writes articles like this one.

Conclusion

This course was fun on many levels. I like being there to push students to think more abstractly as they form the foundation of skills that will lead to success in higher levels of mathematics. I like also crafting exercises and explorations that engage and equip the students that are finishing their mathematics careers. We should be able to meet the needs of both groups in one classroom at this stage.

I frequently reminded myself of the big picture by reading through Jonathan Claydon’s posts on his own Precalc course development over the years. If you haven’t checked him out, you should. It’s also entertaining to pester him about a resource he posted a few years ago and hear him explain how much things have changed since then.

2016 – 2017 Year In Review: Surveys

Overview

Last year I took Julie Reubach’s survey and used it for the students in my final set of classes at my previous school. This year I gave essentially the same survey. Probably the most important thing for me was to compare some of the results to make sure the essential elements of my teaching identity made the transition intact.

The positives:

  • Students responded that the reassessments and the quizzing system were important elements to keep for next year. I’ll share more about my reflection on the reassessment system in a later post.
  • Students liked having plenty of time during class to work and get help if they needed it. I tried to strike a balance between this, exploration, and direct instruction. More on that last point below.
  • Students appreciated the structures of class and the materials. They liked having warm-up activities for each class, the organization of documents on Google Drive, and the use of PearDeck for assssment of their ideas during class.
  • The stories, personal anecdotes, and jokes at the start of class apparently go over well with students. I don’t think I could stop this completely anyway, so I’m glad students don’t necessarily see this as being unfocused or as a waste of class time.
  • Students like structured opportunities to work together and solve problems that are not just sets from the handouts. Explorations got strong reviews, which is good because I think they are good uses of class time too.

What needs work:

  • Students want more example problems. I consistently did some

in each class, but I always struggled with the balance between doing more problems and addressing issues as they came up individually. Some students want a bit more guidance that doesn’t necessarily require whole group instruction, but say that the individual group explanations or suggestions aren’t meeting their needs completely. This might mean I record some videos or present worked problems as part of the class resources in case students want them.

  • Related to the previous point is the use of homework, Some students want more help on homework, but again don’t necessarily want to spend whole class instruction doing it. I admit that I still struggle with the usefulness of going over homework, particularly as a whole class and collecting information on what students struggled with is not smooth. The classroom notebook doesn’t solve that problem to my satisfaction either. Short, focused presentations of how to get started on certain problems (and not full solutions) might be all that is needed to meet this shortcoming that many students mentioned in their surveys.
  • Despite my efforts to make learning the unit circle easier, students continue to report their dislike for learning it. I present students a series of approaches to understanding how to evaluate functions around the unit circle. This is also one of the few topics where I encourage both understanding (through creative assessment questions) and accuracy in evaluating functions correctly using whatever means students find necessary. Memorization, if that is what students choose to do, is one way that students could approach this. I think part of the issue is that proficiency in this topic requires more genuine effort than others. There are no shortcuts here, and facility with evaluating trigonometric functions goes a long way in making other topics easier. I’m not sure what the solution is here. This is one area where I think procedural fluency had no valid replacement, particularly in the context of IB, or preparation in Precalculus.
  • The other topic that students reported they found the most difficult was binomial theorem, again surprising given that it is one of the more procedurally straight forward topics of the courses. Do I need to consider teaching these in a more formulaic way so that students are more successful? I wonder if I have swung too far in the wrong direction with respect to avoiding activities that demand fluency or practice.
  • Students want more summary of what we’ve done each class and where we are going. I think this is a completely valid request, and is perhaps made easier to do with each course defined in terms of learning standards.
  • Conclusion

    I appreciate how consistently students are willing to give feedback about my classes. There were some really useful individual comments that will help me think about how the decisions I make might affect the spectrum of students in each course. I promised students that I wouldn’t look at the results until after grades were in, just in case that might encourage more honesty. This was an anonymous survey, and with the larger class sizes this year, I think there was a closer amount of anonymity with respect to individual responses. There is a lot to sift through here, which is why I’m glad I still have the better part of the summer to do so.

    2016 – 2017 Year In Review: Technology Tools

    Overview

    I’ve always been a pretty heavy user of technology. I’ve been more careful in the past few years to use it for a reason, not for its own sake though. I also balance that use though with a healthy desire to try new things in a way that I would actually use them in the classroom.

    This is also the first year I’ve been able to take advantage of the Google Tools suite since Vietnam is not subject to the limitations of China’s Great Firewall. Though there have been times when the internet connection to the entire country has been subject to shark attacks, connections in general have been smooth. Seeing how effectively some folks use Google in the classroom after being unable to use it for six years make me feel seriously behind the times. Luckily, my colleagues are really eager to share what they do. I might be caught up.

    1. A Macbook Pro where I do most of my lesson planning. I connected an external widescreen monitor that mirrored all projected content. The second screen was sent through AirPlay to an Apple TV, which was then connected to the projector.
    2. Class worksheets electronically created and stored in Google Docs. These are printed out on A5 size sheets for students to tape into their notebooks for a physical record of what we did.
    3. For IB Mathematics SL and PreCalculus, I had two students per class make an additional Google Doc that was a copy of the handout. In this document, students would paste solutions to class work, homework, and whatever else they thought might be important to their classmates. The student responsibility for doing this was on a rotating schedule, similar to what I’ve used in my previous classes.
    4. Notability app for class notes, with a Wacom Tablet for input. I used the wireless accessory kit for around two days, because it disconnected too frequently.
    5. iPhone as a document camera for capturing student work for sharing answers or for conversation during the class. I would take pictures of student work and use AirDrop to upload them for inclusion in the notes.
    6. Moodle as a repository for all of the above documents and links. I also used it occasionally for distributing quizzes and automatic grading.
    7. My WeinbergCloud website for managing, assigning, and recording reassessments throughout the semester.
    8. PearDeck on a trial basis for first semester, and then regularly during second. I sometimes used an iPad to manage the class, but every time I regretted it, and just used my computer.
    9. Desmos Calculator usually at least once per lesson
    10. Desmos Activity Builder about once per unit per course
    11. EdPuzzle for self paced lessons, videos, and quizzes in Algebra 2. Most of the videos were produced by my colleague, Scott Hsu.
    12. Spreadsheets for building useful calculators (like discriminants for quadratics, arithmetic/geometric series sums, etc)
    13. Khan Academy for practice exercises and monitoring of student effort in reviewing material.
    14. Geogebra for checking exam questions and demonstrating its use as a work-checking tool for students.
    15. Camtasia for recording videos from time to time of solving problems, using
    16. Quizster as a way to have students submit specific homework problems for feedback.
    17. Wireless keyboard and trackpad, though these lasted about a week and a half.
    18. I dabbled with GoFormative, primarily when I was on sick leave for a while. Connection issues that were inconsistent across the class led to my abandoning it for regular use.
    19. For two units in PreCalculus, I used Trello as a way to organize units and help students organize their work for each day of class.

    What worked:

    • The process of cutting and pasting images of problems or student work into Notability, and then annotating them was great for recording important information during class. These notes were then either pasted into the document created by students for each class, or exported as PDF for posting on Moodle. This felt like a good way to have a record of what went on during a given class block in case students missed a block.
    • I liked automated grading of quizzes through Google forms and Moodle. This definitely saved time, but the process of getting feedback to students in response still is awkward. When student work is analog, but answer checking is digital, where should that feedback go? Quizster offers some way of making all of this occur in the same tool, but the workflow never was smooth enough to fully commit to it.
    • The combination of PearDeck and Desmos Activity builder, along with photos of student work, made for great sources of understanding (and misunderstanding) that helped me decide how or whether to proceed with other material. These also made for great motivating elements for direct instruction when it needed to happen. The students really liked using these tools, and said they looked forward to them, according to the end of year survey results.
    • I don’t think Khan Academy exercises work well for assessing students beyond a basic level. I think they can provide the practice some students need on procedural skills like factoring or evaluating trigonometric functions. It’s just one tool among many to serve the needs of my students.
    • When I provided guidance on how spreadsheets could be used for more than just making charts, students appreciated it. One student went so far as to say that this instruction was “actually useful”. I decided not to ask what this student thought about the rest of the class.

    What needs work:

    • I posted homework problems on the printed class handout, digital handout, and on a dedicated document for assignments that is an expectation across classes our high school division. Consistently updating all three documents was a challenge, despite my best efforts.
    • The student notebook entries for each class were among the least favorite element of the class, as reported by students. I’ve written previously about the need to have some record of what happens during class, but frustration over students that do not produce their own record through regular use of a dedicated notebook. This isn’t the best solution, but I think it’s the closest I’ve come to something that actually reaches the right balance. I just wish I could figure out how to get students to buy into its usefulness.
    • I still have not figured out the best way to bring the class back together after letting them work at their own pace through lessons. The only times it makes a lot of sense to do this is at the very beginning of class, and at the very end.
    • PearDeck, Desmos Activity Builder, and GoFormative each offer features that I really like. None of them do everything. I’m ok with this, but I wonder whether the fragmentation of activities is good for students, or a problem since their work is distributed across these tools.
    • While I liked using Trello, and some students reported that they also appreciated it, many students did not. I’m not sure if it actually is the self-paced lesson tool I’m looking for, but it was better than a static Google document.
    • At the end of the year, despite my own research and attempts to improve this, the Apple TV disconnected at least once every class period, if not more frequently.

    Conclusion

    My focus continues to be on using technology to free up time for the ways that I can best add value in the classroom. Many students don’t need my help in making progress. Some do, and some like having me explain ideas to them. It’s hard to simultaneously meet these different needs without technology, which enables me to be in multiple places at once.

    Having the range of tools I describe above, and not fully committing to one, is both a blessing and a curse. The fragmentation means the residue of learning is distributed across many web addresses. The variety helps keeps students (and me) from getting into a rut. I don’t know if this balance is appropriately tuned yet.

    2016-2017 Year in Review: Being New

    Overview

    This was my first year since 2010 being the new kid in school. Developing a reputation takes time, so I was deliberate about establishing who I am as a teacher from the beginning. I wrote about learning names at the beginning of the year, for example. My school, surpassing 1,000 students this year, is the second largest of those at which I have worked. The high school division is just over 320 students. There are many systems that are in place to manage the reality of this group of ninth through twelfth graders that has a tremendous diversity of interests, programs (IB and AP), extra-curricular organizations, and obligations outside of the school walls. I walked in admittedly intimidated by the scope of this place and what it aims to accomplish.

    After one of our meetings before student orientation, there was a lot of information that had been shared. I asked our high school principal what the priority needed to be in the first quarter in terms of processing all of that information. He put me at ease – the focus should be on figuring out how this place works. He promised (and certainly delivered) on a pledge to remind us of what was important throughout the year, but with the understanding that there would be a learning curve for our group of newbies. The faculty is passionate about teaching and creative in how they go about designing classroom experiences. They were intensely committed to sharing what they do and helping those of us that were new how to prioritize at any given time.

    What worked:

    • The beginning of school was was a mix of content and getting to know them/me activities that were deliberately designed for those purposes. This sort of thing is important at the beginning of any year if the composition of a class is new. It’s essential if the teacher is new too. Each group is unique and has chemistry that not only is important to suss out in the beginning, but must be regularly assessed as the year proceeds. I thought this series of activities worked really well. I will modify these for the purpose of offering variety for next year’s student groups, but not really improvement.
    • I was able to get most of my work preparing lessons at school during my prep periods. Exceptions were after exams, the end of the year, and near reporting deadlines. This required serious levels of prioritization and disciplined decisions around what I actually could accomplish in those blocks of time. While I maintained to-do lists, a major component of my success came from block-scheduling those tasks and sticking to the schedule. This left time after school and at home to spend designing the explorations, experiments, and bigger picture puzzles that were nice, but not necessary.
    • I streamlined many of the administrative procedures I had created in my previous schools. I rebuilt spreadsheets that had been unchanged for several years rather than hacking old ones to work. Part of this was required to address the fact that my class sizes were substantially larger, but I also decided it was time.
    • As I had hoped to do, I spent much of the year watching. I did not want to come in and identify everything that this school did not have that I may have had a hand in organizing in past school years, and then add it myself. That is how I came to feel burnt out every time June came around. I was quite picky with what I involved myself in. I said no to things. When I was ready to design a VEX robotics sprint (more on that later) at the end of the year, however, this meant I had the energy and drive to do so.
    • The level of support I have felt from administrators and colleagues this year has been incredible. Nothing makes you feel so effective as a team that has your back, and that is realistic about what should, what can, and what cannot be accomplished with a given set of resources.

    What needs work:

    • I did not get out and visit my colleagues anywhere nearly as frequently as I wanted. This is a seriously impressive group of teachers trying different things. Part of the problem was my commitment to trying to get things done during my prep periods, so I do take responsibility for that. It would not have been too devastating to that structure, however, if I also planned blocks of time when I would visit specific colleagues. I ate in the lunchroom with colleagues fairly regularly, and that was great for learning what everyone was doing. It was not enough. More of that next year.
    • I originally planned on doing outreach with parents more regularly this year. They are incredibly trusting of what we as teachers design for students, and this was evident at parent teacher conference nights during both semesters. I want more than that though. I want them to understand my philosophy for why learning mathematics right now is important. I don’t think the parents understand standards based grading, and although the students made solid attempts to explain it during conferences, these conversations don’t happen nearly as frequently as they should. I need to think more about what that communication looks like, and why I feel it is important, because I don’t think I can fully articulate that here. I do know that there is a lost opportunity when it comes to parents really understanding what we do in the classroom on a regular basis.
    • I now believe that the ease of establishing community and connections with others is inversely related to the ease of living in that place. I often tell the story of how it was easy to rally a group of colleagues in my early days of China to go find cheese, which was difficult to find. Many of my closest bonds were formed during those adventures of first world adversity. Here in District 7 of HCMC, there is no such difficulty. Life is really good and easy here. This means, however, that one must work a little bit harder to leave the comfortable bubble of life to find adventure and make new friends. This is especially the case for me as the father of a now toddling ball of energy. It takes effort and time to build those relationships. That’s definitely something that I need to work on deliberately planning more frequently next year.

    Conclusion

    The second year anywhere is always less scattered than the first. The next few weeks are all about figuring about how to use the time not spent learning the ropes.

    Building Models: Drone Intercept and Desmos

    I put together an activity using Desmos Activity Builder that was a variation on an older air traffic control task as part of my unit on parametric equations and vectors.

    Here’s some of the copy for the activity:

    Students could only see six seconds of the drone animation before they disappeared from the screen. I had students detail their process of finding the intersection point and intersection time as part of the follow up for this activity.

    My favorite product of this activity though came with the superposition of everyone’s models on top of the drone footage. Here’s that result (click to see the animation):

    We had some really productive discussions as part of evaluating this result. The students noticed how most people had the correct initial location, but dramatically different results based on the velocity vectors used to generate the parametric expressions. Some students saw it as cheating to use Desmos to gather data, make calculations to create an approximate solution, and then tweak that solution. I shared that I saw that as a natural role of feedback in the modeling process.

    The activity has one slide with some behind-the-scenes Activity Builder features, and I’m not sure I should release that at this point. If you are interested in using this activity with your students, let me know, and I can create a version without that slide.

    What Does the Desmos for Probability Look Like?

    Desmos, the online graphing calculator, activity builder, and general favorite of the MTBoS does phenomenal work.

    I found myself wondering over the past few days about the statistics and probability world and how there isn’t a Desmos parallel in that realm for easy experimentation. You can put an algebraic expression in Desmos and graph it in a few keystrokes. You can solve a problem algebraically, and then graph to see if you are correct. There are multiple ways to confirm graphically or numerically what you have solved algebraically, or any other permutation of the three.

    It’s also easy to put a bunch of marbles in a jar, pick one, replace, and repeat, though this becomes tedious beyond a few trials. Such a small data set often isn’t enough to really see long term patterns, particularly in cases where you are trying to test whether the theoretical probability you have calculated is correct or not. For subtle cases that involve replacement versus no replacement, the differences between the theoretical probabilities of events are small if there are enough marbles in the jar.

    Creating a simulation and running it half a million times is possible in a spreadsheet or a number of computer languages, but the barrier to entry there is not trivial. I’ve written simulations myself of various problems and usually make predictions for what I think is going to happen. I then will usually work to find the theoretical probability by hand.

    So what would this sort of probability playground look like? There are some examples out there already. Here’s one from CPM for small numbers of trials. I haven’t done an exhaustive search, but I haven’t seen anything that truly allows full experimentation at the level I’m hoping to achieve. Here are some ideas for what I would love to see exist:

    • Natural language definitions for sources of possible outcomes. By this, I mean being able to define outcomes verbally. This might mean “rain” and “no rain”, with the assumption that having only two labels means these events are complementary. This might mean we define numbers of items for each possible outcome, or simply enter the probability of each as a decimal. The key thing is that I do not want to require labeling events as A or B, and throwing notation around. Let’s see if we can make this as visual and easy to explore
    • Ease of setting up conditional outcomes for compound events.. If event A (I know, I’m breaking the previous rule here) happens, only B and C are possible, and event D is only possible if event A does not occur.
    • Sinks that easily allow for large numbers of trials. I might want to have a single trial be generated a million times – tell me the proportion of all of the different outcomes. Make it easy for me to count up instances of binomial probability and see how many times, out of ten, I get three or more successes. Tell me when I’m not looking at all of the possibilities. For example, give me some visual indication that when I’m picking two marbles from a jar, that if I only have both red or both blue in my possible outcomes, I’m missing outcomes in which there is one of each.
    • Make it easy to tap into existing complex data sets for exploration purposes. Include some data sets that are timely and relevant. The US election comes to mind.

    I realize also that this is a tall order, but I’ve seen how far the Desmos team has explored the algebraic/numerical space. Now that they have expanded into the Geometry space through their beta, I wonder if they (or someone else for that matter) has something like this probability exploration tool on their roadmap.

    Building Arguments with Probability and the Clips App

    I don’t like projects for assessment. I do like in class projects for the purposes of fostering discussion and other forms of interactions. I decided to put together something fun to build time into the unit while students developed their skills in applying binomial probability. From student feedback, they actually said it was fun, so this wasn’t just hopeful thinking (this time). This also had the added value of giving students a change to work on Common Core mathematical practice standard 3: Construct viable arguments and critique the reasoning of others.

    I gave pairs of groups of students a statement. The center paragraph was the same for both – a statement about probabilities. The paragraphs preceding and following that were different – conflicting contexts for each statement. Here’s an example.

    I ended up writing four sets of situations to make sure that each class had at least two groups working on the same probability statement, but different arguments.

    I asked students to do calculations and write a 100 word abstract stating their argument. After learning that the Clips app, recently released by Apple, made for a really easy way for students creatively describe and document their thinking, I also asked students to create a two minute video documenting the situation and their argument. You can see a selection of the video results below.

    Students were really challenged to search for the calculations and results that supported their arguments. Some reported that they felt dishonest doing so.

    You can check out all four sets of scenarios and the rubric I used here. The students said that working in teams and working through this task was enjoyable and actually reinforced their understanding of how to use binomial probability. As with a previous unit, this project was graded for completion, not for a grade, a fact I stated up front. So far, the students haven’t actually said this was a problem for them, and the quality of what they produced didn’t seem to suffer much.

    An Experiment: Swapping Numerical Grades for Skill-Levels and Emoji

    I decided to try something different for my pre-Calculus class for the past three weeks. There was a mix of factors that led me to do this when I did:

    • The quarter ended one week, with spring break beginning at the end of the next. Not a great time to start a full unit.
    • I knew I wanted to include some conic sections content in the course since it appears on the SAT II, and since the graphs appear in IB and AP questions. Some familiarity might be useful. In addition, conic sections also appear as plus standards within CCSS.
    • The topic provides a really interesting opportunity to connect the worlds of geometry and algebra. Much of this connection, historically, is wrapped up in algebraic derivations. I wanted to use technology to do much of the heavy lifting here.
    • Students were exhibiting pretty high levels of stress around school in general, and I wanted to provide a bit of a break from that.
    • We are not in a hurry in this class.

    Before I share the details of what I did, I have to share the other side to this. A long time ago, I was intrigued by the conversation started around the Twitter hashtag #emojigrading, a conversational fire stoked by Jon Smith, among many others. I like the idea of using emoji to communicate, particularly given my frustrations over the past year on how communication of grades as numbers distort their meaning and imply precision that doesn’t exist. Emoji can be used communicate quickly, but can’t be averaged.

    I was also very pleased to find out that PowerSchool comments can contain emoji, and will display them correctly based on the operating system being used.

    So here’s the idea I pitched to students:

    • Unit 7 standards on conic sections would not be assessed with numerical grades, ever. As a result, these grades would not affect their numerical average.
    • We would still have standards quizzes and a unit exam, but instead of grades of 6, 8, and 10, there would be some other designation that students could help select. I would grade the quizzes and give feedback during the class, as with the rest of the units this year.
    • Questions related to Unit 7 would still appear on the final exam for the semester, where scores will be point based.

    I also let students submit some examples of an appropriate scale. Here’s what I settled on based on their recommendations:

    I also asked them for their feedback before this all began. Here’s what they said:

    • Positive Feedback:
      • Fourteen students made some mention of a reduction in stress or pressure. Some also mentioned the benefits of the grade being less specific being a good thing.
      • Three students talked about being able to focus more on learning as a result. Note that since I already use a standards based grading system, my students are pretty aware of how much I value learning being reflected in the grade book.
    • Constructive Feedback:
      • Students were concerned about their own motivation about studying or reassessing knowing that the grades would not be part of the numerical average.
      • Some students were concerned about not having knowledge about where they are relative to the boundaries of the grades. Note: I don’t see this by itself as a bad thing, but perhaps as the start of a different conversation. Instead of how to raise my grade, it becomes how I develop the skills needed to reach a higher level.
      • There were also mentions of ‘objectivity’ and how I would measure their performance relative to standards. I explained during class that I would probably do what I always do: calculate scores on individual standards, and use those scores to inform my decisions on standards levels. I was careful to explain that I wasn’t going to change how I generate the standards scores (which students have previously agreed are fair) but how I communicate them.

    I asked an additional question about what their parents would think about the change. My plan was to send out an email to all parents informing them of the specifics of the change, and I wanted students to think proactively about how their parents would respond. Their response in general: “They won’t care much.” This was surprising to me.

    So I proceeded with the unit. I used a mix of direct instruction, some Trello style lists of tasks from textbooks, websites, and Desmos, and lots of circulating and helping students individually where they needed it. I tried to keep the only major change to this unit to be the communication of the scores through the grade book using the emoji and verbal designation of beginner, intermediate, expert. As I also said earlier, I gave skills quizzes throughout.

    The unit exam was a series of medium level questions that I wanted to use to gauge where students were when everything was together. As with my other units, I gave a review class after the spring break where students could work on their own and in groups, asking questions where they needed it. Anecdotally, the class was as focused and productive as for any other unit this year.

    I was able to ask one group some questions about this after their unit test, and here’s how they responded:

    The fact that the stress level was the same, if not less, was good to see. The effort level did drop in the case of a couple of students here, but for the most part, there isn’t any major change. This class as a whole values working independently, so I’m not surprised that none reported working harder during this unit.

    I also asked them to give me general feedback about the no-numerical-grades policy. Some of them deleted their responses before I could take a look, but here’s some of what they shared:

      • Three students confirmed a lower stress level. One student explained that since there was no numerical grade, she “…couldn’t force/motivate [her]self to study.”
      • Five students said the change made little to no difference to them. One student summed it up nicely: “It wasn’t much different than the numerical grades, but it definitely wasn’t worse.”
      • One student said this: “The emojis seemed abstract so I wasn’t as sure of where I was within the unit compared to numbers.” This is one of a couple of the students that had concerns about knowing how to move from one level to the next, so the unit didn’t change this particular student’s mind.

     

    • This was a really thought-provoking exercise. A move away from numerical grades is a compelling proposition, but a frequent argument against it is that grades motivate students. By no means have I disproven this fact in the results of my small case study. If a move like this can have a minimal effect on motivation, and students get the feedback they need to improve, it offers an opportunity for considering similar experiments in my other classes.

      There are a couple questions I still have on this. Will students choose to reassess on the learning standards from unit 7, given that they won’t change the numerical average when we return to numerical grades for unit 8? The second involves the longer term retention of this material. How will students do on these questions when they appear on the final exam?

      I’ll return to this when I have more answers.

     

    SBG and Leveling Up – Part 2: Machine Learning

    In my 100-point scale series last June, I wrote about how our system does a pretty cruddy job of classifying students based on raw point percentages. In a later post in that series, I proposed that machine learning might serve as a way to make sense of our intuition around student achievement levels and help provide insight into refining a rubric to better reflect a student’s ability.

    In I last post, I wrote about my desire to become more methodical about my process of deciding how a student moves from one standard level to the next. I typically know what I’m looking for when I see it. Observing students and their skill levels relative to a given set of tasks is often required to identify the level of a student students. Defining the characteristics of different levels is crucial to communicating those levels to students and parents, and for being consistent among different groups. This is precisely what we intend to do when we define a rubric or grading scale.

    I need help relating my observations of different factors to a numerical scale. I want students to know clearly what they might expect to get in a given session. I want them to understand my expectations of what is necessary to go from a level 6 to a level 8. I don’t believe I have the ability to design a simple grid rubric that describes all of this to them though. I could try, sure, but why not use some computational thinking to do the pattern finding for me?

    In my last post, I detailed some elements that I typically consider in assigning a level to a student: previously recorded level, question difficulty, number of conceptual errors, and numbers of algebraic, and arithmetic errors. I had the goal of creating a system that lets me go through the following process:

    • I am presented with a series of scenarios with different initial scores, arithmetic errors, conceptual errors, and so on.
    • I decide what new numerical level I think is appropriate given this information. I enter that into the system.
    • The system uses these examples to make predictions for what score it thinks I will give a different set of parameters. I can choose to agree, or assign a different level.
    • With sufficient training, the computer should be able to agree with my assessment a majority of the time.

    After a lot of trial and error, more learning about React, and figuring out how to use a different machine learning library than I used previously, I was able to piece together a working prototype.

    You can play with my implementation yourself by visiting the CodePen that I used to write this. The first ten suggested scores are generated by increasing the input score by one, but the next ten use the neural network to generate the suggested scores.

    In my next post in this series, I’ll discuss the methodology I followed for training this neural network and how I’ve been sharing the results with my students.

    1 2 3 4 5 8