2014-2015 Year In Review: IB Physics SL/HL

This was my first year teaching IB Physics. The class consisted of a small group of SL students with one HL, and we met every other day according to the block schedule. I completed the first year of the sequence with the following topics, listed in order:

    Semester 1

  1. Unit 1 - Experimental Design, Uncertainty, Vectors (Topic 1)
  2. Unit 2 - Kinematics & Projectile Motion (Topic 2.1)
  3. Unit 3 - Newton's Laws (Topic 2.2)
  4. Unit 4 - Work, Energy, and Momentum (Topic 2.3)
  5. Semester 2

  6. Unit 5 - Circular Motion, Gravitation, and Orbits (Topics 6.1, 6.2)
  7. Unit 6 - Waves and *Oscillation(Topic 4, AHL Topic 9, *AHL Engineering Option Topic B3.1,3.2)
  8. Unit 7 - Thermal Physics (Topic 3, Engineering Option Topic B2)
  9. Unit 8 - *Fluid Dynamics (Engineering Option Topic B3)

For the second semester of the course, there was at least one block every two weeks that was devoted to the HL student and the HL only content - the SL students worked on practice problems or other work they had for their IB classes during this time. Units 7 and 8 were concurrent, so the HL student had to work on both the thermodynamics content and the fluid dynamics content together. This was similar to how I did it previously while teaching the AP physics B curriculum.

One other fact that is relevant - none of my students are native speakers of English. More on this later.

What worked:

  • The growth students made during the year was significant. I saw students improve in their problem solving skills and their organization in the process of doing textbook style assessment problems.
  • I learned to be honest about the IB expectations for answering questions on assessments.In the beginning, I tried to shield students from questions that combined conceptual understanding, computation, and complex language, often choosing two out of the three of them for any one question that I either wrote or selected from a bank. My motivation was to isolate assessment of the physics content from assessment of the language. I wanted answers to these separate questions:
    1. Does the student understand how the relevant physics applies here?
    2. Does the student understand how to apply the formulas from the reference table to calculate what the question is asking for?
    3. Can the student process the text of the question into a physics context?
    4. Can the student effectively communicate an answer to the question?

    On official IB assessment items, however, this graininess doesn't exist. The students need to be able to do all of these to earn the points. When I saw a significant difference between how my students did on my assessments versus those from IB, I knew I need to change. I think I need to acknowledge that this was a good move.

  • Concise chunks of direct instruction followed by longer problem solving sessions during class worked extremely well. The students made sense of the concepts and thought about them more while they were working on problems, than when I was giving them new information or guiding them through it. That time spent stating the definitions was crucial. The students did not have a strong intuition for the concepts, and while I did student centered conceptual development of formulas and concepts whenever possible, these just didn't end up being effective. It is very possible this is due to my own inexperience with the IB expectations, and my conversations with other teachers helped a lot to refine my balance of interactivity with an IB pace.
  • Students looked forward to performing lab experiments. I was really happy with the way this group of students got into finding relationships between variables in different situations. Part of this was the strong influence I've developed with the Modeling Instruction curriculum. As always, students love collecting data and getting their hands dirty because it's much more interesting than solving problems.

What needs work:

  • My careless use of the reference sheet in teaching directly caused students to rely excessively upon it. I wrote about this previously, so check that post out for more information. In short: students used the reference sheet as a list of recipes as if they provided a straight line path to solutions to questions. It should be used as a toolbox, a reminder of what the relationships between variables are for various physics concepts. I changed this partly at the end of the year, asking students to describe to me what they wanted to look for on the sheet. If their answer was 'an equation', I interrogated further, or said you aren't about to use the reference sheet for what it was designed to do. If their answer was that they couldn't remember if pressure was directly or inversely related to temperature, I asked them what equation describes that relationship, and they were usually able to tell me.
    Both of these are examples of how the reference sheet does more harm than good in my class. I fault myself here, not the IB, to be clear.
  • The language expectations of IB out of the gate are more of an obstacle than I expected at the beginning of the year. I previously wrote about my analysis of the language on IB physics exams. There tends to be a lot of verbal description in questions. Normally innocuous words get in the way of students simultaneously learning English and understanding assessment questions, and this makes all the difference. These questions are noticably more complex in their language use than that used on AP exams, though the physics content is not, in my opinion, more difficult. This is beyond physics vocabulary and question command terms, which students handled well.
  • Learning physics in the absence of others doesn't work for most students. Even the stronger students made missteps working while alone that could have been avoided by being with other students. I modified my class to involve a lot more time working problems during class and pushed students to at least start the assigned homework problems while I was around to make the time outside of class more productive. Students typically can figure out math homework with the various resources available online, but this just isn't the case for physics at this point. It is difficult for students to get good at physics without asking questions, getting help, and seeing the work of other students as it's generated, and this was a major obstacle this semester.
  • Automaticity in physics (or any subject) shouldn't be the goal, but experience with concepts should be. My students really didn't get enough practice solving problems so that they could recognize one situation versus another. I don't want students to memorize the conditions for energy being conserved, because a memorized fact doesn't mean anything. I do want them to recognize a situation in which energy is conserved, however. I gave them a number of situations, some involving conservation, others not, and hoped to have them see the differences and, over time, develop an awareness of what makes the two situations different. This didn't happen, partly because of the previous item about working physics problems alone, but also because they were too wrapped up in the mechanics of solving individual problems to do the big pciture thinking required for that intuition. Group discussions help on this, but this process is ultimately one that will happen on the individual level due to the nature of intuition. This will take some time to figure out.
  • Students hated the formal process of writing up any parts of the labs they performed. This was in spite of what I already said about the students' positive desire to do experiments. The expressions of terror on the students' faces when I told them what I wanted them to do with the experiment break my heart. I required them to do a write-up of just one of the criteria for the internal assessment, just so they could familiarize themselves with the expectations when we get to this next year. A big part of this fear is again related to the language issue. Another part of it is just inexperience with the reality of writing about the scientific process. This is another tough egg to crack.

There was limited interest in the rising junior class for physics, so we won't be offering year one to the new class. This means that the only physics class I will have this year will be with the same group of students moving on to the second year of IB physics. One thing I will change for physics is a set of memorization standards, as mentioned in my post about standards based grading this year. Students struggled remembering quick concepts that made problem solving more difficult (e.g. "What is the relationship between kinetic energy and speed?") so I'll be holding students responsible for that in a more concrete way.

The issues that need work here are big ones, so I'll need some more time to think about what else I will do to address them.

Leave a Comment

Filed under IB, physics, year-in-review

#TeachersCoding: Building a VotingBooth with MeteorPad

I previously wrote about a game that I had students build as part of our Meteor app building unit.

Here's another video series that would be of much better use to teachers: Have students vote on an answer to a question. The example I gave in class was of students voting on which picture of my dog Mileaux was the best.

mileauxChoose

This is a fairly simple application of Meteor principles that could be useful in a classroom setting. Since it's all done on MeteorPad, you don't have to install anything on your computer. If you threw this together, and then wanted to actually give it to a class, you could give them the link highlighted in red in the image below. NOTE: This link will change whenever you reload the Meteorpad page, so make sure you share the one corresponding to your instance of the application.

Screen Shot 2015-07-23 at 10.49.18 PM

You can visit the fully functioning code and try it out at this link:
VoteForMileaux - MeteorPad

Leave a Comment

Filed under programming

2014-2015 Year In Review: Web Programming

This was the first year I've taught a computer programming course. The class was a broad survey of programming in HTML5. This was the overall sequence:

    Semester 1:

  1. Hacking a webpage from the browser console
  2. HTML tags, structures, and organization
  3. CSS - page design, classes and IDs, along with using Bootstrap
  4. Javascript - variables, structures, conditionals
  5. jQuery - manipulating the page using events and selectors, animations
  6. Semester 2:

  7. Mongo Databases & Queries
  8. HTML Templates using Blaze
  9. Writing Meteor Apps
  10. Meteor, Media, and the HTML5 Canvas
  11. HTML5 Games using Phaser

I have posted the files and projects I used with students at this repository on Github:
https://github.com/emwdx/webprogramming2014-2015

What did I do?

The class generally began with a warm-up activity that involved students analyzing, writing, or running code that I gave them. This always led into what we were going to explore on a given day's lesson. I would show the class a few lines of code, ask them to make a prediction of what they thought would happen. This might be a visual request - what will this look like? Will there be an error? Was this error intentional or not?

This was all done while students had their laptops closed and notebooks open. I usually designed a series of tasks for students to complete using some code snippets that were saved in the directory on the school server.

We didn't use any textbook, so I knew I needed to create a reference that students could refer back to whenever they got stuck. For each class, I took notes either in Microsoft OneNote or the SMART Notebook software and saved the notes in PDF form. I don't know if students used this or not.

I had three types of assessment:

  • Mini-projects, which were fairly straight forward and had unique answers. These were assessed by general completion (4.5/5) with a (5/5) given for effort to creatively make the code their own. I was fairly loose on that final half point, giving it whenever I saw students clearly engaged by the task. You can see an example of this assignment here.
  • Projects, which had clear guidelines and requirements to meet the minimum grade that ranged from 80 - 90 percent, and then a series of additional tasks that raised the grade up to 100%. The additional task points weren't awarded until the basic requirements were met, though that didn't stop students from trying (see below).
  • Blog posts, which were required for every class. The expectations required a summary of what we learned for each class, along with code snippets, questions about what we learned, or confusion about something they wanted to go over in the next class. As the students became more skilled, this turned into questions that started as "How can we.../Is it possible to...".

Once every two weeks, and usually on a Friday, I had a 20% day during which students could work on anything they wanted related to web programming. Some students worked on previous projects to resubmit them, others experimented with code from the previous class or week. In a couple of cases, students worked on their own pet projects, which included a chat application, a mathematical formula parser, and applying visual design principles to the pages we created in class. I often made suggestions for what students could do at the beginning of the class block, including providing some basic code they could use to experiment.

What worked:

  • Based on feedback from the end of the year, students enjoyed the course. They had a lot of positive comments on the ways I ran the class and that they always got help when they needed it.
  • Forcing students to write down code helped with retention and building a useful reference for later. I didn't require them to write down long blocks of code, but for things like HTML tags and Javascript, I wanted there to be some written reinforcement that things were important. I was pretty strict on deciding when I wanted students to write down code (to activate that part of the brain) and when I wanted them to copy it directly into a text editor and run it.
  • Forcing students to recreate code (and not copy and paste) led to higher activity and interaction between students while learning to code. I saved some code as images, not text, which required students to go line by line and see what they were doing. This was a decision I made early on because it helped me when learning to code. That extra step of needing to look at the code while I was typing it in led me to take a closer look at what it said, and I wanted to give a similar opportunity to my students.
  • The more open ended projects led to much richer questions and interaction between students. I really liked the range of responses I received when I gave open ended projects. Some students were exceptionally creative or went well beyond the requirements to make code that mattered to them.
  • Students were constantly helping each other with their code...when they eventually asked for this help. I was called over many times by students calling out the blanket statement "my code doesn't work" and then handing me their laptop, but over time they learned that I wasn't going to just fix their code for them. They became careful readers of each other's code, when they finally made the step to ask someone to help, though this took some time.
  • I succeeded in having students do more than listen. I never talked for more than 15 minutes before students were actually writing and experimenting with code. This was exactly what I wanted.
  • 20% days were a big hit. Some students wanted this time as extra processing time to complete the mini projects from the rest of the week. Others liked being able to ask me how to do anything, or to find tutorials for HTML elements that they wanted to learn to use. I really liked how well this worked with this group of students and looked forward to it, and not just because it was a reduction in the planning required for class.
  • Videos offered an effective and preferred method for learning to write code in this class. I put together a number of screencasts in which I spoke about the code, and in some cases coded it live. Students were able to pause, copy code to the editor, and then run it pretty easily. Some zipped through it, others took longer, but this is nothing new. The time required to do this, as is always a consideration for me, was often more than I could afford. Luckily, there is plenty of material available already out there, so I was able to step back and give another voice and face a chance to teach my students.

What needs work:

  • The bonus elements for projects were the first things most students wanted to figure out first. Many students did not put in the time to read and complete the basic requirements for projects, resulting in submitted projects that were sent right back as being incomplete. Some of this was a language issue, as there were many ESOL students in the class, but most of it was what we always encounter when working with adolescents: not reading the directions.
  • Students reused a lot of old (and unrelated) code. I emphasized creating simple code from scratch throughout the year, as my expectations were relatively simple. For many students, copying and pasting code was a crutch that led to many more problems than simply writing simple, clean code from the start. I get it - I copy and paste code myself - but I also know how to clean it up. They knew why not to do it (because they all tried it at some point) but some students continued doing it to the end. I need a better plan for helping students not fall into this trap.
  • Many students did not pick up on error messages in the console that said precisely where the problem with the code was located. At times, I expected too much from students, because the console is a scary place. That said, I think I could do a better job of emphasizing how to find the line numbers referenced in these errors messages, regardless of what the error message is.

I really enjoyed teaching this class, and not just because of the awesome group of students that took it. It helped me refine my knowledge and get better at some of the HTML, CSS, and Javascript coding skills that I had used, but often had to relearn every time I wanted to use them.

Feedback, as always, is welcome!

Leave a Comment

Filed under reflection, year-in-review

2014-2015 Year-In-Review: Standards Based Grading

This was my third year using standards based grading with my classes. I wrote last year and the year before about my implementation.

What did I do differently?

  • I had my WeinbergCloud implementation working from the beginning of the year, so it was part of the expectations I introduced on day one.
  • I also adjusted this system a bit to make it easier to link the reassessments and the content of the standards. There seemed to be too much uncertainty about what each standard represented, which translated into more confusion when signing up for reassessments than I wanted. Creating a list of standards and resources associated with each standard shrank this gap.
  • I did not limit the number of reassessments per day explicitly. I expected that students would not sign up for a ridiculous number given the limitations on their credits, which students earned by doing homework or coming to tutoring.
  • I included time within at least one class a week per student during which students could do reassessments without having to come in outside of class time.
  • Unit exams continued to be assessed purely on course standards, not points. Semester final exams were percentage based.
  • I scaled all of my standards levels from 1 - 5 to be from 6 - 10 to make it easier to communicate the levels to parents and be consistent with our school grading policy of not giving numerical grades below 50%. No student actually received lower grades due to my system of adding a base grade to each standard, but the process of explaining to students and parents that a 1 was really a 60% (5 for the base grade + 1 for the standard level) was clearly more complex than it needed to be.
  • For my combined IB HL/SL class, the HL students had standards that only they were responsible for learning, while also being responsible for the SL standards. More on this later.

What worked:

  • Students seemed to have a better understanding from the beginning of the year of what standards based grading and assessment was all about. I did a bit more deliberate instruction on the ideas behind it at the beginning of the year. I also had smaller classes than before, so I was better able to have individual conversations about signing up for reassessments and talking about the process.
  • A small proportion of students were fully sold on the idea of reassessment as a learning tool. Some students reassessed at least twice a week throughout the semester, and these students had strong performances on the cumulative final exams.
  • By the second unit exam, students were generally not leaving questions blank on assessments. They were trying their best to do some amount of work on each question.
  • As with last year, I gave more challenging questions to assess the range of student ability. Most of these involved either multiple standards combined in one, more open ended responses, or questions requiring explanation. Assessing at the higher levels of mastery became strongly subjective, and students accepted this, though they occasionally advocated for themselves as to why they deserved to be marked higher. They generally felt that it was fair when arithmetic errors kept them in the 8/10 range.
  • Having students report their mastery level when signing up for a reassessment made it much easier for me to know what problem type or category to give them. Furthermore, this made it easier to justify changing the mastery level higher after a successful reassessment, but not making it the highest level on the scale. A student that was a 6 and answered a couple of questions correctly might move to an 8, whereas a student that was previously an 8 would be given more challenging questions and some conversation explaining their understanding in order to move to a 10.
  • It was my priority to get assessments back within the same period, and I estimate that I was able to do this more than 95% of the time. Simple, short, and carefully designed assessments can reveal quite a bit about what students do/don't understand.

What needs work:

  • Similar to previous semesters, I had high participation of a small group of students, with far too many students choosing not to reassess until the very end of each semester. Some students did not initiate their own reassessments at all.
  • Students again hoarded their credits to the end of the semester. I flirted with the idea of adding an expiration date to credits to discourage holding on to credits for long periods of time, but time constraints kept me from implementing this.
  • As a consequence of credit-hoarding, students near the end of the semester signed up for absurd numbers of reassessments in a day - I believe the largest quantity was nine. I shared with students that a good rule of thumb for planning purposes is 10 minutes per reassessment, so doing five reassessments before school isn't practical, but that didn't come across well. Students that couldn't do all of their reassessments in the morning simply pushed them to later in the day. This was a problem for me because I never knew if students were going to show up according to their scheduled time, or just do everything after school. Canceling after no-shows at the end fixed this problem pretty efficiently, however.
  • When a student would answer all questions correctly on an unannounced standards quiz, I generally assigned this a mastery level of 8 on a 6 - 10 scale. Students that had less than an 8 in this case usually had trouble with the same questions on a unit assessment or reassessment on the same standard later on. In other words, the students that had trouble initially learning a concept did not necessarily get the help they needed to make progress before the unit exam. This progress often happened after the exam, but this led to a lot of students falling behind pretty early on. I need to introduce interventions much earlier.

Under consideration for next year:

These are the ideas I am mulling over implementing before school gets started in a month, and I'd love to hear what you think.

  • Make credit expiration happen. This has been an issue for the year and a half of WeinbergCloud's existence. I threatened implementing this in speaking with students, and they were immediately asking me not to because it would prevent them from putting off reassessments as they preferred to do. This includes students that were doing the practice problems between classes anyway, so this wasn't just about losing the credits. Adding a "why not just give a reassessment a try" argument worked in face-to-face conversation with students that were hoarding credits, so forcing the process might be worth the effort. I understand that learning takes time, but many of the students putting off reassessment weren't actively reviewing the standards over time any way. I'd rather force the feedback cycle through more iterations since that is when students seem to learn the most.
  • Introduce submitting work into the process of reassessment. This could be electronic ("To complete your sign up, submit a scan/photo of the work you have done to prepare") or could just be shown before I give them a reassessment. This would reduce some of the sign-ups that happen only based on the mastery score rather than reviewing the concepts that come with it. Students earn credits by doing practice problems or coming to tutoring, and these let them sign up for reassessments - this won't change. To actually go the final step and take the reassessment, I need to see what students have done to prepare. In some cases (students that see me the day before, for example) I may waive this requirement.
  • Require X number of reassessments per two week cycle of the block schedule. This might be in lieu of the previous change, but I'm afraid this might encourage (rather than prevent) a rush of reassessments at the end of a two week period. On the other hand, if the goal is to increase opportunities for feedback, this might be more effective.
  • Make it possible for students to sign-up for an appointment to go over (but not be assessed) material on a given standard. Reassessments are great opportunities for feedback, but sometimes students want to come in to go over material. I get emails from students asking this, but it might be easier to just include this within WeinbergCloud.
  • Introduce skills/definition standards for each unit. This would be a standard for each unit that covers basic recall of information. I'll discuss why I want these (particularly in physics) in more detail within a later post. The short story is that I want to specifically assess certain concepts that are fundamental to all of the standards of a unit with a single binary standard.
  • Classify standards mastery levels in terms of 'likelihood of success'. This is a lower priority, and when I tried to explain this to a colleague, she wasn't convinced it would be worth the effort. If you have a 10, it means you have a 95% or higher likelihood of answering anything I give you correctly. The probabilities might not scale linearly - a 9 might mean between 90-95%, an 8 between 75% and 90, etc. I don't know. The reason I want to do this is to justify giving a 10 to students that have demonstrated solid proficiency without requiring perfection, and have a better reason for only raising a student from a 6 to an 8 after answering a couple questions on a single reassessment.

    Right now the difference between an 8, 9, and 10 are defined (in order) by answering questions correctly on a single standard quiz, a comprehensive unit exam, and correctly answering stretch questions correctly. A student that gets an 8 on a standards quiz before an exam might then answers related questions incorrectly on the multi-standards exam and remains an 8. If this student then takes a quiz on a single standard and answers that question correctly, does it make sense to then raise their mastery level above 8? This is what I often do. I can also control for this by giving a more challenging question, but I'm not sure I need to.

    In short, something is fishy here, and I need to think it out more in order to properly communicate it to students. In my head, I understand what I want to communicate: "yes, you answered these questions correctly, but I'm still not convinced that you understand well enough to apply the concepts correctly next time." This is not the highest priority out of the ones I've mentioned here.

As always, I appreciate your feedback. Thanks for reading!

2 Comments

Filed under standards based grading, teaching philosophy

Social Interactions and Time

Social work is important but social work will require, by its nature, more wait time than automated work.
--p. 131, Functionary: Learning To Communicate Mathematically In Online Environments by Dan Meyer

This quote from Dan's dissertation gets to a theme of my lesson design this year. The time requirements of social interactions in the classroom are critical to honestly working them in to classroom routines. Dan is referring to the time required waiting for another students to refactor and resubmit a verbal description online. My takeaway from this point gets at a reality of making student socialization a tool for learning in the classroom.

Conversations about learning take time. 

Exit tickets at the end of the class are quick ways to assess specific skills presented during a class period, but they are essentially one way channels since they can't be acted upon until next class. Time in class for lightly structured conversation around a lesson reveals understanding (or a lack thereof) is not just interactive for students, but allows me to hear a range of responses and parse them for what my students have learned. This conversation can be limited to small chunks of one or two minutes, so the payoff to investment ratio is big if those conversations are carefully designed and motivated. 

Identifying what is and is not useful in those conversations is essential to working in an environment with peers. This is a valuable skill for students to develop. It's difficult impossible to plan for every possible response students will have to everything that is said, and there will always be unexpected or off topic elements. This 'noise' can be managed but shouldn't be eliminated. Doing so denies the ebb and flow of real conversations that students have outside our classrooms all the time. If we are to leverage socialization in our classrooms for learning, we have to acknowledge that the efficiency will never be perfect. This is especially the case as Dan's research suggests that students best learn to communicate mathematically through revision and feedback.

I could go much faster through material if all I used was direct instruction. My students would be forced to be compliant to such a structure, and probably wouldn't enjoy my class as much, which I've decided is important to me. It is satisfying as a teacher to see students working through their understandings without my help, and this can only happen if I provide time for it during class. Scheduling time for it is a way to show students that I value what comes out of these conversations.

2 Comments

Filed under teaching philosophy

MeteorPad Tutorial: GoldMine

In a unit on Meteor applications for my web design class, I wrote a series of applications to help my students see the basic structure of a few Meteor applications so that they could eventually design their own. The students had seen applications that tallied votes from a survey, compiled links, and a simple blog. This one was about being competitive, and the students were understandably into it.

This tutorial was designed to use MeteorPad due to some challenges associated with installing Meteor on student computers. The first one involved permissions issues since students are not by default given access to the terminal. The second involved connectivity issues to the Meteor servers from China, which unfortunately, I never fully resolved. MeteorPad offered an awesome balance between ease of making an application and minimizing the need for Terminal. For anyone looking to get started with Meteor, I recommend trying out MeteorPad as it doesn't require any knowledge of working in the terminal or software installation.

I was able to take data on the students clicking away as I created new pieces of gold and put it out into the field. I've written previously about my enthusiasm for collecting data through clicks, so this was another fun source of data.

Code can be downloaded here from Github.

Leave a Comment

Filed under programming, studentwork, Uncategorized

On Grant Wiggins

Like many others in the world of education, I was saddened by the loss of Grant Wiggins on May 26th. Before I begin my summer period of writing on what I've learned this year, it seems appropriate to share just how much Grant and his ideas helped shape my classroom into the place of learning it has become.

I was lucky to have met Grant when he came to my school in the Bronx in my fourth year teaching. My assistant principal at the time worked to bring him and was understandably excited to share the news of his approaching visit. I had not read Understanding by Design from start to finish in my education courses, but the principles described were frequently referenced. I was embarrassed to learn that I knew of Grant's ideas but not his name. My wife pulled out her copy of UbD when I told her who was coming to visit us and pointed to Grant's name on the cover, and I realized this wasn't going to be just another disconnected day of PD staring at a PowerPoint presentation.

The time he spent with us began a transformative period of refining my planning process, possibly the most significant I've had over my twelve year career.

His beliefs around assessing content skills independently pushed me to experiment with standards based grading. His famous analogy identifying the distinction between practicing soccer skills and playing in a game revealed clearly the mismatch between the different types of assessments I was using and the mixed levels of success my students had on them. I experimented more with open ended problems to give my students the experience of playing the game of mathematics. I came to shed my fear of exposing students to problems that they hadn't seen before, and instead embraced them as opportunities to expand student intuition around the associated skills. This shift away from the 'skills first, application later' philosophy became central to my teaching. It would take a bit longer for me to successfully integrate essential questions into my unit planning routine. I changed my lesson planning routine to be end goal oriented rather than being decided by sections in a textbook or pacing guide. It took longer to feel comfortable using essential questions to plan lessons, but I knew when I first learned about their power that I wanted to develop my ability to do so. 

I also learned a great deal about the power of sharing ideas from reading Grant's blog. It was clear that he saw his work helping teachers as a process leading them to discover these truths for themselves, and not as a keeper of secret knowledge to be doled out by buying the next book. He was always describing his experiences with teachers as they were developing their craft. He wrote openly about the struggles he faced along the way. When I started blogging myself, I felt obligated to service my own teaching through a similar level of honesty in writing. I was honored that he also discussed and shared my ideas on a couple occasions.

A colleague of mine once said that much of the professional development we receive as teachers is little more than stating the obvious. The ideas that Grant shared were not new, but they also were not what I was told from the beginning of my training as a teacher. They should have been. Start from the end, give students opportunities to think big, and assess authentically what you want your students to be able to do. Keeping these ideas at the front of my teaching has not always resulted in the outcomes I expected, but I love how they have shaped my priorities when sitting down to plan what comes next. 

It is often the small shifts in thinking that make the big differences in what we do daily. I am thankful to Grant starting this process for me. I know his work lives on in the many classrooms that have been touched by his ideas, and students are the ultimate benefactors of the changes he promoted in our classrooms.
Thank you, Grant, for sharing your life with us.

Leave a Comment

Filed under teaching philosophy

Coding The Feud with Meteor

Now that I'm cleaning up loose ends from the year, I'm finding time to share some of the projects that have kept me from posting here as of late. Sorry, folks.

We decided to shift from our usual Quiz Bowl activity at the end of the year to a new format of Family Feud. This process developed over the final quarter of the year, so I was able to get some student help putting a web application together for the visuals. A big shout out to Alex Canon in 9th grade who did prototyping of the HTML templates using Blaze in my Web Programming class.

Screen Shot 2015-06-17 at 9.20.49 PM

The application is written all in Meteor and was a big hit. I've posted the code here at Github and a demo application at http://HISfeud.meteor.com. The looping music and authentic sound effects made for a good show while students tried to remember what they answered on their survey from a month ago. This was part of our end of year house competition, which complicated things a bit since Family Feud is played two teams at a time. Still, I like how it worked out.

Lots more to share, so stay tuned.

Leave a Comment

Filed under programming, teaching stories

Reaction Time & Web Data Collection

If you put out an open call through email to complete a task for nothing in return, it might make sense not to expect much. I tried to make it as simple as possible to gather some reaction time data for my IB Mathematics SL class to analyze. My goal for each class has been to get an interesting data set each time and see what students can make out of it. After several hours of having this open, I had a really nice set of data to give the class.

I know my social networks are connections between some phenomenal people. That said, I didn't know that the interest in trying this out would be so substantial, and in several cases, get people to try multiple times to get their own best time. In less than a week, I've collected more than 1,000 responses to my request to click a button:
Screen Shot 2015-05-22 at 3.41.12 PM

I coded this pretty quickly and left out the error correction I would have included given the number of people that did this. I've been told that between phones, tablets, desktops, laptops, and even SmartBoards, there have been many different use cases for times ranging from hundredths of a second to more than five minutes - clearly an indication that this badly needs to be tweaked and fixed. That said, I am eager to share the results with the community that helped me out, along with the rest of the world. A histogram:

There's nothing surprising here to report on a first look. It is clear that my lazy use of jQuery to handle the click event made for a prominent second peak at around 0.75 seconds for those tapping on a screen rather than clicking. Some anecdotal reporting from Facebook confirmed this might have been the explanation. The rest of the random data outside of the reasonable range is nothing more than poorly coding the user experience on my part. Sorry, folks.

This isn't the first time I've done a data collection task involving clicking a button - far from it. It's amazing what can be collected with a simple task and little entry cost, even when it's a mathematical one. One of the things I wonder about these days is which tools are needed to make it easy for anyone (including students) to build a collection system like this and investigate something of personal importance. This has become much easier with tools such as Google Docs, but it isn't easy to get a clean interface that strips away the surrounding material to make the content the focus. For all I know, there may already be a solution out there. I'd love to hear about it if you know.

2 Comments

Filed under Uncategorized

Maintaining Sanity, Reviewing Priorities

I've had a really busy year. I've always said at the start of the school year that I'm going to say 'no' more frequently in as politely a way as possible. I've said I'd be more honest about priorities. Instead of spending time writing code for something that might be really cool as part of a lesson next week, I need to get tests graded today. I've had more preps this year than ever before. I have big scale planning to do relative to my IB classes and their two year sequence of lessons, labs, and assessments. In a small school like ours, it's difficult to avoid being on multiple committees that all want to meet on the same day.

Probably the hardest part has been figuring out what my true classroom priorities are. I'd love to look at every student's homework, but I don't have time. I'd love to make videos of all of my direct instruction, but I don't have time. I'd love to curate a full collection of existing resources for every learning standard in my courses, but despite designing my own system to do this, I haven't had time.

Over the course of the year, however, I've found that the set of goals I have for every class can be boiled down to three big ones:

Give short SBG assessments as frequently as possible.

These need to be looked at and given back in the course of a class period, or they lose their effectiveness for students and for my own course correction when needed.

Provide more time for students to work during class. Use the remaining time to give direct instruction only as needed, and only to those that really need it.

Time I spend talking is unnecessary for the students who get concepts, and doesn't help the students that do not. If I'm going to spend time doing this, it needs to be worth it. This also means that I may not know what we need to review until during the class, so forget having full detailed lesson plans created a week at a time. I think I've accepted that I'm better at correcting errors along the way than I am at creating a solid, clear presentation of material from start to finish, at least given time constraints.

It has been more efficient for me to give students a set of problems and see how they approach them than tell them what to do from the start. There are all sorts of reasons why this is also educationally better for everyone involved.

Focus planning time on creating or finding interesting mathematical tasks, not on presentation.

I've always thought this, but a tweet from Michael Pershan made it really clear:

What I teach comes from the learning standards that I either create or am given. Maximizing opportunities for students to do the heavy cognitive lifting also maximizes the time these ideas spend simmering in their heads. This rarely occurs as a result of a solid presentation of material. It doesn't necessarily (or even usually) happen by watching a perfect video crafted by an expert. When you have a variety of mental situations in which to place your students and see how they react, you understand their needs and can provide support only when necessary. Anything can be turned into a puzzle. Finding the way to do that pays significant dividends over spending an extra ten minutes perfecting a video.


Going back to these three questions has helped me move forward when I am overwhelmed. How might I assess students working independently? What do I really need to show them how to do? What can I have my students think about today that will build a need for content, allow them to engage in mathematical practice, or be genuinely interesting for them to ponder?

What are your priorities?

4 Comments

Filed under reflection, teaching philosophy