Category Archives: studentwork

An Experiment: Swapping Numerical Grades for Skill-Levels and Emoji

I decided to try something different for my pre-Calculus class for the past three weeks. There was a mix of factors that led me to do this when I did:

  • The quarter ended one week, with spring break beginning at the end of the next. Not a great time to start a full unit.
  • I knew I wanted to include some conic sections content in the course since it appears on the SAT II, and since the graphs appear in IB and AP questions. Some familiarity might be useful. In addition, conic sections also appear as plus standards within CCSS.
  • The topic provides a really interesting opportunity to connect the worlds of geometry and algebra. Much of this connection, historically, is wrapped up in algebraic derivations. I wanted to use technology to do much of the heavy lifting here.
  • Students were exhibiting pretty high levels of stress around school in general, and I wanted to provide a bit of a break from that.
  • We are not in a hurry in this class.

Before I share the details of what I did, I have to share the other side to this. A long time ago, I was intrigued by the conversation started around the Twitter hashtag #emojigrading, a conversational fire stoked by Jon Smith, among many others. I like the idea of using emoji to communicate, particularly given my frustrations over the past year on how communication of grades as numbers distort their meaning and imply precision that doesn't exist. Emoji can be used communicate quickly, but can't be averaged.

I was also very pleased to find out that PowerSchool comments can contain emoji, and will display them correctly based on the operating system being used.

So here's the idea I pitched to students:

  • Unit 7 standards on conic sections would not be assessed with numerical grades, ever. As a result, these grades would not affect their numerical average.
  • We would still have standards quizzes and a unit exam, but instead of grades of 6, 8, and 10, there would be some other designation that students could help select. I would grade the quizzes and give feedback during the class, as with the rest of the units this year.
  • Questions related to Unit 7 would still appear on the final exam for the semester, where scores will be point based.

I also let students submit some examples of an appropriate scale. Here's what I settled on based on their recommendations:

I also asked them for their feedback before this all began. Here's what they said:

  • Positive Feedback:
    • Fourteen students made some mention of a reduction in stress or pressure. Some also mentioned the benefits of the grade being less specific being a good thing.
    • Three students talked about being able to focus more on learning as a result. Note that since I already use a standards based grading system, my students are pretty aware of how much I value learning being reflected in the grade book.
  • Constructive Feedback:
    • Students were concerned about their own motivation about studying or reassessing knowing that the grades would not be part of the numerical average.
    • Some students were concerned about not having knowledge about where they are relative to the boundaries of the grades. Note: I don't see this by itself as a bad thing, but perhaps as the start of a different conversation. Instead of how to raise my grade, it becomes how I develop the skills needed to reach a higher level.
    • There were also mentions of 'objectivity' and how I would measure their performance relative to standards. I explained during class that I would probably do what I always do: calculate scores on individual standards, and use those scores to inform my decisions on standards levels. I was careful to explain that I wasn't going to change how I generate the standards scores (which students have previously agreed are fair) but how I communicate them.

I asked an additional question about what their parents would think about the change. My plan was to send out an email to all parents informing them of the specifics of the change, and I wanted students to think proactively about how their parents would respond. Their response in general: "They won't care much." This was surprising to me.

So I proceeded with the unit. I used a mix of direct instruction, some Trello style lists of tasks from textbooks, websites, and Desmos, and lots of circulating and helping students individually where they needed it. I tried to keep the only major change to this unit to be the communication of the scores through the grade book using the emoji and verbal designation of beginner, intermediate, expert. As I also said earlier, I gave skills quizzes throughout.

The unit exam was a series of medium level questions that I wanted to use to gauge where students were when everything was together. As with my other units, I gave a review class after the spring break where students could work on their own and in groups, asking questions where they needed it. Anecdotally, the class was as focused and productive as for any other unit this year.

I was able to ask one group some questions about this after their unit test, and here's how they responded:

The fact that the stress level was the same, if not less, was good to see. The effort level did drop in the case of a couple of students here, but for the most part, there isn't any major change. This class as a whole values working independently, so I'm not surprised that none reported working harder during this unit.

I also asked them to give me general feedback about the no-numerical-grades policy. Some of them deleted their responses before I could take a look, but here's some of what they shared:

    • Three students confirmed a lower stress level. One student explained that since there was no numerical grade, she "...couldn't force/motivate [her]self to study."
    • Five students said the change made little to no difference to them. One student summed it up nicely: "It wasn't much different than the numerical grades, but it definitely wasn't worse."
    • One student said this: "The emojis seemed abstract so I wasn't as sure of where I was within the unit compared to numbers." This is one of a couple of the students that had concerns about knowing how to move from one level to the next, so the unit didn't change this particular student's mind.


  • This was a really thought-provoking exercise. A move away from numerical grades is a compelling proposition, but a frequent argument against it is that grades motivate students. By no means have I disproven this fact in the results of my small case study. If a move like this can have a minimal effect on motivation, and students get the feedback they need to improve, it offers an opportunity for considering similar experiments in my other classes.

    There are a couple questions I still have on this. Will students choose to reassess on the learning standards from unit 7, given that they won't change the numerical average when we return to numerical grades for unit 8? The second involves the longer term retention of this material. How will students do on these questions when they appear on the final exam?

    I'll return to this when I have more answers.


Class Notes and Workflow (On The Other Side of the Wall)

I've struggled in the past with the role of class notes. I wrote more than a year ago about my solution using Microsoft OneNote. Since moving out of China, I've realized just how far behind I am in just awareness of what Google Docs are capable of doing. My new school uses them extensively for all sorts of organizational and administrative purposes, not to mention applications in the classroom. I decided to upgrade my class notebook system this year to make better use of these tools. Now that we're approaching three months in, I'm feeling pretty happy about my system thus far.

I now make all my handouts on Google Docs. The bandwidth and lack of a Great Firewall make it a reliable way to have access to files both at school and at home, which means that I'm not dragging my computer back and forth anymore. There's something to be said for carrying a minimalist backpack, especially given the temperatures here. I relied on iCloud Drive last year which worked well enough, but the fact that I'm not worrying about files syncing between home and school is a clear change for the better. These files are titled U3D02 - CW - Title of Day's Lesson to signify 'Unit 3, Day 2' for ease of identifying files and their order. These are starting points for class activities, resources to use during class such as Desmos activities, videos, or other parts of what might be useful to students learning a given topic.


Each handout is shared with the class through Hapara teacher dashboard and Google Drive, and I give students read access on each file. Two students are randomly picked to be responsible for class notes. These two students make a copy of this handout during class, name it with the same title and unit/day designation, and then change CW (class work) to NB (notebook) file to indicate the purpose of this file.

I take notes during class using Notability and my Wacom tablet. It's easy to copy and paste images from the digital handout into the notes, and then annotate them as needed. I take photos of student work with my phone and use Airdrop to get them to my classroom laptop. At the end of the class, I paste images of the notes I take during class into the relevant part of the notes. The two students are responsible for solving problems from the class handout and from homework, taking pictures, and putting them into the notes file on Google Docs. Links to these files are then shared on the course website with the rest of the class.

My class handouts are still printed on A5 paper as an analog backup, and quizzes are usually still on paper as well. I still insist on students doing problems by hand since that's ultimately how they will be assessed. The computer is there for access to Desmos, Geogebra, and the digital handout.

The most satisfying part of all of this is that students are being remarkably proactive about asking for materials to be shared, letting me know when they think something should be added to a handout, or adding it themselves when they have editing access to the file. There is also a flow of suggestions and comments to the students that are responsible for each day's lesson.

It's pretty amazing what is possible when a major world power isn't disrupting the technology you want to use in the classroom (or for whatever) on a regular basis.

My Unscientific Case Study on Helpful Explanations

I've been fascinated by the discussion on Dan Meyer's blog about explanations and their role in a math class. This was prompted by this article that makes assertions about the usefulness of these explanations to indicating understanding. The question of what merits the label of explanation and how that relates to 'showing work' is an important one, and has been hashed around by the commenters on Dan's blog. I decided to pitch a question to students that asked them to explain and nudge them in a discussion to get meaning out of their responses.

Here's the question, which is from the Amsco Integrated Algebra textbook on page 115:
Screen Shot 2015-11-18 at 9.54.27 AM

I took pictures of their responses and then put them up two at a time in front of the class. I didn't pair them up deliberately, which might have been more interesting. After putting them up, I asked students to first share their observations about what made them different. There wasn't much of a responses, but I wrote what was shared underneath. I also asked for each pair to vote on which of the two was more helpful to understanding the answers. Here are the results:

Pair A:
Screen Shot 2015-11-18 at 10.00.07 AM

3 voted for the one on the left, 11 voted for the one on the right.

The one student that spoke up said that the one on the left makes more sense because the one on the right merely shows the pattern. I didn't get more out of this student in terms of explanation, and other students weren't stepping up to share.

Pair B:
Screen Shot 2015-11-18 at 10.02.03 AM

10 voted for the one on the left, 4 voted for the one on the right.

The left example is the sort of diagram that I think I've seen in those Facebook posts knocking Common Core. I've never shown them this kind of diagram though - this was 100% from the student who, knowing this student's history, has never stepped into a CCSS classroom in the United States to be taught this explicitly. This student decided to make this diagram because she felt it best showed her understanding of the problem. On the right is a set of arithmetic problems that show precisely the same thing, and the students preferred it, but weren't willing to share why.

Pair C:
Screen Shot 2015-11-18 at 10.02.51 AM

In a move that surely would appease the writers of the article Dan referenced in his post, 2 students voted for the left one, and 12 voted for the right.

I'm not sure what these results mean aside from the comments I've already shared. I think it would be easy for Garelick and Beals to point to the preferences of my students as evidence that supports their argument. I think the role of showing answers in this context are different from one of testing, which is one complication of this result. The other is that my question on answers being 'helpful' might be dramatically different from asking which answers best show 'understanding'.

Certainly a more carefully designed experiment might tease out more. This might be the sort of task to use Desmos Activity Builder or PearDeck to give students a chance to share their thoughts in a less public setting.

Crutches and Exponents

Math teachers frequently discuss how students forget what the exponent rules actually mean when they make mistakes applying them. The layer of abstraction that these rules lay over the numbers and operations is at fault, of course. The reason we teach the rules is that they show structure that goes beyond the operations. They simplify our work in calculating expressions.

I was really glad that a student used this approach today when she forgot the rules:
Screen Shot 2015-10-29 at 9.38.07 AM

I would much rather a student move back to a method they know rather than blindly apply the rules they don't. This method, or crutch, is less efficient, but holds more meaning for the student. We dissuade students from crutches like counting on their fingers because they should be able to do the arithmetic in other ways. Building meaning is important, however, and the better approach would be to show how learning the mathematical ideas and structures can simplify the process. In speaking with this student afterwards, it was clear that going back to this method that we used to motivate the rule helped her understand what it meant.

I continued with this approach in reviewing zero and negative exponents today. Of the students that said they knew the rule already, only a couple of them actually applied it correctly before we did this activity. I primed the class with this:
Screen Shot 2015-10-29 at 9.41.32 AM

Students worked in groups to apply the rules and rewrite them, and I nudged them gently with using what they saw as motivation for rules about zero and negative exponents. From this, I introduced a new crutch as a way to show what negative exponents mean:

Screen Shot 2015-10-29 at 9.43.48 AM

Just as the student wrote out the factors and then divided them out in the problem above, I don't mind if a student does this as a reminder of what the rule means. I find this much more productive than a simple rule that states that fractions to a negative power simply 'flip'. Hopefully I'll see the benefits of this approach moving forward.

Student Feedback on Class Notebooks

There have been a lot of great moments since I started using a OneNote class notebook as my main repository for class notes. I wrote previously about what I was doing differently, and a lot has happened since then. The blog post in which I detail those developments is coming, I promise.

To tide you over, I'll share this great note that a student wrote in a portfolio reflection before the first quarter reports about our use of OneNote. I think it pretty much sums up why it has totally been worth making a fundamental change to the normal structure of class notes in my classroom. Here's the student:

For this quarter, Mr.Weinberg made us a cool thing called OneNote. We were able to record the class notes and upload the practice problems for every section online, and share freely. As we got to make our own reviews and share with the whole class, I had more opportunities to go over again with all the tiny details to find any mistakes since I did not want my classmates to learn something incorrect because of me. Therefore, not only my friends could get better understanding, but I could have a more thorough review and have better understanding too. Last time I worked for it, I was working as a group with _______. She organized the information that we learned in class, and I did the practice questions. I used to skip many steps in between the calculation and solving process, but this time I did all the question step by step, even for the questions that looked obvious to me, just to help out my friends’ understanding.

I did feel the need to correct the record with this student that I didn't actually create OneNote. Aside from that, this is the kind of perfect validation that I'll take from students any day of the week.

MeteorPad Tutorial: GoldMine

In a unit on Meteor applications for my web design class, I wrote a series of applications to help my students see the basic structure of a few Meteor applications so that they could eventually design their own. The students had seen applications that tallied votes from a survey, compiled links, and a simple blog. This one was about being competitive, and the students were understandably into it.

This tutorial was designed to use MeteorPad due to some challenges associated with installing Meteor on student computers. The first one involved permissions issues since students are not by default given access to the terminal. The second involved connectivity issues to the Meteor servers from China, which unfortunately, I never fully resolved. MeteorPad offered an awesome balance between ease of making an application and minimizing the need for Terminal. For anyone looking to get started with Meteor, I recommend trying out MeteorPad as it doesn't require any knowledge of working in the terminal or software installation.

I was able to take data on the students clicking away as I created new pieces of gold and put it out into the field. I've written previously about my enthusiasm for collecting data through clicks, so this was another fun source of data.

Code can be downloaded here from Github.

Online School Resumes with Meteor

As you may know, I've been teaching a web programming course this year. I wrote previously about the work we did at the beginning of the year making interactive websites using the Meteor framework. Since then, we've spent time exploring the use of templates, event handlers, databases, and routing to build single page applications.

The latest assignment I gave students was to create an online school resume site with a working guestbook. I frequently discuss the importance of having a positive digital footprint online, and one of the most beneficial ways of establishing this is through a site created to share their work. Students worked last week to complete this and submitted their projects. We've had connectivity issues to the Meteor servers from China from school. As a result, some students used Meteorpad, which unfortunately means their sites aren't permanent.

Those that were successful at deploying, however, have persistent guestbooks that anyone can visit and comment upon. Some students added secret pages or like buttons to show that they have learned how to use the reactive features of Meteor. The students were excited when I said I would post links on my blog and have given me permission to share. Here is the set of deployed sites:

Maria's Site
Dominick's Site
Tanay's Site
Luke's Site
Steven's Site
Tiffany's Site

I'm really proud of how far these students have come since the beginning of the year. They have accrued some bad habits of copying code and avoiding commenting their Javascript, but I take some responsibility for not holding them accountable for this. My goal was to have the focus of this course be on building and creating as the first priority, and the second on developing skills as programmers. As with many of the subjects I teach, helping students see the need for the basics is often most easily done with the end product in mind.

If anyone wants recommendations for a summer hire, let me know.

Before a Break: CCSS Math, Bogram Problems, and Peer Feedback

I spent the day in a room full of my colleagues as part of our school's official transition to using the Common Core standards for mathematics. While I've kept up to date with the development of CCSS and the roll-out from here in China, it was helpful to have some in-person explanation of the details from some experts who have been part of it in the US. Our guests were Dr. Patrick Callahan from the Illustrative Mathematics group and Jessica Balli, who is currently teaching and consulting in Northern California.

The presentation focused on three key areas. The first focused on modeling and Fermi problems. I've written previously about my experiences with the modeling cycle as part of the mathematical practice standards, so this element of the presentation was mainly review. Needless to say, however, SMP4 (Model with mathematics) is my favorite, so I love anything that generates conversation about it.

That said, one element of Jessica's modeling practice struck me by surprise, particularly given my enthusiasm for Dan Meyer's three-act framework. She writes about the details on her blog here, so go there for the long form. When she begins her school year with modeling activities, she leaves out Act 3.. Why?

Here's Jessica talking about the end of the modeling task:

Before excusing them for the day, I had a student raise their hand and ask, "So, what's the answer?" With all eyes on me, a quick shrug of my shoulders communicated to them that that was not my priority, and I was sticking to it (and, oh, by the way, I have no idea what time it will be fully charged). Some students left irritated, but overall, I think the students understood that this was not going to be a typical math class.
Mission accomplished.

Her whole goal is to break students of the 'answer-getting' mentality and focus on process. This is something we all try to do, but perhaps pay it more lip-service than we think by filling that need for Act 3. Something to consider for the future.

The other two elements, also mostly based in Jessica's teaching, went even further in developing other student skills.

I had never head of Bongard problems before Jessica introduced us to them. This involves looking at well defined sets of six examples and non-examples, and then writing a rule that describes each one.

Here's an example: Bongard Problem, #1:

You can find the rest of Bongard's original problems here.

In Jessica's class, students share their written rules with classmates, get feedback, and then revise their rules based only on that feedback. Before today's session, if I were to do this, I would eventually get the class together and write an example rule with the whole class as an example. I'm probably doing my students the disservice by taking that short-cut, however, because Jessica doesn't do this. She relies on students to do the work of piecing together a solid rule that works in the end. She has a nicely scaffolded template to help students with this process, and spends a solid amount of time helping students understand what good feedback looks like. Though she helps them with vocabulary from time to time, she leaves it to the students to help each other.

Dr. Callahan also pointed out the importance of explicitly requiring students to write down their rules, not just talk about them. In his words, this forces students to focus on clarity to communicate that understanding.

You can check out Jessica's post about how she uses these problems here:
Building Definitions, Bongard Style

The final piece took the idea of peer feedback to the next level with another template for helping students workshop their explanations of process. This should not be a series of sentences about procedure, but instead mathematical reasoning. The full post deserves a read to find out the details, because it sounds engaging and effective:

"Where Do I Put P?" An Introduction to Peer Feedback

I want to focus on one highlight of the post that notes the student centered nature of this process:

I returned the papers to their original authors to read through the feedback and revise their arguments. Because I only had one paper per pair receive feedback, I had students work as pairs to brainstorm the best way to revise the original argument. Then, as individuals, students filled in the last part of the template on their own paper. Even if their argument did not receive any feedback, I thought that students had seen enough examples that would help them revise what they had originally written.

I've written about this fact before, but I have trouble staying out of student conversations. Making this written might be an effective way for me to provide verbal mathematical details (as Jessica said she needs to do periodically) but otherwise keep the focus on students going through the revision process themselves.

Overall, it was a great set of activities to get us thinking about SMP3 (Construct viable arguments and critique the reasoning of others) and attending to precision of ideas through use of mathematics. I'm glad to have a few days of rest ahead to let this all sink in before planning the last couple of months of the school year.

Math Caching and Immediately Useful Teaching Data

Last July, I posted a video in which I showed how to create a local, customized version of the Math Caching activity that can be found here.

I was inspired to revisit the idea last weekend reading Dan Meyer's post about teacher dashboards. The part that got me thinking, and that stoked the fire that has been going in my head for a while, is identifying the information that is most useful to teachers. There are common errors that an experienced teacher knows to expect, but a new teacher may not recognize is common until it is too late. Getting a measure of wrong answers, and more importantly, the origin of those wrong answers, is where we ideally should be making the most of the technology in our (and the students') hands. Anything that streamlines the process of getting a teacher to see the details of what students are doing incorrectly (and not just that they are getting something wrong) is valuable. The only way I get this information is by looking at student work. I need to get my hands on student responses as quickly as I can to make sense of what they are thinking.

As we were closing in on the end of an algebra review unit with the ninth graders this week, I realized that the math cache concept was good and fun and at a minimum was a remastering of the review sheet for a one-to-one laptop classroom. I came up with a number of questions and loaded it into the Python program. When one of my Calculus students stopped in to chat, and I showed her what I had put together, I told her that I was thinking of adding a step where students had to upload a screenshot of their written work in addition to entering their answer into the location box. She stared at me and said blankly: 'You absolutely have to do that. They'll cheat otherwise.'

Screen Shot 2013-09-20 at 11.23.26 PM

While I was a bit more optimistic, I'm glad that I took the extra time to add an upload button on the page. I configured the program so that each image that was uploaded was also labeled with the answer that the student entered into the box. This way, given that I knew what the correct answers were, I knew which images I might want to look at to know what students were getting wrong.

This was pure gold.

Screen Shot 2013-09-20 at 11.30.23 PM

Material like this was quickly filling up the image directory, and I watched it happening. I immediately knew which students I needed to have a conversation with. The answers ranged from 'no solution' to 'identity' to 'x = 0' and I instantly had material to start a conversation with the class. Furthermore, I didn't need to throw out the tragically predictable 'who wants to share their work' to a class of students that don't tend to want to share for all sorts of valid reasons. I didn't have to cold call a student to reluctantly show what he or she did for the problem. I had their work and could hand pick what I wanted to share with the class while maintaining their anonymity. We could quickly look at multiple students' work and talk about the positive aspects of each one, while highlighting ways to make it even better.

In this problem, we had a fantastic discussion about communicating both reasoning and process:

Screen Shot 2013-09-20 at 11.43.02 PM

The next step that I'd like to make is to have this process of seeing all of the responses be even more transparent. I'd like to see student work popping up in a gallery that I can browse and choose certain responses to share with the class. Another option to pursue is to get students seeing the responses of their peers and offer advice.

Automatic grading certainly makes the job of answering the right/wrong question much easier. Sometimes a student does need to know whether an answer is correct or not. Given all the ways that a student could game the system (some students did discuss using Wolfram Alpha during the activity) the informative part on the teaching and assessment end is seeing the work itself. This is also an easy source of material for discussion with other teachers about student work (such as with Michael Pershan's Math Mistakes).

I was blown away with how my crude hack to add this feature this morning made the class period a much richer opportunity to get students sharing and talking about their work. Now I'm excited to work on the next iteration of this idea.

Computational Thinking & Spreadsheets

I feel sorry for the way spreadsheets are used most of the time in school. They are usually used as nothing more than a waypoint on the way to a chart or graph, inevitably with one of its data sets labeled 'Series 1'. The most powerful uses of spreadsheets come from how they provide ways to organize and calculate easily.

I've observed a couple things about the problem solving process among students in both math and science.

  • Physics students see the step of writing out all of the information as an arbitrary requirement of physics teachers, not necessarily as part of the solution process. As a result, it is often one of the first steps to disappear.
  • In math, students solving non-routine problems like Three Act problems often have calculations scrawled all over the place. Even they are written in an organized way, in the event that a calculation is made incorrectly, any sets of calculations that are made must be made again. This can be infuriating to students that might be marginally interested in finding an answer in the first place.
  • Showing calculations in a hand written document is easy - doing so in a document that is to be shared electronically is more difficult. There are also different times when you want to see how the calculation was made, and other times that you want to see the results. These are often presented in different parts of a report (body vs. appendix) but in a digital document, this isn't entirely necessary.

Here's my model for how a spreadsheet can address some of these issues:
Screen Shot 2013-02-01 at 7.47.59 PM

Why I like it:

  • The student puts all of the given information at the top. This information may be important or used for subsequent calculations, or not. It minimally has all of the information used to solve a problem in one place.
  • The coloring scheme makes clear what is given and what is being being calculated.
  • The units column is a constant reminder that numbers usually have units. In my template, this column is left justified so that the units appear immediately to the right of the numerical column.
  • Many students aren't comfortable exploring a concept algebraically. By making calculations that might be useful easy to make and well organized, this sets students up for a more playful approach to figuring things out.
  • Showing work is easy in a spreadsheet - look at the formulas. Depending on your own expectations, you might ask for more or less detail in the description column.

Some caveats:

    • A hand calculation should be done by someone to confirm the numbers generated by the spreadsheet are what they should be. This could be a set of test data provided by the teacher, or part of the initial exploration of a concept. Confirming that a calculation is being done correctly is an important step of trusting (but verifying, to quote Reagan for some reason) the computer to make the calculations so that attention can be focused on figuring out what the numbers mean.
    • It does take a bit of time to teach how to enter a formula into a spreadsheet. Don't turn it into a lecture about absolute or relative addressing, or about rows and columns and which is which - this will come with practice. Show how numbers in scientific notation look, and demonstrate how to get a value placed in another cell. Get straight into making calculations happen among your students and in a way that is immediately relevant to what you are trying to do. Then change a given value, and watch the students nod when all of the values in the sheet change immediately.
    • Building off of what I just said, don't jump to a spreadsheet for a situation just to do it. The structure and order should justify itself. Big numbers, nasty numbers, lots of calculations, or lots of given information to keep track of are the minimum for establishing this from the start as a tool to help do other things, not an end in and of itself.
    • Do not NOT


      hand your students a spreadsheet that calculates everything for them. If a student wants to make a spreadsheet for a particular type of calculation, that's great. That's the student recognizing that such a tool would be useful, and making the effort to do this. If you hand them a calculator for one specific application, it perpetuates the idea among students that they have to wait for someone else that knows better than them to give them the tool to use. Students should have the ability to make their own utilities, and this is one way to do it.

Example from class yesterday:

We are exploring the way Newton's Law of Gravitation is used. I asked students to calculate the force of gravity from different planets in the solar system pulling on a 65 kilogram person on Earth, with Wolfram Alpha as the source of data. Each of them used a scientific or graphing calculator to calculate their numbers, with the numbers they used written by hand (without units) on their papers with minimal consistency. They grumbled about the sizes of the numbers. When noticeable differences arose in magnitude between different students, they checked each other until they were satisfied.

I then showed them how to take the pieces of data they found and put them in the spreadsheet in the way I described above. In red, I highlighted the calculation for the magnitude of the force for an object on Earth, and then asked a student to give me her data. This was the value she calculated! I was quickly able to confirm the values that the other students also had made.

I then had them calculate the weight of an object on Earth's surface using Newton's law of gravitation. This sent them again on a search for data on Earth's vital statistics. They were surprised to see that this value was really close to the accepted value for g = 9.8;m/s^2. I then asked them in their spreadsheet how they might figure out the acceleration due to gravity based on what they already knew. Most were able to figure out without prompting that dividing by the 65 kilogram mass got them there. I then had them use that idea and Newton's Law of Gravitation to figure out how to obtain the acceleration due to gravity at a given distance from the mass center of a planet. I then had them use the spreadsheet model on their own to calculate the acceleration due to gravity on a couple of different planets, and it went really well.

The focus from that point on was on figuring out what those numbers meant relative to Earth. Often with these types of problems, students will calculate and be done with it. These left them a bit curious about each other's answers (gravity on Jupiter compared to the Moon) and opened up the possibilities for subsequent lessons. I'll write more about how I have grown to view spreadsheets as indispensable computing tools in the classroom in the future. A pure computational tool is the lowest level on the totem pole of applications of computers for learning mathematics or science, but it's a great entry point for students to see what can be done with it.


Spreadsheet Calculation Template

Centripetal Acceleration of the Moon - a comparison we used two days ago to suggest how a 1/r^2 relationship might exist for gravity and the moon.