A Response to Slate: How the recent article on technology misses the point.

Ah, summer. A great time to kick back, relax, and have time to write reactions to things that bug me.

I read through the article on Slate titled 'Why Johnny Can't Add Without a Calculator' and found it to be a rehashing of a whole slew of arguments that drive me nuts about technology in education. It also does a pretty good job of glossing over a number of issues relative to learning math.

The problem isn't that Johnny can't add without a calculator. It's that we sometimes focus too much about turning our brain into one.

This was the sub-heading underneath the title of the article:

Technology is doing to math education what industrial agriculture did to food: making it efficient, monotonous, and low-quality.

The author then describes some ancedotes describing technology use and implementation:

  • An experienced teacher forced to give up his preferred blackboard in favor of an interactive whiteboard, or IWB.
  • A teacher unable to demonstrate the merits of an IWB beyond showing a video and completing a demo of an electric circuit.
  • The author trying one piece of software and finding it would not accept an answer without sufficient accuracy.

I agree with the author's implication that blindly throwing technology into the classroom is a bad idea. I've said many times that technology is only really useful for teaching when it is used in ways that enhance the classroom experience. Simply using technology for its own sake is a waste.

These statements are true about many tools though. The mere presence of one tool or another doesn't make the difference - it is all about how the tool is used. A skilled teacher can make the most of any textbook - whether recently published or decades old - for the purposes of helping a student learn. Conversely, just having an interactive whiteboard in the classroom does not make students learn more. It is all about the teacher and how he or she uses the tools in the room. The author acknowledges this fact briefly at the end in arguing that the "shortfall in math and science education can be solved not by software or gadgets but by better teachers." He also makes the point that there is no "technological substitute for a teacher who cares." I don't disagree with this point at all.

The most damaging statements in the article surround how the author's misunderstanding of good mathematical education and learning through technology.

Statement 1: "Educational researchers often present a false dichotomy between fluency and conceptual reasoning. But as in basketball, where shooting foul shots helps you learn how to take a fancier shot, computational fluency is the path to conceptual understanding. There is no way around it."

This statement gets to the heart of what the author views as learning math. I've argued in previous posts on how my own view of the relationship between conceptual understanding and learning algorithms has evolved. I won't delve too much here on this issue since there are bigger fish to fry, but the idea that math is nothing more than learning procedures that will someday be used and understood does the whole subject a disservice. This is a piece of the criticism of Khan Academy, but I'll leave the bulk of that argument to the experts.

I will say that I'm really tired of the sports skills analogy for arguing why drilling in math is important. I'm not saying drills aren't useful, just that they are never the point. You go through drills in basketball not just to be able to do a fancier shot (as he says) but to be able to play and succeed in a game. This analogy also falls short in other subjects, a fact not usually brought up by those using this argument. You spend time learning grammar and analysis in English classes (drills), but eventually students are also asked to write essays (the game). Musicians practice scales and fingering (drills), but also get opportunities to play pieces of music and perform in front of audiences (the game).

The general view of learning procedures as the end goal in math class is probably the most destructive reason why people view math as something acceptable not to be good at. Learning math this way can be low-quality because it is "monotonous [and] efficient", which is not technology's fault.

One hundred percent of class time can't be spent on computational fluency with the expectation that one hundred percent of understanding can come later. The two are intimately entwined, particularly in the best math classrooms with the best teachers.

Statement 2: "Despite the lack of empirical evidence, the National Council of Teachers of Mathematics takes the beneficial effects of technology as dogma."

If you visit the link the author includes in his article, you will see that what NCTM actually says is this:

"Calculators and other technological tools, such as computer algebra systems, interactive geometry software, applets, spreadsheets, and interactive presentation devices, are vital components of a high-quality mathematics education."

...and then this:

"The use of technology cannot replace conceptual understanding, computational fluency, or problem-solving skills."

In short, the National Council for Teachers of Mathematics wants both understanding and computational fluency. It really isn't one or the other, as the author suggests.

The author's view of what "technology" entails in the classroom seems to be the mere presence of an interactive whiteboard, new textbooks, calculators in the classroom, and software that teaches mathematical procedures. This is not what the NCTM intends the use of technology to be. Instead the use of technology allows exploration of concepts in ways that cannot be done using just a blackboard and chalk, or pencil and paper. The "and other technological tools next to calculators in the quote has become much more significant over the past five years, as Geometers Sketchpad, Geogebra, Wolfram Alpha, and Desmos have become available.

Teachers must know how to use these tools for the nature of math class to change to one that emphasizes mathematical thinking over rote procedure. If they don't, then math continues as it has been for many years: a set of procedures that students may understand and use some day in the future. This might be just fine for students that are planning to study math, science, or engineering high school. What about the rest of them? (They are the majority, by the way.)

Statement 3: "...the new Common Core standards for math...fall short. They fetishize “data analysis” without giving students a sufficient grounding to meaningfully analyze data. Though not as wishy-washy as they might have been, they are of a piece with the runaway adaption of technology: The new is given preference over the rigorous."

If "sufficient grounding" here means students doing calculations done by hand, I completely disagree. Ask a student to add 20 numbers by hand to calculate an average, and you'll know what I mean. If calculation is the point of a lesson, I'll have students calculate. The point of data analysis is not computation. Just because the tools take the rigor out of calculation does not diminish the mathematical thinking involved.

Statement 4: "Computer technology, while great for many things, is just not much good for teaching, yet. Paradoxically, using technology can inhibit understanding how it works. If you learn how to multiply 37 by 41 using a calculator, you only understand the black box. You’ll never learn how to build a better calculator that way."

For my high school students, I am not focused on students understanding how to multiply 37 by 41 by hand. I do expect them to be able to do it. Usually when my students do get it wrong, it is because they feel compelled to do it by hand because they are taught (in my view incorrectly) that doing so is somehow better, even when a calculator sits in front of them. As with Statement 3, I am not usually interested in students focusing on the details of computation when we are learning difference quotients and derivatives. This is where technology comes in.

I tweeted a request to the author to check out Conrad Wolfram's TED Talk on using computers to teach math, and asked for a response. I still haven't heard back. I think it would be really revealing for him to listen to Wolfram's points about computation, the traditional arguments against computation, and the reasons why computers offer students new opportunities to explore concepts in ways they could not with mere pencil and paper. His statement that math is much more than computation has really changed the way I think about teaching my students math in my classroom.

Statement 5: "Technology is bad at dealing with poorly structured concepts. One question leads to another leads to another, and the rigid structure of computer software has no way of dealing with this. Software is especially bad for smart kids, who are held back by its inflexibility."

Looking at computers used purely as rote instruction tools, I completely agree. That is a fairly narrow view of what learning mathematics can be about.

In reality, technology tools are perfectly suited for exploring poorly structured concepts because they let a student explore the patterns of the big picture. The situation in which "one question leads to another" is exactly what we want students to feel comfortable exploring in our classroom! Finally, software that is designed for this type of exploration is good for the smart students (who might quickly make connections between different graphical, algebraic, and numerical representations of functions, for example) and for the weaker students that might need different approaches to a topic to engage with a concept.

The truly inflexible applications of technology are, sadly, the ones that are also associated with easily measured outcomes. If technology is only used to pass lectures and exercises to students so they can perform well on standardized tests, it will be "efficient, monotonous, and low quality" as the author states at the beginning.

The hope that throwing calculators or computers in the classroom will "fix" problems of engagement and achievement without the right people in the room to use those tools is a false one, as the author suggests. The move to portray mathematics as more than a set of repetitive, monotonous processes, however, is a really good thing. We want schools to produce students that can think independently and analytically, and there are many ways that true mathematical thinking contributes to this sort of development. Technology enables students to do mathematical thinking even when their computation skills are not up to par. It offers a different way for students to explore mathematical ideas when these ideas don't make sense presented on a static blackboard. In the end, this gets more students into the game.

This should be our goal. We shouldn't going back to the most basic textbooks and rote teaching methods because it has always worked for the strongest math students. There must have been a form of mathematical Darwinism at work there - the students that went on historically were the ones that could manage the methods. This is why we must be wary of the argument often made that since a pedagogical method "worked for one person" that that method should be continued for all students. We should instead be making the most of resources that are available to reach as many students as possible and give them a rich experience that exposes them to the depth and variety associated with true mathematical thinking.

What my dad taught me about learning.

The first time I saw the word 'Calculus', I was staring at the spines of several textbooks that sat on the bookshelf at home. I didn't think much of them; I knew they were my parents', and that they were from their college days, but had no other awareness of what the topic actually was. I did assume that the reason there were so many of them was because my parents must have liked them so much. After further investigation, I learned that they were mostly my dad's books. His secret was out: he must have loved Calculus. I believed this for a while.

When my older brother took Calculus, these books came off the shelf occasionally as a resource, though I don't know if this was his decision or my dad's. From what I knew, my brother breezed through Calculus. I know he worked hard, but it also seemed to come fairly naturally to him. I remember conversations that my parents had about not knowing where my brother got this talent from. They admitted at this point that it couldn't have been from either of them. My dad had taken Calculus multiple times and the collection of textbooks was the evidence that hung around for no particularly good reason.

This astounded my young brain for a couple of reasons. It was mind-boggling to me that my parents ever had trouble doing anything. They always seemed to know just what to do in different situations - how could they not do well in a class designed to teach them something? It was also the first time I ever remember learning that my dad was not successful in everything he tried to do. This conflicted deeply with what I understood his capabilities to be.

As I understood it, he just knew everything.

When I was nine and my parents had bought me a keyboard to learn to play piano for the first time, there was no AC adapter in the box I had unwrapped only moments before. My dad scrounged around among his junk boxes and drawers and found one with the correct tip, but the polarity was wrong. I knew I wasn't going to be able to start jamming that night - it was late and a trip to the store wasn't an option. He wasn't going to submit to that as a possibility - he took the adapter downstairs to the basement and had me follow him. There was soldering involved, and electrical tape. I had no idea what he was doing. Moments later, however, he appeared with the same adapter and a white label that said 'modified'. We plugged it in to the keyboard and it lit up, ready for me to play and drive my parents crazy with my rendition of . I now understand that he switched the wires around to change the polarity - I did it myself with some students recently in robotics. At the time though, it seemed like magic. I just knew I had the smartest dad in the world.

His mantra has always been that if it can be fixed, it should be fixed, no matter the hilarity of the process. I watched him countless times take in the cast-off computers of other people who asked him if he knew how to fix them. Thinking back, I don't know that he ever specifically answered that question. His usual response was (and still is) "I'll take a look." So he would work long hours with a vacuum, various metal tools, and a gray multimeter (that I think he still has) laid out like a surgeon investigating a patient. I rarely had the patience to sit and watch. I would see the results of his work: sheets of yellow legal pad paper filled with notes and diagrams scrawled along the way. In the end, he would inevitably find a solution, though often at this point the person who had asked him to fix the item had gone and bought a new one. I don't recall ever believing my dad thought it was a waste.

We also worked on things together to try to get closer in my early teens. We both took tests to get amateur radio licenses. I came to really enjoy learning Morse code and got the preparation books to climb the license ladder. He commented repeatedly as I zipped through the books about memorizing the books and not understanding the underlying theory of resonant circuits and antenna diagrams. That was true – at the time I just wanted to pass the tests. I didn't understand that the process of learning was the valuable part, not the end point. I didn't see that. I just continued to believe that the tests were a means to an end, just as I viewed through my thirteen year old brain that his herculean efforts to fix things was a means to getting things fixed., and nothing more.

My dad is one of the smartest people I know. As I've grown older, however, I have come to understand that it wasn't that about knowing everything. He instead had been continuously demonstrating what real learning is supposed to be. It was never about knowing the answer; it was about finding it. It wasn't about fixing a computer, it was about enjoying figuring out how it can be fixed, however much frustration was involved. It wasn't just about saving money or avoiding a trip to the store to buy an electric adapter. It was about seeing that we can understand the tools we use on a regular basis well enough to make them work for us.

I have seen time and time again how he mentors people to make them better at what they do. I have seen it in the way he mentors FIRST robotics teams as a robot inspector at the Great Lakes regional competition in Cleveland. I have seen it in the way he has spent his time since selling the company he founded with partners years ago. He chooses to do work that matters and makes sure that others are right there to learn beside him. There were times growing up when, admittedly, I just wanted him to fix things that needed to be fixed. To his credit, he insisted on involving me in the process, even when I protested or became impatient.. I didn't see it when I was younger. Knowing how to go about solving problems is among the most important skills that everyone needs. I was getting free lessons from someone that not only was really good at it, but cared enough about me to want me to learn the joy of figuring things out.

One of my students this year was really into electronic circuits and microcontrollers. He soldered 120 LEDs into a display and wanted to use an Arduino to program it to scroll text across it. The student's program wasn't working and he didn't know why. I had only been tangentially paying attention to the issues he was having, and when he was visibly frustrated, I pulled up a chair and sat next to him, and then said 'let's take a look.” We went through lines of code and found some missing semicolons and incorrectly indexed arrays, and I asked him to tell me what each line did. I was only a couple steps ahead of him in identifying the problem, but we laughed and tried making changes while speaking out loud what we thought the results would be. At one point, he said to me “Mr. Weinberg, you're so smart. You just know what to do to fix the program.”

I immediately corrected him. I didn't know what was wrong. We were able to make progress by talking to each other and experimenting. It wasn't about knowing just what to do. It was about figuring out what to try next and having strategies to analyze what was and was not working. I learned this from a master.

On this Father's day (that also happens to be the day before my dad's birthday), I celebrate this truth: much of what I do as a teacher comes from trying to channel my dad's habits while confronting big challenges. I don't want my students to memorize steps to pass tests; I want them to understand well enough to be able to solve any challenge set before them. I don't want to fix my students' problems – I want to help them learn to fix problems themselves. I don't want my students to be afraid to fail; I want them to understand through example that failure leads to finding a better way.

I am grateful for all that I have learned from him., and I try to teach my students what he has taught me about learning at every opportunity. It would be fine by me if I ever need to do Calculus for him - I'd still be in the red.

The perils of playing cards and probability. What do you assume your students know?

One of the topics taught in the first semester of my first semester teaching was probability. Flipping coins and rolling dice both serve to bring the kids to understand how it is used in games, but the first thing a couple teachers told me to do once they got the basics was to go to playing cards. This seemed like a natural fit to get the students excited - I figured they had seen people playing cards on the street as I had seen countless times wandering around the city. There are also so many opportunities to talk about intersection and union of sets. How many cards are hearts or face cards? How many are hearts and face cards? Sounded like a good idea to me.

When I actually did this with my class the first time, there were a couple really big issues that came up. Being a new teacher, I wasn't as strong in terms of preventing students from calling out answers. When I did write up some fairly simple questions on the board (such as find P(red card) if a single card is selected) the enthusiasm for three or four students in answering these led me to believe that this small sample was representative of the class. If these four knew it (or so assumed my naive first year teacher brain), the rest probably knew it, but just didn't feel comfortable answering. This was a ridiculously inaccurate assumption. In fact, I think it's a painfully clear example of self-selection bias that all teachers should consider when asking any question of an entire class. Who is going to raise their hand for the purposes of establishing that he or she does not know what I am talking about?

Another issue appeared when I started walking around the room during independent work. I saw that the students were struggling both with the idea of probability AND with the details of the different types of cards. It was hard separating the two bodies of knowledge because I had framed the topic only in terms of these concepts. Students that didn't understand what the various cards were couldn't answer the questions because they couldn't figure out which cards were desired outcomes. Students that didn't get probability in general didn't understand how the sample space and the desired outcomes fit together to calculate theoretical probability. Some didn't understand either idea.

After the class, I talked to a few teachers about it. One said a phrase that makes my blood boil every time I hear it: "Come on - they really should know _______". In this case, the phrase in the blank was "the types of playing cards." The assertion that there is something wrong with a student because he or she doesn't know an arbitrary fact is not an argument we should be making. The biggest reason it is a problem is this: if your lesson predicates itself on students knowing a fact, and you haven't made any effort to establish as part of your lesson whether or not students actually know that fact, your lesson is going to backfire. Hard. It will be like pulling your own teeth while simultaneously telling your students "look, you can do it too!"

I understood more about this in talking to my mentor teacher. He pointed out that using playing cards is one of the worst ways teachers could teach probability because of the cultural bias inherent in assuming students have the required background knowledge. Reasons why:

  • Alright kids, we're going to do some probability, but make sure you know what these words mean first, because I'm going to be using them all with the assumption you do: suit, hearts, diamonds, clubs, spades, face card, king, queen, jack, ace, joker. Don't forget that there are red cards and black cards.
  • Wait, English isn't your first language? OK, so spend your time learning these words in addition to the math content terms I really want you to learn: probability, sample space, and outcome. Uh...I guess that learning this esoteric set of words will be good for you because it will help you understand spoken English better. The more words you know, the better your English, right?
  • What's that? How can you not understand that something can be a face card and a club? Face cards are jacks, queens, kings, and aces - get it? And there are four different suits, so there have to be four face cards that are also clubs - get it?. Well, yes, spades are also black, but clubs are black and have the little clover shape. Yes, the symbol tells you the suit. No, the card doesn't actually say "spades" or "hearts". But it's easy because the heart is for hearts, the diamond is for diamond, and well, you might just have to remember the others. Oh wait, the spade is shaped like a shovel - did you know shovels are sometimes called spades? That will help you remember it. Get it? [By now, the student is nodding to get you out of his face.]
  • So now that we've covered all the vocabulary, what is the probability of randomly picking a card that has a value of 10 or greater? Oh, you don't know about the value of cards? Sure, well that's just fine. Obviously the jack is above the 10 because it has a guy on the front. It has the lowest value of the face cards, because the queen and king are higher. The king is of higher value than the queen because of the patriarchal culture that has dominated the globe for, well, forever. And then there's the ace. Sometimes the ace is the highest card. Other times it has the lowest value. That's life. Who has an answer?

How much math content has actually been explored during this entire (imagined) dialogue? Furthermore, if we assume that playing cards is an engaging and authentic application of probability, shouldn't understanding the math content be made easier by the presence of all of this extra knowledge? Think about the reverse situation - should a student that knows her probability, but does not know the details of the card system, get a 50% on a quiz of this topic in a math class?

I don't know about you, but I didn't actually play cards that much as a kid. It's a dangerous assumption to make that all kids have. If you don't know if your students have this knowledge or not, and don't want to guess from looking at them (which is always good policy not to do), and don't want to spend class time reviewing, it probably isn't a good idea.

One of the other teachers with whom I discussed this issue gave out a reference sheet with all of the vocabulary, pictures, and cards in order of value, and let them use it for quizzes and tests that included this topic. I think that's fine. An even better solution though? Find a topic that doesn't require so much background knowledge. Flip a coin and roll a 20 sided die. Put numbers on index cards, throw them in a bag, and ask for probabilities of drawing a card that is even or prime. At least in that case, students need to use mathematical knowledge to classify the outcomes. That's what you want to assess anyway, right?

Making connections to background knowledge is one of the most powerful ways to help students learn. Making assumptions about what background knowledge students have is an easy way to make a lesson a dud. Assess, don't assume.

End of year reflections - SBAR analysis

I wrapped up grading final exams today. Feels great, but it was also difficult seeing students coming in to get their final checklists signed before they either graduate or move on to a different school. Lots of bittersweet moments in the classroom today.

I decided after trying my standards based grading (SBG) experiment that I wanted to compare different students' overall performances among the grading categories to their quiz percentages. In a previous post, I wrote about my experimentation doing my quiz assessments on very specific skill standards that the students were given. As I plan to change my grading to be more SBG based for the fall, I figured it would be good to have some comparison data to be able to argue the many reasons why this is a good idea.

In my geometry and algebra two classes, there are 28 total students. I removed two students from the data set that came in the last month of school, and one outlier in terms of overall performance.

The table below shows the names of each grading category, as well as the overall grade weight for the category used in calculating the grade. The numbers are the correlation coefficient in the data between the variable listed in the row and column. For example, the 0.47 is the correlation between the HW data and the Quizzes/Standards data for each student.

Homework (8%) Quizzes/Standards (12%) Test Average (48%) Semester Exam (20%) Final Grade (100%)
HW 1 0.47 0.41 0.36 0.48
1 0.89 0.91 0.95

1 0.88 0.97
Sem. Exam

1 0.95
Final Grade


Some notes before we dive in:

  •  The percentages do not add up to 100% because I am leaving out the portfolio grade (8%) and classwork grades (4%) which are not performance based.
  • Homework is graded on turning it in and showing work for answers. I collect and look at homework as a regular way to assess how well they are learning the material.
  • The empty cells are to save ink; they are just the mirror image of the results in the upper half of the matrix since correlation is not order dependent.
  • I know I only have 25 samples - certainly not ready for a research publication.

So what does this mean? I don't know that this is very surprising.

  1. Students doing HW is not what makes learning happen. I've always said that and this continues to support that hypothesis. It can help, it is evidence that students are doing something with the material between classes, but simply completing it is not enough. I'm fine with this. I get enough information from looking at the homework to create activities that flesh out misunderstanding the next time we meet. The unique thing about homework is that it is often the first time students look at the material on their own rather than with their peers in class.
  2. My tests include some direct assessments of skills, but also include a lot of new applications of concepts and questions requiring students to explain or show things to be true. It's very gratifying to see such a strong connection between the quiz scores and the test scores.
  3. I always wonder about the students that say "I get it in class, but then on the tests I freeze up." If there's any major lesson that SBG has confirmed for me, it's that student self-awareness of proficiency is generally not great without some form of external feedback. If this were the case, there would be more data with high quiz scores and low exam scores. That isn't the case here. My students need real and correct feedback on how they are doing, and the skills quizzes are a formalized way to do this.
  4. I find it really interesting how close the quiz average and the semester exam percentages are. The semester exam was cumulative and covered a lot of ground, but it didn't necessarily hit every single skill that was tested on quizzes. There were also not quizzes for every single skill, though I tried to hit a number of key ones.

This leads me to believe that it is possible to have several key standards to focus on for SBG purposes, and also to dedicate time to work on other concepts during class time through project based learning, explorations, or independent work. It's feasible to assess these other concepts as mathematical process standards that are assessed throughout the semester. It strikes a good balance between developing skills according to curriculum but not making classes a repetitive process of students absorbing procedures for different types of problems. I want to have both. My flipping experiments have worked well to approaching that ideal, but I'm not quite there yet.

I'll have more to say about the details of what I will change as I think about it during the summer. I think a combination of using BlueHarvest for feedback, extending SBG to my Calculus and Physics classes, and less emphasis on grading and collecting homework will be part of it. Stay tuned.

The Problem Database Project - Where do I begin?

I am really excited to be part of this incredibly cool idea - excited enough that setting up and playing around with CSS, HTML, and PHP was my afternoon yesterday. The past ten months really have been my most intense in terms of learning programming. Though I've found plenty of interesting projects to take up my time outside of teaching, the Global Physics Problem Database, and whatever elements I can contribute to it, is probably the challenge that I'm most buzzed about taking up.

For those unaware of what I'm talking about, check out these posts from John Burk (here and here)on the conversations and ideas that have been tossed around for the past couple of weeks. I find I am routinely generating skills-based questions for my students frequently, and if there was a reliable site, NOT blocked by the GFW, that I could easily use with my students for this purpose, I'd get a lot of use out of it. Furthermore, one of the highlighted goals of the project is to document its making so that others, including those with limited experience in web applications, can learn how it is being built. I consider myself one of those novices - my web experience has been limited to a three week effort in the Udacity course on web applications engineering. Not exactly enough background to design the next ed-tech innovation to receive millions of dollars, but luckily there are many talented people in this group from whom I can learn.

The bulk of my time yesterday was spent installing the PHP framework Laravel on my laptop. Despite Andy Rundquist's excellent screencast on doing this on Windows, I was booted from my Ubuntu partition at the time, and said to myself "Why not make things difficult and install it on Linux, the structure of which you still haven't figured out?"

I learned exactly what I wanted to learn - more details about how the Linux filesystem works, where things are located, etc. The tricks I had to make before getting the Laravel default page to work:

  • In addition to placing the Laravel directory in the /var/www directory, I had to play around with the permissions for that directory so that Apache & PHP could manipulate those files. I kept getting a 'Laravel failed to open stream: Permission denied" error. The solution I found here worked, but may not have been what I should have done. Oh well - it worked, and it's a local installation, so I'm not so worried about security at this point.
  • I needed to manually install php5-mcrypt because it isn't included in the regular installation of php5. Mom, did you know that?
  • I did see the Laravel default page pop up at this point - success! I was playing around with suggested code from the documentation but was having trouble getting anything other than this page to load. I spent an hour just on this issue, staring in disbelief at the pages of Laravel documentation and a number of tutorials (like this one)that told me how easy it was to get started, even for a beginner. Then I found a page with the trick: the local host address needed to include index.php in it in order to run any functions I put into /application/routes.php file. Oops.Once I got to this point I was ready to play a bit more. The dog woke me up early, and I have trouble falling back asleep once there's any light on the horizon, so I picked up where I had left off with Codeacademy's courses on CSS. I'm capturing the excitement of learning this stuff for the first time, so for those that have known this for a long time, I apologize if what I am about to say is either painfully obvious or painfully understated.

    CSS is the bomb.

    The fact that adding a style sheet is all that it takes to turn this: into this: is pretty incredible.

I've also never used PHP before, but as I expected, there's enough information out on the web to get a basic idea pretty quickly with the programming knowledge I already have. While the whole Laravel framework is written in PHP itself, the thing that I really like about it is how seamlessly it integrates into HTML code. Here is the code I wrote to generate the list of options at the top of the page, as well as include a welcome message:

Why am I tickled pink by this?

The variables I defined at the top of the code are filled in by the PHP code that appears later in the HTML. If the person loading this page is somehow classified as a student by the variable 'user_type', then that person will see this:

If the user_type is instead 'teacher', the menu changes to look instead like this:

Depending on the user, the content of that menu will change, as can the functionality of each link. I had no idea how easy it was for the page to change what it looks like with a simple addition of PHP code. In the case of the physics problem database, the values of those variables that correspond to the username, the user_type, and the question_text are going to be coming from the database being designed for the project. At the moment, I don't really know how to get that information into the page through PHP, but I know the reason Laravel was chosen was to help make this process as easy as possible.

Again, for those more experienced, I understand that this might seem obvious.  I've always wondered exactly how it was that web servers provided different web pages at the same address depending on whether you've logged in, what privileges you have, etc. I remember how cool I thought I was lurking around in web server directories and looking at different HTML files that you normally couldn't access unless you were logged in, which I wasn't - this was before this was identified as the ridiculously obvious security flaw that it was. I understood the idea of cookies and caching, but never took any time to understand exactly how it worked.

Now that I've seen behind the curtain, I'm pumped.

Enough for now though - off to work on end of the year comments. One week is left, consisting of classes, giving final exams, grading final exams, and wrapping up loose ends before returning to the US for some good times with family and friends. Oh, and PHP.

My PHP file can be found at https://gist.github.com/2862463 .