Monday, June 30, 2014

Constant Velocity Model -- Summary and Implementation Reflection

My current impression is that the constant velocity model is a tool for describing and solving problems related to objects moving with a constant velocity.  Two common examples of such motion would be much of our motion in cars between stop lights and falling objects that have reached a terminal velocity.
  It includes several complimentary representations of motion, from which the speed, direction, and (often) starting point of the motion can be determined:

  •  a verbal description of the motion (the object moves a constant distance in each unit of time, in a constant direction)
  • a diagram of the motion in the form of a motion map with equally spaced position dots at each "snapshot" of equal time intervals and equal length velocity arrows starting at each position dot and pointing in the direction of the motion
  • a graph of position vs. time, with the vertical-intercept representing the starting position of the object (relative to a designated reference point) and the slope representing the speed of the motion through its steepness and the direction of the motion through its sign
  • a graph of velocity vs. time, which by defintion of this being a "constant velocity model" will be a constant value whose magnitude and sign correspond to the slope of the position-time graph
  • an equation that comes from the position-time graph:  x = vt + x0, where x is the position relative to the reference point, v is the velocity of the object, t is the time elapsed since the beginning of the motion, and x0 is the starting position of the motion.
It can be developed by a inquiry activity in which students track the position of a constant-velocity vehicle from a variety of starting points as a function of elapsed time, graph that position-time data, consider the significance of various features of the graphs and equations.  It is then, within the modeling instruction materials, additionally translated to the motion maps and velocity-time graph representations through Socratic small group and large group discussions.  (Is this an operational definition of creating operational definitions?)

I have previously done parts of this modeling-building process with both my ninth grade Physics I and my advanced 11th grade Physics course.  In general, my students seem to get comfortable with the position-time graphs quite quickly but struggle much more with the velocity-time graphs and figuring out how to go backwards from the v-t to x-t graphs.  And the idea that the area under the v-t graph tells the displacement, and the difference between average velocity and average speed are fairly fuzzy to many of them, at least in part because I'm not as careful about differentiating position, distance, and displacement up front as I could be.  Based on my experience in this summer's modeling workshop, I will be strongly considering the following alterations next year:
  • I will try my best to steal a (possibly shorter version of) Laura's discussion leading to operational definitions of position, reference point, distance, and displacement.  
  • I've done the Buggy Lab with both my courses.  However, each time I prescribed that everyone do two identical runs, one forward from the reference point (ish) and one back towards the reference point from high positions.  I loved the combination of similar first runs with a broad variety of second runs for different groups (different starting positions, speeds, and directions, in various combinations) to bring interest and richness to the succeeding whiteboard discussion of the results.  I will definitely do that next year.  
  • I haven't really used Predicted Graphs before -- I like the idea of asking students try to think through the relationships between factors graphically.  I have traditionally had them make "If, then, because" hypotheses verbally, and I wonder if you lose anything by not having students write out the verbal explanation of their thinking.
  • I also finally feel like I understand the 5% rule enough to try introducing it to my classes, although I remain concerned that it will creep into my IB students' externally moderated lab reports in places it doesn't belong.
  • I wonder about alternatives to reserving one of my school's two laptop carts every time we do a lab and the students want to be able to fit lines for 5 or 10 minutes per group.  I think I should educate myself on how they can do it with the graphing calculators many of them bring, the Open Office distribution on the linux desktop in my room, and maybe the free Logger Pro Lite software I could have the students with laptops download.  (Does the free version do curve fits?  I'll have to check.)
  • It was very helpful to see the model development and summary through the unit.  I will be more conscious about explicitly generalizing and publishing the consensus of the classes on the graph and the associated equation after the buggy lab discussion next year, and adding to it as we add further representations (motion maps, v-t graphs).
  • I've explained motion maps by first asking students to try to come up with their own system for diagraming motion, and then guiding them into invention something similar to motion maps, illustrated with the motion maps reading.  I feel like my introduction of that was relatively effective this spring, although I love the blinking open-close kinesthetic experience.  I asked my students to image a flash photo taken in the dark once per second, and that worked pretty well for them, but I think adding the blinking will help clarify it.
  • I loved "walking the graph" (and think it's also helpful for walking the motion map, and probably for walking the v-t curve, too) -- wish I'd thought of this sooner.  I think it will really help make the graphs and motion maps more concrete for my students.
  • I've used most of the worksheets in Unit 1 with my students, but I've been very guilty of underutilizing their discussion and conception clarification potential.  I think I often take for granted that my students are using language the way I hope they will, and don't ask for additional explanation often enough to realize where they're not as clear as they seem.  I've also been super guilty of glossing over the highly useful ambiguities in the worksheets ("Is away from the detector always in the positive direction?", "What happens between the dots on a position - time graph we're given without actually seeing the motion for ourselves?").
  • I try to whiteboard most of the worksheets in some form, but I usually let student groups choose which problems they do, and usually each problem is done only by one group.  I'm still getting used to the "assign two problems per group" strategy that seems to be much more common in the workshop, but I can see some advantages of it.  I'm willing to try it, although it will require getting a lot more whiteboards than I have now.
  • I've already mentioned that I have a huge crush on the "Movie Shot" end of unit lab.  I will be stealing this.  

Sunday, June 29, 2014

What is physics? (And who are you asking?)

Mid-afternoon Friday, I tweeted:  "What do you do with / for the student who zones out during discussion? The one always saying, "What did you say?" ".  I don't know if it's being short on sleep, the Friday-ness of it all, or a real problem with spending most of the day whiteboarding worksheets, but I did have a really hard time focusing on Friday.  And I do have students who have an equally hard time focusing on a discussion that I and some of my students find engrossing.  I wonder what happens to those students when whole-group whiteboard discussions become a primary mode of instruction.  Maybe they make up for it in the small-group discussions and catch up with the model summary at the end of the discussion?

Reading this weekend, I found the Hammer "Two Approaches to Learning Physics" article VERY useful.  I hadn't previously thought to step back from individual physics concepts to the overall concept of what it means to learn physics and do physics.  The two example students made it really concrete for me, and it gradually became clear to me that I, Dr. Jekyle and Mr. Hyde-like, teach two different conceptions of physics.  With my ninth grade class, I have a lot of leeway on content coverage, I've deliberately and repeatedly chosen depth over breadth, and I think I do a reasonably good job of having students develop the qualitative concepts before the equations (especially fall semester, when I do CASTLE and we don't really get any equations until November or so).  With my 11th grade class, which I taught for the first time this past year, I felt very pressured to cover a large syllabus of content to prepare my students for the IB Physics exam, and I think I've slipped into physics = formulas mode much more than I like to contemplate.   The idea of pacing as a marker for how much time is allotted for independent reasoning and sense-making was a revelation to me.

As I started the second article, Mestre on "Learning and Instruction in PreCollege Physical Science" I noticed the 1991 publication date.  Then I went back and noticed the 1989 date on the Hammer article.  I remarked to my husband how depressing it was that all these amazingly true and useful ideas about teaching and learning physics were published 25 years ago and yet haven't seem to have any impact on the mainstream approach to teaching physics at any level.  My intuition is that this is at least in part because Mestre's list of science education reform stakeholders at the end of his article excludes some of the most important players.

At the beginning of my education program and the University of Michigan, we pulled out the Michigan High School Content Expectations for our subject area and went through them.   We were taught that the processing of designing instruction started with those standards.  Most science teachers I know would love to take a deeper, more conceptual, more student-driven approach to learning, but feel that their primary responsibility, the job on which they are being evaluated, is to teach their students as much of the HSCEs as they can in the time available.  And they feel that the traditional transmission method is the only chance they have to come anywhere close to that.

You can argue that the traditional transmission method is actually less efficient at achieving real learning than a slower, more constructivist method.  I think that probably depends very highly on what instrument you're using to measure the learning.  If it's individual interviews or written explanations of scientific concepts, I believe that the constructivist approach is more productive.  If it's a very facts-driven standardized test like the MEAP, I suspect that (much as the formula-driven student did better in her physics class) the transmission method instruction gets higher scores.   But I'd love to see data that proves otherwise.

Which leads me back to Mestre's list of stakeholders who should be involved in physics education reform.  It includes teachers, scientists, and textbook publishers.  It does NOT include the state or federal departments of education, the legislature, or parents.  And yet, they are the ones who "pay the piper" through taxes and appropriations, and thereby "call the tune" by setting standards and evaluations of both student learning and teachers.   Maybe the Next Generation Science Standards relax the breadth requirements and allow more breathing space for students to construct their own understandings.  I haven't read them closely enough yet to have an opinion on that.  But I also have doubts that they will be easily adopted by Michigan, given the resistance to adopting the Common Core and replacing the MEAP in the legislature over the past year or two.

We're inviting our school administrators to come visit this workshop.  That's important.  But maybe we should also be inviting our local state senators and representatives?  And the heads of our PTOs?  Are we spreading the research-supported view of a better conception of what physics (and science) is to the right people?

Thursday, June 26, 2014

Lights, Camera, Action!

I really enjoyed the last half hour of the workshop today, when we did the constant velocity "action movie shot" practicum lab.  I've done a much less exciting "race prediction" version of this with my constant velocity buggies, and even that was fun,  But the variation on the different scenarios and the dramatic window dressing made it a lot more fun and funny, as well as more challenging in applying the different concepts for this unit.  I will definitely try to incorporate the "Chase Scene", "Get Away Crash", and "Blind Collision" scenarios and the "We have to get the perfect shot the first time -- if you get it, you're a Hero, if not you're a Bum who's never work in this town again!".  

Practicum labs in general seem like a cool, fun tool for application, extension, and assessment that I've underutilized.  I hope we do more of them.

Wednesday, June 25, 2014

Minecraft Babies and Concrete Canoes

One of the hardest concepts I currently tackle with my ninth graders is Buoyancy.  It's the culmination of the mass vs. volume vs. weight vs. density vs. pressure unit.   We learn to distinguish mass from weight (by thinking about what they are, how you measure them, and what they mean on the International Space Station).  We find a relationship between mass and volume (by measuring them with a balance and a graduated cylinder and plotting them against each other to find the density of some aluminum rods).  Then we think about a thumb tack and Newton's Third Law to define pressure and think about it in solid-solid interactions and fluid situations (swimming pools, airplanes).  We measure the weight reduction of prisms of the same volume when submerged in water (it's the same!  even if the prisms are different materials!).  Then we roll all that together into two derivations that we work out together, both of which blow my students' minds (some in a good way, some in a bad):  first the depth dependence of fluid pressure, then the fact that the difference in pressure on the top and bottom of a container reduces to be equal to the weight of the fluid displaced by the container.

This unit does not immediately "work".  I know, because I do Standards Based Grading, and the buoyancy concept is the one I do the most additional help and retesting on (although my Measured Uncertainty, Calculated Uncertainty, and Sig Fig concepts are close on its heels).   Eventually, with further small group discussion and additional hard thinking on the part of my students, most of them internalize that for floating objects (e.g. concrete canoes) the buoyancy force must equal the gravitational force on the object.  And some of them grok the connections between the volume of the displaced liquid, how the density of that liquid functions as a conversion factor between volume and mass, and how the gravitational field strength functions as a conversation factor between mass and weight.  But it is painful.

Every year I think about omitting buoyancy, and every year (so far) I've decided to mix things up instructionally and give it another shot, because it's such a beautiful combination of all these challenging, interrelated, but distinct concepts (volume, mass, density, pressure, force).   This comes to mind both because one of yesterday's Hestenes readings bemoans the lack of a solid understanding of buoyancy in physics students, and because so many of those tricky fundamental concepts feature in tonights' Arons reading.

All of which is to say, the weakness that Aron mentions are certainly ones I've seen in my students.  I see how we've incorporated counting squares for area and displacement volume measurements in the Unit 1 labs.  But do density and buoyancy come in anywhere?  A casual flip through my binder of Modeling Mechanics units doesn't seem to include buoyancy.  Is there a Modeling unit that deals with these basic ideas about different ways to describe how big something is or how much of it there is (length vs. area vs. volume vs. density)?  And that addresses fluid forces?  Is it off in some Physical Sciences set of materials?

My favorite exercise for addressing the scaling issue is a warm-up I developed after the first time I read some of these sections.  I call it my Minecraft Baby warm-up, as to figure out the right answer, I guide them through sketching a Minecraft baby and then increasing each of it's dimensions one at a time.  It goes as follows:
"Consider an newborn who weighs 40 N.  During a year she grows so that each dimension of her body (length, width, height) increases by 40%.  How much will she then weigh?  (Assume constant density.)"
It is accompanied by cute pictures of my daughter as a newborn and at almost-a-year.  (This may be why it is one of my favorite warm-ups.)


Would anyone like to guess the most popular answer (by far)?  And the second most popular answer?  (Hint: the answer to at least the first is in the Arons section on Scaling!)

Tuesday, June 24, 2014

What I want to get out of this Modeling Workshop (Quantitative Version)

This morning, we were asked, "What do you want to get out of this workshop?".  My basic answer was, "to be a better physics teacher".  But I actually have numerical goals for that.

For the last three years, ever since I audited a few afternoons of this workshop in 2011 and then was fortunate enough to be added to the modeling instruction list serv, I have given my ninth-grade students the (Simplified Language) FCI as a pre-test and a post-test at the beginning and end of my force and motion units (currently based mostly on Science Curriculum Inc's Force, Motion, and Energy).  Each year, my ninth grade students' pre-test scores have averaged 27-28%,  completely in line with the first year physics high school students in the Hestenes 1992 paper.  And each year, my gains in percent correct answers have grown a bit, but from a non-awesome baseline -- 10% to 11% to 14%.  I do much better with some students, but not with all of them.  I want to be the kind of physics teacher who has average gains of 35-40%.  (Figure:  "Traditional" vs. "Interactive Engagement" FCI gains from Hake (1998) vs. my data for individual students this year and overall averages for the past three years.)



This year, based on some suggestions from the modeling listserv, I added the Lawson Classroom Test of Scientific Reasoning to the mix (Lawson, 1978; Lawson et al., 2000).  Although I find the Information Processing discussing in the Hestenes 1979 article way too rational and mechanical to be a very useful view of my very human students, I do think the use of the Piagetian framework for student's reasoning skills should be helpful.   When I analyzed my Lawson data, I had two pleasant surprises and one unpleasant one.  Can you tell what they were?



On the plus side, my students come to me further along the concrete-formal transition than the average ninth graders collected by O'Donnell (2011).  They also make good progress in moving along that transition over the course of the year.  On the minus side, there's no correlation between their reasoning skills and their gains on the FCI.  I am not, as a teacher, taking advantage of their scientific thinking abilities to help them do a better job of constructing solid Newtonian physics concepts.  

That's why I'm here.  I want to do better.  That first graph, while a bit depressing right now, offers hope that better is attainable.  (Although I always wonder how valid the benchmark lines are for 9th grade physics.)    And that Hestenes 1979 article tells me that, "The teacher can no more develop effective new curricula and teaching techniques on their own than he can discover ab initio the basic principles of the science he teaches.  If the profession of teaching is ever to transcend the folklore state ... it must be guided and supported by a program of profound educational research."  I'm hoping Modeling Instruction will be that guide and support for me.



Works cited  (No, my blog posts don't usually have works cited lists, but this one is borrowed and modified from the student improvement data write-up I did last week, so I already had the references handy.)

Hake, R. R.  (1998) Interactive-engagement vs traditional methods: A six-thousand- student survey of mechanics test data for introductory physics courses.  American Journal of Physics, 66 (64).  http://dx.doi.org/10.1119/1.18809.
Hestenes, D. (1979) Wherefore a science of teaching?  The Physics Teacher, April 1979, 235-242.
Hestenes, D., Wells, M., and Swackhamer, G. (1992) Force Concept Inventory.  The Physics Teacher, 30, 141-158.  
Lawson, A. E. (1978). The development and validation of a Classroom Test of Formal Reasoning. Journal of Research in Science Teaching, 15(1):11–24.
Lawson, A. E., Clark, B., Cramer-Meldrum, E., Falconer, K. A., Sequist, J., and Kwon, Y.-J. (2000). Development of scientific reasoning in college biology: Do two levels of general hypothesis-testing skills exist? Journal of Research in Science Teaching, 37(1):81–101.
O'Donnell, J. R. (2011) Creation of national norms for scientific thinking skills using the Classroom Test of Scientific Reasoning  (Unpublished masters thesis). Winona State University, Winona, Minnesota.  Downloaded from modeling.asu.edu October 15, 2013.





 

Also...

I still think about this old post about Heidi, Tim, and teaching every time I watch Project Runway while "multitasking" and grading.  That must make it my favorite post ever.

Still Standing...

Three years of teaching, one baby, one move, and three new courses developed later... I'm still teaching!  Huzzah!  (Although I'm deeply hoping that after I've developed one last completely-from-scratch-course next year, life will rebalance a bit away from work and towards sleep and time with my family and the occasional hobby.  And maybe time to blog regularly.)

And this year, I'm finally getting to attend a Modeling Physics workshop, which has been a goal since about the time of the last post I made on this blog.

And they want us to reflect on the day each night on our blog.  So, 15 guaranteed posts!  How exciting!