What about pseudo-data analysis?

There was a time when education was a politician’s refuge.  When a candidate wanted to look caring and populist, they would go to a local school, take some photos, and proclaim that children are the future (preferably sung along with Whitney Houston) and the US (or fill in your constituency) has the best schools in the world.

Through locally focused earmarks and formula-based federal funding initiatives, schools received federal funding.  Similar processes at the state-level provided per-student formula funding.

Then along came No Child Left Behind and TIMSS and a desire to quantify success.  Three important cultural dynamics came to play: 1) the idea that value can be concretely demonstrated, 2) a strong cultural desire to be better than everyone else, and 3) a lack of time and/or capacity to assimilate the enormous amount of information available.  To satisfy these three elements, the public needs a number that tells us our kids are learning and our schools are the best.  Recently, many policy makers have also decided we need a number that will tell us if teachers are any good.

All of the ways this doesn’t work is a book, not a blog.  The enormity of variables makes saying anything substantial nearly impossible.  Either you make assertions based on statistics that ignore factors (lurking variables is the term in the research world), or you make the scope so finite that you have said nothing.

To that end, I want to ride on Dan Meyer’s coattails and create an educational policy version of pseudocontext and call it pseudo-data analysis.  I define PDA as any attempt to use statistics to draw a conclusion that is either a) drawing or allowing oversimplified conclusions to be drawn from data, or b) trumpeting one piece of data while ignoring the glaring gaps.

Two examples:

1)   A climatology professor in Washington state who has determined that math textbooks are to blame for his students failing a self-created diagnostic test.

2)   The headline: “Texas’ white eighth-grade students earn second highest score in nation on science NAEP”

The data:

What conclusion should we draw from this data?  I guess “Our students are average” doesn’t make a good press release headline.

 

This would be a great context to ask students to conjecture on how Texas is in the top ten in every subpopulation, but 26th overall.

Every teacher has seen PDA that is annoying in the least.  Please share any other examples you can find.

avatar

About Tim Pope

Tim Pope no longer works with Key Curriculum Press. His posts and biography have been preserved for the archives. After ten years in the classroom teaching a myriad of subjects and spending a couple of years as a principal on the Navajo reservation (oh, the stories I can tell), I have spent the last seven years doing math professional development at the Dana Center and now at Key Curriculum. My focus now is on helping schools and districts implement Key resources.
This entry was posted in Assessment and tagged , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>