Regents Recap — January 2013: Miscellaneous Grievances

Here is another installment in my series reviewing the NY State Regents exams in mathematics.

The January 2013 math Regents exams contained many of the issues I’ve complained about before:  lack of appreciation for the subtleties of functions, asking for middle terms,  non-equivalent equivalent expressions, and the like.

I’ve chronicled some of the larger issues I saw this January here, but there were a few irritating problems that didn’t quite fit elsewhere.  Like number 9 from the Geometry exam.

Regents 2013 January G 9

First of all, I don’t really understand why we bother writing multiple choice questions about constructions instead of just having students perform constructions.  Setting that issue aside, this question is totally pointless.

The triangle is equilateral.  Regardless of how it was constructed, the fact that AB = AC = BC will always justify its equilateralness.  Under no circumstance could the fact that  AB = AC = BC not justify a triangle is equilateral.  The construction aspect of this problem is entirely irrelevant.

Next, I really emphasize precise use of language in math class.  In my opinion, in order to think clearly about mathematical ideas, you need to communicate clearly and unambiguously about them.  The wording of number 32 from the Algebra 2 / Trig exam bothers me.

Regents 2013 January AT 32

What does the answer mean in the phrase “express the answer in simplest radical form”?  Presumably it means “the two solutions to the equation”, but “answer” is singular.  And if it means “the set of solutions”, well, you can’t put a set in simplest radical form.

Are we trying to trick the students into thinking there’s only one solution?  Or is this just a lazy use of the word “answer”, like the way students lazily use the word “solve” to mean dozens of different things?  I understand that this is nit-picking, but this is a particular pet peeve on mine.

Lastly, number 20 from the Geometry exam is simply absurd.  Just looking at it makes me uncomfortable.

Regents 2013 January G 20I’m sure we can find a better way to test knowledge of logical relationships than by promoting common mathematical errors!

Regents Recap — January 2013: Recycled Problems

Here is another installment in my series reviewing the NY State Regents exams in mathematics.

I reuse problems on tests all the time.  I’m sure every teacher does.  Sometimes I’ll change a number or two, sometimes I’ll change what the question asks for, or sometimes I’ll use the problem just as it is.

But I’m not writing tests for thousands of students state-wide, and my tests don’t determine whether or not students graduate, teachers keep their jobs, or schools remain open.

So it seems reasonable to ask if reusing problems on high-stakes exams, like the Regents, is an appropriate practice.

Compare number 38 from the January 2013 Algebra 2 / Trig exam

Regents 2013 January AT 38

with number 27 from the 2005 Math B exam.

Regents 2013 January Math B 27And one more important difference between my tests and these standardized tests:  I don’t pay millions of dollars to educational specialists to develop my exams.

Related Posts

 

Math Quiz — NYT Learning Network

cooper unionThrough Math for America, I am part of an ongoing collaboration with the New York Times Learning Network.  My latest contribution, a Test Yourself quiz-question, can be found here:

Test Yourself Math — March 6, 2013

This question is about how the Cooper Union is contemplating an end top its longstanding no-tuition policy, due in part to a current $12 million operating loss.  How much would Cooper Union have to charge in tuition to cover that loss?

Regents Recap — January 2013: Question Design

Here is another installment in my series reviewing the NY State Regents exams in mathematics.

One consequence of scrutinizing standardized tests is a heightened sense of the role question design plays in constructing assessments.

Consider number 14 from the Integrated Algebra exam.

Regents 2013 January IA 14

In order to correctly answer this question, the student has to do two things:  they need to locate the vertex of a parabola; and they need to correctly name a quadrant.

Suppose a student gets this question wrong.  Is it because they couldn’t find the vertex of a parabola, or because they couldn’t correctly name the quadrant?  We don’t know.

Similarly, consider number 21  from the Geometry exam.

Regents 2013 January G 21

This is a textbook geometry problem, and there’s nothing inherently wrong with it.  But if a student gets it wrong, we don’t know if they got it wrong because they didn’t understand the geometry of the situation, or because they couldn’t execute the necessary algebra.

Using student data to inform instruction is a big deal nowadays, and collecting student data is one of the justifications for the increasing emphasis on standardized exams.  But is the data we’re collecting meaningful?

If a student gets the wrong answer, all we know is that they got the wrong answer.  We don’t know why; we don’t know what misconceptions need to be corrected.  In order to find out, we need to look at student work and intervene based on what we see.

And what if a student gets the right answer?  Well, there is a non-zero chance they got it by guessing.  In fact, on average, one out of four students who has no idea what the answer is will correctly guess the right answer.  So a right answer doesn’t reliably mean that the student knows how to solve this problem, anyway.

So what then, exactly, is the purpose of these multiple choice questions?

Follow

Get every new post delivered to your Inbox

Join other followers: