Regents Recap — January 2013: Miscellaneous Grievances

Here is another installment in my series reviewing the NY State Regents exams in mathematics.

The January 2013 math Regents exams contained many of the issues I’ve complained about before:  lack of appreciation for the subtleties of functions, asking for middle terms,  non-equivalent equivalent expressions, and the like.

I’ve chronicled some of the larger issues I saw this January here, but there were a few irritating problems that didn’t quite fit elsewhere.  Like number 9 from the Geometry exam.

Regents 2013 January G 9

First of all, I don’t really understand why we bother writing multiple choice questions about constructions instead of just having students perform constructions.  Setting that issue aside, this question is totally pointless.

The triangle is equilateral.  Regardless of how it was constructed, the fact that AB = AC = BC will always justify its equilateralness.  Under no circumstance could the fact that  AB = AC = BC not justify a triangle is equilateral.  The construction aspect of this problem is entirely irrelevant.

Next, I really emphasize precise use of language in math class.  In my opinion, in order to think clearly about mathematical ideas, you need to communicate clearly and unambiguously about them.  The wording of number 32 from the Algebra 2 / Trig exam bothers me.

Regents 2013 January AT 32

What does the answer mean in the phrase “express the answer in simplest radical form”?  Presumably it means “the two solutions to the equation”, but “answer” is singular.  And if it means “the set of solutions”, well, you can’t put a set in simplest radical form.

Are we trying to trick the students into thinking there’s only one solution?  Or is this just a lazy use of the word “answer”, like the way students lazily use the word “solve” to mean dozens of different things?  I understand that this is nit-picking, but this is a particular pet peeve on mine.

Lastly, number 20 from the Geometry exam is simply absurd.  Just looking at it makes me uncomfortable.

Regents 2013 January G 20I’m sure we can find a better way to test knowledge of logical relationships than by promoting common mathematical errors!

Regents Recap — January 2013: Recycled Problems

Here is another installment in my series reviewing the NY State Regents exams in mathematics.

I reuse problems on tests all the time.  I’m sure every teacher does.  Sometimes I’ll change a number or two, sometimes I’ll change what the question asks for, or sometimes I’ll use the problem just as it is.

But I’m not writing tests for thousands of students state-wide, and my tests don’t determine whether or not students graduate, teachers keep their jobs, or schools remain open.

So it seems reasonable to ask if reusing problems on high-stakes exams, like the Regents, is an appropriate practice.

Compare number 38 from the January 2013 Algebra 2 / Trig exam

Regents 2013 January AT 38

with number 27 from the 2005 Math B exam.

Regents 2013 January Math B 27And one more important difference between my tests and these standardized tests:  I don’t pay millions of dollars to educational specialists to develop my exams.

Related Posts

 

Regents Recap — January 2013: Question Design

Here is another installment in my series reviewing the NY State Regents exams in mathematics.

One consequence of scrutinizing standardized tests is a heightened sense of the role question design plays in constructing assessments.

Consider number 14 from the Integrated Algebra exam.

Regents 2013 January IA 14

In order to correctly answer this question, the student has to do two things:  they need to locate the vertex of a parabola; and they need to correctly name a quadrant.

Suppose a student gets this question wrong.  Is it because they couldn’t find the vertex of a parabola, or because they couldn’t correctly name the quadrant?  We don’t know.

Similarly, consider number 21  from the Geometry exam.

Regents 2013 January G 21

This is a textbook geometry problem, and there’s nothing inherently wrong with it.  But if a student gets it wrong, we don’t know if they got it wrong because they didn’t understand the geometry of the situation, or because they couldn’t execute the necessary algebra.

Using student data to inform instruction is a big deal nowadays, and collecting student data is one of the justifications for the increasing emphasis on standardized exams.  But is the data we’re collecting meaningful?

If a student gets the wrong answer, all we know is that they got the wrong answer.  We don’t know why; we don’t know what misconceptions need to be corrected.  In order to find out, we need to look at student work and intervene based on what we see.

And what if a student gets the right answer?  Well, there is a non-zero chance they got it by guessing.  In fact, on average, one out of four students who has no idea what the answer is will correctly guess the right answer.  So a right answer doesn’t reliably mean that the student knows how to solve this problem, anyway.

So what then, exactly, is the purpose of these multiple choice questions?

Regents Recap — January 2013: Where Does This Topic Belong?

Here is another installment in my series reviewing the NY State Regents exams in mathematics.

There seems to be some confusion among the Regents exam writers about when students are supposed to learn about lines and parabolas.  Consider number 39 from the January 2013 Integrated Algebra exam:

regents january 2013 ia 39

Compare the above problem with number 39 from the June 2012 Geometry exam:

regents june 2012 g 38These questions are essentially equivalent.  They both require solving a system of equations involving a linear function and a quadratic function by graphing.  Yet, they appear in the terminal exams of two different courses, that are supposed to assess two different years of learning.

When, exactly, is the student expected to learn how to do this?  If the answer is “In the Geometry course”,  the Algebra teacher can hardly be held accountable if the student doesn’t know how to solve this problem.  And if the answer is “In the Integrated Algebra course”, what does it mean if the student gets the problem wrong on the Geometry exam?  Is that the fault of the Geometry teacher or the Algebra teacher?  The duplication of this topic raises questions about the validity of using these tests to evaluate teachers.

And if that isn’t confusing enough, check out this problem from the 2011 Algebra 2 / Trig exam.

regents june 2011 at 39Here, we see the same essential question, except now the student is required to solve this system algebraically.  These three exams–Integrated Algebra, Geometry, Algebra 2/Trig–span at least three years of high school mathematics.  In the integrated Algebra course, a student is expected to solve this problem by graphing.  Then, 2 to 3 years later, a student is expected to be able to solve the same kind of problem algebraically.

What does that say about these tests as measures of student growth?

Regents Recap — January 2013: What Are We Testing?

Here is another installment in my series reviewing the NY State Regents exams in mathematics.

One significant and negative consequence that standardized exams have on mathematics instruction is an over-emphasis on secondary, tertiary, or in some cases, irrelevant knowledge.  Here are some examples from the January 2013 Regents exams.

First, these two problems, number 10 from the Algebra 2 / Trig exam, and number 4 from the Geometry exam, emphasize notation and nomenclature over actual mathematical content knowledge

Regents 2013 -- Tertiary knowledge 1

Rather than ask the student to solve a problem, the questions here ask the student to correctly name a tool that might be used in solving the problem.  It’s good to know the names of things, but that’s considerably less important than knowing how to use those things to solve problems.

The discriminant is a popular topic on the Algebra 2 / Trig exam:  here’s number 23 from January 2013:

Regents 2013 January AT 23

It’s good for students to understand the discriminant, but the discriminant per se is not really that important.  What’s important is determining the nature of the roots of quadratic functions.

If you give the student an actual quadratic function, there are at least three different ways they could determine the nature of the roots.  But if you give them only the discriminant, they must remember exactly what the discriminant is and exactly what the rule says.  This forces students and teachers to think narrowly about mathematical problem solving.

In number 3 on the Algebra 2 / Trig exam, we see a common practice of testing superficial knowledge instead of real mathematical knowledge.

Regents 2013 January AT 3

Ostensibly, this is a question about statistics and regression.  But a student here doesn’t have to know anything about what a regression line is, or what a correlation coefficient means; all the student has to know is “sign of the correlation coefficient is the sign of the coefficient of x”.  These kinds of questions don’t promote real mathematical learning; in fact, they reinforce a test-prep mentality in mathematics.

And lastly, it never ceases to amaze me how often we test students on their ability to convert angle measures to the archaic system of minutes and seconds.  Here’s number 35 from the Algebra 2 / Trig exam.

Regents 2013 January AT 35

A student could correctly convert radians to degrees, express in appropriate decimal form, and only get one out of two points for this problem.  Is minute-second notation really worth testing, or knowing?

Follow

Get every new post delivered to your Inbox

Join other followers: