August 29, 2003
“It is literally true that you can succeed best and quickest by helping others to succeed.” - - Napolean Hill
|
Greetings from Wayne Central School District. Below are some items of interest.
Newsletter:
1.
| Math A Exam: Enclosed is a copy of the commissioner Mills’ decision about Math A. As explained last night the panel found the test was substantially harder than the previous year and recommended it be rescaled. Commissioner Mills concurred. This will affect 9th and 10th graders who took the test. Some of which may now receive credit. Students in 11th and 12th grade have already received credit through class credit in lieu of the regents test.
|
2.
| Opening Day: Opening day for staff will be Tuesday, September 2nd. We will begin with coffee and donuts in the high school cafeteria at 7:30. This will be followed by remarks in the performing Arts Center at 8:00. Our guest speaker is Jay McTighe, author of the book Understanding by Design. Understanding by design is the instructional model chosen by the district last year.
|
3.
| Carol Fisher Retirement party: Mark your Calendars! Carol Fisher’s retirement party will be held on September 25 in the new hall at Casey Park.
|
4.
| Four County School Boards: Enclosed is the response from 4 County regarding the Superintendents resolution on passing budgets on time. As you can see there appears to be strong support. This will be a priority for the Superintendents this year.
|
5.
| Bus Loop Question. A question was raised last night regarding SED’s interpretation of student’s crossing roads or parking lots to get to school. The issue surrounded whether the new parking lot being created at the south end of the HS could be a student parking lot or did it need to be a faculty lot. I consulted with architect Matt Diehl this morning. SED does not want students crossing a road to get to school or to a playing field. The parking lot as it exists now and as it is designed to be in the future contains a walkway around the lot that meets SED requirements. The reality, however is that students will not and do not currently use the round about route. They currently walk across the bus loop. It would seem that this information will play a role in Mrs. Morrin and Mr. Atseff’s discussion about where the student parking lot should be.
|
Sept. 2 – Opening Day for Staff
|
Sept. 6 – Opening Day for Students
|
Sept. 9 – OP Grade 1 Parent Information Night
|
Sept. 9 – FE Family Picnic
|
Sept. 10 – Board of Education meeting
|
Sept. 10 – OP Grade 2 Parent Information Night
|
Sept. 29 – Four County School Bds. Legislative Comm. Mtg – Phelps Hotel – 6:30 p.m.
|
| Dinner selections: Strip Steak, Char-grilled Chicken, Herb-grilled Whitefish
|
| Please let Lori know if you plan on attending
|
October 23-25 – NYSSBA Annual Convention – Rochester Convention Center
|
| |
Athletics:
|
8/30 – Boys JV/V Soccer vs. McQuaid Jesuit – 5:00 & 7:00 p.m.
9/2 – Boys JV/V Soccer vs. Seneca Falls – 4:30 & 7:00 p.m.
9/2 – Boys JV /V Volleyball vs. Central Square – 5:30 p.m.
9/3 – Girls V Soccer vs. Newark – 7:00 p.m.
9/3 – Girls JV/V Volleyball vs. Aquinas – 5:00 p.m.
9/4 – Boys JV/V Soccer vs. Aquinas – 5:00 & 7:00 p.m.
9/5 – Girls JV Tennis vs. Marcus-Whitman – 4:15 p.m.
|
9/8 – Girls JV Tennis vs. Palmyra-macedon – 4:15 p.m.
|
| |
| |
a.
| SED – Math A Press Release
|
b.
| Math A Review Panel Findings
|
c.
| Opening Day Agenda/Letter
|
d.
| Ken Schaumberg Memoranda
|
e.
| Carol Fisher Retirement Party Invitation
|
f.
| Four County Information
|
g.
| Town of Ontario Board Meeting Minutes – 8/11/03 |
FOR IMMEDIATE RELEASE, AUGUST 26, 2003
For More Information, Contact
COMMISSIONER MILLS ISSUES ORDER TO RESCORE
JUNE MATH A REGENTS EXAM, BASED ON PANEL’S RECOMMENDATION
State Education Commissioner Richard Mills directed the State Education Department today to create and issue a new scoring chart for the June 2003 Math A Regents Exam. He made his decision based on the recommendation of the independent panel appointed by the Board of Regents to review the exam.
The new scoring chart will be available before the start of school. This means the June Math A Regents scores will improve.
The Regents and the Commissioner gave a broad charge to the panel, which included this question and eight others: “If the June 2003 Regents Math A Exam was not of the same level of difficulty as previous Math A Exams, can the results be re-scaled appropriately and used to measure student achievement, and if so how?”
In their interim report, the Panel on Regents Math A concluded, “the June, 2003 exam was harder than the June, 2002 exam. In short, students in June, 2003 were held to a higher standard than their counterparts a year earlier.” The Panel recommended “that the scores on the June, 2003 exam be statistically adjusted, using the 9th graders as a basis, so that the June, 2003 students will receive a score similar to what they would have received had they taken the June, 2002 exam.” The Panel described how this would be done.
In keeping with the Regents commitment to publish the panel’s report exactly as written, the panel’s interim report is attached.
Regents Chancellor Robert M. Bennett said, “The panel noted the urgency of responding to local school districts and their students. We appreciate this first recommendation and support the panel’s good work in such a short period of time.”
Commissioner Mills said, “This decision resolves a major uncertainty facing last year’s ninth and tenth graders. I thank the panel for its work. They worked hard on their charge and rightly concentrated on the question that had to be resolved before the start of school: the status of last year’s ninth and tenth graders. I agree with their recommendation to re-scale the Regents Math A exam, and we are creating that new scale now. I look forward to the panel’s final report in October as a source of further good advice on how to ensure a sound Regents Math A exam in January.”
On June 24, Commissioner Mills announced that the panel would be appointed to study the June Math A Regents Exam after preliminary data indicated a very low success rate in comparison with previous Math A exams. The June exam should have been comparable in difficulty with the January exam and previous Math A exams. Therefore, success rates should have been generally consistent, but they were not. As Commissioner Mills said at the time, “That inconsistency indicates there was a problem in the process of creating this June exam.”
Commissioner Mills had already directed that juniors and seniors could count their local course grade in the place of an exam score. The test counts for freshmen and sophomores, and they will now get a higher score. Juniors and seniors still have the option to use this higher test score or their class grade.
The panel, which was appointed by the Board of Regents on July 18, will continue to work on the rest of its charge (see attached) and will make more recommendations this fall.
-30 –
TO: Richard P. Mills
FROM: Math A Panel
William Brosnan, Stanley Chapman, Gregory Cizek,
Franco DiPasqua, Andrew Giordano, Lidia Gonzalez,
Robert Gyles, Daniel Jaye, Sophia Maggelakis,
Theresa McSweeney, Alfred Posamentier, Katherine Staltare,
Alan Tucker
SUBJECT: Interim Report
DATE: August 25, 2003
INTRODUCTION
The Panel would first like to express its appreciation to the Board of Regents and the Commissioner for our appointment. Although we come with different perspectives, we have reflected on several occasions about the positive dynamics of the group. Appreciation is extended to Tom Sheldon for his enormously successful efforts to create a strong Panel.
The Panel would also like to convey its appreciation to the leadership and staff of the New York State Education Department for providing us with substantial amounts of information and materials, and unfettered access to personnel. We are grateful for the full and enthusiastic cooperation we have encountered, which helped make our difficult task manageable and, we believe, successful to date.
This interim report focuses on Elements #4 and #8 of our charge, which are:
#4. Is the June, 2003 Regents Math A exam of the same level of difficulty as prior Math A exams? (That is, in addition to the equating included in question 2, consider the content, cognitive demand, and perceived difficulty of the exam.)
#8. If the June, 2003 Regents Math A exam was not of the same level of difficulty as previous Math A exams, can the results be re-scaled appropriately and used to measure student achievement, and if so, how?
While this report focuses on these elements, our research and findings touch on several other elements of the charge.
Math A Interim Report Page 2 of 6
This interim report is a response to the urgent problem of how June, 2003 Math A test results should be applied to 9th and 10th graders as they finalize their programs of study for the coming school year. Our work will continue in September when we meet to address the remaining elements of our charge.
FINDINGS
A very detailed analysis was conducted of the June, 2003 Math A Regents exam, and an extensive comparison was made of that exam with the June, 2002 Math A Regents exam. We found:
1. An analysis of the Rasch Item difficulty values generated by the Item Response Theory (IRT) method conducted on both exams shows that, while the overall difficulty of the two exams appeared comparable, disaggregation by exam parts showed important differences. While the difficulty of Parts 1 and 2 were close for the two exams (with the June, 2003 exam being slightly easier), Parts 3 and 4 of the 2003 exam were substantially more difficult than the same parts of the 2002 exam. (See Appendix A.)
2. Appendix A also includes a comparison of the probability of success on each item as generated by the IRT analysis, and an aggregation of these probabilities for the June, 2002 exam and the June, 2003 exam. We found:
•
The expected average score for the field test takers of the June, 2002 exam was 51.
•
The expected average score for the field test takers of the June, 2003 exam was 46, five points lower.
(It should be noted that the expected average score for the field test of both exams was below the 65 passing level and even below the 55 which was available as a passing score for a local diploma for some students until June, 2003.)
3. While one subgroup of our Panel was working on the above-mentioned item analysis, another subgroup was looking at the wording of the items in Parts 3 and 4, and the order in which the items were presented to students. When the subgroups reconvened together, we were struck by the fact that the conclusions reached by the two subgroups were virtually identical. The findings of the second group are detailed in Appendix B. In addition to finding that the 2003 items were more difficult, this group found that the students were presented at the beginning of Part 3 with several difficult items on a row, which is consistent with
Math A Interim Report Page 3 of 6
anecdotal evidence from the field that some students this year reached Part 3 and faced several tough problems in a row, which caused them to become frustrated and "give up."
4. In addition to the greater difficulty of the June, 2003 items, the Panel identified content coverage concerns in the June, 2003 exam. For example, 3 of the 35 items tested the Pythagorean Theorem; this would dramatically lower the score of a student with a weakness in that area. Yet, trigonometry was not assessed at all (which was a tremendous source of frustration in the field as previous Math A exams contained trigonometry items and many teachers spent substantial time teaching this to their students -- yet there was not one trigonometry question on the exam). In addition to potentially depressing the mean performance on the June, 2003 examination, such imbalances in content representation can also attenuate the content validity of the test.
The above-mentioned statistical and substantive differences lead the Panel to the conclusion that the June, 2003 exam was harder than the June, 2002 exam. In short, students in June, 2003 were held to a higher standard than their counterparts a year earlier.
This conclusion is consistent with evidence across the State about how students performed. One example is from the Marcellus School District, which had the following results on the Math A exam for those who scored a "2" on the Math 8 exam:
•
In January, 2002, 11 students passed Math A, out of 12 who had scored a 2 in Math 8.
•
In June, 2002, 16 students passed Math A, out of 19 who had scored a 2 in Math 8.
•
In January, 2003, 19 students passed Math A, out of 21 who had scored a 2 in Math 8.
•
In June, 2003, none (0) of the students passed Math A, out of 15 who had scored a 2 in Math 8.
While this evidence is anecdotal in nature, it is consistent with the experience of the K-12 educators on the Panel in their classes, schools, and districts.
In summary, the Panel is convinced there is compelling evidence that the June, 2003 Math A exam, when compared with the June, 2002 exam, was substantially more difficult for students.
Math A Interim Report Page 4 of 6
OPTIONS CONSIDERED
The Panel considered several possible ways of responding to the fact that this year's exam was more difficult than the June, 2002 exam. At the outset, we agreed that any solution recommended would need to meet these criteria:
•
the solution must "do no harm" to any student (that is, no student’s standing should be made worse by the application of any potential remedy);
•
the solution must be understandable (that is, it should be able to be easily explained to parents, educators, policy makers, etc.); and
•
the solution must be defensible, as it should have the characteristics of fundamental fairness and be psychometrically sound.
The Panel spent hours exploring possible approaches.
The Panel considered recommending setting aside the June, 2003 examination, as was done (and we believe very appropriately so) for the 11th and 12th graders. Several concerns were expressed about doing this for the 9th and 10th graders, a major one being that these students would be held to no objective standard. This could allow students to "slip through" and receive diplomas with weaker math skills than needed for success in their futures. Additionally, it was believed that such a precedent could serve to undermine standards-based reform efforts and implementation of important changes to the way mathematics is taught and learned in New York. A significant majority of the Panel was not in favor of this option.
The Panel considered somehow removing the items that were the most troubling. This, also, generated several concerns, the most significant of which was that every item had some students responding correctly; by removing an item and rescaling, those students who had answered the item correctly could end up with a lower score. The Panel was unanimous in its thinking that this would violate our "do no harm" criterion for a decision. This was rejected by all.
The Panel considered "curving." That option was quickly rejected because there is no basis for doing that. "Curving" would require determining where the "average" child should score, i.e., a 65 or a 75, etc. There was no basis whatsoever for making such a determination. “Curving” would result in a total disconnect between the meaning of a student’s test score with respect to the content standards, replacing this important interpretation with a norm-referenced interpretation.
A number of other adjustments were considered. Ultimately, the Panel focused on the fact that the effect of the anomalies found in this test could be
Math A Interim Report Page 5 of 6
estimated with reasonable precision by looking at comparable groups of 9th graders. There are some differences between the June, 2002 population and the June, 2003 population in that, this year, more students who are struggling in math took the Math A exam because the Course I exam is no longer an option. However, struggling math students are almost invariably programmed into 10
th
grade; the 9th grade has included, and continues to include, only those students who are strong in math and who the teachers feel can challenge this exam at that early stage of their high school career. Thus, a comparison of the June, 2002 9th grade result and the June, 2003 9th grade result is valid. The data we have is that, in the sample of 400 school districts for which the State Education Department has data, the June, 2003 9th graders scored 9.6 points lower than the June, 2002 9th graders. This drop of almost 10 points from June, 2002 to June, 2003 is very consistent with our review of the item statistics, as well as our qualitative review of the items.
This grade comparison led us to our recommendation.
RECOMMENDATION
The Panel recommends that the scores on the June, 2003 exam be statistically adjusted, using the 9th graders as a basis, so that the June, 2003 students will receive a score similar to what they would have received had they taken the June, 2002 exam. The adjustment the Panel is recommending has these three steps:
1. Draw matched samples of 9th graders from the June, 2002 exam and the June, 2003 exam that are as representative as possible of the entire State.
2. Conduct an equi-percentile equating of the two distributions.
3. Establish the conversion of raw scores to scaled scores and put the June, 2003 scores on the June, 2002 scale.
The Panel is recommending this adjustment because it holds this year's students to the same standards as the ones to which the June, 2002 students were held; it essentially corrects for the anomalies of the June, 2003 exam. Any adjustment less than this would be unfair because it would not correct for the full effect of the anomalies found in the exam. Any adjustment greater than this would result in an overcorrection, which means the students who took the June, 2003 exam would end up being held to a lower standard than last year's students; the majority of the Panel felt this would not be a fair result.
The recommendation for the score adjustment detailed above is endorsed by the entire membership of this Panel present for the two day session on August 19 and 20.
9th
Math A Interim Report Page 6 of 6
THE PANEL’S WORK CONTINUES
This Panel has been asked to review in great depth the status of Math A. As we have explored the June, 2003 exam, we have identified a number of issues about which we have serious concerns, including, but not limited to, the content standards, the performance standards, the lag in full implementation of curriculum and instruction aligned with the Math A frameworks, and the infrastructure to support the move to higher standards. We view the adjustment we are recommending for the June, 2003 exam as an interim step. We intend to consider the broader issues at our upcoming meetings on September 10, 11 and 19, and to make recommendations to the Board of Regents on October 8. It will be our hope that these broader recommendations, if accepted, will result in changes beginning with the January, 2004 exam. The Panel sees the recommendation made in this report as an interim step to take care of the students who took the exam last June and to enable them to be appropriately scheduled for the fall semester.
Back to top