Whether or not it’s fair, the U.S. News & World Report’s “Best High Schools” list is viewed by citizens all over the nation. The magazine, which has built a reputation due to its comprehensive university ranking system, created a college readiness index (CRI) to rank the top 500 high schools in the nation. “The rankings, often taken at face-value, are very important to the general public,” Interim Acting Principal Jie Zhang, said. Due to the nature of these rankings, many were disappointed to see Stuyvesant rank as just the 66th best high school in the nation. However, a closer examination of the ranking techniques will reveal exactly how Stuyvesant fell so low.
The May 2012 rankings, which used data from the class of 2010, revealed twenty-six schools with CRI’s of 100.0, implying that every student within each of those schools was college-ready by the magazine’s standards.
According to U.S. News’s Best High Schools Methodology, they “evaluated nearly 22,000 public high schools in 49 states and the District of Columbia.” Essentially, the rankings were determined after a three-step process:
Step one: Determine whether each school’s students were performing better than the average student in the state using high school proficiency tests.
Step two: For schools that passed the first step, U.S. News determined whether the school’s disadvantaged students, which include minorities and low income students, were performing better than the average for similar students in the state, again, using high school proficiency tests. Twenty-two percent of evaluated schools moved on to the final step.
Step three: Finalist schools were judged on the CRI. This was calculated by Advanced Placement (AP) or International Baccalaureate (IB) participation rate, weighted 25 percent, and quality-adjusted AP or IB participation rate, weighted 75 percent. The AP participation rate is the number of students who took at least one AP, and quality-adjusted participation rate is the number of students who passed (received a score of 3 or higher) at least one AP.
The U.S. News’s technical appendix reports, “The purpose of the college readiness index is to avoid creating an incentive for high schools to improve their ranking by offering more AP and/or IB courses and exams, regardless of whether their students are prepared to succeed in them.”
However, sophomore Wilbur Zhao said, “It doesn’t stop the schools. If a student doesn’t take any APs, the school will lose out on both participation in and passing at least one AP.” To a certain extent, U.S. News creates incredible incentive for high schools to offer more AP classes. The median CRI in the U.S. News rankings was 16.3. If a school mandates its entire school body to take one AP, the school would have a 100 percent AP participation rate. The school would have 25.0 CRI off the bat, enough to place within the top ten percent of public high schools in the nation, based on U.S. News ranking methodology.
“APs are a legitimate measure of college readiness, but U.S. News’s methodology is not comprehensive enough. Students need only a score of three on an AP to count toward quality-adjusted rates. There is no distinction between a three and five on an AP exam. There is little penalty for failed exams,” said Zhang.
The School of Science and Engineering Magnet in Dallas, Texas is an exemplary case study of these points. Each test taker took an average of 12.6 AP exams over four years, yet the school had a 51 percent exam pass rate. Even after failing almost half the exams, the students had an average of over six AP exams passed, an exceptional quality-adjusted exam rate. As a result of their quantity over quality methodology U.S. News & World Report ranked the school third in the nation.
Newsweek’s methodology is quite different from that of U.S. News’s. It includes more factors: graduation rate, college matriculation rate, and AP/IB/ Advanced International Certificate of Education (AICE) tests taken per student, which each count for a quarter of the score. Other factors include the average SAT/ACT scores and average AP/IB/AICE scores, which are each 10 percent of the score, and AP courses offered per student, which is five percent of the score.
Newsweek ranked the School of Science and Engineering Magnet first and fourth based on the 2010 and 2011 school years, respectively. In 2010, the school had 17.2 AP exams offered per graduate. In 2011, the methodology changed so that they compared “total AP tests taken per student” rather than “total AP tests taken per graduate.” The school’s average student took 3.8 AP exams per year, so by senior year, each student had taken at least 15 exams. However, the school’s average AP scores were 2.6 and 2.5 for the two respective years. Furthermore, the school’s average SAT scores were 1786 and 1742.
Newsweek also ranked The Gatton Academy of Mathematics and Science in Kentucky first in its 2012 list. The school had a high average SAT of 2010 and average AP score of 4.3. Despite its top ranking, the scores were no better than that of other high-performing schools. It was the school’s 4.7 AP exams per student that set it apart from the rest. However, this stems from the fact that the school is composed of only juniors and seniors. Sophomores and first term juniors apply for admission, with each grade composed of 128 students. These students have largely taken introductory courses already, so naturally, their students can take more AP classes, which is reflected in the statistics. This skews the AP/IB Tests per student in the school’s favor – yet another scenario where ranking methodology benefits schools that push more AP classes on students.
In comparison, Stuyvesant also had an average AP score of 4.3 and an average SAT score of 2090. Though Stuyvesant did significantly better on these exams than many other schools, the school was ranked low by U.S. News because the average Stuyvesant student had taken only 0.8 AP’s, which is mostly due to a lack of resources that restricts the number of students that can take an AP exam. Thus, it seems that these lists are made based on the quantity of challenging classes taken, as opposed to success in such challenging classes—the antithesis to the usual meaning of “college readiness.”
The bottom line is that the number of AP exams taken is given unjustified weight, and the number of AP exams failed is given unjustified triviality. Some students have the ability to pass more AP exams and classes, but are never given the opportunity to do so, especially in such a populous school as ours. In other schools, students who do not really have the ability to take APs are forced into doing so anyway, which results in poor scores but high rankings. Ranking reports often have a difficult time drawing the line, but “Best High Schools” will continue to mislead the masses as long as there are numbers to be analyzed and publishers to be paid.