Updated June 26, 2014
This document reflects school assessments as of the above date. For the most current assessments, please reference the main Scorecard page.
The American Medical Student Association (AMSA) launched the first PharmFree Scorecard in 2007 with the 2014 Scorecard being its seventh iteration. Since its inception, the AMSA Scorecard team has been committed to objectively evaluating conflict-of-interest policies and curricula at the 161 allopathic and osteopathic medical colleges in the United States and Puerto Rico.
The Scorecard is an evolving tool that uses letter grades to assess schools’ performance in fourteen potential areas of conflict of interest. It offers a comprehensive look at the changing landscape of conflict-of-interest policies across U.S. medical education, as well as in-depth assessments of individual policies that govern industry interaction between students, faculty, and the pharmaceutical and medical device industries.
While the relationships between academic physicians and industry benefit medical research and treatment, they can bias medical education in favor of specific products. Studies have shown that industry influence—whether in the form of gifts, commercially supported education, or simply visits with pharmaceutical representatives—can lead to more expensive and less evidence-based prescribing practices. Medical schools and academic medical centers have been leaders in setting new standards for policies on potential conflicts of interest, as supported by strong guidelines set by the Association of American Medical Colleges (AAMC) in 2008 and the Institute of Medicine in 2009. In 2012, The Pew Charitable Trusts continued this tradition by convening an expert task force of leaders from academic medical centers to create a new set of recommendations for best practices in conflict of interest policies at academic medical centers.
AMSA revised its Scorecard to reflect many of the recommendations of this task force. These changes represent an intentional strengthening of the standards of excellence with which policies are evaluated and scored. With the impending release of a federal database disclosing all financial relationships between physicians and industry, greater scrutiny of the appropriateness of these certain industry activities is particularly critical.
The new Scorecard adds three categories not present in prior years, including:
Moreover, Site access to pharmaceutical sales representatives and medical device representatives are now scored as two separate domains.
Some institutions will notice a decline in their grades this year, because the new scorecard envisions an ideal, conflict-free medical education environment—one that will not be easy for all medical schools, even those with good policies, to achieve. Through its Scorecard, AMSA seeks to encourage schools to adopt aspirational standards in conflict of interest regulation.
As of June 28, 2014, 77 medical schools submitted policies to AMSA for scoring, a 48% response rate. This year’s relatively low participation rate is likely due to the first major change in methodology since the Scorecard’s creation. In prior years, schools were merely asked to submit policy updates. This year, because of new domains and changes to old ones, we requested entirely new policy submissions. As was true in the past, we expect participation rates to increase significantly over subsequent years. In order to continue our tradition of serving as a comprehensive database of U.S. medical schools, we nevertheless scored non-reporting schools by searching for publically available documents online (see Methodology for more details.)
Of the 161 US medical schools, 27 receive “A”s (17%), 81 “B”s (50%), 25 “C”s (16%), and 26 Incompletes (16%) This year the grades of “D” and “F” were removed because we found that trying to assign these grades to a small number of schools in the lower end of the spectrum involved difficult and potentially arbitrary decisions. In general, schools’ conflict of interest policies fall into three major categories of policy quality:
In addition, 26 schools (16%) received “I” for Incomplete. These were schools that had not submitted policies, and for which our web search yielded so little information that grading them was unlikely to be a meaningful measure of policy strength. While in prior years such schools would have been assigned Ds or Fs, this likely overestimated the number of schools with poor, as opposed to incomplete policies. These institutions are encouraged to share updated policy information with AMSA for a possible re-grade. After a one-year period, institutions that have not updated their information will receive a grade of “D”.
Consistent with the more stringent criteria of this year’s Scorecard, fewer schools received grades of A or B this year than last: In 2013, 72% of schools received the top two grades, whereas in 2014 this proportion dropped slightly to 67%. This overall decrease was entirely due to a drop off in the number of A grades (40 in 2013, and 27 in 2014). In fact, the number of schools with solid B policies increased from 74 in 2013 to 81 this year.
Changes in the Scorecard this year make it difficult to compare scores to years past. Nonetheless, of the 25 institutions receiving “A” scores, three deserve particular attention for submitting newly drafted strong policies improving their grades from Cs, Ds, and “In Process”. These schools are: Edward Via College of Osteopathic Medicine, Pacific Northwest University of Health Sciences College of Osteopathic Medicine, and Hofstra North Shore-LIJ College of Medicine.
“B” policies on the verge of excellence
Many of the solid B schools’ policies would reach “A” status with relatively minor improvements. For example, 9 “B” schools with particularly strong policies narrowly missed model status because only 6, rather than 7 (half), of their COI domains had perfect scores—these schools included: Albert Einstein College of Medicine-Yeshiva University, East Tennessee State University James H. Quillen College of Medicine, New York University School of Medicine, Southern Illinois University School of Medicine, The Commonwealth Medical College, University of Arizona College of Medicine – Tucson, University of Oklahoma College of Medicine, Virginia Commonwealth University School of Medicine, and Yale University School of Medicine.
Depending on the school, one of the following relatively minor policy adjustments would propel them from a B to an A score:
Domains with similar criteria to prior Scorecards
Two domains were essentially unchanged from prior years and therefore allow year to year comparisons: industry-funded promotional speaking and site access for pharmaceutical sales representatives.
Domains with more stringent standards this year
Several domains have more stringent criteria this year, resulting in fewer schools receiving perfect scores. However, the disclosure domain experienced improvements across institutions this year.
Scholarship perfect scores 2014: 3, down from 123 in 2013.
Industry funding of CME perfect scores 2014: 5, down from 28 in 2013.
Gifts perfect score 2014: 79, down from 93 in 2013.
Meals perfect score 2014: 24, down from 93 in 2013.
Attendance of Industry-Sponsored Promotional Programs perfect score 2014: 25, down from 101 in 2013.
Consulting and Advising Relationships perfect score 2014: 25, down from 71 in 2013.
Disclosure perfect score 2014: 51, up from 41 in 2013
COI Curriculum perfect score 2014: 34, down from 79 in 2013
Domains that are new this year:
Ghostwriting perfect score 2014: 105
Medical device representatives perfect score 2014: 91
Extension of COI policies perfect score 2014: 50
Enforcement and sanctions perfect score 2014: 126
The following is a streamlined list of all 2014 domains and the number of schools receiving perfect scores in each:
The Scorecard serves not only to measure the strength of policies, but also to provide a valuable resource for institutions to develop and refine new policies. The inclusion of full text policies on the website (where permission has been given) will facilitate this goal.