Many hospital-based adult specialties are routinely ranked annually by US News & World Report. Its 2014-2015 rankings included cancer; cardiology and heart surgery; diabetes and endocrinology; ear, nose, and throat; gastroenterology and gastrointestinal surgery; geriatrics; gynecology; nephrology; neurology and neurosurgery; ophthalmology; orthopedics; pulmonology; psychiatry; rehabilitation; rheumatology; and urology.1 Only teaching hospitals that saw a high volume of patients were included in the rankings. Ophthalmology, psychiatry, rehabilitation, and rheumatology were ranked based on reputation among specialists; the others were ranked based on objective data (eg, hospital mortality rates, nursing ratios) in addition to a reputation score from a survey of physicians.1
Dermatology has never been included in the annual US News & World Report rankings. In the past, dermatology residency programs have been ranked solely according to the amount of annual funding received from the National Institutes of Health (NIH).2 Wu et al3 expanded the scope by creating an algorithm used to rank dermatology residency programs based on scholarly achievement. Included were a number of NIH grants as well as 4 other factors—publications in 2001-2004, Dermatology Foundation (DF) grants from 2001-2004, faculty lectures in 2004 delivered at national conferences, and number of full-time faculty members who were on the editorial boards of the top 3 US dermatology journals and the top 4 subspecialty journals—deemed important by the authors.3
The current study refines the prior algorithm by creating a weighted ranking algorithm and using criteria that the authors considered to be more meaningful than the original criteria used.3 Specifically, the current study considered the amount of NIH and DF funding received versus the number of grants received, and less importance was given to the number of faculty members on editorial boards and DF funding relative to other criteria. We used publicly available data from Web searches to conduct this study.
The overall ranking algorithm was designed based on the methodology used by the Institute of Higher Education, Shanghai Jiao Tong University, to rank universities in the annual Academic Ranking of World Universities, which is published annually and uses a weighted ranking algorithm that includes academic and research performance factors to evaluate universities worldwide.4
The names of all dermatology residency programs in the United States were obtained as of December 31, 2008, from FREIDA Online using the search term dermatology; the names of all full-time faculty members at these residency programs and number of residents also were obtained by searching the programs’ Web sites.
For another related study investigating the relationship between residency program characteristics and residents pursuing a career in dermatology, the following data were obtained: total number of full-time faculty members at the program; total number of residents; amount of NIH funding received in 2008 (http://report.nih.gov/award/index.cfm); amount of DF funding received in 2008 (http://dermatologyfoundation.org/pdf/pubs/DF_2008_Annual_Report.pdf.); number of publications by full-time faculty members in 2008 (http://www.ncbi.nlm.nih.gov/pubmed/); number of faculty lectures given at annual meetings of 5 societies in 2008 (American Academy of Dermatology, the Society for Investigative Dermatology, the American Society of Dermatopathology, the Society for Pediatric Dermatology, and the American Society for Dermatologic Surgery); the number of full-time faculty members who were on the editorial boards of 6 dermatology journals (Journal of Investigative Dermatology, Archives of Dermatology [currently known as JAMA Dermatology], Journal of the American Academy of Dermatology, Dermatologic Surgery, Journal of Cutaneous Pathology, and Pediatric Dermatology); and whether a program was housed within an institution’s department of dermatology or division of internal medicine.
The data were summed for all faculty members at a given program. To avoid duplicate faculty publications, collections for each residency program were created within PubMed (ie, if 2 authors from the same program coauthored an article, it was only counted once toward the total number of faculty publications from that program).
The dermatology residency programs that were excluded from this analysis included University of Texas at Austin, University of Texas Medical Branch, and the University of Connecticut, which were started in 2008, as well as Kaiser Permanente Southern California, which was started in 2010. The combined Boston University and Tufts University dermatology residency program was established prior to 2008 and therefore was counted as such in the analysis. The program at Harbor-UCLA Medical Center was not included in the analysis because the program was reestablished in 2004 and we could not assume that the prior program had similar attributes. Military residency programs also were excluded from the analysis, as residents are assigned to faculty positions upon graduation. The NIH residency program also was excluded because it is not a traditional 3-year residency program.
There were 5 factors that were deemed by the authors to be the most reflective of academic achievement among dermatology residency programs: number of faculty publications in 2008; amount of NIH funding received in 2008; number of faculty lectures given at 5 society meetings in 2008; amount of DF funding received in 2008; and the number of faculty members who were on the editorial boards of 6 dermatology journals in 2008. We wished to get a broad range of subspecialties of dermatology (eg, medical, surgical) for the journals. Further, the 6 journals selected were chosen because they had the highest impact factors at the time for general dermatology and dermatopathology.
Each residency program was assigned a score from 0 to 1 for each of these factors. The program with the highest number of faculty publications was assigned a score of 1 and the program with the lowest number of publications was assigned a score of 0. The programs in between were subsequently assigned scores from 0 to 1 based on the number of publications as a percentage of the number of publications from the program with the most publications.
A weighted ranking scheme was used to rank programs based on the relative importance of each factor. The authors decided that NIH funding, number of faculty publications, and number of lectures at society meetings were relatively more important than the other factors; thus these factors were given a weight of 1.0. The remaining factors—DF funding and number of faculty members on editorial boards of journals—were given a weight of 0.5. Values were totaled and programs were ranked based on the sum of these values.
Data were analyzed using SAS 9.2 software. This study was approved by the institutional review board at Kaiser Permanente Southern California.