Graduate Council Advisory Committee
HUC Board Room
Wednesday, October 14, 2009

Members present: John Johnstone, Gregg Janowski, Jim Collawn, Bryan Noe, Jeff Engler, Stephen O’Connor, Erica Pryor, Mary MacDougall, Rosalyn Weller

Guest: Leslie McClure

Staff: Thomas Harris, Cyndi Ballinger

    1. First agenda item discussed the NRC document that describes the methodology that was used to evaluate the multiple sources of information that were gathered to assess doctoral programs. Data were collected during the 2006 - 2007 academic year. One of the goals of the NRC is to provide program rankings to prospective students. Dr. Leslie McClure, Biostatistics, presented an overview of the methodology (PowerPoint Presentation Link). The NRC survey sought to provide summary measures of overall program quality. The overall measure is a weighted average of three sub dimensions: research activity of the faculty, student support and outcomes, and the diversity of the academic environment. The rankings consist of four domains: 1. Research Activity - contribution to research by assessing publication, citations, percentage of faculty with research grants, and recognition of scholar by awards and honors. 2. Student Support and Outcomes measures data on percentage of students who are fully funded in their first year, time to degree, placement in academic positions including postdocs and ultimate employment outcomes. 3. Diversity of the academic environment measures the percent of both faculty and students who are from underrepresented groups, percentage of female students, and the percentage of international students. 4. The overall rankings reflect which specific variable defines the construct that is most important to the overall quality of a program. Fields of discipline were selected based on two criteria: at least 500 PhDs in that field were awarded in the five years prior to the 2005 – 2006 academic year, and at least 25 Universities offered that the degree. The universities selected internally which programs would participate in the survey. There were five questionnaires: Institutional, Program, Faculty, Student, and Ranking.

      Two significant complications with the data collection were that data are only from one year, and also the rating survey was completed only by a subset of faculty from each field.  Weights of importance were based on data from programs across the nation in the same field. The direct weights were determined from responses by  faculty on the faculty questionnaires. The regression weights took the data from all the questionnaires about the program and predicted the national ratings. The combined weights provide important factors that describe the program. The research domain had the largest contribution to the overall score and diversity of the environment had the lowest. The weights were applied to data from each program and random samples of half the data were run 500 times to determine the program rankings. The results are a range of rankings compared to programs across the nation. Programs should compare themselves to programs with similar ranges.

    2. The second item on the agenda was the discussion of a method to evaluate faculty scholarly output. The UAB Grad School has contracted with Academic Analytic (AA) that uses a database to produce their Faculty Scholarly Productivity index (FSP) which is one parameter used to evaluate the quality of graduate programs. Handouts portrayed the AA results for the UAB Pathology program which, using the 2007-2008 faculty list ranked #3 in country.  The anticipation is that AA data can be used by departments and programs as a self improvement tool. Peer comparisons are also available using the AA database.  Data are collected annually, a distinct advantage over the NRC approach. Program Directors will be sent Graduate Faculty lists for updating in January 2010. Programs will be encouraged to list only those faculty who contribute in a constructive way to the program.
    3. The last agenda item was the Fiscal 2011 Graduate Fellowship Allocations. The process for applying will not be changed for the upcoming year.  Allocations will be made based on results of application review by impartial committees.  Pre-populated spreadsheets containing older data will be provided from the Graduate School. Programs will only need to enter current year data. Solicitation of applications will be sent to Program Directors on October 15.  Deadline for all applications will be November 16.

(ADCOM web site is

The meeting adjourned at 5:15 p.m.