cea logo

Inside the UAB School of Education, a group of researchers are evaluating reform initiatives and programs run by P-12 school districts, universities, nonprofits and state and local government agencies across the country.

The researchers work for the UAB Center for Educational Accountability (CEA), a research center in the School of Education that is dedicated to evaluating the effectiveness of public and private reform programs. CEA Director Scott Snyder Ph.D., an associate professor in the UAB Department of Human Studies, says the CEA’s primary purpose is to help administrators and leaders make data-based decisions for their programs.

He and other researchers at the CEA provide a number of services. They include program evaluations, database design and development, research design, statistical analysis, instrument and survey development, and grant writing for research and program evaluations.

“Part of what we do is to work with program personnel to identify metrics that are good indicators of the degree to which outcomes have been met,” says Snyder. “We then map across time the level of achievement of the outcomes and the degree to which the activities were implemented as planned.”

The CEA got its start when UAB and the University of Alabama Board of Trustees established the Center in 1995. The initial funding for it came from a Mayer Electric Foundation grant.

cea staffCenter for Educational Accountability Faculty and Staff (L-R)
Lawrence Moose, Rachel Cochran Ph.D., Scott Snyder Ph.D.,
Marcia O’Neal Ph.D., & Jason Fulmore, Ph.D.
Today, the CEA serves clients across Alabama, Mississippi, Georgia, Tennessee, Louisiana and Illinois. The clients have included the Alabama Department of Education’s Office of School Readiness, Birmingham City Schools, various units within UAB and numerous nonprofit organizations. CEA researchers have also worked on projects funded by the National Institutes of Health, the U.S. Department of Education, the National Science Foundation, the Centers for Disease Control and Prevention and the Cancer Research Foundation of America.

One of the CEA’s latest clients has been the Woodlawn Foundation. The nonprofit group is developing initiatives to address poverty, blight and low academic achievement in Birmingham’s Woodlawn community.

In 2013, CEA researchers surveyed Woodlawn residents to measure the level of interest and support for a proposed early learning center for the community, according to CEA Assistant Director Rachel Cochran, Ph.D.

“Now, we’re developing a database for them to collect information on graduation rates, student achievement scores, participation in pre-K and after-school programs and programs like Better Basics and any of the other partners that the Foundation is working with to impact student achievement.”

CEA researchers not only generate and interpret data, they also advise clients on how to improve their programs. Currently, they are analyzing data for a Birmingham City Schools’ project called the Woodlawn Innovation Network (WIN). WIN is an effort to improve learning in the five elementary and middle schools that feed into Woodlawn High School.

 “The information we’ve shared back with WIN has helped them make decisions on where to put their resources and where to focus their attention,” says Jason Fulmore, Ph.D., a CEA research methodologist and evaluator. “I think that’s our role, to serve as a partner in helping them make those decisions.”

In addition, the researchers are working on projects for the Birmingham Education Foundation, an organization focused on improving education and educational outcomes for Birmingham students.

The CEA also hosts training sessions for school and agency personnel on how to conduct their own evaluations and analyze the data they have available. With such knowledge, leaders can answer their own applied questions, Snyder says.

“One of the challenges we’ve seen is that sometimes an organization will lack baseline data that can be used to measure how successful a program is at meeting its goals,” Snyder says. “Baseline data shows how well people are doing before a program begins. The CEA helps administrators figure out what baseline data to collect in order to demonstrate the need for the program and determine whether goals are being met by the program.”

Another issue they often see involves program initiatives that have unintended consequences. An example, says Snyder, would be a case where a school’s science scores fall because a reform initiative encourages the teachers to spend more time on literacy and math instruction and consequently, they spend less time teaching science.

“And sometimes program activities have benefits that were not expected,” says Snyder, “such as children doing better on word problems in math because a reading intervention has improved their comprehension. A good evaluation will catch unintended consequences, both positive and negative.”

Along with Snyder, Cochran and Fulmore, the CEA staff also includes Marcia O’Neal, Ph.D., a methodologist and evaluator, and CEA Program Director, Lawrence Moose.

 “I think that because of our excellent staff,” says Snyder, “the CEA has done excellent work in helping programs gather and use data to make important improvements across time.”

Contact the UAB Center for Educational Accountability at 205-975-5388 or visit the website at www.uab.edu/education/cea.