Online Education and Metrics

Questioner: Anonymous

Posted to: David S. Szymanski, UNF President

Once again UNF scored at the bottom of 11 schools on the Board of Governors (BOG) State University System Performance Metrics and once again the metrics on which UNF scored the lowest included student retention and student graduation rates. UNF’s bottom ranking means the loss of tens of millions of dollars in performance-based funding. Correctly the new president has chosen to embrace the metrics and is working to, among other things, increase student retention and reduce the time it takes students to graduate. However is anyone in the new administration giving thought to the prospect that a major of cause of UNF’s low scores in student retention and graduation rate may be UNF’s decision to increase the percent of students in online courses? Under the metrics, each university is permitted to select one goal to be evaluated on (the others are chosen by the BOG). Under the past administration, UNF selected the percent of undergraduate students in online courses as its chosen goal. However, a growing body of education research documents that student retention rates are much lower in online courses (50-70% lower) and lower retention rates mean longer times to graduation. It should therefore not be surprising to think that UNF’s choice to increase online courses as one of its metrics is having an adverse effect on the metrics of retention and time to graduation. Isn’t it time that UNF reevaluate its choice to increase online courses as a metric goal? While online classes are an increasing part of the educational landscape and technology enhanced teaching should not be avoided, no other Florida school has ever selected online courses as its chosen metric. Although an important decision, the choice by UNF to increase online courses was never a strategic one. According to one high-level administrator involved with the decision, the online course metric was chosen because it was considered “low hanging fruit” that UNF could accomplish quickly to increase its metric standing. The decision was also questionable in the face of UNF’s “high-touch” brand of “no one like you and no place like UNF.” Is anyone in the new administration giving thought to the apparent contradiction of UNF’s choice to increase online courses and the consequences of that choice for student retention rates and time to graduation? Has research been conducted to check if student dropout rates at UNF are higher in online courses like research documents? Has research been conducted to determine if UNF’s low retention rates and longer time to graduation are adversely affected by students taking online courses at UNF?

Response from the Floor by President Szymanski

I do not have an answer to all of these questions; however, we are strategically pursuing answers to these questions through our Institutional Research office.

Committee to Analyze SUS Data

December 1, 2016

Questioner: Anonymous

Posed to: Earle Traynham, Provost & Vice President, Academic Affairs

It has been established that UNF finished at or near the bottom of the SUS [State University System] with regard to the state performance metrics, and that based upon the current metrics our movement upward isn’t likely. Could Academic Affairs appoint a committee of individuals with appropriate expertise to participate in data interpretation, formula creation, and causal versus correlative relationships?

Response from the floor by FA President Radha Pyati:
A group of faculty worked with Vice Provost Jay Coleman this summer on metrics in various colleges and departments. Their (PowerPoint) presentation will be attached to this response. (Analytics Initiatives by Jay Coleman)

10-Step Plan to Improve UNF Ranking

October 13, 2016

Questioner: Anonymous

Posed to: John Delaney, President, University of North Florida and Earle Traynham, Provost & Vice President, Academic Affairs

What is the 10-step plan to lead UNF to a university ranked in the middle among the other universities?

Response from the Floor by Vice President Tom Serwatka:

The answer to this question depends on what is meant by the middle of the metrics. If one looks at each of the 10 metrics and UNF’s score on each of the metrics, UNF scores either above the middle or at the middle of each metric. What UNF has not done is received growth scores according to the metrics. For example, when looking at graduation rates and employment rates, we are at or above the center score. The place where we are below center is cost per degree. There is no need for UNF to be ashamed of our performance on the metrics. Please see the attached addendum submitted by Dr. Serwatka regarding the metric scores

UNF Pride and SUS Rankings

October 13, 2016

Questioner: Anonymous

Posed to: John Delaney, President, University of North Florida

The President recently put out a note that encourages us to brag about UNF; this is of course much easier done if those to whom we brag do not then remind us of our position vis a vis the state metrics. Can the President assure us that we will not continue to be in the bottom of the rankings next year or in subsequent years?


At the October 6 meeting of the Faculty Association, I promised to send a copy of the Excellence Points earned by each university on the performance funding metrics this past year. This report follows below.

As can been seen in this chart, UNF falls among the three lowest performing schools on only one metric: cost per degree. On two metrics, we score as the fourth best performing (percent of graduates employed and lowest percentage of graduates with excess hours). On one, we score in fifth place
(median wages for new graduates). On graduation rates, we score in seventh place, and we score in eighth place on four metrics (second-year retention; bachelors’ and masters’ degrees earned in strategic areas of emphasis; and percentage of students with Pell grants). All schools tied for number one on the metric their board of trustees selected.

The only reason we fell in the overall bottom three is because we didn’t score improvement points, not because of our placement in Excellence Points.

SUS Excellence Points Chart

Tom Serwatka

Ratio of Faculty to Administrators

Questioner: Anonymous

Posted to: John Delaney, President University of North Florida

“A recent widely published study indicates that 3 to 1 is the ideal ratio of faculty to administrators at a university. Data on the SUS web site indicates UNF’s ratio is 2 to 1, if we include non-tenure- track faculty, and that our ratio is worse than many others in the SUS. What can UNF do to improve these numbers, operate more efficiently, and reduce the administrative bloat?”

Written response from Dr. Thomas Serwatka, Vice President & Chief of Staff to the President:

Unfortunately, there are two different problems with the data that the anonymous faculty member was using as the basis for this question.


The data on the SUS website were drawn from IPEDS data submitted by each institution. In most cases, the data we are required to send to IPEDS are very standardized and precisely prescribed. There are, however, some elements that require university judgment.


The IPEDS directions for this data element state that managerial-level people should be included in the executive/administrative category if the individuals spend at least 80 percent of their time in administrative tasks, as opposed to providing direct service. When UNF enters its data, we do so based on title and fail to measure the amount of time spent on direct services. Our submission includes directors and assistant directors while other institutions do not. In some cases, our directors are appropriately included in this category: for example, the director of the Florida Institute of Education. In many other instances, the individuals who hold the title of director are spending considerably more than 20 percent of their time in providing direct services. And, it is highly unlikely that any of our assistant directors even comes close to the 80 percent criteria, yet we include them under this code. We need to work on our system of reporting to make it more consistent with the definition and/or the practices across the state.


One other reason that may cause the significant differences you see on the tables presented by the SUS is that UNF includes its auxiliary services employees in this count. We have reason to believe that some other institutions are only counting E&G funded positions.


Before any meaningful comparisons across institutions can be made, we need to standardize coding for this element. Our Office of Institutional Research has been asked to see if the system can address these discrepancies in reporting data. When we get this resolved, it is likely that our data will more accurately reflect the national norms.