Measuring Societal Interaction

The performance of the two basic tasks of universities is measured and monitored in many ways. University rankings are based on indicators predominantly measuring features of these tasks. What is measured unavoidably, and to some extent unfortunately, becomes more visible in universities' policy discourses. A university leadership that takes the third task seriously therefore should make an effort to measure and monitor the performance of the task.

It is easy to enumerate indicators measuring some aspects of the third mission. I will provide some examples below. However, one may still claim that measuring societal interaction is a difficult task that can be easily misleading for a variety of reasons.

First, as described above, the range of the manner and means available for achieving worthwhile interaction is so broad as to be virtually limitless. The definition of the third mission does not have any clear boundaries, and a comprehensive measurement therefore is impossible. However, if universities, as indicated above, have attempted to strategically single out certain forms of interaction as their core third mission activities, finding some indicators for these forms of interaction might prove beneficial. For example, if universities emphasize the particular task of fostering the social responsibility of students (as described in contributions to the IECHE Conference 2013), the impact of community services on the attitudes of students can be measured, and has indeed been measured at many universities. It is important to recognize, however, that such indicators only measure a part, and probably only a minor part of all third mission activities universities are engaged in, and want to be engaged in.

Secondly, multidisciplinary universities face an additional challenge because of the fact that different disciplines tend to interact with society in markedly different ways. As mentioned above, faculties of law, medicine, humanities, and science legitimately have different patterns of social interaction. Any indicators chosen almost necessarily are more advantageous for some faculties and departments than for others. It is therefore difficult to use the indicators as such for comparing the third mission performance of different disciplines, or the performance of universities with different ranges of disciplines.

Thirdly, many of the possible indicators are either wanting in terms of reliability or extremely time-consuming to collect. Unlike some research and education indicators, many third mission indicators require particular collection of data from the researchers, or separate studies or questionnaires. Good universities try not to put excessive bureaucratic data-collection obligations on the shoulders of their personnel.

In spite of these considerations, measuring and monitoring of societal interaction take place in numerous universities and many countries. There are many examples of indicators, used as proxies of interaction. For Swedish legislationdrafting purposes I collected the indicators used to measure third mission (as defined by each university) activities by Finnish universities. To illustrate the variations that occur even within one country, I give as an example the indicators chosen by the largest multidisciplinary Finnish universities:

– University of Helsinki: employment ratio of graduates, number of active alumni, and fundraising results.

– Aalto University: stakeholder groups, non-academic funding, employees and turnover of start-ups, and important positions of trust.

– University of Turku: outside participation, open forums, patents, business funding, textbooks, development projects, and important positions of trust.

– University of Oulu: employment of graduates, master theses made for externals, invention notifications, and study credits at the Open University.

– University of Eastern Finland: reputation barometer, external expert assignments, start-ups, strategic partners, visibility levels in science news, and discussion.

The recent societal interaction barometer of the University of Jyväskylä has often been referred to and contains several items: education services to stakeholders, publications to stakeholders, expert services, stakeholder events, applied research and development activity, student theses for stakeholders, industrial property rights, spin-offs, employment ratio, internships, and infrastructure cooperation.

Often such third mission indicators tend to measure the interaction between research or academic expertise and society. Many of them therefore relate to the innovation issue analyzed more closely in Chapter “Excellence in Innovation and Knowledge Economy”. Indicators on the interaction between education and society are less frequent, but do exist, as the previous examples show. When focusing in particular on the social responsibility of universities to promote certain values and to ensure students internalize such values, a comprehensive measuring of the complete picture is probably impossible, but the development of the attitudes of students can certainly be measured, as is mentioned above.

Even though a large variety of indicators of the above type are used, and have their natural place among the instruments available to university leadership, such measuring and measurements only provide a partial picture. It is basically most useful for monitoring the development over time of particular activities within particular departments. A comprehensive and trustworthy understanding of the role of universities and their components, and a reliable and credible comparison of the performance of these components at a particular point of time, cannot be achieved through the use of indicators alone.

A strategic vision of how universities should fulfill their third mission and efficient measures to support the development toward the vision requires, in addition to the indicator-based purely quantitative assessment of the social interaction of universities, more qualitative means of assessment. Only such qualitative assessment, preferably to be performed at regular intervals, can adequately take into account the varying opportunities, traditions, and applied usefulness of different disciplines. Understanding of the states of affairs under consideration can be reached by means of regular qualitative assessments without overburdening organizations with continuous collection of large amounts of data.

From the point of view of international excellence, such broad assessment is of particular importance. The indicators used to measure third mission activities, as described above, are often not very effective for pinpointing activities reflecting universities' global responsibilities.

< Prev   CONTENTS   Next >