An Enabling Environment

Report
Johann Mouton, CREST
HESA Conference on Research and Innovation
4 April 2012


Given the different institutional histories,
missions and capacities, a high degree of
differentiation in terms of key research
production dimensions are only to be
expected
The differentiation constructs and associated
indicators presented and discussed here are
not independent of each other (in statistical
terms there are multiple “interaction effects”)
We still need a proper conceptualisation of the notion of
research differentiation. As a first attempt I would
distinguish the following SIX types or categories:
Differentiation in terms of
 Volume of research production
 Shape of research production (differences in
distribution of output by scientific field)
 Site of publication (comparative journal indexes)
 Research collaboration
 Research impact (High or low visibility or recognition)
 Demographics: Differences in distribution of output
by gender/ race/ qualification/ age
Dimension
Indicators
Volume
Absolute nr of papers in peer-review journals (Institutional
level)
Normalized output (Nr of papers in peer reviewed journals
divided by size of permanent academic staff – Institutional
level)
Shape
Total nr of papers by scientific domain/field
Site of publication
Total nr of papers by journal index (ISI, ISI-SA, IBSS, SA,
Scopus)
Collaboration
Nr of single institution papers
Nr of nationally co-authored papers
Nr of internationally co-authored papers
Impact
Journal normalized citation score (Institution level)
Field-normalized citation score (Institutional level)
Demographics
Nr of papers by demographic category (gender, race, age
intervals, highest qualification)
University research production - since the introduction
of a national research subsidy scheme in 1987 – initially
remained quite stable (ranging between 5000 and 5500
article units between 1988 and 2003) BUT then
increased dramatically to reach more than 8000 units in
2010. The best explanation for this dramatic increase is
the introduction of the new research funding framework
in 2003 (and which came into effect in 2005) which
provided much more significant financial reward for
research units and clearly provided a huge incentive to
institutions to increase their output
9000
8000
7000
6000
5000
4000
3000
2000
1000
0
But the increase in recent years in absolute output
has not affected the institutional distribution. The
huge differences between the most productive
and least productive universities that were evident
25 years ago, have remained mostly unchanged. A
few universities have managed to improve their
position in the ranking (UWC is a good example),
but the vast inequalities in knowledge production
between the top and bottom universities have not
disappeared.
INSTITUTION
Total nr of Research Publication Equivalents
Column %
University of Pretoria
University of KwaZulu-Natal
University of Cape Town
University of the Witwatersrand
University of Stellenbosch
University of South Africa
University of the Free State
University of Johannesburg
North-West University
Rhodes University
University of the Western Cape
Nelson Mandela Metropolitan University
University of Limpopo
Tshwane University of Technology
University of Zululand
University of Fort Hare
17981.52
16246.66
15757.88
15595.61
13251.65
8786.84
6938.24
6264.10
5268.50
4306.78
2895.35
2742.50
2017.29
1037.38
1017.38
939.04
14.6%
13.2%
12.8%
12.6%
10.7%
7.1%
5.6%
5.1%
4.3%
3.5%
2.3%
2.2%
0.8%
0.8%
0.8%
0.8%
Cape Peninsula University of Technology
Walter Sisulu University
Durban University of Technology
Central University of Technology
University of Venda
Vaal University of Technology
Mangosuthu Technikon
695.07
465.26
415.65
317.28
225.81
177.72
35.21
0.6%
0.6%
0.3%
0.3%
0.2%
0.1%
0.0%
INSTITUTION
University of Pretoria
University of KwaZulu-Natal
University of Cape Town
University of the Witwatersrand
University of Stellenbosch
THE TOP FIVE
Total nr of Research
Publication
Equivalents
Column %
17981.52
14.6%
16246.66
13.2%
15757.88
12.8%
15595.61
12.6%
13251.65
10.7%
78833.32
63.9%
Rule: Universities producing more than 10% of total university output
INSTITUTION
University of South Africa
University of the Free State
University of Johannesburg
North-West University
Rhodes University
University of the Western Cape
Nelson Mandela Metropolitan University
Total nr of Research
Publication
Equivalents
Column %
8786.84
7.1%
6938.24
5.6%
6264.10
5.1%
5268.50
4.3%
4306.78
3.5%
2895.35
2.3%
2742.50
2.2%
37202.31
Rule: Universities poducing at least 1% of total sector output
30.2%
INSTITUTION
University of Limpopo
Tshwane University of Technology
University of Zululand
University of Fort Hare
Cape Peninsula University of Technology
Walter Sisulu University
Durban University of Technology
Central University of Technology
University of Venda
Vaal University of Technology
Mangosuthu Technikon
Total nr of Research
Publication
Equivalents
Column %
2017.29
0.8%
1037.38
0.8%
1017.38
0.8%
939.04
0.8%
695.07
0.6%
465.26
0.6%
415.65
0.3%
317.28
0.3%
225.81
0.2%
177.72
0.1%
35.21
0.0%
7343.09
5.3%
100%
90%
80%
70%
60%
66.7
63.1
65.5
61.4
63.9
30
30.6
28.2
31.5
30.2
3.3
1990 - 1994
6.2
6.6
7
5.3
1995 - 1999
2000 - 2005
2005 - 2009
Total period
50%
40%
30%
20%
10%
0%
Bottom eleven
Middle seven
Top five
2008
2003
1998
1993
0.00
200.00
400.00
600.00
800.00
1000.00
1200.00
The statistics presented thus far on institutional output
only refer to absolute output and have not been
normalized for the size (viz. Academic capacity) of
institutions. In the following two graphs we first present
the rankings i.t.o. research output (normalized for
number of permanent staff) and then the rankings i.t.o.
knowledge output (Masters and Doctoral graduates
included) also normalized for size of academic staff. A
comparison of the two ranking reveal some interesting
shifts in rankings (most notably for NMMU, UNISA and
some of the UoT’s) but the overall difference in
normalized output between the top and the bottom
universities remains huge.
1. UCT
2. US
3. RU
4. WITS
5. UKZN
6. UP
7. UJ
8. UFS
9. NWU
10. UWC
11. UNISA
12. UFH
13. NMMU
14. UZ
15. UV
16. TUT
17. CPUT
18. CUT
19. VUT
20. UL
21. WSU
22. DUT
23. MUT
Headcount of permanent staff Research Publication Units
Accrued
982
1 253.03
917
1 034.70
321
325.33
997
936.14
1 403
1 146.51
1 676
1 187.46
884
610.90
795
496.49
1 086
585.94
509
266.82
1 404
734.60
292
142.22
574
255.51
253
66.66
321
76.76
820
188.06
749
155.26
260
39.56
322
44.73
770
93.25
608
51.85
574
48.45
152
7.57
Per Capita Output
1.28
1.13
1.01
0.94
0.82
0.71
0.69
0.62
0.54
0.52
0.52
0.49
0.45
0.26
0.24
0.23
0.21
0.15
0.14
0.12
0.09
0.08
0.05
Rank
University
1 (2)
2 (1)
3 (3)
4 (4)
5 (6)
STELLENBOSCH
CAPE TOWN
RHODES
WITWATERSRAND
PRETORIA
6 (7)
7 (5)
8 (13)
JOHANNESBURG
KWA-ZULU NATAL
NELSON MANDELA
9 (9)
10 (8)
NORTH WEST
FREE STATE
11 (10)
WESTERN CAPE
12 (16)
TSHWANE UT
13 (18)
14 (11)
CENTRAL UT
SOUTH AFRICA
15 (17)
16 (12)
17 (14)
CAPE PENINSULA UT
FORT HARE
ZULULAND
18 (19)
19 (22)
20 (20)
21 (15)
22 (22)
23 (23)
VAAL UT
DURBAN UT
LIMPOPO
VENDA
WALTER SISULU
MANGOSUTHU
Average annual weighted
output 2007 – 2009
1833
1926
550
1609
2216
847
1768
482
1110
898
505
277
74
938
184
199
146
43
82
243
80
24
2
Average annual normed output
for 2007 – 2009
177
166
140
131
110
107
103
99
94
94
82
75
61
60
60
53
49
33
30
25
24
6
4
SA universities vary hugely in terms of the
“shape” of their knowledge production. The big
differences in scientific field profiles of the
different universities is clearly a function of
institutional histories (e.g. having a medical
school or faculty of theology) and institutional
missions (research intensive universities versus
more teaching universities and ex-technikons)
UNISA
UWC
NWU
UFH
SU
UFS
UP
UKZN
WITS
UCT
0%
10%
20%
30%
Engineering
40%
50%
Natural sciences
60%
70%
Health Sciences
80%
SSH
90%
100%
Distribution of research output by journal index
(ISI, IBSS and “SA”) varies hugely. The
differences between the universities in terms of
this dimension is mainly a function of the shape
of knowledge production at the universities,
but clearly also of other factors like institutional
histories, language of publishing, and so on.
One of the immediate results of these
differences is its impact on university rankings.
NMMU
UNISA
NWU
UWC
RHO…
UFS
UP
SU
UFH
UKZN
WITS
UCT
0.0%
20.0%
40.0%
60.0%
80.0%
100.0%
University research output has become significantly
more “international” and “collaborative” over the past
10 – 15 years. South African academics collaborate much
more than before – in the post-apartheid sanction
period this was always to be expected. But we also
collaborate more in fields (such as infectious diseases)
with international teams receiving huge international
funding. Interestingly, there is nothing in the funding
framework that actively encourages collaborative
research – on the contrary. But one has to immediately
add that this “negative” feature of the framework is
offset by the positive effects of collaborative publishing
as demonstrated in higher citations and more visibility.
UWC
UWC
UWC
SI
NC
IC
96
22
14
19
97
27
19
14
98
28
25
28
99
29
31
25
00
23
14
20
01
31
19
37
02
33
27
36
03
36
18
48
04
27
19
53
05
26
30
66
06
41
38
94
07 Total
39
381
35
295
82
536
UCT
UCT
UCT
SI
NC
IC
316
195
176
277
181
214
274
171
196
292
168
265
242
160
260
261
157
299
259
175
286
233
163
324
224
212
388
276
233
462
289
262
546
285
246
588
NWU
NWU
NWU
SI
NC
IC
30
26
21
31
26
39
27
34
31
27
31
28
43
20
34
32
23
21
43
32
37
31
32
63
40
23
64
44
39
91
46
38
72
57
31
74
484 33.2%
379 26.0%
593 40.7%
UP
UP
UP
SI
NC
IC
220
111
74
190
119
83
186
144
109
190
135
121
204
154
145
197
145
166
199
184
179
229
160
205
224
156
204
255
196
268
260
221
287
251
196
291
2781 39.7%
2015 28.8%
2202 31.5%
UNISA SI
UNISA NC
UNISA IC
38
17
20
36
15
27
37
14
34
36
12
23
31
12
23
27
10
20
31
10
12
28
14
15
30
8
27
27
13
28
37
23
16
47
22
14
399 49.1%
328 20.2%
266 30.7%
3529
2522
4168
Col %
31.4%
24.3%
44.2%
34.5%
24.7%
40.8%
The impact of SA’s research production has
increased significantly over the past 15 years –
mostly because of collaborative publishing (in
high-impact journals) – and possibly also
because of increased research in highly visible
research areas. This is true at the country level,
but with very different impact levels at the
institutional level.
9000
8000
7000
6000
5000
4000
3000
2000
2001
2002
2003
2004
2005
2006
2007
2008
Source: Robert Tijssen (CWTS, Leiden University, Netherlands); CWTS WoS database
2009
2010
1,00
0,95
0,90
0,85
0,80
0,75
0,70
0,65
0,60
MNCS
Top 20% highly cited papers (deficit/surplus)
Top 10% highly cited papers (deficit/surplus)
0,55
Top 1% highly cited papers (deficit/surplus)
0,50
2000-2003 2001-2004 2002-2006 2003-2006 2004-2007 2005-2008 2006-2009 2007-2010
Source: Robert Tijssen (CWTS, Leiden University, Netherlands); CWTS WoS database
1.10
1.00
CPP/FCSm
0.90
0.80
0.70
0.60
0.50
0.40
95-98 96-99 97-00 98-01 99-02 00-03 01-04 02-05 03-06 04-07
Univ. Cape Town
Stellenbosch Univ.
Univ. Witwatersrand
Univ. KwaZulu-Natal
Univ. Pretoria
1
0.9
CPP/FCSm
0.8
0.7
0.6
0.5
0.4
0.3
99-02
00-03
North-West Univ.
Univ. Johannesburg
Rhodes Univ.
Univ. Fort Hare
01-04
02-05
03-06
04-07
Univ. Western Cape
Nelson Mandela Metropolitan Univ.
Univ. Free State
Univ. Limpopo
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
UCT
SU
<0.50
UW
0.50-0.79
UKZN
0.80-1.19
UP
NWU
1.20-1.49
UWC
1.50-1.99
UJ
2.00-5.00
NMMU


International trends – the “demands” created by
international rankings
National steering instruments: Revised funding
scheme + Expanded SA presence in ISI + NRf
Rating system which have led to ….
 Increase in research output
 Increase in ISI-production



Institutional capacities (Merton and cumulative
advantage theory)
Institutional histories and structures
Institutional strategies (overleaf)
We have seen how the institutional differences in research
productive capacity has remained pretty much unchanged for the
past 20 years. But how have the most productive universities (the top
5 – 7) managed to increase their absolute output so much more than
some of the weakest institutions? How have some universities
managed to increase their international visibility and impact much
more significantly than others?
There are at least two plausible (complementary) explanations – both
relates to the human capital base. The first is evidence that shows
that the top universities are not necessarily more productive at the
individual level – they simply manage to broaden the active research
base within the institution (cf. next slide). The second is the very
persuasive evidence that shows the very strong correlation between
the proportion of doctorate capacity and per capita research output
(cf. following slide)
Number of UCT
Number of
authors
Publication Total WITS WITS authors Total UCT
responsible for
year
articles
articles
responsible
articles
for articles
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
895
1160
1164
1177
1270
1204
1090
1106
1026
1127
1120
1078
1077
947
911
927
901
932
939
975
974
958
984
907
983
966
974
961
750
809
932
898
870
877
945
939
823
880
892
960
917
930
974
842
1047
1070
1166
1242
973
886
926
897
953
939
923
950
902
992
926
981
1018
851
1186
1155
1211
1205
Total UKZN
articles
Number of
UKZN authors
responsible for
articles
557
554
573
615
577
636
660
748
651
721
735
744
799
985
1023
1140
1086
591
537
609
608
584
661
621
708
653
702
688
716
745
725
717
726
632
Percentile
breakdown of
authors
Nr of authors
WITS
UCT
Mean nr of article equivalents
UKZN
WITS
UCT
UKZN
91-100% (Top
10% of authors)
462
652
411
13.66
11.41
13.61
71-90%
939
1302
823
3.38
2.40
3.22
51-70%
931
1302
823
1.26
0.86
1.13
31-50%
937
1302
823
0.59
0.43
0.56
11-30%
857
1302
823
0.32
0.25
0.36
1-10% (Bottom
10% of authors)
542
652
411
0.16
0.12
0.19
Total
4668
6512
4114
2.48
1.95
2.43
Average publications in accredited
journals per permanent academic staff
(2009)
1.40
UCT
1.20
SU
1.00
RU
WITS
0.80
UKZN
UP
0.60
UNISA
UJ
0.40
NMMU
UFH
UFS
UWC
NWU
UZ
CPUT TUT
DUT
VUT UL
WSU
MUT
0.20
0.00
0
CUT
UNIVEN
10
20
30
40
50
60
% of permanent academic staff with PhD (2008)
70
We undoubtedly have a highly differentiated
university sector when assessed in terms of key and
relevant indicators
 Some of the “causes” of these differences reflect the
path-dependency of historical factors, missions and
structures. Other differences are the results of more
recent institutional responses to international and
national policies, strategies and incentives.
 I have argued that the trends presented show that
there are identifiable enabling mechanisms and
drivers that impact on greater productivity and
international impact even within a differentiated
system.


similar documents