• No results found

The metric wave

N/A
N/A
Protected

Academic year: 2022

Share "The metric wave"

Copied!
12
0
0

Laster.... (Se fulltekst nå)

Fulltekst

(1)

The metric wave

SPARC Europe Member’s meeting

LIBER 2016, Helsinki June 29

th

(2)
(3)

Metrics are much used

• Impact

– Prestige – Citations – IF

– H-index – Rankings

• Evaluation

– Often uses metrics as a “stand-in” or proxy for research quality

(4)

Overlooks

• The many problems of metrics, e.g.

– Citations measures only science’s own re-use of science

– A highly skewed and small sample of sources

Google H-index ≥ Scopus H-index WoS H-index

– Much manipulated

By authors, journals and publishers

– Cannot always be accurately reproduced – Faulty mathematics

IF is an average in an extremely skewed distribution

(5)

Results in

• Overhiring, promoting and financing

mediocre research

• Overlooks excellent research

• Metrinecrosis

– The slow death of science by metrics poisoning

By D. A. Warrell [CC BY 2.5

(http://creativecommons.org/licenses/by/2.5)], via Wikimedia Commons

(6)

And in

• Richer publishers

– They own the high-ranking journals

• Poorer science

– Science pay the profits of publishers – TA makes science work less efficiently

• Literature access costlier ⇒ access to less literature

• Science less valuable for society

(7)

So what do we tell the researchers?

• The other way lies TA

– A job, promotion and tenure

– Research financing – Respect from

colleagues – Reality …

• One way lies OA

– Solidarity with poorer researchers

– Making a better society and better science

– Fulfilling the function of science

– The right thing to do!

(8)

Time to mend our ways!

• Research evaluation means that!

– Evaluating the research

• That’s hard work

– Not finding out where it was published

• Anyone could do that …

• We need to change all kind of evaluation

processes to become evaluation, not a

looking up of arbitrary numbers!

(9)

Impact factor vs actual citations

• Studied for one author over 17 years

– 70 articles

• Correlation between IF and actual citations was 0.016

«As responsible scientists we should insist on the same quality standards for scientific evaluation as we require of the scientific work itself.»

Seglen, P.O. 1989. «From bad to worse: evaluation by Journal Impact» Trends in Biochemical Sciences14(8), 326–327. http://doi.org/10.1016/0968-0004(89)90163-1

(10)

Evaluation

• Evaluation means assessing the value of content

• Evaluation can be informed, but not replaced, by various metrics

– Not IF, it is not a content or author metric

– And content quality causes IF, IF does not cause quality

• Alternative metrics for wider impact and societal interest

– An evolving field

• DORA – The San Francisco Declaration on Research Assessment

http://www.ascb.org/dora/

(11)

Thank you for listening

I’ll be even happier if you actually do something about how you evaluate!

Jan Erik Frantsvåg SPARC Europe Chair

Open Access Adviser,

UIT The Arctic University of Norway jan.e.frantsvag@uit.no

(12)

Creative Commons License

This work is licensed under

Attribution 4.0 International License

http://creativecommons.org/licenses/by/4.0/

Referanser

RELATERTE DOKUMENTER

– Brainstorm: List all the ideas to include in your paper – Organize: Group related ideas together. – Order: Arrange material from general to specific – Label: Create main and

Cohesion – how to make your writing fluid Concision – how to make your writing

Tables and Figures – how to make your writing understood Citation –how to reference your sources.. Punctuation – how to make your writing accurate

Based on this work, we identified the following sixteen items as potential concerns raised by human challenge studies in general: scientific rationale, absence of

Based on the evaluation of available international standards and guidelines for air pollutants, as well as the evaluation of the available air quality

(quality) as the “scientific adequacy of the technical evidence and arguments;” “salience” (relevance) as the extent to which the scientific advice is responsive to the

The principal evaluation committee focused on the missing "blue sky" research (high scientific quality and low relevance) for engineering sciences in Norway, as

The main focus of the evaluation should be the scientific quality of Norwegian research within biology, medicine and health and psychology in Norwegian universities,