• No results found

1 Scope and scale of the evaluation

1.4 Data and review process

The evaluation draws on a comprehensive set of data. The Social Anthropology panel based its assessment on the written self-assessments submitted by the institutions and a qualitative assessment of the submitted publications. Further bibliometric data from the analysis by Damvad Analytics, Denmark commissioned by the Research Council, and further data on the funding of social science were used to contextualise and/or confirm the panel’s qualitative evaluation. The panel chair met with the institutions, primarily to supplement and clarify information provided in the self-assessments.

Building from the bottom, the assessments of individual scientific output fed into the evaluations of the research groups and research area, while the self-assessment reports for the research groups fed into the institutional research evaluation and the assessment of the research area. The self-assessments from the institutions contributed to the assessment of the research area within the institution. The report on personnel and bibliometrics (publications) was considered at the research group level, the institutional level and national research area level. Societal impact cases were considered at the group and area level. The research area evaluations were used by the field panels to build a picture of national performance within the research field covered by the panel reports.

The panels also based their assessment on data on funding and personnel, as well as information from earlier institutional and disciplinary evaluations from the Research Council and policy documents from the Government.

See Appendix H for information on time frames for assessments and bibliometric data.

Institutional self-assessment reports

Reports were submitted by all the research-performing units. They included quantitative and qualitative information at the institutional level (called level 1 in the self-assessment template), and at the level of the disciplines/research areas corresponding to the panels (called level 2 in the self-assessment template).

19

The following were enclosed with the self-assessments report from each unit:

• A list of the 10 most important publications for each research area;

• A list of 10 dissemination activities;

• Societal impact cases for each discipline showing important dissemination and knowledge exchange results, (the impact cases were optional);

• An analysis of strengths, weaknesses, opportunities and threats (a SWOT analysis)

• A form (number 2): Target audience for scientific publications;

• A form (number 3): Research matching the priorities set out in the Norwegian Government’s Long-term plan for research and higher education and in other relevant policy documents;

• An overview of study programmes.

The templates for institutional self-assessments and publications are attached to the report as Appendices C and J.

Self-assessment reports for research groups

The institutions were given an opportunity to include research groups in the evaluation. The reviews by the research panels were based on self-assessments and other documentation. The data included quantitative data on group members and funding, qualitative information on various aspects of the research activities and CVs for all the members of the groups. In addition, each group had the option of submitting one copy of a scientific publication for each member included in the evaluation, as well as case studies of the societal impact of their research.

The template for research groups is attached to the report as Appendices E and K.

Societal impact cases

Reflecting the novel approach of including societal impact in the evaluation (cf.1.2.1), the institutions were invited to include case studies documenting a broader non-academic, societal impact of their research. Participation was optional.

Bibliometric report

The Research Council of Norway (RCN) commissioned an analysis of publications and personnel

dedicated to social science research for the evaluation,

https://www.damvad.com/uploads/Publications/Report%20

%20Social%20Science%20in%20Norway%20v2.3.pdf.

DAMVAD Analytics conducted the analysis, mainly basing its work on data from the following sources:

the Norwegian Centre for Research Data (NSD); the Current Research Information System in Norway (CRIStin) and the National Researcher Register for which NIFU is responsible. DAMVAD Analytics added bibliometric data from Elsevier’s Scopus database and Google Scholar to enhance the analysis of the internationally published scientific material.

The RCN defined the framework for Damvad’s analysis, and decided to include the following elements:

• The total scientific output within social science for Norway;

• The institutions involved in social science in Norway;

• The research personnel engaged in social science in Norway.

For an overview of the publishing in economics, please see appendix F: Damvad Fact sheet for economics.

20 Funding data

Data and information on financial resources and funding (cf. 2.2) are based on:

• Report on Science and Technology Indicators for Norway: (Norges forskningsråd, Det norske forsknings- og innovasjonssystemet – statistikk og indikatorer, Norges forskningsråd, Lysaker,

2016;

https://www.forskningsradet.no/prognett-indikatorrapporten/Home_page/1224698172612

• NIFU, Norwegian Research and Development (R&D) statistics and indicators, https://www.nifu.no/en/statistics-indicators/nokkeltall/

• Research Council of Norway, The Project Databank,

https://www.forskningsradet.no/prosjektbanken/#/Sprak=en.

• The Research Council of Norway, Social sciences research in Norway 2010–2016: Funding streams and funding instruments. Report submitted to the principal committee for the Research Council’s evaluation of the Social Sciences (SAMEVAL), report for internal use by SAMEVAL evaluators (ref. page 1, first section) unpublished report, undated (2017): 11 pages.

In addition, section 2.2 draws on:

• The Research Council of Norway, Report on Science and Technology Indicators for Norway 2017, The Research Council of Norway, Lysaker, 2017;

https://www.forskningsradet.no/prognett-indikatorrapporten/Science_and_Technology_2017/1254031943643

Other relevant publications provided by the Research Council Earlier evaluations commissioned by the Research Council

• Relevant disciplinary evaluations (please see the reference list for details)

• The Research Council of Norway: Evaluation of the Humanities in Norway. Reports from the panels and the principal evaluation committee.

Evaluation of the Social Science Institutes. Panel Report, January 2017, the Research Council of Norway, Lysaker.

National plans and strategies for research policy

• The Research Council of Norway, Research for Innovation and Sustainability. Strategy for the Research Council of Norway 2015–2020, 2015.

• Kunnskapsdepartementet, Meld. St. 7 (2014–2015), Langtidsplan for forskning og høyere utdanning 2015–2024, [The Royal Norwegian Ministry for Research and Higher Education, Long-term plan for research and higher education 2015–2024], 2015, [in Norwegian].

Official reports on the status of higher education:

• Kunnskapsdepartementet, Meld. St. 18 (2014–2015). Melding til Stortinget. Konsentrasjon for kvalitet. Strukturreform i universitets- og høyskolesektoren, 2015 [White paper, no. 18 (2014–

2015), Concentration for quality. Structural reforms across the universities and university colleges, The Royal Norwegian Ministry for Research and Education, Oslo 2015] [in Norwegian].

• Kunnskapsdepartementet, Tilstandsrapport for høyere utdanning 2017, Rapport, 2017 [The Royal Norwegian Ministry for Research and Education, Status Report for Higher Education, Report, 2017] [in Norwegian].

21

1.4.1 Process and assessment tools

The Research Council set up ‘SharePoint’ (a Microsoft Office 365 program), and all background material and other data and documents were deposited there. The panel shared files and work in progress in SharePoint.

The Research Council commissioned Nordic Institute for Studies in Innovation, Research and Higher Education (NIFU), Oslo, Norway to provide scientific and project management support to the panels.

Research Professor Vera Schwach acted as scientific secretary for the economics panel.

Panel meetings and work

The economics panel held three one-day meetings: in May and September 2017, and in January 2018.

In addition, the panel chair of economics joined the other panel chairs for two one-day panel chair meetings, held in April and September 2017.

The chair carried out the interviews with the 18 institutions on behalf of panel during four days in late October 2017 (see section, meeting with the insitutions). The scientific secretary wrote minutes from the interviews. In between the meetings, the members were in contact through emails.

The panel divided the assessments and writing among the members. The secretariat took the main responsibility for providing fact sheets, as well as chapter one and chapter two of the report.

Assessment tools

In order to ensure that all the dimensions were covered, and to ensure a uniform evaluation across the six different research areas, the secretariat at NIFU provided the panels with assessment tools.

These were:

• A template for research and scientific quality: numerical grading, see Table 1 below;

• A template for assessments of the units: institutions and research groups, see Appendix I;

• A template for assessment of the ten most important publications listed by the institutions, see Appendix J;

• A template for assessment of the publications of listed members of research groups, see Appendix K.

• The panels used the following description as the basis for their scoring of scientific quality.

Table 1 Scientific quality, numerical scale

Scale Criteria

5 Excellent Original research at the international forefront. The unit has a very high productivity. The unit [the institution /research group] undertakes excellent, original research, and publishes it in outstanding international channels for scientific and scholarly publications.

Its researchers present ongoing research regularly at recognised, international scientific conferences.

4 Very good Research with a high degree of originality, and a scientific profile with a high degree of publications in high quality channels for scientific and scholarly publications. The unit has a high productivity. The researchers participate habitually at international scientific conferences. The research is decisively very relevant to the knowledge production in the field internationally.

22

3 Good Research of a good international standard. The unit has an acceptable productivity, and contributes to the development within its field. The researchers participate at scientific conferences.

2 Fair Research of an acceptable, but moderate standard. The productivity at the unit is modest, and with few original contributions to the field internationally.

1 Weak Research of insufficient quality and with a meagre scientific publication profile. The productivity is low.

Meetings with the institutions

The panels supplemented the written documentation and data with information provided by the institutions in interviews. The meetings took place at Hotel Park Inn within walking distance of Gardermoen Airport, Oslo. The six panel chairs conducted the interviews. Each institution was interviewed individually. The panels had prepared the questions beforehand and sent the list to the institutions two weeks in advance. The lists contained both general and panel-specific questions. The interviews allowed for elaboration and discussion of issues of importance to the panel’s assessments.

The panel’s secretaries took extensive minutes of the meetings. The minutes were shared with all panel members.

Fact checking by institutions

Institutions were given the opportunity to provide a fact check of the assessment texts after the panels assessments were completed. The check did not include the grades or final evaluations, as the institutions were asked only to correct any factual errors. New and updated information was not included.