• No results found

A Comment on Peer Review Evaluation

I don 't have as well prepared a critique as my colleague, but I do have some remarks. First of all, nothing quite like what has been presented by Lars Oidefeldt is going on in the US. The Swedish Natural Science Research Council is doing a much more intensive evaluation of a small number of projects than is possible in the US. lf you think of bow large the US is, and how many projects and how many universities in a given field, it would be quite difficult to cover everything like that.

I did mention that the Academy of Sciences and the Academy of Engineering did some particular analyses about the status of science in the same fields, where the advances were being made and where they thought things might be going -but they did this without assessing particular individuals or groups.

As I mentioned before, I have just come from consulting with the Hungarian Government They are setting up their research fund. They had one under their Communist regime. It started about five years ago under their Academy of Sciences which is in the Eastem model, governmentally controlled institutes and so forth where they have made grants for research. They have now made this an inde-pendent agency. Their parliament has disassociated it from their Academy of Sciences and I was asked to come in and help them establish their procedures and work through a lot of issues and questions.

The money allocation issue isa very difficult one. You said it hasn't changed much here. It is very hard to change in any area. It is especially hard when you are starting in a country and they don't know really where to start as far as making the allocations among the programs. But, with Dr. Gidefeldt's pennission I am going to send his report on to the head of that agency because I think it would be very useful for them to have outside reviewers come in and assess their strengths and weaknesses in the various fields. They have some problems of being a small country, high quality science in many areas. But they have immense problems in terms of the universities versus the Academy and the established people versus the people who had not been established and so forth. And so in setting up their system they are very hesitant to critique proposals or to review proposals. Everybody is very tentative about being too critical, so pemaps some teams that could come in from the outside and look at particular areas could be useful to them to give them a bener base.

Secondly, I do want to mention something along these lines that is carried out by NSF, where we have site visits by groups of reviewers. They are generally in connection with very big projects and proposals. I mentioned we have about 28,000

proposals a year and most of those are for support of say one professor or two people with some students for a few years' work.. But two or three hundred are for very large projects generally involving big facilities, supercomputers, telescopes and so forth.

Also, in the past several years we have tried to establish what are called Science and Technology Centers, or in our engineering area, called Engineering Research Centers. The term 'centers' is a very fluid one; these are multi-field projects with a team of several investigators from different areas of research. Their mission is to collaborate in areas that need integration; we are trying to overcome the disciplinary structure and work. across in a multi-disciplinary way. In engineering we have established about twenty-five of these and in non-engineering areas which started later, only a couple of years ago, there are about twenty of these so far in various multi-disciplinary areas.

The original ones in engineering were set up for five years at about five million dollars a year apiece. In making those awards our director and board decided that each center should be reviewed in three years time to see if they should be continued. So we have a process similar to the one that was mentioned here, of getting site visitors who have no connection with a particular project, but we have a problem in that they represent various fields so they must have a spokesman for biology, a spokesman for computer science and so forth on these particular teams.

Frequently people from government laboratories and industry are invited too. These site visit teams have the same sorts of organization as the Swedish teams, a chainnan and people picked from the outside and so forth. Ours have a problem in that when they go to visit one of these laboratories or centers the scientists there wish to spend the entire time demonstrating how wonderful the science is and putting on slide shows and presentations, so that the review team consequently has to press to have its time to ask its own questions and to write its own report. So sometimes it gets a little bit out of control in that respect.

But the reports of these committees are very powerful. Six years ago we made the first grants for six engineering research centers. When the time came for their · third-year review, two of them were discontinued, because these teams made reports to our director and then to our board which said they were not proceeding as well as they could be and were not integrating the science in ways that had been promised in the proposals, so although they have good people, they are doing good work, they are not achieving the kind of integration or cross-disciplinary work.. And they were discontinued. There is quite a message in that So these reviews are now continuing, each year's group is now being reviewed.

I know of one other type of assessment like the Swedish model and this was not

~b~~~~~~~a~~~~had~~

education and research board to look at all the programs that had been specified in

legislation. Over the years they had accumulated a great many of these, where it said, for example, this university should have a military history department, this university should have such and such in physics, this university should have such and such in mathematics. There were some forty-eight different programs. They could only assess them one by one though because they couldn 't match them with each other. But they asked the same sort of questions, had the same sort of reviews, the same kind of statements and reports. And as a result several programs were discontinued by the legislature because they felt that they weren 't perf orming well enough and others were increased in funding. But this was not in one particular scientific discipline, this was across a wide range of incompatible programs.

Anthony F. J. van Raan

Bibliometric lndicators as Research