Who decides?  What do we think of peer assessment

Convener(s): Emma Stenning 

Participants: Ric Watts, Jo Crowley, Jenny Brown, Lyn Gardner, Syeera, plus others (names not got)

Summary of discussion, conclusions and/or recommendations:

Does ACE have the authority to make judgments on artistic quality? It was felt that ACE and some of its staff members might not have the expertise and knowledge to be able to make key decisions based on artistic achievement, innovation and quality. 

We talked about how ACE staff come to the organization – Emma offered that many of her London team have worked closely with the sector, but nationally and across artforms this isn’t always the case.  

This is why people talk of ‘peer assessment’ – not trusting ACE having the expertise to make these calls. However, Emma worked in the sector for years – why is she no longer considered a ‘peer’.  It is as if she has crossed to ‘the darkside’.

This feeling might come because of intrinsic difficulties with ACE.  It is both ‘partner’ and the ‘hand that feeds’.  These can be contradictory functions which cause issues between ACE and sector. 

It was noted that, since the abolition of peer assessment in the form of Drama Panels, ACE officers now have more power, more responsibility and more influence.  Is this taken into account within recruitment and training.  ES explained that recruitment at ACE is difficult – people don’t want to work for ACE.

There was discussion about the need for change within the funding system.  The historical principle cannot be a reason to continue funding an organization.  Room needs to be made for new artists and innovation.  That was clear.  The problem comes in the decision making process – how are these decisions taken?

Emma Stenning went through the process that led to the recent funding recommendations, which started in Aug 06, and has seen a number of drafts of ‘the portfolio’ created by each unit, which has then been debated by Senior staff.  There is then assessment at a regional level looking at the wider picture across artform in that region, and then assessment nationally looking at strategy for each artform.

This is the first time, however, that cuts have not only been made to failing companies – instead, strategic vision and shifting priorities have fed into the decisions.  Made the funding round more tough.

The point was raised that this seems problematic – employing a more strategic approach (where a cut company can be successful but just not in vogue) in a year when there is no lead-in for disinvestment due to delayed CSR.  It makes the decision more hard to swallow for those companies disinvested with little notice, for reasons of chaning priorities.

We talked about what impact Peer Assessment would then have had on this process – would it have improved it?  Would decisions have been different? Would ACE be getting less criticism for its decisions?

Peer Assessment gives legitimacy to the decision making process.  Its means that decisions are taken collectively, and in collaboration with the sector. 

There was concern about previous ‘peer panels’ which were consisted of the good and the great. {ADDENDUM- Not enough!) Any new model of peer assessment needs to be made up of a range of members, representing a cross-section of the industry, and possibly rolling or democratically elected.  These panels should involve more than just artists and audiences, but embrace all with a stake in the arts – journalists, academics etc.

It was generally felt this would be a positive step towards a more transparent process and restoring faith in funding decisions.