Directors' Blog

Te Pūnaha Matatini Call for Associate Investigators

Te Pūnaha Matatini Call for Associate Investigators

Te Pūnaha Matatini – the meeting place of many faces – is a New Zealand Centre of Research Excellence that carries out interdisciplinary research into complex systems and networks. We have a particular focus on complex data analytics, economic and social systems, as well as natural ecosystems.

Te Pūnaha Matatini is currently looking for up to ten new associate investigators who can complement our existing research programme or enable us to develop new research programmes. Associate investigators are not eligible for direct funding, but will be supported to attend investigator meetings and engage in Te Pūnaha Matatini research projects. Associate Investigators should be researchers who are New Zealand-based, and who have demonstrated the ability or potential to conduct research in areas of relevance to Te Pūnaha Matatini.

Those interested in becoming an associate investigator are invited to submit a cover letter and academic CV in the standard NZ Government RS&T format to Kathryn Morgan, Te Pūnaha Matatini’s Coordinator kathryn.morgan@auckland.ac.nz by 9am Monday 24th July 2017.

The cover letter will outline applicants’ intentions to contribute to Te Pūnaha Matatini’s research programme, either through existing projects or proposed projects for 2018-2020, including the supervision of students or research assistants. Applicants are advised to also include descriptions of your existing and/or future contributions to Te Pūnaha Matatini’s broader strategic goals in a) public engagement and outreach, b) industry and stakeholder engagement, c) contribution to evidence-based policy or decision-making, d) research sector leadership, e) diversity and equity in New Zealand, and f) recognising the distinctive contribution of Māori. Please note any collaborations that you have with existing investigators.

Applications will be considered by Te Pūnaha Matatini’s Research Committee, taking into account the merit of the application, the strategic fit of the proposed collaboration, and the resources available within the Centre for accommodating the investigator.

Te Pūnaha Matatini is committed to increasing diversity in the research community, and applicants from under-represented groups are strongly encouraged to apply. Te Pūnaha Matatini is also a family-friendly Centre of Research Excellence, with a commitment to rewarding merit relative to opportunity, and to applying best practice in enabling opportunity.

 

Role of Associate Investigators

Te Pūnaha Matatini’s associate investigators will typically be researchers at an academic institution, crown research institute or independent research institute, with an interest in complex systems or networks, data analytics, economic and social systems, and natural ecosystems.

Associate investigators are responsible for:

  • Actively collaborating on Centre research projects and/or
  • Actively collaborating on Te Pūnaha Matatini’s broader strategic goals
  • Conducting their research and other activities in accordance with Te Pūnaha Matatini’s ethical and professional policies.

It is expected that associate investigators be involved, or have the potential to be involved, in at least one Centre project, research or otherwise. Associate investigators are also requested to use their Te Pūnaha Matatini affiliation in all contexts relevant to the Centre. Associate investigators are also expected to contribute to annual reporting.

Associate investigators are eligible to:

  • apply for Centre funds to support PhD students and post-docs
  • apply for Centre funds to support activities that support Te Pūnaha Matatini’s strategic goals
  • attend Centre workshops and the annual investigator Centre forum

The Research Committee will review the role of associate investigators annually.

Taking a stand for science and society

Taking a stand for science and society

The world has taken a horrible turn for the worse in the last few weeks. Following a rather bizarre inauguration weekend that featured a running dispute over the size of Trump’s crowd and then the inspiring Women’s March, the Trump Administration unsettled the world with a flurry of controversial orders, some likely illegal and most certainly immoral. I did not expect to see things go this wrong, this fast.  

The travel bans were probably the most frightening in intent, blocking people from travelling to the US on the basis of their place of birth, including existing legal residents as well refugees who were on the verge of making a new start. The orders themselves were drafted to avoid the overt racism and hostility towards Muslim society that characterised the Trump campaign, but failed in this by their construction on the flimsiest of pretexts. It was a dog-whistle that everybody could hear.

The impact on many people, including those fleeing wars in Syria and Yemen, was immediate. Around the world students, refugees, and others in vulnerable circumstances were turned away or detained at airports, separated from their families, or saw their hopes for escaping deadly conflict destroyed.

Te Pūnaha Matatini is the meeting place of many faces, and amongst our diverse community are friends and colleagues who have been directly affected by this ban. Just as we would not tolerate a researcher who refused to work with people on the basis of their place of birth, their gender, or their religious beliefs, we also have no tolerance for politicians who craft policies that use this as a basis to single out vulnerable people for harm.

What can we do? At Te Pūnaha Matatini HQ we’ve been writing to people in positions of power to urge them to take a stand. If you’d like to know how we’ve been going about this, then drop us a line.

We will also be supporting the March for Science, which will take place on Earth Day, April 22. The focal point for the March will be Washington DC, but there will also be marches in cities in New Zealand. You can follow @ScienceMarch_NZ on twitter if you would like to join in here in New Zealand.

This march is not just for scientists. The Trump administration has already sent strong signals that it is willing to hinder the science community’s ability to speak to the public and it is highly likely that cuts to science funding will follow. Climate change and the degradation of the environment will affect everyone, however, and it is the already marginalised who stand to lose the most. And none of the crises that society faces today are solvable unless we also address social injustice.   

I’ve seen and heard comments that politics has no place in science, but these remind me of what I heard when New Zealanders marched against the Springbok Tour in 1981. One must acknowledge that science flourishes under some political arrangements, while it fails under others. When politicians abandon tolerant discourse, respect for others, and dismiss the value of evidence, science is in trouble. Whose job is it then to ensure the public understands this, if not the science community?

I’ve also seen pleas from some scientists to be left alone in their labs to get on with their science; some high profile scientists have argued that by addressing intersectionality, the march organisers are also attempting to put politics into science. We know that in science itself there are inequities and power structures that prevent or make it harder for some groups of people to become scientists in the first place. Only the most privileged in our society have labs in which to hide. There is no equivalence between the politics of Trump that seeks to exclude, and the efforts of many scientists to make science accessible to everyone.


Shaun Hendy is Director of Te Pūnaha Matatini. In 2016 he authored the book Silencing Science which explores the public obligations of scientists and instances where scientists have been prevented from speaking out.

Science for Policy: Part I

Science for Policy: Part I

A good deal of the research we do at Te Pūnaha Matatini is intended to inform government policy. But it is one thing to do the research, and quite another to have that research influence policy. This is why there is a growing interest in the relationship between research and policy, although there are still many different points of view on what this relationship should, let alone does, take. Over the next month or so we are going to post a series of blogs that discuss some of the issues that face researchers who wish to influence policymakers and policymakers who wish to use research.

In this first post, I am going to review aspects of this issue that are touched on in my recent book, Silencing Science. There I discuss the reasons why so few scientists seem to be prepared to engage with the public on subjects that are politically contested (tl;dr? Try this article from the Education Review). There are lots of motivations for avoiding contentious debates in public, but one concern that scientists have is the risk of damaging their relationship with policymakers, with the consequent implications for funding and their job. Understanding this relationship is important if we want to improve the use of research in policy.

The model I used to analyse this relationship in Silencing Science was borrowed from Roger Pielke, based on the analysis in his book The Honest BrokerHe identifies four modes in which scientists can legitimately engage with policymakers: the pure scientist, the science arbiter, the issues advocate, and the honest broker of policy alternatives. As I wrote:

“The first two modes operate when a scientist provides advice on issues with policy options around which there is political consensus. The pure scientist simply summarises the state of knowledge in a particular field without reference to policy options. If a scientist is asked by a policy-maker to weigh in on the evidence for or against the effectiveness of a specific policy option, they adopt the role of science arbiter. In both cases, the scientist can claim to be sticking to the science, and can put themselves at arms length from the politics of the day.”

I would argue that these two modes dominate the approach that New Zealand scientists take to engaging with government. These are the silent scientists; they may engage behind the scenes with policy-makers, but they generally don’t make an effort to inform the public other than through very passive channels (e.g. see the Royal Society of New Zealand’s report on the water fluoridation). Pielke argues that these modes are appropriate when the policy implications are not politically divisive, but when policies have serious political ramifications, Pielke says that a different approach is needed.

From Silencing Science again:

“The situation is more complex for the science advisor when providing advice on policies that are politically divisive. In this case, Pielke argues that the roles of the pure scientist or the science arbiter are poor choices. By standing back from politics, Pielke says, scientists risk becoming pawns in a contested public debate. When scientists claim they are sticking to the science on hotly contested issues, their scientific authority can be hijacked by special interests.”

The recent inquiry into the Ministry of Primary Industries’ (MPI) failure to prosecute over illegal fish discards illustrates this. The inquiry found emails from an MPI senior manager in 2014 that revealed serious concerns about the way illegal fish discards were being monitored:

“discarding is a systemic failure of the current system and something we have not been able to get on top of from day 1 of the QMS [Quota Management System]. FM [Fisheries Management] can’t quantify the tonnages involved but we suspect they are significant to the point that they are impacting on stocks.”

Yet in May 2016, prior to the release of these emails, the same senior manager was quoted in an MPI press release saying:

“Science is the bedrock of our approach to fisheries management and New Zealand invests $22.5 million each year to ensure our fisheries science is up-to-date and accurate.”

This response makes me very uncomfortable. The Ministry is using the authority of science to deflect criticism and legitimate public scrutiny of the strengths and weaknesses of its management systems.

In this type of situation, Pielke suggests that scientists are better to approach the issue as an advocate, or an honest broker. The advocate takes sides in a policy debate, openly going beyond the science to grapple with the policy implications that may stem from the science. Indeed, the fisheries story and the inquiry itself were sparked by University of Auckland researcher Glenn Simmons arguing for much stronger monitoring of discards:

“… the future sustainability, traceability and certification of fisheries will depend on how government addresses the under-reporting problems, which have long been evident and which should be a cause of concern. Unreported catches and dumping not only undermine the sustainability of fisheries, but result in a suboptimal use of fishery resources and economic waste of valuable protein.”

Simmons’ role in the debate is not something that many scientists would relish. He has been subject to criticism by the Ministry and has his work critiqued in the media by his peers – while peer critique is a crucial part of science, scientists are not always comfortable when it takes place in the public eye. Nonetheless, advocates like Simmons play a crucial role in getting issues on the policy agenda.

The trick to pulling this off, according to Pielke, is to avoid using your science to mask a hidden agenda. An advocate must be explicit about where the science ends and values take over, acknowledging that scientific evidence alone is not sufficient in itself to make a policy decision.

The fourth option is that of the honest broker. In this role, the honest broker, like the advocate, acknowledges the gap between science and policy. Rather than trying to weigh in on a particular side of a policy debate, though, the honest broker attempts to consider a range of possible policy options, perhaps even using their expertise to introduce new solutions that were not yet on the table.

The honest broker is perhaps the most difficult stance for an individual researcher to attempt. Individuals are very rarely in a position where this is practical, as it requires the synthesis of the expertise of a wide range of colleagues and a diverse set of political viewpoints. In Silencing Science, I single out the Parliamentary Commissioner for the Environment as an example of an honest broker, but the Commissioner is supported in that role by a large team and is in the position to take a significant amount of time and care in weighing in on issues. As Pielke has pointed out, honest brokers are almost never a single individual. More typically this is a role for committees or panels.

In New Zealand we have several bodies that might be in a position to take honest broking on. The Royal Society of New Zealand “produce papers, convene panels and hold events to provide expert advice to policy-makers and contribute to public debate.” Generally this advice is undertaken in the pure scientist or science arbiter mode: a recent advice paper on sugar and health, for instance, almost entirely avoids policy recommendations, focussing instead on summarising evidence linking sugar consumption and health, despite the intense debate around policy options such as sugar taxes and mandatory labelling.

The other significant body is the Network of Science Advisors chaired by Sir Peter Gluckman, the Prime Minister’s Chief Science Advisor. The terms of reference and membership of this group is not readily available to the public, so it is difficult to comment on the way they operate. We are going to be discussing this group in a later post, together with some recommendations about how we think it could be utilised more effectively.

While Pielke’s model is a useful entry point into this discussion, it does have a number of shortcomings. Over the next few weeks we’ll be discussing this further in the New Zealand context.

Shaun Hendy

Te Pūnaha Matatini Call for Associate Investigators

Te Pūnaha Matatini Call for Associate Investigators

Te Pūnaha Matatini – the meeting place of many faces – is a New Zealand Centre of Research Excellence that carries out interdisciplinary research into complex systems and networks. We have a particular focus on complex data analytics, economic and social systems, as well as natural ecosystems.

Te Pūnaha Matatini is currently looking for up to ten new associate investigators who can complement our existing research programme or enable us to develop new research programmes. Associate investigators should be researchers who are New Zealand-based, and who have demonstrated the ability or potential to conduct research in areas of relevance to Te Pūnaha Matatini.

Associate investigators are not eligible for direct funding, but will be supported to attend investigator meetings and engage in Te Pūnaha Matatini research projects. In 2017, Te Pūnaha Matatini will be seeking to appoint new principal investigators. Associate investigators who have already established strong relationships with the Centre will be in a strong position to take up these positions.

Application details

Anyone interested in becoming an associate investigator should send a cover letter and an academic CV to Sarah Hikuroa, Te Pūnaha Matatini’s Coordinator (s.hikuroa@auckland.ac.nz) by end of business on Tuesday July 19th 2016. The letter should outline their intentions to contribute to Te Pūnaha Matatini’s research programme, either through existing projects or through developing new projects. The letter should also mention any broader contributions that the investigator could make as well as any collaborations with existing investigators that may exist. Applications will be considered by Te Pūnaha Matatini’s Research Committee, taking into account the merit of the application, the strategic fit of the proposed collaboration, and the resources available within the Centre for accommodating the investigator.

Te Pūnaha Matatini is committed to increasing diversity in the research community, and applicants from under-represented groups are strongly encouraged to apply. Te Pūnaha Matatini is also a family-friendly Centre of Research Excellence, with a commitment to rewarding merit relative to opportunity, and to applying best practice in enabling opportunity.

Role of Associate Investigators

Te Pūnaha Matatini’s associate investigators will typically be researchers at an academic institution, Crown Research Institute or independent research institute, with an interest in complex systems or networks, data analytics, economic and social systems, and natural ecosystems.

Associate investigators are responsible for:

  • Actively collaborating on Centre research projects and/or
  • Actively collaborating on Te Pūnaha Matatini’s broader strategic goals
  • Conducting their research and other activities in accordance with Te Pūnaha Matatini’s ethical and professional policies.

It is expected that associate investigators be involved, or have the potential to be involved, in at least one Centre project, research or otherwise. Associate investigators are also requested to use their Te Pūnaha Matatini affiliation in all contexts relevant to the Centre. Associate investigators are also expected to contribute to annual reporting.

Associate investigators are eligible to:

  • apply for Centre funds to support PhD students and post-docs
  • apply for Centre funds to support activities that support Te Pūnaha Matatini’s strategic goals
  • attend Centre workshops and the annual investigator Centre forum.

The Research Committee will review the role of associate investigators annually.

Jaffe on Marsden Fund

Jaffe on Marsden Fund

Back in May, after some criticism of the Marsden Fund processes made it into the media, I wrote about Te Pūnaha Matatini investigator Adam Jaffe’s study of the Marsden Fund. Adam presented his preliminary findings at our Launch workshop in February, and today they were released as a Motu working paper.

There is a short media release here, but the upshot is that it shows that receiving a Marsden grant leads to higher productivity and impact, at least in terms of papers published and the citations received. This won’t surprise many, but it is very exciting to see the benefits of Marsden funding quantified for the first time.

In fact I think this is a watershed study. It is the first rigorous evaluation of a New Zealand research funding process ever undertaken and it has thrown up some fascinating insights. It also demonstrates the benefits of the sustained collection and retention of science and innovation data, and the Marsden Fund should be commended for its commitment to doing so.

Unfortunately, the Ministry of Business, Innovation and Employment and its predecessors have done a poor job of curating their data since New Zealand moved to a contestable funding system in the early 90s, which means that much of our funding system remains opaque. I understand, however, that the Ministry is working on a plan to put in place systems and practices that will enable these sorts of evaluations to be made in coming decades.

What sort of data do you need? The difficulty in evaluating contestable research funding is that funding agencies go to great lengths to select the best projects and the best applicants. You can’t just compare the performance of those applicants who got funded to those who didn’t, because any difference in performance might just be a sign that your application process is doing its job in selecting performers from non-performers, rather than a signal from the funding.

One way to avoid this selection bias would be to allocate funding randomly, but few funding agencies are willing to do this. And even if we decided that a randomized control trial was a good idea, we’d still have to wait a decade or so to acquire data for the study.

Instead, Adam and his team have made use of the Marsden fund panel scores that are used to rank the projects of applicants in the second round of the Marsden fund. These panel scores can be used to estimate the selection bias in your performance data, enabling you to back out the effect of the funding itself. The Marsden fund has kept the panel scores for both the successful and unsuccessful projects for a number of years, and this has been matched with bibliometric data for applicants to measure subsequent performance.

The most interesting finding from this data is that the expert panels that evaluate Marsden Fund proposals do not seem to have a selection bias! You see a jump in performance for those applicants who were funded, but otherwise the subsequent performance of applicants seems to be independent of their ranking by the panel. Panels are not able to pick winners, but those that they do give the money to go on to win.

As a panelist myself, I seldom felt that we were making meaningful selections at the second round – almost all the proposals we looked at seemed eminently fundable. This inability to pick winners does not necessarily mean that the panels are redundant. I expect that there might still be benefits that accrue from encouraging researchers to plan and develop research plans that can stand up to scrutiny from these panels. It does suggest though that we should be cautious about using success in Marsden as a proxy for research quality, particularly when it comes to career advancement.

Perhaps the best news for researchers is that the study suggests that there would be no diminishing returns if we were to double or treble the size of the Marsden fund. If we could fund all second round applicants, we would be unlikely to see any decrease in the quantity and impact of the research carried out, just a step change in performance across the research sector.

There are some caveats to the study, so it is well worth reading in its entirety (here it is again). For instance, the lift in performance measured could be indirect. If winning a Marsden grant increases your chances of getting funding from other sources, then some of the boost in performance might come from other funding rather than Marsden. If we had good data from MBIE, we might be able to tell …

It is also worth noting that the Marsden Fund is there to do more than generate papers and citations. Ultimately we would like to be able to measure impacts in other ways. The sort of study that might come next would be to look at the subsequent careers of Marsden-funded PhD students. Does working at the cutting edge of science set you up for a successful career?

 

 

 

 

Declaration: I was a Principal Investigator on two Marsden-funded projects during the period that this study covers (in 2006 and 2008), and I was on the Physics, Chemistry and Biochemistry Panel from 2010-2012.

Some thoughts on Going Public

Some thoughts on Going Public

An abridged version of this blog post appeared in the University of Auckland’s Uninews on June 4th 2015.

Two years ago, towards the end of my term as President of the New Zealand Association of Scientists, a journalist asked me an astonishing question. News of the possible detection of harmful bacteria in a batch of Fonterra’s milk powder had just broken, yet the media was struggling to find any experts who would speak about the tests. The journalist who called me wanted to know whether scientists had gone quiet because the government was muzzling them.

I was surprised. The government has no ability to silence the science community. Although the government owns the Crown Research Institutes, they are not subject to the restrictions that the public service face on communicating with the media. And the responsibility of academic scientists to speak out is spelled out in the Section 162 of the Education Act, where it is made clear that we have a role as the “critic and conscience of society”. For the most part, scientists are free to speak out as they choose.

In practice, things are not so simple. In the case of the milk powder scare, many scientists who did have the expertise felt conflicted through their relationships with Fonterra, the Ministry of Primary Industries, or AgResearch. The silence that resulted meant that uninformed, fringe voices began to get airtime.

One of the few experts who did speak out was University of Auckland microbiologist Dr Siouxsie Wiles. Writing on her well-known blog, Infectious Thoughts, she provided one of the very few scientific perspectives on the story and debunked some of the wilder theories being aired in the media. With this, Dr Wiles quickly became the go to person for the media, and finally, some sound science started appearing in news reports.

Like many researchers who have stepped up in a crisis, Dr Wiles asked herself that if she didn’t do it, who would? Yet, as she noted in her address to the New Zealand Association of Scientists Conference, Going Public, in April, the reaction from many of her colleagues was far from positive. To some, it seemed that she had spoken out of turn. And sadly, as we learned from other delegates at the conference in April, her experiences are far from unique. It seems that the scientific community can be its own worst enemy.

Communicating with the public, whether through the media or otherwise, is often seen as a less than serious pursuit for scientists – something best left for the twilight of one’s career, or to be attempted in the lead up to that crucial funding round. Time on twitter is time away from the lab, a trade-off that no true scientist – god forbid, one early in their career – should be prepared to make. And when an articulate young scientist upstages us in the media, it can ruffle our greying feathers.

Yet communication is a skill, and working with the media requires a great deal of commitment. The scientists who we hear from in public are those that have chosen to work hard at these skills and those who have put the sustained effort needed to build relationships with journalists. It is difficult work, made more difficult at times by the lack of recognition by colleagues or institutions.

It is also important to understand that the media has changed. The business model that supported public interest journalism for centuries is on the brink of collapse. Only Radio New Zealand can support specialist science reporters these days. If you are not pro-active in working with the media, they will often not have the time or resources to come knocking. As Fiona Fox, head of the UK Science Media Centre, puts it “the media will do science better when scientists learn to do the media better.”

If we want a better informed public in New Zealand – and dare I say it, a public prepared to put more tax dollars into university research – then the least we can do is support our colleagues who are working hard to bring this about.

Should the Marsden Fund be restructured?

Should the Marsden Fund be restructured?

By Shaun Hendy

The MacDiarmid Institute’s Kate McGrath created a bit of a stir on Friday with a blog post that took a critical look at the Marsden Fund’s decision to invite 15% fewer proposals into the second round of its funding process than it did last year. The Marsden Fund runs a two-round process: applicants submit a one-page expression of interest in February, with about a fifth of these then invited by expert panels to submit a more comprehensive proposal in May.

In 2014 20% of first round applicants were invited to submit a second-round proposal. This year this figure fell to just 17%. Below, I’ve plotted the fraction selected to submit second round proposals since 1998. After a spike in 2003, the fraction selected at the second round has returned to the levels seen in fifteen years ago. So, while not unprecedented, a 17% throughput at the first round is something not seen for a while.fig1The advantage of running a two-round system is that it can reduce the burden placed on the sector in the writing of comprehensive proposals. If we had a one-round system, everyone who wanted to apply would have to write a comprehensive proposal that was detailed enough for peer review. The more proposals that get through to the second round, the more work there is for the researchers that write them and the selection panels that evaluate them. The Marsden Fund Council chose to reduce the number of proposals selected for the second round this year because they wanted to reduce the workload for the sector.

McGrath is conscious of this issue, having blogged about compliance costs in late 2013: “The compliance costs of everyone applying for everything is costing our country real money that could be used producing real outcomes from the scientists utilising their time to ply their trade; scientific research rather than applying for funding that invariably they will never get.” In other words, the amount of effort we spend writing proposals should not be allowed overwhelm the effort we put in to our research.

Expectations

A slightly different way to look at this is to compute the expected value of writing a second-round proposal i.e. what is the expected pay-off in funding from submitting a second-round proposal? You can compute this by multiplying the likelihood that a second-round proposal will be funded by the average amount awarded. I’ve plotted this quantity below from 2001 to 2014, with a fantasy figure for 2015 based on the Marsden Fund’s indication that this year they will fund 90-95 proposals.fig2By reducing the number of proposals at the second round, the Marsden Fund Council can increase the expected value of second-round proposals to the applicants. A standard second-round proposal (as opposed to a fast-start) this year is expected to return $315,000-$319,000 compared to $287,000 in 2014. As there is less money available overall this year, if the Marsden Council had okayed a similar proportion of second-round proposals to what it did in 2014, then expected value of a second-round proposal would have fallen to around $250,000. This would have been well below the long-run average of $290,000 for the expected value of a second-round proposal.

However, in her recent post, McGrath says that is now “more difficult to get your foot in the door of arguably the most prestigious grants in the country”. Her concern is that panels might not be expert enough to make good decisions at the first cut, so that the country might be missing out on funding the best science. The assumption that McGrath makes is that decisions made at the second round, when panels have access to expert peer review, will be of significantly higher quality than those at the first round.

Picking winners

Yet recent work by Te Pūnaha Matatini investigator Adam Jaffe suggests that Marsden panels are not very effective at predicting the eventual success of the second round proposals they view. Jaffe’s findings (not yet published, but which were commissioned by the Marsden Fund Council) are broadly consistent with the findings of many similar studies in other countries: panels and peer reviewers are not very good at picking winners (although see this recent article).

These findings do not to say that panels aren’t doing their best. Rather, the evidence suggests it is simply an almost impossible challenge to predict the outcomes of cutting-edge scientific research. It is also far from clear that letting more proposals through to the second round would improve matters. In my opinion, when we know that panels have such a difficult job to do, it is better to acknowledge their limited effectiveness and to reduce the burden on the system accordingly rather than to double down as McGrath suggests.

One-round system?

One of the key disadvantages of a two-round process is that, all things being equal, it will have a much lower success rate than a one-round process. More people will be prepared to chance their arm in round one if they only need to complete a short application. Filling out a single-round application for an Australian Research Council Discovery Grant, for instance, is considerably more onerous than the preparation of even a second-round Marsden proposal. The Australians are rewarded for the extra work involved with a success rate of 18%.

Despite this, I still prefer our system. If the ability of expert panels to predict funding success is poor, then better to have a system that minimises transaction costs. And as I noted in a blog post in March, I think we also need to acknowledge the increased research intensity in universities that has driven the oversubscription of the Marsden Fund.

I think the Marsden Fund Council is well aware of the trade-offs described above and has chosen a sensible course. In fact the Council should be commended for commissioning work to investigate the effectiveness of its decision-making process. The real fix for the system – increasing the Marsden Fund to match the increased level of research activity in our universities – is not within the Council’s control.

The 2014 Marsden round

The 2014 Marsden round

I have been keeping track of the Marsden fund for a few years now over on A Measure of Science at Sciblogs New Zealand. As we wait for the results from the first round in 2015, let’s reflect on last year’s results. 2014 was another tough year for applicants, with the success rate falling to just 8.3% – well below the long run success rate of 10%. As the figure below shows, this was the fifth year in a row that the success rate has been below 10%.

Picture

The figure also shows that this is because the number of applications has risen considerable in these last five years, while below, we see that the total funding awarded has not kept pace. There was a big injection of new funding in 2009 after the National government came to power, but this only seems to have coincided with a large increase in the number of applications.

Picture2

So if there is no more funding available, why are we writing more applications? The plot below shows that this increase in preliminary applications submitted has come from the universities, while the number of applications coming out of our Crown Research Institues has actually declined. This growth has not been driven by a growth in university research staff. Statistics New Zealand’s R&D survey suggests that these numbers have, if anything, declined over the last decade: in the 2004 survey, the universities reported 3300 FTE researchers, while in 2014, they reported just 3100

Picture3

There certainly seems to have been a change in behaviour amongst university staff in the last few years. A decade ago there was one preliminary application for every five FTE researchers from universities, while last year this had risen to one for every three FTE researchers. It is tempting to suggest that this is in response to the increase in funding in 2009, but then one might have expected a similar response by CRI researchers, whereas the opposite has happened.

Picture4

I think that this may have something to do with the pressure that university researchers are now under to measure up for the Performance Based Resarch Fund. University researchers are expected to be applying for external research funding, and for many, the Marsden fund is the only option. While we might welcome the growth in research activity in universities, this is evidently placing a signficant strain on an already stretched science and innovation system.
Finally, the other feature that caught my eye in the 2014 round was a small drop-off in proportion of funds awarded to fast-start applicants. The share of funds awarded to fast-starts has levelled off since 2011. This had to happen at some point after a steady growth in its share since the fast-start scheme was introduced in 2001.

Picture5

Fast-start applicants have a slightly higher chance of success than standard applicants, with a long run success rate of 13% (c.f 9% for standard applicants). Last year though this dropped to 11%, in line with the success rate for standard proposals (7% in 2014).

Picture6

No turning back

No turning back

What a week! It was nearly two years ago that I sat down with Dion O’Neale to first discuss about the possibility of establishing a Centre of Research Excellence in complex systems over a beer in Wellington. And finally, last Wednesday, in the University of Auckland’s Fale Pasifika, the Vice-Chancellor Stuart McCutcheon declared Te Pūnaha Matatini open for business.

Pierre Roudier, Geoff Wilmott, Dion O'Neale, Ken Quarrie

Pierre Roudier, Geoff Wilmott, Dion O’Neale, Ken Quarrie

 

Siouxsie Wiles, Rhian Salmon, Ed Abraham, Rebecca Priestley

Siouxsie Wiles, Rhian Salmon, Ed Abraham, Rebecca Priestley

 

Mark Gahegan, Laurie KNight, Shaun Hendy

Mark Gahegan, Laurie Knight, Shaun Hendy

Te Pūnaha Matatini is now starting to seem rather real. Te Pūnaha Matatini HQ has also been humming with visiting investigators hot-desking in our new wāhi hui, new students arriving, and a couple of top-secret projects. We ordered business cards on Monday, we launched our website on February 13, and @punahamatatini has been tweeting now for 292 days. But what really made it tangible for me was the two-day research symposium we held last Thursday and Friday.

DSC_0185

Logo

Over two days we asked our investigators to give three-minute ‘lightning’ talks (or lightning strikes as Peter Davis called them), one slide each, on the most exciting aspect of their research right now. This was a real success – it turns out there is a lot you can say in three minutes if you put your mind to it. We also heard from our project leaders on their plans and the PhD projects that will be offered, and heard talks from several outside organisations who we are keen to work with.

DSC_0201

Alexei’s Slide

Have a look at the @punahamatini and @TPMwhanau twitter feeds if you want to see the range of topics we covered.

One of the talks we didn’t tweet about was the one given by Lillian Grace from Wiki New Zealand, who gave us a sneak preview of the new Wiki site and the engine behind it. Lillian is on a mission to increase data literacy in New Zealand – to democratize data as she puts it – by bringing together disparate streams of government data in one place and making the data sets easy to work with and present. The new site went live on Tuesday so you can go see what we were so impressed by at wikinewzealand.org. It is also worth mentioning that there is more to come – they will be rolling out several exciting new features over the next few months.

But what made Te Pūnaha Matatini feel most concrete for me last week was watching Rachelle Binny and her Whānau committee get together in the wāhi hui for a meeting after our symposium had finished. Rachelle is the first chair of our early career researcher network, our whānau, and it has been really quite wonderful to watch the way she is has started to inspire and lead this group of young, talented scientists.  Te Pūnaha Matatini is no longer an idea owned by me, by our investigators, or by any of our institutions. It has been adopted and made real by the next generation of researchers.

DSC_0203

Whānau

Compete to cooperate

Compete to cooperate

The first conference I attended as a PhD student was held at a postcard-perfect alpine château up in the Canadian Rockies. The organisers had brought together particle physics, the physics of the very smallest things we can detect, with cosmology, the physics of the entire universe. These two bodies of knowledge both become important in the very early stages of the universe, when everything we can see in the night sky, even the most distant objects visible only with our most powerful telescopes, was compressed into an unimaginably tiny volume.

The talks would start early in the morning and run through until midday, when the participants would disperse to ski at the local resort, or to go skating on a frozen lake that was overlooked by the château. While physicists often style themselves as accomplished outdoor types, the reality is that they spend too much time at their desks or in the lab – by the end of the week the hotel had run out of crutches and knee braces. Every afternoon, after the rescue teams had retrieved our fallen, we would reconvene for talks and discussions that would continue long into the night.

The hot topic of the week was the two competing measurements of the rate at which the universe was expanding. This is a crucial thing to know if you want to know the age of the universe, and if and when it might end. There were two different research groups at the conference each claiming to have made the first reliable measurement of this rate. The discussion was running hot because the measurement of one group was almost twice that of the other.

The fate of the universe quite literally depended on who was right. To an outside observer, each group seemed supremely confident in their own measurement and equally confident that the other group had made a mistake. Someone was wrong.

Today, we know that both groups were equally mistaken. Over the next few years, each group refined their measurements to bolster their case. As new data was collected and analysed, the two measurements started to creep closer together. A few years later they met in the middle.

Science doesn’t always result in compromises like this, but when science works well, it often does so via a mix of competition and cooperation.

From a certain perspective competition between teams of researchers might seem wasteful. Why fund two research groups to make the same measurement? But in this case, the existence of a competing group eliminated any complacency that might have set in. The overconfidence of each team in their own approach was tempered by their rivalry with the other group.

The cooperation is less explicit in the narrative, but was nonetheless crucial. The teams cooperated by openly sharing their results and methods in an effort to persuade the rest of the research community that they had the right answer.

New Zealand has experimented over the years with mechanisms for driving cooperation and competition in our research system, yet we still seem to be incapable of striking the right balance. The National Science Challenges eliminated competition altogether, or at least banished it behind closed doors, and led many to conclude that they had been captured by an old boys’ network.

In my opinion, the Centres of Research Excellence have struck the right balance. They are selected through a highly competitive process, yet one which encourages large teams of researchers to come together and create something greater than the sum of its parts. And in the 2014 round we saw that incumbency was no guarantee of success. Only two of the existing CoREs were selected for further funding and four new entrants were given their chance, including Te Pūnaha Matatini.

We now need to ensure that the other side of the ledger is balanced: openness, sharing of best practice, and collaboration between the CoREs will ensure that science emerges as the winner in the long run.