Knowledge for all – Open access to scientific research

Scientific papers are at the very heart of our student lives. They cause nightmares as they feature on the seemingly endless reading list for our seminars and inspire dreams as we strive for seeing our own name in the list of authors. Still, few students waste a thought on the business side of scientific publishing. Unjustly so, as the field might undergo radical changes in the coming years with far-reaching consequences for academia.

The source of the potential upheaval is a European initiative for open-access science publishing. Under the code name “Plan S,” the European Commission and the national research organisations of twelve European countries demand that all work resulting from publicly funded research shall be made accessible free of charge by 2021. In concrete terms, the plan stipulates that research worth €7.6 billion needs to be uploaded in open-access journals. This demand pits them against publishing houses, which fear a severe disruption to their existing business model.

A monopoly on knowledge

As the bankrollers of most research in their countries, national research organisations take a reasonable interest in reforming a system that absurdly overcharges them for bringing the results of the research to the public. In the current system, publishing houses receive the manuscripts of publicly financed researchers free of charge. The manuscripts are in turn checked by peer reviewers – most of whom are also employed at universities. At the end of the production chain, publishers sell the resulting journals to  university libraries. Collectively, publicly funded institutions therefore buy the fruits of their own labour.

Of course, publishers also incur certain costs, such as for administrative tasks, marketing, layout, printing and, perhaps most importantly, the administration of the peer review process. But they could by no means explain the immense increases in journal prices observed over the last decades. From 1984 to 2005, the average price charged for academic periodicals in the US increased sixfold while the overall price level rose by a factor of less than two (see Figure).

University libraries are increasingly unwilling or unable to pay. Couperin, a consortium representing 250 French education institutes, announced last year that its negotiations with Springer came to nought and that it will no longer subscribe to their journals. However, giving up access to top journals is hardly an option for universities. Researchers must stay up to date with the latest findings in their fields and students, whether they like it or not, need to go through their reading lists.

It follows that publishing houses are in a quasi-monopoly position with nearly unrestricted pricing power. This is evident not only from the price increases for journals, but also by the profits that the three biggest publishers – Springer, Elsevier and Wiley-Blackwell – regularly amass. Elsevier, for example, chalked up profit margins of 37% in 2018. In comparison, the average listed company in the S&P 500 index had a margin of only 10% in that year.


Science without borders

The deficiencies of the current system raise the question for an alternative model. One answer is provided by open access, meaning the free provisioning of research results online. This can take two forms: the first one is “green open access,” where an article continues to be submitted in a paid journal. In addition, after an embargo period of six to twelve months, the authors upload the article for the purpose of self-archiving to their institution’s website. The second is called “golden open access” and refers to publications in journals that are themselves accessible free of charge. Their main difference concerns how the journal covers the remaining publication costs. In the green model, the reader continues to pay the journal for the privilege of early access. In the golden model, the costs are covered by “publication fees” settled by the authors, who usually pass them on to the funder – e.g. their university or grant provider.

With the advent of open access at the beginning of the century, many predicted the end of the existing payment model. And indeed, open access has made some inroads – including the Public Library of Science and BioMed Central journals, as well as the ArXiv website, an online repository for scientific manuscripts. Many students will also be familiar with Sci-Hub, a website hosting papers without regard to copyright. In a legal way, however, the expected open access revolution never fully materialised. Today, only a quarter of scientific articles are made freely available, most of them in green open access.

Now Plan S intends to radically accelerate the transition. It responds to calls for greater transparency and cost efficiency regarding the use of public money. Further, it is expected to accelerate the speed of discoveries. As science advances through cross-fertilisation between projects, any barriers such as paywalls or embargo periods necessarily slow it down. Instantly uploading manuscripts, even before the protracted peer reviewing process, could serve as a catalyst of scientific progress.

Moreover, extending the diffusion of scientific knowledge to a less affluent audience renders science more equitable and encourages diverse thinking in academia. Finally, open access may shift the focus away from publishing exclusively significant results and allow the research community insights into “failed” studies that may have equally valuable insights to give. One study claims that the results of half of all clinic trials in the US go unpublished (Riveros et al., 2013). Without knowing about these, researchers may end up pursuing dead ends that have already been explored by their colleagues.


S for Short-Sighted?

In the eyes of sceptics however, the sweeping changes of Plan S risk undermining the quality of research by severely hurting high-class journals. A particularly contentious demand of Plan S is a proposed cap on publication fees. This would be particularly hard to meet for journals with high rejection rates. Since they also incur expenses for the peer review of rejected articles, they face significantly higher costs for every publication. Nature, for example, estimates their publication fees to be at $40,000 per article – many times the limit contemplated by backers of Plan S.

Renowned journals pride themselves on their selectivity as it grants their articles a quality seal that open access journals could struggle to replicate. Critics fear that in the extreme case, open access can end in the practice of “predatory journals,” which accept any article for the sole purpose of cashing in the authors’ publication fees. A survey by the Nature Publishing Group shows that almost half of the authors therefore express doubts about the quality of open access journals.

The main worry about Plan S is therefore that rather than reforming the publishing system worldwide, it could create a parallel system for European research. If the top journals do not go along with the proposed changes, nationally funded researchers would be restricted to less reputable open access outfits. In the worst case, this could even lead to an exodus of scientific talent to countries or funders without open access-requirements. Recognising the risks of an abrupt implementation, the consortium behind Plan S has postponed its introduction by a year – it was initially supposed to start in 2020 – and suggested a two-year transition period. Even after that delay, it remains all but clear whether the plan will indeed manifest or remain the pipe dream of disenchanted open-access advocates.


In the current system, publishers use monopoly power to demand exaggerated prices from university libraries without compensating those who contributed to the research. Open access promises to upend the practice and extend the insights of scientific research to a much broader range of people without any financial limitations. But as its advancement has stalled, new political support is required to maintain the momentum. Plan S could potentially provide this boost. Its success, however, depends on whether it can create mechanisms to continue the process of rigorous peer review and uphold quality. If it does, the plan could serve to inspire other countries to pursue open-access initiatives. Elsewise, it will founder as a quixotic undertaking aspiring for a world with free, unlimited knowledge for all.

By Stefan Preuss



CSI Market , 2019.

Couperin, 2018.

Dingley, B., 2005. US Periodical Price Index 2005.

Kimball, M.S., 2017.

RelX Yearly Result, 2019.

Riveros C., Dechartes A., Perrodeau E., Haneef R., Boutron I., Ravaud P., 2013.

The Economist, 2018.





Interview with Julien Grenet from PSE


Julien Grenet is a researcher at the CNRS, an Associate Professor at Paris School of Economics, and one of the founders of the Institut des Politiques Publiques. He is specialised in education economics, public economics and market design. He is known by the general public for his participation in the public debate and the vulgarisation of economic concepts in some media such as France Culture.


He agreed to talk to the magazine about his work as a researcher, the importance for economists to be involved in the public debate and about modern issues that the french educational system is facing today.


Why did you create l’Institut des Politiques Publiques? What are its specificities?

We created l’Institut des Politiques Publiques – IPP – with Antoine Bozio in 2011. It followed a six-year period that Antoine spent in London working for the Institute for Fiscal Studies – IFS, which is our main inspiration for IPP. What was lacking in France was an institute that evaluates public policy, tries to put together the insights of academic research and translates them into policy brief reports targeting a broader audience such as policymakers, journalists and citizens. We felt that there was a very good academic research in public policies existing in France, but most of the results were not really conveyed to the general debate, which is, in my opinion, quite unfortunate. IFS was a good model to import in France. We started small but we have  grown up ever since, trying to cover a broad range of topics that are interesting for the public debate, such as tax policies, education, housing, pension, and environment. We also work on health issues.

What is your opinion, as a researcher, on the role of economists in the public debate?

I do not want to be judgmental on what we should do or not do. There are different ways to contribute to the public debate. From my point of view, you do so through the academic output you produce that then spills over onto the public debate. You should also try to meet policymakers. The important thing is to participate in the debate on topics that you know, and only on them. Unfortunately, it is not always the case, and that sort of attitude may damage the reputation of economists. I am personally trying to restrict my interventions to questions on education or housing, since I have worked on it.

Why did you choose to study education, and more specifically social segregation and selection processes, as your main topic?

I started to study education because it was the topic of my Master thesis. What drove me to this is that I come from a family of teachers whose social mobility upwards was entirely due to school.  I was shocked in a way by the fact that through the education system, my family managed to climb up the social ladder. Today, we sometimes have the impression that it does not play this role anymore, and we wonder what is wrong with our educational system. I think the tools of economists have a lot to say. What we can learn with economics is improving the efficiency of the educational system.

I went into it for personal reasons; afterwards, the topics that I have addressed are more random. I started working on the return of education, which is a very classic question. Then, since I was working in the same office as Gabrielle Fack, who was working on housing, we thought about working on something in between those two fields of interest. We started working on the effect of school zoning (“la carte scolaire” in French) on housing prices. We thought that this system was one way to assign students to schools, but we actually found out there were many others. We started reading about the school choice mechanism and got interested in that. It is a very dynamic field in economics: how to assign students to schools? How to assign teachers to schools? How to drive students to higher education programs?

In France, there has been a lot going on on the subject lately, and this is important for the public debate. We heard a lot about Admission Post Bac and Parcoursup; those are, in my opinion, important technical tools for policy implications or policy effects. We empirically know quite little about how their effect in the real world. I think this is where we, as economists, can contribute: by improving these tools.

According to the OECD, France is one of the most unequal countries in terms of climbing up the social ladder. What is your analysis?

I think that there are many reasons to it; yet, we can hardly identify them. What the OECD has shown is that at the age of 15, your performance is more determined by your social background in France than in any other country. France is typically in the top three countries where social determinism is the strongest at school.

One reason is that our educational system, especially the middle school system – between 11 and 15 years old – is highly segregated. From research, we know that ghetto schools harm students who are studying there beyond the effect of social background. This segregation in the school system increases inequalities. This might be due to different things: the level of residential segregation is very high in France, and the way we assign students to schools is far from being optimal. As we are assigning students to their local school, if the neighborhood is segregated, then the school is going to be segregated too.

There are many other ways to assign students that we could use. For instance, there is what we call “control school choice” that tries to achieve a balance in the social composition. We could also redesign the school boundaries, or “school catchment”, so that they would be more diverse in their intake students. That is one important topic to be addressed: can we reduce segregation in school by using different methods of assignment?

There is also a problem with how teachers are assigned to school. Typically, young teachers, who are inexperienced, are assigned to the most deprived schools in France, which is obviously a problem. We know that teachers have their biggest efficiency improvement during their first few years of teaching. Hence, students from deprived schools have less probability to benefit from the most efficient teaching.

There is also an issue with the educational system. The French system is very good at selecting an elite and the whole system is created to detect these students who go all their way up to “classes préparatoires”, “grandes écoles” and so on. However, it is not so good to have as many students as possible to succeed. We have a very strong elite but, in the meantime, we are losing a lot of students along the way. France has a high drop-out rate: many students quit school with no certification. Another problem with the system is that vocational courses are seen as a personal failure, unlike many other countries. Therefore, a lot of students who follow this path feel like they failed their studies.

Your research focuses on assignment algorithms. What consequences did you find of such algorithms on students’ choices?

France is a very centralised country; hence, it is more inclined to use these algorithms to assign students and teachers than other countries. There has been very little involvement of researchers and economists to design these algorithms. In fact, a lot of research on this assignment mechanism comes from the U.S.. It is a branch of design mechanism theory which received a lot of visibility thanks to the Nobel prize of Alvin Roth and Lloyd Shapley in 2012. They really transformed the landscape in many dimensions:  for example, the assignment of students to school in the U.S. has been completely redesigned in many cities using these algorithms. Kidney exchanges now rely on these algorithms, and there are many new applications, such as social housing allocation.

In France, in my opinion, the main problem is the fact that there is not enough transparency about these algorithms. They exist in order to produce the best possible matching between students and schools, to try to maximize satisfaction while respecting several priority rules. The problem is that, the way the algorithms and the priority rules work are not well known. This has led many people to reject the whole idea of selecting people with algorithms because they feel that there is a black box, like a lottery, when in fact, an algorithm is just a tool.

What really matters is the way you design priorities. If you have two students who apply to a school and there is only one seat left, which student has the priority over the other is a political decision depending on which criteria you promote – students with better grades, students who live closer to the school, students with a lower social background, … This is not sufficiently explained and democratically decided. The issue today is to bring research into these algorithms, so that there are more discussions and a better understanding of the way they work.

You are currently working on a project on social mix. Why it is a topic of interest? What are your preliminary results and your analysis?

We have already said that the lack of social mobility is one of the reasons why there is so little mobility upward in France. The question is how to address this problem. We have several potential ways of doing it. We could use the  , we could redesign the school catchment area, we could also close some schools and send some students away from their original choice, like in the city center rather than in a suburban area.

We do not have many empirical results telling us in which case we should use this or that tool nor do we know the actual effect of some tools on segregation. Moreover, these effects are mitigated by the behavior of the parents: if they decide to send their child to a private school, we might not get as much social mix as we initially wanted. Therefore, we are trying to evaluate different ways to assign students to school in order to create social mix and evaluate their effect. To do so, we are using several experiments that were launched across the country, and we try to compare the effect of these experiments on social mix.

The reason why we want to increase social mix is because we believe it is going to reduce inequalities. We are interested in the effect of social mixing on both students’ performance and their non cognitive aptitudes: their self-confidence, their social fatalism and the way they perceive others – the perception of difference. What we are trying to use here is the fact that, in some experiments, even if we found a large effect on social mix,

We try to evaluate this through surveys that are conducted in schools. We are now proceeding in the second wave; two other waves  are coming. What we try to evaluate is how does the change of the school social composition individually affect the students through their performance in school and their non-cognitive outcomes. If we look at the literature, there is no evidence of this, especially on the non-cognitive aptitudes, because we cannot really measure it with administrative data. We need to go to the schools and directly ask students some questions. That is our contribution to the literature: trying to answer one of these questions.

Finally, what results in your research were you surprised of?

I did not anticipate the fact that this students’ assignment mechanism would have such a big impact on the composition of schools. I started to work on these assignment mechanisms looking at several high schools in Paris. In 2013, the educational authority of Paris adopted an algorithm to replace the manual procedure. As a part of the algorithm, they created a bonus for low-income students. This bonus would increase their priority, and as a result, the social segregation in high schools in Paris went down by 30 % in only two years, which is huge. This had not been anticipated by the local education authority because they did not think that the way the bonus had been created would make that bonus so large. They did not realise that they gave almost automatically their first choice to low-income students. This completely changed the landscape of Paris, which was the most segregated area in France. This is no longer the case.

By working on this data, I realised that these tools are in fact even more powerful than any reform. For instance, the “assouplissement de la carte scolaire” was relaxing these schools’ catchment areas, so that students could apply to schools that are away from their homes. In reality, this had very little effect on the social composition, whereas these school choice algorithms, like the one implemented in Paris, had a huge impact with very little coverage in the media. The numbers shown in the graph are explanatory: the low-income students now have a bigger set of choices than before. This is one of the surprises of research and economics: it is not because something is not looked upon by researchers or does not get any attention, that it is not existing. You can be like an archeologist: you can dig the results up that were unknown until now and they can change the way you see and understand the educational system.


By Thomas Séron

Should we use new economic methods to assess the impact of collusion on welfare in vertical markets? The example of the “Yoghurt case”


Céline Bonnet is a director of research at INRAE within TSE


If literature has widely covered collusion in horizontal markets, it has not given enough attention to collusion in vertical markets, and more precisely on how to properly evaluate the impact of cartels on total welfare. As we observe convictions for collusion among prominent manufacturers, economists try to advise authorities on new approaches to better consider the strategies of retailers, and better assess the impact of collusion on both manufacturers and retailers, as well as on consumers.




A concentrated market which has become the scene of anti-competitive practices

Over the past 30 years in France, the retail sector has known successive mergers that strengthened the bargaining power of big retailers against manufacturers. The food retail sector, for example, is dominated by eight major groups, including Carrefour and Leclerc, who represent about 40% of the total sales. To counteract this concentration trend, manufacturers of the food industry also decided to engage in a consolidation movement in the early 2000s. The increase of concentration among both retailers and manufacturers has led to higher prices for consumers.

Despite that trend, retailers have still searched for new innovative strategies to differentiate themselves and be more competitive on the market. Big retailers have played the strategy of Private Labels – PLs: they sell store-owned brands, such as, for example, la Marque Repère in Leclerc. PLs are then sold along with National Brands – NBs, established manufacturer brands – giving retailers advantages on both horizontal and vertical markets. They can differentiate from other retailers who might sell the same NBs, and they gain bargaining power against NBs manufacturers, which will lose market shares for the benefit of PLs manufacturers if they charge too high prices. Indeed, PLs products can be substitutes for NBs products, and are often sold at a relatively low price.

The concentration of manufacturers, along with increasing selling prices, also facilitated collusion and other anti-competitive practices. This can be illustrated by the “yoghurt case.

In 2015, French authorities charged 10 major PLs producers of the French dairy desserts sector – such as Yoplait and Lactalis – for having colluded from 2006 to 2012. Indeed, even though PLs are retailer-owned brands, one PL manufacturer may produce for several retailers at the same time. This gives PLs producers incentives to collude. If the price proposed by the retailer is too low, they can reduce their market share in the concerned retailer’s store and sell somewhere else. Retailers will suffer from this strategy, as they need PLs products to differentiate and bargain. Hence, the bargaining power of PLs producers increases with collusion.


A traditional estimation method of collusion effects has become outdated

To assess the variation in welfare caused by the collusion, the French competition authorities used a traditional economic approach, consisting in mainly focusing on the horizontal collusion, and fixing the retailers’ response. The flaw of this method is that it does not take into account vertical relations between PLs producers and retailers, and hence neglects the strategic response of the retailers. It also ignores the potential  “umbrella effect”, which arises when an increase in PLs products’ wholesale prices diverts demand to the substitute product (NBs) and thus distort NBs products’ wholesale prices and market share. A forthcoming paper  (C. Bonnet, Z. Bouamra-Mechemache, Empirical methodology for the evaluation of collusive behaviour in vertically-related markets: an application to the “yogurt cartel” in France) addresses this issue and applies this new methodology to the “Yoghurt case.


A new economic initiative to assess the impact of a cartel on welfare applied to the “Yoghurt case

The idea is to model a competitive setting – or non-collusive counterfactual – to obtain the prices and quantities that would have been observed in such environment, and then compare it with the prices and quantities we currently observe on the market. This new method differs from the traditional one in the sense that the negotiation of the choice of the wholesale prices is modelled as a Nash bargaining game, and not as a unilateral decision from the manufacturers that retailers have to accept. The results from this paper concluded that there was profitable collusion among PLs manufacturers. It also showed that the profit variation for retailers was quite ambiguous, and that PLs producers were not necessarily the only winners of the cartel.

Faculty article

In the competitive setting, by decreasing the wholesale price of PLs products, we would expect that the market share – and hence the wholesale and retail prices – of NBs products would decrease due to a drop in NBs demand. Indeed, in the yoghurt market, we observe an asymmetric substitution between the two types of products: NBs products are more sensitive to a change in the prices of PLs products than the other way around. Strangely, the simulation showed a decrease in market share and wholesale prices for NBs products, but not a decrease in retail prices. In fact, the « umbrella effect » causes a decrease in wholesale prices of NBs products following the decrease in the wholesale prices of PLs products. NBs and PLs manufacturers clearly lose profit in the competitive setting compared to collusion. The novelty then is to take into account the optimal strategy of the retailer, which is actually to slightly increase the retail price of the NBs products: clients will be attracted by the low prices of PLs products, and the retailers will extract a maximum of surplus from consumers who still want to buy NBs products. The retailer actually gains from PLs products but loses from the increase in NBs products’ prices because of the asymmetric substitution. The overall result varies from one retailer to another: for some, the negative effect of NBs products exceeds the positive effect of PLs products, but not for others.

Hence, both PLs and NBs manufacturers are better off with collusion, while the results for retailers are mitigated. The study also found that consumers are worse off with collusion, but the loss is relatively low – less than 1% of the consumer surplus. Overall, total welfare has increased on the yoghurt market.


The “yoghurt case” is an example of how variations in welfare can be wrongly estimated when not taking into account all the strategies of all players of the game. With this new methodology, consisting in considering both inter and intra brand competition, as well as a supply model that includes vertical linkages between manufacturers and producers, competition authorities can better evaluate profit sharing between providers and sellers. In the “yoghurt case”, having more precise information on the providers of each seller would have allowed to estimate the exact impact of collusion on each provider.


By Céline Bonnet