One day of April 2014, I found myself in a conference room, on the top floor of a brand new building, in a top European university. I was speaking to a group of PhDs, doctors and professors from some of Europe’s most prestigious research institutions. I was part of this group because the company I worked at was a (very minor) partner in a research project with all these major scientists. I made my presentation. I explained the hypotheses we made and how we designed an experiment to test them. I explained how the results of the experiment, along with on-the-ground observation, showed that some of the assumptions of the project were probably wrong.
That’s when something incredible happened. The most major researcher around the table, more diplomas that you can count, head of several groundbreaking projects in his discipline, mentor to dozens of students from around the world, spoke up. “Can’t you tell something positive at all?” he said. He didn’t ask about the methodology or about the interpretation of the results. He didn’t read our conclusions. Instead, he asked us to change our results.1 He was afraid our findings would look bad on him. It’s only normal. This person gets his power from the number of publicly-financed projects he manages to obtain and with which he pays his staff. Donors love when projects have very positive results - and they hate trouble. That’s why this person, whose titles made him look like an eminent scientist, engaged in the most un-scientific behavior there is, which is to distort facts for personal gain.
How did we get there? How did universities, once centers of knowledge, came to reward dissimulation? That’s what I tried to find out. In this essay, I’ll take you through a millennium of university history, from the Middle Age all the way to the 20th century, when things got out of hand. If you make it to the very end, you’ll see I offer solutions, too. Buckle up, here we go.
From centers of knowledge to suppliers of human capital
The role of universities is to “supply human capital and incubate start-ups”, said a report authored by a several academics in 2012.2 Parliamentarians in France put it in a slightly less blunt way in 2007 when they wrote that everyone agreed the main mission of universities was to “integrate graduates into professional life”.3 In their report, these French parliamentarians wrote a short history of universities in Europe, starting in the 12th century. For them, medieval universities trained high-ranking professionals and civil servants. It’s nice that they rewrote history to make it look like their new law was the continuation of a millennial tradition but there’s a caveat: it’s not true.
When Charles IV created the University of Prague in 1348, for instance, his desire was to “satisfy the hunger of locals for knowledge”.4 Notice how he didn’t mention he was “providing the economy with efficient tradesmen”. Of course, reading documents that old is bound to provoke anachronisms. It remains that universities existed for the sake of seeking the truth. The truth, at the time, was the Bible.5 The curriculum was organized so that students would first spend a few years studying grammar, Latin and other minor topics until they were ready, at the very end, to study theology.
Of course, some of the skills needed to attain theology, along with the prestige of knowledge, made medieval and modern colleges great places for the powerful to send their young ones get an education.6 Many high-ranking civil servants went through university, even though the most common way to receive an education was to hire a (university-educated) preceptor. In business and in all trades, education was carried out within corporations or guilds, where a master was responsible for an apprentice. Starting in the 16th century, university became a nice-to-have experience for the higher-ups and the bourgeoisie, along the more traditional ways of training for a job. Not to forget that banking and finance was operated by Jews (Christians didn’t have the right to charge interest), who could not attend university.7
In their first 900 years or so, universities were not businesses. They were full of poor boys that rich sponsors sent to become theologians. At times, some colleges realized that life was easier when they replaced poor, sponsored students with richer ones, but the gist of university remained: Students were there to come closer to the truth. Money or employability had nothing to do with it. In 1452 for instance, the ideal college principal at Paris university must “eschew all forms of profit or interest”.8
The act of studying, of trying to come closer to the truth, was so special that university staff and students had their own courts. At Paris, this exceptional jurisdiction lasted roughly 500 years, until Louis XIV decided the time had come for universities to bow to political power. In England, the privilege lasted until the end of the 19th century.9
The advent of modernity, in the 18th and 19th century, transformed universities. For one thing, the truth to be sought changed. Truth was not to be found in the Bible anymore, but through science (which had emerged in European universities as part of the lesser arts). On the other hand, universities integrated to society. Jews and women were allowed in, for instance.10 Governments started to build state universities as prestige projects.11 They also policed them more closely. Not only did they revoke their privileges, they also did not hesitate to close them down for political gain. Tsar Nicholas I of Russia, for instance, closed down the University of Warsaw, which had been founded just 15 years earlier, in an attempt to destroy Polish nationalism in 1830.12 (Two hundred years later, authoritarians keep on attacking academia to cement their power, as the ongoing closure of the Central University in Budapest and the closure of 14 universities in Turkey show.13)
Despite the dramatic changes that modernity brought in the 19th century, the purpose of universities remained the same: to seek the truth. There was often a utilitarian twist to it. The state with the best science could build the best military, hence the need to promote research. In the whirlwinds of the Enlightenment, one state, France, did take the radical decision of closing all universities for good, in 1793. They were replaced with rational institutions where students learned skills that were directly useful to the economy. Faculties of theology were out, schools of engineering were in. After a century of back and forth, French politicians made a remarkable discovery. They realized that total state control over academics did not help scientific progress. In 1885, the French government encouraged academics to become more independent. Their reasoning was that more independent scientists would produce more and better knowledge (and improve military capabilities).14 They went back to the idea of universities as truth-seekers.
The integration of universities to the economy
Over the course of the 20th century, universities underwent the largest transformation in their 1000-year history. The massification of higher education, most of the time at public universities, meant that universities took on the responsibility to train the larger share of the workforce. From centers dedicated to seeking the truth, whether theological or scientific, they became training operations. Students stopped going to university primarily to acquire knowledge, they went there to acquire skills that would land them a job. Governments stopped looking at universities primarily as producers of truth and instead started viewing them as “providers of human capital”.
This inversion of priorities had tremendous consequences. As soon as they accepted the role of trainers of future employees, universities took on a new responsibility: training had to be adequate. The responsibility for the rise of unemployment in the 1970’s necessarily fell, at least in part, on the institution that was responsible for providing the population with useful skills. Universities became a problem to fix and, as is often the case when you make the wrong diagnosis, the cures designed by politicians did more harm than good.
Politicians identified two main problems. First, universities were out of touch with the real world and could not identify the priorities of the economy. To solve this, they made funding to academics (apart from their salaries) conditional. Instead of receiving money to pursue their research, scientists must apply for grants, the topic of which have been decided by administrative staff, who, in turn, receive advice from corporations.15 The second problem identified by politicians was that universities were inefficient. They were paid for by taxpayers and had to provide them with a good return on investment, the thinking went. The solution was thus to make universities more businesslike, tearing apart the 1452 document quoted above which forbade university principals to think of profit.
The consequences of this two-pronged approach were disastrous. So disastrous, in fact, that the very concept of truth in our society was shaken.
Corruption in research
Science requires critical thinking. A theory cannot be proven right, it can only be proven wrong. As obvious as they seem, these two statements are not compatible with grants awarded by donors after a tender. A donor, whether public or private, wants to obtain results. The more sensational the results, the better. Some donors even require to proofread the results before they are published. The researcher who receives a grant but fails to provide results might reduce her chances of obtaining a grant next time. The direct consequence of this system is the abolition of critical thinking and the refusal to communicate negative findings, as I made the experience in 2014.
Research driven by grants encourages loyalty, which is not compatible with critical thinking.16 A scientist’s loyalty should be to the search for the truth, not to a donor. Even if the direct falsification of results remains a relatively rare occurrence, research is corrupted in many ways. No academic will take the risk to provoke a negative finding, so experiments are rarely retested and some false theories can linger on. Scientists are ready to abandon their area of expertise to follow the topics dictated by the grant-givers.17
Several factors compound the moral impossibility of serving both the truth and grant-givers. Grant-givers use a series of factors to assess the success of a grant or the efficiency of a program. The number of publication is one, so that academics publish a lot of low-quality papers. The amount of press coverage can be another, pushing universities to sensationalize research or distort findings. Peer-review, the traditional gate-keeping mechanism of academia, failed to address these problems.18
This table by Marc Edwards and Siddhartha Roy, two American researchers famous for having made clear the magnitude of the Flint water crisis, summarizes how some incentives corrupt academic work.19
The rise of the authoritarian bootlickers
Corruption of research is the most dangerous consequence of the wrong diagnosis politicians made. It is also the one that received most attention. Another problem, more dangerous in the long run, concerns the people who staff academia. By rewarding loyalty over critical thinking, universities automatically promote boot-licking stooges over actual scientists. These persons, who are able to obtain money from grant-givers, control who gets what (not everyone working in academia is a public servant, many positions, especially doctoral students, depend on grant money). The mechanism is self-sustaining. These connectors are able to position themselves in a governing position in more projects because of the success of previous ones. Once in this position, they can co-author more scientific papers and rise up the ladder of the academic hierarchy.
Because the success of a project depends on the positivity of the results, the person who controls the money must make sure that none of the scientists under their control publishes - or, better: finds - results that do not go in the direction the donor intended. To be successful, these persons must act in an authoritarian manner and prevent academics under them to think critically.
The grant system puts loyalty and obedience before intellectual honesty and critical thinking. This enables authoritarian-minded persons to access positions of power and, in the meantime, weeds out those who did care for scientific research. While this mechanism might not be very visible yet (because Nobel-Prize level academics made most of their careers under a different system), it is extremely obvious for people under 40. A great many would-be scientists left academia after they realized that they had to swear allegiance to authoritarian anti-scientists ; their absence ensures that the overall quality of academia will sharply drop in the next decades.
Corruption in the institution
Forced by governments to be more efficient, most universities have taken to implementing business practices to maximize profit. To do so, they logically increased the number of high-paying students. In Europe, this meant enrolling as many non-Europeans as possible. International students pay roughly 15,000€ a year in fees, when locals pay four or five time less.20 More subtle tricks involve setting up a license agreement in countries with a high demand for European degrees. The Sorbonne university (Paris 4), for instance, set up shop in Abu Dhabi and cashes in 15% of the fees students pay there (a whopping 50,000€ for a Bachelor’s degree).21
Putting money before research comes at a cost. In Abu Dhabi, for instance, the Sorbonne University remained silent when one of its lecturers was arrested and sentenced to two years in prison for having expressed an opinion that displeased the country leaders.22 In many European institutions, lecturers and professors are pressured to give good grades to high-paying students.23 Some were even charged with corruption after asking students for bribes.24
While this last point does not pertain to the activities of researchers in particular, I believe it has a tremendous impact on academic life. The honesty required to carry out good scientific work cannot survive in a system plagued with corruption. The value of a university’s reputation only makes things worse. Because a student’s degree or a professor’s career depends on her university’s reputation, the incentives to become whistle-blower are even lower in academia than in the corporate world.
The disappearance of the truth
The systematic corruption of research, the promotion of loyal bootlickers over good scientists and the corporate corruption at universities did not only provoke the collapse of university as an institution. As I’ve argued above, universities were privileged institutions where students and academics could search and preserve the truth.25 With this mission gone, truth has nowhere to hide and lies can run freely in the public discourse.26
Even academics who still carry out scientific work cannot - physically - devote time to defend the truth in public. In France, nine out of ten academic staff said they did not have the time to work properly, for instance.
Of course, academia in the decades past did not have a version of the truth locked up in a vault, like bullions of gold in a bank. Quite the contrary: All academics supported ideas that have been proven false since, and a few of these ideas did a lot of harm. Factual truth is constantly evolving. A fact is a fact as long as it’s not been proven wrong. It’s a safe bet to say that the vast majority, if not all, of today’s knowledge will be tomorrow’s superstitions. Truth is a process, not a fixed body of scientific papers, and it is this process of creation of factual truth that’s broken.
As a result, academia could not upgrade its defenses when the mechanisms by which truth was attacked evolved in the second part of the 20th century. The tobacco industry was first in developing a new way to obfuscate the truth in the early 1960’s. Instead of denying a scientific finding, they cast doubt upon it, saying that more research was needed before any action be taken. They also funded research in other directions in order to shift the blame and the attention on another industry. The strategy was so successful that the sugar industry used it to blame fat for obesity and the hydrocarbon giants used it again with even greater success to prevent any action on global warming. The clearest sign of the failure of academia to rise up to this challenge is that it took over 50 years for scientists to identify these mechanisms.28
These new techniques, combined to the systemic inability of academia to confront them (how many academics still don’t understand why climate change deniers think the way they do?), lowered the bar for superstitions looking to make a comeback in the public discourse. Creationism was an early candidate. Even though it was quashed in the United States in the 1960’s, it has been creeping back ever since.29 Anti-vaccination movements are another example of a superstition which, although once accepted as wrong, is benefiting from the collapse of academia. The credibility of academia is so low, its means of action so ridiculous, that any theory can now prosper, however laughable. Chemtrails and the idea that cell phones cause cancer gained a wide following in the past decade, and they’re far from the most irrational of these new superstitions.30
Totalitarian systems of the 20th century attacked academics as part of their war on truth. In the 21st, the collapse of truth under the weight of systemic contradictions within academia is paving the way for renewed totalitarianism. The economics of social media might have been one cause for the segmentation of the population into alternate realities.31 Had universities been strong and forward-thinking, they would have set up shop on social networks first (after all, they had e-mail in the early 1990’s) and would have learned how to use them efficiently before their enemies did. The ultimate cause of the segmentation into alternate realities lies in the loss of a common platform where truth is created, debated and shared. Wikipedia was not enough to replace universities.
End conditional funding for research
Grants were put in place to prevent researchers from doing nothing. Back in the 1980’s, in France, you could literally do nothing with your allocation of research money, and nobody could do anything against it.32 Now, with grants, things have changed a great deal: You have to fake time sheets and publish bogus papers in order to be able to still do nothing with grant money. And in the process, a large chuck of academia was destroyed. Was it worth it? Probably not.
The first step towards the resuscitation of academia should be the end of conditional funding, starting with grants. All researchers should receive a lump-sum amount with which to finance research. This would kill the complex, expensive and utterly useless grant-management administration and it would let researchers investigate what they feel is worthy of their time, not what politicians and their corporate friends think is.33
Privatize higher education
The economy of the second half of the 20th century required that a great many people develop skills that go beyond secondary education, just like the economy of the first half of the 20th century required that a great many people have skills that go beyond primary education. Governments turned to universities as if they were the simple continuation of secondary schools. But they aren’t. Universities are fundamentally different in that their main mission is research and thinking, not instruction. By turning universities into “suppliers of human capital”, governments destroyed the foundations of one of the oldest European institutions and ensured their collapse.
Universities are not a place for higher education. They are a place for research. If private sector companies or public administrations want to train their future workers, they should do it themselves. The distinction between universities and higher schools for specialists (engineers, painters etc.) worked fairly well in France, between 1885, when universities were revived, and the moment universities were tasked with the instruction of millions of French youth, in the 1960’s.
Today, universities are so bad at training that private companies are setting up their own educational system. Apple opened an “academy” in Naples in 2016, where students can learn programming in a year-long curriculum.34 More interestingly, French billionaires of the dotcom bubble joined forces to create 42, a higher education institution where students become programmers in 3 years.35 In such schools, the curriculum is tailored for the needs of the businesses that stand behind them, ensuring that students find jobs very easily.
That higher education be run by private corporation is not new. In journalism, many of the most prestigious schools belong to news organization. The Henri-Nannen-Schule, in Hamburg, was built to provide Gruner+Jahr, Die Zeit and Der Spiegel with new recruits. In Madrid, Unidad Editorial (which publishes El Mundo), has its own school, too. These examples show that a school run by a a corporation need not be inferior to one run by a university.
Absent a radical change by European (and other) governments, higher education will probably diverge into universities that fail at training adequately and institutions run by corporations or administrations. The latter help in getting a job but can restrict the opportunities of students. At 42, for instance, students do not receive ECTS credits, preventing them from enrolling in a master’s program later in their lives. At Apple Academy, it’s even worse: the only programming language that is taught is developed by Apple, ensuring that they are locked in for the rest of their professional lives.
Universities should be left to what they were meant to do, which is looking for the truth. The drive to privatize them or make them more efficient should stop. Corporations and administrations should be put in charge of what they need, which is to train their future employees. And the governments should let both spheres collaborate, so that students and professionals can move from one to the other.
1. To protect the privacy of those involved, some factual elements have been changed, but the story is true. If you want to access the actual research, email me. And worry not, we did not bow to pressure.
2. Quote from the introduction chapter of Universities in Transition, edited by Bo Göransson and Claes Brundenius (2012).
4. See Littera fundationis Universitatis Carolinae Pragensis, which I read automatically translated from Czech. If you have a translation in English, French or German, let me know.
5. Arabs developed universities before Christians did (the first Christian university opened at Bologna, followed by Paris, around 1150). All I’ve read on the topic points to the total absence of influence of Muslim universities on the European system, but it might be due to the bias of historians. If you have a source on the issue, pass it on.
6. On the social outcome of university studies in the Middle Ages, I found Walter Rüegg’s introduction to A History of the University in Europe very helpful.
8. This quote, and much of what I learned about European universities in the 16th century, comes from Aux sources de la pédagogie des jésuites. Le «Modus parisiensis», by Gabriel Codina Mir (1967).
9. I couldn’t find a comprehensive history of university privileges, so I may be wrong. The information about Cambridge I took from Archives of the University of Cambridge: An Historical Introduction, page 58 ff, the one about Paris is from Mir (op. cit.)
12. In the interest of fairness: He also closed down the university of Wilno/Vilnius, which had been founded much earlier.
14. See L’Enseignement français de la Révolution à nos jours, by Pierre Chevallier, Bernard Grosperrin and Jean Maillet, page 131 (1968).
15. The assumption that corporations know what the needs of the economy are - as opposed to what their own needs are - would require another essay.
16. For a great development on this opposition between the morals of research and the morals of government, read the Chapter 3 of Jane Jacob’s Systems of Survival.
17. As the headline of a French article about Nobel Prize Laureate Jean-Pierre Sauvage in 2016 made clear, he, by current standards, ‘did everything wrong’ because he refused to follow the whims of grant-givers. Younger scientists do not have this opportunity. Read Le Nobel de chimie 2016 avait tout faux.
18. On the issue of the failings of peer review, read Who’s Afraid of Peer Review?, a study by Science that showed that many journals accepted blatantly ridiculous results. On the incapacity of most news outlets to figure out what makes a good study, the story told in I Fooled Millions Into Thinking Chocolate Helps Weight Loss. Here’s How. makes a good point.
20. In 2008, after my experience at the University of East Anglia, where I landed in a Chinese-only class, I wrote a text, pompously titled The economics of postgrad degrees in the post-student fee era. I still stand by it.
23. Alas, no one I talked to over the last ten years on the topic agreed to go on the record. You’ll have to take my word for it.
24. At the university of Toulon, for instance (read L’ancien président de l’université de Toulon jugé pour trafic d’inscriptions reconnaît avoir fait des erreurs) or at Zagreb (read ‘Die Armen müssen halt lernen’). In Germany, the sale of PhDs was common practice (read ‘Das Ansehen unserer Abschlüsse steht auf dem Spiel’) but I think it’s not due to transformations of the university system, rather a consequence of the value locals put in having a ‘Dr.’ title appended to one’s name.
26. As I’ve argued before, journalists can play a very small role, but they won’t save much. Librarians are probably the best second line of defense, but their situation is as critical as universities’.
28. On the topic, read Golden Holocaust by Norbert Proctor, though I suspect that Merchants of Doubt by Naomi Oreskes and Erik Conway (which I haven’t read) is an easier target. Both were published in the early 2010’s.
29. The decisions by which the United States Supreme Court fought creationism were Epperson v. Arkansas (1968) and Edwards v. Aguillard (1987). Creationists resorted to the toolbox of Big Tobacco by offering alternative theories, notably ‘intelligent design’, that they put on an equal footing with evolution.
30. Studies of conspiracy theorists show that those who tend to believe such superstitions are the least integrated to the economic fabric in general. These beliefs are obviously a social problem, not a scientific one. However, the lack of respect for academics and the lack of reactions from them intensifies the underlying issue.
31. The phrase is from Caitlin Dewey’s 2015 piece, What Was Fake on the Internet This Week: Why This Is the Final Column.
32. This comes from direct testimonies.
33. Obviously, this system could be gamed. Making payment to researchers directly (as opposed to laboratories, or groups of researchers), would give a veto right to any researcher in a team - which would not make things manageable. Science is complex, I’m not pretending to offer tailor-made solutions here, I’m just pointing out that grants worsened the problems they were designed to solve.
34. Read Silicon Valley comes to Naples: Apple prepares to open Italian academy for the background.