There is no shortage of people who want us to get to Mars, but less noticed among them are the people who ask, “Should we go?”. While the reasons that would compel us to go to Mars are many and substantial there is a vital question that must be considered.
What if there is already life on Mars? Should we still go if there is?
One of the greatest advocates of space exploration in the 20th century, Carl Sagan, said no. In the chapter ‘Blues for a Red Planet’ in his book Cosmos, he discusses the idea fully: “If there is life on Mars, I believe we should do nothing with Mars. Mars then belongs to the Martians, even if the Martians are only microbes.” When he wrote that in 1980, inconclusive evidence on such life being on Mars from the Viking landers made his statement incredibly relevant.
While we might consider the idea of life on Mars less likely than he did, the question he tries to answer remains important. Especially when mankind moves beyond Mars and towards other candidates for life harboring worlds such as Europa, Titan, or Enceladus.
Sagan was opposed to anthropocentrism, the belief that humans are exceptional or otherwise extremely significant, both on Earth and in the universe. In defending the rights of Martian microbes to their planet, Sagan suggests that all life has great value and that our ability to get to Mars does not constitute a right to colonize it. His position was reflected here on Earth through his support of the Cornell Students for the Ethical Treatment of Animals, as their faculty advisor.
On the other hand, the argument could be made that the life we are most likely to find on Mars, if any, would be microbial and thus perhaps not entitled to any special rights. After all, who cares for the rights of microbes on Earth?
Consider a few methods of deciding who has certain rights in modern philosophy. They tend to run into the issue of anthropocentrism, and find it difficult to protect the rights, if any, of non-human life. Microbes get left out entirely.
James Griffin’s idea of human rights is based on the notion of Normative Agency, our ability to design and act on a plan of life, and could only be fully applied to humanity at this time. Even if you were to scale rights down with the ability of an animal to use said agency, microbes would be left with nothing.
Kant’s notion of Autonomy is also difficult to apply to anything less intelligent than a human. While it would be possible to support a maxim that respects all life rather than just humanity using his ethics, Kant still relies on the ability of a moral agent to reason, which we generally don’t consider microbes to be able to do. John Rawls, in his Kantian masterpiece A Theory of Justice, excludes animals from having “rights” because of this. In all of his works, he never suggests we owe non-human life anything more than a guarantee to avoid cruelty. His concern for microbes could be considered minimal.
John Stuart Mill’s Utilitarianism perhaps demands that we do invade Mars, as he places the happiness of a human at a much higher value than that of a lower life form. Stating in his classic text Utilitarianism: “it is better to be a human being dissatisfied than a pig satisfied; better to be Socrates dissatisfied than a fool satisfied. And if the fool, or the pig, are of a different opinion, it is because they only know their own side of the question.” The promotion of the total happiness then, calls for us to ignore the risks to microbial life on Mars.
Of course, the opponents of anthropocentrism could point out, perhaps accurately, that humans have a tendency to place high value on traits that they have a monopoly on. When happiness is discussed, we even claim access to a higher form of it. Perhaps we are biased towards our own virtues and value. After all, wouldn’t intelligent eagles value the power of flight and keen vision over having thumbs?
So, should we go to Mars? Or is Mars for the Martians, if any? The question of if there ever was life on Mars to begin with must be resolved first. When it is resolved, we are still faced with the issue of life on other worlds we desire to travel to. Should we invoke the Prime Directive of the Federation when exploring the heavens, or brush aside the bacterium of the cosmos? The philosophical jury is still out.
Quem diz é o filósofo espanhol Francisco Jarauta. No mundo de hoje, nada é como era antes. É um laboratório onde todos os modelos políticos, éticos e morais precisam ser repensados
Poucos filósofos contemporâneos conhecem tanto quanto Francisco Jarauta, catedrático de filosofia da Universidade de Múrcia, antropólogo e especialista em História da Arte, os desafios e as complexidades de uma sociedade globalizada em que, diz ele, “todas as certezas do passado voaram pelos ares ao mesmo tempo”.
Ele considera que, se as grandes utopias morreram, ainda nos restam “as pequenas utopias, as que nos ajudam a viver o cotidiano”, como a de poder morar em um bairro onde todos se conheçam e sejam solidários uns com os outros, ou a da busca pelo tempo livre e pelo silêncio criativo. A pequena utopia da amizade, ou aquela capaz de transformar nosso trabalho e nossos sonhos em fruição, em vez de pesadelos e escravidão.
Para o filósofo, a emergência do Outro está se transformando em um laboratório onde “nossos modelos políticos, éticos e morais precisam ser repensados”.
Como diretor do Conselho Científico do Instituto Europeu de Design (IED) da Espanha, e membro do Conselho Internacional do Grupo IED, hoje presente também no Brasil, com unidades no Rio e em São Paulo, Jarauta, que ensina em várias instituições universitárias do mundo, se nutre da experiência desses milhares de estudantes que formam um caleidoscópio cosmopolita das tendências que estão forjando a nova civilização.
A esses jovens, que estão se graduando em design industrial e gráfico, e como novos estilistas de moda, Jarauta surpreende e estimula com suas metáforas e paradoxos. Como quando lhes diz que “somos nossas próprias perguntas”, para acrescentar em seguida que “a intensidade dos fatos condena essas perguntas ao silêncio”.
Escrutinador do caminho por onde vai a nova civilização, Jarauta é também um intelectual que gosta de tomar o pulso da humanidade. Depois de uma viagem recente a um campo de refugiados na Grécia, declarou ao jornal O Globo que o mais urgente hoje é “reconstruir o coração da humanidade”.
Para isso, afirma, “é preciso caminhar pela viagem da vida”, onde existe a dor, a crueldade, a cegueira ante as tragédias como as dos refugiados e imigrantes, “esses novos párias da história”.
Em sua conversa, que tínhamos interrompido durante muitos anos, desde nossos encontros no IED de Madri, ressalta que “estamos na era do ‘pós’: a pós-verdade, a pós-democracia, a pós-política, a pós-modernidade a pós-identidade. Nada é mais como ontem”.
Diz “que temos de nos debruçar na janela do mundo para ver o que se está passando e o que está chegando”. Vivemos não só no mundo da velocidade, mas naquele em que “as geografias se deslocaram e os espaços e as distâncias desapareceram”. Um jovem estudante de Cingapura se encontra em poucas horas com um do Rio e é como se fossem do mesmo bairro. A globalização os transforma em contemporâneos.
Isso leva a um “inevitável processo de miscigenação cultural”, à nova “sociedade da rede”, onde todos nos comunicamos, misturamos, contagiamos e recriamos. Aí reside o verdadeiro futuro.
O que chamamos de crise na realidade significa que todas as velhas definições do saber e da cultura estão morrendo, assim como as velhas profissões. A neociência já está sendo criada fora das universidades clássicas, muitas delas ainda de “formato medieval”.
Nada mais está petrificado nem prefigurado. Saltam pelos ares as definições do passado. O que é a filosofia hoje, a política, a arte? O que é o design em uma sociedade pós-industrial? “Já não é a pura fabricação de objetos para o consumo e o mercado”, diz o filósofo.
É muito mais: “O design se transformou, por exemplo, em um dos instrumentos básicos na hora de definir as novas formas da cultura. Pertence, por direito próprio, ao mundo do projeto, capaz de transformar os gostos, as formas de percepção das coisas e as novas necessidades das pessoas”.
Para tentar apreender o mundo em ebulição, torna-se cada vez mais atual, diz Jarauta, o estudo das tendências. É preciso saber, como os antigos radioestesistas, detectar os mananciais que correm sob nossos pés. Temos que ser “mergulhadores do novo”.
Os jovens, mais que ninguém, necessitam hoje, segundo o filósofo, nutrir-se da certeza de que o mundo novo que os espera não só não lhes será hostil, mas lhes “permitirá participar para dar-lhe nome e sentido”.
An respected thinker points the way ahead. This book is nothing short of a state-of-the-world address, delivered by a scholar uniquely suited to the task. Immanuel Wallerstein, one of the most prominent social scientists of our time, documents the profound transformations our world is undergoing. With these transformations, he argues, come equally profound changes in how we understand the world.
Wallerstein divides his work between an appraisal of significant recent events and a study of the shifts in thought influenced by those events. The book’s first half reviews the major happenings of recent decades-the collapse of the Leninist states, the exhaustion of national liberation movements, the rise of East Asia, the challenges to national sovereignty, the dangers to the environment, the debates about national identity, and the marginalization of migrant populations. Wallerstein places these events and trends in the context of the changing modern world-system as a whole and identifies the historical choices they put before us.
The second half of the book takes up current issues in the world of knowledge-the vanishing faith in rationality, the scattering of knowledge activities, the denunciation of Eurocentrism, the questioning of the division of knowledge into science and humanities, and the relation of the search for the true and the search for the good. Wallerstein explores how these questions have arisen from larger social transformations, and why the traditional ways of framing such debates have become obstacles to resolving them. The End of the World As We Know It concludes with a crucial analysis of the momentous intellectual challenges to social science as we know it and suggests possible responses to them.
Immanuel Wallerstein is Distinguished Professor of Sociology and director of the Fernand Braudel Center at Binghamton University. Among his numerous books are The Modern World-System (1974, 1980, 1989), Unthinking Social Science (1991), and After Liberalism (1995).
Tradução de The End of the World as We Know It, com uma apresentação especial para a edição brasileira. Obra de valor inquestionável e leitura indispensável para todos aqueles que se interessam em uma avaliação do significado dos fatos mais importantes das últimas décadas. Elabora um diagnóstico da crise atual do sistema capitalista e o apresenta em estado de “crise terminal”.
Inquietante, o pensamento de Wallerstein nos dá motivos suficientes para profundas reflexões e questionamentos. O livro ainda conta com o texto inédito de Luis Fernando Verissimo, que curiosamente intitula Immanuel Wallerstein de “autor-trampolim”: “Há muitos autores que valem mais pelos pensamentos que provocam, ou pela maneira inédita de pensar que exemplificam, do que propriamente pelas suas idéias.”
Sobre o autor: Immanuel Wallerstein é hoje um dos principais nomes da Ciência Social no mundo. Faz um painel do encontro das Ciências Sociais com o pensamento mais avançado da ciência dita exata, a ciência dos sistemas complexos, para integrar todo o conhecimento humano em um conjunto único. Fundou e dirige o Centro Fernand Braudel para o Estudo de Economia, de Sistemas Históricos e de Civilizações, situado em Binghamton. É pesquisador na Universidade Yale e publica artigos em numerosos jornais e sites da internet. Durante quatro anos, foi presidente do IAS (Associação Internacional de Sociologia).
It’s been 134 years since Friedrich Nietzsche declared: “God is Dead”, giving philosophy students a collective headache that’s lasted from the 19th century until today. It is, perhaps, one of the best known statements in all of philosophy, well known even to those who have never picked up a copy of The Gay Science, the book from which it originates. But do we know exactly what he meant? Or perhaps more importantly, what it means for us?
Nietzsche was an atheist for his adult life and didn’t mean that there was a God who had actually died, rather that our idea of one had. After the Enlightenment, the idea of a universe that was governed by physical laws and not by divine providence was now reality. Philosophy had shown that governments no longer needed to be organized around the idea of divine right to be legitimate, but rather by the consent or rationality of the governed — that large and consistent moral theories could exist without reference to God. Europe no longer needed God as the source for all morality, value, or order in the universe; philosophy and science were capable of doing that for us. This increasing secularization of thought led the philosopher to realize that not only was God dead but that we had killed him with our own desire to better understand our world.
The death of God didn’t strike Nietzsche as an entirely good thing. Without a God, the basic belief system of Western Europe was in jeopardy, as he put it in Twilight of the Idols: “When one gives up the Christian faith, one pulls the right to Christian morality out from under one’s feet. This morality is by no means self-evident… Christianity is a system, a whole view of things thought out together. By breaking one main concept out of it, the faith in God, one breaks the whole.”
Nietzsche thought this would only be a good thing for some people, saying: “… at hearing the news that ‘the old god is dead’, we philosophers and ‘free spirits’ feel illuminated by a new dawn.” With the old system of meaning gone a new one could be created, but it came with risks. Nietzsche believed that the removal of this system put most people at the risk of despair of meaninglessness. What could the point of life be without a God? Even if there was one, the West now knew that he hadn’t placed us at the centre of the universe, and was learning of the lowly origin from which man had evolved. The universe wasn’t made for us anymore. Nietzsche feared that this understanding of the world would lead to pessimism, “a will to nothingness” that was antithetical to the life-affirming philosophy Nietzsche prompted.
His fear of the nihilism and our reaction to it was shown in The Will to Power, when he wrote that: “What I relate is the history of the next two centuries. I describe what is coming, what can no longer come differently: the advent of nihilism. . . . For some time now our whole European culture has been moving as toward a catastrophe.” He would not have been surprised by the events that plagued Europe in the 20th century. Communism, Nazism, Nationalism, and the other ideologies that made their way across the continent in the wake of World War One sought to provide man with meaning and value, as a worker, as an Aryan, or something else; in a similar way as to how Christianity could provide meaning as a child of God, and give life on Earth value by relation to heaven. While he may have rejected those ideologies, he no doubt would have acknowledged the need for the meaning they provided.
Of course, as Nietzsche saw this coming, he offered us a way out. The creation of our own values as individuals. The creation of a meaning of life by those who live it. The archetype of the individual who can do this has a name that has also reached our popular consciousness, the Übermensch. Nietzsche however, saw this as a distant goal for man and one that most would not be able to reach. The Übermensch, which he felt had yet to exist on Earth, would create meaning in life by their will alone, and understand that they are, in the end, responsible for their selection. As he put it: “For the game of creation, my brothers, a sacred yes is needed: the spirit now wills his own will.” Such a bold individual will not be able to point to dogma or popular opinion as to why they value what they do.
Having suggested the rarity and difficulty in creating the Ubermensch, Nietzsche suggested an alternative response to Nihilism, and one that he saw as the more likely to be selected; The Last Man. A “most contemptible thing” who lives a quiet life of comfort, without thought for individuality or personal growth as: “‘We have discovered happiness,’ — say the Last Men, and they blink.” Much to the disappointment of Zarathustra, Nietzsche’s mouthpiece, the people whom he preaches to beg him for the lifestyle of The Last Man, suggesting his pessimism on our ability to handle God’s death.
But you might ask, if God has been dead for so long and we are supposed to be suffering for knowing it, where are all the atheists? Nietzsche himself provided an answer: “God is dead; but given the way of men, there may still be caves for thousands of years in which his shadow will be shown.” Perhaps we are only now seeing the effects of Nietzsche’s declaration.
Indeed, atheism is on the march, with near majorities in many European countries and newfound growth across the United States. But, unlike when atheism was enforced by the communist nations, there isn’t necessarily a worldview backing this new lack of God, it is only the lack. Indeed, British philosopher Bertrand Russell saw Bolshevism as nearly a religion unto itself; it was fully capable and willing to provide meaning and value to a population by itself. That source of meaning without belief is gone.
As many atheists know, to not have a god without an additional philosophical structure providing meaning can be a cause of existential dread. Are we at risk of becoming a society struggling with our own meaninglessness? Are we as a society at risk for nihilism? Are we more vulnerable now to ideologies and conmen who promise to do what God used to do for us and society? While Americans are increasingly pessimistic about the future, the non-religious are less so than the religious. It seems Nietzsche may have been wrong in the long run about our ability to deal with the idea that God is dead.
As Alain de Botton suggests about our values, it seems that we have managed to deal with the death of god better than Nietzsche had thought we would, we are not all the Last Men, nor have we descended into a situation where all morality is seen as utterly relative and meaningless. It seems that we have managed to create a world where the need for God is reduced for some people without falling into collective despair or chaos.
Are we as individuals up to the task of creating our own values? Creating meaning in life by ourselves without aid from God, dogma, or popular choice? Perhaps some of us are, and if we understand the implications of the death of God we stand a better chance of doing so. The despair of the death of God may give way to new meaning in our lives; for as Jean Paul Sartre suggested “life begins on the other side of despair.”
Abrams, Daniel, Haley Yaple, and Richard Wiener. “ArXiv.org Physics ArXiv:1012.1375v2.” [1012.1375v2] A Mathematical Model of Social Group Competition with Application to the Growth of Religious Non-affiliation. N.p., n.d. Web. 04 Aug. 2016.
“Americans Overwhelmingly Pessimistic about Country’s Path, Poll Finds.” Mcclatchydc. N.p., n.d. Web. 04 Aug. 2016.
“America’s Growing Pessimism.” The Atlantic. Atlantic Media Company, 10 Oct. 2015. Web. 04 Aug. 2016.
“CNN/ORC Poll: 57% Pessimistic about U.S. Future, Highest in 2 Years.” CNN. Cable News Network, n.d. Web. 04 Aug. 2016.
Nietzsche, Friedrich Wilhelm, and Walter Arnold Kaufmann. “The Meaning of Our Cheerfulness.” The Gay Science: With a Prelude in Rhymes and an Appendix of Songs. New York: Vintage, 1974. N. pag. Print.
Press, Connie Cass Associated. “Gloom and Doom? Americans More Pessimistic about Future.” Las Vegas Review-Journal. N.p., 03 Jan. 2014. Web. 04 Aug. 2016.
Russell, Bertrand. Bolshevism: Practice and Theory. New York: Arno, 1972. Print.
The August 2016 issue of Cultural Anthropology included the research article “Practicing Uncertainty: Scenario-Based Preparedness Exercises in Israel,” by Limor Samimian-Darash, who is Senior Lecturer at the Federmann School of Public Policy and Government at the Hebrew University of Jerusalem. What follows is a lightly edited transcript of an interview that contributing editor Ned Dostaler conducted with Samimian-Darash about the article, her conceptual influences and engagements, and her next project on global scenarios.Ned Dostaler: When did you first learn about the Turning Point scenario-based exercises that you describe in your article, and why did you decide to take them up as an object of anthropological inquiry? Is this work part of a larger project?
Limor Samimian-Darash: My interest in scenarios grew out of my concern with analyzing and conceptualizing uncertainty. I was first drawn to the issue of uncertainty during my research on pandemic flu preparedness in Israel. Although literature in both sociology and anthropology thoroughly discusses the concept of risk, including its cultural perceptions and its roots in modernity, I found that uncertainty was understudied, at best.
In preparing for pandemic flu, Israeli authorities were forced to deal with what I term potential uncertainty. This type of uncertainty differs conceptually from what I call possible uncertainty and reflects a distinctive perspective on the future, the present, and the relations between the two. Whereas possible uncertainty derives from experience-based knowledge, that is, information based on past events, potential uncertainty derives from events that emerge from the virtual realm—from situations unaccounted for by known possibilities.
The authorities charged with preparing for pandemic influenza brought a variety of technologies to bear on the problem. During my research, I observed the application of three such technologies, two of which approached the problem from a risk-based perspective. The third, the syndromic surveillance system, seemed to be bringing something new to the table in terms of its underlying conception of uncertainty and its mode of operation: it was attempting to work “with” uncertainty, rather than against it. My encounter with this new mode of governing uncertainty inspired my interest in governing technologies that operate on principles other than risk, which, in turn, led me to scenarios.
This interest is consistent with my belief that, as anthropologists, we should not limit ourselves to constructing general or grand narratives of contemporary social processes. Important though such efforts are, we also need to look at the actual mechanisms that drive those processes. For me, that means examining existing mechanisms that govern risk and uncertainty, considering how new mechanisms emerge, and how—through their use—they mold conceptualizations of uncertainty in contemporary societies. Rather than focus on the appearance of new risks in the world, which is the basis of the risk-society approach, or on the impossibility of calculating these risks, a fundamental point of science and technology studies, I tackle uncertainty through the empirical study of the techniques that govern it, through their extraction and rigorous conceptualization.
Since heightened security consciousness is a fact of life in Israel, it was perhaps natural that my interest in the imagination and construction of future uncertainty would come to focus on security preparedness. Israel’s annual Turning Point exercise provided me with a tailor-made opportunity to explore the design and operation of mechanisms intended to govern security-related uncertainty. As the entire exercise is scenario-based, it offered me a means of studying the scenario as a practice of imagining the future, of tracing the role of the scenario in emergency preparedness and perception, and of understanding how it envisions and translates future uncertainty more generally.
The Turning Point exercise is part of Israel’s overall program of preparedness for war and disasters. Previous studies of scenarios have focused on relatively small-scale (e.g., table-top) exercises. Turning Point offered a novel view of a national-scale exercise within which multiple scenarios enacted at varying organizational levels are incorporated into one master narrative event. I was extremely excited to get access to such a field site and to be able to comprehensively explore the scenario exercises through ethnographic fieldwork. My study enabled me to follow the months of preparations for the annual exercises and all the fronts on which they take place.
The flu scenarios I had studied earlier were merely scripts and thus drew heavily on past events—the story lines were very limited and were not put into actual practice. Turning Point gave me a chance to fully explore the scenario as a form of thought and practice. I was interested in understanding whether the practice of scenarios opens up something new in thought and in actuality, whether it promotes an emerging and a becoming beyond the already known and existing. It seemed likely to me that both the content of a narrative and the particular way in which it is enacted affect how the scenario works as an uncertainty-based technology.
ND: I am interested in understanding how and why scholars choose their analytical interlocutors and concepts. Thus, I was wondering if you might be able to say something about who you consider to be your closest or most influential interlocutors.
LSD: My work draws analytically and methodologically on the anthropology of the contemporary, as developed by Paul Rabinow, on certain of Michel Foucault’s studies and his elaboration of modes of thought and experience, and the philosophical approach of Gilles Deleuze and Félix Guattari. Disparate as these thinkers are, they have all provided me with valuable analytical approaches and tools.
The primary influence within anthropology on my research is Paul Rabinow’s work on the contemporary: its problem focus, analytical mode, and anthropological ethos. Rabinow’s work profoundly shapes how I approach anthropological inquiry, identify problems, and formulate concepts.
Foucault’s writing on governmentality, as presented in the three forms of sovereignty, discipline, and the biopolitical security apparatus, provides me with both an analytical framework highlighting the heterogeneous structure of power and a methodological blueprint for identifying problematizations. His three governmental forms emerged historically in response to specific problems, each enacted with a certain aim and through certain practices. They are not mutually exclusive, however, and the emergence of one does not imply the disappearance of another. The biopolitical security apparatus, a technology of governing the population through normalization (of circulation and freedom), has been especially relevant to my work.
I initially took up the concept of the security apparatus during my analysis of Israel’s 2001–2002 smallpox vaccination project. During that project vaccination, a security prevention measure, became a new technique of governing, of preparedness, enacted through the novel temporality of a new problem. Whereas, historically, the governmental means and end of the security apparatus was to constitute and manage the population in relation to actual events, in the contemporary form of preparedness, a problem of temporality emerges, the need to operate on both present and future risk simultaneously and thus to govern through time. In following Rabinow’s work, I am investigating contemporary forms of governing beyond those specific (historical) forms presented by Foucault. Thus the task is not to literally re/present what Foucault has extracted as part of a particular historical contingency, but to grasp his effort as an analytical approach that enables us to observe other forms of experience in the contemporary.
The philosophy of Deleuze both complements Rabinow’s work on concepts and helps to bring Foucault’s (historical) constructs into the contemporary. One can plug into Deleuze and Guattari’s work from many different angles: their discussion of events versus accidents (The Logic of Sense), the virtual and the actual, difference and repetition (Difference and Repitition), the rhizome, assemblages (A Thousand Plateaus), and the idea of concept (What is Philosophy?). In adopting a Deleuzean approach, the researcher has a dual task: to establish virtual events (concepts, problems) from current events (solutions to existing problems) and to show how, in their actualization, problems are not swallowed up or suppressed by the solutions given to them. In other words, the researcher’s goal is not only to explore the solutions created for particular historical problems (as in a Foucauldian problematization) but also to pull the problem out of the solutions, from actual events. Doing so constitutes counteractualization, the process whereby the investigator establishes a virtual event. The idea of counteractualization or countereffectuation appears in various forms in Deleuze and Guattari’s writing, especially when they discuss the creation of concepts. I draw generally on these ideas in my own conceptual work. In particular, Deleuze’s differentiation between subjective and objective uncertainties in referring to two forms of the event has been useful to me in formulating the concept of potential uncertainty and in distinguishing it from risk.
ND: In thinking through the Turning Point scenario-based exercises, your article describes a shift that “move[s] us from one mode of governing, via biopolitical security apparatuses and risk-based technologies, to another mode, of preparedness and uncertainty-based technologies.” Do you have a sense of how that shift is experienced by the subjects of the latter mode of governance that the article traces?
LSD: I get at subjects’ experience largely through the atmosphere I observed during the exercise, especially in the emergency situation rooms. To really understand their experience of the Turning Point exercise as an uncertainty-based technology, I need to bring another concept into the discussion: affect. Different types of technologies, my research has shown me, can be distinguished in terms of their relationship (or lack thereof) to affect.
Turning Point involves preparations by governmental ministries, local municipalities, and the population at large for a wide array of threats. As far as the exercise’s bureaucrats are concerned, the main experience is one of uncertainty, reflecting an intrinsic aspect of the scenarios they devise and are charged with managing. Israeli citizens, however, do not deal with scenarios but are mainly involved in simulation-like practices, which entail a different mode of experience.
Whereas scenario technology, as a practice of uncertainty, gives rise to alertness and urgency among participants, the simulation, through the practice of repetition and order, leaves no room for uncertainty to develop. The simulation elicits a known, predictable set of reactions that constitutes the event as one of certainty, one with prescribed solutions. Participants’ task is to precisely follow given instructions or to repeatedly practice the same actions, never deviating from protocol. The scenario, by contrast, requires participants to face something new as it is emerging and to adapt to that emerging reality.
A simulation involves the realization of a problematic future possibility and the implementation of known solutions to that problem in order to routinize responses to it. A scenario, by contrast, is triggered by the threat of the unforeseen and involves the practice of uncertainty. Since a simulation is a closed event that aims to prevent uncertainty and disorder, it cannot generate affect (but can manage emotions); affect, following Brian Massumi, can only occur through activation of an uncertainty-based mechanism such as the scenario. In that context, affect is produced not only by the identification of an external threat or uncertainty but also, and mainly, by the spontaneous creation of uncertainty during the exercise. Scenario technology produces an affective situation among its participants precisely because of its inherent capacity to create uncertainty.
ND: Finally, what can you tell us about your next project?
LSD: Scenario techniques are varied, have proliferated in many areas of practice and research (e.g., military, policy, energy, and, more recently, health and business), and are in widespread use around the world. However diverse, they share a common mode of thought and practice: they commonly present or perform “stories about the future aimed at helping people break past their mental blocks and consider ‘unthinkable’ futures” (Ringland 1998, 12). Although scenarios have grown in popularity over the past several decades, their social-scientific study remains limited.
My next research project tackles global scenarios. I will be examining how scenario thinking has emerged historically in relation to other responses to the problem of the future, such as prediction. I want to look at scenarios as enacted in three fields—health, cybersecurity, and business—and analyze them from national, international, and global perspectives. This global focus is, I believe, essential to furthering understanding of the scenario form, and forms of imagination more generally, especially as studies thus far have been limited in scope—usually focused on one field of activity within one country. Moreover, the three fields I want to examine also emphasize different temporalities in their scenario thinking and planning. As part of this effort, I plan to explore the particular relationship between the past and the future that characterizes temporal thinking in each of the three field sites.
Ringland, Gill. 1998. Scenario Planning: Managing for the Future. Chichester, U.K.: Wiley.
The Google Books N-gram corpus contains an enormous volume of digitized data, which, to the best of our knowledge, sociologists have yet to fully utilize. In this paper, we mine this data to shed light on the discipline itself by conducting the first empirical study to map the disciplinary advancement of sociology from the mid-nineteenth century to 2008. We analyse the usage frequency of the most common terms in five major sociology categories: disciplinary advancement, scholars of sociology, theoretical dimensions, fields of sociology, and research methodologies. We also construct an overall index deriving from all sociology-related key words using the principal component method to demonstrate the overall influence of sociology as a discipline. Charting the historical evolution of the examined terms provides rich insights regarding the emergence and development of sociological norms, practices, and boundaries over the past two centuries. This novel application of massive content analysis using data of unprecedented size helps unpack the transformation of sociocultural dynamics over a long-term temporal scale.
The emergence of big data has opened many research opportunities and topics for the field of social science. As a lens on human culture (Aiden and Michel, 2013), big data offer enormous possibilities to detect historical trajectories, human interactions, social transformations and political practices with rich spatial and temporal dynamics. Forecasting the next five decades of social science research, King (2009: 91) has predicted a ‘historic change’ in which the profusion of gigantic databases and their investigation will promote ‘our knowledge of and practical solutions for problems of government and politics to grow at an enormous rate’.
One particularly promising new tool for massive content analysis is the Google N-gram corpus, a digitized books repository containing enormous volumes of digitized data. Michel et al. (2011) have described the construction of the first edition of the Google N-gram Corpus with approximately 5 million books and examined the usage frequency of words in order to quantitatively analyse human culture trends in ways unimaginable even a decade ago. Following this seminal study, the Google N-gram corpus has been used to explore the politics of disaster (Guggenheim, 2014), the language of contention (Tarrow, 2013), the transformation of economic life (Bentley et al., 2014; Roth, 2014), patterns of poverty and anti-poverty policy (Ravallion, 2011), linguistic and written language development (Twenge et al., 2012), and the psychology of culture (Greenfield, 2013; Zeng and Greenfield, 2015).
Notwithstanding this recent profusion of academic texts employing digitized texts, sociologists have yet to fully explore the possibilities offered by this new dataset. Whereas almost a decade ago the ‘coming crisis of empirical sociology’ related to sociologists’ failure to engage with the vast proliferation of social data (Savage and Burrows, 2007), sociologists need to think seriously about the challenges and opportunities posed by big data. As Burrows and Savage recently point out (2014: 2):
Sociologists generally used and refined rather familiar methods, talked mainly to each other about esoteric theoretical pre-occupations, and had not caught up with the fact that sociology was no longer an avant-garde discipline which had attracted legions of critical students and scholars in the 1960s and 1970s but had become fully part of the academic machine.
This absence is particularly striking given that the establishment, expansion, and influence of sociology is particularly reliant on words and phrases, rather than figures, functions, equations or other mathematical expressions, as compared to any natural science. Books serve as one of the most telling embodiments of a society’s knowledge over time, and the majority of sociology’s most canonical achievements have seen publication in book form. It seems only appropriate, then, to seize upon the opportunity provided by the Google N-gram corpus to identify and examine the long-term trends and themes that have characterized the field of sociology itself.
Sociology, as one of the core disciplines of the social sciences, is ‘like a caravansary on the Silk Road, filled with all sorts and types of people and beset by bandit gangs of positivists, feminists, interactionists, and Marxists, and even by some larger, far-off states like Economics and the Humanities, all of whom are bent on reducing the place to vassalage’ (Abbott, 2001: 6). Yet, notwithstanding this statement on the complexities of disciplinary advancement of sociology, there is virtually no empirical sociological research that can attest to the development of different ‘sorts and types’ of sociological norms, practices and boundaries. In the current study, we conduct the first empirical analysis, to our knowledge, in the field of sociology to use the corpus of digitized books. We analyse the evolution of the usage of the most common words and phrases in terms of disciplinary advancement, sociology scholars, sociology theories, sociology fields and sociology research methodologies between the 1850s and 2008. We also employ the data extracted from the corpus to quantitatively testify theories of the development of sociology. Our results show that the annual usage frequency count of a particular term based on big-data strategy not only gives clues as to the historical emergence and progress of sociology – indicating, for example, the longevity or popularity of a particular sociology field or method – but also sheds light on the linkage between the development of sociology and broader sociocultural dynamics over centuries.
Data and method
Since 2004, Google has been engaged in digitizing books printed as early as 1473 and representing 478 languages from 40 top universities worldwide (Michel et al., 2011). The first edition of Google corpus for analysis consists of about 5 million volumes of books between 1550 and 2008, excluding journals and serial publications (around 40 per cent of all scanned publications), which represent a different aspect of culture than do books. To avoid data duplication, the team of Google corpus converted billions of book records from over 100 sources of metadata information provide by libraries, retailers, and publishers in order to generate a single non-redundant database of book editions (Michel et al., 2011, Supplementary Online Material).
Following exactly the same procedure described in Michel et al. (2011), the second edition of Google corpus (2012) consists of about 8 million books, representing 6 per cent of all the books printed from the 1500s onward (Lin et al., 2012). Compared to the first edition, the 2012 Google corpus has a larger underlying book collection and higher quality digitalization (Lin et al., 2012). The English corpus alone comprises 4.5 million volumes of books and around half a trillion words (Table 1).
The Google Books corpus provides information about how many times per year an ‘n-gram’ appears in all the books included in the corpus, where an n-gram is a continual string of n words (uninterrupted by a space). A 1-gram could be a single word, for example, ‘sociology’, or numbers ‘1.234’. An n-gram is a sequence of 1-grams, such as the phrases ‘sociology theory’ (a 2-gram) and ‘field of sociology’ (a 3-gram). Punctuation and capitalization are preserved in the data set. By searching the Google corpus for a key word or phrase, one can obtain information about the annual occurrence of that keyword or phrase for a given time period. Although the absolute percentage of any individual word is, of necessity, small, the traces of such words, their rise and fall, can help index the most robust sociocultural trends over a long-term timeline.
In the present analysis, we focus on the English-language books corpus. We also analyse some specific terms in both American English and British English books to make a further comparison across different social contexts.1 In terms of time frame, we restrict our research to between mid-1850 and 2008 (inclusive) for two reasons. First, the profession of sociology emerged as a scholarly discipline in the early part of the nineteenth century and only really started to flourish in the mid-1850s,2 with Karl Marx, Herbert Spencer, and other early generation scholars to publish their works in the field of sociology (Boudon, 1989). Second, digitization of written texts is a cumulative process. Contemporary holdings of books published in the early 1800s are often incomplete and scant, meaning that information extracted from books before the 1850s could be from a biased sample. At the other end of the timeline, books published after 2008 are still being digitized and included in the Google Books corpus. Thus far, there is no data match beyond the year 2008 (Lin et al., 2012).
This language and year restriction can substantially alleviate the potential problem of data accuracy because more than 98 per cent of words are correctly digitized for modern English books (Michel et al. 2011, Supplementary Online Material). Still, two concerns may be raised regarding the representativeness of the Google corpus analysed in the present paper.
First, the corpus was constructed using OCR (optical character recognition) technology. As Michel et al. (2011) mention, books with poor OCR quality (due to size, paper quality, or the physical condition) were filtered out. This could lead to a potential sample problem. Second, the corpus is most likely to be biased towards recent books, since more books are published in more recent years, leading to skewed results of word usage. Regarding the first issue, however, books filtered out due to poor OCR quality only accounted for around 4 per cent of all scanned volumes (Michel et al., 2011, Supplementary Online Material) – a considerably small fraction. As for the second concern, we normalized the total number of appearances of a key word using the frequency of ‘the’ in the same year rather than the total number of all words.3 Thus, we obtained the normalized annual frequency of the word usage of our search terms as:
where Rit denotes the word usage of the key word i in year t, Cit represents the total number of appearance the word i in year t, and Ct is the total number of ‘the’ that appeared in all books published in year t. Conceptually, a higher Rit indicates higher frequency of word usage and thus higher cultural and social influence for the time period in question.
Drawing on various sociology textbooks, including A Dictionary of Sociology (Scott and Marshall, 2009), Sociology (Giddens and Sutton, 2013), we conducted a panoramic search of the disciplinary advancement of sociology in five major categories: academic significance, masters of sociology, theoretical dimensions, fields of sociology, and analytical methodologies. ‘Academic significance’ refers to the historical position of sociology in human knowledge as a subject related and compared to other subjects; the key word for this is ‘sociology’ or ‘sociological’. For ‘masters of sociology’, sociologists’ full names serve as the search terms and the goal is to chart key figures’ rise to fame and their academic reputations. The key words for ‘theoretical dimension’ are the names of relevant sociological theories and schools; ‘fields of sociology’ focuses on the sub-branches of sociology and popular research topics; and ‘analytical methodologies’ focuses mainly on the comparison of qualitative and quantitative research methodologies in sociology. Finally, we constructed an overall index deriving from all sociology-related key words using the principal component method to demonstrate the overall sociocultural influence of sociology in two centuries’ books.
Academic significance of sociology
We first counted the appearance of the key word ‘Sociology’ in the corpus since 1850. As a control group we also ran a similar search on the four subjects of ‘Philosophy’, ‘Economics’, ‘Anthropology’ and ‘Psychology’. It is worth noting that we did not run a test on ‘Political Science’ due to the fact that ‘Political’ or ‘Politics’ could be interpreted in numerous ways and thus would likely include non-academic related materials in the results.
The x-axis of Figure 1 demonstrates the year label from 1850 to 2008, while the y-axis stands for the word frequency statistics of different subjects. From Figure 1, one can observe that the word ‘Philosophy’ accounts for approximately 0.007 per cent of the total word count. Compared to other subjects, phrases associated with ‘Philosophy’ appeared earlier and more frequently. However, around the turn of the nineteenth to the twentieth century, the curve for ‘Philosophy’ plunged drastically and did not rise again until the early twentieth century. This finding corresponds with the collapse of classic German philosophy, especially the Hegelian school of philosophy in history (Solomon, 1988). It is noteworthy that from 1890 to 1920, as the word frequency statistics curve for ‘Philosophy’ dropped, the respective curves for the other subjects rose.
In fact, the word frequency statistics for ‘Sociology’, ‘Economics’ and ‘Anthropology’ rose steadily between mid-late nineteenth century and the 1930s, especially in the case of ‘Economics’, which saw the most drastic uptick in frequency, developing a wide lead over ‘Sociology’, ‘Psychology’ and ‘Anthropology’.
Our analysis yields interesting insights regarding the impact of major world events. For example, during World War I (1914–1918), the statistics for ‘Sociology’, ‘Psychology’ and ‘Economics’ did not drop, but in World War II (1939–1945) the statistics dropped dramatically and only began to increase again with the end of the war. This seems to indicate that WWII had a much greater impact on these disciplines than did WWI. The effect of WWII was reversed, however, in the case of ‘Anthropology’, which saw no decline during WWII; indeed, if anything, it saw a slight rise in its statistics. We believe this can be linked to the expansion of conflict beyond Europe to include Asia, Africa and Oceania, thus increasing states’ demand for strategic knowledge about non-Western countries. A broader war has, on one hand, secured funding on anthropology from government based on strategic purposes to study nationalism, internationalism, racial supremacy and anti-totalitarianism, on the other hand anthropologists themselves were able to shift their research horizon from traditional subjects such as African and Indian tribes to Eastern Europe and Southeast Asia (Price, 2002). Anthropologist Ruth Benedict’s 1946 study of Japan, The Chrysanthemum and the Sword, stands as arguably one of the best-known examples of such state-driven academic research.
The curves for ‘Sociology’, ‘Economics’, ‘Psychology’ and ‘Anthropology’ all peaked during the 1970s and 1980s, then began another round of slow descent in the 1990s. The descent for each subject might simply represent the dilution of knowledge in a constantly expanding corpus: with the total amount of knowledge possessed by human beings constantly on the rise, the percentage increase year to year for each subject or field might understandably be decreasing. However, for ‘sociology’, the decreasing word frequency does not necessarily mean the decline of the importance of sociology as a discipline. We will analyse this further in a later section.
We conducted searches for the full English name of 30 major Western sociologists in the Google N-gram corpus. Figure 2 illustrates the top 12 sociologists in word frequency statistics.4 They are (chronologically): Karl Marx, Herbert Spencer, Max Weber, Emile Durkheim, Georg Simmel, Herbert Marcuse, Talcott Parsons, Erving Goffman, Zygmunt Bauman, Jürgen Habermas, Pierre Bourdieu and Anthony Giddens. From Figure 2, we conclude three major findings.
Dilution effect: From Karl Marx to Anthony Giddens, it seems that each new sociologist is destined never to surpass his predecessors’ academic significance. This phenomenon does not necessarily suggest that the influence of one sociologist cannot surpass his predecessor. For instance, the influence of Pierre Bourdieu after the 1980s exceeded his predecessors Georg Simmel and Emile Durkheim and reached 0.00005 per cent around 2003, next only to Karl Marx and Max Weber. However, if we categorize sociologists into different generation group, we can see that later generation peaked at 0.00008 per cent in the 1970s represented by Talcott Parsons and none of the descendants could ever pass that point, let alone to reach the statistics of earlier sociologists like Herbert Spencer and Karl Marx. Thus conceived, it is almost impossible for later generation sociologists to surpass the fame of the earlier generation.
This phenomenon is due to the explosive growth in the total amount and categories of human knowledge. In other words, sociology constituted a bigger share of given knowledge during the nineteenth century, as that body of knowledge was still being amassed. When it comes to the twentieth and twenty-first centuries, in contrast, though sociology itself has continued to develop and more and more people have become professional sociologists, the discipline’s relative influence in human knowledge has decreased – not unlike the dilution of a substance mixed with ever larger quantities of water. To the extent that Talcott Parsons appears to be the last sociologist with the same level of influence as the generations that came before him, this may well have as much to do with the changing size of the ‘reservoir’ of all human knowledge as it does with Parsons’ work itself.
Exogenous effect: Compared to other sociologists, the word frequency curves with the highest average upward slope were those of Herbert Spencer and Karl Marx. In other words, Spencer and Marx enjoyed the most rapid ascent to positions of authority within the field in terms of influence. The speed of their rise, however, was supported by strong exogenous forces other than academic factors. Herbert Spencer was a generalist – a combination of philosopher, biologist, anthropologist, sociologist, political theorist, and a classic man of letters. He interacted with social elites throughout his life and was connected to many important ideologists and dignitaries. Spencer utilized his high-status social network to gain authority and audience as a generalist, enabling him to become extremely influential in the late nineteenth century, when the total amount of knowledge was still limited. Karl Marx, in comparison, did not enjoy such success in his lifetime; instead, his influence peaked between the 1920s and 1940s, and then again in the 1960s to the 1970s – precisely when Marxism and Communism were becoming influential beyond the academic world and actually changing the course of twentieth-century history.
Acceleration effect: Whereas most of the first generation of sociologists had to enjoy their fame posthumously, twentieth-century sociologists have become influential much earlier in their careers. With the exception of Herbert Spencer, all of the great names of sociology born in the nineteenth century became most reputable after their death. Karl Marx became most famous some 20 years after his death; Max Weber’s name began to rise exactly after his death in 1920; and, likewise, none of Emile Durkheim, Georg Simmel or Herbert Marcuse lived to see the years in which their numbers truly blossomed. In contrast, sociologists born in the twentieth century were much luckier. For instance, when Talcott Parsons began to gain fame in the 1940s, he was no more than 40 years old. Anthony Giddens became famous at the same age. Jürgen Habermas and Pierre Bourdieu became highly influential slightly later, but both began their ascent when they were in their fifties, around the 1980s–1990s, and Habermas is still alive today.
This acceleration effect can be ascribed to the development and standardization of sociology as a subject. In the late nineteenth century, as the discipline was still being established, there were fewer scholars and academic standards were, if not lower per se, at the very least less formalized, with greater room for flexibility. Sociology, too, was still in the process of legitimating its claim as a science. All these factors contributed to a longer ‘wait time’, so to speak, for a sociology scholar to reach notable fame. Today, both the discipline and the academic field in general are well established, enabling sociologists can make use of better disciplinary infrastructure and pre-existing channels to increase their influence.
The contribution of sociology towards human knowledge lies in a series of inspiring and explanatory concepts and theories. As such, we conducted key word searches for classic theories of sociology in order to explore their relative impact. Because most nineteenth-century sociological works are more general in nature – concerned as they were with establishing the basic parameters and goals of the discipline – we focused on the most famous, more specific sociological theories of the twentieth century. As Figure 3 illustrates, we concentrated on the ten most famous sociological theories: Conflict Theory, Social Exchange Theory, Structural Functionalism, Structuration Theory, Symbolic Interactionism, Rational Choice Theory, Ethnomethodology, Neo Functionalism, Strength of Weak Ties, and Structural Holes.
Lifetime trajectory of a theory: We noticed that each theory, from its birth to maturity, from its peak popularity to its point of diminishing returns, has its own life trajectory. In the mid-late twentieth century, the majority of the theories reached a peak in their growth-rate and usage about 30–40 years after their introduction. After that point, their influence begins to diminish. Interestingly, even though the sample of theories is relatively small, this life-cycle pattern fits that found for words more generally by researchers in linguistics. For example, Petersen et al. (2012) have identified universal growth-rate fluctuations in the birth and death rates of words: new words reach a pronounced peak about 30–50 years after the originate, after which point they either enter the long-term lexicon or fall into disuse.
The metabolism of a theory: We also noticed that the influence of earlier theories was superseded by that of newer theories. For instance, the growth rate of Structural Functionalism began to decrease in the mid-1990s while the usage of Structural Holes, a theory 20 years younger, superseded the former. Ethnomethodology and Symbolic Interactionism also appear to be on their way out. Meanwhile, Rational Choice Theory is still increasing in frequency, but now at a slower rate. Furthermore, when we grouped Strength of Weak Ties and Structural Holes together, we found that their total influence had already surpassed that of Structuration Theory and Social Exchange Theory around 2008. In other words, the cultural influence and academic significance of newly developed social capital and social network approaches has already gone beyond that of ‘classical’ sociological theories. Whether they will continue this growth, however, remains to be seen.
Explanatory scale of a theory: Generally speaking, a grand theory possesses stronger generalization ability and a larger scale of utilization. Yet, we found that since at least the mid-twentieth century, the theoretical world is no longer dominated by grand theories. For instance, Anthony Giddens’ Structuration Theory and Talcott Parsons’ Structural Functionalism have fallen significantly below Ethnomethodology, Symbolic Interactionism and Rational Choice Theory, all of which focus on micro-level interactions in society rather than large-scale macro functions of societal structures and institutions. Moreover, as time progresses, there seems to be less and less room reserved for grand theories: theories that thrived after the 1970s, such as Strength of Weak Ties and Structural Holes, all adopt micro or meso perspectives in order to understand human behaviour. While the relative pros and cons of ‘micro’ versus ‘macro’ theories are still the subject of much debate today, we speculate that the ambitious nature of grand theories may have, over time, become a disadvantage, actually limiting their appeal for contemporary theorists. Indeed, it may well be as many postmodern theorists have already declared, that sociology has entered a ‘post grand theories’ era.
Fields of sociology
Sociology is subdivided into many specialized fields and these fields are constantly changing over time. For this analysis, we looked at the shifting pattern of these fields in sociology in order to capture the larger discipline’s related social change. We conducted a key word search for eight of the most prominent fields, namely: Educational Sociology (Sociology of Education), Rural Sociology, Urban Sociology, Political Sociology, Economic Sociology, Sociology of Law, Sociology of Religion and Historical Sociology.
A few interesting findings can be observed in Figure 4. First, Educational Sociology emerged early as the most prominent field, but was replaced by Sociology of Education in the late 1960s. The shift was not merely semantic. Educational Sociology focused primarily on the social and cultural factors affecting relatively smaller social groups, thus neglecting larger societal influences on education in the post-industrial period. The Sociology of Education, on the other hand, turns its interest to the social function of education and thus investigates the role of education as a social institution (Shimbori, 1972). Second, after the 1990s, both the Sociology of Religion and Historical Sociology progressed at a relatively aggressive pace, particularly when compared with all the other fields, which demonstrated signs of descending. Third, Rural Sociology emerged as a sub-field of the discipline in the early twentieth century and exhibited a very high growth rate from the 1950s to the 1980s. This reflects the fact that Rural Sociology is the earliest and the most prominent sub-discipline of American sociology as an outgrowth of the response to the pronounced differentials in rural and urban social organization of the late nineteenth century, with its development peak around 1950s to 1960s (Brunner, 1957; Nelson, 1969).
In addition to the various fields within sociology, we were also interested to see shifts in terms of substantive research topics, which subjects were deemed ‘hot’ and when. In Figure 5 we compare eight representative terminologies within the social stratification and mobility, and social capital and network areas: Social Identity, Social Movement, Social Mobility, Social Stratification, Social Capital, Social Network, Social Class and Social Strata.
From Figure 5, we can observe that the growth-rate fluctuation of Social Mobility and Social Stratification peaked around 1975 and then started to decline. The popularity of Social Network rose rapidly from the late 1980s and surpassed Social Mobility around 1997. As Freeman (2004) argues, with the development of desktop computers and computer programs to manage network data, social network research finally took off from the mid-1980s onwards, shifting from ‘network as metaphor’ to ‘network as a mathematical expression’. Around the same time, research on Social Capital exceeded Social Mobility and finally surpassed Social Class around 2003. In other words, research on each of Social Capital and Social Networks is currently on the rise, while research on each of Social Mobility and Social Stratification is declining. Meanwhile, research on Social Movements started proliferating around the mid-1960s when waves of new movements organized around race and gender emerged in both America and Western Europe (Kriesi et al., 1995; Lovenduski, 1986).
Research methodologies of sociology
Which methods are used most by sociologists – quantitative or qualitative methods? To answer this question, we focus on shifts in the relative balance between the two major research methodologies in sociology over the past century.
We first calculated the average score of annual frequencies of each method in both quantitative and qualitative approaches from 1950 to 1980. Then we normalized the two groups of average scores into Z values and use ZQN – ZQL to obtain an index of quantitative analysis for each year. Figure 6 shows a plot of this index.
From Figure 6, we can see that both methods took turns ‘in the lead’ across different time periods. From 1950 to 1980, qualitative methods were more prominent, while the usage of quantitative methods surpassed that of qualitative approaches in the 1980s and 1990s, except for a short period around 1995–1997. After 2000, quantitative methods dominate in a majority of scholarships. It is noteworthy that scholars who utilize qualitative methods are also more likely to publish their research in book format, in contrast to quantitative researchers who are more likely to publish in journals and other formats; therefore, if anything, it is likely that our calculation underestimates the ‘lead’ of quantitative over quantitative methods.
An overall index: influence of sociology
In this section we use the word usage of relevant sociology-related key words in the above categories (except for methodology) to generate an overall measure for the sociocultural influence of sociology in millions of books. We carry out a Principal Components Analysis (PCA) to extract as much information as possible from the corpus while preserving degrees of freedom. We prefer the PCA method to applying the average score of normalized annual frequencies because PCA can ‘concentrate’ much of the sociological signals into the first few factors by ‘screening’ the later factors that are dominated by noise. This is important given that we generate the list of sociology-related words without establishing any theory about how closely the selected signals capture the meaning of ‘sociology’. The factor-predicted score S is calculated by:
where m denotes the number of factors with eigenvalues larger than 1, and is the cumulative proportion of explained variances larger than 90 per cent.
We report the factor loadings, variances, as well as correlation of signals in Table 2. The KMO measure of sampling adequacy, and the SMC between each signal and all other signals strongly suggest that these signals pick up sociology-related dynamics in the corpus. The first three principal components account for around 91 per cent of the variance. Using the first three factors and their respective proportion of variance, we can predict the index for influence of sociology.
Table 2. Factor loadings on and correlations of sociology signalsa
Notes:. The KMO reports the Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy, and SMC reports the squared multiple correlations (SMC) between each signal and all other signals.
Factors with an eigenvalue less than 1 are not presented.
aIn robustness check we added more sociology-related words to the list (e.g., middle class, working class, social status, and etc.) and obtained almost identical PCA results.
Social Exchange Theory
Rational Choice Theory
Strength of Weak Ties
In Figure 7, we further present the time series of the z-score equivalents of the overall index for sociology, as well as the time series of the word usage of ‘sociology/sociological’. As the figure shows, the influence of sociology as a discipline took off in the 1970s. Although the word usage of ‘sociology/sociological’ began to decline in the 1980s, the overall usage of sociological terms, including sociological theories and topics, began to skyrocket in all other respects. This reflects the extent to which sociology has come to penetrate and influence other domains and disciplines. For example, theories of weak ties and structural holes have been widely applied in the study of business management, while social capital has become a popular topic in research on economic development, political participation and public health. Further, we believe that the impact of sociology will continue to expand in the foreseeable future.
A research case beyond description
With the help of Google corpus, we are able to conduct more substantial research into the development of sociology beyond simply describing the rise and fall of the usage of sociology-related words. We use the case of the early development of sociology in the USA as an example to illustrate how the data extracted from Google corpus can be used to conduct quantitative study.
Upon the creation of American sociology as a professional discipline circa the 1890s (Cortese, 1995; Young, 2009), the tenets of the social gospel movement made sociology an acceptable course of study in many American denominational colleges. This has led to considerable debate among students of history of sociology regarding the nature of the connection between sociology and social gospelism (Henking, 1993; Morgan, 1969; Williams and MacLean, 2012). As Morgan (1969: 42) has indicated, ‘the Social Gospel and early sociology were often indistinguishable in terms of both ideas and leading personnel. This close parallelism is seen as a major factor in the early acceptance of sociology as an academic discipline in the nineteenth century universities.’ Research on this question, however, has only looked at individual case studies and thus lacks the support of hard data.
Digitized written texts provide a statistical solution to this dilemma. We searched using the key words ‘Sociology’, ‘Social Gospel’ and ‘Hull House’,5 with ‘Anthropology’ as a control group, and compared the results from the American English corpus and the British English corpus. As demonstrated in Figure 8, ‘Social Gospel’ and ‘Sociology’ both show signs of growth from 1890 to 1930 in America, with their respective growth rate close to each other; meanwhile, ‘Anthropology’ shows no visible signs of growth. By contrast, the correlation between the growth of ‘Sociology’ and ‘Social Gospel’ was far less obvious in England.
The above findings based on visual inspection of the data provide only preliminary evidence of the effects of the social gospel movement on the development of sociology in America. We thus proceed to use the time series (1890–1930) of ‘Sociology’, ‘Social Gospel’, ‘Hull House’ and ‘Anthropology’ to perform a Granger causality test to formally test the proposed connection between sociology and social gospelism. In the language of time series analysis, X is the Granger-cause of Y in the sense that Y can be better predicted using the histories of both X and Y than it can be predicted using the history of Y alone.
Using time series with persistence displayed by a unit root process in a standard ordinary least square equation can lead to spurious results of correlations. Therefore, we first performed stationary tests for all four time series using the Dickey–Fuller General Least Square (DFGLS) method and the Phillips–Perron (P-P) method. We found that all of them are integrated of the first order. We therefore used their first differences to fit a vector autoregressive (VAR) model to examine the relationships among them. The results from the American English corpus in Table 3 clearly show that ‘Social Gospel’ is the Granger-cause of ‘Sociology’ at a 0.05 alpha level, and ‘Hull House’ is the Granger-cause of ‘Sociology’ too at a 0.09 alpha level. In addition, the identified time lag suggests that the social gospel movement within the past 4 years can effectively affect the development of sociology at any given time. However, neither of the two words are the Granger-cause of ‘Anthropology’ even at a 0.1 alpha level. Furthermore, results from the British English corpus demonstrate that there is no Granger-relationship among the time series of ‘Social Gospel’, ‘Hull House’, ‘Sociology’, and ‘Anthropology’ at all. In general, our findings based on time series analyses lend support to the argument that there was a close relationship between the early development of sociology and the social gospel movement in the USA.
Table 3. Granger causality tests for the potential connections between sociology and social gospel movement using two different corpus
Notes:. The lag length was chosen according to the Schwarz Bayesian information criterion (SBIC), the Hannan and Quinn information criterion (HQIC), and Akaike information criterion (AIC).
Sg does not Granger cause Soci
Hull does not Granger cause Soci
Anthr does not Granger cause Soci
Sg does not Granger cause Anthr
Hull does not Granger cause Anthr
Socil does not Granger cause Anthr
This paper is the first of its kind to use the Google Books N-gram corpus, perhaps the largest electronic corpus yet constructed, to map out the disciplinary advancement of sociology in terms of the discipline in general and its major scholars, theories and research fields from the mid-nineteenth century to 2008. The intention of this research is in no way to suggest an evaluative ranking of the theories, scholars, schools, or methodologies that make up sociology. Instead, our goal has been to respond to Back and Puwar’s (2012) call for a ‘live sociology’ to deal with ‘lively data’, or the challenge posed by big data, the knowledge economy and the digitization of everyday life. As such, the aim and, it is hoped, the contribution of this study has been to show that massive content analysis from digitized books can provide rich insights regarding the historical evolution of professional disciplines and long-term sociocultural changes at a macro level.
Conceptually, examination of high frequency use of a specific term in a representative sample of the written texts is particularly important because it helps ‘identify the dynamics of historical emergence, decline, and comparative significance of a political concept’ (Hassanpour, 2013: 299). This gives corpus methodology significant advantages over traditional survey methods in which the sheer quantity of data and the availability of data are limited (Beer and Burrows, 2013; Lin et al., 2012). The use of newspaper data from one or more localities also tends to produce validity and reliability problems and there is no standard solution to correct for potential description and selection bias (Earl et al., 2004; Oliver and Myers, 1999). So far, however, the use of corpus data analysis has barely started among sociologists. With the exploding scale of digitization, more and more materials will be included in the historical corpus in the years to come. This will fundamentally change our scope of research and open venues for sociologists to employ new and creative approaches to social research.
Of course, there is still room for improvement in the present research. First, the full dataset analysed here only accounts for around 6 per cent of all books ever published from 1500 onwards. This means that it may be biased relative to the ensemble of all surviving books. The scanned and digitized books in particular were mainly borrowed from university or public libraries, retailers and publishers, and thus the composition of the corpus reflects the acquisition practices of the participating institutions. Although the assembled collections of books from various participating institutions could still be argued to be representative, the results here are tentative and should be treated with some caution.
Second, there are so many searchable sociology terms and we only cover a small proportion of the sociologists, theories and research fields. Therefore, the phenomena and the patterns observed might not represent the most universal versions. For example, we have only addressed some classic, traditional and established research fields of sociology such as social class, social movements or social capital; other important new fields such as globalization, migration, gerontology, gender, and race or ethnicity are increasingly popular among contemporary sociologists but may be under-represented here. The goal of this study has been to use novel data and visualization methods to shed light on the history of sociology itself, not by any means to summarize over a hundred years of sociological research.
Third, the advanced search function of the database was still limited and, therefore, the accuracy of the search results was far from perfect. For instance, different names may be attached to the same sociological terminology and words are sometimes used in ways that do not convey the same single sociological concept as the one intended in the analysis. Even though we have used Google search engines as a control group and chose the version with the highest level of representation, the accuracy of the results may still be lacking.
Despite its drawbacks, our research strategy is sufficient to show that written literary data in human history can help reinvigorate a sociological imagination able to extrapolate the historical trajectory of a sociological practice. Michel et al. (2011) proposed the concept of ‘culturomics’ to refer to the use of high-throughput digitized resources to study sociocultural trends and the human cultural genome. Similarly, we also suggest to open up a new field – ‘socialomics’ – to study the current state of a dynamic, fluid social world with massive digitized data collection and analysis. The value of establishing such an energetic and forward-thinking approach lies in the fact that the amount of human knowledge accessible to sociologists via physical reading is, in fact, very limited. This glass ceiling of academic research could result in a form of myopia, blinding us to the development of social science within and across media and forums not limited to the book format. With ‘genetic’ analysis of word frequency usage in a digitized era, we are likely to achieve theoretical inspirations and academic knowledge that the early generation of sociologists could not even have imagined.
1We also examine whether the pattern we find in the main analysis can be applied to the narrative-of-event corpora of newspapers. We searched the same key words in the field of sociology in the corpus of the New York Times and the results show similar general trends. Results of the relevant tests are available from the authors upon request.
2Although sociology’s exact timeline as a field/profession/discipline remains contested, this general time period works for the purposes of the current paper.
3Here we follow Bentley et al. (2014) and Acerbi et al. (2013), both studies that use this strategy. According to Acerbi et al. (2013), the word ‘the’ stably accounts for around 6 per cent of all words per year, and is thus a good representative of real writing and real sentences.
4The curve for the other 18 sociologists were all beneath the statistics curve of Jürgen Habermas. They are: Herbert Blumer, Charles Cooley, Alfred Schutz, George Mead, Harold Garfinkel, Max Horkheimer, Niklas Luhmann, György Lukács, C. Wright Mills, Robert Merton, Ralf Dahrendorf, Gerhard Lenski, Peter Blau, Randall Collins, Jeffrey Alexander, James Coleman, Immanuel Wallerstein and Norbert Elias.
5Hull House was the most famous ‘good-neighbor’ centre in the social gospel movement. Its founder Jane Addams later won a Nobel Peace Prize.
Please quote the article DOI when citing SR content, including monographs. Article DOIs and “How to Cite” information can be found alongside the online version of each article within Wiley Online Library. All articles published within the SR (including monograph content) are included within the ISI Journal Citation Reports® Social Science Citation Index.
Science fiction likes to depict robots as autonomous machines, capable of making their own decisions and often expressing their own personalities. Yet we also tend to think of robots as property, and as lacking the kind of rights that we reserve for people.
But if a machine can think, decide and act on its own volition, if it can be harmed or held responsible for its actions, should we stop treating it like property and start treating it more like a person with rights?
What if a robot achieves true self-awareness? Should it have equal rights with us and the same protection under the law, or at least something similar?
These are some of the issues being discussed by the European Parliament’s Committee on Legal Affairs. Last year it released a draft report and motion calling for a set of civil law rules on robotics regulating their manufacture, use, autonomy and impact upon society.
Of the legal solutions proposed, perhaps most interesting was the suggestion of creating a legal status of “electronic persons” for the most sophisticated robots.
The report acknowledged that improvements in the autonomous and cognitive abilities of robots makes them more than simple tools, and makes ordinary rules on liability, such as contractual and tort liability, insufficient for handling them.
For example, the current EU directive on liability for harm by robots only covers foreseeable damage caused by manufacturing defects. In these cases, the manufacturer is responsible. However, when robots are able to learn and adapt to their environment in unpredictable ways, it’s harder for a manufacturer to foresee problems that could cause harm.
The report also questions about whether or not sufficiently sophisticated robots should be regarded as natural persons, legal persons (like corporations), animals or objects. Rather than lumping them into an existing category, it proposes that a new category of “electronic person” is more appropriate.
The report does not advocate immediate legislative action, though. Instead it proposes that legislation be updated if robots become more complex; if and when they develop more behavioural sophistication. If this occurs, one recommendation is to reduce the liability of “creators” proportional to the autonomy of the robot, and that a compulsory “no-fault” liability insurance could cover the shortfall.
But why go so far as to create a new category of “electronic persons”? After all, computers still have a long way to go before they match human intelligence, if they ever do.
Artificial intelligence is also starting to live up to its moniker. Alan Turing, the father of modern computing, proposed a test in which a computer is considered “intelligent” if it fools humans into believing that the computer is human by its responses to questions. Already there are machines that are getting close to passing this test.
If this progress continues, it may not be long before self-aware robots are not just a product of fantastic speculation.
The EU report is among the first to formally consider these issues, but other countries are also engaging. Peking University’s Yueh-Hsuan Weng writes that Japan and South Korea expect us to live in a human-robot coexistence by 2030. Japan’s Ministry of Economy, Trade and Industry has created a series of robot guidelines addressing business and safety issues for next generation robots.
If we did give robots some kind of legal status, what would it be? If they behaved like humans we could treat them like legal subjects rather than legal objects, or at least something in between. Legal subjects have rights and duties, and this gives them legal “personhood”. They do not have to be physical persons; a corporation is not a physical person but is recognised as a legal subject. Legal objects, on the other hand, do not have rights or duties although they may have economic value.
Assigning rights and duties to an inanimate object or software program independent of their creators may seem strange. However, with corporations we already see extensive rights and obligations given to fictitious legal entities.
Perhaps the approach to robots could be similar to that of corporations? The robot (or software program), if sufficiently sophisticated or if satisfying certain requirements, could be given similar rights to a corporation. This would allow it to earn money, pay taxes, own assets and sue or be sued independently of its creators. Its creators could, like directors of corporations, have rights or duties to the robot and to others with whom the robot interacts.
Robots would still have to be partly treated as legal objects since, unlike corporations, they may have physical bodies. The “electronic person” could thus be a combination of both a legal subject and a legal object.
The European Parliament will vote on the resolution this month. Regardless of the result, reconsidering robots and the law is inevitable and will require complex legal, computer science and insurance research.
Outraged headlines erupted when students launched a campaign to challenge the great western philosophers. We went to the source of dissent – London’s School of Oriental and Asian Studies – to investigate
“They Kant be serious!”, spluttered the Daily Mail headline in its most McEnroe-ish tone. “PC students demand white philosophers including Plato and Descartes be dropped from university syllabus”. “Great thinkers too male and pale, students declare”, trumpeted the Times. The Telegraph, too, was outraged: “They are said to be the founding fathers of western philosophy, whose ideas underpin civilised society. But students at a prestigious London university are demanding that figures such as Plato, Descartes and Immanuel Kant should be largely dropped from the curriculum because they are white.”
The prestigious London University was the School of Oriental and Asian Studies (Soas). It hit the headlines last month when journalists discovered that students, backed by many of their lecturers, have set up a campaign to “Decolonise Our Minds” by transforming the curriculum. So shocking did the idea seem of a British university refusing to teach Plato, Locke or Kant that the story was picked up by newspapers across the globe. BBC2’s Newsnight debated whether “universities should eschew western philosophers”. This predictably generated more outraged headlines when one of the guests, sociologist Kehinde Andrews, denounced Soas as a “white institution” and the Enlightenment as “racist”.
For academics and students at Soas, the press coverage itself is the cause of outrage. “When the report came out that we were trying to take white men off the table, it was just bewildering because we had no intention of doing that,” says Sian Hawthorne, a convenor of the undergraduate course World Philosophies, the only philosophy degree that Soas provides. “Our courses are intimately engaged with European thought.”
“We’re not trying to exclude European thinkers,” says a second-year doctoral student, and a member of the Decolonising Our Minds group. “We’re trying to desacralise European thinkers, stopping them from being treated as unquestionable. What we are doing is quite reasonable.”
So what is the truth behind the headlines? Will philosophy students at Soas really not be taught Aristotle and Kant? Do the students and academics have a point that the curriculum is “too white”? And what should be the place of European philosophy, and European philosophers, in an age of globalisation and of a shifting power balance from west to east?
I went to Soas to talk to students and academics. “That’s the one thing,” one student told me, “that no journalist has so far done.”
The School of Oriental and African Studies was founded in 1916 “to secure the running of the British Empire”, as historian Ian Brown puts it in his history of the institution. Its aim was to provide “instruction to colonial administrators, commercial managers, and military officers, but also to missionaries, doctors and teachers”. Soas taught them the local languages as well as providing “an authoritative introduction to the customs, religions, laws of the people whom they were to govern”.
Today, of the more than 6,000 students at Soas, almost half come from abroad, from 130 countries, and more than half are black or minority ethnic. Far from teaching students how to administer the empire, the school now helps develop independent, postcolonial societies. It sees its mission also as providing a critique of empire, and of its continuing legacies, a view that extends to the very top of Soas management. “Our minds are colonised, absolutely,” says Deborah Johnston. Johnston is no student, nor even a mere academic, but the pro-director of learning and teaching, one of the most senior management figures at Soas. She continues: ‘‘In most UK universities there has been a dominance of European thought. That’s why we need to do work to decolonise the curriculum, and our minds.”
For some, such views emanating from the very top of the institution entrench the belief that, in the words of an academic at another London college, “Soas is the most politicised of British universities”. Others, however, see the problem not as one of an institution that is too politicised but as one that has not yet rid itself of the ghosts of empire. The curriculum, such critics claim, is still too rooted in a colonial view of the world, too stuffed with European thinkers, and too blind to African, Asian and Latin American thinkers.
Neelam Chhara is a third-year politics student at Soas, and the Student Union officer for “equality and liberation”. “On my course in political theory,” she says, “we discussed 26 thinkers. Just two were non-European – Frantz Fanon and Gandhi.”
Such “frustrations with our curriculum” led students to set up the Decolonising Our Minds group. “We thought: why not show what an alternative curriculum could look like by hosting thinkers and academics that didn’t centre on Europe like our curriculum was doing.”
Meera Sabaratnam laughs when I tell her about Chhara’s reading list. “That’s two more non-Europeans than when I was taught political theory in my undergraduate PPE at Oxford.” Sabaratnam is a lecturer in international relations at Soas. As an institution, it is, she says, much better than most universities. For instance, 39% of academic staff are of black or minority ethnic background – more than three times the figure for British universities overall. Nevertheless, she supports the Decolonising Our Minds campaign. “It is necessary to talk about colonial legacies and to look at how colonialism and racism impact the institution.”
The argument for a more diverse curriculum seems reasonable, indeed unquestionable. After all, philosophers and thinkers come not just from Europe. There are great non-European intellectual traditions, myriad philosophical schools from China, India, Africa and the Muslim world, many of which have shaped European philosophy. Three years ago I wrote a book on the global history of ethics, called The Quest for a Moral Compass, which drew not just on European philosophers, but also on the works of Mo Tzu and Zhu Xi, Ibn Rushd and Ibn Sina, Anton Wilhelm Amo and Frantz Fanon, Sarvepalli Radhakrishnan and Fung Yu Lan. All these different thinkers, I wanted to show, can be woven into a single but complex narrative through which we can rethink global history.
And yet, the debate about a “diverse curriculum” is not as straightforward as one might imagine. Few would contest the idea that European thinkers should not be on the curriculum simply because they are European. But of the major European philosophers that often dominate reading lists – such as Plato, Aristotle, Descartes, Locke, Hobbes, Kant, Rousseau, Nietzsche, Arendt or Sartre – how many are there simply because they are European rather than because their ideas merit study?
Sabaratnam acknowledges the problem. “Framing a course is primarily about content: what are the issues that need to be taught, and who can speak interestingly about those issues? How many European thinkers you include and the balance between European and non-European thinkers is an academic decision. If you want to understand political theory, you can’t avoid engagement with Kant, Hegel and so on.”
“But,” she adds, “that can’t be the be-all-and- end-all.” There has, she insists, “to be a parallel debate about diversity and representation. There is value in having non-European thinkers and women on those reading lists.”
If European thinkers should not be on reading lists simply because they are European, should non-Europeans be included just because they are non-European, solely for the value of increased diversity? Kwame Anthony Appiah, professor of philosophy and law at New York University, and last year’s Reith lecturer on Radio 4, is sceptical. He teaches a course on global ethics, which includes European, Chinese, Arab and Indian thinkers. The key question for him, however, is not “Is the curriculum sufficiently diverse?” but “Is any particular thinker worth studying?”
“If they were uninteresting or unimportant,” he observes, “it would not be much of a defence to say, ‘They are Arab or Chinese and make the course more diverse.’”
The difficulties in thinking about a diverse curriculum can be seen in the founding statement of the Decolonising Our Minds campaign. It does not say: “We need to expand our curriculum to include philosophers from across the globe”. Rather, it insists (under the heading “Decolonising Soas: Confronting the White Institution”) that, “If white philosophers are required, then to teach their work from a critical viewpoint.” This suggests that not having white philosophers should be the default position. This might not quite be “students demanding white philosophers be dropped from university syllabus”, as the newspapers claimed, but it’s not that far off.
“When you put it to me like that,” says Sian Hawthorne, “yes, I think that is problematic. However, I take a more generous reading of that statement as saying whomever is taught, whoever’s work is drawn on, it must always be dealt with critically. That is one of the first principles of a university education.”
The students themselves told me that they had not realised what the statement actually said, and would change it.
Do we need to be particularly critical of white philosophers, I asked Hawthorne. Yes, she replied, because “whiteness has been engaged in perpetuating forms of oppression and marginalisation and exclusion”. Does she think that all European philosophy is tainted by racism and colonialism? “Yes. There’s plenty of evidence to demonstrate this.”
But by insisting that the work of all white philosophers, from Aristotle to Arendt, from Socrates to Sartre, should be seen as tainted by racism, is she not confusing ideas and identity? Is she not falling into the same trap as racists, suggesting that because one possesses a particular identity, so one’s ideas are necessarily distinct, and linked to that identity? A philosopher is white so his or her ideas are contaminated.
Hawthorne rejects the criticism, and uses as an analogy the way that academics look upon the work of the German philosopher Martin Heidegger. Heidegger was one of the most influential 20th-century philosophers, having shaped the ideas of a host of thinkers such as Hannah Arendt, Jean-Paul Sartre and Jacques Derrida. He was also a Nazi with repulsively antisemitic views. The discovery of Heidegger’s nazism and antisemitism has led to much debate about how to treat his philosophical ideas.
“Do we deal with Heidegger?” asks Hawthorne. “I think we must. But we must do so in the understanding that he was a Nazi. We don’t not read his texts. But we read them carefully. That should also be the case with white philosophers. Just because they’re white doesn’t mean that they’re written off. But we need to be careful.”
This, though, is a false analogy. What concerns many about Heidegger is not his skin colour or his identity but his political views. Asking whether Heidegger’s Nazi views should affect the way that we understand his philosophical ideas is different from insisting that, because Aristotle or Kant or Arendt were white, we should be careful in the way we read their writings.
“Whiteness is not a useful category when talking of philosophy,” says Appiah. “When people speak, they speak ideas, not identity. The truth value of what you say is not indexed to your identity. If you’re making a bad argument, it’s a bad argument. It’s not bad because of the identity of the person making it.”
Perhaps the fiercest debate about European thought emerges in the battle over the Enlightenment, that sprawling intellectual, cultural and social movement that spread through Europe during the late 17th and 18th centuries, and was the harbinger of intellectual modernity. There is no period of history that has been more analysed, celebrated and disparaged. Unlike, say, the Renaissance or the Reformation, the Enlightenment is not simply a historical moment but one through which debates about the contemporary world are played out. From the role of science to the war on terror, from free speech to racism, there are few contemporary debates that do not engage with the Enlightenment, or at least with what we imagine the Enlightenment to have been. Inevitably, then, what we imagine the Enlightenment to have been has become a historical battleground.
“It’s become familiar to think of the Enlightenment as special,” Hawthorne suggests, “because it’s a constitutive narrative for how the west understands itself.” The Enlightenment, in her view, provides a myth, a creation story, that the west tells itself about what makes it more civilised and the rest of the world more barbaric.
Yet, for much of the past two centuries, the Enlightenment was seen as central to the values of the left, and of those challenging western imperialism and injustice. As the late Marxist historian Eric Hobsbawm put it, “All progressive, rationalist and humanist ideologies are implicit in it, and indeed come out of it.”
More recently, however, many on the left have argued that the Enlightenment, far from being a resource for those challenging colonialism, is itself a colonial project. Enlightenment universalism, such critics argue, is racist because it seeks to impose western ideas of rationality and objectivity on other peoples. “The universalising discourses of modern Europe and the United States.” Edward Said argued in his book Culture and Imperialism, “assume the silence, willing or otherwise, of the non-European world.” It is an argument central to the Soas campaign.
Soas academics and students argue that Enlightenment thinkers had a highly restricted notion of freedom; freedom as “the property of propertied white men”, as Meera Sabaratnam puts it. John Locke is widely regarded as having provided the philosophical foundations of modern liberal conceptions of tolerance. Yet he was a shareholder in a slaving company. Immanuel Kant, often seen as the greatest of Enlightenment philosophers, clung to a belief in a racial hierarchy, insisting that “Humanity is at its greatest perfection in the race of the whites” and that “the African and the Hindu appear to be incapable of moral maturity”.
“Enlightenment philosophers make arguments about knowledge and reason setting us free, and laud the values of liberty,” Hawthorne observes, “at the very moment that colonial enterprises and the slave trade are expanding. Those very same arguments are summoned to justify Europe’s so-called civilising mission and make claims about European superiority.”
The British historian Jonathan Israel, now professor of modern European history at Princeton university, is perhaps the most important contemporary scholar of the Enlightenment. Over the past decade he has published an extraordinary trilogy of books, Radical Enlightenment, Enlightenment Contested and Democratic Enlightenment. The size of Israel’s labours is eye-catching. Each work in the trilogy runs to almost 1,000 pages; in total there must be close to 2m words here. There are few who better understand the Enlightenment.
Like many before him, Israel lauds the Enlightenment as that transformative period when Europe shifted from being a culture “based on a largely shared core of faith, tradition and authority” to one in which “everything, no matter how fundamental or deeply rooted, was questioned in the light of philosophical reason”. Yet, Israel is also deeply critical. At the heart of his argument is the insistence that there were actually two Enlightenments. The mainstream Enlightenment of Locke, Voltaire, Kant and Hume is the one of which we know, and of which most historians have written. But it was the Radical Enlightenment, shaped by lesser-known figures such as d’Holbach, Diderot, Condorcet and, in particular, the Dutch philosopher Baruch Spinoza, that provided the Enlightenment’s heart and soul.
The two Enlightenments, Israel suggests, divided on the question of whether reason reigned supreme in human affairs, as the Radicals insisted, or whether reason had to be limited by faith and tradition – the view of the mainstream. The mainstream’s intellectual timidity constrained its critique of old social forms and beliefs. By contrast, the Radical Enlightenment “rejected all compromise with the past and sought to sweep away existing structures entirely”.
I talked to Israel about the Soas debate. The argument that the Enlightenment is racist, he suggests, comes from a one-eyed view, the selective picking and choosing of certain individuals and quotes. Such critics see only the more conservative mainstream figures, such as Locke, Kant and Hume, and ignore the thinkers of the Radical Enlightenment, an approach that Israel calls “seriously obtuse”. The Radical Enlightenment, he observes, “was condemned by all European governments and by all churches, because in principle it insisted on the universal and equal rights of men and the full emancipation of the black population”.
In 1770 a remarkable polemic against colonialism and slavery called Histoire philosophique des deux Indes (The Philosophical History of the Two Indies) was published. Written by a number of Radical thinkers including Raynal, Diderot and d’Holbach, it was both a study of Europe’s relations with the East Indies and the New World and an encyclopedia of anti-colonialism. Arguing that “natural liberty is the right which nature has given to everyone to dispose of himself according to his will”, the book both prophesied and defended the revolutionary overthrow of slavery: “The negroes only want a chief, sufficiently courageous to lead them to vengeance and slaughter… Where is the new Spartacus?”
The Histoire was astonishingly successful, published in more than 50 editions in at least five languages over the following 30 years. But it was only one of many such radical tracts, including d’Holbach’s Système sociale, Tom Paine’s Rights of Man, and the works of Condorcet and Diderot. “This current,” Israel argues, “was totally at odds with all forms of imperialism, colonialism and racial discrimination or prejudice.”
The Radical Enlightenment was “without question the starting point for the anti-colonialism of our time”. In Israel’s view, what he calls the “package of basic values” that defines modernity – toleration, personal freedom, democracy, racial equality, sexual emancipation and the universal right to knowledge – derives principally from the claims of the Radical Enlightenment.
Israel is sympathetic to the demand that university curricula be diversified. “There is a strong case for studying non-European traditions as an essential part of any philosophy teaching course.” But, he points out, such a global view began in the Radical Enlightenment itself. “Many radical enlighteners believed their anti-Christian naturalism had powerful roots in medieval Islamic philosophy. They also had strong affinities with Chinese Confucianism. They were free of the Eurocentrism that marked the mainstream Enlightenment of Voltaire, Montesquieu, Hume and Smith.”
“I wouldn’t want to go up against Jonathan Israel,” laughs Sian Hawthorne. “He is probably the foremost thinker on the Enlightenment. All I would say in response is that there is no single thing that you can point to and say ‘That’s the Enlightenment’.”
That, however, is a view that fits more comfortably with Israel’s notions of the two Enlightenments, the mainstream and the Radical, than it does with the claim that “the Enlightenment is racist”.
Hawthorne is right, however, to point to Locke’s failure to challenge slavery and to Kant’s racial anthropology. Such views do seem shocking today. But they seem shocking because of the transformation in consciousness brought about in large part by the Enlightenment itself. In most societies and traditions, European and non-European, the kind of ethnocentrism expressed by many mainstream Enlightenment thinkers was the norm. The Enlightenment helped change that. “I don’t know where you’d get the powerful tools for criticising European colonialism if you did not have the Enlightenment,” observes Appiah. “The modern idea of equality, the modern critique of inequality – much of the materials for that idea and for that critique come from that period.”
One does not have to rely on historians like Israel or philosophers like Appiah to make that point. It was made also by the very people who suffered under the yoke of European colonialism and sought to cast it off.
Today, most people know of the French and American revolutions, two great social tumults whose reverberations we still feel. Few know of the other great revolution of the 18th century – the one in Haiti that began in 1791 and culminated with independence in 1804.
In 1791, a mass insurrection broke out among Haiti’s slaves, upon whose labour France had transformed Saint-Domingue, as it called its colony, into the richest island in the world. It was an insurrection that turned into a revolution, a revolution that defeated the three greatest armies of the age – the French, British and Spanish – to become the first successful slave revolt in history, a revolution that was to shape history almost as deeply as those of 1776 and 1789.
The slaves were led by Toussaint L’Ouverture, a self-educated former slave, deeply read, highly politicised and possessed of a genius in military tactics and strategy. He was the “Spartacus” for which the European radicals who wrote the Histoire philosophique des deux Indes had pined.
Toussaint’s greatest gift, perhaps, was his ability to see that while Europe was responsible for the enslavement of blacks, nevertheless within European culture lay also the political and moral ideas with which to shatter the bonds of enslavement. The French bourgeoisie might have tried to deny the mass of humanity the ideals embodied in the Declaration of the Rights of Man. But Toussaint recognised in those ideals a weapon more powerful than any sword or musket or cannon.
From Toussaint L’Ouverture to Nelson Mandela, for two centuries those battling against European power and racial oppression looked to the Enlightenment ideals as the fuel for their struggles. Today, most of those struggles and movements have disappeared. As a result the meanings of “radicalism” and “decolonisation” have withered, and come to mean something very different and much more tame than they did half a century or a century ago. Shorn of the social movements that gave Enlightenment values their radical edge, those values have lost much of their meaning. That today so many should so easily dismiss the Enlightenment in the name of “decolonisation” tells us more about the shaky foundations of contemporary radicalism than it does about the Enlightenment.
The one word that Sian Hawthorne returns to again and again is “dialogue”. “We’re not used to seeing the world as the world. We keep cutting things up and segmenting them. Too often we don’t see the entanglements between European and non-European philosophies. What’s missing is dialogue.”
“Dialogue” is one of those words, like “diversity”, that can mean all things to all people. It is often used to define shallow, skating-on-the-surface conversations which give the impression of an exchange but which touch upon nothing substantive. It can also mean proper, dig-deep contestations through which we test each other’s ideas and in which we show ourselves willing to be uncomfortable as we ourselves are tested. In universities, and in society at large, there is today too little of the latter and too much of the former; too little real engagement and too great a desire to stay within our comfort zones.
There is much on which I disagree with the Decolonising Our Minds approach. I disagree with its concept of “whiteness”, with the characterisation of the Enlightenment as “racist”, with the understanding of what “European thought” constitutes, with what it means to “decolonise”. What I admire, though, is the openness to have this debate, and to engage in the kinds of conversations I had with both students and academics. I spent an afternoon discussing, debating and disagreeing with Meera Sabaratnam. At the end, she said: “The discussion that we’re having now is exactly the kind of discussion that it should be possible to have at universities.” On that, I could not agree more.
A different philosophy: six key texts
1. Mo Tzu, Basic Writings
(Columbia University Press)
Most people know of Confucius. They should know of Mo Tzu. Though he lived a century after Confucius, he has a claim to be China’s first true philosopher. Unlike Confucius, Mo Tzu engaged in an explicit reflective search for moral standards and gave tightly reasoned arguments for his views. He defended a universalist vision, arguing that the moral interests of strangers are as important as those of our tribe. He proposed a form of what we now call “consequentialism”, the idea that an act should be judged primarily by its effects, which was remarkably sophisticated for its time. The conservatism of Confucianism, and its cultivation of the moral character necessary to rule, to administer and to follow, won the favour of the Chinese state. The radicalism of Mo Tzu was forgotten and suppressed. Only fragments of his writing remain.
2. Ibn Rushd, The Decisive Treatise
(University of Chicago Press)
The Andalusian Muslim Ibn Rushd (1126-1198), often known in the west as Averroes, was the last of the great classical Islamic philosophers. Through his commentaries on Aristotle, he became more influential on western philosophy than on Islamic thought. Central to Ibn Rushd’s work was the relationship between philosophy and religion and the insistence on the compatibility of reason and faith. Perhaps his two most important works are The Incoherence of the Incoherence and The Decisive Treatise. The first is a response to the great theologian al-Ghazali and his attack on reason in his book The Incoherence of the Philosophers. The second is a defence of the role of reason in a community of faith, in which Ibn Rushd argues that it is God who commands humans to employ reason and not just faith.
3. Abu’l ‘Ala al-Ma’arri, The Book of al-Ma’arri
(New Humanity Books, 2015)
Today, we have become used to thinking of the Islamic world as insular, hostile to reason and freethinking and with a single, unquestioned view of God and the Qur’an. But in the first half-millennium of its existence, especially during the Abbasid period (750-1258), there was within the Islamic empire an extraordinary flourishing of philosophical debate and of freethinking. The most important of the freethinkers was Abu’l ’Ala al-Ma’arri, an 11th-century Syrian poet and philosopher, renowned for his unflinching religious scepticism:
“They all err – Muslims, Jews,
Christians, and Zoroastrians:
Humanity follows two world-wide sects:
One, man intelligent without religion,
The second, religious without intellect.” [from The Two Universal Sects]
There are very few English translations of his work. There is NYU Press’s recently published edition of his Epistle of Forgiveness (sometimes compared to Dante’s Inferno) and this short selection of his poetry.
4. Jonathan Israel, Radical Enlightenment
(Oxford University Press)
A groundbreaking study of the “other Enlightenment”, not the Enlightenment of Locke, Hume, Voltaire and Kant, but that of Spinoza, Pascal, d’Holbach and Diderot, a half-underground movement whose radicalism, according to Israel, has deeply shaped modern conceptions of freedom, liberty, equality and tolerance.
5. CLR James, The Black Jacobins
Trinidadian CLR James was one of those towering figures of the 20th century who is all too rarely recognised as such. Novelist and orator, philosopher and cricket lover, historian and revolutionary, Pan-Africanist and Trotskyist – few modern figures can match his intellectual depth, cultural breadth or sheer political contrariness. The Black Jacobins tells the story of the Haitian revolution and of its tragically flawed leader, Toussaint L’Ouverture. Decades before historians such as EP Thompson began producing “history from below”, CLR James told of how the slaves of Haiti had not been passive victims of their oppression but active agents in their own emancipation. It is a work of biography and social history, not of philosophy, but central to the narrative is the importance of ideas, especially the ideas of the Enlightenment, as weapons for social transformation.
6. Frantz Fanon, The Wretched of the Earth
(Penguin Modern Classics)
A classic of the anti-colonial struggle, The Wretched of the Earth has since become the bible of postcolonial literature. Fanon’s admirers see him as giving succour to the view that European thought is destructive of non-European peoples and cultures. His critics focus on his celebration of violence as redemptive. Fanon’s work is in fact more subtle than either allow. Born in Martinique, Frantz Fanon was a psychiatrist and revolutionary and a key figure in the Algerian struggle for independence. The Wretched of the Earth was written when he was dying of leukaemia and is a searing indictment of the dehumanising trauma of colonialism on the colonised individual, culture and nation. KM