The Routledge Handbook of the Sociology of Arts and Culture offers a comprehensive overview of sociology of art and culture, focusing especially – though not exclusively – on the visual arts, literature, music, and digital culture. Extending, and critiquing, Bourdieu’s influential analysis of cultural capital, the distinguished international contributors explore the extent to which cultural omnivorousness has eclipsed highbrow culture, the role of age, gender and class on cultural practices, the character of aesthetic preferences, the contemporary significance of screen culture, and the restructuring of popular culture. The Handbook critiques modes of sociological determinism in which cultural engagement is seen as the simple product of the educated middle classes. The contributions explore the critique of Eurocentrism and the global and cosmopolitan dimensions of cultural life. The book focuses particularly on bringing cutting edge ‘relational’ research methodologies, both qualitative and quantitative, to bear on these debates. This handbook not only describes the field, but also proposes an agenda for its development which will command major international interest.
‘…women are women and men are men. And muxes, well, they are muxes.’
The indigenous Zapotec communities of southern Mexico have long acknowledged ‘muxes’ as a third gender of people who are assigned as male at birth, but eventually become drawn to traditionally female roles. This can include dressing in feminine attire, taking on ‘women’s work’ and engaging in romantic relationships with men. Anthropologists believe the culture’s acceptance of gender-mixing predates European contact, and has survived the strict gender dichotomy imposed by Spanish Catholic colonisers. The director Ivan Olita’s short documentary Muxes sketches the lives of several muxes living in the town of Juchitán de Zaragoza, where, once heavily discriminated against by society at large, they’ve made significant strides towards acceptance and respect over the last decade.
Limor Samimian-Darash: My interest in scenarios grew out of my concern with analyzing and conceptualizing uncertainty. I was first drawn to the issue of uncertainty during my research on pandemic flu preparedness in Israel. Although literature in both sociology and anthropology thoroughly discusses the concept of risk, including its cultural perceptions and its roots in modernity, I found that uncertainty was understudied, at best.
In preparing for pandemic flu, Israeli authorities were forced to deal with what I term potential uncertainty. This type of uncertainty differs conceptually from what I call possible uncertainty and reflects a distinctive perspective on the future, the present, and the relations between the two. Whereas possible uncertainty derives from experience-based knowledge, that is, information based on past events, potential uncertainty derives from events that emerge from the virtual realm—from situations unaccounted for by known possibilities.
The authorities charged with preparing for pandemic influenza brought a variety of technologies to bear on the problem. During my research, I observed the application of three such technologies, two of which approached the problem from a risk-based perspective. The third, the syndromic surveillance system, seemed to be bringing something new to the table in terms of its underlying conception of uncertainty and its mode of operation: it was attempting to work “with” uncertainty, rather than against it. My encounter with this new mode of governing uncertainty inspired my interest in governing technologies that operate on principles other than risk, which, in turn, led me to scenarios.
This interest is consistent with my belief that, as anthropologists, we should not limit ourselves to constructing general or grand narratives of contemporary social processes. Important though such efforts are, we also need to look at the actual mechanisms that drive those processes. For me, that means examining existing mechanisms that govern risk and uncertainty, considering how new mechanisms emerge, and how—through their use—they mold conceptualizations of uncertainty in contemporary societies. Rather than focus on the appearance of new risks in the world, which is the basis of the risk-society approach, or on the impossibility of calculating these risks, a fundamental point of science and technology studies, I tackle uncertainty through the empirical study of the techniques that govern it, through their extraction and rigorous conceptualization.
Since heightened security consciousness is a fact of life in Israel, it was perhaps natural that my interest in the imagination and construction of future uncertainty would come to focus on security preparedness. Israel’s annual Turning Point exercise provided me with a tailor-made opportunity to explore the design and operation of mechanisms intended to govern security-related uncertainty. As the entire exercise is scenario-based, it offered me a means of studying the scenario as a practice of imagining the future, of tracing the role of the scenario in emergency preparedness and perception, and of understanding how it envisions and translates future uncertainty more generally.
The Turning Point exercise is part of Israel’s overall program of preparedness for war and disasters. Previous studies of scenarios have focused on relatively small-scale (e.g., table-top) exercises. Turning Point offered a novel view of a national-scale exercise within which multiple scenarios enacted at varying organizational levels are incorporated into one master narrative event. I was extremely excited to get access to such a field site and to be able to comprehensively explore the scenario exercises through ethnographic fieldwork. My study enabled me to follow the months of preparations for the annual exercises and all the fronts on which they take place.
The flu scenarios I had studied earlier were merely scripts and thus drew heavily on past events—the story lines were very limited and were not put into actual practice. Turning Point gave me a chance to fully explore the scenario as a form of thought and practice. I was interested in understanding whether the practice of scenarios opens up something new in thought and in actuality, whether it promotes an emerging and a becoming beyond the already known and existing. It seemed likely to me that both the content of a narrative and the particular way in which it is enacted affect how the scenario works as an uncertainty-based technology.
ND: I am interested in understanding how and why scholars choose their analytical interlocutors and concepts. Thus, I was wondering if you might be able to say something about who you consider to be your closest or most influential interlocutors.
LSD: My work draws analytically and methodologically on the anthropology of the contemporary, as developed by Paul Rabinow, on certain of Michel Foucault’s studies and his elaboration of modes of thought and experience, and the philosophical approach of Gilles Deleuze and Félix Guattari. Disparate as these thinkers are, they have all provided me with valuable analytical approaches and tools.
The primary influence within anthropology on my research is Paul Rabinow’s work on the contemporary: its problem focus, analytical mode, and anthropological ethos. Rabinow’s work profoundly shapes how I approach anthropological inquiry, identify problems, and formulate concepts.
Foucault’s writing on governmentality, as presented in the three forms of sovereignty, discipline, and the biopolitical security apparatus, provides me with both an analytical framework highlighting the heterogeneous structure of power and a methodological blueprint for identifying problematizations. His three governmental forms emerged historically in response to specific problems, each enacted with a certain aim and through certain practices. They are not mutually exclusive, however, and the emergence of one does not imply the disappearance of another. The biopolitical security apparatus, a technology of governing the population through normalization (of circulation and freedom), has been especially relevant to my work.
I initially took up the concept of the security apparatus during my analysis of Israel’s 2001–2002 smallpox vaccination project. During that project vaccination, a security prevention measure, became a new technique of governing, of preparedness, enacted through the novel temporality of a new problem. Whereas, historically, the governmental means and end of the security apparatus was to constitute and manage the population in relation to actual events, in the contemporary form of preparedness, a problem of temporality emerges, the need to operate on both present and future risk simultaneously and thus to govern through time. In following Rabinow’s work, I am investigating contemporary forms of governing beyond those specific (historical) forms presented by Foucault. Thus the task is not to literally re/present what Foucault has extracted as part of a particular historical contingency, but to grasp his effort as an analytical approach that enables us to observe other forms of experience in the contemporary.
The philosophy of Deleuze both complements Rabinow’s work on concepts and helps to bring Foucault’s (historical) constructs into the contemporary. One can plug into Deleuze and Guattari’s work from many different angles: their discussion of events versus accidents (The Logic of Sense), the virtual and the actual, difference and repetition (Difference and Repitition), the rhizome, assemblages (A Thousand Plateaus), and the idea of concept (What is Philosophy?). In adopting a Deleuzean approach, the researcher has a dual task: to establish virtual events (concepts, problems) from current events (solutions to existing problems) and to show how, in their actualization, problems are not swallowed up or suppressed by the solutions given to them. In other words, the researcher’s goal is not only to explore the solutions created for particular historical problems (as in a Foucauldian problematization) but also to pull the problem out of the solutions, from actual events. Doing so constitutes counteractualization, the process whereby the investigator establishes a virtual event. The idea of counteractualization or countereffectuation appears in various forms in Deleuze and Guattari’s writing, especially when they discuss the creation of concepts. I draw generally on these ideas in my own conceptual work. In particular, Deleuze’s differentiation between subjective and objective uncertainties in referring to two forms of the event has been useful to me in formulating the concept of potential uncertainty and in distinguishing it from risk.
ND: In thinking through the Turning Point scenario-based exercises, your article describes a shift that “move[s] us from one mode of governing, via biopolitical security apparatuses and risk-based technologies, to another mode, of preparedness and uncertainty-based technologies.” Do you have a sense of how that shift is experienced by the subjects of the latter mode of governance that the article traces?
LSD: I get at subjects’ experience largely through the atmosphere I observed during the exercise, especially in the emergency situation rooms. To really understand their experience of the Turning Point exercise as an uncertainty-based technology, I need to bring another concept into the discussion: affect. Different types of technologies, my research has shown me, can be distinguished in terms of their relationship (or lack thereof) to affect.
Turning Point involves preparations by governmental ministries, local municipalities, and the population at large for a wide array of threats. As far as the exercise’s bureaucrats are concerned, the main experience is one of uncertainty, reflecting an intrinsic aspect of the scenarios they devise and are charged with managing. Israeli citizens, however, do not deal with scenarios but are mainly involved in simulation-like practices, which entail a different mode of experience.
Whereas scenario technology, as a practice of uncertainty, gives rise to alertness and urgency among participants, the simulation, through the practice of repetition and order, leaves no room for uncertainty to develop. The simulation elicits a known, predictable set of reactions that constitutes the event as one of certainty, one with prescribed solutions. Participants’ task is to precisely follow given instructions or to repeatedly practice the same actions, never deviating from protocol. The scenario, by contrast, requires participants to face something new as it is emerging and to adapt to that emerging reality.
A simulation involves the realization of a problematic future possibility and the implementation of known solutions to that problem in order to routinize responses to it. A scenario, by contrast, is triggered by the threat of the unforeseen and involves the practice of uncertainty. Since a simulation is a closed event that aims to prevent uncertainty and disorder, it cannot generate affect (but can manage emotions); affect, following Brian Massumi, can only occur through activation of an uncertainty-based mechanism such as the scenario. In that context, affect is produced not only by the identification of an external threat or uncertainty but also, and mainly, by the spontaneous creation of uncertainty during the exercise. Scenario technology produces an affective situation among its participants precisely because of its inherent capacity to create uncertainty.
ND: Finally, what can you tell us about your next project?
LSD: Scenario techniques are varied, have proliferated in many areas of practice and research (e.g., military, policy, energy, and, more recently, health and business), and are in widespread use around the world. However diverse, they share a common mode of thought and practice: they commonly present or perform “stories about the future aimed at helping people break past their mental blocks and consider ‘unthinkable’ futures” (Ringland 1998, 12). Although scenarios have grown in popularity over the past several decades, their social-scientific study remains limited.
My next research project tackles global scenarios. I will be examining how scenario thinking has emerged historically in relation to other responses to the problem of the future, such as prediction. I want to look at scenarios as enacted in three fields—health, cybersecurity, and business—and analyze them from national, international, and global perspectives. This global focus is, I believe, essential to furthering understanding of the scenario form, and forms of imagination more generally, especially as studies thus far have been limited in scope—usually focused on one field of activity within one country. Moreover, the three fields I want to examine also emphasize different temporalities in their scenario thinking and planning. As part of this effort, I plan to explore the particular relationship between the past and the future that characterizes temporal thinking in each of the three field sites.
Ringland, Gill. 1998. Scenario Planning: Managing for the Future. Chichester, U.K.: Wiley.
Lord Martin Rees is an astrophysicist and the former master of Trinity College, Cambridge. He sat down with The WorldPost for a wide-ranging interview, which has been edited for clarity and brevity.
Alexander Görlach: Out of all great transformations we are going through, from climate change to artificial intelligence to gene editing, what are the most consequential we are about to witness?
Martin Rees: It depends on what time scale we are thinking about. In the next 10 or 20 years, I would say it’s the rapid development in biotechnology. We are already seeing that it’s becoming easier to modify the genome, and we heard about experiments on the influenza virus to make it more virulent and transmissible. These techniques are developing very fast and have huge potential benefits but unfortunately also downsides.
They are easily accessible and handled. It’s the kind of equipment that’s available at many university labs and many companies. And so the risk of error or terror in these areas is quite substantial, while regulation is very hard. It’s not like regulating nuclear activity, which requires huge special purpose facilities. Biohacking is almost a student-competitive sport.
I am somewhat pessimistic, because even if we do have regulations and protocols for safety, how would we enforce them globally? Obviously we should try and minimize the risk of misuse by error or by design of these technologies and also be concerned about the ethical dilemmas they pose. So my pessimism stems from feelings that what can be done, will be done ― somewhere by someone ― whatever the regulations say.
Görlach: Do you fear that this could happen not only in the realm of crime ― if we think of so-called “dirty bombs,” for example ― but could also be used by governments? Do we need a charter designed to prevent misuse?
Rees: I don’t think governments would use biotech in dangerous ways. They haven’t used biological weapons much, and the reason for that is that the effects are unpredictable.
‘Over the next 10 or 20 years, the greatest transformation we are likely to live through is the rapid development in biotechnology.’ Lord Martin Rees
Görlach: That brings recent Hollywood blockbusters like “Inferno” to mind, where one lunatic tries to sterilize half of mankind through a virus.
Rees: Several movies have been made about global bio-disasters. Nevertheless, I think it is a realistic scenario, and I think it could lead to huge casualties. Disasters such as the one from “Inferno,” as well as other natural pandemics, could spread globally. The consequences of such a catastrophe could be really serious for society. We have had natural pandemics in historic times ― the “black death,” for example. The reason that governments put pandemics ― natural or artificially produced ― high on their risk register is the danger of societal breakdown. That is what worries me most about the possible impact of pandemics. This is a natural threat, of course. The threat is aggregated by the growing possibility that individuals or small groups could manufacture a more lethal virus artificially.
Görlach: So when speaking of the age of transformation, aspects of security seem paramount to you. Why is that?
Rees: We are moving into an age when small groups can have a huge and even global impact. In fact, I highlighted this theme in my book Our Final Century, which I wrote 13 years ago. These new technologies of bio and cyber ― as we know ― can cause massive disruption. We have had traditional dissidents and terrorists, but there were certain limits to how much devastation they could cause. And that limit has risen hugely with these new bio and cyber-technologies. I think this is a new threat, and it is going to increase the tension between freedom, security and privacy.
Görlach: Let’s look at another huge topic: artificial intelligence. Is this a field where more uplifting thoughts occur to you?
Rees: If we stay within our time frame of 10-20 years, I think the prime concerns about A.I. are going to be in the realm of biological issues. And everyone agrees that we should try and regulate these. My concern is that it will be hard to make effective regulations. Outside biological consequences, in the long term, of course we need to worry about A.I. and machines learning too much.
In the short term, we have the issue of the disruption of the labor market due to robotics taking over ― not just factory work but also many skilled occupations. I mean routine legal work, medical diagnostics and possibly surgery. Indeed, some of the hardest jobs to mechanize are jobs like gardening and plumbing.
We will have to accept a big redistribution in the way the labor market is deployed. And in order to ensure we don’t develop even more inequality, there has got to be a massive redistribution. The money earned by robots can’t only go to a small elite ― Silicon Valley people, for instance. In my opinion, it should rather be used for the funding of dignified, secure jobs. Preferably in the public sector ― young and old, teaching assistants, gardeners in public parks, custodians and things like that. There is unlimited demand for jobs of that kind.
‘Some of the hardest jobs to mechanize are jobs like gardening and plumbing.’ Lord Martin Rees
Görlach: But robots also potentially could take on the work of a nurse, for that matter.
Rees: True, they could do some routine nursing. But I think people prefer real human beings, just as we’ve already seen that the wealthiest people want personal servants rather than automation. I think everyone would like that if they could afford it, and everyone in old age would like to be cared for by a real person.
Görlach: In your opinion, what mental capacities will robots have in the near future?
Rees: I think it will be a long time before they will have the all-round ability of humans. Maybe that will never happen. We don’t know. But what is called generalized machine learning, having been made possible by the ever-increasing number-crunching power of computers, is a genuine big breakthrough. These structures of machine learning are a big leap, and they open up the possibility that machines can really learn a lot about the world. It does raise dangers though, which people may worry about. If these computers were to get out of their box one day, they might pose a considerable threat.
Görlach: In your opinion, what sparks new innovation and ideas? Will A.I. and machines foster these processes?
Rees: Moments of insights are quite rare, sadly. But they do happen, as documented cases suggest (laughs). There is a great saying: “Fortune favors the prepared mind.” You have got to ruminate a lot before you are in a state to have one of these important insights. If you ask when the big advances in scientific understanding happen, they are often triggered by some new observation that in turn was enabled by some new technological advancement. Sometimes that happens just by a combination of people crossing disciplines and bringing new ideas together; sometimes just through luck; sometimes through a special motivation that caused people to focus on some problem; sometimes by people focusing on a new problem that was deemed too difficult previously and therefore didn’t attract attention.
‘Fortune favors the prepared mind.’
Görlach: Would you say a collective can have an idea or that only individuals have ideas?
Rees: Many ideas may have depended on the collective to even emerge. In soccer, one person may score the key goal. That doesn’t mean the other 10 people on the team are irrelevant. I think a lot of science is very much like that: the strength of a team is crucial to enable one person to score the goal.
Görlach: Do natural sciences and humanities have the capability to tackle the challenges occurring from these transformations?
Rees: The kinds of issues we are addressing in Cambridge involve social sciences as well as natural sciences. As I said before, because of the societal effect, the consequences of a pandemic now could be worse than they were in the past, despite our more advanced medicine. Also, if we are thinking of ecological problems like food shortages, the issue of food distribution is an economic question, as well as a question of what people are ready to eat. All these things involve fully understanding people’s social attitudes. Are we going to be satisfied eating insects for protein?
Görlach: With the rising amount of aggregated data, it becomes increasingly difficult for the humanities to keep up with natural sciences. How can we synchronize the languages of different academic fields in this era of big data?
Rees: Great question! There are impediments caused by disciplinary boundaries, and we have to encourage people to bridge these. I am gratified that we have some young people who are of this kind: philosophers who are into computer science or biologists who are interested in system analysis. All these things are very important. I think here in Cambridge, we are quite well-advantaged because we traditionally have the college system whereby we have small academic groups in each college. Each of these colleges is a microcosm, so all disciplines cross somewhat. It is therefore particularly propitious as a location for the development of cross-disciplinary work.
How can we synchronize the languages of different academic fields in this era of big data?
Görlach: The blessings of modern innovation seem to be ignored by many policymakers; we see a retreat from globalization and a retreat from digitalization. Is it a disconnect between science and the rest of society?
Rees: The misapplication of science is a problem, of course. As well as the fact that science’s benefits are irregularly distributed. There are some people that don’t benefit, such as traditional factory workers. If you look at the welfare of the average blue-collar worker and their income in real terms ― in the U.S. and in Europe ― it has not risen in the last 20 years; in many respects, their welfare has declined. Their jobs are less secure, and there is more unemployment. But there is one aspect in which they are better off: information technologies. IT spreads far quicker than expected and led to advantages for workers in Europe, the U.S. and Africa.
Görlach: But surely globalization made many poor people less poor and a few rich people even richer.
Rees: Sure, I guess this statement can be made after 25 years of globalization. But it should also be addressed that we now witness a significant backlash in many places in terms of Brexit or the presidential election in the U.S.
Görlach: How drastically do you think these developments will affect science, the attitude toward it and its funding?
Rees: Many of the people who use modern information technology, such as cellphones, aren’t aware of the immense technological achievements. Back in the day, developments could be traced back to scientific innovations decades ago, which were mainly funded by either the military or the public. They may not be aware of it, but they appreciate it. So it’s unfair to say people are anti-science. They are worried about science because indeed there is a risk that some of these technologies will run ahead faster than we can control and cope with them. So there is a reasonable ground for some people to be concerned ― for example, about biotech and A.I.
But we also have to bear in mind that for technology to be developed, it’s necessary ― but not sufficient ― for a certain amount of science to be known. We can take areas of technology in which we could have forged ahead faster but haven’t because there was no demand. Take one example: it took only 12 years from the first Sputnik to Neil Armstrong’s small step on the moon ― a huge development in 12 years. The motivation for the Apollo program was a political one and has led to huge expenses. Or take commercial flying ― today, we fly in the same way we did 50 years ago, even though in principle we could all fly in supersonics.
These are two examples where the technology exists but there hasn’t been a motive ― neither political nor economic ― to advance these technologies as fast as possible. In the case of IT, there was the obvious demand, which exploded globally in an amazing way.
‘There are areas of technology in which we could have forged ahead faster but haven’t because there was no demand.’ Lord Martin Rees
Görlach: Living in a so-called post-factual era, what are “facts” to you as a scientist?
Rees: In the United Kingdom, those who voted for Brexit voted that way for a variety of reasons. Some who voted for it wanted to give the government a bloody nose; others voted blatantly against their own interest. The workers in South Wales, for example, benefited hugely from the European Union. There is a wide variety of different motives but I don’t think people would say that they voted against technology.
Görlach: Still, there is this ongoing narrative about the fear of globalization and digitalization, and that would also imply the fear of technology.
Rees: Sure, but that is oversimplified. We can have advanced technology on a smaller scale. I don’t think you can say that technology is always correlated with larger-scale globalization. It allows for robotic manufacturing, and it allows for more customization to individual demand. The internet has allowed a lot of small businesses to flow.
Görlach: But there seems to be an increasing disconnect in many societies regarding the consensus on which facts matter and how facts are perceived.
Rees: To understand this attitude you are expressing, we have to realize that there aren’t many facts that are clear and relevant in their own right. In most cases, I think people have reason to doubt. Most economic predictions, for example, have pretty poor records, so you can’t call them facts.
In the Brexit debate, there were a lot of valid arguments on both sides, and you can’t blame the public for being skeptical. This is also true for the climate debate. It is true that some people deny what is clear. But the details on climate change are very uncertain. Even those who agree on all will differ in their attitudes toward the appropriate policy. That depends on other things, including ethics. In a lot of recent debates, people agreed about the science. They disagree about the appropriate policies deriving from that facts. For instance: how much constraint are we willing to exercise, in order to facilitate the life of generations to come? Opinions differ hugely.
‘In the Brexit debate, there were a lot of valid arguments on both sides, and you can’t blame the public for being skeptical.’ Lord Martin Rees
Görlach: But how then do you judge the developments we now see in many Western societies?
Rees: I think these developments are partly caused by new technologies that have led to new inequalities. Another point is: even if it hasn’t increased, people are now more aware of inequality. In sub-Saharan Africa, people see the kind of life that we live, and they wonder why they can’t live that kind of life. Twenty-five years ago, they were quite unaware of it. This understandably produces more discontent and embitterment. There is a segment of society, a less-educated one, that feels left behind and unappreciated. That is why I think a huge benefit to society will arise if we have enough redistribution to recreate dignified jobs.
Görlach: What political framework do you think of as an ideal environment for science?
Rees: In the Soviet Union, they had some of the best mathematicians and physicists, partly because the study of those subjects was fostered for military reasons. People in those areas also felt that they had more intellectual freedom, which is why a bigger fraction of the top intellectuals went into math and physics in Soviet Russia than probably anywhere else ever since. That shows you can have really outstanding scientists surviving in that sort of society.
Görlach: So the ethical implication is not paramount to having “good” science after all?
Rees: I think scientists have a special responsibility to be concerned about the implications of their work. Often an academic scientist can’t predict the implications of his work. The inventors of the laser, for instance, had no idea that this technology could be used for eye surgery and DVD discs but also for weaponry. Among the most impressive scientists I have known are the people who returned to academic pursuits after the end of World War II with relief but remained committed to doing what they could to control the powers they had helped to unleash.
In all cases, the scientists supported the making of the bomb in the context of the time. But they were also concerned about proliferation and arms control. It would have been wrong for them to not be concerned.
To make an analogy: if you have teenage son, you may not be able to control what he does, but you sure are a poor parent if you don’t care about what he does. Likewise, if you are a scientist and you created your own ideas, they’re your offspring, as it were. Though you can’t necessarily control how they will be applied, because that is beyond your control, you nonetheless should care and you should do all you can to ensure that your ideas, which you have helped to create, are used for the benefit of mankind and not in a damaging manner. This is something that should be instilled in all students. There should be ethics courses as part of all science courses in university.
‘How much constraint are we willing to exercise, in order to facilitate the life of generations to come? Opinions differ hugely.’ Lord Martin Rees
Görlach: What, then, is your motivation as a scientist?
Rees: I feel I am very privileged to have consistently, over a career of nearly 40 years now, played part in debates on topics that I think are writing the history of science in this period. As we make great, collective, scientific progress, we are able to confront new mysteries, which we couldn’t even have addressed in the past. Many of the questions that were being addressed when I was young have now been solved. Pressing questions couldn’t even have been posed back then.
Of course the science I do is very remote from any application, but it’s of great fascination and a very wide audience is interested in these questions. It certainly adds to my satisfaction that I can actually convey some of these exciting ideas to a wider public. I would get less satisfaction if I could only talk about my work to a few fellow specialists, so I am glad that these ideas can become part of a broader culture.
Görlach: What is the best idea you ever had?
Rees: I don’t have any sort of singular idea, but I think I have played a role in some of the ideas that have gradually formed over the last 20 or 30 years about how our universe has evolved from a simple beginning to the complex cosmos we see around us that we are a part of. For me, the social part of science is very important ― many ideas emerge out of discussion and cooperation and, of course, out of experiments and observations.
The symbiosis between science and technology ― the old idea is that science eventually leads to an application ― is far too naïve! It goes two ways, because advancements made in academics are facilitated by technology. We only made advancements beyond Aristotle by having much more sensitive detectors and being able to explore space in many ways. If we didn’t have computers or ways of detecting radiation, etc., we would have made no progress because we are no wiser than Aristotle was.
Görlach: Lord Rees, thank you very much for your time.
EARLIER ON WORLDPOST:
The appropriate reaction is to be happy when people share their good news, right? Like when that friend from graduate school emails to say he’s just won a tenure-track professorship at [insert name of elite institution]. You’re not supposed to think: “Damn, that position should be mine.” Or when you run into a colleague in the corridor who is perky because her book manuscript has just been accepted by a major scholarly publisher, you force a smile but you’re really thinking, “I deserve a contract. What have I got to show for my hard work?”Any of this sound familiar? To paraphrase John Lennon, maybe you’re just a jealous researcher.
According to Sybil L. Hart, editor of the Handbook of Jealousy: Theory, Research, and Multidisciplinary Approaches: “The word jealousy stems from the Latin zelus, meaning passion.”
Passion can be great, but not when it comes to jealousy. The word “jealous” implies a profound sense of bitterness toward someone who has something you want — and, perhaps, something you feel you deserve.
Very little has been written on jealousy in academic life, and yet, anecdotal evidence suggests that it is prevalent in our profession. This is unsurprising. As Chronicle readers are well aware, academe today is a place of increasingly precarious employment conditions — where the “publish or perish” mantra is more relevant than ever and the pressure to win grant money has reached fever pitch.
In such an environment, it’s little wonder that jealousy can take hold. I’ve certainly felt my share (and I herewith apologize for privately cursing those of you who got positions and/or book contracts that I wanted). Jealousy may come with the academic turf but that’s rarely a good thing. So what can we do to better manage our envy at all stages of the academic career?
Accept that you’re jealous. Don’t blame your envious feelings on a bad day or a bad week (though you may well be having one or both of those). Most important, don’t pretend those feelings aren’t there, and that everything’s fine.
Just say the words — “I’m jealous” — and you’ll be in a position to tackle the rest of these suggestions.
Recognize the state of the industry. Academe is competitive. That is as true in Australia (where I’m based) as it is everywhere else. Hundreds of seriously bright, seriously hard-working and seriously determined scholars can apply for a single position. The days of graduate students completing a doctorate and walking into a tenure-track position without working up a sweat are gone. (Did they ever exist?)
And let’s face it, academe has never been a level playing field. Consider factors such as nepotism and ideological bias. Consider, too, the myriad forms of discrimination (sexism, racism, homophobia, ableism). The mere existence of campaigns such as “Why is my curriculum white?” (in Britain) suggests that even the more “progressive” parts of the academy still have some serious work to do.
Recognizing that academe is competitive and riddled with inequalities won’t (and shouldn’t!) make you smile. Nevertheless, it should help alleviate any fixation you might have on a person or persons who’ve reaped the rewards that you so fervently desire. You’re one of countless people around the world who are trying to develop (or enhance) their research careers — there’s not just one or two of us.
Focus on you. Seems obvious, right? We all know that progress in your academic career is all about putting your best foot forward. You constantly need to argue why you are the best person for this professorship, or why your project should receive funding above all others.
Remind yourself regularly: It’s unhelpful to direct resentment, envy, and anxiety at someone who has achieved some measure of success that you desire. When you do that, your energies — the ones that you could be putting into your own research and career progression — are instead being channeled into that other person. You’re focused on what they’ve done, what they’ve achieved, when what you really need to do is figure out the steps you need to take to get where you want to go.
Protect your reputation. You’re not putting your best foot forward when you rant to other researchers about how Professor X got the promotion that should have come your way.
At best, that kind of talk can paint you as sour and uncollegial. At worst, it can make you seem unprofessional and worth avoiding. I once heard a colleague indulge in just such a rant and I emerged from it thinking: “Well, what’s that person saying about me behind my back? Why would I want to work with them?”
If you do genuinely feel the need to voice your grievances, perhaps speak with a mental-health professional. I say that without a shred of sarcasm. Your mental health is paramount, plus confidentiality is assured.
Develop interests outside the academy. I almost screamed when my therapist made that recommendation. Did he realise how busy I was — what with writing lectures, marking term papers, answering student queries, producing job applications? How could I have any kind of life outside the university?
Yes, all of those activities take time. Plenty of time. Yes, they must be undertaken if you hope to avoid dropping off the scholarly radar.
That said, developing interests outside the seminar rooms and grant-writing workshops can help prevent you from becoming too entangled in the daily grind of scholarly life. Outside interests can help alleviate the strain and tension that can manifest in emotions such as jealousy.
And so, I took my therapist’s advice and purchased a gym membership. I assure you, there’s nothing like sweating it out on the leg press to temporarily block out those multiple deadlines and the heavy teaching week. Stressed about that seminar paper? Work it out on whatever hobby allows you to step outside of your academic persona.
Given that our industry is increasingly volatile, jealousy is an almost inevitable aspect of academic life. But we don’t have to let it overwhelm us.
The term “utopia” is used in two distinct ways:
1) As a term of criticism; as in: “Your ideas are utopian; they are uselessly over-idealistic, they could never work.”
2) As a term of positive appraisal; as in: “These utopian ideas give one real hope: the utopia they describe would be worth aiming for.”
The standard view is that it is utopian in the first sense to seek to radically transform our society. This unfortunately tends to rule out the possibility of utopia in the second sense. And I believe it is that possibility which we have great need of, at the present time.
Why? Because without it, we are probably finished. I mean: we are now in a situation which makes it the case that without radical transformation, without radical hope, we are doomed. Mere reformism will not be enough to save us from climate catastrophe and its causes: rampant fossil fuel interests; uncontrolled capitalist accumulation and commodification; the hegemony of economic growthism; continual production of artificial ‘needs’ (e.g. by advertising); and a profound failure to challenge the resultant individualist ‘aspirational’ consumerism, even on the so-called ‘Left’. Most ‘Leftisms’ are hopelessly in hock to growthism. And to a ‘deprivation model’ which means that their prescription for society is simply: more of the same, shared out a bit better. ‘Ferraris for all’, as the book has it. Yes, that really is the title of a book that someone has written and published. That such books exist is a testament to how desperate our predicament is. That the egregious ‘Spiked’ magazine likes the book a lot tells you most of what you need to know about what is within its covers.
So my claim is that the standard view is exactly wrong. The true utopianism in sense (1) now is: belief in anything like the status quo. For instance, belief in liberal political philosophy, in economic growth, and so forth. What is needed is the ambition to aim at a version of utopianism in sense (2): a radical democratic ecologism, serious about a post-growth future and about relocalising our world. Our ambitions need now to be utopian (in sense (2)): only such ambitions stand a chance of going far enough in the direction of change.
“We are now in a situation which makes it the case that without radical transformation, without radical hope, we are doomed.”
Liberal-individualist, pro-growth, pro-‘fairness’ doctrines, well-intentioned though they usually are, are now the `hopeless’, over-the-top kind of utopianism (in sense 1). This is a deliberate inversion of what liberals and conservatives have always said about socialists, Greens etc. This historic irony, this inversion, is one that it is vital we recognise now, take seriously, and implement. Such that we can be bold enough in transforming society from its current, disastrous neoliberal growthist pathway.
We need to create a culture of sufficiency, a culture of enough. The possibility or indeed actuality of such culture(s) is a constant theme of utopian experiments and writings, and rightly so. For, if we are to have a chance of doing enough to save our descendants from being fried, we need to actualise such a culture as a key part of that doing, as a radical alternative to growthism. ‘Enoughism’, as in the ‘voluntary simplicity’ movement, manifests a genuine love and care for our fellow humans: those poorer than ourselves today, and those at risk of complete destruction and immiseration in the future. Inegalitarian Rawlsian liberalism needs to give way to egalitarianism and the fetish for fairness needs to give way to something much ‘warmer’: love/care. We need to imagine futures without the desire for growth. We need to imagine simpler futures, where we might finally (as I like to put it) build down the threats that at present are hanging over the future itself.
This can no longer be tenably castigated as unwisely utopian. On the contrary, to state again the extraordinary truth, a truth that we need to start to take in and get used to…it is the standard pro-justice/fairness/‘development’/growth agendas that are hopelessly utopian. They pretend that a fairer version of business-as-usual is maybe going to be enough. And here, I am echoing the philosopher Rai Gaita: “[P]lacing the weight that I do on our humanity and on love rather than on, say, the obligated acknowledgement of rights, is more hardheaded than the longing to make secure to reason what reason cannot secure, all the while whistling in the dark.” 
There is no real chance, I believe, of our taking significant enough action fast enough to save the future, if we do not love the future ones – our descendants, deep into the future – with all our hearts. Love them as our loved ones now (starting, but not ending, with our babies and kids). For without such love, we will simply take much of what they need from them, as we are currently doing. We will consign the(ir) world to bleakest misery, and ‘sell’ such misery back to ourselves as being simply us taking what we deserve or ‘need’, a ‘fair’ share. We ought rather to be in awe of our wondrous power over them — and therefore utterly respectful of their vulnerability and beauty. We ought to give our all for them. For us not to be myopic, they need to be real to us.
These are utopian demands (in sense (2)). Demands for an exercise of imagination, of ‘self-restraint’, and of changing our polities and our very world, of a kind that we are not used to. But: anything less is selling the future short. For we have reached a point where we are profoundly imperiling the future and only a radical course correction will save it.
It’s high time for (that type of) utopianism.
A Cyborg’s Take on Utopia – https://iainews.iai.tv/articles/a-cyborgs-take-on-utopia-auid-765?utm_source=Institute+of+Art+and+Ideas&utm_campaign=508d40256a-IAI+News+Issue+53_2017_02_01&utm_medium=email&utm_term=0_33593fe9fa-508d40256a-47042765
Anarchy, Open Borders and Utopia – https://iainews.iai.tv/articles/anarchy-open-borders-and-utopia-auid-762?_ga=1.66014054.9548632.1486078719
A Woman’s World – https://iainews.iai.tv/articles/a-feminist-utopia-auid-766
The twentieth century saw four basic visions of hell on earth, or dystopia. These were:
Orwellian. Rule by autocratic totalitarian people, party or elite group, limitation of choice, repression of speech and repression of minorities, belief in order, routine and rational-morality. Control by enclosure, fear and explicit violence. Violent repression of dissent (via ‘the party line’). Erotic physicality and sexual freedom suppressed via control of sexual impulses. Control of thought by explicitly policing language (Orwellian Newspeak).
Huxleyan Rule by democratic totalitarian systems, excess of choice, limitation of access to speech platforms, assimilation of minorities, belief in emotional-morality, ‘imagination’ and flexibility, and control by desire, debt and implicit threat of violence. No overt control of dissent (system selects for system-friendly voices). Erotic physicality and sexual freedom suppressed via promotion of pornographic sensuality and dissolution. Control of thought by implicitly enclosing language within professional boundaries (Illichian Newspeak, or Uniquack).
Kafkaesque Rule by bureaucracy. Control of populace via putting them into writing, forcing people to spend free time on bureaucratic tasks, thereby inducing tractable stress and the schizoid, self-regulating self-consciousness (anxiety about low marks, unlikes, official judgements and the like) that bureaucratic surveillance engenders. Generation of a system which structurally rewards those who seek an indirect relationship with their fellows or who, through fear of life, seek to control it through the flow of paperwork.
Phildickian Rule by replacing reality with an abstract, ersatz virtual image of it. This technique of social control began with literacy*—and the creation of written symbols, which devalued soft conscious sensuous inspiration, fostered a private (reader-text) interaction with society, created the illusion that language is a thing, that meaning can be stored, owned and perfectly duplicated, that elite-language is standard and so on—and ended with virtuality—the conversion of classrooms, offices, prisons, shops and similar social spaces into ‘immersive’ on-line holodecks which control and reward participants through permanent, perfect surveillance, the stimulation of positive and negative emotion, offers of godlike powers, and threats to nonconformists of either narco-withdrawal or banishment to an off-line reality now so degraded by the demands of manufacturing an entire artificial universe, that only hellish production-facilities, shoddy living-units and prisons can materially function there.
The reader can decide for herself under which of above we currently struggle to eke out a life worth living. I would like to suggest that all modern societies are both Kafkaesque and Phildickian with either a Huxleyan or Orwellian overarching framework; modern, western, capitalist societies tend to be basically Huxleyan (HKP) and pre-modern, eastern, communist countries tend to be basically Orwellian (OKP).
The reason why ideological managers** (academics, film directors, journalists, etc) prefer to have two (or more) dystopian systems is that it makes us seem like the goodies and them the baddies. Communism is to blame for their foodbanks and breadlines, but capitalism has nothing to do with ours (or vice versa). Sure our masses have the same miserable lives as theirs, reel under the same bureaucratic insanity, stumble around the same shoddy unreal worlds, and witness the same catastrophic destruction of nature and beauty as theirs do, but at least we’ve got democracy! / at least our families stick together! / at least the trains run on time! / at least GTA 9 is coming out soon / at least the Olympics will cheer us up (delete as appropriate).
This is an adapted extract from The [Utopian] Apocalypedia.
* Obviously I’m not suggesting that literacy is inherently or completely dystopian, but it is the beginning of a dangerous and distorting process, which starts with societies demanding literacy for participation — and devaluing orality and improvised forms of expression — and ends with the complete eradication of reality. This danger and distortion increases with every step towards virtuality (print, perspective, photography, television, internet) until, by the time we reach VR, there remains no possibility of reverie, transcendence, humanity, meaning or genuine creativity, all of which become suspect.
** And of course for those who depend on their illusions.
The Google Books N-gram corpus contains an enormous volume of digitized data, which, to the best of our knowledge, sociologists have yet to fully utilize. In this paper, we mine this data to shed light on the discipline itself by conducting the first empirical study to map the disciplinary advancement of sociology from the mid-nineteenth century to 2008. We analyse the usage frequency of the most common terms in five major sociology categories: disciplinary advancement, scholars of sociology, theoretical dimensions, fields of sociology, and research methodologies. We also construct an overall index deriving from all sociology-related key words using the principal component method to demonstrate the overall influence of sociology as a discipline. Charting the historical evolution of the examined terms provides rich insights regarding the emergence and development of sociological norms, practices, and boundaries over the past two centuries. This novel application of massive content analysis using data of unprecedented size helps unpack the transformation of sociocultural dynamics over a long-term temporal scale.
The emergence of big data has opened many research opportunities and topics for the field of social science. As a lens on human culture (Aiden and Michel, 2013), big data offer enormous possibilities to detect historical trajectories, human interactions, social transformations and political practices with rich spatial and temporal dynamics. Forecasting the next five decades of social science research, King (2009: 91) has predicted a ‘historic change’ in which the profusion of gigantic databases and their investigation will promote ‘our knowledge of and practical solutions for problems of government and politics to grow at an enormous rate’.
One particularly promising new tool for massive content analysis is the Google N-gram corpus, a digitized books repository containing enormous volumes of digitized data. Michel et al. (2011) have described the construction of the first edition of the Google N-gram Corpus with approximately 5 million books and examined the usage frequency of words in order to quantitatively analyse human culture trends in ways unimaginable even a decade ago. Following this seminal study, the Google N-gram corpus has been used to explore the politics of disaster (Guggenheim, 2014), the language of contention (Tarrow, 2013), the transformation of economic life (Bentley et al., 2014; Roth, 2014), patterns of poverty and anti-poverty policy (Ravallion, 2011), linguistic and written language development (Twenge et al., 2012), and the psychology of culture (Greenfield, 2013; Zeng and Greenfield, 2015).
Notwithstanding this recent profusion of academic texts employing digitized texts, sociologists have yet to fully explore the possibilities offered by this new dataset. Whereas almost a decade ago the ‘coming crisis of empirical sociology’ related to sociologists’ failure to engage with the vast proliferation of social data (Savage and Burrows, 2007), sociologists need to think seriously about the challenges and opportunities posed by big data. As Burrows and Savage recently point out (2014: 2):
Sociologists generally used and refined rather familiar methods, talked mainly to each other about esoteric theoretical pre-occupations, and had not caught up with the fact that sociology was no longer an avant-garde discipline which had attracted legions of critical students and scholars in the 1960s and 1970s but had become fully part of the academic machine.
This absence is particularly striking given that the establishment, expansion, and influence of sociology is particularly reliant on words and phrases, rather than figures, functions, equations or other mathematical expressions, as compared to any natural science. Books serve as one of the most telling embodiments of a society’s knowledge over time, and the majority of sociology’s most canonical achievements have seen publication in book form. It seems only appropriate, then, to seize upon the opportunity provided by the Google N-gram corpus to identify and examine the long-term trends and themes that have characterized the field of sociology itself.
Sociology, as one of the core disciplines of the social sciences, is ‘like a caravansary on the Silk Road, filled with all sorts and types of people and beset by bandit gangs of positivists, feminists, interactionists, and Marxists, and even by some larger, far-off states like Economics and the Humanities, all of whom are bent on reducing the place to vassalage’ (Abbott, 2001: 6). Yet, notwithstanding this statement on the complexities of disciplinary advancement of sociology, there is virtually no empirical sociological research that can attest to the development of different ‘sorts and types’ of sociological norms, practices and boundaries. In the current study, we conduct the first empirical analysis, to our knowledge, in the field of sociology to use the corpus of digitized books. We analyse the evolution of the usage of the most common words and phrases in terms of disciplinary advancement, sociology scholars, sociology theories, sociology fields and sociology research methodologies between the 1850s and 2008. We also employ the data extracted from the corpus to quantitatively testify theories of the development of sociology. Our results show that the annual usage frequency count of a particular term based on big-data strategy not only gives clues as to the historical emergence and progress of sociology – indicating, for example, the longevity or popularity of a particular sociology field or method – but also sheds light on the linkage between the development of sociology and broader sociocultural dynamics over centuries.
Data and method
Since 2004, Google has been engaged in digitizing books printed as early as 1473 and representing 478 languages from 40 top universities worldwide (Michel et al., 2011). The first edition of Google corpus for analysis consists of about 5 million volumes of books between 1550 and 2008, excluding journals and serial publications (around 40 per cent of all scanned publications), which represent a different aspect of culture than do books. To avoid data duplication, the team of Google corpus converted billions of book records from over 100 sources of metadata information provide by libraries, retailers, and publishers in order to generate a single non-redundant database of book editions (Michel et al., 2011, Supplementary Online Material).
Following exactly the same procedure described in Michel et al. (2011), the second edition of Google corpus (2012) consists of about 8 million books, representing 6 per cent of all the books printed from the 1500s onward (Lin et al., 2012). Compared to the first edition, the 2012 Google corpus has a larger underlying book collection and higher quality digitalization (Lin et al., 2012). The English corpus alone comprises 4.5 million volumes of books and around half a trillion words (Table 1).The composition of Google Books Corpus
|First edition 2009||Second edition 2012|
|(5.2 million books)||(8.11 million books)|
|Word count||Book count||Word count|
|English||361 billion||4.54 million||468.5 billion|
|France||45 billion||0.86 million||102.2 billion|
|Spanish||45 billion||0.79 million||84 billion|
|German||37 billion||0.66 million||64.7 billion|
|Chinese (Simplified)||13 billion||0.3 million||26.9 billion|
|Russian||35 billion||0.59 million||67 billion|
|Hebrew||2 billion||0.07million||8 billion|
|Italian||–||0.3 million||40 billion|
|Total||538 billion||8.11 million||8,613 billion|
The Google Books corpus provides information about how many times per year an ‘n-gram’ appears in all the books included in the corpus, where an n-gram is a continual string of n words (uninterrupted by a space). A 1-gram could be a single word, for example, ‘sociology’, or numbers ‘1.234’. An n-gram is a sequence of 1-grams, such as the phrases ‘sociology theory’ (a 2-gram) and ‘field of sociology’ (a 3-gram). Punctuation and capitalization are preserved in the data set. By searching the Google corpus for a key word or phrase, one can obtain information about the annual occurrence of that keyword or phrase for a given time period. Although the absolute percentage of any individual word is, of necessity, small, the traces of such words, their rise and fall, can help index the most robust sociocultural trends over a long-term timeline.
In the present analysis, we focus on the English-language books corpus. We also analyse some specific terms in both American English and British English books to make a further comparison across different social contexts.1 In terms of time frame, we restrict our research to between mid-1850 and 2008 (inclusive) for two reasons. First, the profession of sociology emerged as a scholarly discipline in the early part of the nineteenth century and only really started to flourish in the mid-1850s,2 with Karl Marx, Herbert Spencer, and other early generation scholars to publish their works in the field of sociology (Boudon, 1989). Second, digitization of written texts is a cumulative process. Contemporary holdings of books published in the early 1800s are often incomplete and scant, meaning that information extracted from books before the 1850s could be from a biased sample. At the other end of the timeline, books published after 2008 are still being digitized and included in the Google Books corpus. Thus far, there is no data match beyond the year 2008 (Lin et al., 2012).
This language and year restriction can substantially alleviate the potential problem of data accuracy because more than 98 per cent of words are correctly digitized for modern English books (Michel et al. 2011, Supplementary Online Material). Still, two concerns may be raised regarding the representativeness of the Google corpus analysed in the present paper.
First, the corpus was constructed using OCR (optical character recognition) technology. As Michel et al. (2011) mention, books with poor OCR quality (due to size, paper quality, or the physical condition) were filtered out. This could lead to a potential sample problem. Second, the corpus is most likely to be biased towards recent books, since more books are published in more recent years, leading to skewed results of word usage. Regarding the first issue, however, books filtered out due to poor OCR quality only accounted for around 4 per cent of all scanned volumes (Michel et al., 2011, Supplementary Online Material) – a considerably small fraction. As for the second concern, we normalized the total number of appearances of a key word using the frequency of ‘the’ in the same year rather than the total number of all words.3 Thus, we obtained the normalized annual frequency of the word usage of our search terms as:
where Rit denotes the word usage of the key word i in year t, Cit represents the total number of appearance the word i in year t, and Ct is the total number of ‘the’ that appeared in all books published in year t. Conceptually, a higher Rit indicates higher frequency of word usage and thus higher cultural and social influence for the time period in question.
Drawing on various sociology textbooks, including A Dictionary of Sociology (Scott and Marshall, 2009), Sociology (Giddens and Sutton, 2013), we conducted a panoramic search of the disciplinary advancement of sociology in five major categories: academic significance, masters of sociology, theoretical dimensions, fields of sociology, and analytical methodologies. ‘Academic significance’ refers to the historical position of sociology in human knowledge as a subject related and compared to other subjects; the key word for this is ‘sociology’ or ‘sociological’. For ‘masters of sociology’, sociologists’ full names serve as the search terms and the goal is to chart key figures’ rise to fame and their academic reputations. The key words for ‘theoretical dimension’ are the names of relevant sociological theories and schools; ‘fields of sociology’ focuses on the sub-branches of sociology and popular research topics; and ‘analytical methodologies’ focuses mainly on the comparison of qualitative and quantitative research methodologies in sociology. Finally, we constructed an overall index deriving from all sociology-related key words using the principal component method to demonstrate the overall sociocultural influence of sociology in two centuries’ books.
Academic significance of sociology
We first counted the appearance of the key word ‘Sociology’ in the corpus since 1850. As a control group we also ran a similar search on the four subjects of ‘Philosophy’, ‘Economics’, ‘Anthropology’ and ‘Psychology’. It is worth noting that we did not run a test on ‘Political Science’ due to the fact that ‘Political’ or ‘Politics’ could be interpreted in numerous ways and thus would likely include non-academic related materials in the results.
The x-axis of Figure 1 demonstrates the year label from 1850 to 2008, while the y-axis stands for the word frequency statistics of different subjects. From Figure 1, one can observe that the word ‘Philosophy’ accounts for approximately 0.007 per cent of the total word count. Compared to other subjects, phrases associated with ‘Philosophy’ appeared earlier and more frequently. However, around the turn of the nineteenth to the twentieth century, the curve for ‘Philosophy’ plunged drastically and did not rise again until the early twentieth century. This finding corresponds with the collapse of classic German philosophy, especially the Hegelian school of philosophy in history (Solomon, 1988). It is noteworthy that from 1890 to 1920, as the word frequency statistics curve for ‘Philosophy’ dropped, the respective curves for the other subjects rose.
In fact, the word frequency statistics for ‘Sociology’, ‘Economics’ and ‘Anthropology’ rose steadily between mid-late nineteenth century and the 1930s, especially in the case of ‘Economics’, which saw the most drastic uptick in frequency, developing a wide lead over ‘Sociology’, ‘Psychology’ and ‘Anthropology’.
Our analysis yields interesting insights regarding the impact of major world events. For example, during World War I (1914–1918), the statistics for ‘Sociology’, ‘Psychology’ and ‘Economics’ did not drop, but in World War II (1939–1945) the statistics dropped dramatically and only began to increase again with the end of the war. This seems to indicate that WWII had a much greater impact on these disciplines than did WWI. The effect of WWII was reversed, however, in the case of ‘Anthropology’, which saw no decline during WWII; indeed, if anything, it saw a slight rise in its statistics. We believe this can be linked to the expansion of conflict beyond Europe to include Asia, Africa and Oceania, thus increasing states’ demand for strategic knowledge about non-Western countries. A broader war has, on one hand, secured funding on anthropology from government based on strategic purposes to study nationalism, internationalism, racial supremacy and anti-totalitarianism, on the other hand anthropologists themselves were able to shift their research horizon from traditional subjects such as African and Indian tribes to Eastern Europe and Southeast Asia (Price, 2002). Anthropologist Ruth Benedict’s 1946 study of Japan, The Chrysanthemum and the Sword, stands as arguably one of the best-known examples of such state-driven academic research.
The curves for ‘Sociology’, ‘Economics’, ‘Psychology’ and ‘Anthropology’ all peaked during the 1970s and 1980s, then began another round of slow descent in the 1990s. The descent for each subject might simply represent the dilution of knowledge in a constantly expanding corpus: with the total amount of knowledge possessed by human beings constantly on the rise, the percentage increase year to year for each subject or field might understandably be decreasing. However, for ‘sociology’, the decreasing word frequency does not necessarily mean the decline of the importance of sociology as a discipline. We will analyse this further in a later section.
We conducted searches for the full English name of 30 major Western sociologists in the Google N-gram corpus. Figure 2 illustrates the top 12 sociologists in word frequency statistics.4 They are (chronologically): Karl Marx, Herbert Spencer, Max Weber, Emile Durkheim, Georg Simmel, Herbert Marcuse, Talcott Parsons, Erving Goffman, Zygmunt Bauman, Jürgen Habermas, Pierre Bourdieu and Anthony Giddens. From Figure 2, we conclude three major findings.
Dilution effect: From Karl Marx to Anthony Giddens, it seems that each new sociologist is destined never to surpass his predecessors’ academic significance. This phenomenon does not necessarily suggest that the influence of one sociologist cannot surpass his predecessor. For instance, the influence of Pierre Bourdieu after the 1980s exceeded his predecessors Georg Simmel and Emile Durkheim and reached 0.00005 per cent around 2003, next only to Karl Marx and Max Weber. However, if we categorize sociologists into different generation group, we can see that later generation peaked at 0.00008 per cent in the 1970s represented by Talcott Parsons and none of the descendants could ever pass that point, let alone to reach the statistics of earlier sociologists like Herbert Spencer and Karl Marx. Thus conceived, it is almost impossible for later generation sociologists to surpass the fame of the earlier generation.
This phenomenon is due to the explosive growth in the total amount and categories of human knowledge. In other words, sociology constituted a bigger share of given knowledge during the nineteenth century, as that body of knowledge was still being amassed. When it comes to the twentieth and twenty-first centuries, in contrast, though sociology itself has continued to develop and more and more people have become professional sociologists, the discipline’s relative influence in human knowledge has decreased – not unlike the dilution of a substance mixed with ever larger quantities of water. To the extent that Talcott Parsons appears to be the last sociologist with the same level of influence as the generations that came before him, this may well have as much to do with the changing size of the ‘reservoir’ of all human knowledge as it does with Parsons’ work itself.
Exogenous effect: Compared to other sociologists, the word frequency curves with the highest average upward slope were those of Herbert Spencer and Karl Marx. In other words, Spencer and Marx enjoyed the most rapid ascent to positions of authority within the field in terms of influence. The speed of their rise, however, was supported by strong exogenous forces other than academic factors. Herbert Spencer was a generalist – a combination of philosopher, biologist, anthropologist, sociologist, political theorist, and a classic man of letters. He interacted with social elites throughout his life and was connected to many important ideologists and dignitaries. Spencer utilized his high-status social network to gain authority and audience as a generalist, enabling him to become extremely influential in the late nineteenth century, when the total amount of knowledge was still limited. Karl Marx, in comparison, did not enjoy such success in his lifetime; instead, his influence peaked between the 1920s and 1940s, and then again in the 1960s to the 1970s – precisely when Marxism and Communism were becoming influential beyond the academic world and actually changing the course of twentieth-century history.
Acceleration effect: Whereas most of the first generation of sociologists had to enjoy their fame posthumously, twentieth-century sociologists have become influential much earlier in their careers. With the exception of Herbert Spencer, all of the great names of sociology born in the nineteenth century became most reputable after their death. Karl Marx became most famous some 20 years after his death; Max Weber’s name began to rise exactly after his death in 1920; and, likewise, none of Emile Durkheim, Georg Simmel or Herbert Marcuse lived to see the years in which their numbers truly blossomed. In contrast, sociologists born in the twentieth century were much luckier. For instance, when Talcott Parsons began to gain fame in the 1940s, he was no more than 40 years old. Anthony Giddens became famous at the same age. Jürgen Habermas and Pierre Bourdieu became highly influential slightly later, but both began their ascent when they were in their fifties, around the 1980s–1990s, and Habermas is still alive today.
This acceleration effect can be ascribed to the development and standardization of sociology as a subject. In the late nineteenth century, as the discipline was still being established, there were fewer scholars and academic standards were, if not lower per se, at the very least less formalized, with greater room for flexibility. Sociology, too, was still in the process of legitimating its claim as a science. All these factors contributed to a longer ‘wait time’, so to speak, for a sociology scholar to reach notable fame. Today, both the discipline and the academic field in general are well established, enabling sociologists can make use of better disciplinary infrastructure and pre-existing channels to increase their influence.
The contribution of sociology towards human knowledge lies in a series of inspiring and explanatory concepts and theories. As such, we conducted key word searches for classic theories of sociology in order to explore their relative impact. Because most nineteenth-century sociological works are more general in nature – concerned as they were with establishing the basic parameters and goals of the discipline – we focused on the most famous, more specific sociological theories of the twentieth century. As Figure 3 illustrates, we concentrated on the ten most famous sociological theories: Conflict Theory, Social Exchange Theory, Structural Functionalism, Structuration Theory, Symbolic Interactionism, Rational Choice Theory, Ethnomethodology, Neo Functionalism, Strength of Weak Ties, and Structural Holes.
Lifetime trajectory of a theory: We noticed that each theory, from its birth to maturity, from its peak popularity to its point of diminishing returns, has its own life trajectory. In the mid-late twentieth century, the majority of the theories reached a peak in their growth-rate and usage about 30–40 years after their introduction. After that point, their influence begins to diminish. Interestingly, even though the sample of theories is relatively small, this life-cycle pattern fits that found for words more generally by researchers in linguistics. For example, Petersen et al. (2012) have identified universal growth-rate fluctuations in the birth and death rates of words: new words reach a pronounced peak about 30–50 years after the originate, after which point they either enter the long-term lexicon or fall into disuse.
The metabolism of a theory: We also noticed that the influence of earlier theories was superseded by that of newer theories. For instance, the growth rate of Structural Functionalism began to decrease in the mid-1990s while the usage of Structural Holes, a theory 20 years younger, superseded the former. Ethnomethodology and Symbolic Interactionism also appear to be on their way out. Meanwhile, Rational Choice Theory is still increasing in frequency, but now at a slower rate. Furthermore, when we grouped Strength of Weak Ties and Structural Holes together, we found that their total influence had already surpassed that of Structuration Theory and Social Exchange Theory around 2008. In other words, the cultural influence and academic significance of newly developed social capital and social network approaches has already gone beyond that of ‘classical’ sociological theories. Whether they will continue this growth, however, remains to be seen.
Explanatory scale of a theory: Generally speaking, a grand theory possesses stronger generalization ability and a larger scale of utilization. Yet, we found that since at least the mid-twentieth century, the theoretical world is no longer dominated by grand theories. For instance, Anthony Giddens’ Structuration Theory and Talcott Parsons’ Structural Functionalism have fallen significantly below Ethnomethodology, Symbolic Interactionism and Rational Choice Theory, all of which focus on micro-level interactions in society rather than large-scale macro functions of societal structures and institutions. Moreover, as time progresses, there seems to be less and less room reserved for grand theories: theories that thrived after the 1970s, such as Strength of Weak Ties and Structural Holes, all adopt micro or meso perspectives in order to understand human behaviour. While the relative pros and cons of ‘micro’ versus ‘macro’ theories are still the subject of much debate today, we speculate that the ambitious nature of grand theories may have, over time, become a disadvantage, actually limiting their appeal for contemporary theorists. Indeed, it may well be as many postmodern theorists have already declared, that sociology has entered a ‘post grand theories’ era.
Fields of sociology
Sociology is subdivided into many specialized fields and these fields are constantly changing over time. For this analysis, we looked at the shifting pattern of these fields in sociology in order to capture the larger discipline’s related social change. We conducted a key word search for eight of the most prominent fields, namely: Educational Sociology (Sociology of Education), Rural Sociology, Urban Sociology, Political Sociology, Economic Sociology, Sociology of Law, Sociology of Religion and Historical Sociology.
A few interesting findings can be observed in Figure 4. First, Educational Sociology emerged early as the most prominent field, but was replaced by Sociology of Education in the late 1960s. The shift was not merely semantic. Educational Sociology focused primarily on the social and cultural factors affecting relatively smaller social groups, thus neglecting larger societal influences on education in the post-industrial period. The Sociology of Education, on the other hand, turns its interest to the social function of education and thus investigates the role of education as a social institution (Shimbori, 1972). Second, after the 1990s, both the Sociology of Religion and Historical Sociology progressed at a relatively aggressive pace, particularly when compared with all the other fields, which demonstrated signs of descending. Third, Rural Sociology emerged as a sub-field of the discipline in the early twentieth century and exhibited a very high growth rate from the 1950s to the 1980s. This reflects the fact that Rural Sociology is the earliest and the most prominent sub-discipline of American sociology as an outgrowth of the response to the pronounced differentials in rural and urban social organization of the late nineteenth century, with its development peak around 1950s to 1960s (Brunner, 1957; Nelson, 1969).
In addition to the various fields within sociology, we were also interested to see shifts in terms of substantive research topics, which subjects were deemed ‘hot’ and when. In Figure 5 we compare eight representative terminologies within the social stratification and mobility, and social capital and network areas: Social Identity, Social Movement, Social Mobility, Social Stratification, Social Capital, Social Network, Social Class and Social Strata.
From Figure 5, we can observe that the growth-rate fluctuation of Social Mobility and Social Stratification peaked around 1975 and then started to decline. The popularity of Social Network rose rapidly from the late 1980s and surpassed Social Mobility around 1997. As Freeman (2004) argues, with the development of desktop computers and computer programs to manage network data, social network research finally took off from the mid-1980s onwards, shifting from ‘network as metaphor’ to ‘network as a mathematical expression’. Around the same time, research on Social Capital exceeded Social Mobility and finally surpassed Social Class around 2003. In other words, research on each of Social Capital and Social Networks is currently on the rise, while research on each of Social Mobility and Social Stratification is declining. Meanwhile, research on Social Movements started proliferating around the mid-1960s when waves of new movements organized around race and gender emerged in both America and Western Europe (Kriesi et al., 1995; Lovenduski, 1986).
Research methodologies of sociology
Which methods are used most by sociologists – quantitative or qualitative methods? To answer this question, we focus on shifts in the relative balance between the two major research methodologies in sociology over the past century.
We first calculated the average score of annual frequencies of each method in both quantitative and qualitative approaches from 1950 to 1980. Then we normalized the two groups of average scores into Z values and use ZQN – ZQL to obtain an index of quantitative analysis for each year. Figure 6 shows a plot of this index.
From Figure 6, we can see that both methods took turns ‘in the lead’ across different time periods. From 1950 to 1980, qualitative methods were more prominent, while the usage of quantitative methods surpassed that of qualitative approaches in the 1980s and 1990s, except for a short period around 1995–1997. After 2000, quantitative methods dominate in a majority of scholarships. It is noteworthy that scholars who utilize qualitative methods are also more likely to publish their research in book format, in contrast to quantitative researchers who are more likely to publish in journals and other formats; therefore, if anything, it is likely that our calculation underestimates the ‘lead’ of quantitative over quantitative methods.
An overall index: influence of sociology
In this section we use the word usage of relevant sociology-related key words in the above categories (except for methodology) to generate an overall measure for the sociocultural influence of sociology in millions of books. We carry out a Principal Components Analysis (PCA) to extract as much information as possible from the corpus while preserving degrees of freedom. We prefer the PCA method to applying the average score of normalized annual frequencies because PCA can ‘concentrate’ much of the sociological signals into the first few factors by ‘screening’ the later factors that are dominated by noise. This is important given that we generate the list of sociology-related words without establishing any theory about how closely the selected signals capture the meaning of ‘sociology’. The factor-predicted score S is calculated by:
where m denotes the number of factors with eigenvalues larger than 1, and is the cumulative proportion of explained variances larger than 90 per cent.
We report the factor loadings, variances, as well as correlation of signals in Table 2. The KMO measure of sampling adequacy, and the SMC between each signal and all other signals strongly suggest that these signals pick up sociology-related dynamics in the corpus. The first three principal components account for around 91 per cent of the variance. Using the first three factors and their respective proportion of variance, we can predict the index for influence of sociology.Factor loadings on and correlations of sociology signals
|Factor 1||Factor 2||Factor 3|
|Social Exchange Theory||0.1982||0.0482||–0.2378||0.9783||0.9390|
|Rational Choice Theory||0.1732||0.2355||0.0893||0.9217||0.9933|
|Strength of Weak Ties||0.1861||0.1825||0.0395||0.8835||0.9916|
In Figure 7, we further present the time series of the z-score equivalents of the overall index for sociology, as well as the time series of the word usage of ‘sociology/sociological’. As the figure shows, the influence of sociology as a discipline took off in the 1970s. Although the word usage of ‘sociology/sociological’ began to decline in the 1980s, the overall usage of sociological terms, including sociological theories and topics, began to skyrocket in all other respects. This reflects the extent to which sociology has come to penetrate and influence other domains and disciplines. For example, theories of weak ties and structural holes have been widely applied in the study of business management, while social capital has become a popular topic in research on economic development, political participation and public health. Further, we believe that the impact of sociology will continue to expand in the foreseeable future.
A research case beyond description
With the help of Google corpus, we are able to conduct more substantial research into the development of sociology beyond simply describing the rise and fall of the usage of sociology-related words. We use the case of the early development of sociology in the USA as an example to illustrate how the data extracted from Google corpus can be used to conduct quantitative study.
Upon the creation of American sociology as a professional discipline circa the 1890s (Cortese, 1995; Young, 2009), the tenets of the social gospel movement made sociology an acceptable course of study in many American denominational colleges. This has led to considerable debate among students of history of sociology regarding the nature of the connection between sociology and social gospelism (Henking, 1993; Morgan, 1969; Williams and MacLean, 2012). As Morgan (1969: 42) has indicated, ‘the Social Gospel and early sociology were often indistinguishable in terms of both ideas and leading personnel. This close parallelism is seen as a major factor in the early acceptance of sociology as an academic discipline in the nineteenth century universities.’ Research on this question, however, has only looked at individual case studies and thus lacks the support of hard data.
Digitized written texts provide a statistical solution to this dilemma. We searched using the key words ‘Sociology’, ‘Social Gospel’ and ‘Hull House’,5 with ‘Anthropology’ as a control group, and compared the results from the American English corpus and the British English corpus. As demonstrated in Figure 8, ‘Social Gospel’ and ‘Sociology’ both show signs of growth from 1890 to 1930 in America, with their respective growth rate close to each other; meanwhile, ‘Anthropology’ shows no visible signs of growth. By contrast, the correlation between the growth of ‘Sociology’ and ‘Social Gospel’ was far less obvious in England.
The above findings based on visual inspection of the data provide only preliminary evidence of the effects of the social gospel movement on the development of sociology in America. We thus proceed to use the time series (1890–1930) of ‘Sociology’, ‘Social Gospel’, ‘Hull House’ and ‘Anthropology’ to perform a Granger causality test to formally test the proposed connection between sociology and social gospelism. In the language of time series analysis, X is the Granger-cause of Y in the sense that Y can be better predicted using the histories of both X and Y than it can be predicted using the history of Y alone.
Using time series with persistence displayed by a unit root process in a standard ordinary least square equation can lead to spurious results of correlations. Therefore, we first performed stationary tests for all four time series using the Dickey–Fuller General Least Square (DFGLS) method and the Phillips–Perron (P-P) method. We found that all of them are integrated of the first order. We therefore used their first differences to fit a vector autoregressive (VAR) model to examine the relationships among them. The results from the American English corpus in Table 3 clearly show that ‘Social Gospel’ is the Granger-cause of ‘Sociology’ at a 0.05 alpha level, and ‘Hull House’ is the Granger-cause of ‘Sociology’ too at a 0.09 alpha level. In addition, the identified time lag suggests that the social gospel movement within the past 4 years can effectively affect the development of sociology at any given time. However, neither of the two words are the Granger-cause of ‘Anthropology’ even at a 0.1 alpha level. Furthermore, results from the British English corpus demonstrate that there is no Granger-relationship among the time series of ‘Social Gospel’, ‘Hull House’, ‘Sociology’, and ‘Anthropology’ at all. In general, our findings based on time series analyses lend support to the argument that there was a close relationship between the early development of sociology and the social gospel movement in the USA.Granger causality tests for the potential connections between sociology and social gospel movement using two different corpus
|American English Corpus (Lag = 4)|
|Sg does not Granger cause Soci||47||22.678***||0.000|
|Hull does not Granger cause Soci||47||7.896*||0.095|
|Anthr does not Granger cause Soci||47||2.019||0.732|
|Sg does not Granger cause Anthr||47||2.981||0.561|
|Hull does not Granger cause Anthr||47||1.195||0.879|
|Socil does not Granger cause Anthr||47||2.581||0.630|
|British English Corpus (Lag = 1)|
|Sg does not Granger cause Soci||50||1.112||0.292|
|Hull does not Granger cause Soci||50||0.155||0.694|
|Anthr does not Granger cause Soci||50||0.006||0.936|
|Sg does not Granger cause Anthr||50||0.079||0.779|
|Hull does not Granger cause Anthr||50||0.119||0.729|
|Socil does not Granger cause Anthr||50||0.594||0.441|
This paper is the first of its kind to use the Google Books N-gram corpus, perhaps the largest electronic corpus yet constructed, to map out the disciplinary advancement of sociology in terms of the discipline in general and its major scholars, theories and research fields from the mid-nineteenth century to 2008. The intention of this research is in no way to suggest an evaluative ranking of the theories, scholars, schools, or methodologies that make up sociology. Instead, our goal has been to respond to Back and Puwar’s (2012) call for a ‘live sociology’ to deal with ‘lively data’, or the challenge posed by big data, the knowledge economy and the digitization of everyday life. As such, the aim and, it is hoped, the contribution of this study has been to show that massive content analysis from digitized books can provide rich insights regarding the historical evolution of professional disciplines and long-term sociocultural changes at a macro level.
Conceptually, examination of high frequency use of a specific term in a representative sample of the written texts is particularly important because it helps ‘identify the dynamics of historical emergence, decline, and comparative significance of a political concept’ (Hassanpour, 2013: 299). This gives corpus methodology significant advantages over traditional survey methods in which the sheer quantity of data and the availability of data are limited (Beer and Burrows, 2013; Lin et al., 2012). The use of newspaper data from one or more localities also tends to produce validity and reliability problems and there is no standard solution to correct for potential description and selection bias (Earl et al., 2004; Oliver and Myers, 1999). So far, however, the use of corpus data analysis has barely started among sociologists. With the exploding scale of digitization, more and more materials will be included in the historical corpus in the years to come. This will fundamentally change our scope of research and open venues for sociologists to employ new and creative approaches to social research.
Of course, there is still room for improvement in the present research. First, the full dataset analysed here only accounts for around 6 per cent of all books ever published from 1500 onwards. This means that it may be biased relative to the ensemble of all surviving books. The scanned and digitized books in particular were mainly borrowed from university or public libraries, retailers and publishers, and thus the composition of the corpus reflects the acquisition practices of the participating institutions. Although the assembled collections of books from various participating institutions could still be argued to be representative, the results here are tentative and should be treated with some caution.
Second, there are so many searchable sociology terms and we only cover a small proportion of the sociologists, theories and research fields. Therefore, the phenomena and the patterns observed might not represent the most universal versions. For example, we have only addressed some classic, traditional and established research fields of sociology such as social class, social movements or social capital; other important new fields such as globalization, migration, gerontology, gender, and race or ethnicity are increasingly popular among contemporary sociologists but may be under-represented here. The goal of this study has been to use novel data and visualization methods to shed light on the history of sociology itself, not by any means to summarize over a hundred years of sociological research.
Third, the advanced search function of the database was still limited and, therefore, the accuracy of the search results was far from perfect. For instance, different names may be attached to the same sociological terminology and words are sometimes used in ways that do not convey the same single sociological concept as the one intended in the analysis. Even though we have used Google search engines as a control group and chose the version with the highest level of representation, the accuracy of the results may still be lacking.
Despite its drawbacks, our research strategy is sufficient to show that written literary data in human history can help reinvigorate a sociological imagination able to extrapolate the historical trajectory of a sociological practice. Michel et al. (2011) proposed the concept of ‘culturomics’ to refer to the use of high-throughput digitized resources to study sociocultural trends and the human cultural genome. Similarly, we also suggest to open up a new field – ‘socialomics’ – to study the current state of a dynamic, fluid social world with massive digitized data collection and analysis. The value of establishing such an energetic and forward-thinking approach lies in the fact that the amount of human knowledge accessible to sociologists via physical reading is, in fact, very limited. This glass ceiling of academic research could result in a form of myopia, blinding us to the development of social science within and across media and forums not limited to the book format. With ‘genetic’ analysis of word frequency usage in a digitized era, we are likely to achieve theoretical inspirations and academic knowledge that the early generation of sociologists could not even have imagined.
- 1We also examine whether the pattern we find in the main analysis can be applied to the narrative-of-event corpora of newspapers. We searched the same key words in the field of sociology in the corpus of the New York Times and the results show similar general trends. Results of the relevant tests are available from the authors upon request.
- 2Although sociology’s exact timeline as a field/profession/discipline remains contested, this general time period works for the purposes of the current paper.
- 3Here we follow Bentley et al. (2014) and Acerbi et al. (2013), both studies that use this strategy. According to Acerbi et al. (2013), the word ‘the’ stably accounts for around 6 per cent of all words per year, and is thus a good representative of real writing and real sentences.
- 4The curve for the other 18 sociologists were all beneath the statistics curve of Jürgen Habermas. They are: Herbert Blumer, Charles Cooley, Alfred Schutz, George Mead, Harold Garfinkel, Max Horkheimer, Niklas Luhmann, György Lukács, C. Wright Mills, Robert Merton, Ralf Dahrendorf, Gerhard Lenski, Peter Blau, Randall Collins, Jeffrey Alexander, James Coleman, Immanuel Wallerstein and Norbert Elias.
- 5Hull House was the most famous ‘good-neighbor’ centre in the social gospel movement. Its founder Jane Addams later won a Nobel Peace Prize.
Science fiction likes to depict robots as autonomous machines, capable of making their own decisions and often expressing their own personalities. Yet we also tend to think of robots as property, and as lacking the kind of rights that we reserve for people.
But if a machine can think, decide and act on its own volition, if it can be harmed or held responsible for its actions, should we stop treating it like property and start treating it more like a person with rights?
What if a robot achieves true self-awareness? Should it have equal rights with us and the same protection under the law, or at least something similar?
These are some of the issues being discussed by the European Parliament’s Committee on Legal Affairs. Last year it released a draft report and motion calling for a set of civil law rules on robotics regulating their manufacture, use, autonomy and impact upon society.
Of the legal solutions proposed, perhaps most interesting was the suggestion of creating a legal status of “electronic persons” for the most sophisticated robots.
The report acknowledged that improvements in the autonomous and cognitive abilities of robots makes them more than simple tools, and makes ordinary rules on liability, such as contractual and tort liability, insufficient for handling them.
For example, the current EU directive on liability for harm by robots only covers foreseeable damage caused by manufacturing defects. In these cases, the manufacturer is responsible. However, when robots are able to learn and adapt to their environment in unpredictable ways, it’s harder for a manufacturer to foresee problems that could cause harm.
The report also questions about whether or not sufficiently sophisticated robots should be regarded as natural persons, legal persons (like corporations), animals or objects. Rather than lumping them into an existing category, it proposes that a new category of “electronic person” is more appropriate.
The report does not advocate immediate legislative action, though. Instead it proposes that legislation be updated if robots become more complex; if and when they develop more behavioural sophistication. If this occurs, one recommendation is to reduce the liability of “creators” proportional to the autonomy of the robot, and that a compulsory “no-fault” liability insurance could cover the shortfall.
But why go so far as to create a new category of “electronic persons”? After all, computers still have a long way to go before they match human intelligence, if they ever do.
But it can be agreed that robots – or more precisely the software that controls them – is becoming increasingly complex. Autonomous (or “emergent”) machines are becoming more common. There are ongoing discussions about the legal liability for autonomous vehicles, or whether we might be able to sue robotic surgeons.
These are not complicated problems as long as liability rests with the manufacturers. But what if manufacturers cannot be easily identified, such as if open source software is used by autonomous vehicles? Whom do you sue when there are millions of “creators” all over the world?
Artificial intelligence is also starting to live up to its moniker. Alan Turing, the father of modern computing, proposed a test in which a computer is considered “intelligent” if it fools humans into believing that the computer is human by its responses to questions. Already there are machines that are getting close to passing this test.
There are also other incredible successes, such as the computer that creates soundtracks to videos that are indistinguishable from natural sounds, the robot that can beatCAPTCHA, one that can create handwriting indistinguishable from human handwriting and the AI that recently beat some of the world’s best poker players.
If this progress continues, it may not be long before self-aware robots are not just a product of fantastic speculation.
The EU report is among the first to formally consider these issues, but other countries are also engaging. Peking University’s Yueh-Hsuan Weng writes that Japan and South Korea expect us to live in a human-robot coexistence by 2030. Japan’s Ministry of Economy, Trade and Industry has created a series of robot guidelines addressing business and safety issues for next generation robots.
If we did give robots some kind of legal status, what would it be? If they behaved like humans we could treat them like legal subjects rather than legal objects, or at least something in between. Legal subjects have rights and duties, and this gives them legal “personhood”. They do not have to be physical persons; a corporation is not a physical person but is recognised as a legal subject. Legal objects, on the other hand, do not have rights or duties although they may have economic value.
Assigning rights and duties to an inanimate object or software program independent of their creators may seem strange. However, with corporations we already see extensive rights and obligations given to fictitious legal entities.
Perhaps the approach to robots could be similar to that of corporations? The robot (or software program), if sufficiently sophisticated or if satisfying certain requirements, could be given similar rights to a corporation. This would allow it to earn money, pay taxes, own assets and sue or be sued independently of its creators. Its creators could, like directors of corporations, have rights or duties to the robot and to others with whom the robot interacts.
Robots would still have to be partly treated as legal objects since, unlike corporations, they may have physical bodies. The “electronic person” could thus be a combination of both a legal subject and a legal object.
The European Parliament will vote on the resolution this month. Regardless of the result, reconsidering robots and the law is inevitable and will require complex legal, computer science and insurance research.