SSPC 2013 Conference – Understanding Stigma of Mental Illness

11 Dec

SSPC 2013 Conference – Understanding Stigma of Mental Illness

Posting this as a reminder to myself, though if anyone’s interested in going I think I’ll be there.


Against “Right to Work” Legislation – a short and angry list of inconvenient details

10 Dec

As discussions about and demonstrations against Michigan’s “Right to Work” legislation reach their peak (or nadir, depending on the scale of your hostility towards workers), I thought I’d point out a few misunderstandings that have been critical to this arrogant legal deceit.

First, “Right to Work” legislation is not meant to give people the “freedom to work without union involvement”; i.e., it does not unyoke them from burdensome and imposed union dues. The ability to abstain from union membership is already enshrined by law. No one – no one – is currently obliged to pay dues against their will, nor are they prevented from working based on this abstention. In 1947, the Taft-Hartley Act prohibited “closed shops,” that is, workplaces wherein everyone had to be a union member. In its very name, “Right to Work” legislation is a lie.

What the law might actually change is whether workers have to contribute to specific bargaining costs. In most situations relevant here, all workers in a shop, factory, industry (if lucky) are members of a bargaining unit. This is distinct from a union in itself. A bargaining unit represents all workers under a specific contract to an employer, and represents them as a collective during contract negotiations. This is opposed to an employee-by-employee, individualized negotiation process that would, and has, profoundly undercut wages, benefits, etc., by pitting worker against worker in a race to the bottom. Collective bargaining, which is predicated on bargaining units, is fundamental to whatever remains of the dignity, influence and protection of the North American worker. Importantly, and I repeat this, a worker is part of these negotiations regardless of union involvement, and is represented by a bargaining committee regardless of paying union dues. As this is a time- and labour-intensive process, however, they are obliged to pay for their representation at the bargaining table. Those who want to opt out of these fees would invariably be represented anyway, but would be represented for free, at the cost of their fee-paying colleagues. While the distinction between these fees and union dues might be muddied, consider a parallel to broader governance: you pay taxes for services, “public goods,” that are unconnected to your political affiliation; you can then choose, regardless of those taxes, to support and join a specific political party. The former is payment for services that are necessary to the functioning of a healthy society, of which you are inescapably a part; the latter are a matter of personal choice and commitment.

Second, studies show that RTW legislation in the 23 states that have it has been coincident with an average increase of 26% in manufacturing jobs. This suggests employment moves to those places with weaker bargaining units. Not much of a surprise there. However, in these cases the RTW laws have usually gone hand-in-hand with broader “business attracting” policies (read: environmental deregulation, health and safety laxity, financial incentives and subsidies) so these numbers are tricky to ground. Opposing them, and with the same statistical flaws, are studies that show that in non-RTW states average wages are $1500 higher, employer-sponsored health insurance is 2.6% higher, and employer-sponsored pensions are 4.8% higher. And considering manufacturing, when it is “brought back onshore,” leads to decreasing job creation (see Paul Krugman’s recent NYT post:, we might want to start protecting the quality of the jobs we do have, rather than sacrificing workers on the altar of business worship.

Third, those who ideologically support RTW legislation as a means of “freeing the market” … well, I doubt they’re reading this blog post. I would point them, and urge you to point them, to the fact that there is no “free market,” nor has there ever been one. “Reducing signal noise” by “freeing rational actors” to “rationally choose how to work, spend, or wither” is pop economics. The sort of “rational actor” we take as the first step in free-market arguments is a fiction that my own discipline, anthropology, has shown again and again to be groundless. The “cost/risk analysis” in which this actor is meant to engage is a rational calculus performed by no one in the world (I stress “in the world,” as there are many who perform it on paper from comfortable ergonomic chairs in think tanks). This was an analytic devised by followers of sadist Italian economist Vilfredo Pareto, whose measure of economic efficiency was an arbitrary one meant to counteract the more “equality and redistribution” theories of the Utilitarians. It and its children are predicated on a notion of “the market” as a disembedded and transcendental system outside of local idioms of goodness, non-economic value, and villainy. And those who argue for it and act on it do so disingenuously, arguing for “free markets” while ignoring the fact that American foreign policy is, for instance, effectively subsidizing domestic industry through import-led interventions (including military support and development, e.g., in Israel) in its client states, props up the oil industry, bails out banks, and engages in innumerable other “market distortions” to suit those who otherwise clamour for its freedom and purity. Collectively-bargaining workers are just one “distortion” – in favour of the most vulnerable and subject – against an overwhelmingly systematic distortion in favour of business interests. In this balance, it is a corrective, not an isolated bias and uniquely selfish tactic, and a hard-won one at that. To say otherwise is to ignore history, and would be high buffoonery if it weren’t so convenient for those hell-bent on ruining lives. And to profit from this fragile process without having to contribute, cloaking selfishness in free-market ideals and the promiscuous word “freedom”, is in definitively poor taste.

Making “Class” – Paris under the 2nd Empire (Part 2)

10 Dec

Part 1 here.

Louis-Napoléon caricatured by Honoré Daumier.

Louis-Napoléon caricatured by Honoré Daumier.

After the events of 1848, Louis-Napoléon Bonaparte, the nephew of Napoléon Bonaparte I and a one-time revolutionary who had twice tried to stage a Bonapartiste coup against Louis-Phillippe (for which he spent some time in prison and considerable time in exile), was elected as President of the Republic. He had strong support from the provinces, as well as the advantage of not having participated on either side in 1848. When he got into a stalemate with the National Assembly in 1851, he staged a coup d’état confirmed with a 92 percent “yes” plebiscite (Wright 1987), thus inaugurating the 2nd Empire.

Under Louis-Napoléon, French industry and commerce took on a recognizably modern élan: industrial production doubled between 1852 – 1870, foreign trade tripled, the use of steam power multiplied by five and the use of railways by six (Ibid.). This was largely made possible by Louis-Napoléon’s policies, which included high protective tariffs to provide a domestic enclave for growth, low taxes, non-interference in industry by government, concession and guarantees for business, and a doubling of governmental expenditure on public works (Ibid.). Given this acceleration, it is no surprise that there were, early on, some self-identified working-class groups with revolutionary principles. Communist Étienne Cabet and socialist Pierre-Joseph Proudhon, for instance, were both active organizers. As early as 1848, in fact, Cabet’s newspaper Le Populaire had a circulation of more than 5000 Parisian workers, and his communist movement had a substantial following. However, rebuffed in his efforts to build solidarity with the progressive bourgeoisie, and rejecting the inevitability (or even perhaps the ontology) of class conflict, he and many followers abandoned France for America to start a small-scale commune called Icaria. In this case, there was a failure to convincingly stake out the boundaries of his purview, which it seems would be a prerequisite for solidarity and social action. Things might have turned out differently had there been an existing, clearly-defined group to which he could lash his programme.

At the time, however, there simply wasn’t a stable and distinct working class or, at least, one that recognized itself as such. Harvey notes that Paris in mid-century was France’s most important and diversified manufacturing centre; in 1866, 58 percent of its 1.8 million people depended on industry for their livelihood (against the slim 13 percent that relied on commerce). Despite this concentration, however, the conditions of life and work weren’t amenable to connection across firms or trades. In 1847, more than half of Parisian enterprises had fewer than two employees, only 11 percent had more than 10, and no more than 425 had more than 500. Over time, this fragmentation increased. Between 1848 and 1871, the clothing industry had 10 percent more enterprises but 20 percent fewer workers, and the chemical industry had 45 percent more firms but 5 percent fewer workers. Within these numerous firms, it was very often hard to distinguish between owners and workers, both because they worked closely together and thus developed sympathy for and cooperation with each other, and because of ascent (through marriage or promotion) or descent (by financial failure). Further, it is hard to distinguish between manufacture and commerce at mid-century because workshops and boutiques were often in the same building with the same owner. While this might have led to some vertical connections, based for instance on these “lower” groups’ resentment of high-level financiers and the increasing commercial monopoly, there was no clear and close enemy against whom they could consolidate. This would result in commiseration, not class consciousness.

Craft-workers, who in 1848 constituted 40 percent of the Parisian workforce, are often held to be the exception. Harvey notes that, in the first half of the 1800s, craft workers had their own hierarchies and organizations, as well as centralized labour markets, which allowed them to make coherent demands as a body and thus negotiate collectively over wages and working conditions. However, their enduring tradition of medieval fraternities, or compagnonnages, as described by Wright, involved rituals, secret meetings and rivalries. This meant they were likely an obstacle to inter-trade solidarity (and in fact their residue as late as the 20th century complicated the formation of trade unions); they certainly excluded the bulk of specialized and unskilled workers. But in the last half of the 1800s, this latter group was growing substantially and absorbing many craft-workers who had been bypassed by changing labour processes. By 1870, even their collective labour market, based on common hiring locations to which potential employers had to gingerly tread, had disappeared entirely.

As rent prices in the city soared, many large industries moved to the periphery, but in their place small specialty producers mushroomed (Harvey 2003). Rather than balancing out the effects of heavy industry’s dispersal, however, these smaller firms tended to rely on subcontracting, along with the hyper-specialization and deskilling of tasks, lowering their overhead and pushing the costs of rent and materials onto the workers. This both degraded the position of craft-workers, who no longer had a privileged place in most work-flows, but also furthered worker fragmentation and lowered wages through increased individual-level competition. (For precisely this reason, subcontracting had been outlawed during the brief period in 1848 when workers had leverage; it was quickly legalized again in 1852.)

The deskilling of labour also allowed for the use of unskilled immigrant workers from the provinces, who flooded into Paris after 1860 (Stovall 1990), as well as women and children. Worker scarcity from 1852 until the 1860s had given labourers a small degree of bargaining power regarding wages; the use of women and children offset this, and nominal wages stagnated while inflation rose. Only some workers stayed ahead of inflation, such as carpenters and mechanics, while the rest sunk into penury, thus polarizing an already fractured work-force (Harvey 2003). The privileged few were still able to take off “Saint Monday,” the informal holiday reserved for the nurturing of Sunday-excursion hangovers. The rest were working up to 14 hours a day in small sweatshops, with single men making just enough money to survive and working women having to marry just to stay afloat (Ibid.)

As Haussmann cleared away old housing in the center of the city and rents rose, workers moved away from the increasingly-consolidated rich quarters in the west – in which they had once lived under the same roofs as their employers – and into more homogenously working-class neighbourhoods closer to the periphery, generally into the north-eastern 19th and 20th and the south-eastern 13th arrondissements (Stovall 1990). The large public works projects attracted a flood of immigrants from the poorer provinces. The Parisian population was growing, but not reproducing itself (the rate of births over deaths wasn’t higher than 1 percent in any year between 1860 and 1900), and the unskilled new arrivals were amassing in the densely-populated outer quarters: against the inner arrondissements (the 1st through the 10th), which grew by 7.1 percent from 1861 to 1896, the 11th through 20th grew by 103 percent, resulting in severe overcrowding, with rates as high as 64.2, 65.5 and 66 percent for the residents of the 13th, 19th and 20th arrondissements, respectively (compared to a city-wide average of 14.9 percent) (Ibid.).

This concentration, as well as increased separation between those living at different economic levels, led to regional solidarities, as workers from different trades and traditions were brought into close proximity. This was particularly so in the neighbourhood cafes and cabarets, which became centres for political agitation and organization (Harvey 2003). However, there was little means of connecting across these regional solidarities to forge class identity based on shared interest rather than shared living quarters. Efforts to match the bourgeois mastery of space, for instance by creating a working-class, city-wide press, or voting in an independent political candidate to serve as a lightning rod for otherwise diffuse working-class energies, were quashed by Republicans (Ibid.).

This fragmentation didn’t necessarily prevent collective action, as the working class was at least minimally defined (though in negative) by pressures from above and, if not uniformly shared, at least significantly overlapping conditions of labour. Falling wages, the dismantling of welfare in favour of unstable public works projects by Georges-Eugène Haussmann (prefect of the Seine département and urban planner who overhauled the Parisian centre), repressive laws against workers’ organizations before 1867, including a law against association dating from 1852, and totalitarian surveillance all affected workers as a group (though not them exclusively). However, Harvey notes, “[If] Paris had a rather more conventional sort of proletariat in 1870 than it did in 1848, the working classes were still highly differentiated” (2003: 233). During Louis-Napoléon’s “liberal” period after 1859, though, a number of changes would open channels of solidarity between these groups, and a class consciousness seems gradually to have emerged despite this differentiation.

Making “Class” – Paris under the 2nd Empire (Part 1)

10 Dec

I talk a lot about work. Between participation in our Teaching Assistants’ union, courses on labour and distribution, and more-or-less heated-or-tipsy conversations with friends, it seems to come up a lot. Yes, work is changing, always, and our situation now is marked by precariousness, contract-work, the privatization and individualization of the security net, uncertain futures, fluid and unpredictable financial foundations, etc., etc. No, we’re not sure how to act within this new space, and so we talk about mobilization, aesthetics, demand, collectivity, and on. But seldom do we tackle class. Perhaps we’ve unthinkingly capitulated to the powerful (and erroneous) discourse that tells us class left with factory work – presumably to the “3rd World” – and that we’re united now by our more comfortable and modern Canadian concerns, which range from leisurely stabs at Rob Ford (oaf! Atwood!) to murmurings of ISP-masked revolution re: our nation’s neutered Netflix catalogue.

As though our own working futures weren’t dismal. Or our cities weren’t ringed by suburbs of (to the one side) distant and affluent 905ers and (to the other) geographically isolated and under-serviced migrant enclaves. Or our province wasn’t attacking one of the few remaining bastions of collective worker sentiment, the public employees. Or our council wasn’t debating a 5-cent “plastic bag fee” while our country’s “4th world” native citizens mobilize marches just to have their basic rights and needs recognized.

I think class is still a useful optic because there are many here who are still structurally, materially excluded and collected. Yet we don’t talk about it when we plan or dream; at least, my friends and I don’t, despite all our chat about work and the world.

As a humble first step towards us knowing how to recognize where we’re at, I’ll be posting a series of writings about class consolidation in Paris during the 1800s. This is meant to be a case study of loose contemporary relevance. Working-class struggle there and then was brutal, bloody, and urgent, and to this day is a source of pride to those many that still remember. Knowing how “class” came into their lives as a way of seeing themselves and recognizing others is important, I think, and might help us consider the details of our own confusion today.

This will likely spread across a few days of posts, and works cited will come at the end when I’m able to collect them all.



On the far side of this history is the revolution and violent counter-revolution of 1848, in which 1 500 – 3 000 workers were killed and 12 000 arrested, with many sent to Algeria as punishment (Wright 1987). According to Wright’s history, workers before 1848 lived in horrid abjection, working 15 or more hours a day for wages so low that many people depended on charity to supplement their income, ate meat once or twice a year and spent 30 – 50 percent of their wage on bread. Prompted by these conditions and united with Republicans who opposed the hyper-conservative Bourbon king Charles X, workers participated in the July Revolution of 1830, the “Three Glorious Days” that ousted the king and brought in the relatively more liberal monarch Louis-Phillippe, the Duc d’Orléans. Under his rule, however, dissatisfaction in the lower quarters persisted, as the upper 5 percent had 75.8 percent of the inherited wealth and the bottom 75 percent had only 0.6 percent (Harvey 2003). This inequality, as well as his heightening conservatism, led to resentment that condensed into barricades on February 22, 1848, resulting in his abdication.

The resulting provisional government included both moderate and radical Republicans, and responded to the workers’ plight with a study of potential reforms and the setting up of National Workshops, which organized work for the surplus labour force. Due to a collapse in Parisian industry and an influx of provincial migrants, enrollment in these Workshops increased from 14 000 to 117 000 between March and June (Harvey 2003). Some pro-labour laws were put into effect, including a 10-hour limit to the work day and a prohibition on subcontracted, piece-rate labour. An Executive Commission was also created and sent to Luxembourg to work, in relative isolation, on a comprehensive labour plan (Marx 1959 [1850]). However, on April 23, elections gave control of the National Assembly to conservative forces, who were fearful of losing their grip on workers and so used the National Workshops, unpopular with peasants who paid the bills but saw no benefit, as a leverage point for aggressive attack. They began to make access to the workshops more difficult, making it piece-rate work rather than paying by daily wage, exiling non-Paris-born workers to miserable earthworks projects in Sologne, and eventually even excluding all unmarried workers (Ibid.). On June 22, the day after this latter expulsion was declared, a second insurrection erupted, the “June Days” in which 1 500 – 3 000 rebels (workers and sympathetic radical Republicans) were killed in clashes with the largely bourgeois National Guard. A further 12 000 were arrested, with many exiled to Algeria (Wright 1987). The compromised leaders of the radical Republicans were removed from the playing board. And though the conservatives had secured their position, their betrayal of those who had helped them oust the monarch would long suppurate beneath the surface of the relative peace that followed.

While clearly an uprising against the consolidating power of the conservative faction in the National Assembly, any class lines drawn around the insurrection’s participants can be very easily blurred, despite Marx’s celebrations of it as a distinctly working-class rebellion (1959 [1850]). Craft-workers were certainly key to its organization as well as the bloody fighting that ensued (Harvey 2003). Alongside them was a motley host of less skilled workers, with whom the craft-workers could agitate but probably not identify as being of the same class (as discussed below). Complicating the picture is the presence of the “lumpen-proletariat,” those lowest-class members of the “vile multitude.” Denigrated from above and below, they were considered by Marx (1959 [1850]) to be the main weapon of the counter-revolutionary forces, enrolled by the conservatives into their Mobile Guard. Later academics have since revised this history, noting that the Mobile Guard was in fact indistinguishable from the workers, apart from their having been isolated and given an esprit de corps (Hayes 1993). At any rate, due to the insecurity of labour, the distinction between the “productive” and “non-productive” lower classes was porous to the point of dissolve. Similarly, the petty bourgeoisie, supposed by Marx to have been unequalled in their fanatical fight “for the salvation of property and the restoration of credit” (1959 [1850]: 311) were in fact integrated on both sides of the struggle. Even apart from the close-knit relations many of the workers and owners had with each other, and the sympathies that would result, a portion of the bourgeoisie had economic motivations for participating. Having been indebted by France’s disastrous war against Prussia, many wanted to see the end of the 3rd Republic that had, one week before, enacted a law enforcing repayment of all late debts (Hayes 1993).

Marx describes the 1848 struggle as “the first great battle… fought between the two classes that split modern society” (1959 [1850]: 304). With the exception perhaps of that small minority of those involved who had ingested the radical social theories that had begun to circulate (more on this below), I suspect the participants would recognize little of their experience in this stark binary. Indeed, while brought together in common cause, there is little reason to suppose that even those working men of the same neighbourhood, unless they were part of the same craft guild or workshop, would have thought of each other as being part of the same group. This was most likely a multitude, heterogeneous in background and interest, brought together against a shared enemy.


Aaron Butler Xylophone Performance

10 Dec

Beautifully played and expertly curated trip through early minimalism.

Watch and re-watch the first moments when he approaches the instrument, and imagine the bodily relationship he has with it. He squares himself to it with extraordinary intimacy and care, demonstrating how much of learning and playing isn’t intellectual recall but is immediate, pre-reflective, and deeply practical.

I’m off now to do the same to a book; place it on a desk, then approach it with mute and earnest care as though it were a BMX jump I was considering. I’ll update you all on how it turns out, in a post most likely to be titled “How to Read with Your Stomach.”

Some Thoughts on Habit

9 Dec

“The great thing, then, in all education, is to make our nervous system our ally instead of our enemy. It is to fund and capitalize our acquisitions, and live at ease upon the interest of the fund. For this we must make automatic and habitual, as early as possible, as many useful actions as we can, and guard against growing into ways that are likely to be disadvantageous to us, as we should guard against the plague. The more of the details of our daily life we can hand over to the effortless custody of automatism the more our higher powers of mind will be set free for their own proper work…. If there be such daily duties not yet ingrained in any one of my readers, let him begin this very hour to set the matter right.”

(William James, “Habit,” from Selected Papers on Philosophy.)

I had a conversation yesterday about actions and character, running along the usual lines of “you are who you are” vs. “you are what you do.” As is always the case when trying to talk of these things, whatever balance between the two apparent options we might have had was lost as slight disagreement led to dramatic opposition, and we were both reactively taking positions neither would probably want to defend. I hate watching a talk fumble, but there didn’t seem to be a way out – once you enter into talk that cleaves between identity/actions, you can’t pull them back together. There’s very little language for it. 

I remember literally staying up at nights trying to balance my sympathy for existentialism – an identification founded on the personal experiences of isolation, doubt, shaky self-definition (i.e. my early 20s) – and my increasing acceptance of a restrained structuralism. For those who find this opposition a bit crude, let me explain that I was living alone in an apartment in downtown Montreal eating far too much chickpea curry and pizza; it’ll ravage one’s sensitivities to nuance. But it did seem impossible to reconcile the lived experience of chosen, decisive action and the reality of connection, cause, and the co-determination of, apparently, all things. (I lean a bit materialist, for the record, though it’s a subtle tilt, having been fired up by Raymond Williams and, through him, Louis Althusser.)

This forced choice – radical freedom or extensive causality – parallels, though isn’t necessarily reducible to, so many others. Indeed, it seems to be the fault line beneath all kinds of conversation, despite some semblance of surface calm. We talk about deeds done, indiscretions, faults committed, and we balance these acts against intention, circumstance, the mitigation that context allows; we pick and choose which of our feelings or choices indicates our deep character, bracketing the rest as meaningless and exceptional; we lie on couches and map our anxieties onto childhood, our early formation, and tie those determinations tight while trying to find ways to let our “true self” manifest; the list stretches long…

My own recent work has looked at Alcoholics Anonymous, which offers a rather meaty example of these contradictions and ambivalences. Mariana Valverde’s Diseases of the Will details these conflicts, as  popular understandings of alcoholism have swung endlessly between drunkenness as a moral failing and drunkenness as an innate disposition (and so a medical, rather than moral, concern). My own reading of the official literature and discourses of A.A. suggests that this tendency to swing between two poles is as present in those suffering from alcohol dependence as it is in those who try to describe that suffering from the outside. The disease model of alcoholism suggests that an alcoholic is who one is; yet the moral culpability for one’s drinking, and the imperative to remain abstinent, suggests that alcoholism iswhat one chooses to do. Again, the determination of one’s action (drinking) by one’s character (a diseased alcoholic) is set against the reverse: the determination of one’s character (the identity of “alcoholic”) by one’s action (a history of drink).

I won’t elaborate further on the distinction I just mentioned, as I’ve been advised by several loved ones to keep these posts shorter and more sweet. (If any of you are reading this, I expect acknowledgement and praise.) I have described it, though, because in this case there is an evident middle ground – the American tradition of pragmatism, specifically, its expression in William James’ writings about habit. This excites me, so here’s a link to a short paper of James’ about the subject. If you’ve read him before, you’ll know this will be a pleasure to flip through; if you haven’t, you owe yourself.

Between the extremes of radically-free action and over-determining “character” or “identity,” habit sits unassumingly and barely noticed. But it offers a way out, at least in Valverde’s view (and my own). Our early actions accumulate and establish later predispositions, and later tendencies grow out of our earlier choices and behaviours; our decisions settle like sediment and ultimately establish, with more or less room for upheaval, the terrain of our character. But, importantly, this character isn’t reducible to an “identity” – there is no substrate of “type” or “category” that serves to determine or explain our actions (which include feelings and thoughts, in my view). James famously wrote that we are “a bundle of habits,” and this metaphor helps us escape the feeling that we are in fact agents of haphazard choosing (for we are in fact “bundled”) and, equally, the suspicion that we are causally chained to our identity (bundles are loose, and no one stick sits beneath the rest). While the whole still depends on its parts, and still forms a whole, no one part is foundational, determining, irreducible.

Two important things follow from this: 1) we’re better able to make sense of things that weren’t “readable” through the bifocals we were earlier limited to; 2) the false dichotomy between the bounded self and the pervasion of the social collapses, and with it falls the wall between internally-perceived freedom and externally-observed cause. I’ll provide a quick example for the first, a brief comment for the second, and leave it at that, with the hope of eliciting some commentary from my dedicated readership (Boots, are you online right now?).

1) Efforts to understand the “work” of Alcoholics Anonymous persistently center on “identity conversion,” the process whereby new affiliates take on the identity of “alcoholic,” as well as its presuppositions about the “disease,” “alcohol allergy,” the need for abstinence, etc. This new identity is performed publicly through story-telling, i.e. “My name is Job, and I am an alcoholic.” This parallels certain creedal conventions that permeate certain strains of Christianity and so, along with references to God in the 12 Steps, lead people to (falsely) reduce A.A. to a religious or cultish group. This view is unable to make much sense of the instability of a system that simultaneously stresses moral responsibility (to attend meetings, make amends, etc.) and amoral medical disease. This difficulty follows from the assumption of a unified, coherent “alcoholic identity” that is taken on by participants; its contradictions must inherently lead to contradictions within oneself or ambivalences which get bracketed. But if you instead read the situation as a shifting of habits, the picture becomes more clear. First, contradictions in the official ideology and, thence, the “alcoholic identity” needn’t be problematic because a “bundle of habits” doesn’t necessitate any sort of internal coherence. There is not a single identity set against itself in terms of moral or medical explanation, but instead an assortment of habits, one of which might best be explained by moral responsibility (the repeated choice to enter bars despite one’s problems with drink) and another by medical disease (a habit of drinking, so repeatedly reinforced that it has become ironclad). “Identity,” as such, no longer matters. In fact, as Valverde points out, it never has mattered much within the program’s actual functioning – she points to the slogans, posters, mottos, advice, etc., that are the framework for one’s recovery, and notes that their general tenor (e.g. “one day at a time”) has little to do with identity and very much to do with habit. Since the founders of the program were strongly influenced by William James and American Pragmatism, this is no surprise.

2) Behaviour is learned, modeled on others, and so its repetition and sedimentation into habit reflects those others; and, as a result, bears strong connection to one’s professional, class, ethnic and familial contexts. There needn’t be any determining relationship between culture/society and the self, because they’re of the same clay – social boundaries are maintained by the behaviours, beliefs, values and tastes of people (see Bourdieu and habitus) who themselves are habituated by those who came before. Individual habits take on the contours of larger social systems by virtue of upbringing, but there is always room for change because it is not a hard-line determination. It isn’t, for instance, as in crude readings of Marx, that the economic base determines the sociocultural superstructure, but that they are both part of the same co-determining and self-reproducing system. The opposition of individual to society, character to act, etc., are all dichotomies imposed after-the-fact; focusing on the immanence of social life, the way matter and value are patterned and reproducing, and so not chaotic or meaningless, without being beholden to any determining foundation or label, is both more epistemologically sound and a bloody relief.

I’ll stop there. That last paragraph might be a bit dense, so please feel free to ask for elaboration or to virtually prod at it like a falsely calm wasps’ nest.

The Neuroscience of Justice, or, the Tumour as Agent

9 Dec



This post was actually written in the summer of 2011, so please pardon its dust.

I’ve just finished reading David Eagleman’s recent article “The Brain on Trial” (The Atlantic,July/August 2011, available online here), and feel… threatened, I suppose. I strongly suggest reading it – which is why it threatens, because its logic is fair and its intent agreeable; certainly the effort has my sympathies. But looked at closely, this picture of justice and culpability has many faultlines, and building a new judicial system on its terrain is, at best, unwise and, at worst, a guarantee of the unacceptable triumph of the who over the what in sentencing. (Of course, neither is ideal – but replacing one inadequate extreme with its equally or more problematic opposite is preposterous.)

I won’t synopsize, as we’ve all got lives to tend to, but in brief (and with excerpts):
The dense interaction of neural networks, the lack of an “uncaused cause” (a la God) within the brain, the lack of a truly “free” will, which follows, and the supposed progress made in identifying the biological bases of aberrant behaviour all lead him to the conclusion that the presumed subject of judicial sanction – the deliberate “criminal” – is a convenient fiction. That is, the coherent, deliberate and rational decision-maker who has chosen to break the law or delve into perversity is a construction that maps rarely, if ever, onto reality. (I agree with this, though with some caveats that lie outside my present purpose.) As “possession,” reckless abandon and the sin of drunkenness have been demoralized and medicalized as, for example, schizophrenia, bipolar disorder and substance-abuse disorders, so too will pedophilia, shoplifting and premeditated murder ultimately be reduced to the plane of neurological disease or dysfunction.

As we wait for technology to catch up with this intuition, and provide the means of identifying the concrete brain mechanisms of criminality, Eagleman and his colleagues have developed a system meant to maximize reflection and self-control, and so to minimize recidivism (not changing the criminal instincts, whatever they are, but toughening the muscles of restraint). From his article:

We may be on the cusp of finding new rehabilitative strategies as well, affording people better control of their behavior, even in the absence of external authority. To help a citizen reintegrate into society, the ethical goal is to change him as little as possible while bringing his behavior into line with society’s needs. My colleagues and I are proposing a new approach, one that grows from the understanding that the brain operates like a team of rivals, with different neural populations competing to control the single output channel of behavior. Because it’s a competition, the outcome can be tipped. I call the approach “the prefrontal workout.”
And on the increased ability to control impulse:
This prefrontal workout is designed to better balance the debate between the long- and short-term parties of the brain, giving the option of reflection before action to those who lack it. And really, that’s all maturation is… The frontal lobes are sometimes called the organ of socialization, because becoming socialized largely involves developing the circuitry to squelch our first impulses.
So, where’s the problem? There are several, and for the sake of length I’ll stick to a couple, and for the sake of clarity I’ll divide them into the descriptive and prescriptive. Hell, for the sake of all of our afternoons, I’ll even use bullet points.
  • The increased and increasing ability to identify neurological bases for behaviour. A comparison is drawn between the supposed bases of criminality and those of, e.g., depression. Indeed, the comparison may be worthwhile, but not in the way Eagleman thinks. While he’s right that specific drugs have concrete effects for some mental illnesses – reduced symptoms, increased functioning – it doesn’t follow that these illnesses themselves have a discrete etiological base (“etiology” refers to the cause or set of causes of a medical condition). Quite unlike, say, a broken leg, mental illnesses are diagnosed via a set of symptom checklists, as provided by the near-biblical Diagnostic and Statistical Manual (DSM-IV-TR, soon to be in its 5th edition), rather than by any sort of clear identification of a “cause” (i.e. no faulty wiring, no neural cluster, no leaky cortical piping). Diseases of this sort refer to clusters of symptoms, and not single physical causes – indeed, the DSM and its lists of symptoms strive for reliability (consistency of application and diagnosis), notvalidity (a demonstrable connection with neural reality). While drugs do treat patients effectively, they are not rationally designed to correct known and specific neural problems, but are instead tested and re-tested for their ability to mollify symptoms, and then applied in cases where they have the most effect – hence their changing application and use in treatment (see: Risperidone), off-label uses, etc. In fact, the diagnostic criteria in the DSM have evolved to better match the drugs, rather than the reverse. The Manual’s third edition, the first that could be called “modern” and biomedical in its models, sought to provide reliable ways to gather together populations of patients for drugs testing; the testing results would change the working definitions of “persons with syndrome X,” the diagnostic criteria would change (“be refined”) as a result, thus affecting later patient testing populations, and so later diagnostic revisions, etc., in an ongoing feedback loop. So yes, perhaps criminality has neurobiological bases, but in this respect their similarity with mental illnesses is telling: they would always lie a step beyond our efforts for direct identification. For more on certain aspects of the above, I strongly recommend Andrew Lakoff’s book Pharmaceutical Reason: Knowledge and Value in Global Psychiatry, available here. (I realize some may argue that neural imaging allows for identification of specific regions of the brain that may serve as the “root” of criminality or brain dysfunction. I counter with the suggestion that the dense interconnection of neural networks, which is taken as a given in Eagleman’s case against free will, means that the more precise and localized the region identified, the less it will be determining of any given set of criminal behaviours or mental symptoms, as such – dense interconnection means that any such behaviour, at least as currently known and recognized, can’t be reduced to any one region. Any meaningful etiological claims – that is, identifying elements that are relevant to and determining of criminal behaviour – most likely must be multifactorial and span several registers: the neuroanatomical, the psychological, the social, etc.)
  • The loss of difference. I’ll exploit the comparison with mental illness once again. Lakoff’s book explores how diagnosis of the DSM sort is able to bring incredibly disparate experiences together under the label, e.g., “depression.” He has a useful example: a suicidal teenaged woman in Buenos Aires is said to have the same mental illness as a recently-bankrupted 54-year-old male in Texas, despite the extreme difference of illness onset, prognosis, treatment options, social context, expression of symptoms, etc. These two cases are brought together by the assumption that a shared cluster of symptoms is indicative of a shared “disease” – that is, a shared chemical imbalance of some sort. While this may be true, virtually all relevant aspects of their suffering, treatment and prognosis differ substantially (with the exception perhaps of one of the several drugs each would likely be treated with). While useful in some contexts (notably, the global sale of pharmaceuticals), this diagnostic convergence masks deep differences. The same would be true of reducing criminal behaviour – which takes a multitude of shapes across a staggering range of social and personal contexts – to a common neural substrate. Premeditation, lack of remorse, denial; these may serve as markers for recidivism in a parole hearing, but it’d be problematic to use them as diagnostic criteria for Criminal Condition X, because it’d reduce or exclude some of the most important dimensions of social crime: class, colour, creed, etc. This would be a quite a loss in our discourses on crime and justice. (To be fair, Eagleman actually accounts for much variation; his understanding of causality is as something extensive and diffuse, and so gives due credit to the relevance of the personal and idiosyncratic. But in practice, reducing that which occurs at the level of society to specific, recurrent neural problems risks a staggering loss of comprehension.)
  • The equally arbitrary drawing of boundaries. Where draw the line of agency? We can contract the boundaries of the responsible, ethical agent to circle tightly around a given neural region, but we can with as much validity extend the boundaries to include the immensity of all social contact and influence. Neither is a given, more ontologically true than the other. Our judicial system at the moment finds an expedient, though flawed, middle ground – assuming agency begins and ends with the individual, but that this agency is more or less mature, more or less compromised, and so “going easy” on those with mental illness, who aren’t of age of consent, who were engulfed by the passions. Examples of either pole – of the contracted and neural, of the extended and diffuse – needn’t be drawn solely from other cultures, marked by supposed exoticism and so inapplicable to our world. Of the former, Eagleman provides examples, and indeed there are many more: a father, stricken with a brain tumour, who without precedent savagely murders his son; a mother, suffering a post-partum chemical imbalance, who drowns her children in a car; and on, and on. Of the latter, I ask you to picture a parent taking responsibility for the actions of his or her drunken son, or an apartment owner paying damages to an injured pedestrian, deemed liable for the accidental fall of his air conditioner from the sill to the sidewalk below. What constitutes a “culpable agent,” in our own society as in all others, is very broad and context-dependent. In effect, then, I agree with Eagleman insofar as this flexible view of responsibility calls for a judicial reform of some sort (I can make no claim as to what, exactly); but I take issue with the implication that the neural offers a more solid bedrock for attributions or negations of responsibility in all or most cases.


P.S. Please, hold the comments about how I obviously don’t understand the purpose of bullet points. We’re all doing our best over here at… what’s this blog called? Whatever, finish your drink and shuffle on out of here.

Psychiatric Caricature and Luhrmann’s “Of Two Minds”

9 Dec

In which I organize some rambling about psychiatric suspicion and Tanya Luhrmann’s fantastic book “Of Two Minds: The Growing Disorder in American Psychiatry” (available here).

The book – a multi-site ethnography drawing on fieldwork in medical schools, psychiatric residency programs, in-patient treatment institutions, and hospitals – deservedly won both the Victor Turner Prize for Ethnographic Writing and the Boyer Prize for Psychological Anthropology. It traces the origins of the gulf between the psychodynamic and the biomedical models of mental suffering and psychiatric treatment, as well as the ascendance of the latter as “managed care” insurance programs came to dominate the funding of mental health interventions (and, as a result, to leverage decreases in time, cost and consistency of care). I won’t synopsize further; rather, I’d like to stress the subtlety and respect (albeit critical) with which she treats her subject matter.
Too often, criticism of psychiatry takes a facile view of the discipline, reducing its evolving and often contradictory discourses to a single plane – usually biomedical, often indifferent to the complexities of subjective life, always oriented toward profit. This last is the bursting coffer of individual therapists, psychiatrists as a horde or, through the alchemy of capital, the psycho-pharmaceutical industry. Granted, the links between industry pressure, gift-giving, conference and support-group funding, etc., and mental health professionals all deserve acute scrutiny – but simplifying them into caricatured profiteering by all involved only confuses the picture. If you want to understand anatomy, you don’t take a scalpel to a straw man. Similarly, the psychodynamic model is popularly condemned, Freud having been “debunked,” as though the merits of its practice (ongoing and intensive analysis of the patient’s subjectivity and personal history) are negated by its flawed theory and ontology. But the one needn’t be reduced to the other, and in fact medical professionals largely lament the loss of the psychodynamic approach as a complement to their now all-too-brief encounters with patients.
The picture Luhrmann paints is in shades of grey, thankfully. Rather than taking for granted the hostility with which practitioners on either side of the divide refuse one another, she listens to her informants as they criticize the new, rigidly biomedical regime, even if they rake in far more money now per working day than ever before. Profit, while certainly working to sustain a practitioner’s subscription to the biomedical model, is a secondary player in the broader field of change. The increasingly foundational role of the Diagnostic and Statistical Manual (DSM-IV-TR, soon to be DSM-V) and its use by insurers to insist on more immediate and “treatable” (i.e. reducing dangerous symptoms within several days of in-patient holding or through out-patient drug treatment) diagnoses has been the real catalyst. When insurers began offering “managed care” insurance, through which patients are able only to receive pre-approved treatments at pre-approved institutions, they became able to use their portfolios of thousands (or more) of potential patients to leverage dramatically decreased costs for treatments, making lengthy psychodynamic interventions not just less profitable by comparison, but entirely unworkable as a treatment option.
The result was, first, a generation of psychiatrists who, despite their zeal for the biomedical model, still maintained the practical worth of a complementary psychodynamic approach; and, in the generation following, a complete lack of substantial training or experience in this latter approach, leaving it neglected and increasingly irrelevant. The ambivalence and sometimes despair of psychiatrists caught in this changing tide – forced to provide desperate patients with less than their conditions require; regretful that they no longer “know” any one of their patients with anything like the intimacy they once saw as foundational to therapy; ethically frustrated by their conflicting responsibilities to help patients and families while also having to insist on a hospital stay even when it may bankrupt them; pleasure at the reduced risk and increased functioning of patients on the right drugs, tinged with the knowledge that they no longer have any role to play in these patients’ support networks, and that they’ll likely see them return again to the emergency room – this ambivalence punctures any notion of psychiatry as a purely exploitative or corrupt discipline. I believe this is important work.

UTSC Centre for Ethnography Writing Fellowships

9 Dec

UTSC Centre for Ethnography Writing Fellowships

The Centre for Ethnography at UToronto Scarborough is currently accepting applications for 2013 Writing Fellows. This is a 10-week appointment with a $10,000 stipend, meant to provide the space and time needed for critical reflection and quality writing. 

This is intended for doctoral students in the later stages of dissertation writing, or those post-doctoral students who are turning their dissertations into a manuscript. Fellows may not hold teaching appointments or other employment during the 10 weeks. Regular presence at the Centre is expected, as well as attendance at all talks and colloquia. 

Applications of five double-spaced pages, describing the applicant’s work and stage of writing, can be submitted by March 1, 2013 to centreforethnography(at)utsc(dot)utoronto(dot)ca.