11/19/13

Toward a plural theory of Anthropocenes

easterFor all I know this has been said before, but: the anthropocene is a world-concept.

The normal way to understand the Anthropocene is as a historical period, defined more or less as the era when human beings acquire the capacity to affect the ecology of the entire planet, thereby opening the door to mass extinction, disastrous climate change, and, at the limit, the disappeareance of the species. Generally people want to date it to the beginning of the Industrial Revolution, though you see arguments for dating it to the beginning of agriculture. Since the challenge we are facing collectively at the moment (and for the next centuries) is the immediate result of the dramatic expansion in carbon-based energy (oil, gas) use that comes from the Industrial Revolution, my impression is that most people are inclined towards that date.

But that’s just because the scope of this environmental event is in fact the entire planet Earth. I want to suggest that we become aware of/wish to designate the Anthropocene at this crucial moment precisely because of that scope, that because like the various other -cenes (the Pleistocene, the Holocene) this era involves ecological/geological/meterological activity that is planet-wide, we feel comfortable declaring it to be “epoch”-worthy. That is, the epoch (that which can be designated by the -cene, that which is a scene for the -cene) is partially a temporal metaphor for spatial scale.

This is true of all world-concepts, and trivial. But now what we can do is to scale down the Anthropocene from the world to world, and recognize that, unlike the Pleistocene or Holocene, we can use the concept to refer to any “world” (that is, any relatively closed totality, relatively closed because like our totality it can be potentially escaped from, in our case via rocket ships/space colonization) that is capable of producing self-extinction through the manipulation of its environment.

In that case there have been other Anthropocenes, some of them, perhaps, not even human. Any virus that kills its host too rapidly–before the host has a chance to infect others–is Anthropocenic in this sense. We might also think of the series of extinctions on Easter Island as one example of a quasi-Anthropocene (resolved by the arrival of European explorers). Or, an extreme and fanciful case, of a literary character like Raskolnikov.

I am not sure that it is politically useful to think of the Anthropocene this way — it may be that there’s more traction in terms of getting people to think about how to live, or die, in it if they can have the narcissistic pleasure of imagining themselves to be historically unique. But it may also be that philosophers and other humanists could benefit from a plural theory, a theory of Anthropocenes, both as a structure for comparative analysis and as a humbling reminder that self-desctruction, when it happens, is usually a matter of degrees of difference, not kinds, from ordinary life.

11/11/13

Critical Distance and the Crisis in Criticism (2007)

One of the things I want to do sometimes is to repost stuff from Printculture’s archives, because it tends to be hard to find. Here is a series of discussions on the topic of something I called “leverage,” by which I meant, as Mark McGurl pointed out in the comments, “critical distance.” The conversation that ensues sees the two of us thinking through and explaining some of the things that motivated The Program Era and The Hypothetical Mandarin. The entire conversation series of posts (which are combined below) dates from October 2007. I will also say that one of the weird things about rereading this stuff is realizing how old some of my ideas are; I swear I’ve repeated some of the things I say below in the last couple of years as though they’d just occurred to me.

Leverage as a function of critical capability and interest

It occurred to me the other day — and in fact I may have already bored one or two Printculture readers with this — that it would be useful to think about why so much academic work on contemporary material isn’t very good. But perhaps the premises bear repeating: (1) a higher percentage of literary critical or cultural analysis of contemporary material — fiction, poetry, film, the culture in general — says, by my standards, completely predictable things (than does work on material removed from us in time) and (2) is therefore no good. I have no data to back the first part of this up; it’s merely an impression. For the movement from the first to the second premise, I rely on my belief that literary critical analysis should, in general, aim to teach us things we don’t already know about the world.

The question I’m setting out to answer here is why this is true. Why, that is, does work on contemporary material so often simply tell me what I (feel like I) already know.

The answer has to do, I think, with leverage. By leverage I mean to indicate the degree to which my ability to tell you something about X that X doesn’t already know about itself and isn’t obviously saying to anyone who’s paying attention, depends to a very large extent on the difference I am able to generate between myself, and what I know or see, and what X knows or sees on its own.

Continue reading

11/11/13

Recidivism in weight loss

Nice article from NY Mag on the psychological and physiological adjustments that come with having lost large amounts of weight.

Cultural fantasies of weight loss present a tidy, attractive proposition – lose weight, gain self-acceptance – without addressing the whole truth: that body image post-weight loss is often quite complicated. Perhaps that helps explain why the rate of recidivism among people who have lost significant amounts of weight is shockingly high – by some estimates, more than 90 percent of people who lose a lot of weight will gain it back. Of course, there are lots of other reasons: genetic predisposition towards obesity, for one. For another, someone who’s lost 100 pounds to get to 140 pounds will need to work harder – including eating much less each day – to maintain that weight than someone who’s been at it her entire life. (Tara Parker-Pope’s excellent piece “The Fat Trap” explains these physiological factors in much greater detail.) But what about the psychological? Who would be surprised if a person – contending with both a new body that looks different from the one she feels she was promised, and the loneliness of feeling there’s no way to express that disappointment – returned to the familiar comfort of overeating? At least its effects are predictable.

Two thoughts: first that the last bit is of a piece toward a more general understanding of how psychologically difficult deprivation is, and how things like being fat or being poor change the wiring of our bodies and our brains. Beginning from that understanding makes compassion for the choices others make far easier (and moralizing judgment oriented around disgust more difficult).

Second is that Iwonder if anyone’s ever done a comparative analysis of the disappointment one feels after losing a great deal of weight and the post-pregnancy/childbirth body. Both are situations in which one does not return (unless one is a certain sort of celebrity, I suppose) to the status quo ante; in the case of weight loss this is exacerbated or made more weird, of course, by the fact that the new status quo may never have been ante. I was 6’1″, 215 pounds at age 16, 6’3″ 240 at 18, and 6’3″ 278 in summer 2002. Since 2007 I’ve bounced between 190 and 200 (I was at 184 at one point, but never again) and I’m still not used to it.

11/8/13

Bonus points to the cynical guy

So a few months ago I predicted that one day actors would be hired by firms like Coursera to teach MOOCs (because once you don’t have to respond to student questions live, who cares who reads from the script? Might as well be a hottie…).

And now one of the leading MOOC firms, EdX, is considering hiring Matt Damon to teach a course.

Casting Damon in a MOOC is just an idea, for now: In meetings, officials have proposed trying one run of a course with someone like Damon, to see how it goes. But even to consider swapping in a star actor for a professor reveals how much these free online courses are becoming major media productions—ones that may radically change the traditional role of professors.

One for-profit MOOC producer, Udacity, already brings in camera-friendly staff members to appear with professors in lecture videos. One example is an introduction to psychology course developed earlier this year in partnership with San Jose State University. It had three instructors: Gregory J. Feist, an associate professor of psychology at San Jose State University, who has been teaching for more than 25 years and who wrote a popular textbook on the subject; Susan Snycerski, a lecturer at the university who has taught for 15 years; and Lauren Castellano, a Udacity employee who recently finished a master’s in psychology from the university, advised by Feist.

11/1/13

Radical Hospitality

One sometimes hears objections to Halloween from people who are afraid it’s an anti-Christian holiday, demonic, Satanic, hedonistic, or pre-Christian. It might be any of the above (it’s certainly pre-Christian, in its guise as Samhain). But I wonder about the implied notion that a non-Christian holiday might be somehow morally deficient, that it would be virtuous to abstain from costumes, trick-or-treating, or jack o’lanterns.

After all, what passes for a Christian holiday? The Día de los Muertos is an ingenious cultural hybrid (All Saints’ Day plus the lingering memory of the Aztec gods). So is Christmas/Yule. Mardi Gras falls somewhere between Lupercalia and the Greater Dionysia. Isn’t Easter named after Ishtar? If we go on at this rate discovering predecessors, we won’t have any Christian festivals left.

And let’s look at the content of the holiday. On Christmas, there is feasting, singing, and giving of gifts. Some of us make charitable donations (especially Americans, when the first of January looms and the IRS comes to mind). But the greater part of the gifts, and certainly the massive advertising, decorating, carol-singing, red-nosed public side of the thing, has to do with circulating good stuff among friends and family– which is not far from benefiting oneself. If “it is more blessed to give than to receive,” most Christmas gifts qualify only under a technicality.

On Halloween, we throw our doors open to all strangers, the stranger the strangers the better: space aliens, pirates, warrior princesses, dinosaurs, zombies, Elvises, Time Lords, Scotsmen, Yellow Submarines. And into their outstretched baskets we pour candy, not caring when they eat it or whether they got more than they deserved. Sure, there’s the threat of a trick in the absence of a treat, and it may be that atavistically we are bribing the ghosts and spirits to stay away from our homesteads for another year. But it’s still an act of radical hospitality that sweeps the candy away from, not around in, the domestic circle. So I say: Cast thy Twix upon the waters! Go thou and do likewise!

10/27/13

Lord of the Files

I’ll be using this title to compile some reflections on the NSA spying scandals. Much of what I say will be obvious. If you want to have friends, don’t treat them like enemies. Don’t spy on them and don’t lie to them about what you did if there is the remotest chance of being found out.

I can imagine that there would be a justification for listening in on suspects already identified as “of interest” in a developing case– this is just acting on what is known as “probable cause,” and although not entirely charming, passes the prudence test (does the behavior result in tangibly reduced risk for the people you are responsible for and care about? If yes, check the box and go ahead, though with misgivings). But it’s in the nature of an uncontrolled, secret program to expand as far as the money will take it (and there’s always more money), until the spying apparatus is pretty soon listening in on everybody who is connected to everybody who might be of interest in a potential case that might eventually surface– which means everybody. It’s quite a thing, mathematically speaking, to stay on top of the trillions of relationships that obtain among a billion or so people. But I’m not proud that we’re accomplishing this triumph; it would have been a better cause for pride if we left more people alone.

Expanding the definition of “national security” to the point that it includes eavesdropping on all parties to any negotiation in which we and our allies are involved is simply a sign of pathology. People who are deeply mentally sick alienate their friends pretty quickly. Put otherwise, it is hard to be a friend of a person who is paranoid and prone to violent outbursts and claims to be the world savior– and who taps your phone and email. Such a person, if you have the poor luck to be stuck on an island with him or her, I would style the Lord of the Files.

And that is the person that the US has become. Blame is raining down on Obama, and the question (for those who care) is, as it was for Nixon and Reagan, “what did he know and when did he know it?” According to the White House press secretary, he didn’t know anything. But we knew that was going to be the answer. The fact that repugnant opposition politicians, who would do ten times worse if they had the chance, are jumping in to score points shouldn’t dissuade us from asking the question. But I don’t think it matters so much what Barack H. Obama, Esq., knew. The presidency is a legal person, like a king under the old “two-bodies” theory, and in the transcendent sense of that personality the presidency knows, authorizes, and is responsible for a hell of a lot, and it has been so since the XYZ affair was rattling the ruffled cuffs of the young Republic.

What was Obama’s reaction when he learned about these nifty intercepts that gave him a preview of what our friends were thinking (but not, apparently, what the Russians or the Chinese were thinking)? What kind of courage would it have taken for him to push it all aside and say to the Director of National Intelligence, “Cut those wires and fire those spooks. I don’t need to tap the phones of our most trusted allies. We can compete in the big old world on a fair-and-square basis”? Did he think, “Well, this is not totally kosher, but I didn’t actually order the surveillance, and it might come in handy some day after all.” Did he have an impulse to reject the Faustian package, only to receive this remonstrance from the spook in charge: “Sir, this intelligence could be of national importance. It could save lives. It could make the difference between Boeing winning a contract and Airbus winning it. Your personal moral quibbles must cede to the national interest”? I don’t know. I suspect presidents don’t like to be called choirboys.

But a salient piece of Washington folklore was uttered by a member of the policy circle most likely to have the task of making up justifications for the US President being the Lord of the Files. Mike Rogers, a Republican member of Congress and head of the House Intelligence committee, offered, according to the Guardian, the following interpretation of twentieth-century history:

Going further, Rogers claimed that the emergence of fascism in Europe in the early 20th century could be partly explained by a conscious decision by the US not to monitor its allies.

“We said: ‘We’re not going to do any kinds of those things, that would not be appropriate,” he said. “Look what happened in the 30s: the rise of fascism, the rise of communism, the rise of imperialism. We didn’t see any of it. And it resulted in the deaths of tens of millions of people.”

I don’t thing Congressman Rogers would pass a high school history test. What dates are associated with (a) the rise of fascism? (b) the rise of communism? (c) the rise of imperialism? Half-credit for (a), Mr. Rogers (the conspicuous events of the 1930s do include Hitler’s election in 1933, but fascism got its start in Italy in 1922); zero credit for your other two answers.

However, what he is parroting is a familiar line based on at least a fragment of fact. In the first decades of the twentieth century the US had a Black Chamber, or cryptanalysis bureau, based, as much of our present spying activity is, in a commercial telecommunications node (in that instance, the Chamber was hidden in a firm that compiled telegraphic codebooks). After 1919, Henry Stimson, Secretary of State, is said to have declared, “Gentlemen do not read each other’s mail,” and disbanded the Black Chamber. It was pulled back together in a hurry in response to the next war (I am getting this from David Kahn’s 1967 book The Codebreakers, a favorite of my father’s).

Now perhaps Stimson’s code of ethics did not prepare the US for conditions as ungentlemanly as those endured by the combatants of WWII. But Mike Rogers, twisting the anecdote, is demonstrating a degree of paranoia and self-centeredness that is quite magnificent, even for the spoiled children of our elected assemblies. He is saying that the fact that we were (allegedly) not spying on the Germans, the Russians, the British, and the rest accounts for “the rise of fascism, the rise of communism, the rise of imperialism… the deaths of tens of millions of people.” Astonishing! Where is a global policeman when you need one? Why, indeed, did the US not steam forth in 1931 and whack the Japanese who were overrunning China, invade Germany in 1933 and slap them around for their poor judgment in electing Hitler, parachute into Windsor Castle in 1757 and make old George say uncle and give Bengal back to its Nawab? Why can’t we poke our NSA earbuds into every wire and satellite and issue executive orders about every damn thing we please, lest somebody, somewhere, get up to some evil? Foreigners are all right, I guess, if carefully observed and called to order at the slightest sign of going wrong.

10/22/13

Scott Adams is a strange man

…with lots of ideas about the future of online education.

I suppose by “strange” I mean that his politics (if you look at his blog) operate from a position that imagines itself as entirely apolitical but is nonetheless quite interested in politics. So it produces frequent pox-on-both-houses language, but also pragmatic suggestions for various kinds of things (including online ed, in the link above) with no real concern for what I think of as the “normal” language of American politics (involving concepts like the moral, the just, and so on).

And then you ask yourself — well, who would Dilbert vote for? — and you realize that Adams’s politics are perfectly in tune with the strip, because the answer is totally unknowable. Even the grounds on which Dilbert might vote for someone are unknowable.

10/20/13

And the MOOC revolution seems to be over

At least according to this reading of a Chronicle story by Chris Newfield. Short version that both faculty and university presidents agree that MOOCs will have a negative impact on higher ed, and that this opinion is held by people who nonetheless seem open to technological innovation and other kinds of innovation in teaching (so it’s not just a thoughtless resistance to change).

And yet, the problem is that for about 18 months state legislatures were allowed to pretend (or pretended to pretend) that the MOOC would allow for further cutting of state support for higher education…

In other words, when universities lose MOOCs as a budget solution, they lose the main source of hope that state politicians had for a free fix of the college cost problem for a less affluent, not wonderfully educated younger generation.  MOOCs were the austerity solution to the mass quality problem.  Without them, tempers will flare, fingers will point, and funding will not be restored. In the meantime, faculty are going to have to lead higher ed innovation anyway, and the good news is that post-MOOC-as-cure-all faculty don’t need to focus on the technology to the exclusion of the “human side” of teaching and learning.

Now that the MOOC seems to be a non-viable solution, we can look forward to the rapid restoration of that missing funding.

10/7/13

How Someone Ends Up Working in Disability Studies…

… or at least thinking about it.

Those of you who know me and my family know that our son, Jules, was born with a very rare genetic disability (known as 9p deletion syndrome). He’s fine, at least medically, though it was no fun for the first three weeks of his life and has on various occasions been a little less fun than it otherwise might have been (cleft palate surgery, some ongoing concerns, now faded, about his heart). Cognitively, we know less about the future than we might, partly because the syndrome is so rare (maybe 150 cases in the United States), partly because it produces such a wide range of outcomes, and partly because the treatment of the disabled has changed so radically in the United States in the last 60 years that evidence gathered on the basis of a 30-, 40-, or 50-year-old 9p deletion person does you little to no good, since that person lived through a radically different set of approaches to disability than will any child born ten or twenty or thirty years later.

I know less than I should about how disabled people are treated in the United States. More than I used to know, of course, before Jules was born, before he spent 2.5 of his first 3 years in an amazing day care facility, in which he was fully integrated with the other kids (a process known as “mainstreaming,” now the normal thing to do in the United States), and to which state-provided therapists (occupational, physical, speech, developmental) showed up for 7 hours a week to help Jules catch up with his peers.

The idea behind mainstreaming and the therapy (which is known generally as “early intervention”) is simple and twofold: first, that the earlier you can work with disabled (or even potentially disabled) children, the better you can help them reach their maximum genetic potential (I know that’s a fuzzy concept, but let’s use it loosely here to express something like the maximal cognitive capacity someone can reach, all other things being equal); and, second, that surrounding (potentially) disabled children with other children who are developmentally “ahead” of them actually encourages the (potentially) disabled children to rise to the level of their peers. In this mainstreaming takes advantage of two well-established developmental facts: that early and frequent intervention produces better developmental outcomes, and that peer effects are powerful social, physical, and cognitive motivators (for good and ill–just ask someone who chooses to live in a frat house).

So by the summer of 2013 Jules barely qualified to continue in the state-provided program that provided the 7 hours of extra attention per week that he had been getting since he was four months old. He had made amazing progress, and was catching up to his peers on a number of levels that the state measures to determine eligiblity for its programs (gross motor, fine motor, speech, social/psychological maturity, etc.). But we were thrilled that he was qualified because we knew that the more help he got, the better off he’d be in the long run. (None of this stuff means he’ll stay caught up with his peers, which is why this early intervention is so important.)

And then we decided to move to Germany for the academic year.

Continue reading

08/27/13

Journalism in Pre-War Conditions

The art of showing you pictures of babies killed in bombardments, so that the public will support another bombardment that will kill more babies whose pictures you won’t be shown.

I apologize for the cynicism, but I can’t think of an intervention, by the US or anyone else, since 1945 that did what it was supposedly going to do. Nor am I a fan of sitting by and watching when horrors are going on. There isn’t a good way to take the weapons away from the bullies without (a) triggering the deaths of thousands more people on both sides, and (b) rolling out a carpet for the very things the US professes to wish did not exist (civil war, djihadist governments, regional power-projection of Iran, China, you name it).

Want to support something? Support medical assistance to the population (Médecins du Monde is deeply engaged).

Representative discussion among people most of whom I would not dismiss as crazy or ignorant, here. I’d like to know how this is going down across the breakfast tables of America now.

08/25/13

Reviewing Scholarly Books

I write a lot of book reviews. (In fact, I’m overdue for one now.) And I just finished copy-editing 23 reviews gathered for the Journal Which Shall Remain Nameless— let me in passing thank the book review editors who recruited the reviewers and kept after them to submit their copy. The two-day bulimic transit through 23 reviews, from 6 to 8 pages in length apiece, has prepared me to discourse to you on the state of the art, which is, on this showing, fairly dismal. What do I like and dislike in a book review? How can I persuade folks to write more intriguing and insightful reviews? It’s not that hard.

Cardinal rule no. 1: Ink is frightfully expensive. Don’t waste it. All right, you know that’s not true; ink is cheap and they’re practically giving away pixels at the moment, but for the person who wants to use either substance well, they’re best treated like gold dust or the finest cocaine. Your reader is probably, like most academics, a slave to duty, but that doesn’t give you a license to waste time. If the book review carries a header saying, for example,

Théodule-Mongin Pfeffernuss, A Comprehensive Catalogue of Gallo-Roman Fibulae Discovered in the Drain of the Caldarium at Aix-la-Chapelle. Leiden: Brill, 2013. Pp. xxi + 430. $1,375.00 (hardback).

Continue reading

08/24/13

Duple Scruple

Today is August 24th, St. Bartholomew’s Day. Ernest Renan said it: there are some things that every French person needs to forget, such as the crusade against the Albigensians or the St. Bartholomew’s Day Massacre. Renan meant that these old grievances, if opened anew, would set French people to fighting amongst themselves rather than building a common Republic or warding off outer enemies.

Continue reading

08/7/13

For External Use Only

usage-ext

One of the things I am always asking France– because, vous savez, lovers are always full of questions– is why the ideas that are taken to their worst extremes of actualization elsewhere have so often begun here. France, mère des arts, des armes, et des lois, I know. But that’s not all. Alongside a lot of civilisation and rayonnement, égalité and parité, it was on French territory that the theory of the fascist state grew to completeness (so that Mussolini could then borrow it from the Action française), that racism and antisemitism took their modern forms (Gobineau, Drumont), that the most eminent medical researchers, decorated with Nobel prizes, advocated a strong eugenics program (Charles Richet, Alexis Carrel). But in comparison with other places, France had a mild case of fascism, antisemitism, racism, eugenics, etc. These could achieve a loud minority, a persistent subtheme, but not (so far) domination of French political life.

You might say: Vichy. But Vichy was a capitulation to invaders who came waving a monstrous growth of bad French ideas. Vichy is an example of what happens when the precarious balance of things that kept people like Maurras and Barrès on the loud lunatic fringe got broken. And no, I am not denying the existence of plenty of nasty racists and exterminationists in la grande patrie, some of them elected officials.

How were moments of crisis averted, by and large, the moments when the same ideas jumped into the saddle elsewhere? My theory is not that French people are uniquely virtuous or that France has some secret ingredient (too bad for me; I could be writing best-sellers and New York Times Magazine pieces about the special Frenchness of the French!), but just that the democratic process kept going here despite the many coups, restorations, revolutions, wars and invasions. Not immaculately; just enough. We can all take encouragement from that.

08/2/13

Bullet Democracy

I’m always interested in the question of whether a social process limits itself or goes on escalating indefinitely. As an example of self-limitation, consider an epidemic that kills so many victims that there are no new bodies left to infect. Or, more optimistically, consider the “asymmetrical” modes of struggle described by Bateson, which he thought ensured greater social stability than symmetrical modes, always apt to escalate into violence without limit. The proposals we’ve been hearing over the last few months for arming more and more people, from kindergarten teachers to garbage collectors, would make the US a society of damagingly “symmetrical” conflicts, in which anyone could shoot anyone for anything. And some people are just fine with that.

I wonder, though, what skills or accomplishments brought a Wayne LaPierre to the head of the National Rifle Association? Was he a recognized Top Gun, a grandmaster of the Bushmaster? I suspect not; he probably got to the top of that particular heap by being good at public relations, communication, rhetoric, and in particular by being more hardline and “on message” than lesser mortals. But even a “hardline” PR man is soft when you compare him to a real gunslinger. I propose that, going forward, the NRA should recruit all its spokesmen and officials from its membership through a system of ranked duels. Any member can challenge another member, just like competitors in tennis or chess, and claim a recognized rank, but only after engaging in a fight to the death.

Setting up this system will be interesting and gratifying for the fans of the gun, and will create jobs in the betting industry as non-gun-owners rush to get in on the excitement by laying odds. There will be symbolic upsets and mythic confrontations. (Can Clint Eastwood really shoot, or is it just for the movies? How about Charlton Heston?) Best of all, the ranks of the NRA will be thinned of the sort of people who just like macho posturing but are not actually good at shooting: these probably pose the greater danger to the public in armed confrontations, so we all benefit. And anyone who demurs from a challenge will be allowed to exit the NRA, taking their year’s membership fee with them.

Let’s see some real bullet democracy at last. (For my part, from behind a thick wall of sandbags.)

 

08/1/13

Where Were You From?

Living in the UK and in North America as an ethnic minority, I am often asked in different situations: “Where were you from?” And in fact, with the growing ethnic, linguistic and cultural complexity of the Hong Kong population, I was asked that question fairly frequently even there. How this question is being asked of course indicates different sociopolitical presumptions and connotations of the questioner. While some people are sincerely and genuinely curious about who I am, others often turn the conversation into a kangaroo-court-styled investigation, making me feel not only uncomfortable, but also violated.

Continue reading

07/30/13

The Real War

We’re at war and don’t know it. Attention, energy, and lives are being wasted on defending against an “enemy” who doesn’t really have the capacity or the will to do us that much harm (if you define harm in terms of citizen lives)– i.e., “terrorists.” But the war we should be attentive to is going on all around us. Prosecuting this war is going to be complicated. It has an infinity of fronts. The enemy is a shadowy, formless, crafty, non-state actor. You probably shook hands with members of this invisible army yesterday, or watched them on the TV. And you weren’t aware that they killed or maimed American citizens in many multiples of the number killed and wounded in military service.

Let us first figure out who they are, and then find a way to stop them. Nothing could be stupider than to be at war and not know it.

07/25/13

Bipartisan Breakthrough!

The very Republicans who are dug in to a scorched-earth, never-never-never position on any piece of legislation or nominee brought forth by Democrats have found that the reauthorization of massive spying on American citizens is a cause they can put themselves behind, linking arms for once with the White House. Isn’t bipartisan concord beautiful, especially when it occurs at the expense of those civil liberties that, as we used to say, “made America great”?