3.17.2017 Doc of the Day

1. Paul Boyer, 2010.
2. William Gibson, 2011.
3. Wolfgang Huemer, 2014.
4. Cynthia McKinney, 2016.
Numero Uno“Garry Wills is a national treasure.  His 40-odd books range from Saint Augustine to John Wayne, but the heart of his work lies with American politics, from the Founding Fathers, the Federalist Papers, and Lincoln’s Gettysburg Address to the brilliant Nixon Agonistes: The Crisis of the Self-Made Man (1970); The Kennedy Imprisonment: A Meditation on Power (1982); and Reagan’s America: Innocents at Home (1987).  The product of a rigorous Catholic edu­cation, and a one-time conservative protégé of William F. Buck­­ley, Wills over a long career has evolved into a public intellectual.  In all his work, he brings to bear his graduate training in the classics, a profound knowledge of U.S. history and constitutional thought, and, above all, a humane yet rigorous moral sensibility.

Now, at 75, Wills in his latest book explores the vast and, in his view, deplorable expansion of the executive branch since World War II.  He examines the emergence of powerful and shadowy bureaucracies such as the Central Intelligence Agency, the National Security Council, and the National Security Agency.  He probes the obsession with secrecy by which successive administra­tions have hidden their illegal and maladroit actions.  He looks at the inflation of the president’s status from the limited function as commander-in-chief of the military prescribed by the Constitution to ‘Commander in Chief’ of the entire nation, granted sycophantic deference by legislators, the media, and the public.

In the tradition of Arthur M. Schlesinger Jr.’s The Imperial Presidency (1973), Wills measures today’s bloated presidency against the closely circumscribed office intended by the Founders. James Madison’s assertion in Federalist 51 that “In republican government the legislative authority, necessarily, predominates” becomes for Wills the standard by which the contemporary executive branch must be judged.

Wills recognizes that the process he documents long predates 1945. He notes Thomas Jefferson’s breathtaking purchase of the Louisiana Territory without congressional authorization, Abraham Lincoln’s Emancipation Proclamation and abrogation of habeas corpus as military measures, and Teddy Roosevelt’s fondness for policy-setting “executive orders.” (He might have added Andrew Jackson’s arrogant defiance of the Supreme Court in the 1832 Indian-removal case Worcester v. Georgia [“John Marshall has made his decision; now let him enforce it!”]; the Wilson Administration’s constitutional violations during and after World War I; and Franklin D. Roosevelt’s expansive presidency. Boasted FDR to an aide in 1936: “There is one issue in this campaign. It is myself.”)

For Wills, however, these earlier assertions of presidential power pale in comparison to recent times when, in the name of national security, the presidency has inexorably expanded at the expense of (indeed, often enabled by) the legislative and judicial branches. While noting periodic congressional and judicial efforts to rein in executive power, such as the constraints imposed on the CIA in the 1947 National Security Act, the 1973 War Powers Act, and the Supreme Court’s 1971 ruling in the Pentagon Papers case, New York Times Co. v. United States, he also shows that just as routinely the executive branch ignored or bypassed such attempts.

Wills cites many aggrandizing actions by Republican administrations, including Nixon’s secret bombing of Cambodia, overthrow of Chile’s elected government, and Watergate crimes, and Reagan’s Iran-contra illegalities. Citing Peter Irons’s War Powers: How the Imperial Presidency Hijacked the Constitution (2005) and Jane Mayer’s The Dark Side: The Inside Story of How the War on Terror Turned into a War on American Ideals (2008), he devastatingly critiques the “crescendo of presidential arrogance” in the Bush-Cheney years, including John Yoo’s now familiar but still chilling “torture memos” and Bush’s incessant use of “signing statements” to put his own spin on a law’s meaning and to define how (or whether) he chose to enforce it.

But as Wills documents the roots of the National Security State in the Truman years and the Obama Administration’s depressing echoing of the secrecy and executive-privilege claims of the Bush-Cheney years, he concludes that the long-term expansion of executive power transcends party and is so deeply entrenched that hopes of reversing it are slight. As he writes in his afterword, summing up the book’s argument:

The momentum of accumulating powers in the executive is not easily reversed, checked, or even slowed. It was not created by the Bush administration. The whole history of America since World War II caused an inertial rolling of power toward the executive branch. The monopoly on [the] use of nuclear weaponry, the cult of the Commander in Chief, the worldwide web of military bases to maintain nuclear alert and supremacy, the secret intelligence agencies, the whole National Security State, the classification and clearance systems, the expansion of state secrets, the withholding of evidence and information, the permanent emergency that has melded World War II with the Cold War and the Cold War with the war on terror—all these make a vast and intricate structure that may not yield to efforts at dismantling it. Sixty-eight straight years of war emergency powers . . . have made the abnormal normal, and constitutional diminishment the settled order.

As this passage—and the book’s title, Bomb Power—suggests, Wills sees the advent of nuclear weapons, and the president’s power to order a nuclear attack, as crucial to the emergence of the National Security State and the ensuing burgeoning of executive power. The super-secret Manhattan Project he calls “the seed of all the growing powers that followed.” Ever since, he contends, the Bomb has driven the steady expansion of presidential power. After all, if a president has the Zeus-like capacity to destroy entire nations and snuff out millions of lives instantaneously, all other powers are trivialized in comparison. When the 1946 Atomic Energy Act granted this cosmic authority solely to the president, Wills writes, “the nature of the presidency was irrevocably altered,” leading to a vast expansion of executive power in all directions, including a sprawling security apparatus to protect nuclear secrets. This consequence was foreseen by such early post-Hiroshima commentators as Dwight Macdonald, Lewis Mumford, and the constitutional scholar Edward S. Corwin (Total War and the Constitution, 1947), and Wills powerfully confirms and updates their warnings.

Not all extensions of executive power can be so clearly tied to the Bomb, however. The plots to assassinate foreign leaders and overthrow governments seem more linked to the imperatives of U.S. global economic interests and hegemony. Even more problematic for Wills’s argument is the expansion of federal regulatory powers through executive-branch agencies—some dating to the Progressive or New Deal eras, and others to more recent times: the Food and Drug Administration, the Securities and Exchange Commission, the Environmental Protection Agency, the Federal Trade Commission’s Bureau of Consumer Protection, etc. These more “benign” enlargements of executive power reflect domestic political dynamics unrelated to the Bomb.

Wills might have explored more fully how growing executive power relates to the larger pattern of corporate power in Cold War America, memorably described by Dwight D. Eisenhower in 1961 as “the military-industrial complex.” More attention to this interconnection, examined in such now-distant works as C. Wright Mills’s The Power Elite (1956), Fred J. Cook’s The Warfare State (1962), and Seymour Melman’s The Permanent War Economy (1974), would have enabled Wills to link the process he examines to broader social and economic developments in postwar America.

With this book, Wills joins a considerable company of writers who over the years have issued similar warnings, ranging from Raoul Berger’s Executive Privilege: A Constitutional Myth (1974) to such recent works as Chalmers Johnson’s The Sorrows of Empire: Militarism, Secrecy and the End of the Republic (2004) and In Democracy’s Shadow: The Secret World of National Security (2005), edited by Marcus G. Raskin and A. Carl LeVan.  This polemical tradition cuts across ideological lines.  New Left radicals of the 1960s denounced LBJ’s Gulf of Tonkin deception and the nexus of government and corporate power they called ‘the Establishment.’  At the same time, denunciations of expanding executive power (particularly regulatory power) have been a staple of the libertarian strand of conservative ideology pioneered by the economist and political theorist Ludwig von Mises, who as early as 1944 published Omnipotent Government: The Rise of the Total State and Total War. Mises’ antigovernment animus, including executive-branch usurpations, lives on in such works as Murray Rothbard’s coauthored New History of Leviathan: Essays on the Rise of the American Corporate State (1972) and John Denson’s Reassessing the Presidency: The Rise of the Executive State and the Decline of Freedom (2001), published by the Ludwig von Mises Institute.

In his conclusion, Wills again concedes that the process he describes seems nearly irresistible.  Nevertheless, he insists, Arthur Miller– fashion, attention must be paid.  If citizens only realized the danger of such power, he seems to suggest, the republican system of checks and balances envisioned by the Founders, with a strictly limited executive branch, might yet be restored.  But as Wills’s classical training, including an awareness of the collapse of Athenian democracy and of the Roman Republic alike, surely reminds him, the hope is slim.  This is a bleak and pessimistic work from a major American public intellectual.” Paul Boyer, “The Imbalance of Power: How the Manhattan Project Gave Birth to the Imperial Presidency;” an American Scholar book review, 2010

Numero Dos“Vancouver, British Columbia, sits just on the far side of the American border, a green-glass model city set in the dish of the North Shore Mountains, which enclose the city and support, most days, a thick canopy of fog.  There are periods in the year when it’ll rain for forty days, William Gibson tells me one mucky day there this winter, and when visibility drops so low you can’t see what’s coming at you from the nearest street corner.  But large parts of Vancouver are traversed by trolley cars, and on clear nights you can gaze up at the wide expanse of Pacific sky through the haphazard grid of their electric wires.Gibson came to Vancouver in 1972, a twenty-four-year-old orphan who’d spent the past half-­decade trawling the counterculture in Toronto on his wandering way from small-town southern Virginia.  He had never been to the Far East, which would yield so much of the junk-heap casino texture of his early fiction.  He hadn’t been to college and didn’t yet intend to go.  He hadn’t yet heard of the Internet, or even its predecessors arpanet and Telenet.  He thought he might become a film-cell animator.  He hadn’t yet written any science fiction—he hadn’t read any science fiction since adolescence, having discarded the stuff more or less completely at fourteen, just, he says, as its publishers intended.

Today, Gibson is lanky and somewhat shy, avuncular and slow to speak—more what you would expect from the lapsed science-fiction enthusiast he was in 1972 than the genre-vanquishing hero he has become since the publication of his first novel, the hallucinatory hacker thriller Neuromancer, in 1984.  Gibson resists being called a visionary, yet his nine novels constitute as subtle and clarifying a meditation on the transformation of culture by technology as has been written since the beginning of what we now know to call the information age.  Neuromancer, famously, gave us the term ­cyberspace and the vision of the Internet as a lawless, spellbinding realm.  And, with its two sequels, Count Zero (1986) and Mona Lisa Overdrive (1988), it helped establish the cultural figure of the computer hacker as cowboy hero.  In his Bridge series—Virtual Light (1993), Idoru (1996), and All Tomorrow’s Parties (1999), each of which unfolds in a Bay Bridge shantytown improvised ­after a devastating Pacific earthquake transforms much of San Francisco—he planted potted futures of celebrity journalism, reality television, and nanotechnology, each prescient and persuasive and altogether weird.

Neuromancer and its two sequels were set in distant decades and contrived to dazzle the reader with strangeness, but the Bridge novels are set in the near future—so near they read like alternate history, Gibson says, with evident pride.  With his next books, he began to write about the present-day, or more precisely, the recent past: each of the three novels in the series is set in the year before it was written.  He started with September 11, 2001.

Pattern Recognition was the first of that series.  It has been called ‘an eerie vision of our time’ by The New Yorker, ‘one of the first authentic and vital novels of the twenty-first century,’ by The Washington Post Book World, and, by The Economist, ‘probably the best exploration yet of the function and power of product branding and advertising in the age of globalization.’  The Pattern Recognition books are also the first since Mona Lisa Overdrive in which Gibson’s characters speak of cyberspace, and they speak of it elegiacally.  ‘I saw it go from the yellow legal pad to the Oxford English Dictionary,’ he tells me.  ‘But cyberspace is everywhere now, having everted and colonized the world.  It starts to sound kind of ridiculous to speak of cyberspace as being somewhere else.’

You can tell the term still holds some magic for him, perhaps even more so now that it is passing into obsolescence.  The opposite is true for ­cyberpunk, a neologism that haunts him to this day.  On a short walk to lunch one afternoon, from the two-story mock-Tudor house where he lives with his wife, Deborah, he complained about a recent visit from a British journalist, who came to Vancouver searching for ‘Mr. Cyberpunk’ and was disappointed to find him ensconced in a pleasantly quiet suburban patch of central Vancouver.  Mr. Cyberpunk seemed wounded by having his work ­pigeonholed, but equally so by the insult to his home, which is quite ­comfortable, and his neighborhood, which is, too.  ‘We like it quiet,’ he explained.

—David Wallace-Wells

INTERVIEWER

What’s wrong with cyberpunk?

GIBSON

A snappy label and a manifesto would have been two of the very last things on my own career want list.  That label enabled mainstream science fiction to safely assimilate our dissident influence, such as it was.  Cyberpunk could then be embraced and given prizes and patted on the head, and genre science fiction could continue unchanged.

INTERVIEWER

What was that dissident influence? What were you trying to do?

GIBSON

I didn’t have a manifesto. I had some discontent. It seemed to me that midcentury mainstream American science fiction had often been triumphalist and militaristic, a sort of folk propaganda for American exceptionalism. I was tired of America-as-the-future, the world as a white monoculture, the protagonist as a good guy from the middle class or above. I wanted there to be more elbow room. I wanted to make room for antiheroes.

I also wanted science fiction to be more naturalistic. There had been a poverty of description in much of it. The technology depicted was so slick and clean that it was practically invisible. What would any given SF favorite look like if we could crank up the resolution? As it was then, much of it was like video games before the invention of fractal dirt. I wanted to see dirt in the corners.

INTERVIEWER

How do you begin a novel?

GIBSON

I have to write an opening sentence. I think with one exception I’ve never changed an opening sentence after a book was completed.

INTERVIEWER

You won’t have planned beyond that one sentence?

GIBSON

No. I don’t begin a novel with a shopping list—the novel becomes my shopping list as I write it. It’s like that joke about the violin maker who was asked how he made a violin and answered that he started with a piece of wood and removed everything that wasn’t a violin. That’s what I do when I’m writing a novel, except somehow I’m simultaneously generating the wood as I’m carving it.

E. M. Forster’s idea has always stuck with me—that a writer who’s fully in control of the characters hasn’t even started to do the work. I’ve never had any direct fictional input, that I know of, from dreams, but when I’m working optimally I’m in the equivalent of an ongoing lucid dream. That gives me my story, but it also leaves me devoid of much theoretical or philosophical rationale for why the story winds up as it does on the page. The sort of narratives I don’t trust, as a reader, smell of homework.

INTERVIEWER

Do you take notes?

GIBSON

I take the position that if I can forget it, it couldn’t have been very good.

But in the course of a given book, I sometimes get to a point where the ­narrative flow overwhelms the speed at which I can compose. So I’ll sometimes stop and make cryptic notes that are useless by the time I get back to them. Underlined three times, with no context—“Have they been too big a deal?”

INTERVIEWER

What is your writing schedule like?

GIBSON

When I’m writing a book I get up at seven. I check my e-mail and do Internet ablutions, as we do these days. I have a cup of coffee. Three days a week, I go to Pilates and am back by ten or eleven. Then I sit down and try to write. If absolutely nothing is happening, I’ll give myself permission to mow the lawn. But, generally, just sitting down and really trying is enough to get it started. I break for lunch, come back, and do it some more. And then, usually, a nap. Naps are essential to my process. Not dreams, but that state adjacent to sleep, the mind on waking.

INTERVIEWER

And your schedule is steady the whole way through?

GIBSON

As I move through the book it becomes more demanding. At the beginning, I have a five-day workweek, and each day is roughly ten to five, with a break for lunch and a nap. At the very end, it’s a seven-day week, and it could be a twelve-hour day.

Toward the end of a book, the state of composition feels like a complex, chemically altered state that will go away if I don’t continue to give it what it needs. What it needs is simply to write all the time. Downtime other than simply sleeping becomes problematic. I’m always glad to see the back of that.

INTERVIEWER

Do you revise?

GIBSON

Every day, when I sit down with the manuscript, I start at page one and go through the whole thing, revising freely.

INTERVIEWER

You revise the whole manuscript every day?

GIBSON

I do, though that might consist of only a few small changes. I’ve done that since my earliest attempts at short stories. It would be really frustrating for me not to be able to do that. I would feel as though I were flying blind.

The beginnings of my books are rewritten many times. The endings are only a draft or three, and then they’re done. But I can scan the manuscript very quickly, much more quickly than I could ever read anyone else’s prose.

INTERVIEWER

Does your assessment of the work change, day by day?

GIBSON

If it were absolutely steady I don’t think it could be really good judgment. I think revision is hugely underrated. It is very seldom recognized as a place where the higher creativity can live, or where it can manifest. I think it was Yeats who said that literary revision was the only place in life where a man could truly improve himself.

INTERVIEWER

How much do you write in a typical day?

GIBSON

I don’t know. I used to make printouts at every stage, just to be comforted by the physical fact of the pile of manuscript. It was seldom more than five manuscript pages. I was still doing that with Pattern Recognition, out of nervousness that all the computers would die and take my book with them. I was printing it out and sending it to first readers by fax, usually beginning with the first page. I’m still sending my output to readers every day. But I’ve learned to just let it live in the hard drive, and once I’d quit printing out the daily output, I lost track.

INTERVIEWER

For a while it was often reported, erroneously, that you typed all your books on a typewriter.

GIBSON

I wrote Neuromancer on a manual portable typewriter and about half of Count Zero on the same machine. Then it broke, in a way that was more or less irreparable. Bruce Sterling called me shortly thereafter and said, “This changes everything!” I said, “What?” He said, “My Dad gave me his Apple II. You have to get one of these things!” I said, “Why?” He said, “Automation—it automates the process of writing!” I’ve never gone back.

But I had only been using a typewriter because I’d gotten one for free and I was poor. In 1981, most people were still writing on typewriters. There were five large businesses in Vancouver that did nothing but repair and sell typewriters. Soon there were computers, too, and it was a case of the past and the future mutually coexisting. And then the past just goes away.

INTERVIEWER

For someone who so often writes about the future of technology, you seem to have a real romance for artifacts of earlier eras.

GIBSON

It’s harder to imagine the past that went away than it is to imagine the future. What we were prior to our latest batch of technology is, in a way, unknowable. It would be harder to accurately imagine what New York City was like the day before the advent of broadcast television than to imagine what it will be like after life-size broadcast holography comes online. But actually the New York without the television is more mysterious, because we’ve already been there and nobody paid any attention. That world is gone.

My great-grandfather was born into a world where there was no recorded music. It’s very, very difficult to conceive of a world in which there is no possibility of audio recording at all. Some people were extremely upset by the first Edison recordings. It nauseated them, terrified them. It sounded like the devil, they said, this evil unnatural technology that offered the potential of hearing the dead speak. We don’t think about that when we’re driving somewhere and turn on the radio. We take it for granted.

INTERVIEWER

Was television a big deal in your childhood?

GIBSON

I can remember my father bringing home our first set—this ornate wooden cabinet that was the size of a small refrigerator, with a round cathode-ray picture tube and wooden speaker grilles over elaborate fabric. Like a piece of archaic furniture, even then. Everybody would gather around at a particular time for a broadcast—a baseball game or a variety show or something. And then it would go back to a mandala that was called a test pattern, or nothing—static.

We know that something happened then. We know that broadcast television did something—did everything—to us, and that now we aren’t the same, though broadcast television, in that sense, is already almost over. I can remember seeing the emergence of broadcast television, but I can’t tell what it did to us because I became that which watched broadcast television.

The strongest impacts of an emergent technology are always unanticipated. You can’t know what people are going to do until they get their hands on it and start using it on a daily basis, using it to make a buck and u­sing it for criminal purposes and all the different things that people do. The people who invented pagers, for instance, never imagined that they would change the shape of urban drug dealing all over the world. But pagers so completely changed drug dealing that they ultimately resulted in pay phones being removed from cities as part of a strategy to prevent them from becoming illicit drug markets. We’re increasingly aware that our society is driven by these unpredictable uses we find for the products of our imagination.

INTERVIEWER

What was it like growing up in Wytheville, Virginia?

GIBSON

Wytheville was a small town. I wasn’t a very happy kid, but there were ­aspects of the town that delighted me. It was rather short on books, though. There was a rotating wire rack of paperbacks at the Greyhound station on Main Street, another one at a soda fountain, and another one at a drugstore. That was all the book retail anywhere in my hometown.

My parents were both from Wytheville. They eventually got together, though rather late for each of them. My father had been married previously, and my mother was probably regarded as a spinster. My mother’s family had been in Wytheville forever and was quite well-off and established, in a very small-town sort of way. My father’s father had moved down from Pennsylvania to start a lumber company. Once the railroads had gotten far enough back into the mountains, after the Civil War, there were a lot of fortunes being made extracting resources.

My mother had had some college, which was unusual for a young woman in that part of the world, but she hadn’t married, which was basically all a woman of her class was supposed to do. When she did eventually marry my father, he was the breadwinner. He had had some college, too, had studied engineering, which enabled him to wind up working postwar for a big construction company. My earliest memories are of moving from project to project, every year or so, as this company built Levittown-like suburbs in Tennessee and North Carolina.

INTERVIEWER

And as these projects were being built you would live in one of the houses?

GIBSON

We did, in these rather sadly aspirational ranch-style houses within brand-new, often unoccupied suburbs. It was right at the beginning of broadcast television, and the world on television was very much the world of that sort of house, and of the suburb. It was a vision of modernity, and I felt part of that.

But my father was often away—he traveled constantly on business trips. When I was about six, he left on one business trip and died. Within a week, my mother and I were back in Wytheville.

INTERVIEWER

How did he die?

GIBSON

It’s odd the way families try to help people grieve—it doesn’t always work out. I was told at the time that he had died of a heart attack. Then later, I began to think, You know, he was young—that’s pretty scary! Twenty years later somebody said to me, Actually, he choked on something in a restaurant. It was a Heimlich maneuver death prior to the Heimlich maneuver.

It was a hugely traumatic loss, and not just because I’d lost my father. In Wytheville, I felt I wasn’t in that modern world anymore. I had been living in a vision of the future, and then suddenly I was living in a vision of the past. There was television, but the world outside the window could have been the 1940s or the 1930s or even the 1900s, depending on which direction you looked. It was a very old-fashioned place.

Towns like that in the South were virtually tribal in those days. Everything was about who your kin were. I was this weird alienated little critter who wasn’t even that into his own kin. I was shy and withdrawn. I just wanted to stay in my room and read books and watch television, or go to the movies.

INTERVIEWER

What drew you to that stuff?

GIBSON

It was a window into strangeness. Any kind of foreign material got my interest, anything that wasn’t from the United States I would walk around the block to see. Most of what you could see on television or at the movies was very controlled, but sometimes you could just turn on your television and see some fabulous random thing, because the local channels had space they couldn’t afford to fill with network material. They might show old films more or less at random, and they wouldn’t necessarily have been screened for content. So there were occasionally coincidences of this kind of odd, other universe—some dark, British crime film from the 1940s, say.

My mother got me an omnibus Sherlock Holmes for a tenth-birthday present and I loved it. I remember casting one particular brick building that I walked by every day as a building in Sherlock Holmes’s London. That could be in London, that building, I thought. I developed this special relationship with the facade of this building, and when I was in front of it I could imagine that there was an infinite number of similar buildings in every direction and I was in Sherlock Holmes’s London.

Part of my method for writing fiction grew out of that fundamental small-town lack of novelty. It caused me to develop an inference mechanism for imagining distant places. I would see, perhaps, a picture of a Sunbeam Alpine sports car and infer a life in England. I always held on to that, and

it migrated into my early fiction, particularly where I would create an ­imaginary artifact in the course of writing and infer the culture that had produced it.

INTERVIEWER

Do you think fiction should be predictive?

GIBSON

No, I don’t. Or not particularly. The record of futurism in science fiction is actually quite shabby, it seems to me. Used bookstores are full of visionary texts we’ve never heard of, usually for perfectly good reasons.

INTERVIEWER

You’ve written that science fiction is never about the future, that it is always instead a treatment of the present.

GIBSON

There are dedicated futurists who feel very seriously that they are extrapolating a future history. My position is that you can’t do that without having the present to stand on. Nobody can know the real future. And novels set in imaginary futures are necessarily about the moment in which they are written. As soon as a work is complete, it will begin to acquire a patina of anachronism. I know that from the moment I add the final period, the text is moving steadily forward into the real future.

There was an effort in the seventies to lose the usage science fiction and champion speculative fiction. Of course, all fiction is speculative, and all history, too—endlessly subject to revision. Particularly given all of the emerging technology today, in a hundred years the long span of human history will look fabulously different from the version we have now. If things go on the way they’re going, and technology keeps emerging, we’ll eventually have a near-total sorting of humanity’s attic.

In my lifetime I’ve been able to watch completely different narratives of history emerge. The history now of what World War II was about and how it actually took place is radically different from the history I was taught in elementary school. If you read the Victorians writing about themselves, they’re describing something that never existed. The Victorians didn’t think of themselves as sexually repressed, and they didn’t think of themselves as racist. They didn’t think of themselves as colonialists. They thought of themselves as the crown of creation.

Of course, we might be Victorians, too.

INTERVIEWER

The Victorians invented science fiction.

GIBSON

I think the popular perception that we’re a lot like the Victorians is in large part correct. One way is that we’re all constantly in a state of ongoing t­echnoshock, without really being aware of it—it’s just become where we live. The Victorians were the first people to experience that, and I think it made them crazy in new ways. We’re still riding that wave of craziness. We’ve gotten so used to emergent technologies that we get anxious if we haven’t had one in a while.

But if you read the accounts of people who rode steam trains for the first time, for instance, they went a little crazy. They’d traveled fifteen miles an hour, and when they were writing the accounts afterward they struggled to describe that unthinkable speed and what this linear velocity does to a perspective as you’re looking forward. There was even a Victorian medical complaint called “railway spine.”

Emergent technologies were irreversibly altering their landscape. Bleak House is a quintessential Victorian text, but it is also probably the best steam­punk landscape that will ever be. Dickens really nailed it, especially in those proto-Ballardian passages in which everything in nature has been damaged by heavy industry. But there were relatively few voices like Dickens then. Most people thought the progress of industry was all very exciting. Only a few were saying, Hang on, we think the birds are dying.

INTERVIEWER

Were you hunting around for books as a kid?

GIBSON

I knew what day of the month the truck would come and put new books on those wire racks around town, but sometimes I would just go anyway, on the off chance that I had missed something during the last visit. In those days you could have bought all of the paperback science fiction that was being published in the United States, monthly, and it probably wouldn’t have cost you five dollars. There was just very little stuff coming out, and it was never enough for me.

A couple of times I found big moldering piles of old science fiction in junk shops and bought it all for a dollar and carted it home. These magazines were probably eight or ten years old, but to me they were ancient—it felt like they were from the nineteenth century. That there could be something in one of these magazines that was completely mind blowing was an amazing thing.

INTERVIEWER

What was so affecting about it?

GIBSON

It gave me an uncensored window into very foreign modes of thought. There was a lot of inherent cultural relativism in the science fiction I discovered then. It gave me the idea that you could question anything, that it was possible to question anything at all. You could question religion, you could question your own culture’s most basic assumptions. That was just unheard of—where else could I have gotten it? You know, to be thirteen years old and get your brain plugged directly into Philip K. Dick’s brain!

That wasn’t the way science fiction advertised itself, of course. The self-advertisement was: Technology! The world of the future! Educational! Learn about science! It didn’t tell you that it would jack your kid into this weird malcontent urban literary universe and serve as the gateway drug to J. G. Ballard.

And nobody knew. The people at the high school didn’t know, your parents didn’t know. Nobody knew that I had discovered this window into all kinds of alien ways of thinking that wouldn’t have been at all acceptable to the people who ran that little world I lived in.

INTERVIEWER

Who were the writers that were most important to you?

GIBSON

Alfred Bester was among the first dozen science-fiction writers I read when I was twelve years old, and I remember being amazed, doing my own science-fiction-writer reconnaissance work a decade or two later, that someone I had discovered that young still seemed to me to be so amazing.

Bester had been doing it in the fifties—a Madison Avenue hepcat who had come into science fiction with a bunch of Joyce under his belt. He built his space-opera future out of what it felt like to be young and happening in New York, in the creative end of the business world in 1955. The plotlines were pulp and gothic and baroque, but what I loved most was the way it seemed to be built out of something real and complex and sophisticated. I hadn’t found that in a lot of other science fiction.

INTERVIEWER

What other writing interested you then?

GIBSON

Fritz Leiber was another culturally sophisticated American science-fiction writer—unusually sophisticated. Samuel Delany, too. I was a teenager, just thirteen or fourteen, reading novels Delany had written as a teenager—that was incredible to me.

I started reading so-called adult science fiction when I was eleven or twelve, and by the time I was fourteen or fifteen I had already moved on, into other kinds of fiction, but somewhere in that very short period I discovered British science fiction and what was at that time called British New Wave science fiction, led, it seemed to me, by J. G. Ballard.

There was a kind of literary war underway between the British New Wave people and the very conservative American science-fiction writers—who probably wouldn’t even have thought of themselves as very conservative—saying, That’s no good, you can’t do that, you don’t know how to tell a story, and besides you’re a communist. I remember being frightened by that rhetoric. It was the first time I ever saw an art movement, I suppose.

When I decided to try to write myself, in my late twenties, I went out and bought a bunch of newer science fiction—I hadn’t been reading the stuff for a long while. It was incredibly disappointing. That window to strangeness just didn’t seem to be there anymore. It was like, when I was twelve there was country blues, and when I’m twenty-six there’s plastic Nashville country—it was that kind of change. My intent, when I began to write, was to be a one-man science-fiction roots movement. I remember ­being horrified that critics who were taken quite seriously, at least within the genre, habitually referred to the category of all writing that was not science fiction or fantasy as “the mundane.” It didn’t make any sense to me. If there was mundane literature, then certainly a lot of it was science fiction. You know, if James Joyce is mundane but Edgar Rice Burroughs isn’t—I’m out of here.

INTERVIEWER

When did you encounter the Beats?

GIBSON

More or less the same time I found science fiction, because I found the Beats when the idea of them had been made sufficiently mainstream that there were paperback anthologies on the same wire rack at the bus station. I remember being totally baffled by one Beat paperback, an anthology of short bits and excerpts from novels. I sort of understood what little bits of Kerouac were in this thing—I could read him—but then there was William S. Burroughs and excerpts from Naked Lunch I thought, What the heck is that? I could tell that there was science fiction, somehow, in Naked Lunch. Burroughs had cut up a lot of pulp-noir detective fiction, and he got part of his tonality from science fiction of the forties and the fifties. I could tell it was kind of like science fiction, but that I didn’t understand it.

INTERVIEWER

Was Dick important to you?

GIBSON

I was never much of a Dick fan. He wrote an awful lot of novels, and I don’t think his output was very even. I loved The Man in the High Castle, which was the first really beautifully realized alternate history I read, but by the time I was thinking about writing myself, he’d started publishing novels that were ostensibly autobiographical, and which, it seems to me, he probably didn’t think were fiction.

Pynchon worked much better for me than Dick for epic paranoia, and  he hasn’t yet written a book in which he represents himself as being in direct contact with God. I was never much of a Raymond Chandler fan, either.

INTERVIEWER

Why not?

GIBSON

When science fiction finally got literary naturalism, it got it via the noir detective novel, which is an often decadent offspring of nineteenth-century naturalism. Noir is one of the places that the investigative, analytic, literary impulse went in America. The Goncourt brothers set out to investigate sex and money and power, and many years later, in America, you wind up with Chandler doing something very similar, though highly stylized and with a very different agenda. I always had a feeling that Chandler’s puritanism got in the way, and I was never quite as taken with the language as true Chandler fans seem to be. I distrusted Marlow as a narrator. He wasn’t someone I wanted to meet, and I didn’t find him sympathetic—in large part because Chandler, whom I didn’t trust either, evidently did find him sympathetic.

But I trusted Dashiell Hammett. It felt to me that Hammett was Chandler’s ancestor, even though they were really contemporaries. Chandler civilized it, but Hammett invented it. With Hammett I felt that the author was open to the world in a way Chandler never seems to me to be.

But I don’t think that writers are very reliable witnesses when it comes to influences, because if one of your sources seems woefully unhip you are not going to cite it. When I was just starting out people would say, Well, who are your influences? And I would say, William Burroughs, J. G. Ballard, Thomas Pynchon. Those are true, to some extent, but I would never have said Len Deighton, and I suspect I actually learned more for my basic craft reading Deighton’s early spy novels than I did from Burroughs or Ballard or Pynchon.

I don’t know if it was Deighton or John le Carré who, when someone asked them about Ian Fleming, said, I love him, I have been living on his reverse market for years. I was really interested in that idea. Here’s Fleming, with this classist, late–British Empire pulp fantasy about a guy who wears fancy clothes and beats the shit out of bad guys who generally aren’t white, while driving expensive, fast cars, and he’s a spy, supposedly, and this is selling like hotcakes. Deighton and Le Carré come along and completely reverse it, in their different ways, and get a really powerful charge out of not offering James Bond. You’ve got Harry Palmer and George Smiley, neither of whom are James Bond, and people are willing to pay good money for them not to be James Bond.

INTERVIEWER

Were you happy in Wytheville?

GIBSON

I was miserable, but I probably would have been anywhere. I spent a year or two being increasingly weird and depressed. I was just starting to get counter­cultural signals. It’s almost comical, in retrospect—1966 in this small Southern town, and I’m like a Smiths fan or something, this mopey guy who likes to look at fashion magazines but isn’t gay. I was completely out of place, out of time. None of it was particularly dramatic, but I’m sure it was driving my mother crazy. Pretty soon I had become so difficult and hard to get out of bed that I let myself be packed off to a boys’ boarding school in Tucson.

INTERVIEWER

Were you close with your mother?

GIBSON

She was difficult. She was literate—she was actually a compulsive reader, and really respected the idea of writing—and she was very encouraging of any artistic impulses I might have had. Writers were her heroes, and that made her kind of a closeted freak in that town. She was one of maybe ten people who had a subscription to the Sunday New York Times.

But she was also an incredibly anxious, fearful, neurotic person, and I would imagine she was pretty much constantly depressed, except that depression didn’t exist in those days, people were just “down” or “difficult.” But she was a chronically depressed, anxiety-ridden single parent who wanted nothing more than to read novels, chain-smoke Camels, and drink bad coffee all day long. There are worse things a parent can do, but it was still hard.

INTERVIEWER

Were you in Arizona when she died?

GIBSON

I was still in school, but not for much longer. I was sufficiently upset, after she died, that they wound up sending me home after a couple of months. But I didn’t get along with my relatives, so my mother’s best friend and her husband finally took me in. This was a woman who’d been my mother’s literary buddy all her life. She was the only other person in town who cared about modern literature, as far as I knew. It was lifesaving for me, because it gave me somewhere I could be where the people I was with weren’t trying to figure out how to get me into the army.

INTERVIEWER

Had you already decided to avoid the draft?

GIBSON

I’m not sure what would have happened if I had been drafted. I was not the most tightly wrapped package at that time, and I think it would have depended on the day I got the draft notice. I suspect I would have been equally capable of saying, Fuck it, I’m going to Vietnam.

I never did get drafted, but I went off to Canada on a kind of exploratory journey to figure out what I might do if I ever was drafted. I got to Toronto early in 1967 and it was the first time I had been in a big city that was pedestrian friendly, not to mention foreign, so I just stayed there. I figured if they drafted me I was already there. But I found that I couldn’t hang out with the guys who’d been drafted.

INTERVIEWER

Why not?

GIBSON

I didn’t belong. I hadn’t made their decision. And I found them too sad, too angry. Some of their families had disowned them. They could feel, I guessed, that they’d brought dishonor on their families by resisting the draft. Some of these were people who had no intention of ever leaving the United States. There were suicides, there was a lot of drug abuse. Nobody knew that a few years down the road it would all be over and that all would be forgiven. And that wasn’t my situation. I was there because I liked it there.

It was 1967, and the world was in the middle of some sort of secular millenarian convulsion. Young people thought everything would change in some Rapture-like way. Nobody knew what it was going to be like, but ­everybody knew that pretty soon everything would be different.

INTERVIEWER

Did you?

GIBSON

I do remember thinking that the world I was seeing around me probably was going to be very different in relatively short order. But I didn’t assume that it would necessarily be better.

I had become interested at some point, before I got to Toronto, in popular delusions and the madness of crowds. Science-fiction writers had long accessed popular delusions as a source of material—intentional communities where people all believe something nobody else in the world believes, groups of people under some sort of great emotional stress who decide that something is about to happen, people who commit suicide en masse, people who invest in Ponzi schemes. When the sixties cranked up, I felt already familiar with what was happening. Moving to the woods always creeped me out so I just stayed in cities and watched the whole thing congeal.

INTERVIEWER

Congeal?

GIBSON

Like bacon fat in the bottom of the pan. It was ghastly—the nuked psychic ruins of 1967.

INTERVIEWER

And how were you passing the time?

GIBSON

I was one of those annoying people who know they are going to do something in the arts, but never do anything about it. But then, in 1967 and 1968, if you were a part of the secular millenarian movement, even on the fringes, you basically didn’t do anything, you just got up in the morning and walked around, and figured out what you had to do to make that happen again the next day—where you were going to sleep and what could be done to pay the rent. Soon, the hippie rapture would happen and it would all be okay. In the meantime you just hung out. While I suspected that wasn’t really sustainable, I couldn’t think of anything else to do.

I had been hugely fond of Toronto as I first found it in 1967, but by 1972 I had lost that fondness. Montreal had always been the business capital of Canada, and when the Quebecois separatist movement got problematic enough for the country to be placed under martial law, all of the big companies fled to Toronto—the stock market even moved there—and the mood of the place changed very quickly.

INTERVIEWER

You met your wife in Toronto, didn’t you?

GIBSON

I took her coffee one morning. I was staying at my friend’s place, and he had spent the night with some woman and didn’t want to get out of bed, so he called to me and asked me to make them some coffee. I said sure, I made them some coffee, brought it up on a tray, and there was my wife.

After we had been together for a while, I began complaining about the weather in Toronto. I told her, I can’t do this winter, I forgot how bad this is. She said, I know an easier way—come with me to Vancouver. We’ve been here ever since.

INTERVIEWER

That’s when you went back to school.

GIBSON

Those days it was fantastically easy to get a degree at UBC. I discovered very quickly that they were in effect paying me for studying things I was already interested in. I could cool it for four years, and I wouldn’t have to worry about what I was going to do for the rest of my life.

But my wife started to talk about having a child. She already had a job, a real job at the university. Everyone I had known during that four-year period was also trying to get a job. It startled me. They hadn’t really been talking about getting jobs before. But some part of me I had never heard from before sat me down and said, You’ve been bullshitting about this art thing since you were fifteen years old, you’ve never done anything about any of it, you’re about to be shoved into the adult world, so if you’re going to do anything about the art thing, you’ve got to do it right now, or shut up and get a job.

That was really the beginning of my career. My wife continued to have a job after she had the baby, so I became the caregiver guy, the house husband guy, and simultaneously I found that it actually provided ample time to write. When he was asleep, I could write, I knew that was the only time I would have to write. Most of the short fiction I wrote at the beginning was written when our son was asleep.

INTERVIEWER

You wrote your first story for a class, didn’t you?

GIBSON

A woman named Susan Wood had come to UBC as an assistant professor. We were the same age, and I met her while reconnoitering the local science-fiction culture. In my final year she was teaching a science-fiction course. I had become really lazy and thought, I won’t have to read anything if I take her course. No matter what she assigns, I’ve read all the stuff. I’ll just turn up and bullshit brilliantly, and she’ll give me a mark just for doing that. But when I said, “Well, you know, we know one another. Do I really have to write you a paper for this class?” She said, “No, but I think you should write a short story and give me that instead.” I think she probably saw through whatever cover I had erected over my secret plan to become a science-fiction writer.

I went ahead and did it, but it was incredibly painful. It was the hardest thing I did in my senior year, writing this little short story. She said, “That’s good. You should sell it now.” And I said, “No.” And she said, “Yeah, you should sell it.” I went and found the most obscure magazine that paid the least amount of money. It was called Unearth. I submitted it to them, and they bought it and gave me twenty-seven dollars. I felt an enormous sense of relief. At least nobody will ever see it, I thought. That was “Fragments of a Hologram Rose.”

INTERVIEWER

How did you meet John Shirley?

GIBSON

Shirley was the only one of us who was seriously punk. I’d gone to a science-fiction convention in Vancouver, and there I encountered this eccentrically dressed young man my age who seemed to be wearing prison pajamas. He was an extremely outgoing person, and he introduced himself to me: “I’m a singer in a punk band, but my day job is writing science fiction.” I said, “You know, I write a little science fiction myself.” And he said, “Published anything?” And I said, “Oh, not really. This one story in this utterly obscure magazine.” He said, “Well, send me some of your stuff, I’ll give you a critique.”

As soon as he got home he sent me a draft of a short story he had written perhaps an hour beforehand: “This is my new genius short story.” I read it—it was about someone who discovers there are things that live in bars, things that look like drunks and prostitutes but are actually something else—and I saw, as I thought at the time, its flaws. I sat down to write him a critique, but it would have been so much work to critique it that instead I took his story and rewrote it. It was really quick and painless. I sent it back to him, saying, “I hope this won’t piss you off, but it was actually much easier for me to rewrite this than to do a critique.” The next thing I get back is a note—“I sold it!” He had sold it to this hardcover horror anthology. I was like, Oh, shit. Now my name is on this weird story.

People kept doing that to me, and it’s really good that they did. I’d give various friends stuff to read, and they’d say, “What are you going to do with this?” And I’d say, “Nothing, it’s not nearly there yet.” Then they’d Xerox it and submit it on my behalf, to places I would have been terrified to submit to. It seemed unseemly to me to force this unfinished stuff on the world at large.

INTERVIEWER

Do you still consider that work unfinished?

GIBSON

I had a very limited tool kit when I began writing. I didn’t know how to ­handle transitions, so I used abrupt breaks, the literary equivalent of jump cuts. I didn’t have any sense of how to pace anything. But I had read and admired Ballard and Burroughs, and I thought of them as very powerful effect pedals. You get to a certain place in the story and you just step on the Ballard.

INTERVIEWER

What was the effect?

GIBSON

A more genuine kind of future shock. I wanted the reader to feel con­stantly somewhat disoriented and in a foreign place, because I assumed that to be the highest pleasure in reading stories set in imaginary futures. But I’d also read novels where the future-weirdness quotient overwhelmed me and ­simply became boring, so I tried to make sure my early fiction worked as relatively solid genre pieces. Which I still believe is harder to do. When I started Neuromancer, for instance, I wanted to have an absolutely familiar, utterly well-worn armature of pulp plot running throughout the whole thing. It’s the caper plot that carries the reader through.

INTERVIEWER

What do you think of Neuromancer today?

GIBSON

When I look at Neuromancer I see a Soap Box Derby car. I felt, writing it, like I had two-by-fours and an old bicycle wheel and I’m supposed to build something that will catch a Ferrari. This is not going to fly, I thought. But I tried to do it anyway, and I produced this garage artifact, which, amazingly, is still running to this day.

Even so, I got to the end of it, and I didn’t care what it meant, I didn’t even know if it made any sense as a narrative. I didn’t have this huge feeling of, Wow, I just wrote a novel! I didn’t think it might win an award. I just thought, Phew! Now I can figure out how to write an actual novel.

INTERVIEWER

How did you come up with the title?

GIBSON

Coming up with a word like neuromancer is something that would earn you a really fine vacation if you worked in an ad agency. It was a kind of booby-trapped portmanteau that contained considerable potential for cognitive dissonance, that pleasurable buzz of feeling slightly unsettled.

I believed that this could be induced at a number of levels in a text—at the microlevel with neologisms and portmanteaus, or using a familiar word in completely unfamiliar ways. There are a number of well-known techniques for doing this—all of the classic surrealist techniques, for instance, especially the game called exquisite corpse, where you pass a folded piece of paper around the room and write a line of poetry or a single word and fold it again and then the next person blindly adds to it. Sometimes it produces total gibberish, but it can be spookily apt. A lot of what I had to learn to do was play a game of exquisite-corpse solitaire.

INTERVIEWER

Where did cyberspace come from?

GIBSON

I was painfully aware that I lacked an arena for my science fiction. The spaceship had been where science fiction had happened for a very long time, even in the writing of much hipper practitioners like Samuel Delany. The spaceship didn’t work for me, viscerally. I know from some interviews of Ballard’s that it didn’t work for him either. His solution was to treat Earth as the alien planet and perhaps to treat one’s fellow humans as though they were aliens. But that didn’t work for me. I knew I wouldn’t be able to function in a purely Ballardian universe. So I needed something to replace outer space and the spaceship.

I was walking around Vancouver, aware of that need, and I remember walking past a video arcade, which was a new sort of business at that time, and seeing kids playing those old-fashioned console-style plywood video games. The games had a very primitive graphic representation of space and perspective. Some of them didn’t even have perspective but were yearning toward perspective and dimensionality. Even in this very primitive form, the kids who were playing them were so physically involved, it seemed to me that what they wanted was to be inside the games, within the notional space of the machine. The real world had disappeared for them—it had completely lost its importance. They were in that notional space, and the machine in front of them was the brave new world.

The only computers I’d ever seen in those days were things the size of the side of a barn. And then one day, I walked by a bus stop and there was an Apple poster. The poster was a photograph of a businessman’s jacketed, neatly cuffed arm holding a life-size representation of a real-life computer that was not much bigger than a laptop is today. Everyone is going to have one of these, I thought, and everyone is going to want to live inside them. And somehow I knew that the notional space behind all of the computer screens would be one single universe.

INTERVIEWER

And you knew at that point you had your arena?

GIBSON

I sensed that it would more than meet my requirements, and I knew that there were all sorts of things I could do there that I hadn’t even been able to imagine yet. But what was more important at that point, in terms of my practical needs, was to name it something cool, because it was never going to work unless it had a really good name. So the first thing I did was sit down with a yellow pad and a Sharpie and start scribbling—infospace, dataspace. I think I got cyberspace on the third try, and I thought, Oh, that’s a really weird word. I liked the way it felt in the mouth—I thought it sounded like it meant something while still being essentially hollow.

What I had was a sticky neologism and a very vague chain of associations between the bus-stop Apple IIc advertisement, the posture of the kids playing arcade games, and something I’d heard about from these hobbyist characters from Seattle called the Internet. It was more tedious and more technical than anything I’d ever heard anybody talk about. It made ham ­radio sound really exciting. But I understood that, sometimes, you could send messages through it, like a telegraph. I also knew that it had begun as a project to explore how we might communicate during a really shit-hot nuclear war.

I took my neologism and that vague chain of associations to a piece of prose fiction just to see what they could do. But I didn’t have a concept of what it was to begin with. I still think the neologism and the vague general idea were the important things. I made up a whole bunch of things that happened in cyberspace, or what you could call cyberspace, and so I filled in my empty neologism. But because the world came along with its real cyberspace, very little of that stuff lasted. What lasted was the neologism.

INTERVIEWER

Where did you get the prefix cyber?

GIBSON

It came from the word cybernetics, which was coined around the year I was born by a scientist named Norbert Wiener. It was the science of feedback and control systems. I was familiar with the word through science fiction more than anything else.

Science fiction had long offered treatments of the notional space inside the computer. Harlan Ellison had written a story called “I Have No Mouth, and I Must Scream,” which was set in what we would call a virtual world within a computer. You could even go back to Ray Bradbury’s story “The Veldt,” which was one of his mordantly cautionary fables about broadcast television. So I didn’t think it was terribly original, my concept of cyberspace. My anxiety, rather, was that if I had thought of it, twenty or thirty other science-fiction writers had thought of it at exactly the same time and were probably busy writing stories about it, too.

There’s an idea in the science-fiction community called steam-engine time, which is what people call it when suddenly twenty or thirty different writers produce stories about the same idea. It’s called steam-engine time ­because nobody knows why the steam engine happened when it did. Ptolemy demonstrated the mechanics of the steam engine, and there was nothing technically stopping the Romans from building big steam engines. They had little toy steam engines, and they had enough metalworking skill to build big steam tractors. It just never occurred to them to do it. When I came up with my cyberspace idea, I thought, I bet it’s steam-engine time for this one, because I can’t be the only person noticing these various things. And I wasn’t. I was just the first person who put it together in that particular way, and I had a logo for it, I had my neologism.

INTERVIEWER

Were you hoping to make cyberspace feel unfamiliar when you were first writing about it?

GIBSON

It wasn’t merely unfamiliar. It was something no one had experienced yet. I wanted the reader’s experience to be psychedelic, hyperintense. But I also knew that a more rigorous and colder and truer extrapolation would be to simply present it as something the character scarcely even notices. If I make a phone call to London right now, there’s absolutely no excitement in that—there’s nothing special about it. But in a nineteenth-century science-fiction story, for someone in Vancouver to phone someone in London would have been the biggest thing in the story. People in the far-flung reaches of the British Empire will all phone London one day!

Giving in to this conflict, I inserted an odd little edutainment show running on television in the background at one point in Neuromancer—“Cyberspace, a consensual realm.” Partly it was for the slower reader who hadn’t yet figured it out, but also it was to get me off the hook with my conscience, because I knew I was going to hit the pulp buttons really big-time and do my best to blow people out of the water with this psychedelic cyberspace effect.

Of course, for the characters themselves, cyberspace is nothing special—they use it for everything. But you don’t hear them say, Well, I’ve got to go into cyberspace to speak to my mother, or I’ve got to go to cyberspace to get the blueberry-pie recipe. That’s what it really is today—there are vicious thieves and artificial intelligence sharks and everything else out there, swimming in it, but we’re still talking to our mothers and exchanging blueberry-pie recipes and looking at porn and tweeting all the stuff we’re doing. Today I could write a version of Neuromancer where you’d see the quotidian naturalistic side, but it wouldn’t be science fiction. With the fairly limited tool kit I had in 1981, I wouldn’t have been able to do that, and, of course, I didn’t know what it would be like.

INTERVIEWER

What was needed that you were missing?

GIBSON

I didn’t have the emotional range. I could only create characters who have ­really, really super highs and super lows—no middle. It’s taken me eight books to get to a point where the characters can have recognizably complex or ambiguous relationships with other characters. In Neuromancer, the whole range of social possibility when they meet is, Shall we have sex, or shall I kill you? Or you know, Let’s go rob a Chinese corporation—cool!

I knew that cyberspace was exciting, but none of the people I knew who were actually involved in the nascent digital industry were exciting. I wondered what it would be like if they were exciting, stylish, and sexy. I found the answer not so much in punk rock as in Bruce Springsteen, in particular Darkness on the Edge of Town, which was the album Springsteen wrote as a response to punk—a very noir, very American, very literary album. And I thought, What if the protagonist of Darkness on the Edge of Town was a computer hacker? What if he’s still got Springsteen’s character’s emotionality and utterly beat-down hopelessness, this very American hopelessness? And what if the mechanic, who’s out there with him, lost in this empty nightmare of America, is actually, like, a robot or a brain in a bottle that nevertheless has the same manifest emotionality? I had the feeling, then, that I was actually crossing some wires of the main circuit board of popular culture and that nobody had ever crossed them this way before.

INTERVIEWER

How did the Sprawl, a megalopolis stretching from Atlanta to Boston, originate?

GIBSON

I had come to Vancouver in 1972, and I wasn’t really trying to write science fiction until 1982. There was a decade gap where I’d been here and scarcely anywhere else—to Seattle for the odd weekend, and that was it. I was painfully aware of not having enough firsthand experience of the contemporary world to extrapolate from. So the Sprawl is there to free me from the obligation to authentic detail.

It had always felt to me as though Washington, D.C., to Boston was one span of stuff. You never really leave Springsteenland, you’re just in this unbroken highway and strip-mall landscape. I knew that would resonate with some readers, and I just tacked on Atlanta out of sci-fi bravura, to see how far we could push this thing. Sometimes in science fiction you can do that. The reader really likes it if you add Atlanta, because they’re going, Shit, could you do that? Could that be possible? If you’re visiting the future, you really want to have a few of the “shit, could they do that?” moments.

INTERVIEWER

Do readers often ask you to explain things about your books you yourself don’t understand?

GIBSON

The most common complaint I received about Neuromancer, from com­puter people, was that there will never be enough bandwidth for any of this to be possible. I didn’t want to argue with them because I scarcely knew what bandwidth was, but I assumed it was just a measure of something, and so I thought, How can they know? It’s like saying there’ll never be enough engines, there’ll never be enough hours for this to happen. And they were wrong.

INTERVIEWER

Why did you set the novel in the aftermath of a war?

GIBSON

In 1981, it was pretty much every intelligent person’s assumption that on any given day the world could end horribly and pretty well permanently. There was this vast, all-consuming, taken-for-granted, even boring end-of-the-world anxiety that had been around since I was a little kid. So one of the things I wanted to do with Neuromancer was to write a novel in which the world didn’t end in a nuclear war. In Neuromancer, the war starts, they lose a few cities, then it stops when multinational corporations essentially take the United States apart so that can never happen again. There’s deliberately no textual evidence that the United States exists as a political entity in Neuromancer. On the evidence of the text America seems to be a sort of federation of city-states connected to a military-industrial complex that may not have any government controlling it. That was my wanting to get away from the future-is-America thing. The irony, of course, is how the world a­ctually went. If somebody had been able to sit me down in 1981 and say, You know how you wrote that the United States is gone and the Soviet Union is looming in the background like a huge piece of immobile slag? Well, you got it kind of backward.

That war was really a conscious act of imaginative optimism. I didn’t quite believe we could be so lucky. But I didn’t want to write one of those science-fiction novels where the United States and the Soviet Union nuke themselves to death. I wanted to write a novel where multinational capital took over, straightened that shit out, but the world was still problematic.

INTERVIEWER

The world of the Sprawl is often called dystopian.

GIBSON

Well, maybe if you’re some middle-class person from the Midwest. But if you’re living in most places in Africa, you’d jump on a plane to the Sprawl in two seconds. Many people in Rio have worse lives than the inhabitants of the Sprawl.

I’ve always been taken aback by the assumption that my vision is fundamentally dystopian. I suspect that the people who say I’m dystopian must be living completely sheltered and fortunate lives. The world is filled with much nastier places than my inventions, places that the denizens of the Sprawl would find it punishment to be relocated to, and a lot of those places seem to be steadily getting worse.

INTERVIEWER

There’s a famous story about your being unable to sit through Blade Runner while writing Neuromancer.

GIBSON

I was afraid to watch Blade Runner in the theater because I was afraid the movie would be better than what I myself had been able to imagine. In a way, I was right to be afraid, because even the first few minutes were better. Later, I noticed that it was a total box-office flop, in first theatrical release. That worried me, too. I thought, Uh-oh. He got it right and ­nobody cares! Over a few years, though, I started to see that in some weird way it was the most influential film of my lifetime, up to that point. It affected the way people dressed, it affected the way people decorated nightclubs. Architects started building office buildings that you could tell they had seen in Blade Runner. It had had an astonishingly broad aesthetic impact on the world.

I met Ridley Scott years later, maybe a decade or more after Blade Runner was released. I told him what Neuromancer was made of, and he had basically the same list of ingredients for Blade Runner. One of the most powerful ingredients was French adult comic books and their particular brand of Orientalia—the sort of thing that Heavy Metal magazine began translating in the United States.

But the simplest and most radical thing that Ridley Scott did in Blade Runner was to put urban archaeology in every frame. It hadn’t been obvious to mainstream American science fiction that cities are like compost heaps—just layers and layers of stuff. In cities, the past and the present and the future can all be totally adjacent. In Europe, that’s just life—it’s not science fiction, it’s not fantasy. But in American science fiction, the city in the future was always brand-new, every square inch of it.

INTERVIEWER

Cities seem very important to you.

GIBSON

Cities look to me to be our most characteristic technology. We didn’t really get interesting as a species until we became able to do cities—that’s when it all got really diverse, because you can’t do cities without a substrate of other technologies. There’s a mathematics to it—a city can’t get over a certain size unless you can grow, gather, and store a certain amount of food in the vicinity. Then you can’t get any bigger unless you understand how to do sewage. If you don’t have efficient sewage technology the city gets to a certain size and everybody gets cholera.

INTERVIEWER

It seems like most if not all of your protagonists are loners, orphans, and nomads, detached from families and social networks.

GIBSON

We write what we know, and we write what we think we can write. I think so many of my characters have been as you just described because it would be too much of a stretch for me to model characters who have more rounded emotional lives.

Before we moved to Vancouver, my wife and I went to Europe. And I ­realized that I didn’t travel very well. I was too tense for it. I was delighted that I was there, and I had a sense of storing up the sort of experiences I imagined artists had to store up in order to be artists. But it was all a bit ­extreme for me—Franco’s Spain is still the only place I’ve ever had a gun pointed at my head. I always felt that everybody else had parents somewhere who would come and get their ass out of trouble. But nobody was going to come get me out of trouble. Nobody was going to take care of me. The ­hedonic risk taking that so many of my peers were into just made me anxious. A lot of people got into serious trouble taking those risks. I never wanted to get into serious trouble.

INTERVIEWER

The protagonist of Count Zero, Bobby Newmark, has a comparatively mundane life—he lives with his mother.

GIBSON

One of the very first so-called adult science-fiction novels I ever read was Starship Troopers by Robert Heinlein. I’d gone away on a trip with my ­mother and I had nothing to read, and the only thing for sale was this rather adult-looking paperback. I was barely up to the reading skill required for Starship Troopers, but I can remember figuring out the first couple of pages, and it blew the top off my head. Later, when I managed to read it all the way through, I got the feeling that I was more like the juvenile delinquents who got beat up by the Starship Troopers than I was like the Starship Troopers themselves. And I remember wondering, Where did the juvenile delinquents go after they got beaten up by the Starship Troopers? What happened to them? Where did they live? Bobby is sort of the answer. They lived with their mothers and they were computer hackers!

INTERVIEWER

In Mona Lisa Overdrive, your third novel, Bobby ends up in a peculiar contraption called the “Aleph.”

GIBSON

I think I was starting to realize that the only image I had for total artificial intelligence or total artificial reality was Borges’s Aleph, a point in space that contains all other points. In his story “The Aleph,” which may be his greatest, Borges managed to envision this Aleph without computers or anything like them. He skips the issue of what it is and how it works. It just sits there under the stairs in the basement of some old house in Buenos Aires, and nobody says why, but you have to go down the stairs, lie on your back, look at this thing, and if you get your head at the right angle, then you can see everything there is, or ever was, anywhere, at any time.

I think I was probably twelve years old when I read that, and I never got over the wonder of that story, and how Borges in this very limited number of words could make you feel that he’s seen every last thing in the universe, just by sonorously listing a number of very peculiar and mismatched items and events. If Bobby was going to go somewhere, that was probably going to be it.

INTERVIEWER

What interested you about Joseph Cornell?

GIBSON

Beginning with Count Zero I had the impulse to use the text to honor works of art that I particularly loved or admired. With Mona Lisa Overdrive, it’s heavily Joseph Cornell, especially his extraordinary talent to turn literal garbage into these achingly superb, over-the-top, poetic, cryptic statements.

Gradually, Cornell became a model of creativity for me. I’ve always had a degree of impostor syndrome about being or calling myself an artist, but I’m pretty sure that there’s some way in which I’m an outsider, and what I’m doing has to be outsider art. I felt that I’ve worked with found objects at times in a similar way because I valued bits of the real world differently than I valued the bits I created myself.

When I was going to start writing All Tomorrow’s Parties, John Clute suggested to me that all of my books had become Cornell boxes. The Bridge in Virtual Light, he said, was my biggest Cornell box. It really spooked me. I think that’s why I wound up burning the Bridge.

INTERVIEWER

Tell me about the Bridge.

GIBSON

The Bridge is a fable about counterculture, the kind of counterculture that may no longer be possible. There are no backwaters where things can breed—our connectivity is so high and so global that there are no more Seattles and no more Haight-Ashburys. We’ve arrived at a level of commodification that may have negated the concept of counterculture. I wanted to create a s­cenario in which I could depict something like that happening in the recognizably near future.

I woke up one morning in San Francisco and looked out the window and had this great archetypal San Francisco experience—there was nothing but fog. Nothing but fog except this perfectly clear diorama window up in the air, brilliantly lit by the sun, containing the very top of the nearest ­support tower of the Bay Bridge. I couldn’t see anything else in the city, just this little glowing world. I thought, Wow, if you had a bunch of plywood, two-by-fours, you could build yourself a little house on top of that thing and live there.

The Bridge novels were set just a few years into the future, which is now a few years in the past, and so they read almost like alternate-history ­novels—the present in flamboyant cyberpunk drag. And the Bridge itself, a shantytown culture improvised in the wake of a devastating Bay Area earthquake, is a piece of emergent technology.

INTERVIEWER

Many readers have argued that the Bridge books offer a theory of technology.

GIBSON

More like a rubbing—like rubbing brass in a cathedral or a tombstone in a graveyard. I’m not a didactic storyteller. I don’t formulate theories about how the world works and then create stories to illustrate my theories. What I have in the end is an artifact and not a theory.

But I take it for granted that social change is driven primarily by emergent technologies, and probably always has been. No one legislates techno­logies into emergence—it actually seems to be quite a random thing. That’s a vision of technology that’s diametrically opposed to the one I received from science fiction and the popular culture of science when I was twelve years old.

In the postwar era, aside from anxiety over nuclear war, we assumed that we were steering technology. Today, we’re more likely to feel that technology is driving us, driving change, and that it’s out of control. Technology was previously seen as linear and progressive—evolutionary in that way our culture has always preferred to misunderstand Darwin.

INTERVIEWER

You don’t see technology evolving that way?

GIBSON

What I mainly see is the distribution of it. The poorer you are, the poorer your culture is, the less cutting-edge technology you’re liable to encounter, aside from the Internet, the stuff you can access on your cell phone.

In that way, I think we’re past the computer age. You can be living in a third-world village with no sewage, but if you’ve got the right apps then you can actually have some kind of participation in a world that otherwise looks like a distant Star Trek future where people have plenty of everything. And from the point of view of the guy in the village, information is getting beamed in from a world where people don’t have to earn a living. They certainly don’t have to do the stuff he has to do everyday to make sure he’s got enough food to be alive in three days.

On that side of things, Americans might be forgiven for thinking the pace of change has slowed, in part because the United States government hasn’t been able to do heroic nonmilitary infrastructure for quite a while. Before and after World War II there was a huge amount of infrastructure building in the United States that gave us the spiritual shape of the American century. Rural electrification, the highway system, the freeways of Los Angeles—those were some of the biggest things anybody had ever built in the world at the time, but the United States really has fallen far behind with that.

INTERVIEWER

Is computer technology not heroic?

GIBSON

I do think it’s a really big deal, although the infrastructure is not physical. There’s hardware supporting the stuff, but the digital infrastructure is a bunch of zeros and ones—something that amounts to a kind of language.

It looks to me as though that prosthetic-memory project is going to be what we are about, as a species, because our prosthetic memory now actually stands a pretty good chance of surviving humanity. We could conceivably go extinct and our creations would live on. One day, in the sort of science-fiction novel I’m unlikely ever to write, intelligent aliens might encounter something descended from our creations. That something would introduce itself by saying, Hey, we wish our human ancestors could have been around to meet you guys because they were totally fascinated by this moment, but at least we’ve got this PowerPoint we’d like to show you about them. They don’t look anything like us, but that is where we came from, and they were actually made out of meat, as weird as that seems.

INTERVIEWER

When did you decide to write about the contemporary world?

GIBSON

For years, I’d found myself telling interviewers and readers that I believed it was possible to write a novel set in the present that would have an effect very similar to the effect of novels I had set in imaginary futures. I think I said it so many times, and probably with such a pissy tone of exasperation, that I finally decided I had to call myself on it.

A friend knew a woman who was having old-fashioned electroshock therapy for depression. He’d pick her up at the clinic after the session and drive her not home but to a fish market. He’d lead her to the ice tables where the day’s catch was spread out, and he’d just stand there with her, and she’d look at the ice tables for a really long time with a blank, searching expression. Finally, she’d turn to him and say, “Wow, they’re fish, aren’t they!” After electro­shock, she had this experience of unutterable, indescribable wonderment at seeing these things completely removed from all context of memory, and gradually her brain would come back together and say, Damn, they’re fish. That’s kind of what I do.

INTERVIEWER

What is “pattern recognition”?

GIBSON

It is the thing we do that other species on the planet are largely incapable of doing. It’s how we infer everything. If you’re in the woods and a rock comes flying from somewhere in your direction, you assume that someone has thrown a rock at you. Other animals don’t seem capable of that. The fear leverage in the game of terrorism depends on faulty pattern recognition. After all, terrorist acts are rare and tend to kill fewer people than, say, automobile accidents or drugs and alcohol.

INTERVIEWER

Had you already begun to write Pattern Recognition before 9/11?

GIBSON

I had but as soon as that happened just about everything else in the manuscript dried up and blew away.

INTERVIEWER

Why did the September 11 attacks have such an effect on you?

GIBSON

Because I had had this career as a novelist, Manhattan was the place in the United States that I visited most regularly. I wound up having more friends in New York than I have anywhere else in the United States. It has that quality of being huge and small at the same time—and noble. So without even realizing it, I had come to know it, I had come to know lower Manhattan better than any place other than Vancouver. When 9/11 happened it affected me with a directness I would never have imagined possible.

In a strange sort of way that particular relationship with New York ­ended with 9/11 because the post–9/11 New York doesn’t feel to me to be the same place.

INTERVIEWER

Are you glad you wrote a book that had so much 9/11 in it?

GIBSON

I’m really glad. I felt this immense gratitude when I finished, and I was sitting there looking at the last page, thinking, I’m glad I got a shot at this thing now, because for sure there are dozens of writers all around the world right this minute, thinking, I have to write about 9/11. And I thought, I’m already done, I won’t have to revisit this material, and it’s largely out of my system.

INTERVIEWER

Alongside that public narrative runs a very private one, with Cayce chasing through the maze of the Internet after the source of some mesmerizing film material she calls “the footage.”

GIBSON

Having assumed that there were no longer physical backwaters in which new bohemias could spawn and be nurtured, I was intrigued by the idea and the very evident possibility that in the post-geographic Internet simply having a topic of sufficient obscurity and sufficient obsessive interest to a number of geographically diverse people could replicate the birth of a bohemia.

When I started writing about the footage, I don’t think I had ever seen a novel in which anybody had had a real emotional life unfolding on a l­istserv, but I knew that millions of people around the world were living parts of their emotional lives in those places—and moreover that the Internet was basically built by those people! They were meeting one another and having affairs and getting married and doing everything in odd special-interest communities on the Internet. Part of my interest in the footage was simply trying to rise to the challenge of naturalism.

INTERVIEWER

You’ve called science fiction your native literary culture. Do you still feel that way, having written three books that are set in the present?

GIBSON

Yes, but native in the sense of place of birth. Science fiction was the first literary culture I acquired, but since then I’ve acquired a number of other literary cultures, and the bunch of them have long since supplanted science fiction.

INTERVIEWER

Do you think of your last three books as being science fiction?

GIBSON

No, I think of them as attempts to disprove the distinction or attempts to dissolve the boundary. They are set in a world that meets virtually every criteria of being science fiction, but it happens to be our world, and it’s barely tweaked by the author to make the technology just fractionally imaginary or fantastic. It has, to my mind, the effect of science fiction.

If you’d gone to a publisher in 1981 with a proposal for a science-fiction novel that consisted of a really clear and simple description of the world today, they’d have read your proposal and said, Well, it’s impossible. This is ridiculous. This doesn’t even make any sense. Granted, you have half a dozen powerful and really excellent plot drivers for that many science-fiction n­ovels, but you can’t have them all in one novel.

INTERVIEWER

What are those major plot drivers?

GIBSON

Fossil fuels have been discovered to be destabilizing the planet’s climate, with possibly drastic consequences.  There’s an epidemic, highly contagious, lethal sexual disease that destroys the human immune system, raging virtually uncontrolled throughout much of Africa.  New York has been attacked by Islamist fundamentalists, who have destroyed the two tallest buildings in the city, and the United States in response has invaded Afghanistan and Iraq.

INTERVIEWER

And you haven’t even gotten to the technology.

GIBSON

You haven’t even gotten to the Internet.  By the time you were telling about the Internet, they’d be showing you the door.  It’s just too much science fiction.”  William Gibson, “The Art of Fiction, 211;” Paris Review Interview, 2011

brain head mental psychology creativity inquiy

Numero Tres“Franz Clemens Brentano (1838–1917) is mainly known for his work in philosophy of psychology, especially for having introduced the notion of intentionality to contemporary philosophy.  He made important contributions to many fields in philosophy, especially to metaphysics and ontology, ethics, logic, the history of philosophy, and philosophical theology.  Brentano was strongly influenced by Aristotle and the Scholastics as well as by the empiricist and positivist movements of the early nineteenth century.  Due to his introspectionist approach of describing consciousness from a first person point of view, on one hand, and his rigorous style as well as his contention that philosophy should be done with exact methods like the natural sciences, on the other, Brentano is often considered a forerunner of both the phenomenological movement and the tradition of analytic philosophy.  A charismatic teacher, Brentano exerted a strong influence on the work of Edmund Husserl, Alexius Meinong, Christian von Ehrenfels, Kasimir Twardowski, Carl Stumpf, and Anton Marty, among others, and thereby played a central role in the philosophical development of central Europe in the early twentieth century.

1. Life and Work

Franz Brentano was born on January 16, 1838 in Marienberg am Rhein, Germany, a descendent of a strongly religious German-Italian family of intellectuals (his uncle Clemens Brentano and his aunt Bettina von Arnim were among the most important writers of German Romanticism and his brother Lujo Brentano became a leading expert in social economics). He studied mathematics, poetry, philosophy, and theology in Munich, Würzburg, and Berlin. Already at high school he became acquainted with Scholasticism; at university he studied Aristotle with Trendelenburg in Berlin, and read Comte as well as the British Empiricists (mainly John Stuart Mill), all of whom had a great influence on his work. Brentano received his Ph.D. in 1862, with his thesis On the Several Senses of Being in Aristotle.

After graduation Brentano prepared to take his vows; he was ordained a Catholic priest in 1864. Nevertheless he continued his academic career at the University of Würzburg, where he presented his Habilitationsschrift on The Psychology of Aristotle in 1867. Despite reservations in the faculty about his priesthood he eventually became full professor in 1873. During this period, however, Brentano struggled more and more with the official doctrine of the Catholic Church, especially with the dogma of papal infallibility, promulgated at the first Vatican Council in 1870. Shortly after his promotion at the University of Würzburg, Brentano withdrew from the priesthood and from his position as professor.

After his Habilitation, Brentano had started to work on a large scale work on the foundations of psychology, which he entitled Psychology from an Empirical Standpoint. The first volume was published in 1874, a second volume (The Classification of Mental Phenomena) followed in 1911, and fragments of the third volume (Sensory and Noetic Consciousness) were published posthumously by Oskar Kraus in 1928.

Shortly after the publication of the first volume, Brentano took a job as a full professor at the University of Vienna, where he continued a successful teaching career. During his tenure in Vienna, Brentano, who was very critical towards his own writing, no longer wrote books but turned instead to publishing various lectures. The topics range from aesthetics (Das Genie [The Genius], Das Schlechte als Gegenstand dichterischer Darstellung[Evil as Object of Poetic Representation]) and issues in historiography to The Origin of the Knowledge of Right and Wrong, in which Brentano laid out his views on ethics. The latter was Brentano’s first book to be translated into English in 1902.

When in 1880 Brentano and Ida von Lieben decided to wed, they had to confront the fact that the prevailing law in the Austro-Hungarian Empire denied matrimony to persons who had been ordained priests – even if they later had resigned from priesthood. They surmounted this obstacle by temporarily moving to and becoming citizens of Saxony, where they finally got married. This was possible only by temporarily giving up the Austrian citizenship and, in consequence, the job as full professor at the University. When Brentano came back to Vienna a few months later, the Austrian authorities did not reassign him his position. Brentano became Privatdozent, a status that allowed him to go on teaching – but did not entitle him to receive a salary or to supervise theses. For several years he tried in vain to get his position back. In 1895, after the death of his wife, he left Austria disappointed; at this occasion, he published a series of three articles in the Viennese newspaper Die neue freie Presse entitled Meine letzen Wünsche für Österreich [My Last Wishes for Austria] (which soon afterwards appeared as a self-standing book), in which he outlines his philosophical position as well as his approach to psychology, but also harshly criticized the legal situation of former priests in Austria. In 1896 he settled down in Florence where he got married to Emilie Ruprecht in 1897.

Brentano has often been described as an extraordinarily charismatic teacher. Throughout his life he influenced a great number of students, many of who became important philosophers and psychologists in their own rights, such as Edmund Husserl, Alexius Meinong, Christian von Ehrenfels, Anton Marty, Carl Stumpf, Kasimir Twardowski, as well as Sigmund Freud. Many of his students became professors all over the Austro-Hungarian Empire, Marty and Ehrenfels in Prague, Meinong in Graz, and Twardowski in Lvov, and so spread Brentanianism over the whole Austro-Hungarian Empire, which explains the central role of Brentano in the philosophical development in central Europe, especially in what was later called the Austrian Tradition in philosophy.

Brentano always emphasized that he meant to teach his students to think critically and in a scientific manner, without holding prejudices and paying undue respect to philosophical schools or traditions. When former students of his took a critical approach to his own work, however, when they criticized some of his doctrines and modified others to adapt them for their own goals, Brentano reacted bitterly. He often refused to discuss criticism, ignored improvements, and thus became more and more isolated, a development that was reinforced by his increasing blindness.

Due to these eye-problems Brentano could not read or write any longer, but had his wife read to him and dictated his work to her. Nonetheless, he produced a number of books in his years in Florence. In 1907 he published Untersuchungen zur Sinnespsychologie, a collection of shorter texts on psychology. In 1911 he presented not only the second volume of his Psychology from an Empirical Standpoint, but also two books on Aristotle: in Aristotle and his World View he provides an outline and interpretation of Aristotle’s philosophy. In Aristoteles Lehre vom Ursprung des menschlichen Geistes Brentano continues a debate with Zeller. This debate had started already in the 1860s, when Brentano criticized Zeller’s interpretation of Aristotle in his Psychology of Aristotle and became quite intense and aggressive in the seventies and eighties of the nineteenth century.

When Italy entered war against Germany and Austria during World War I, Brentano, who felt himself a citizen of all three countries, moved from Florence to neutral Switzerland. He passed away in Zurich on March 17, 1917.

Brentano left a huge number of unpublished manuscripts and letters on a wide range of philosophical topics in his last domicile in Zürich and in his summer-residence in Schönbühel bei Melk; some manuscripts were probably left behind in Florence. After his death, Alred Kastil and Oskar Kraus, who were students of Brentano’s former student Anton Marty in Prague, worked on the Nachlass. Their attempt to set up a Brentano-archive in Prague was supported by Tomas Masaryk, a former student of Brentano who had become founder and first President (from 1918 to 1935) of the Republic of Czechoslovakia. Alas, due to the political turbulences that were to came over central Europe the project was doomed to fail. Substantial parts of the Nachlass were transferred to different places in the United States, some of it has later been brought back to Europe, especially to the Brentano-Forschungsstelle at the University of Graz, Austria, and the Brentano family archive in Blonay, Switzerland. (For a detailed history of Brentano’s Nachlass, cf. Binder (2013)).

Kastil and Kraus did succeed, however, to begin to publish posthumously some of the lecture notes, letters, and drafts he had left. They tried to present Brentano’s work as best as they could, putting together various texts to what they thought were rounded, convincing works, sometimes following questionable editorial criteria. Their work was continued by other, more careful editors, but has by far not yet been completed: a much needed critical edition of his complete œuvre is still to be waited for.

2. Philosophy as a Rigorous Science and the Rise of Scientific Psychology

One of Brentano’s main principles was that philosophy should be done with methods that are as rigorous and exact as the methods of the natural sciences. This standpoint is clearly mirrored in his empirical approach to psychology. It is noteworthy here that Brentano’s use of the word “empirical” deviates substantially from what has become its standard meaning in psychology today. He emphasized that all our knowledge should be based on direct experience. He did not hold, however, that this experience needs to be made from a third-person point of view, and thus opposes what has become a standard of empirical science nowadays. Brentano rather argued a form of introspectionism: doing psychology from an empirical standpoint means for him to describe what one directly experiences in inner perception, from a first-person point of view.

Brentano’s approach, like that of other introspectionist psychologists of the late nineteenth century, was harshly criticized with the rise of scientific psychology in the tradition of logical positivism, especially by the behaviorists. This should not obscure the fact that Brentano did play a crucial role in the process of psychology becoming an independent science. He distinguished between genetic and empirical or, as he later called it, descriptive psychology, a distinction that is most explicitly drawn in his Descriptive Psychology. Genetic psychology studies psychological phenomena from a third-person point of view. It involves the use of empirical experiments and thus satisfies the scientific standards we nowadays expect of an empirical science. Even though Brentano never practiced experimental psychology himself, he very actively supported the installation of the first laboratories for experimental psychology in the Austro-Hungarian Empire, a development that was continued by his student Alexius Meinong in Graz. Descriptive psychology (to which Brentano sometimes also referred as “phenomenology”) aims at describing consciousness from a first-person point of view. Its goal is to list “fully the basic components out of which everything internally perceived by humans is composed, and … [to enumerate] the ways in which these components can be connected” (Descriptive Psychology , 4). Brentano’s distinction between genetic and descriptive psychology strongly influenced Husserl’s development of the phenomenological method, especially in its early phases, a development of which Brentano could not approve for it involved the intuition of abstract essences, the existence of which Brentano denied.

3. Brentano’s Theory of Mind

Brentano’s main goal was to lay the basis for a scientific psychology, which he defines as “the science of mental phenomena” (Psychology, p. 18). In order to give flesh to this definition of the discipline, he provides a more detailed characterization of mental phenomena. He proposes six criteria to distinguish mental from physical phenomena, the most important of which are: (i) mental phenomena are the exclusive object of inner perception, (ii) they always appear as a unity, and (iii) they are always intentionally directed towards an object. I will discuss the first two criteria in this section, and the third in a separate section below.

All mental phenomena have in common, Brentano argues, “that they are only perceived in inner consciousness, while in the case of physical phenomena only external perception is possible” (Psychology, 91). According to Brentano, the former of these two forms of perception provides an unmistakable evidence for what is true. Since the German word for perception (Wahrnehmung), literally translated, means “taking-true”, Brentano says that it is the only kind of perception in a strict sense. He points out that inner perception must not be mixed up with inner observation, i.e., it must not be conceived as a full-fledged act that accompanies another mental act towards which it is directed. It is rather interwoven with the latter: in addition to being primarily directed towards an object, each act is incidentally directed towards itself as a secondary object. As a consequence, Brentano denies the idea that there could be unconscious mental acts: since every mental act is incidentally directed towards itself as a secondary object, we are automatically aware of every occurring mental act. He admits, however, that we can have mental acts of various degrees of intensity. In addition, he holds that the degree of intensity with which the object is presented is equal to the degree of intensity in which the secondary object, i.e., the act itself, is presented. Consequently, if we have a mental act of a very low intensity, our secondary consciousness of this act also will have a very low intensity. From this Brentano concludes that sometimes we are inclined to say that we had an unconscious mental phenomenon when actually we only had a conscious mental phenomenon of very low intensity.

Consciousness, Brentano argues, always forms a unity. While we can perceive a number of physical phenomena at one and the same time, we can only perceive one mental phenomenon at a specific point in time. When we seem to have more than one mental act at a time, like when we hear a melody while tasting a sip of red wine and enjoying the beautiful view from the window, all these mental phenomena melt into one, they become moments or, to stick with Brentano’s terminology, divisives of a collective. If one of the divisives ends in the course of time, e.g., when I swallow the wine and close my eyes, but continue to listen to the music, the collective goes on to exist. Brentano’s views on the unity of consciousness entail that inner observation, as explained above, is strictly impossible, i.e., that we cannot have a second act which is directed towards another mental act which it accompanies. One can remember another mental act one had a moment earlier, or expect future mental acts, but due to the unity of consciousness one cannot have two mental acts, one of which being directed towards the other, at the same time.

Brentano points out that we can be directed towards one and the same object in different ways and he accordingly distinguishes three kinds of mental phenomena: presentations, judgments, and phenomena of love and hate. These are not three distinct classes, though. Presentations are the most basic kind of acts; we have a presentation each time when we are directed towards an object, be it that we are imagining, seeing, remembering, or expecting it, etc. In his Psychology Brentano held that two presentations can differ only in the object, towards which they are directed. Later he modified his position, though, and argued that they can also differ in various modes, such as temporal modes. The two other categories, judgments and phenomena of love and hate, are based on presentations. In a judgment we accept or deny the existence of the presented object. A judgment, thus, is a presentation plus a qualitative mode of acceptance or denial. The third category, which Brentano names “phenomena of love and hate,” comprises emotions, feelings, desires and acts of will. In these acts we have positive or negative feelings towards an object.

4. Intentionality

Brentano is probably best known for having introduced the notion of intentionality to contemporary philosophy. He first characterizes this notion with the following words, which have become the classical, albeit not completely unambiguous formulation of the intentionality thesis:

Every mental phenomenon is characterized by what the Scholastics of the Middle Ages called the intentional (or mental) inexistence of an object, and what we might call, though not wholly unambiguously, reference to a content, direction toward an object (which is not to be understood here as meaning a thing), or immanent objectivity. Every mental phenomenon includes something as object within itself… (Brentano, Psychology, 88)

[Jedes psychische Phänomen ist durch das charakterisiert, was die Scholastiker des Mittelalters die intentionale (auch wohl mentale) Inexistenz eines Gegenstandes genannt haben, und was wir, obwohl mit nicht ganz unzweideutigen Ausdrücken, die Beziehung auf einen Inhalt, die Richtung auf ein Objekt (worunter hier nicht eine Realität zu verstehen ist), oder die immanente Gegenständlichkeit nennen würden. Jedes enthält etwas als Objekt in sich… (Brentano, Psychologie, 124f)]

This quotation must be understood in context: in this passage, Brentano aims at providing one (of six) criteria to distinguish mental from physical phenomena with the aim to define the subjekt matter of scientific psychology – and not to develop a systematic account of intentionality. The passage clearly suggests, however, that the intentional object towards which we are directed is part of the psychological act. It is something mental rather than physical. Brentano, thus, seems to advocate a form of immanentism, according to which the intentional object is “in the head,” as it were. Some Brentano scholars have recently argued that this immanent reading of the intentionality thesis is too strong. In the light of other texts by Brentano from the same period they argue that he distinguishes between intentional correlate and object, and that the existence of the latter does not depend on our being directed towards it.

When Brentano’s students took up his notion of intentionality to develop more systematic accounts, they often criticized it for its unclarity regarding the ontological status of the intentional object: if the intentional object is part of the act, it was argued, we are faced with a duplication of the object. Next to the real, physical object, which is perceived, remembered, thought of, etc., we have a mental, intentional object, towards which the act is actually directed. Thus, when I think about the city of Paris, I am actually thinking of a mental object that is part of my act of thinking, and not about the actual city. This view leads to obvious difficulties, the most disastrous of which is that two persons can never be directed towards one and the same object.If we try to resolve the problem by taking the intentional object to be identical with the real object, on the other hand, we face the difficulty of explaining how we can have mental phenomena that are directed towards non-existing objects such as Hamlet, the golden mountain, or a round square. Like my thinking about the city of Paris, all these acts are intentionally directed towards an object, with the difference, however, that their objects do not really exist.

Brentano’s initial formulation of the intentionality-thesis does not address these problems concerning the ontological status of the intentional object. The first attempt of Brentano’s students to overcome these difficulties was made by Twardowski, who distinguished between content and object of the act, the former of which is immanent to the act, the latter not. This distinction strongly influenced other members of the Brentano School, mainly the two students for who the notion of intentionality had the most central place, Meinong and Husserl.

Meinong’s theory of objects can best be understood as a reaction to the ontological difficulties in Brentano’s account. Rather than accepting the notion of an immanent content, Meinong argues that the intentional relation is always a relation between the mental act and an object. In some cases the intentional object does not exist, but even in these cases there is an object external to the mental act towards which we are directed. According to Meinong, even non-existent objects are in some sense real. Since we can be intentionally directed towards them, they must subsist (bestehen). Not all subsisting objects exist; some of them cannot even exist for they are logically impossible, such as round squares. The notion of intentionality played a central role also in Husserlian phenomenology. Applying his method of the phenomenological reduction, however, Husserl addresses the problem of directedness by introducing the notion of ‘noema.’

Brentano was not very fond of his students’ attempts to resolve these difficulties, mainly because he rejected their underlying ontological assumptions. He was quick to point out that he never intended the intentional object to be immanent to the act. Brentano thought that this interpretation of his position was obviously absurd, for it would be “paradoxical to the extreme to say that a man promises to marry an ens rationisand fulfills his promise by marrying a real person” (Psychology, 385). In later texts, he therefore suggested to see intentionality as an exceptional form of relation. A mental act does not stand in an ordinary relation to an object, but in a quasi-relation (Relativliches). For a relation to exist, both relata have to exist. A person a is taller than another person b, for example, only if both a and b exist (and a is, in fact, taller than b). This does not hold for the intentional quasi-relation, Brentano suggests. A mental phenomenon can stand in a quasi-relation to an object independent of whether it exists or not. Mental acts, thus, can stand in a quasi-relation to existing objects like the city of Paris as well as non-existing objects like the Golden Mountain. Brentano’s later account, which is closely related to his later metaphysics, especially to his turn towards reism, i.e., the view that only individual objects exist, can hardly be considered a solution of the problem of the ontological status of the intentional object. He rather introduces a new term to reformulate the difficulties.

5. Time-Consciousness

According to Brentano’s theory, mental acts cannot have duration. This brings up the question of how we can perceive temporally extended objects like melodies. Brentano accounts for these cases by arguing that an object towards which we are directed does not immediately vanish from consciousness once the mental act is over. It rather remains present in altered form, modified from ‘present’ to ‘past.’ Every mental phenomenon triggers an ‘original association’ or ‘proteraesthesis,’ as he calls it later, a kind of memory which is not a full-fledged act of remembering, but rather a part of the act that keeps lively what was experienced a moment ago. When I listen to a melody, for example, I first hear the first tone. In the next moment I hear the second tone, but am still directed towards the first one, which is modified as past, though. Then I hear the third tone, now the second tone is modified as past, the first is pushed back even further into the past. In this way Brentano can explain how we can perceive temporally extended objects and events. The details of Brentano’s account of time-consciousness changed over the time, owing to changes in his overall position. At one point he thought that the temporal modification was part of the object, later he thought that they belonged to judgments, and even later he argued that they were modes of presentations.

Brentano’s account of time consciousness greatly influenced his students, especially Edmund Husserl, whose notion of ‘retention’ bears close resemblances to Brentano’s notion of ‘original association.’

6. Logic, Ethics, Aesthetics, and Historiography

According to Brentano, psychology plays a central role in the sciences; he considers especially logics, ethics, and aesthetics as practical disciplines that depend on psychology as their theoretical foundation. Brentano’s conception of these three disciplines is closely related to his distinction between the three kinds of mental phenomenon: presentations, judgments, and phenomena of love and hate, i.e., emotions.

Logic, according to Brentano, is the practical discipline that is concerned with judgments; i.e. with the class of mental phenomena in which we take a positive or a negative stance towards the (existence of the) object by affirming or denying it. In addition, judgments are correct or incorrect; they have a truth-value. According to Brentano, a judgment is true when it is evident, i.e., when one perceives (in inner perception that is directed towards the judgment) that one judges with evidence. Brentano, thus, rejects the correspondence theory of truth, suggesting that “a person judges truly, if and only if, his judgment agrees with the judgment he would make if we were to judge with evidence” (Chisholm 1986, 38). Notwithstanding this dependence on the notion of judgment, however, truth, for Brentano, is not a subjective notion: if one person affirms an object and another person denies the same object, only one of them judges correctly. (For a more detailed discussion of Brentano’s contributions to logic, cf. the entryBrentano’s Theory of Judgement.)

Ethics, on the other hand, is concerned with phenomena of love and hate. When experiencing a phenomenon of this class, we take an emotional stance towards an object, i.e., a stance that can be positive or negative. Moreover, phenomena of this class can be correct or incorrect. In these two aspects we have a formal analogy between judgments and emotions. An emotion is correct, according to Brentano, “when one’s feelings are adequate to their object — adequate in the sense of being appropriate, suitable, or fitting” (Brentano, Origins, 70). If it is correct to love an object, we can say that it is good; if it is correct to hate it, it is bad. The question of whether or not it is correct to have a positive emotion towards an object is not a subjective one; according to Brentano it is impossible that one person correctly loves an object and another person correctly hates it.

Aesthetics, finally, is based on the most basic class of mental phenomena: on presentations. According to Brentano, every presentation is in itself of value; this holds even for those that become the basis of a correct, negative judgment or a correct negative emotion. Thus, while judgments and emotions consist in taking either a positive or a negative stance, the value of a presentation is always positive, but comes in degrees: some presentations are of higher value than others. Not every presentations is of particular aesthetic value, though; in order to be so, it has to become the object of an emotion in which one correctly takes a positive stance towards it. In short, according to Brentano, an object is beautiful if a presentation that is directed at it arouses a correct, positive emotion, i.e., a form of pleasure; it is ugly, on the other hand, if a presentation that is directed at it arouses a correct, negative emotion, a form of displeasure.

This discussion shows that Brentano’s philosophy has strong psychologistic tendencies. Whether or not one is to conclude that he does adopt a form of psychologism depends on the exact definition of the latter term: Brentano vehemently rejects the charge of psychologism, which he takes to stand for a subjectivist and anthropocentric position. At the same time, however, he explicitely defends the claim that psychology is the theoretical science on which practical disciplines of logic, ethics, and aesthetics are based. Hence, he does adopt the form of psychologism Husserl seems to have had in mind in the Prolegomena to his Logical Investigations, where he defines logical psychologism as a position according to which:

… [T]he essential theoretical foundations of logic lie in psychology, in whose field those propositions belong — as far as their theoretical content is concerned — which give logic its characteristic pattern. … Often people talk as if psychology provided the sole, sufficient, theoretical foundation for logical psychology (Husserl 2001, 40).

Die wesentlichen theoretischen Fundamente liegen in der Psychologie; in deren Gebiet gehören ihrem theoretischen Gehalt nach die Sätze, die der Logik ihr charakteristisches Gepräge geben. … Ja nicht selten spricht man so, als gäbe die Psychologie das alleinige und ausreichende theoretische Fundament für die logische Kunstlehre. (Husserl, 1900, 51)

Brentano’s interest in the history of philosophy is not only reflected by his extensive work on Aristotle, but also by his historiographical considerations – and also in this context psychology is to play a fundamental role. In his text The Four Phases of Philosophy and Its Current State(1998) he defended the metaphilosophical thesis that progress in philosophy can be explained according to principles of cultural psychology. In philosophy progress takes place in circles: each philosophical period, Brentano holds, can be subdivided in four phases. The first is a creative phase of renewal and ascending development; the other three are phases of decline, dominated by a turn towards practical interests, by scepticism, and finally by mysticism. After the fourth phase, a new period begins with a creative phase of renewal. With this scheme Brentano succeeds in giving his philosophical preferences an intellectual justification; it allows him to explain his fascination for Aristotle, the Scholastics, and Descartes as well as his dislike of Kant and the German idealists.

7. Brentano’s Metaphysics

Even though Brentano worked on problems in metaphysics and ontology throughout his life, he hardly published on these topics during his lifetime. The impact of his views is due to the fact that from his early lectures at the University of Würzburg on he discussed them with his students, both in class and (especially in later years) in correspondence.

Even though Brentano’s views have underwent considerable changes over the years, his general attitude can be characterized as sober, parsimonious, and (in the current use of the term) nominalistic; at no point did he admit the existence of universals, he rather relied on mereological principles to account for classical problems in ontology.

Brentano’s early metaphysics, which is the result of his critical reading of Aristotle, is a form of conceptualism. He does distinguish between substance and accidents, but argues that both are but fictions cum fundamentum in re. With this, he wants to suggest that they do not have actual existence, but that we can make judgments about real things that are correct and contain references to substances and accidents. This view is closely connected to his epistemic notion of truth, according to which the question of whether a judgment is true does not depend on its corresponding to reality, but rather on whether it can be judged with evidence. Brentano elucidates the relation between a thing and its properties on the basis of the mereological notions of “logical part” and “metaphysical part,” the former of which account for abstract, repeatable properties, the latter for the concrete properties of a thing. Both are not considered to be denizens of reality in a narrow sense, but rather fictions that have a foundation in reality. (For a reconstruction and discussion of the details of Brentano’s early ontology, cf. Chrudzimski 2004).

After the introduction of the notion of intentionality in his Psychology from an Empirical Standpoint (1874), Brentano struggled to account for the ontological status of the intentional object. When he first introduces the notion, suggesting, as we have seen above, that “[e]very mental phenomenon includes something as object within itself” (Brentano, Psychology, 88), he seems to be interested primarily in presenting a psychological thesis and does not seem to be overly worried with its ontological implications; at this point, the talk of an “immanent object” might have been a mere façon de parler(cf. Chrudzimski and Smith, 2003, 205). Soon Brentano finds himself in the need, however, to address this question and, as a result, to enrich the domain of objects in his ontology. He seems to admit that next to concrete things there are irrealia, that is, objects that to not really exist but have the status of thought-objects or, as he puts it, entia rationis, that do not have an essence and do not stand in causal relations. Brentano does not systematically elaborate his ontological position in this period, we rather find a bundle of ideas of which he did not seem to be fully convinced. This underlines that the formulation of these views was not made with the intention to make a contribution to ontology, but rather to reply to concerns that have emerged from his introduction of the notion of intentionality.

In his late philosophy, from 1904 on, Brentano rediscovers the virtue of ontological parsimony and takes up the main insights of his conceptualist period, developing (and radicalizing) them to a form of reism, according to which the only items that exist are individual things (res). “While young Brentano tried to ontologically play down certain ways of speaking, late Brentano tried to eliminate them from philosophical discourse” [“Der junge Brentano versuchte gewisse Redeweisen ontologisch zu bagatellisieren, der späte Brentano versuchte sie aus dem philosophischen Diskurs zu eliminieren”] (Chrudzimski 2004, 177). He abandons the notion of irrealia, which he now regards as linguistic fictions, and continues to deny the existence of universals or abstract entities. Instead, he conceives both substances and accidents as real things that are related to one another by a particular mereological relation: an accident does not only ontologically depend on the substance, it also contains the substance as a part without, however, containing any supplementary part. A white table, accordingly, is an accident that contains the table as a part. If we were to paint it red, the white table would cease to exist and the red table would come into existence – the continuity between the two being guaranteed by the table, which was part of the white table and is now part of the green table.

Brentano’s ontology is known to a broader audience only through posthumously published works that were edited by his late students Oskar Kraus and Alfred Kastil, who considered his late position most important and accordingly put less emphasis on Brentano’s earlier phases. Only recently the development of Brentano’s views on ontology has gained more attention, mainly through the work of scholars who were able to study unpublished manuscripts in the archives (cf., for example, Chrudzimski 2004). This underlines once more the need of a critical edition of Brentano’s entire Nachlass, which would make it possible for a broader audience to critically assess the development of Brentano’s views in ontology.

8. The Impact of Brentano’s Philosophy

Brentano’s contributions to philosophy were widely discussed among philosophers and psychologists at the end of the nineteenth and the beginning of the twentieth century. After some time his influence was eclipsed by the work of his students, some of who founded philosophical traditions on their own: Husserl started the phenomenological movement, Meinong the Graz school, Twardovski the Lvov-Warsaw School. As a result, in the second half of the twentieth century Brentano was often mentioned as the philosopher who had (re-)introduced the notion of intentionality, as “grandfather” of the phenomenological movement, or for his influence on early analytic philosophy, but his philosophical views and arguments were hardly discussed.

There are notable exceptions to this tendency, though. Roderick Chisholm, for example, made a continuous effort to show Brentano’s significance to contemporary philosophy by adopting his results in his own contributions to the philosophy of mind, but also in presentations of various aspects of Brentano’s thought (cf. Chisholm 1966, 1982, and 1986). Moreover, in recent decades the tradition that is often referred to as “Austrian philosophy” has gained increasing interest in a broader philosophical audience, which is due mainly to the work of Rudolf Haller, Barry Smith, Peter Simons, and Kevin Mulligan, among others. By showing the systematic relevance of Brentano’s (and other Austrian philosophers’) contributions to problems discussed in ontology, logic, the theory of emotions, or consciousness, they could counteract the tendency to reduce Brentano’s contributions to philosophy to the notion of intentionality.

In recent years an increasing number of philosophers from different fields have rediscovered and elaborated on different themes from Brentano’s philosophy. Brentano’s views on ethics, for example, (which have gained more attention in English-speaking countries than in central Europe, probably because of the early English translation of Brentano’s lecture on ethics (1902)), have been taken up in fitting attitude theories of value, which analyze ethical value in terms of correct or incorrect forms of approval or disapproval. His theory of mind has inspired neo-Brentanian accounts of consciousness that aim to do justice to the systematic nature of Brentano’s theory of mind, where the notion of intentionality is closely intertwined with the conception of secondary consciousness and the thesis of the unity of consciousness.

In particular, it has been suggested that Brentano’s notion of secondary consciousness (i.e., the thesis that every mental phenomenon is incidentally directed also towards itself as a secondary object) can provide the means to overcome higher-order theories of consciousness that have been widely discussed in the late twentieth century. Brentano, who argued that every mental phenomenon is object of inner perception, has sometimes been regarded as an early proponent of a higher-order perception theory of consciousness (cf., for example, Güzeldere 1997, 789). This interpretation, however, does not pay due attention to the fact that according to Brentano, inner perception is not a self-standing mental phenomenon of a higher level, but rather a structural moment of every mental phenomenon. Moreover, Brentano explicitly rejects the basic assumption of all higher-order perception theories of consciousness, i.e., the idea that we can have two mental phenomena (of distinct levels) at the same time, one of them being directed towards the other: higher-order perception theories postulate what Brentano calls ‘inner observation’ (as opposed to inner perception), which he retains impossible, as we have seen above.

Accordingly, a number of recent interpreters have suggested that Brentano was an advocate of a one-level account of consciousness: ‘Since the features that make an act conscious are firmly located within the act itself rather than bestowed on it by a second act, this locates Brentano’s view as a one-level view of consciousness’ (Thomasson, 2000, 192).  This reading has given place to neo-Brentanian theories such as Thomasson’s adverbial account (cf. Thomasson 2000) or self-representational approaches (cf., for example, Krigel 2003a,b) that build on the thesis that ‘every conscious state has a dual representational content.  Its main content is the normal content commonly attributed to mental representations.  But it also has a (rather peripheral) special content, namely, its own occurrence’ (Kriegel 2003a, 480), which they take as Brentano’s central thesis.  Moreover, Kriegel suggests that for Brentano this self-representational aspect is a necessary condition for having a presentation (Kriegel 2013).

Other interprets have taken more cautious lines.  Mark Textor (2006), for example, has relied on Brentano’s thesis of the unity of consciousness to account for the relation between primary and secondary consciousness.  A mental phenomenon, according to Textor’s interpretaion of Brentano’s theory, does not become conscious by representing itself, but rather by its being unified or fused with an immediately evident cognition of it.  Also Dan Zahavi has insisted that Brentano does distinguish two levels of perception, which sheds doubts on the one-level interpretation: ‘It could be argued that Brentano’s claim that every conscious intentional state takes two objects, a primary (external) object and secondary (internal) object, remains committed to a higher-order account of consciousness; it simply postulates it as being implicitly contained in every conscious state’ (Zahavi, 2006, 5).  In short, Brentano’s distinction between primary and secondary consciousness ‘introduces some kind of level-distinction into the structure of experience’ (Brandl 2013, 61) but does not conceive of higher-perception as a full-fledged mental phenomenon at its own, which is why Brandl has recently proposed to regard it a ‘one-and-a-half-level theory’ (Brandl 2013, 61f).”  Wolfgang Huemer, “Franz Brentano;” Stanford Encyclopedia of Philosophy, 2014

news washington post media newspaper moon landing space

Numero Cuatro“Either by omission or by commission, the US media actively misinforms the public on crucial issues that matter.  The reason they do this is because they legally can.

My mentor and dissertation committee member, Dr. Peter Dale Scott, recently wrote on his Facebook page: ‘Inadequate decently priced housing is one of America’s most urgent domestic problems, with developers vacating neighborhoods to build third and fourth homes for the one percent.  It is a symptom of what’s wrong that Cynthia McKinney, one of the relatively few former members of Congress with a Ph.D., has to go to RT to discuss a crisis that is so under-reported in the US media.’

And therein lies the problem with US media: The news is so filtered and in some cases propagandized that it bears little resemblance to the day-to-day intellectual needs of the average US citizen. It fails to provide solutions, let alone information that allows US citizens to cast informed votes. Either by omission or by commission, the US media actively under-, ill-, or misinforms the public on crucial issues that matter! The reason they do this is because they legally can. Media in the US has at least one court ruling that allows them to knowingly lie to the public.

Let’s start with the First Amendment to the US Constitution that protects freedom of speech. Courts in the US have ruled on many occasions that freedom of speech also includes the freedom to lie. The rationale is that such rulings give space for unpopular statements of fact. For example, in 2012, the US Supreme Court voted 6-3 to affirm a lower court decision to overturn a conviction for lying about one’s credentials.

The lower court judge in that case wrote, “How can you develop a reputation as a straight shooter if lying is not an option?”

Washington State Supreme Court even ruled that lying to get votes, distinguishing between fact and opinion, was not something that the state should negotiate. It wrote that people and not the government should be the final arbiter of truth in a political debate.

Now, the First Amendment does not protect some types of lying: like, for instance, lying while under oath, lying to a government official, lying to sell a product. Even in defamation cases, the plaintiff has a firm threshold to overcome, especially if the person targeted is a “public person.” However, the Supreme Court has emphatically ruled that individuals have a right to lie: what about corporations and media outlets? In 2012, the Supreme Court extended First Amendment rights to organizations and corporations in its Citizens United decision.

My local newspaper, the Atlanta Journal and Constitution (AJC), ran a headline against me just days before my election that read: “McKinney Indicted.” One had to pore over the article to learn that the McKinney referred to was neither me nor my father, nor anyone related to me. But the AJC never stated that fact. It was a dirty trick carried out by the US press. And sadly, it happens all the time. I filed a lawsuit against the AJC, but had to withdraw it because of a lack of money to finance the lawsuit and, worse, the hostile environment regarding the media and anybody’s efforts to make them tell the truth. I remained powerless before the media monolith and wondered why and how they could get away with such blatant and outright lies.

Then, in 2010, ‘Project Censored’ ran a story that caught my eye: “The Media Can Legally Lie.” After having had my series of run-ins with my local media as they always failed to report the truth about me, I was drawn to this story. Project Censored is a media watchdog based at Sonoma State University in California. Its goal is to end the junk food news diet of misinformation and disinformation fed to the US public by the corporate media. It is a project of students and faculty to shine a light on underreported or unreported stories that should be of great interest to the public. The Project Censored movie tells a part of its important story.

The 2010 story centers on two journalists, hired by FOX News as investigative journalists, who became whistleblowers when they were instructed to report “news” that they knew was not true.

According to Project Censored, in February 2003, FOX News argued that there was no prohibition on media outlets distorting or falsifying the news in the United States.  And skipping ahead, FOX News won on that claim!  But to backtrack to provide some context, the issue was the placement of Bovine Growth Hormone, BGH, manufactured by Monsanto, into the milk stream without labeling it.

A husband and wife reporting team produced a four-part series revealing the health risks for humans in drinking milk from cows treated with BGH to boost milk production.  FOX News wanted the reporters to add statements from Monsanto that the couple knew were not factual.  When they refused to make the suggested edits, the couple was fired.  They sued and a Florida jury decided the couple was wrongfully fired.  FOX Newsappealed the case.  Basically, the Florida Appeals Court ruled that there is no law, rule, or even regulation against distorting the news and that the decision to report honestly resides with the news outlet.

FOX News was joined in its court action by other news outlets, notably Cox Television, Inc., a sister organization to the Cox-owned Atlanta Journal and Constitution.  In an incredible and chilling turnabout, the two truth-telling journalists were ordered to pay FOX News millions of dollars to cover the company’s attorney fees.  The reporters were told by FOX News executives, The news is what we say it is.’

And there we have it.  Now, this Court action immediately affected the right of people in the US to know what is in the food they buy.  Media consolidation in the US is such that six corporations control 90 percent of the junk food news and entertainment fed to the people of the US and around the world.  And US Courts not only say that this is OK, but also decided that it’s OK for them to knowingly lie to the public.  That, in a nutshell, is why the US media lie: Because they can.  And that, in a nutshell, is why the people of the US are increasingly turning to RT and alternative news outlets for information: Because they must.”  Cynthia McKinney, “Why Does the U.S. Media Lie So Much;” RT, March, 2016