February 28th marks Rare Diseases Day in non-leap years, and always represents World Tailors Day, while India commemorates its National Science Day on this date; when the hunters and tribal peoples of Europe were still barely into the Bronze Age, twenty-two hundred nineteen years ago, China’s first Han emperor Gaozu held his coronation and initiated four centuries of dynastic rule; fourteen hundred forty-eight years afterward, in 1246 CE, not quite half a world away, in the Iberian Peninsula, the early stages of Spain’s Christian Reconquista unfolded with a Castilian victory at the Siege of Jaén against the Taifa of that theretofore Islamic outpost; four hundred ninety-two years in advance of this exact passing day, the mass-murdering ‘conqueror’ Hernan Cortes oversaw the execution of Cuauhtemoc, a homicide that he had planned and orchestrated in keeping with Spanish imperial goals and protocols; eight years beyond that chilling act, in 1533 back in Western Europe, the baby boy came bounding into the world who would grow into the redoubtable intelligence and scathing penslinger, Michel de Montaigne; one year more than a quarter millennium later, in 1784, the beloved John Wesley first chartered a Methodist Church; a hundred ninety years back, a pair of local scions and builders in Maryland impacted the economic development of early America when they incorporated the Baltimore and Ohio Railroad; two decades further along, in 1847, California became at least till now irretrievably American when U.S. forces crushed Spanish defenders at the Battle of the Sacramento River; just two years forward from that instant, in 1849, the steamboat that took nearly a hundred sixty days to sail round South America from New York powered into San Francisco Bay, confirming the first regular ship connections between the East and West Coasts of the United States; another eighteen years subsequently, in 1867, the United States indulged its distrust of and distaste for Catholic creeds by ending diplomatic relations with ‘the Holy See,’ a suspension of connection that lasted till the second Reagan administration, 1984; in a contrasting sort of imperial thrust a thousand ninety-six days onward in time and space, in 1870, the Ottoman Empire’s Sultanate designated the Bulgarian Balkans as an Islamic Exarchate of Turkey; marking this as a benchmark day in the annals of corporate rule, a decade and a half thereafter, in 1885, a key subsidiary of Bell Telephone came into being in New York as its own outfit, the American Telephone and Telegraph Company; nine years on the dot still more proximate to the present pass, in 1894, the male infant was born who would soon enough enchant audiences as the writer and performer and producer, Ben Hecht; three additional years on the temporal arc, in 1897, maybe eight thousand miles South and East, French forces overthrew the last queen of Madagascar, Ranavalona, and assumed imperial command over the African island that has served as an outpost for South Asia in Africa itself; the very next year, in 1898, the Supreme Court of the United States found constitutional a law that limited Utah’s mine and smelter workers’ hours to eight a day; another three year stretch in the direction of today, in 1901, the baby boy opened his eyes who would rise as the champion of science and knowledge and peace, Linus Pauling; a mere two years nearer to the here and now, in 1903, an infant boy emerged from his mother en route to a life as the force of nature and film, Vincente Minnelli; a dozen years along the road to today, in 1915, plus a ‘leap day’ in addition, South Carolina passed a compulsory school attendance law that effected eliminating child labor in mills and mines prior to the youngster’s fourteenth birthday; the very next year exactly, in 1916, the dark and daunting storyteller and thinker Henry James breathed his last; six years henceforth, in 1922, England unilaterally ended its ‘protectorate’ over Egypt and declared its colony’s putative ‘independence’ from London’s sway; taking immediate political advantage of the serendipitous disaster of the Reichstag fire, Hitler’s new government immediately announced the passage of the Reichstag Fire Decree the very next day, exactly eleven years beyond England’s ‘generosity’ to Cairo, in 1933;two years farther down the pike, in 1935, a DuPont scientist confirmed a formula for a manmade compound, which as nylon revolutionized many industries; another three years on time’s relentless march, across the continent in 1938, Chines Ladies’ Garment Workers’ Union members initiated what would become a successful 120 day strike for better wages and conditions in their factory and storefront positions; just shy of a decade on the way to today, in 1947, Chinese Nationalists on Taiwan savagely attacked protests of the reactionary regime, in the process killing tens of thousands of civilians; a single additional solar cycle in the general direction of now, in 1948, police murder in the British colonial capitol of Accra killed three former imperial soldiers who were protesting for promised pensions, the upshot of which was a week of mayhem and uprising that lay the basis for the emergence of an erstwhile independent Ghana from the British imperial asset of the Gold Coast; a half decade even closer to the current context, in 1953, scientists James Watson and Francis Crick communicated to friends that they had untangled the structure and chemistry of the stuff-of-life, the DNA molecule, and the male infant entered our midst who would mature as the Nobel Prize winning establishment economist, Paul Krugman; just a year hence, in 1954, factories released the first color television sets that embodied National Televsions Systems Committee
standards; a further five years en route to today’s light and air, in 1959, one of America’s hoped-for ‘spy satellites, Discover 1, lifted off from the launchpad only to fail to achieve the polar orbit that its handlers had intended it to reach; an eight year interlude more on track for today, in 1967, the denizen of ruling class cultural control, Henry Luce, came to the point of his personal final edition; a half decade yet later on, in 1972, representatives of the U.S. and China inaugurated the ‘New World Order’ with their signatures on the Shanghai Communique; eleven years further down the road, in 1983, the popular television show, M*A*S*H broadcast its final episode, which, with well over a hundred million viewers, was the most watched original programming in TV history; meanwhile, three years subsequent to that cultural crossroads in 1986, South African workers displayed the sort of international solidarity on which working class power, and human survival, likely depend, walking off their jobs in protest at the layoff of 450 union members in New Jersey who worked for the same multinational conglomerate as the Elandsfontein laborers seven thousand miles to the South and East, and a likely fascist assassin blew away the liberal social democratic Prime Minister of Sweden, Olof Palme; seven years in even greater proximity to the present point in time and space, in 1993, Bureau of Alcohol, Tobacco, and Firearms stormtroopers initiated a bloody intervention at the Branch Davidian religious compound near
Waco, Texas when they tried to serve a warrant to arrest David Koresh, resulting in the deaths of nine police and church members on that day and a nearly three month siege that ended with fiery murder on a much larger scale; just two further turns around the solar center afterward, in 1998, Serbian police four thousand miles to the East began an even larger and bloodier intervention against the CIA-backed Kosovo Liberation Army; four years still later, in 2002, another excrescence of state-involved ‘religious violence’ erupted in India’s Gujarat region, in an outbreak of mass killing in the context of riotous uprising in well over 150 Muslims died; seven hundred thirty days past that carnage, in 2004, a quieter and more natural reaping transpired when historian and Librarian of Congress Daniel J. Boorstin breathed his last at the exact same time that, around the world, upwards of one million Taiwanese joined hands, literally, to form a five hundred kilometer long chain to commemorate the slaughter by American-supported soldiers and police of more than thirty thousand ancestors and friends on the same day fifty-seven years previously; three years more on the temporal road, in 2007, the travels of establishment historian and critic Arthur M. Schlesinger formally ended; seven years farther in the direction of our own day, just three years past in 2014, ninety-eight year old mathematician and civil rights and social justice advocate Lee Lorch lived out his final day.
Numero Uno—“THE SIMPLEST of our extravagant expectations concerns the amount of novelty in the world. There was a time when the reader of an unexciting newspaper would remark, ‘How dull is the world today!’ Nowadays he says, ‘What a dull newspaper!’ When the first American newspaper, Benjamin Harris’ Publick Occurrences Both Forreign and Domestick, appeared in Boston on September 25, 1690, it promised to furnish news regularly once a month. But, the editor explained, it might appear oftener ‘if any Glut of Occurrences happen.’ The responsibility for making news was entirely God’s‑or the Devil’s. The newsman’s task was only to give ‘an Account of such considerable things as have arrived unto our Notice.’
Although the theology behind this way of looking at events soon dissolved, this view of the news lasted longer. ‘The skilled and faithful journalist,’ James Parton observed in 1866, ‘recording with exactness and power the thing that has come to pass, is Providence addressing men.’ The story is told of a Southern Baptist clergyman before the Civil War who used to say, when a newspaper was brought in the room, ‘Be kind enough to let me have it a few minutes, till I see how the Supreme Being is governing the world.’ Charles A. Dana, one of the great American editors of the nineteenth century, once defended his extensive reporting of crime in the New York Sun by saying, ‘I have always felt that whatever the Divine Providence permitted to occur I was not too proud to report.’
Of course, this is now a very old‑fashioned way of thinking. Our current point of view is better expressed in the definition by Arthur MacEwen, whom William Randolph Hearst made his first editor of the San Francisco Examiner: ‘News is anything that makes a reader say, ‘Gee whiz!”‘ Or, put more soberly, ‘News is whatever a good editor chooses to print.’
We need not be theologians to see that we have shifted responsibility for making the world interesting from God to the newspaperman. We used to believe there were only so many “events” in the world. If there were not many intriguing or startling occurrences, it was no fault of the reporter. He could not be expected to report what did not exist.
Within the last hundred years, however, and especially in the twentieth century, all this has changed. We expect the papers to be full of news. If there is no news visible to the naked eye, or to the average citizen, we still expect it to be there for the enterprising newsman. The successful reporter is one who can find a story, even if there is no earthquake or assassination or civil war. If he cannot find a story, then he must make one‑by the questions he asks of public figures, by the surprising human interest be unfolds from some commonplace event, or by “the news behind the news.” If all this fails, then he must give us a “think piece embroidering of well‑known facts, or a speculation about startling things to come.
This change in our attitude toward “news” is not merely a basic fact about the history of American newspapers. It is a symptom of a revolutionary change in our attitude toward what happens in the world, how much of it is new, and surprising, and important. Toward how life can be enlivened, toward our power and the power of those who inform and educate and guide us, to provide synthetic happenings to make up for the lack of spontaneous events. Demanding more than the world can give us, we require that something be fabricated to make up for the world’s deficiency. This is only one example of our demand for illusions.
Many historical forces help explain how we have come to our present immoderate hopes. But there can be no doubt about what we now expect, nor that it is immoderate. Every American knows the anticipation with which he picks up his morning newspaper at breakfast or opens his evening paper before dinner, or listens to the newscasts every hour on the hour as he drives across country, or watches his favorite commentator on television interpret the events of the day. Many enterprising Americans are now at work to help us satisfy these expectations. Many might be put out of work if we should suddenly moderate our expectations. But it is we who keep them in business and demand that they fill our consciousness with novelties, that they play God for us.
The new kind of synthetic novelty which has flooded our experience I will call “pseudo‑events.” The common prefix “pseudo” comes from the Greek word meaning false, or intended to deceive. Before I recall the historical forces which have made these pseudo‑events possible, have increased the supply of them and the demand for them, I will give a commonplace example.
The owners of a hotel, in an illustration offered by Edward L. Bernays in his pioneer Crystallizing Public Opinion (1923), consult a public relations counsel. They ask how to increase their hotel’s prestige and so improve their business. In less sophisticated times, the answer might have been to hire a new chef, to improve the plumbing, to paint the rooms, or to install a crystal chandelier in the lobby. The public relations counsel’s technique is more indirect. He proposes that the management stage a celebration of the hotel’s thirtieth anniversary. A committee is formed, including a prominent banker, a leading society matron, a well‑known lawyer, an influential preacher, and an “event” is planned (say a banquet) to call attention to the distinguished service the hotel has been rendering the community. The celebration is held, photographs are taken, the occasion is widely reported, and the object is accomplished. Now this occasion is a pseudo‑event, and will illustrate all the essential features of pseudo‑events.
This celebration, we can see at the outset, is somewhat — but not entirely —misleading. Presumably the public relations counsel would not have been able to form his committee of prominent citizens if the hotel had not actually been rendering service to the community. On the other hand, if the hotel’s services had been all that important, instigation by public relations counsel might not have been necessary. Once the celebration has been held, the celebration itself becomes evidence that the hotel really is a distinguished institution. The occasion actually gives the hotel the prestige to which it is pretending.
It is obvious, too, that the value of such a celebration to the owners depends on its being photographed and reported in newspapers, magazines, newsreels, on radio, and over television. It is the report that gives the event its force in the minds of potential customers. The power to make a reportable event is thus the power to make experience. One is reminded of Napoleon’s apocryphal reply to his general, who objected that circumstances were unfavorable to a proposed campaign: “Bah, I make circumstances!” The modern public relations counsel —‑and he is, of course, only one of many twentieth‑century creators of pseudo‑events — has come close to fulfilling Napoleon’s idle boast. “The counsel on public relations,” Mr. Bernays explains, “not only knows what news value is, but knowing it, he is in a position to make news happen. He is a creator of events.”
The intriguing feature of the modem situation, however, comes precisely from the fact that the modem news makers are not God. The news they make happen, the events they create, are somehow not quite real. There remains a tantalizing difference between man‑made and God‑made events.
A pseudo‑event, then, is a happening that possesses the following characteristics:
(1) It is not spontaneous, but comes about because someone has planned, planted, or incited it. Typically, it is not a train wreck or an earthquake, but an interview.
(2) It is planted primarily (not always exclusively) for the immediate purpose of being reported or reproduced. Therefore, its occurrence is arranged for the convenience of the reporting or reproducing media. Its success is measured by how widely it is reported. Time relations in it are commonly fictitious or factitious; the announcement is given out in advance ‘for future release’ and written as if the event had occurred in the past. The question, ‘Is it real?’ is less important than, ‘Is it newsworthy?’
(3) Its relation to the underlying reality of the situation is ambiguous. Its interest arises largely from this very ambiguity. Concerning a pseudo‑event the question, ‘What does it mean?’ has a new dimension. While the news interest in a train wreck is in what happened and in the real consequences, the interest in an interview is always, in a sense, in whether it really happened and in what might have been the motives. Did the statement really mean what it said? Without some of this ambiguity a pseudo‑event cannot be very interesting.
(4) Usually it is intended to be a self‑fulfilling prophecy. The hotel’s thirtieth‑anniversary celebration, by saying that the hotel is a distinguished institution, actually makes it one. …
UNTIL RECENTLY we have been justified in believing Abraham Lincoln’s familiar maxim: ‘You may fool all the people some of the time; you can even fool some of the people all the time; but you can’t fool all of the people all the time.’ This has been the foundation‑belief of American democracy. Lincoln’s appealing slogan rests on two elementary assumptions. First, that there is a clear and visible distinction between sham and reality, between the lies a demagogue would have us believe and the truths which are there all the time. Second, that the people tend to prefer reality to sham, that if offered a choice between a simple truth and a contrived image, they will prefer the truth.
Neither of these any longer fits the facts. Not because people are less intelligent or more dishonest. Rather because great unforeseen changes — the great forward strides of American civilization — have blurred the edges of reality. The pseudo‑events which flood our consciousness are neither true nor false in the old familiar senses. The very same advances which have made them possible have also made the images — ‑however planned, contrived, or distorted — more vivid, more attractive, more, impressive, and more persuasive than reality itself.
We cannot say that we are being fooled. It is not entirely inaccurate to say that we are being ‘informed.’ This world of ambiguity is created by those who believe they are instructing us, by our best public servants, and with our own collaboration. Our problem is the harder to solve because it is created by people working honestly and industriously at respectable jobs. It is not created by demagogues or crooks, by conspiracy or evil purpose. The efficient mass production of pseudo‑events — in all kinds of packages, in black‑and white, in technicolor, in words, and in a thousand other forms — is the work of the whole machinery of our society. It is the daily product of men of good will. The media must be fed! The people must be informed! Most pleas for ‘more information’ are therefore misguided. So long as we define information as a knowledge of pseudo‑events, ‘more information’ will simply multiply the symptoms without curing the disease.
The American citizen thus lives in a world where fantasy is more real than reality, where the image has more dignity than its original. We hardly dare face our bewilderment, because our ambiguous experience is so pleasantly iridescent, and the solace of belief in contrived reality is so thoroughly real. We have become eager accessories to the great hoaxes of the age. These are the hoaxes we play on ourselves.
Pseudo-events from their very nature tend to be the more interesting and more attractive than spontaneous events. Therefore in American public life today pseudo-events tend to drive all other kinds events out of our consciousness or at least to overshadow them. Earnest, well‑informed citizens seldom notice that their experience of spontaneous events is buried by pseudo‑events. Yet nowadays, the more industriously they work at “informing” themselves the more this tends to be true.
In his now‑classic work, Public Opinion, Walter Lippmann in 1922 began by distinguishing between “the world outside and the pictures in our heads.” He defined a “stereotype” as an oversimplified pattern that helps us find meaning in the world. As examples he gave the crude “stereotypes we carry about in our heads,” of large and varied classes of people like “Germans,” “South Europeans,” “Negroes,” “Harvard men,” “agitators,” etc. The stereotype, Lippmann explained, satisfies our needs and helps us defend our prejudices by seeming to give definiteness and consistency to our turbulent and disorderly daily experience. In one sense, of course, stereotypes — the excessively simple, but easily grasped images of racial, national, or religious groups — are only another example of pseudo‑events. But, generally speaking, they are closer to propaganda. For they simplify rather than complicate. Stereotypes narrow and limit experience in an emotionally satisfying way; but pseudo‑events embroider and dramatize experience in an interesting way. This itself makes pseudo‑events far more seductive; intellectually they are more defensible, more intricate, and more intriguing. To discover how the stereotype is made to unmask the sources of propaganda‑is to make the stereotype less believable. Information about the staging of a pseudo‑event simply adds to its fascination.
Lippmann’s description of stereotypes was helpful in its day. But he wrote before pseudo‑events had come in full flood. Photographic journalism was then still in its infancy. Wide World Photos had just been organized by The New York Times in 1919. The first wirephoto to attract wide attention was in 1924, when the American Telephone and Telegraph Company sent to The New York Times pictures of the Republican Convention in Cleveland which nominated Calvin Coolidge. Associated Press Picture Service was established in 1928. Life, the first wide‑circulating weekly picture news magazine, appeared in 1936; within a year it had a circulation of 1,000,000,’ and within two years, 2,000,000. Look followed, in 1937. The newsreel, originated in France by Path6, had been introduced to the United States only in 1910. When Lippmann wrote his book in 1922, radio was not yet reporting news to the consumer; television was of course unknown.
Recent improvements in vividness and speed, the enlargement and multiplying of news‑reporting media, and the public’s increasing news hunger now make Lippmann’s brilliant analysis of the stereotype the legacy of a simpler age. For stereotypes made experience handy to grasp. But pseudo‑events would make experience newly and satisfyingly elusive. In 1911 Will Irwin, writing in Collier’s, described the new era’s growing public demand for news as “a crying primal want of the mind, like hunger of the body.” The mania for news was a symptom of expectations enlarged far beyond the capacity of the natural world to satisfy. It required a synthetic product. It stirred an irrational and undiscriminating hunger for fancier, more varied items. Stereotypes there had been and always would be; but they only dulled the palate for information. They were an opiate. Pseudo‑events whetted the appetite; they aroused news hunger in the very act of satisfying it.
In the age of pseudo‑events it is less the artificial simplification than the artificial complication of experience that confuses us. Whenever in the public mind a pseudo‑event competes for attention with a spontaneous event in the same field, the pseudo‑event will tend to dominate. What happens on television will overshadow what happens off television. Of course I am concerned here not with our private worlds but with our world of public affairs.
Here are some characteristics of pseudo‑events which make them overshadow spontaneous events:
(1) Pseudo‑events are more dramatic. A television debate between candidates can be planned to be more suspenseful (for example, by reserving questions which are then popped suddenly) than a casual encounter or consecutive formal speeches planned by each separately.
(2) Pseudo‑events, being planned for dissemination, are easier to disseminate and to make vivid. Participants are selected for their newsworthy and dramatic interest.
(3) Pseudo‑events can be repeated at will, and thus their impression can be re‑enforced.
(4) Pseudo‑events cost money to create; hence somebody has an interest in disseminating, magnifying, advertising, and extolling them as events worth watching or worth believing. They are therefore advertised in advance, and rerun in order to get money’s worth.
(5) Pseudo‑events, being planned for intelligibility, are more intelligible and hence more reassuring. Even if we cannot discuss intelligently the qualifications of the candidates or the complicated issues, we can at least judge the effectiveness of a television performance. How comforting to have some political matter we can grasp!
(6) Pseudo‑events are more sociable, more conversable, and more convenient to witness. Their occurrence is planned for our convenience. The Sunday newspaper appears when we have a lazy morning for it. Television programs appear when we are ready with our glass of beer. In the office the next morning, Jack Paar’s (or any other star performer’s) regular late‑night show at the usual hour will overshadow in conversation a casual event that suddenly came up and had to find its way into the news.
(7) Knowledge of pseudo‑events—of what has been reported, or what has been staged, and how‑becomes the test of being “informed.” News magazines provide us regularly with quiz questions concerning not what has happened but concerning “names in the news”—what has been reported in the news magazines. Pseudo‑events begin to provide that “common discourse” which some of my old‑fashioned friends have hoped to find in the Great Books.
(8) Finally, pseudo‑events spawn other pseudo‑events in geometric progression. They dominate our consciousness simply because there are more of them, and ever more.
By this new Gresham’s law of American public life, counterfeit happenings tend to drive spontaneous happenings out of circulation. The rise in the power and prestige of the Presidency is due not only to the broadening powers of the office and the need for quick decisions, but also to the rise of centralized news gathering and broadcasting, and the increase of the Washington press corps. The President has an ever more ready, more frequent, and more centralized access to the world of pseudo‑events. A similar explanation helps account for the rising prominence in recent years of the Congressional investigating committees. In many cases these committees have virtually no legislative impulse, and political sometimes no intelligible legislative assignment. But they do have an almost unprecedented power, possessed now by no one else in the Federal government except the President, to make news. Newsmen support the committees because the committees feed the newsmen: they live together in happy symbiosis. The battle for power among Washington agencies becomes a contest to dominate the citizen’s information of the government. This can most easily be done by fabricating pseudo‑events.
A perfect example of how pseudo‑events can dominate is the recent popularity of the quiz show format. Its original appeal came less from the fact that such shows were tests of intelligence (or of dissimulation) than from the fact that the situations were elaborately contrived‑with isolation booths, armed bank guards, and all the rest‑and they purported to inform the public. The application of the quiz show format to the so‑called “Great Debates” between Presidential candidates in the election of 1960 is only another example. These four campaign programs, pompously and self‑righteously advertised by the broadcasting networks, were remarkably successful in reducing great national issues to trivial dimensions. With appropriate vulgarity, they might have been called the $400,000 Question (Prize: a $100,000‑a‑year job for four years). They were a clinical example of the pseudo‑event, of how it is made, why it appeals, and of its consequences for democracy in America.
In origin the Great Debates were confusedly collaborative between politicians and news makers. Public interest centered around the pseudo‑event itself: the lighting, make‑up, ground rules, whether notes would be allowed, etc. Far more interest was shown in the performance than in what was said. The pseudo‑events spawned in turn by the Great Debates were numberless. People who had seen the shows read about them the more‑ avidly, and listened eagerly for interpretations by news commentators. Representatives of both parties made “statements” on the probable effects of the debates. Numerous interviews and discussion programs were broadcast exploring their meaning. Opinion polls kept us informed on the nuances of our own and other people’s reactions. Topics of speculation multiplied. Even the question whether there should be a fifth debate became for a while a lively “issue.”
The drama of the situation was mostly specious, or at least had an extremely ambiguous relevance to, the main (but forgotten) issue: which participant was better qualified for the Presidency. Of course, a man’s ability, while standing under klieglights, without notes, to answer in two and a half minutes a question kept secret until that moment, had only the most dubious relevance‑if any at all‑to his real qualifications to make deliberate Presidential decisions on long‑standing public questions after being instructed by a corps of advisers. The great Presidents in our history (with the possible exception of F.D.R.) would have done miserably; but our most notorious demagogues would have shone. A number of exciting pseudo‑events were created‑for example, the Quemoy‑Matsu issue. But that, too, was a good example of a pseudo‑event: it was created to be reported, it concerned a then‑quiescent problem, and it put into the most factitious and trivial terms the great and real issue of our relation to Communist China.
The television medium shapes this new kind of political quiz-show spectacular in many crucial ways. Theodore H. White has proven this with copious detail in his The Making of the President 1960 (1961). All the circumstances of this particular competition for votes were far more novel than the old word “debate” and the comparisons with the Lincoln Douglas Debates suggested. Kennedy’s great strength in the critical first debate, according to White, was that he was in fact not “debating” at all, but was seizing the opportunity to address the whole nation; while Nixon stuck close to the issues raised by his opponent, rebutting them one by one. Nixon, moreover, suffered a handicap that was serious only on television: he has a light, naturally transparent skin. On an ordinary camera that takes pictures by optical projection, this skin photographs well. But a television camera projects electronically, by an “image‑orthicon tube” which has an x‑ray effect. This camera penetrates Nixon’s transparent skin and brings out (even just after a shave) the tiniest hair growing in the follicles beneath the surface. For the decisive first program Nixon wore a make‑up called “Lazy Shave” which was ineffective under these conditions. He therefore looked haggard and heavy‑bearded by contrast to Kennedy, who looked pert and clean‑cut.
This greatest opportunity in American history to educate the voters by debating the large issues of the campaign failed. The main reason, as White points out, was the compulsions of the medium. “The nature of both TV and radio is that they abhor silence and ‘dead time.’ All TV and radio discussion programs are compelled to snap question and answer back and forth as if the contestants were adversaries in an intellectual tennis match. Although every experienced newspaperman and inquirer knows that the most thoughtful and responsive answers to any difficult question come after long pause, and that the longer the pause the more illuminating the thought that follows it, nonetheless the electronic media cannot bear to suffer a pause of more than five seconds; a pause of thirty seconds of dead time on air seems interminable. Thus, snapping their two‑and‑a‑half‑minute answers back and forth, both candidates could only react for the cameras and the people, they could not think.” Whenever either candidate found himself touching a thought too large for two‑minute exploration, he quickly retreated. Finally the television‑watching voter was left to judge, not on issues explored by thoughtful men, but on the relative capacity of the two candidates to perform under television stress.
Pseudo‑events thus lead to emphasis on pseudo‑qualifications. Again the self‑fulfilling prophecy. If we test Presidential candidates by their talents on TV quiz performances, we will, of course, choose presidents for precisely these qualifications. In a democracy, reality tends to conform to the pseudo-event. Nature imitates art.
We are frustrated by our very efforts publicly to unmask the pseudo‑event. Whenever we describe the lighting, the make‑up, the studio setting, the rehearsals, etc., we simply arouse more interest. One newsman’s interpretation makes us more eager to hear another’s. One commentator’s speculation that the debates may have little significance makes us curious to hear whether another commentator disagrees.
Pseudo‑events do, of course, increase our illusion of grasp on the world, what some have called the American illusion of omnipotence. Perhaps, we come to think, the world’s problems can really be settled by ‘statements,’ by ‘Summit’ meetings, by a competition of ‘prestige,’ by overshadowing images, and by political quiz shows.
Numero Dos—“I believe that there will never again be a great world war – a war in which the terrible weapons involving nuclear fission and nuclear fusion would be used. And I believe that it is the discoveries of scientists upon which the development of these terrible weapons was based that is now forcing us to move into a new period in the history of the world, a period of peace and reason, when world problems are not solved by war or by force, but are solved in accordance with world law, in a way that does justice to all nations and that benefits all people.
Let me again remind you, as I did yesterday in my address of acceptance of the Nobel Peace Prize for 1962, that Alfred Nobel wanted to invent ‘a substance or a machine with such terrible power of mass destruction that war would thereby be made impossible forever.’ Two thirds of a century later scientists discovered the explosive substances that Nobel wanted to invent the fissionable substances uranium and plutonium, with explosive energy ten million times that of Nobel’s favorite explosive, nitroglycerine, and the fusionable substance lithium deuteride, with explosive energy fifty million times that of nitroglycerine. The first of the terrible machines incorporating these substances, the uranium-235 and plutonium-239 fission bombs, were exploded in 1945, at Alamogordo, Hiroshima, and Nagasaki. Then in 1954, nine years later, the first of the fission-fusion-fission superbombs was exploded, the 20-megaton Bikini bomb, with energy of explosion one thousand times greater than that of a 1945 fission bomb.
This one bomb, the 1954 superbomb, contained less than one ton of nuclear explosive. The energy released in the explosion of this bomb was greater than that of all of the explosives used in all of the wars that have taken place during the entire history of the world, including the First World War and the Second World War.
Thousands of these superbombs have now been fabricated; and today, eighteen years after the construction of the first atomic bomb, the nuclear powers have stockpiles of these weapons so great that if they were to be used in a war hundreds of millions of people would be killed, and our civilization itself might not survive the catastrophe.
Thus the machines envisaged by Nobel have come into existence, and war has been made impossible forever.
The world has now begun its metamorphosis from its primitive period of history, when disputes between nations were settled by war, to its period of maturity, in which war will be abolished and world law will take its place. The first great stage of this metamorphosis took place only a few months ago – the formulation by the governments of the United States, Great Britain, and the Soviet Union, after years of discussion and negotiation, of a Treaty3banning the testing of nuclear weapons on the surface of the earth, in the oceans, and in space, and the ratification and signing of this treaty by nearly all of the nations in the world.
I believe that the historians of the future may well describe the making of this treaty as the most important action ever taken by the governments of nations, in that it is the first of a series of treaties that will lead to the new world from which war has been abolished forever.
We see that science and peace are related. The world has been greatly changed, especially during the last century, by the discoveries of scientists. Our increased knowledge now provides the possibility of eliminating poverty and starvation, of decreasing significantly the suffering caused by disease, of using the resources of the world effectively for the benefit of humanity. But the greatest of all the changes has been in the nature of war the several million fold increase in the power of explosives and corresponding changes in methods of delivery of bombs.
These changes have resulted from the discoveries of scientists, and during the last two decades scientists have taken a leading part in bringing them to the attention of their fellow human beings and in urging that vigorous action be taken to prevent the use of the new weapons and to abolish war from the world.
The first scientists to take actions of this sort were those involved in the development of the atomic bomb. In March, 1945, before the first nuclear explosion had been carried out, Leo Szilard prepared a memorandum4 to President Franklin Delano Roosevelt5 in which he pointed out that a system of international control of nuclear weapons might give civilization a chance to survive. A committee of atomic scientists, with James Franck6 as chairman, on June 11, 1945, transmitted to the U.S. Secretary of War a report urging that nuclear bombs not be used in an unannounced attack against Japan, as this action would prejudice the possibility of reaching an international agreement on control of these weapons7.
In 1946 Albert Einstein, Harold Urey, and seven other scientists8 formed an organization to educate the American people about the nature of nuclear weapons and nuclear war. This organization, the Emergency Committee of Atomic Scientists (usually called the Einstein Committee), carried out an effective educational campaign over a five-year period. The nature of the campaign is indicated by the following sentences from the 1946 statement by Einstein:
“Today the atomic bomb has altered profoundly the nature of the world as we know it, and the human race consequently finds itself in a new habitat to which it must adapt its thinking… Never before was it possible for one nation to make war on another without sending armies across borders. Now with rockets and atomic bombs no center of population on the earth’s surface is secure from surprise destruction in a single attack… Few men have ever seen the bomb. But all men if told a few facts can understand that this bomb and the danger of war is a very real thing, and not something far away. It directly concerns every person in the civilized world. We cannot leave it to generals, senators, and diplomats to work out a solution over a period of generations… There is no defense in science against the weapon which can destroy civilization. Our defense is not in armaments, nor in science, nor in going underground. Our defense is in law and order… Future thinking must prevent wars.”9
During the same period and later years, many other organizations of scientists were active in the work of educating people about nuclear weapons and nuclear war; among them I may mention especially the Federation of American Scientists (in the United States)10, the Atomic Scientists’ Association (Great Britain), and the World Federation of Scientific Workers (with membership covering many countries).
On July 15, 1955, a powerful statement, called the Mainau Declaration, was issued by fifty-two Nobel laureates11. This statement warned that a great war in the nuclear age would imperil the whole world, and ended with the sentences: “All nations must come to the decision to renounce force as a final resort of policy. If they are not prepared to do this, they will cease to exist.”
A document of great consequence, the Russell-Einstein Appeal, was made public by Bertrand Russell12 on July 9, 1955. Russell, who for years remained one of the world’s most active and effective workers for peace, had drafted this document some months earlier, and it had been signed by Einstein two days before his death, and also by nine other scientists. The Appeal began with the sentence: “In the tragic situation which confronts humanity, we feel that scientists should assemble in conference to appraise the perils that have arisen as a result of the development of weapons of mass destruction…” And it ended with the exhortation: “There lies before us, if we choose, continual progress in happiness, knowledge, and wisdom. Shall we, instead, choose death, because we cannot forget our quarrels? We appeal, as human beings, to human beings: Remember your humanity, and forget the rest. If you can do so, the way lies open to a new Paradise; if you cannot, there lies before you the risk of universal death.”13
This Appeal led to the formation of the Pugwash Continuing Committee, with Bertrand Russell as chairman, and to the holding of a series of Pugwash Conferences (eleven during the years 1957 to 1963). Financial support for the first few conferences was provided by Mr. Cyrus Eaton14, and the first conference was held in his birthplace, the village of Pugwash, Nova Scotia.
Among the participants in some of the Pugwash Conferences have been scientists with a close connection with the governments of their countries, as well as scientists without government connection. The Conferences have permitted the scientific and practical aspects of disarmament to be discussed informally in a thorough, penetrating, and productive way and have led to some valuable proposals. It is my opinion that the Pugwash Conferences were significantly helpful in the formulation and ratification of the 1963 Bomb Test Ban Treaty.
Concern about the damage done to human beings and the human race by the radioactive substances produced in nuclear weapons tests was expressed with increasing vigor in the period following the first fission-fusion-fission bomb test at Bikini on March 1, 1954. Mention was made of radioactive fallout in the Russell-Einstein Appeal and also in the statement of the First Pugwash Conference. In his Declaration of Conscience issued in Oslo on April 24, 1957, Dr. Albert Schweitzer described the damage done by fallout and asked that the great nations cease their tests of nuclear weapons15. Then on May 15, 1957, with the help of some of the scientists in Washington University, St. Louis, I wrote the Scientists’ Bomb Test Appeal, which within two weeks was signed by over two thousand American scientists and within a few months by 11,021 scientists, of forty-nine countries. On January 15, 1958, as I presented the Appeal to Dag Hammarskjöld16 as a petition to the United Nations, I said to him that in my opinion it represented the feelings of the great majority of the scientists of the world. The Bomb Test Appeal consists of five paragraphs. The first two are the following:
“We, the scientists whose names are signed below, urge that an international agreement to stop the testing of nuclear bombs be made now.
Each nuclear bomb test spreads an added burden of radioactive elements over every part of the world. Each added amount of radiation causes damage to the health of human beings all over the world and causes damage to the pool of human germ plasm such as to lead to an increase in the number of seriously defective children that will be born in future generations.”17
Let me now say a few words to amplify the last statement, about which there has been controversy. Each year, of the nearly 100 million children born in the world, about 4,000,000 have gross physical or mental defects, such as to cause great suffering to themselves and their parents and to constitute a major burden on society. Geneticists estimate that about five percent, 200,000 per year, of these children are grossly defective because of gene mutations caused by natural high-energy radiation – cosmic rays and natural radioactivity, from which our reproductive organs cannot be protected. This numerical estimate is rather uncertain, but geneticists agree that it is of the right order of magnitude.
Moreover, geneticists agree that any additional exposure of the human reproductive cells to high-energy radiation produces an increase in the number of mutations and an increase in the number of defective children born in future years, and that this increase is approximately proportional to the amount of the exposure.
The explosion of nuclear weapons in the atmosphere liberates radioactive fission products-cesium 137, strontium go, iodine 131, and many others. In addition, the neutrons that result from the explosion combine with nitrogen nuclei in the atmosphere to form large amounts of a radioactive isotope of carbon, carbon 14, which then is incorporated into the organic molecules of every human being. These radioactive fission products are now damaging the pool of human germ plasma and increasing the number of defective children born.
Carbon 14 deserves our special concern. It was pointed out by the Soviet scientist O.I. Leipunsky in 1957 that this radioactive product of nuclear tests would cause more genetic damage to the human race than the radioactive fallout (cesium 137 and other fission products), if the human race survives over the 8,000-year mean life of carbon 14. Closely agreeing numerical estimates of the genetic effects of bomb-test carbon 14 were then made independently by me and by Drs. Totter, Zelle, and Hollister of the United States Atomic Energy Commission18. Especially pertinent is the fact that the so-called “clean” bombs, involving mainly nuclear fusion, produce when they are tested more carbon 14 per megaton than the ordinary fission bombs or fission-fusion-fission bombs.
A recent study by Reidar Nydal, of the Norwegian Institute of Technology in Trondheim, shows the extent to which the earth is being changed by the tests of nuclear weapons. Carbon 14 produced by cosmic rays is normally present in the atmosphere, oceans, and biosphere, in amount as to be responsible for between one and two percent of the genetic damage caused by natural high-energy radiation. Nydal has reported that the amount of carbon 14 in the atmosphere has been more than doubled because of the nuclear weapons tests of the last ten years, and that in a few years the carbon-14 content of human beings will be two or three times the normal value, with a consequent increase in the gene mutation rate and the number of defective children born.
Some people have pointed out that the number of grossly defective children born as a result of the bomb tests is small compared with the total number of defective children and have suggested that the genetic damage done by the bomb tests should be ignored. I, however, have contended, as have Dr. Schweitzer and many others, that every single human being is important and that we should be concerned about every additional child that is caused by our actions to be born to live a life of suffering and misery. President Kennedy in his broadcast19 to the American people on July 26, 1963, said: “The loss of even one human life, or the malformation of even one baby – who may be born long after we are gone – should be of concern to us all. Our children and grandchildren are not merely statistics towards which we can be indifferent.”
We should know how many defective children are being born because of the bomb tests. During the last six years I have made several attempts to estimate the numbers. My estimates have changed somewhat from year to year, as new information became available and as continued bomb testing increased the amount of radioactive pollution of the earth, but no radical revision of the estimates has been found necessary.
It is my estimate that about 100,000 viable children will be born with gross physical or mental defects caused by the cesium 137 and other fission products from the bomb tests carried out from 1952 to 1963, and 1,500,000 more, if the human race survives, with gross defects caused by the carbon 14 from these bomb tests. In addition, about ten times as many embryonic, neonatal, and childhood deaths are expected-about 1,000,000 caused by the fission products and 15,000,000 by carbon 14. An even larger number of children may have minor defects caused by the bomb tests; these minor defects, which are passed on from generation to generation rather than being rapidly weeded out by genetic death, may be responsible for more suffering in the aggregate than the major defects.
About five percent of the fission-product effect and 0.3 percent of the carbon-14 effect may appear in the first generation; that is, about 10,000 viable children with gross physical or mental defects, and 100,000 embryonic, neonatal, and childhood deaths.
These estimates are in general agreement with those made by other scientists and by national and international committees. The estimates are all very uncertain because of the deficiencies in our knowledge. The uncertainty is usually expressed by saying that the actual numbers may be only one-fifth as great or may be five times as great as the estimates, but the errors may be even larger than this.
Moreover, it is known that high-energy radiation can cause leukemia, bone cancer, and some other diseases. Scientists differ in their opinion about the carcinogenic activity of small doses of radiation, such as produced by fallout and carbon 14. It is my opinion that bomb-test strontium 90 can cause leukemia and bone cancer, iodine 131 can cause cancer of the thyroid, and cesium 137 and carbon 14 can cause these and other diseases. I make the rough estimate that, because of this somatic effect of these radioactive substances that now pollute the earth, about 2,000,000 human beings now living will die five or ten or fifteen years earlier than if the nuclear tests had not been made. The 1962 estimate of the United States Federal Radiation Council was 0 to 100,000 deaths from leukemia and bone cancer in the U.S. alone, caused by the nuclear tests to the end of 1961.
The foregoing estimates are for 600 megatons of bombs. We may now ask : At what sacrifice is the atmospheric test of a single standard 20-megaton bomb carried out? Our answer, none the less horrifying because uncertain, is: with the sacrifice, if the human race survives, of about 500,000 children, of whom about 50,000 are viable but have gross physical or mental defects; and perhaps also of about 70,000 people now living who may die prematurely of leukemia or some other disease caused by the test.
We may be thankful that most of the nations of the world have, by subscribing to the 1963 treaty, agreed not to engage in nuclear testing in the atmosphere. But what a tragedy it is that this treaty was not made two years earlier! Of the total of 600 megatons of tests so far, three-quarters of the testing, 450 megatons, was done in 1961 and 1962. The failure to formulate a treaty in 1959 or 1960 or 1961 was attributed by the governments of the United States, Great Britain, and the Soviet Union to the existing differences of opinion about methods of inspection of underground tests. These differences were not resolved in 1963; but the treaty stopping atmospheric tests was made. What a tragedy for humanity that the governments did not accept this solution before taking the terrible step of resuming the nuclear tests in 1961!
I shall now quote and discuss the rest of the nuclear test ban petition of six years ago.
“So long as these weapons are in the hands of only three powers, an agreement for their control is feasible. If testing continues, and the possession of these weapons spreads to additional governments, the danger of outbreak of a cataclysmic nuclear war through the reckless action of some irresponsible national leader will be greatly increased.
An international agreement to stop the testing of nuclear bombs now could serve as a first step toward a more general disarmament and the ultimate effective abolition of nuclear weapons, averting the possibility of a nuclear war that would be a catastrophe to all humanity.
We have in common with our fellowmen a deep concern for the welfare of all human beings. As scientists we have knowledge of the dangers involved and therefore a special responsibility to make those dangers known. We deem it imperative that immediate action be taken to effect an international agreement to stop the testing of all nuclear weapons.”
How cogent is this argument? Would a great war, fought with use of the nuclear weapons that now exist, be a catastrophe to all humanity? Consideration of the nature of nuclear weapons and the magnitude of the nuclear stockpiles gives us the answer: it is Yes.
A single 25-megaton bomb could largely destroy any city on earth and kill most of its inhabitants. Thousands of these great bombs have been fabricated, together with the vehicles to deliver them.
Precise information about the existing stockpiles of nuclear weapons has not been released. The participants in the Sixth Pugwash Conference, in 1960, made use of the estimate 60,000 megatons. This is 10,000 times the amount of explosive used in the whole of the Second World War. It indicates that the world’s stockpile of military explosives has on the average doubled every year since 1945: My estimate for 1963, which reflects the continued manufacture of nuclear weapons during the past three years, is 320,000 megatons.
This estimate is made credible by the following facts. On November 12, 1961, the U.S. Secretary of Defense20 stated that the U.S. Strategic Air Command then included 630 B-52’s, 55 B-58’s, and 1,000 B-47’s, a total of 1,685 great bombers. These bombers carry about 50 megatons of bombs a piece – two 25-megaton bombs on each bomber. Accordingly, these 1,685 intercontinental bombers carry a load totaling 84,000 megatons. I do not believe that it can be contended that the bombs for these bombers do not exist. The Secretary of Defense also stated that the United States has over 10,000 other planes and rockets capable of carrying nuclear bombs in the megaton range. The total megatonnage of nuclear bombs tested by the Soviet Union is twice that of those tested by the United States and Great Britain, and it is not unlikely that the Soviet stockpile is also a tremendous one, perhaps one-third or one-half as large as the U.S. stockpile.
The significance of the estimated total of 320,000 megatons of nuclear bombs may be brought out by the following statement: if there were to take place tomorrow a 6-megaton war, equivalent to the Second World War in the power of the explosives used, and another such war the following day, and so on, day after day, for 146 years, the present stockpile would then be exhausted – but, in fact, this stockpile might be used in a single day, the day of the Third World War.
Many estimates have been made by scientists of the probable effects of hypothetical nuclear attacks. One estimate, reported in the 1957 Hearings before the Special Subcommittee on Radiation of the Joint Committee on Atomic Energy of the Congress of the United States, was for an attack on population and industrial centers and military installations in the United States with 250 bombs totaling 2,500 megatons. The estimate of casualties presented in the testimony, corrected for the increase in population since 1957, is that sixty days after the day on which the attack took place ninety-eight million of the 190 million American people would be dead, and twenty-eight million would be seriously injured but still alive; many of the remaining seventy million survivors would be suffering from minor injuries and radiation effects.
This is a small nuclear attack made with use of about one percent of the existing weapons. A major nuclear war might well see a total of 30,000 megatons, one-tenth of the estimated stockpiles, delivered and exploded over the populated regions of the United States, the Soviet Union, and the other major European countries. The studies of Hugh Everett and George E. Pugh21, of the Weapons Systems Evaluation Division, Institute of Defense Analysis, Washington, D.C., reported in the 1959 Hearings before the Special Subcommittee on Radiation, permit us to make an estimate of the casualties of such a war. This estimate is that sixty days after the day on which the war was waged, 720 million of the 800 million people in these countries would be dead, sixty million would be alive but severely injured, and there would be twenty million other survivors. The fate of the living is suggested by the following statement by Everett and Pugh: “Finally, it must be pointed out that the total casualties at sixty days may not be indicative of the ultimate casualties. Such delayed effects as the disorganization of society, disruption of communications, extinction of livestock, genetic damage, and the slow development of radiation poisoning from the ingestion of radioactive materials may significantly increase the ultimate toll.”
No dispute between nations can justify nuclear war. There is no defense against nuclear weapons that could not be overcome by increasing the scale of the attack. It would be contrary to the nature of war for nations to adhere to agreements to fight “limited” wars, using only “small” nuclear weapons – even little wars today are perilous, because of the likelihood that a little war would grow into a world catastrophe.
The only sane policy for the world is that of abolishing war.
This is now the proclaimed goal of the nuclear powers and of all other nations.
We are all indebted to the governments of the United States, the Soviet Union, and Great Britain for their action of formulating a test ban agreement that has been accepted by most of the nations of the world. As an American, I feel especially thankful to our great President, John F. Kennedy, whose tragic death occurred only nineteen days ago. It is my opinion that this great international agreement could not have been formulated and ratified except for the conviction, determination, and political skill of President Kennedy.
The great importance of the 1963 Test Ban Treaty lies in its significance as the first step toward disarmament. To indicate what other steps need to be taken, I shall now quote some of the statements made by President Kennedy in his address to the United Nations General Assembly on the 26th of September, 1961.
“The goal (of disarmament) is no longer a dream. It is a practical matter of life or death. The risks inherent in disarmament pale in comparison to the risks inherent in an unlimited arms race…
Our new disarmament program includes:…
First, signing the test-ban treaty by all nations…
Second, stopping production of fissionable materials and preventing their transfer to (other) nations;…
Third, prohibiting the transfer of control over nuclear weapons to other nations;
And Sixth, halting… the production of strategic nuclear delivery vehicles, and gradually destroying them.”
The first of these goals has been approached, through the 1963 treaty, but not yet reached. Six weeks ago, by the vote ninety-seven to one, the Political Committee of the United Nations General Assembly approved a resolution asking that the eighteen-nation Disarmament Committee take supplementary action to achieve the discontinuance of all test explosions of nuclear weapons for all time. We must strive to achieve this goal.
The fourth action proposed by President Kennedy, that of keeping nuclear weapons from outer space, was taken two months ago, in the United Nations, through a pledge of abstention subscribed to by many nations.
Action on the third point, the prevention of the spread of nuclear weapons, could lead to a significant diminution in international tensions and in the chance of outbreak of a world war. The 1960 treaty making Antarctica a nuclear-free zone provides a precedent. Ten Latin American nations have proposed that the whole of Latin America be made into a second zone free of nuclear weapons; and a similar proposal has been made for Africa. Approval of these proposals would be an important step toward permanent peace.
Even more important would be the extension of the principle of demilitarization to Central Europe, as proposed by Rapacki, Kennan22, and others several years ago. Under this proposal the whole of Germany, Poland, and Czechoslovakia, and perhaps some other countries would be largely demilitarized, and their boundaries and national integrity would be permanently assured by the United Nations. I am not able at the present time to discuss in a thorough way the complex problem of Berlin and Germany; but I am sure that if a solution other than nuclear destruction is ever achieved, it will be through demilitarization, not remilitarization.
President Kennedy, President Johnson, Chairman Khrushchev, Prime Minister Macmillan23, and other national leaders have proclaimed that, to prevent the cataclysm, we must move toward the goal of general and complete disarmament, we must begin to destroy the terrible nuclear weapons that now exist, and the vehicles for delivering them. But instead of destroying the weapons and the delivery vehicles, the great nations continue to manufacture more and more of them, and the world remains in peril.
Why is no progress being made toward disarmament? I think that part of the answer is that there are still many people, some of them powerful people, who have not yet accepted the thesis that the time has now come to abolish war. And another part of the answer is that there exists a great nation that has not been accepted into the world community of nations – the Chinese People’s Republic, the most populous nation in the world. I do not believe that the United States and the Soviet Union will carry out any major stage of the process of disarmament unless that potentially great nuclear power, the Chinese People’s Republic, is a signatory to the disarmament agreement; and the Chinese People’s Republic will not be a signatory to such a treaty until she is accepted into the community of nations under conditions worthy of her stature24. To work for the recognition of China is to work for world peace.
We cannot expect the now existing nuclear weapons to be destroyed for several years, perhaps for decades. Moreover, there is the possibility, mentioned by Philip Noel-Baker in his Nobel lecture in 1959, that some nuclear weapons might be concealed or surreptitiously fabricated, and then used to terrorize and dominate the disarmed world25; this possibility might slow down the program of destroying the stockpiles.
Is there no action that we can take immediately to decrease the present great danger of outbreak of nuclear war, through some technological or psychological accident or as the result of a series of events such that even the wisest national leaders could not avert the catastrophe?
I believe that there is such an action, and I hope that it will be given consideration by the national governments. My proposal is that there be instituted, with the maximum expedition compatible with caution, a system of joint national-international control of the stockpiles of nuclear weapons; such that use could be made of the American nuclear armaments only with the approval both of the American government and of the United Nations, and that use could be made of the Soviet nuclear armament only with the approval both of the Soviet government and of the United Nations. A similar system of dual control would of course be instituted for the smaller nuclear powers if they did not destroy their weapons.
Even a small step in the direction of this proposal, such as the acceptance of United Nations observers in the control stations of the nuclear powers, might decrease significantly the probability of nuclear war.
There is another action that could be taken immediately to decrease the present great hazard to civilization. This action would be to stop, through a firm treaty incorporating a reliable system of inspection, the present great programs of development of biological and chemical methods of waging war.
Four years ago the scientists participating in the Fifth Pugwash Conference concluded that at that time the destructive power of nuclear weapons was far larger than that of biological and chemical weapons, but that biological and chemical weapons have enormous lethal and incapacitating effects against man and could also effect tremendous harm by the destruction of plants and animals. Moreover, there is a vigorous effort being made to develop these weapons to the point where they would become a threat to the human race equal to or greater than that of nuclear weapons. The money expended for research and development of biological and chemical warfare by the United States alone has now reached 100 million dollars per year, an increase of sixteenfold in a decade, and similar efforts are probably being exerted in the Soviet Union and other countries.
To illustrate the threat, I may mention the plans to use nerve gases that, when they do not kill, produce temporary or permanent insanity, and the plans to use toxins such as the botulism toxin, viruses such as the virus of yellow fever, or bacterial spores such as of anthrax, to kill tens or hundreds of millions of people.
The hazard is especially great in that, once the knowledge is obtained through a large-scale development program such as is now being carried out, it might well spread over the world and might permit some small group of evil men, perhaps in one of the smaller countries, to launch a devastating attack.
This terrible prospect could be eliminated now by a general agreement to stop research and development of these weapons, to prohibit their use, and renounce all official secrecy and security controls over microbiological, toxicological, pharmacological, and chemical- biological research. Hundreds of millions of dollars per year are now being spent in the effort to make these malignant cells of knowledge. Now is the time to stop. When once the cancer has developed and its metastases have spread over the world, it will be too late.
The replacement of war by law must include not only great wars but also small ones. The abolition of insurrectionary and guerrilla warfare, which often is characterized by extreme savagery and a great amount of human suffering, would be a boon to humanity.
There are, however, countries in which the people are subjected to continuing economic exploitation and to oppression by a dictatorial government, which retains its power through force of arms. The only hope for many of these people has been that of revolution, of overthrowing the dictatorial government and replacing it with a reform government, a democratic government that would work for the welfare of the people.
I believe that the time has come for the world as a whole to abolish this evil, through the formulation and acceptance of some appropriate articles of world law. With only limited knowledge of law, I shall not attempt to formulate a proposal that would achieve this end without permitting the possibility of the domination of the small nations by the large nations. I suggest, however, that the end might be achieved by world legislation under which there would be, perhaps once a decade, a referendum, supervised by the United Nations, on the will of the people with respect to their national government, held, separately from the national elections, in every country in the world.
It may take many years to achieve such an addition to the body of world law. In the meantime, much could be done through a change in the policies of the great nations. During recent years insurrections and civil wars in small countries have been instigated and aggravated by the great powers, which have moreover provided weapons and military advisers, increasing the savagery of the wars and the suffering of the people. In four countries during 1963 and several others during preceding years, democratically elected governments, with policies in the direction of social and economic reform, have been overthrown and replaced by military dictatorship, with the approval, if not at the instigation., of one or more of the great powers. These actions of the great powers are associated with policies of militarism and national economic interest that are now antiquated. I hope that the pressure of world opinion will soon cause them to be abandoned and to be replaced by policies that are compatible with the principles of morality, justice, and world brotherhood.
In working to abolish war, we are working also for human freedom, for the rights of individual human beings. War and nationalism, together with economic exploitation, have been the great enemies of the individual human being. I believe that, with war abolished from the world, there will be improvement in the social, political, and economic systems in all nations, to the benefit of the whole of humanity.
I am glad to take this opportunity to express my gratitude to the Norwegian Storting [Parliament] for its outstanding work for international arbitration and peace during the last seventy-five years. In this activity the Storting has been the leader among the parliaments of nations. I remember the action of the Storting in 1890 of urging that permanent treaties for arbitration of disputes between nations be made, and the statement that ‘the Storting is convinced that this idea has the support of an overwhelming proportion of our people. Just as law and justice have long ago replaced the rule of the fist in disputes between man and man, so the idea of settling disputes among peoples and nations is making its way with irresistible strength. More and more, war appears to the general consciousness as a vestige of prehistoric barbarism and a curse to the human race.’
Now we are forced to eliminate from the world forever this vestige of prehistoric barbarism, this curse to the human race. We, you and I, are privileged to be alive during this extraordinary age, this unique epoch in the history of the world, the epoch of demarcation between the past millennia of war and suffering, and the future, the great future of peace, justice, morality, and human well-being. We are privileged to have the opportunity of contributing to the achievement of the goal of the abolition of war and its replacement by world law. I am confident that we shall succeed in this great task; that the world community will thereby be freed not only from the suffering caused by war but also, through the better use of the earth’s resources, of the discoveries of scientists, and of the efforts of mankind, from hunger, disease, illiteracy, and fear; and that we shall in the course of time be enabled to build a world characterized by economic, political, and social justice for all human beings and a culture worthy of man’s intelligence.” Linus Pauling, “Science & Peace;” Nobel Peace Prize lecture, 1963
Numero Tres—“I am delighted to have been invited to reminisce on our meetings. In a way this is an appropriate place for me to do so. I went to my first AMS meetings while a graduate student at the University of Cincinnati.
During Thanksgiving 1935 I hitchhiked from Cincinnati to Lexington, Kentucky, for this purpose. There I encountered Fritz John and his wife, recent refugees from Nazi Germany. They invited me to sleep on the couch in their living room, which I was glad to do. Leon Cohen was also then at the University of Kentucky, and also very friendly to a beginning graduate student. Their friendship, which was to become long-standing, still evokes a warm glow.
That Christmas the AMS winter meeting was in St. Louis. Again I hit the highways, now to make my first visit to the city where my parents had met and married. En route, J. L. Doob (whom I had come to know while using the Columbia University mathematics library) picked me up. As you know, he later became President of the AMS. In this capacity, he designated me as his representative to the Mathematical Sciences Conference Board on the occasion when another commitment prevented his personal participation. This was the only time any AMS President has appointed me to any committee or function. The membership has twice elected me to serve as a member-at-large of the Council, following my nomination by petition, so I don’t feel neglected by my colleagues.
Back to St. Louis. In those days even the winter meetings were what we would now regard as quite small. There was room enough for the American Economics Association (AEA) to hold its meetings also at Washington University at the same time. By chance, I picked up the flyer announcing an AEA luncheon for 85 cents at which there were to be three speakers. The following morning there was a correction. The three speeches were cancelled and the price raised to $1.00. I have always wondered whether the initial listing (or threat) of three speeches had been intended as a ploy to prepare the way for the then high price for a luncheon.
Another experience with a meeting luncheon has stuck in my mind. Still a graduate student, I ventured to an AMS meeting at Columbia University while home in New York during a vacation. Of course, students didn’t go to the associated luncheon, held at the faculty club. On the way back to the afternoon session, a couple of us passed by just as D. J. Struik, whose wonderful talk yesterday at this meeting evoked a standing ovation, was descending the club steps. I asked the usual stupid question: ‘How was the lunch?’ A happy smile lit up his face: ‘Far better than I expected; the food was mediocre.’
Not all events connected with AMS meetings were that cheerful or that light. Some of the things of which I’ll speak now will require a willingness to face ourselves as we were. To prepare for this, I have chosen a few lines from the official poem read by its author, Maya Angelou, at the inauguration of President Clinton:
“History, despite its wrenching pain,
Cannot be unlived, but if faced
With courage, need not be lived again.”
The welcome, or, rather, lack of welcome, to minorities and women lasted over a long period of time at our meetings. Unless we are willing to face this in these days of backlash, we’ll be living this all over again. Here are some examples from the AMS and the Mathematical Association of America (MAA).
This was first made a matter of record in 1951 when I was teaching at Fisk University in Nashville, Tennessee, a leading historically Black university. The Southeast regional meeting of the MAA took place with Vanderbilt University as host. There was an official banquet at which the national President of the MAA was the speaker. Using rather vulgar language, the chair of the local arrangements committee, a Vanderbilt professor, said that no tickets would be available to Negro members. I’m using the polite version of the word he employed.
On April 20, 1951, my department sent a letter to the Board of Governors of the MAA and (well aware that the AMS behaved no better) also to the Council of the AMS, describing the situation and making certain suggestions. Then, with a covering note, I sent it to Science, there being no AMS or MAA outlets for letters then. They appeared on August 10, 1951. (See Attachment 1.) It was reprinted (together with quite a bit of related material) as an appendix to the book Black Mathematicians and their Works .
The April letter’s first paragraph formulates what would appear to be minimum obligations owed by any organization to its members. Further on, it explains why such conditions need to be made explicit and then continues by citing precedents in place in other professional organizations. Finally, it establishes that the by-laws we requested were entirely practicable even in terms of the law and practices of that time. (See p. 91, .)
Unfortunately, neither the AMS nor the MAA has ever pioneered in facing these issues. Other major organizations, such as the American Psychological Association, including at least one which was entirely southern-based (Southern College Personnel Association), were already behaving much better than we were. However, the demand from the Mathematics Department at Fisk University, supported by colleagues elsewhere, did bring some action. Policy statements were adopted calling for meetings to be run so that all could participate. That was a step forward, but a rather gingerly one. The concept of participation seems to have quite different meanings for different people. The statements included wording to the effect that where accommodations are provided for some they will be provided for all. But what does that mean?
This necessitated a second letter on December 17, 1951. (See Attachment 2.) It recorded that the Fisk Department was pleased at the anti-discrimination affirmation by then adopted by the MAA, but pointed out that implementation was lacking and repeated the need for the procedures already requested in the April 20, 1951, letter (Attachment 1). This time the letter added some AMS history to emphasize the need for definite, unambiguous, enforceable policy.
It records, e.g., that “When the Society met at the University of Georgia in 1947, not one Negro was present.” After I wrote that letter, I learned that there was more to the story than that. Actually, one had wanted to participate. This was J. Ernest Wilkins, Jr., present in this room, who, many years later, was elected to the AMS Council and, more importantly, to the National Academy of Engineering.
In 1947 Wilkins was a few years past the Ph. D. he had earned at the University of Chicago slightly before his nineteenth birthday. He received a letter from the AMS Associate Secretary for that region urging him to come and saying that very satisfactory arrangements had been made with which they were sure he’d be pleased: they had found a “nice colored family” with whom he could stay and where he would take his meals! The hospitality of the University of Georgia (and of the AMS) was not for him. This is why the meeting there was totally white.
In 1951 I would be informed by the Secretary of MAA’s Southeastern section that in all the twenty years in which he held that post not a single Black mathematician had attempted to participate in any way in MAA meetings in that region — until my Fisk colleagues and I did so that year, only to be excluded from the official banquet addressed by the MAA national President.
The University of Georgia figures again in this same period in another example of AMS insensitivity to its Black constituency. In 1951 the AMS sold its library to the University of Georgia, which was the highest of six bidders. A careful search of AMS records does not disclose any assurances given — or even sought — that all AMS members, regardless of race, would be able to use it. This was at a time of intense segregation mandated by Georgia state law. (At the other four U.S. institutions bidding, access would not have been a problem.)
In that period, David Blackwell, then at Howard University, later to become the first Black member of the National Academy of Sciences, the first on the AMS Council, the first (and only) to serve as a Vice President (now no longer active in AMS), attempted to attend an AMS meeting in the Washington area. He drove to the meeting, but it took only twenty minutes for him to decide to turn around and drive right back home.
W.W.S. Claytor, a distinguished point-set topologist, suffered even more unpleasant experiences at AMS meetings, with the result that he became unwilling to attend any. The two letters following reveal much about both the universities and the AMS in that time. The first letter is to Virginia Newell, who, as an editor of Black Mathematicians and Their Works, had sought information from the late Walter R. Talbot, himself an early Black PhD in mathematics. He, in turn, forwards a letter written by Claytor’s widow, a university professor in another discipline:
Dear Mrs. Newell:
I was just about to give up on getting a write-up on Claytor and tell you that I had lost count on the number of times I had been promised the requested information. Then the enclosed material came yesterday as the mailing cover will show. I have made some pencil marks on the papers. I remember when Claytor was on a post-doctoral at Michigan and they had a vacancy for which he was qualified. They would not offer him the position, and the student newspaper took up the issue but to no avail. I believe that incident in discrimination was one of the main chilling, if not killing, points in the research career of a brilliant mathematician. There are references in the literature to his work, but he lost his spirit. I wish Mae had included that item, but I wouldn’t want to burden her with more questions or requests. Needless to say, I hope you can find a way to include Mae’s contributions on Claytor. He definitely belongs among the top few of our research persons even with his short career of doing research. His spirit was broken by discrimination.
Good wishes always.
Walter R. Talbot
I am sorry about being late with this but it is just difficult for me to write about Bill. I am still at the point where I do not like to go back and think. In order to get much of this material, I had to go to what I call our memory books and looking at pictures and sort of reliving Bill; it just hurts a bit too much. I hope this is O.K. There is so much I just cannot put on paper. Even writing about Bill and his presentation at the Math Society, I thought about the days Bill used to tell me how owing to the Black-White mess, he had to stay at a private home when the others were at the hotel where the Association met. Over the years when the color-line became less, he never would attend any more meetings. Kline used to come to see us periodically and try to get Bill to go with him but I guess the hurt went too deeply with him. After he left, I found old papers and letters he had when Kline was trying to get him in Princeton as a Fellow and whew, again it was the color mess. At Princeton, the administration said the students might object to a “culud” person which was a laugh, they would never have known it. I do hope what I have written is O.K.
[Mrs. William Claytor]
The Canadian and US governments have apologised, as indeed they should, for the internment of their citizens of Japanese descent during World War II. I know of no plans to apologize for the generations of slavery and discrimination inflicted upon those of African descent. Some day there will be an African country with the same economic and political clout as contemporary Japan. Then we can look forward to similar apologies to the descendants of Africa. But our mathematical organizations could apologize for past behavior before then.
Time went on and episodes continued. I remember yet another, in 1960, when A. Shabazz and S.C. Saxena, both on the faculty of Atlanta University (now Clark-Atlanta), and their graduate student W.E. Brodie were subjected yet again to jimcrow treatment at the spring meeting of the Southeastern Section of MAA. This, it should be noted, was several years after AMS and MAA commitments to the contrary. They had not been warned in advance that such discourtesy would be in store. The three left in protest.
And so in 1969 the National Association of Mathematicians (NAM) came into being to address the needs of the Black mathematical community. This was a turbulent period. A group of more or less left-oriented mathematicians established the Mathematicians Action Group (MAG) that same year. We were motivated largely by concern over the Vietnam war, the militarization of mathematics, the lack of democracy in the AMS, the existence of racism and sexism, and related social issues as they impinged on mathematicians and vice versa.
This led to the liveliest Business Meeting (New Orleans, January 1969) that the Society has ever had. I noticed Everett Pitcher smiling just now when I made that remark. He was at that time AMS Secretary and was sporting a beard; more of that in a moment. In the summer of 1968 there had been what an official commission of enquiry was later to denounce as a police riot. This was at the Democratic National Convention in Chicago. Richard Daley, the father of the present Mayor, was Mayor then. The police were called out against grass roots delegations, including one involving mathematicians, there to protest the US government’s undeclared war against Vietnam. The wild and brutal behavior of the police shocked the world: massive use of tear gas, random and vicious beatings of demonstrators, onlookers and passers-by. Anybody with a beard was beaten.
What did all this have to do with the AMS? Our next spring (regional) meeting was scheduled for Chicago and a great many of us were determined that the AMS should not go about “business as usual.” So MAG, of which I was a member, decided to call upon the Society to move the Spring 1969 meeting away from Chicago in protest. MAG designated me to introduce such a motion at the January 25, 1969, AMS Business Meeting. Secretary Pitcher’s report on that Business Meeting is included as Attachment 3 .
Well over 400 members attended that Business Meeting, an enormous contrast to the typical attendance. After extensive debate, the motion to move the upcoming meeting that I made on behalf of MAG passed decisively. It called upon the AMS Executive Committee to move the meeting, since the business meeting itself lacked the power to do so. That Committee had rejected earlier an individual request to make the move, but fortunately recognized the feeling among the members and complied. The new venue, by the way, was Cincinnati.
MAG brought other current issues formulated in five resolutions to the New Orleans Business Meeting, these via Ed Dubinsky who is also at this meeting. The texts of these resolutions, which for technical reasons could not come to an immediate vote, are included in Attachments 3 and 4. The AMS Council voted by 29–1 to dissociate itself from these resolutions and introduced Resolution B, which it regarded as contradictory to the five MAG resolutions. (See Attachment 4.) It then held a hasty referendum, simultaneously calling upon the membership to defeat the five and adopt the allegedly contradictory Resolution B. (See Attachment 4.) And that is what happened. No time for discussion was allowed. I submitted a letter on these resolutions and their handling which the Notices refused to publish. It then came out in the MAG Newsletter with an explanation of the context. (See Attachment 5.)
I could say a lot about this, but I’ll restrain myself, except for comments on Resolution 5, which read:
“Whereas the shortage of mathematicians in North American Universities is different and greater among black and brown Americans than among whites, and whereas this situation is not improving, be it resolved that the AMS appoint a committee composed of black and third world mathematicians to study this problem and other problems concerning black and third world mathematicians, and report their conclusions and recommendations to the Society.”
Just imagine! The Council called for the defeat by a vote of 29-1 of such a resolution. And offered no substitute. No wonder NAM was established that same year. Fortunately, with the growth of members’ demands and the existence of NAM, changes began. Eventually, the substance of that resolution was adopted. Resolution 4 also rose from the ashes of defeat as the current Notices letters section now attests.
The spirit of Resolution 5, representing sensitivity to the anguish of the Black community, can be found in another lively Business Meeting. This had its roots in the establishment in 1972 of a reciprocity agreement with the South African Mathematical Society. Strong objections were made when this became known. I am not aware of all of them, but I recall a letter from Gail Young to the January 1974 meeting of the AMS Council to which I had just been elected. I moved that the agreement be cancelled and a lively discussion ensued. From the observers’ seats James A. Donaldson, Chair of the Howard University Mathematics Department and later a Council member, emphasized that Black members of the AMS could not “avail themselves fully and equally of the privileges of membership in the South African Mathematical Society”. Donaldson stressed that the AMS would be inflicting another insulting discrimination on Black members and predicted that many would not wish to remain members under such offensive circumstances.
Ultimately, my motion to cancel the agreement passed. But more was to come. At a subsequent meeting, the Council (despite strong objection from some members) decided to eliminate all reciprocity agreements and institute individual foreign memberships in their stead. (See Attachment 6.) This would have taken the sting out of the anti-apartheid stance that the cancellation of the agreement with South Africa had placed on record. Fortunately, the Bylaws required securing the consent of a Business Meeting when a new class of membership is created.
Postponed from the August 1976 Business Meeting because of vagueness, the Council proposal was debated — and defeated — in the January 1977 Business Meeting, a very vigorous gathering. The members were in no mood to soften the anti-apartheid implications of the cancellation of the South African agreement, nor did they wish to lose their participation rights in mathematical societies abroad as a result of cancellation of all reciprocity agreements. So there is a genuine function for serious business meetings; they don’t all have to be pro-forma.
History knows many turns. Now South Africa has a government led by those imprisoned or exiled by the apartheid regimes. So, on May 3, 1994, I wrote to the AMS officers asking them to put before the Council a proposal to offer the South African Mathematical Society a reciprocity agreement and recounting some of the foregoing background. Somehow or other, the Secretary thought that he could just do it, without reference to the Council. My letter was not circulated, nor was the item placed on the agenda. However, Vice President Jean Taylor, apprised of this, raised the issue. On her motion the Council authorized the establishment of this reciprocity agreement.
Now the Society is in the happy historical position of having deliberately distanced itself from apartheid South Africa and then having promptly offered its friendship to a South Africa distancing itself from its gruesome past. The vote in the January 1977 Business Meeting gave us this opportunity.
If I have not referred explicitly to the important role of our women colleagues as a collective in the Business Meetings of our Society, it is because Alice Schafer, an early President of the Association for Women in Mathematics (AWM) has done so at this session, not for want of appreciation. They as individuals, and the AWM as an organization, have added much to the profession, including our meetings and the leadership of our organization. The programs and discussions they have organized have raised essential issues and mobilized activity. I have mentioned NAM, but not enough. These two organizations bring systematically before the AMS and the MAA activities and views which would otherwise die on the vine if raised at all.
What they have done and do is needed by and is beneficial to the entire Society. This is obvious enough in terms of combatting discrimination, encouraging affirmative action to overcome past discrimination, and developing activities to these ends. But they bring also a spirit of democracy embracing all our activities.
Just one example: Alice Schafer recounted how Mary Gray, founding AWM President, opened meetings of the Council. These meetings had been completely closed, no uninvited observers permitted. Her insistence on the right to observe what is being done in our collective name has established that right for the entire membership and helped create a more open Society, even if democratic norms are yet to be fully realized. Now those of us who are not on Council can observe and even contribute to the discussions.
It has been an arduous process, one in which the end remains elusive yet. The atmosphere has changed. No longer would we meet where jimcrow rules or where overt sexism is proclaimed. There is recognition of the obligation to meet where all our members can be comfortable, safe, and welcome. This was demonstrated by the overwhelming agreement to move the January 1995 meetings away from Colorado when that state adopted an amendment removing human rights protections from homosexuals. How different was that discussion from those of earlier years when some of us sought decent treatment for our Black colleagues!
True, the atmosphere has changed. But has it changed enough? The position of female and minority mathematicians and the opportunities for members of these communities to become mathematicians are still far short of what they should be. Unemployment afflicts our successors and we don’t know what to do about it.
Around the world, today is Polar Bear Day; Roman co-emperors sixteen hundred thirty-five years ago issued edicts ‘requesting’ all citizens to covert to ‘trinitarian’ Christianity; forty-five years hence, in 425, the daughter-in-law of one of those imperial masters helped to found the University of Constantinople; four hundred fifty-five years back, England’s rulers and the Lords of the Congregation of Scotland validated the Treaty of Berwick to expel the French from Scotland; two hundred thirty-three years before the here and now, Parliament voted against further funding of war with America’s colonies; two hundred fourteen years prior to the present pass, the District of Columbia by law came under the direct jurisdiction of the U.S. Congress; six years later, in 1807, a baby boy was born who grew up as popular poet Henry Wadsworth Longfellow; MORE HERE
A Thought for the Day
The nearly unimaginable scope of things in general means, of course, that every single fanciful notion that we entertain—no matter how engaging and fulsome, bursting with potential for insight and action, even if it becomes as popular as the latest telecommunications marvel—has a likelihood that approaches zero of corresponding precisely to the shape and parameters of the entire universal enterprise of which each of us nevertheless manifests an indelible and inalienable part, the unfolding of which dynamic delineates one of the profound and persistent paradoxes of conscious existence, to wit that we have as our ‘existential’ job, so to speak, the task of seeking what we can never, ever, ever actually find or achieve, the comprehensive comprehension of ourselves and all that is, in the total web of relations and interconnections that define even the most mundane attributes of the cosmos, whether such a routine component of everything has the label ‘I’ or ‘thee’ or something else altogether.
Quote of the Day
It was impossible for me to believe that conditions in Europe could be worse than they were in the Polish section of Chicago, and in many Italian and Irish tenements, or that any workshops could be worse than some of those I had seen in our foreign quarters. Alice Hamilton
Doc of the Day
1. John Steinbeck, 1962.
2. Dee Brown, 1973.
3. Ralph Nader, various dates.
4. Paul Sweezy, 1987.
Numero Uno—“Your Majesties, Your Royal Highnesses, Min Vackra Fru, Ladies and Gentlemen.
I thank the Swedish Academy for finding my work worthy of this highest honor.
In my heart there may be doubt that I deserve the Nobel award over other men of letters whom I hold in respect and reverence – but there is no question of my pleasure and pride in having it for myself. MORE HERE
The 30th annual Rally of Writers Conference will be held on April 8 on the West Campus of Lansing Community College in Lansing, Michigan. The conference features workshops, craft and publishing talks, and author readings in poetry, fiction, and creative nonfiction. Participating writers and publishing professionals include poet Terry Wooten; fiction writers Landis Lain, Steven Piziks, and Jess Wells; nonfiction writer Andrea King Collier; and agent Alice Speilburg (Speilburg Literary). Fiction writer Lori Nelson Spielman will deliver the keynote. The cost of the conference is $85 ($60 for students) in advance, and $100 ($70 for students) on-site. Visit the website for more information.
A fellowship, which includes a stipend of $20,000, is given annually to a writer of nonfiction (including creative nonfiction) working on a book relating to the literature, history, culture, or art of the Americas before 1830. The fellowship includes a two-month research period to be conducted at the John Carter Brown Library on the campus of Brown University in Providence, Rhode Island, and a two-month writing term at the C. V. Starr Center for the Study of the American Experience at Washington College in Chestertown, Maryland. Submit a writing sample of any length, a project description, a curriculum vitae, and contact information for three references by March 15. There is no entry fee. Visit the website for complete guidelines.
Belhaven University, a Christian university committed to the ministry of integrating biblical truth and learning, is searching for the ideal candidate for Assistant Professor of Creative Writing who has a heart for Christ Jesus and students at the Jackson, Mississippi campus.
Around the world, today is Polar Bear Day; Roman co-emperors sixteen hundred thirty-seven years ago issued edicts ‘requesting’ all citizens to covert to ‘trinitarian’ Christianity; forty-five years hence, in 425, the daughter-in-law of one of those imperial masters helped to found the University of Constantinople; four hundred fifty-seven years back, England’s rulers and the Lords of the Congregation of Scotland validated the Treaty of Berwick to expel the French from Scotland; two hundred thirty-five years before the here and now, Parliament voted against further funding of war with America’s colonies; two hundred sixteen years prior to the present pass, the District of Columbia by law came under the direct jurisdiction of the U.S. Congress; six years later, in 1807, a baby boy was born who grew up as popular poet Henry Wadsworth Longfellow; another half-decade afterward, in 1812, Manuel Belgrano first raised the flag of an independent Argentina, and Lord Byron delivered his first House-of-Lords speech, a defense of
Luddite protests in Nottinghamshire; one hundred fifty-seven years ago, Abraham Lincoln delivered his controversial-but-acclaimed speech at Cooper Union in New York, which many credit for his subsequent election to President; three hundred sixty-six days subsequently, in 1861, Russian troops fired on Polish protesters who demanded independence from Moscow rule; two years further on, in 1863, a baby boy came into the world who would mature as philosopher and academic George Herbert Mead; another half-decade further on, in 1868, a female infant uttered her first cry en route to a life as environmental and occupational health scientist and advocate Alice Hamilton; two decades thereafter, in 1888, the baby male came along who would become popular and academic historian Arthur Schlesinger; a hundred twenty years back, the baby girl who would become world-famous singer Marian Anderson was born; three years thereafter, in 1900, the British Labor Party came into existence; two years closer to the present, in 1902, the baby boy took a first breath who would grow up to win the Nobel Prize in literature as John Steinbeck; a decade hence to the day, in 1912, across the Atlantic in French Africa, a male child gave his first cry on his way to a life as acclaimed novelist Lawrence Durrell; ninety-six years before this precise moment, the International Working Union of Socialist Parties came into existence in opposition to the Bolshevik’s dominance in the Third International; a year later exactly, in 1922, the Supreme Court upheld female suffrage as promulgated by the Nineteenth Amendment; eleven years after that point, in 1933, the German Parliament building, the Reichstag, burned in a stepping-stone to Nazi dominance of German politics; just a year afterward, in 1934, across the Atlantic, two baby boys entered the world, the first to grow up as activist, lawyer, and author Ralph Nader, the second to become thinker and writer of Native American matters, N. Scott Momaday; eighty-one years ago, Russian Nobellist Ivan Pavlov died; three years later, in 1939, the Supreme Court fulfilled its standard function of ruining the rights of workers with its declaration that sit-down strikes were illegal; three hundred sixty-five days beyond that, in 1940, two scientists at Berkeley discovered the radioactive isotope of Carbon that revolutionized the dating of ancient artifacts; two years nearer the present pass, in 1942, a girl infant took her first breath on the way to becoming, as Charlayne Hunter-Gault, the first African American to attend the University of Georgia, ultimately becoming a working journalist; a year beyond that juncture, in 1943, across the Atlantic in Germany, the Rosenstrasse protests against interdicting German Jews who had married German women unfolded; forty-four years back, American Indian Movement activists occupied Wounded
Knee, South Dakota in protest at ill-treatment and historical crimes against indigenous Americans; sixteen years later, in 1989,3000 miles south in Venezuela, riots against repressive rule, the Caracazo upheaval, began against the government there; thirteen years before this point, the initial report from John Jay College, commissioned by the U.S. Conference of Catholic Bishops, came out with its damning conclusions about sexual abuse on the part of priests, and Marxist economist and thinker Paul Sweezy took his final breath; four years after that day, in 2008, ‘conservative’ thinker William F. Buckley enjoyed his final day on Earth.
Numero Uno—“Your Majesties, Your Royal Highnesses, Min Vackra Fru, Ladies and Gentlemen.
I thank the Swedish Academy for finding my work worthy of this highest honor.
In my heart there may be doubt that I deserve the Nobel award over other men of letters whom I hold in respect and reverence – but there is no question of my pleasure and pride in having it for myself.
It is customary for the recipient of this award to offer personal or scholarly comment on the nature and the direction of literature. At this particular time, however, I think it would be well to consider the high duties and the responsibilities of the makers of literature.
Such is the prestige of the Nobel award and of this place where I stand that I am impelled, not to squeak like a grateful and apologetic mouse, but to roar like a lion out of pride in my profession and in the great and good men who have practiced it through the ages.
Literature was not promulgated by a pale and emasculated critical priesthood singing their litanies in empty churches – nor is it a game for the cloistered elect, the tinhorn mendicants of low calorie despair.
Literature is as old as speech. It grew out of human need for it, and it has not changed except to become more needed.
The skalds, the bards, the writers are not separate and exclusive. From the beginning, their functions, their duties, their responsibilities have been decreed by our species.
Humanity has been passing through a gray and desolate time of confusion. My great predecessor, William Faulkner, speaking here, referred to it as a tragedy of universal fear so long sustained that there were no longer problems of the spirit, so that only the human heart in conflict with itself seemed worth writing about.
Faulkner, more than most men, was aware of human strength as well as of human weakness. He knew that the understanding and the resolution of fear are a large part of the writer’s reason for being.
This is not new. The ancient commission of the writer has not changed. He is charged with exposing our many grievous faults and failures, with dredging up to the light our dark and dangerous dreams for the purpose of improvement.
Furthermore, the writer is delegated to declare and to celebrate man’s proven capacity for greatness of heart and spirit – for gallantry in defeat – for courage, compassion and love.
In the endless war against weakness and despair, these are the bright rally-flags of hope and of emulation.
I hold that a writer who does not passionately believe in the perfectibility of man, has no dedication nor any membership in literature.
The present universal fear has been the result of a forward surge in our knowledge and manipulation of certain dangerous factors in the physical world.
It is true that other phases of understanding have not yet caught up with this great step, but there is no reason to presume that they cannot or will not draw abreast. Indeed it is a part of the writer’s responsibility to make sure that they do.
With humanity’s long proud history of standing firm against natural enemies, sometimes in the face of almost certain defeat and extinction, we would be cowardly and stupid to leave the field on the eve of our greatest potential victory.
Understandably, I have been reading the life of Alfred Nobel – a solitary man, the books say, a thoughtful man. He perfected the release of explosive forces, capable of creative good or of destructive evil, but lacking choice, ungoverned by conscience or judgment.
Nobel saw some of the cruel and bloody misuses of his inventions. He may even have foreseen the end result of his probing – access to ultimate violence – to final destruction. Some say that he became cynical, but I do not believe this. I think he strove to invent a control, a safety valve. I think he found it finally only in the human mind and the human spirit. To me, his thinking is clearly indicated in the categories of these awards.
They are offered for increased and continuing knowledge of man and of his world – for understanding and communication, which are the functions of literature. And they are offered for demonstrations of the capacity for peace – the culmination of all the others.
Less than fifty years after his death, the door of nature was unlocked and we were offered the dreadful burden of choice.
We have usurped many of the powers we once ascribed to God.
Fearful and unprepared, we have assumed lordship over the life or death of the whole world – of all living things.
The danger and the glory and the choice rest finally in man. The test of his perfectibility is at hand.
Having taken Godlike power, we must seek in ourselves for the responsibility and the wisdom we once prayed some deity might have.
Man himself has become our greatest hazard and our only hope.
So that today, St. John the apostle may well be paraphrased …
Numero Dos—“1960s seatbelt laws represented a cultural change
A cultural shift took place with the use of seat belts. I still remember when a seat belt was an ‘after-market’ item for a car. Few if any cars I saw had seat belts in the 1960s.By the late 1960s, pressure from Ralph Nader and consumer groups caused Congress to pass laws that mandated the installation of seat belts as required equipment for a new car. Most states didn’t force people to wear them–then the law only required people to have them. It was sort of like creating a safety net for trapeze artists, but leaving the net on the solid ground. Most people just stuffed the seat belts behind the seats to keep from sitting on them.
Public-service ads featuring crash dummies starting giving us graphic reasons to think again about the use of seatbelts. It started seeming like a pretty good idea after all. And within a few more years, it was more than a good idea — it was the law.
When you see the paralysis of the government, when you see Washington, D.C., be corporate-occupied territory, every department agency controlled by overwhelming presence of corporate lobbyists, corporate executives in high government positions, turning the government against its own people, one feels an obligation [to run for President].
Source: Meet the Press: 2008 “Meet the Candidates” series , Feb 24, 2008
Obama & McCain differ, but neither takes on corporations
Q: Do you see differences between Barack Obama and John McCain on the war, on tax cuts, on the environment, on a lot of issues?
A: Yeah. There are differences, obviously. The question is not whether their differences verbally or what they put on their Web site, the question is what is their record? Senator Obama’s record has not been a challenging one. He’s not been a Senator Wellstone or Senator Metzenbaum by any means. He has leaned, if anything, more toward the pro-corporate side of policymaking. The issue is, do they have the moral courage? Do they have the fortitude to stand up against the corporate powers and get things done? Yes, get things done for the American people? 1950, President Truman proposed universal health care. We still don’t have it.
Democracy is gone when elections are commercialized
Senator Gravel knows that elections have been commercialized to the point where the very media expectation of candidates is determined by how much money they’ve raised in every quarter. It’s almost like a corporation: What is the quarterly report? Money from commercial interests, with their 10,000 political action committees, comes heavily in terms of quid pro quo.
Senator Gravel understands that we must take the domination of just about everything by giant corporations as a major issue. If you don’t make this a major issue, it will affect our economy and our electoral reforms, and we will be avoiding a critical issue and engaging in rhetorical charades, slogans, clich‚s, and self-censorship.
If money is the index of electoral politics, Senator Gravel rightly believes our democracy is gone. We’re supposed to have a government of the people, by the people, and for the people. There can be no democracy if it is a government of the Exxons, by the General Motors, for the DuPonts.
Corporations have too much control over people’s lives
What we have to do when we talk about injustice is get people indignant, who ordinarily would not get indignant even though they know about this injustice. In a Business Week poll in 2000, 72% of the people polled said they think corporations have too much control over their lives.
That was before the corporate crime wave that started with Enron and the looting and draining of pensions and small investors. You can imagine what it would be now.
We have to touch people with descriptions and narrations of different types of corporate control Some descriptions will leave people cold. A lot of people don’t really have empathy to inner-city exposure to business crime like payday loan interest rates of 400%, or asbestos and lead. Other people resonate with the corporate seizure of our tax dollars, where a huge portion of our tax dollars are cycled back to corporate welfare and bailouts. There are people who really get angry about that. We’ve got to enter that constituency as well
Corporations control government; that defines fascism
Every single agency and department in Washington is overwhelmingly controlled by corporationsThey have 10,000 Political Action Committees. They put their high executives in high government positions. They have 35,000 full-time lobbyists. Just imagine that–even the Labor Department is not controlled by trade unions–it’s controlled by corporations.
It isn’t just the government under the CONTROL of corporations–the government IS the Corporation now! The corporation IS the government!
That’s the kind of thing that Franklin Roosevelt called “fascism.” In 1938 he said to Congress, “When government is controlled by private economic power, that is fascism”
Citizens’ agenda for cracking down on corporate crime
This is a citizens’ agenda for cracking down on corporate crime.
Establish a public online corporate crime database at the Dept. of Justice, and produce an annual corporate crime report
Increase Corporate Crime Prosecution Budgets of the SEC, which have been chronically under funded.
Ban Corporate Criminals from Government Contracts – including corporate contracts in Iraq.
Punish corporate tax escapees by closing the offshore reincorporation loophole and banning government contracts and subsidies for companies that relocate their headquarters to an offshore tax haven.
Restore the Rights of Defrauded Investors in seeking restitution
Require shareholder authorization of top executive compensation at annual shareholder meetings.
Enact corporate sunshine laws that force corporations to provide better information about their records on the environment, human rights, worker safety, and taxes, as well as their criminal and civil litigation records.
In 1980, Detroit was desperate for economic revitalization. General Motors offered to build a new complex if a suitable site could be found and given to GM. Detroit agreed that the best site was Poletown, a neighborhood consisting of second-generation Polish-Americans and African-Americans, [which was given to GM by eminent domain].
[In a reversal this week], the Michigan Supreme Court rejected the suggestion that “a vague economic benefit stemming from a private profit-maximizing enterprise is a ‘public use.’“ Rather, land may be seized and transferred to a private party only if ”the property remains subject to public oversight after transfer to a private entity.“
This decision makes good sense as a matter of constitutional law & fundamental fairness. People’s homes can no longer be seized to achieve speculative benefits or to reward usually large corporate welfare kings. Courts nationwide should follow Michigan’s lead and reestablish their rightful role in our constitutional system.
Legislative “tort deform” for consumers, not corporations
Another tort deform bill has failed in the Senate this week. American consumers should be thankful that the “Class Action Fairness Act” was mired in election year posturing by both parties. Some, mainly Republicans & corporations, would have you believe that this is a “victory for trial lawyers.” It is not. Sadly, this is not even that much of a victory for the aggrieved consumers who, as a result of the failed legislation, will retain access to their state judges and courts.
The reason we are seeing tort deformers push the myriad pieces of legislation that would immunize doctors from malpractice responsibility; that would protect oil companies from cleaning up polluting components of gasoline from our drinking water sources; or that would make more onerous the ability of class actions to succeed against wealthy cigarette manufacturers, asbestos manufacturers and other corporations, is because they need only establish a few federal legislative precedents to open the tort deform floodgates.
Economic powers control our lives and our elections
Our country and its principles are abandoned by the very economic powers that control our destiny. Autocratic global corporations are deep into strategic planning. They openly and confidently strive to control our jobs; our environment; our political and educational institutions; our food, drugs, and other consumptions; our savings; our childhoods; our culture; even our genetic futures. Toward these ends, they incessantly move to control our elections and our governmental institutions.
Source: The Good Fight, by Ralph Nader, p. 3 , Jul 6, 2004
Capitalism can lead to fascism
[Quoting Franklin Delano Roosevelt]: “The liberty of a democracy is not safe if the people tolerate the growth of private power to a point where it becomes stronger than the democratic state itself. That in its essence is fascism: ownership of the government by an individual, by a group or any controlling private power.”
Source: The Good Fight, by Ralph Nader, p. 4 , Jul 6, 2004
Corporations should not legally be counted as individuals
Whether we think in terms of justice under law or equal protection of the laws, it is untenable that artificial entities called corporations are given most of the constitutional rights of real humans while aggregating powers, privileges, and immunities that individuals, no matter how wealthy, could never come close to attaining The primacy of civic values, rooted in our Declaration of Independence and the Constitution, must become our common objective for the common good
Source: The Good Fight, by Ralph Nader, p. 8 , Jul 6, 2004
Giant corporations roam the Earth making people into serfs
Giant corporations roam the Earth, pitting societies against one another in search of the lowest costs from serf labor and other exactions from authoritarian regimes while pulling down standards of living in more democratic countries. This downward drift is accelerated by transnational, autocratic systems of commercial governance known as the World Trade Organization (WTO), the North American Free Trade Agreement (NAFTA), and the African Growth and Opportunity Act (AGOA)
Source: The Good Fight, by Ralph Nader, p. 10 , Jul 6, 2004
Corporate politics is only free speech because money talks
At the Republican National Convention, I managed to observe that while more than $13 million in taxpayer funding had gone to this convention because an earlier Congress viewed such gatherings as civic affairs, the Republicans had added to that millions of corporate dollars. Elections, I told the reporters, are supposed to be for real people–the voters–not for corporations, artificial entities that cannot vote (at least not yet).
While I was talking with reporters, a wandering corporate fellow, having overheard my remarks about the convention’s corporate omnipresence, blurted, “It’s free speech, Ralph.” I responded, “Sure, money talks freely, doesn’t it?”
And business money donated to the Republican Party and its convention made even more public money gush in its service.
[At the Republican National Convention], lavish parties were setting spending records for national political conventions. These events are the “convention behind the convention.” The talk almost always centers around big-business demands–contracts, subsidies, tax breaks, bail-outs, and reducing or eliminating regulation.
Paying for these concessions with ever-larger campaign donations gives new meaning to what the wry Will Rogers once said about Congress: “the best money could buy.” So when the corporate greasers & persuaders finished their work at the Republican convention, they took a few days off & then flew to Los Angeles for the opening of the Democratic convention.
For them, it was the same racket, just different coastlines. Democratic National Committee donors who gave $50,000 enjoyed a private reception. The biggest donors watched the action from private skyboxes far above the floor.
The business discussed [at either convention] has very little to do with the “Other America.”
In June 2000, Nader released a document laying out his personal finances. Due to privacy concerns-his old bugaboo-Nader refused to make his tax returns public, a step taken by most candidates. As a compromise, the Federal Election Commission accepted a financial-disclosure document prepared by Nader himself.
Nader’s net worth was $3.8 million as of June 2000. He owned $1.2 million in Cisco Systems stock and more than $100,000 worth of shares in Fidelity’s Magellan Fund, which has stakes in defense contractors and business interests in South American rain forests. This, from a man who once urged GM’s shareholders to revolt in an effort to force the company to be more socially responsible. Then again, Nader estimated during his presidential run that he had made $14 million since 1967. He gave the bulk of the money away, financing his own causes and numerous others. As always, Nader is full of contradictions inside of contradictions, like Chinese boxes.
Nader’s very first fight, and perhaps his most famous, was with General Motors. At the time, people pegged him for an auto safety crusader, exclusively. But Nader worked hard to wriggle out of that pigeonhole. He went on to address a vast litany of issues: unsanitary food preparation, flammable clothing, avaricious sports franchises, the limits of standardized testing.If you get bumped from a plane and the airline provides you a voucher for another flight, thank Nader. When you get an X-ray and the technician covers you with a lead apron-again Nader deserves thanks.
The nest result of all Nader’s frenetic activity was a full-blown movement, consumerism. The name is a bit misleading: consumerism is not just about the price of a cup of sugar, at least not at its core. It is more of a political and economic theory, born out of Nader’s distinct observations about the ongoing struggle between corporations and individual citizens, producers and consumers.
What does Nader stand for? His raison d’etre for his candidacy: shifting power from “giant corporations, which have a grip over our government, environment, workplace, and marketplace” to “workers, consumers, taxpayers, and the voters of America.”
Source: Scot Lehigh, Boston Globe, page D1 , Oct 8, 2000
1995: Gov. Bush’s tort reform benefited him personally
The pro-business Texas legislature passed 7 bills to address the mounting perception that the court system was unfair to companies being sued. The new laws, which were quickly signed by the governor, limited punitive damages, overhauled the state’s deceptive trade practices, made it more difficult to sue doctors for malpractice, limited the liability of companies when more than one was to blame, and allowed judges to sanction lawyers for filing frivolous lawsuits.
A 1995 study by Public Citizen, a watchdog group founded by Ralph Nader, later released a report to the Associated Press that showed 3/4 of the companies in which Bush owned stock, were defendant corporations and could be drastically affected by lawsuit reforms.
The governor claimed that tort reform was “good for business” in Texas, and that everyone benefited because insurance premiums would fall. However, in the following years, rates did not decline and the insurance industry recorded profits at a 40-year high.
Ethical rules should REQUIRE reporting corporate misconduct
In response to evidence of grave corporate misconduct, a lawyer merely resigning provides insufficient protection to society. That is why ethical rules should REQUIRE–not merely permit–a corporate lawyer to notify the proper authorities in specifically defined and limited circumstances. These circumstances should include instances in which, for example, the lawyer is reasonably certain that the corporation is involved in criminal conduct, or in cases where the public health and safety is likely to be substantially harmed, such as the marketing of a known dangerous product or the serious violation of pollution laws. In addition, if the circumstances justify whistle-blowing, the law should protect in-house attorneys from suffering job sanctions merely for complying with their ethical responsibilities.
Adopting these conscience-releasing reforms would create a powerful deterrent to corporate wrongdoing. Many corporate attorneys would howl in protest if confronted with this proposal.
Legal delaying tactics cause crisis in confidence in law
Too often power-lawyer strategies are aimed at one goal: delay. That word describes so much of what many corporate lawyers do. They delay by obstructing the fact-finding process, or by filing questionable appeals. As a consequence, it often takes many years for a civil case to reach its ultimate conclusion.
When justice moves at the pace of continental drift, it contributes mightily to the public’s current crisis of confidence in attorneys, judges, and, indeed, the entire civil justice system. And with some power lawyers going even further and deliberately concealing evidence, it is evident that dramatic changes are needed.
Evidence of misconduct is exposed only through determined and expensive discovery litigation, or never turns up at all. For every case where the wrong-doing finally comes out, how many cases are there where smoking gun documents remain buried in the files or damning eyewitness testimony never comes to light? Presently, there is no way to know.
Corporate state gives away public assets to private monopoly
With relentless focus and resources, [corporate lawyers] work to indenture government and the people to large corporations, to have government subsidize in many ways entire industries, to compromise the arm’s-length relationship between government and business by systemically undermining the rule of just law that is supposed to protect wronged or harmed citizens.By complex means, public assets–from natural resources to medical research and development–are given away to private monopoly ownership and/or control. Neutering the purpose of the law vis-a-vis corporations–as with health and safety regulations–has been done particularly in the auto and drug areas.
All the while, innocent people have suffered in the workplace, marketplace, and environment.
Corporate power lawyers are not just any citizens equipped only with the influence of their facts and arguments. They are paid to be architects of the corporate state from whence they derived their power, their wealth, and their status.
Instrumental in food safety laws as well as auto safety
In his career as consumer advocate Nader has been instrumental in the passing of the following legislation:
National Motor Vehicle and Highway Traffic Safety Act (1965)
Wholesome Meat Act
Wholesome Poultry Product Act
Safe Water Drinking Act
Tire safety & grading disclosure law
Center for Auto Safety
Fisherman’s Clear Water Action Group
Professional Drivers (PROD) (truck safety)
Professionals for Auto Safety
Source: Green Party 2008 Presidential Candidate Questionnaire , Feb 3, 2008
1965 book “Unsafe At Any Speed” saved millions of lives
In 1963 [I wanted] to bring the reckless, hyper-horsepower-minded automobile industry under the rule of law. The more I learned about the simple safety features–seat belts, collapsible steering columns–that could make crashes survivable, the more I was driven to press for mandatory vehicle safety standards.My book “Unsafe at Any Speed” came out in 1965, and by 1966, Congress had passed the Motor Vehicle and Highway Safety Acts, bringing the auto industry under federal regulation. The book created an uproar in Detroit. Congress responded to overwhelming evidence that safer cars could greatly diminish highway casualties. Isn’t this the way our political system is supposed to work?
More than a million lives have been saved and many millions of injuries prevented or reduced in severity because of implementation of these laws. The system worked–government responded to an engaged citizenry, and the fatality rate declined from 5.6 deaths per hundred million vehicle miles in 1966 to 1.6 in 2000.
Auto safety devices are simple & cheap; but take years
When I started on motor vehicle safety issues back in the 1950s, what impressed me most was the simple nature of safety devices that were not in cars. For example, the padded dash panel that was invented by the makers of the Roman chariots in ancient Rome. The collapsible steering column was patented before WWI. Seat belts were available to pilots in WWI. When I started criticizing the auto companies for not putting these simple, lifesaving features in cars, that was considered a radical move by the auto companies and by quite a few commentators as well.
When studies showed that in frontal collisions, if you hit your head against the rearview mirror and it did not break away, it could be a fatal injury. It took us years to get the auto company executives to let their engineers do what they knew how to do and to put breakaway rearview mirrors in cars that we have today. All of these safety devices cost a pittance even on the first round of installation.
Source: Remarks to the Detroit Economic Club , Oct 10, 2000
Safety regulation works; but Auto Safety Agency sold out
The enormous success in the first few years of the Auto Safety Agency’s administration [is] still to our benefit today. The death toll per 100 million vehicle miles in 1966 was 5.6 fatalities for every 100 million vehicle miles driven. Last year it was 1.6. So regulation does work, and a coordinated national effort to have everybody involved, address the problem, can diminish the problem.
What has happened now is that the Auto Safety Agency has become a consulting firm for the auto industry. The process started under Ronald Reagan and George Bush and continued unabated under the Clinton-Gore administration. With the exception of the airbag standard, there has been very little advance in automotive safety and fuel-efficiency technology in people’s motor vehicles in the last 20 years. The last statutory fuel-efficiency standard was established in 1975, and the goal was by 1985, a motor vehicle average fuel efficiency would be 27.5 miles per gallon
Source: Remarks to the Detroit Economic Club , Oct 10, 2000
More regulation for auto safety, with criminal penalties
We need, in this country, new motor vehicle statutory authorities, with the following amendments enacted by Congress:
To put criminal penalties for knowing and willful violation of motor vehicle standards or knowing or willful refusal to recall known defective cars that are impairing human life.
To increase the [maximum] civil penalties from $925,000 to $15 million.
To require the testing before certifying for compliance with safety standards.
To extend the statute of limitations. Right now if you have a car that is over eight years old, and the company discovers a serious defect, they don’t have to recall the vehicle. After eight years, they are in the clear.
Now, [for] all of these and other knowing and willful criminal behavior, coverups, there is no criminal penalty. But if you are ever in Colorado or Wyoming and Idaho, and you get caught harassing a wild ass, you can go to jail for one year.
Source: Remarks to the Detroit Economic Club , Oct 10, 2000
Cancel R&D giveaways to auto industry; let them do it
The Partnership for a New Generation of Vehicles (PNGV) is a public/private partnership between seven federal agencies and the big three automakers. The program represents an effort to coordinate the transfer of property rights for federally funded research and development to the automotive industry.It is hard to imagine an industry less in need of government support than the highly capitalized auto industry, which is reporting record profits year after year. The government is supporting research that the industry would or should do on its own in response to market demands, or could easily be required to do in order to meet tougher environmental standards.
The PNGV is not the only example of a federal research program that should be eliminated. Research and development programs in areas like nuclear power and fossil fuels (among them the clean coal technology program) invest funds in undesirable non-renewable technologies. Such programs are not defensible.
Gore has given auto industry and other polluters a free ride
Q: Why do you speak so harshly about Al Gore?A: Gore said he was going to take on the auto industry. He gave the auto industry eight years of free ride on fuel efficiency standards, which have actually gone down; they’re at their lowest level since 1980-one reason for this oil price increase. He’s been weak on pesticides; given biotech industry a free ride; supported GATT and NAFTA, which are anti-environmental. He’s had eight years to convince us-we can’t believe him on that.
Source: Nader-Buchanan debate on ‘Meet the Press’ , Oct 1, 2000
Motor vehicles are the greatest environmental hazard
Year after year, through its traumatic and polluting impact, the motor vehicle performs as the greatest environmental hazard in this country. The inceptions and consequences of this hazard do not conform neatly to municipal, county, and State boundaries. In terms of unused capacity, fuel consumption per passenger, injuries, pollution, and total time displacement of drivers and passengers, automotive travel is probably the most wasteful and inefficient mode of travel by industrial man.
Source: VoteNader.com, “Auto Safety” , Feb 21, 2000
DOT: Focus on safety and mass transit
Q: If you were president, how would you run the Department of Transportation differently?
A: I would make its mission safety, No. 1 — whether it’s aviation, highway safety, motor vehicle safety. It isn’t No. 1 now. It’s basically a department that’s a consulting firm to the motor vehicle industry and all its component parts — trucking industry, automobile manufacturers, the highway lobby, etc. It’s not enforcing the law. I would enforce the law.
I would dramatically expand investment in modern public transit. Instead of spending billions keeping our boys in Europe and East Asia to defend against nonexistent enemies on behalf of prosperous countries, I’d put that money into job production for public transit and other public works, like schools, clinics, sewage systems and drinking water systems.
Q: On the issue of pollution emissions tests and controls, you’ve commented, The more you try to control pollution at the end point, the more expensive it gets and the more pissed off people get with administrators and having to go and get their car inspected and get a sticker. So why isn’t it controlled at the point of production?
A: Because at the point of production the company has to change the product, whereas at the point of emission it’s more at the [consumer] end. It’s more, “You haven’t kept it up. You haven’t maintained it,“ etc. But also, if you control it at the emission point you don’t have to raise the question of displacing the internal combustion engine with a new propulsion system. You don’t have to answer the question why the auto companies have been promising electric cars for all these years. I saw it at the 1939 World’s Fair at the GM exhibit. And now the head of the Automobile Manufacturer’s Association is quoted in the press as saying it’s still ten years off.
Source: Alternative Radio, interview by David Barsamian , Dec 8, 1995
Ralph Nader on Consumer Rights
Long history of pushing for reforms to make consumers safer
Nader has a long history of pushing for reforms that will make consumers safer. “Because of Ralph Nader, we drive safer cars, eat healthier food, breathe better air, drink cleaner water and work in safer environments,” according to the campaign Web site.But Nader is not without controversy, including public spats with the Democratic Party–particularly in 2000, when some accused him of derailing then-Vice President Al Gore’s presidential run.
Source: 2008 Third Party debate, in Lorain County Chronicle-Telegram , Oct 19, 2008
Instrumental in Consumer Product Safety Act and related laws
In his career as consumer advocate Nader has been instrumental in the passing of the following legislation:
Consumer Product Safety Act
Foreign Corrupt Practices Act
Mine Health and Safety Act
Whistleblower Protection Act
Medical Devices safety
Nuclear power safety
Mobile home safety
Consumer credit disclosure law
Pension protection law
Co-Op Bank Bill (1978)
Funeral home cost disclosure law
Occupational Safety and Health Act (OSHA) 1970
Source: Green Party 2008 Presidential Candidate Questionnaire , Feb 3, 2008
Help for ordinary people should replace corporate welfare
Ralph Nader railed against big business from the heart of corporate America yesterday. Nader criticized New York City for offering multimillion-dollar tax breaks and other incentives to persuade the New York Stock Exchange to stay in Manhattan. “Here is this bastion of global capitalism on welfare. It will take hundreds of millions of taxpayer dollars in order to build them a new building. At the same time, hundreds of neighborhoods are suffering from inadequate funding of their vital needs.”
Source: Jayson Blair, NY Times , Sep 1, 2000
Address corporate crimes piecemeal AND by revoking charters
Q: What are your views on Richard Grossman of the Program on Corporations, Law and Democracy? He criticizes the piecemeal, one-by-one approach of addressing corporate crimes and advocates revoking their charters to do business whenever they harm the common good.
A: I agree with both. I think you’ve got to do the retail law enforcement, which of course not only helps people immediately who are being harmed or cheated by these criminal violations or fraudulent behavior, but it informs people. Every time there’s a prosecution, every time there’s law enforcement, it informs people of the misdeeds of these corporations. On the other hand, you’ve got to go to the basic charter that state governments and in some respects the federal government provides to create these corporations and the conditioning of proper corporate behavior historically by these charters when they were given by legislatures in various states in the early part of the nineteenth century has been forgotten.
Source: Alternative Radio interview with David Barsamian , Feb 23, 2000
More public disclosure of corporate lawsuit outcomes
In order for people to make informed decisions about how they will conduct their lives, about which products to purchase and which to avoid, about which companies to patronize, and the like, they need to assess information.
Today, a parent may buy a child a car safety seat unaware that other children have been severely injured or killed in that model due to a product defect. A secret settlement may have swept the potential danger under the rug. A doctor may prescribe medication unaware that there are potential side effects being kept from the public by the drug company, side effects that may take his patient’s life. Why? Because corporate attorneys made a secret settlement that buried information about the side effects. A woman may seek help from a mental health professional unaware that he has been repeatedly accused of sexually assaulting his patients, because he secretly paid his previous victims for their silence through confidential settlements.
Billing quotas pressure corporate lawyers to pad bills
Young law firm associates generally must meet billing quotas, often two thousand billable hours a year, and the number actually expected of lawyers who hope to advance in the firm may be even higher. Experienced law firm bill auditors say that to tally 2000 hours legitimately chargeable to clients in a year, a lawyer must put in six or seven full days of work a week, virtually without a break.
The pressure to bill extracts a terrible toll on health, energy, and family life. As a consequence, many of these young attorneys end up padding bills or doing unnecessary work simply to feed the voracious billing quote monster. This situation makes their stomachs–and their consciences–queasy. The idealistic reasons for which the lawyer may have entered the profession are easily swept away in a blizzard of time sheets, late nights working on minutiae, and lost sleep worrying about whether all of the hard work will pay off with a lucrative partnership
Stop giving corporations the same rights as people
Q:Are you advocating reform, tinkering with the system, or would you support a fundamental overhaul?
A:When you have fundamental problems you’ve got to have fundamental overhauls. Right now corporations are treated as persons, just like flesh-and-blood individuals. A corporation, an artificial legal entity, [is treated] as a person having all the rights under our Constitution. It’s absurd. You can’t have equality under the law with General Motors and John and Jane Doe having the same rights, when Ralph Nader railed against big business from the heart of corporate America yesterday. Nader criticized New York City for offering multimillion-dollar tax breaks and other incentives to persuade the New York Stock Exchange to stay in Manhattan. “Here is this bastion of global capitalism on welfare. It will take hundreds of millions of taxpayer dollars in order to build them a new building. At the same time, hundreds of neighborhoods are suffering from inadequate funding of their vital needs.”
One of the reasons why corporate lobbyists are well paid is that they are expected to follow orders, allow no internal wavering, and click their heels for maximum greed. The post-9-11 frenzy started with the airline industry–headed by the same company bosses who, year after year, rejected one proposal for airline security after another by our aviation safety group and safety-conscious aviation engineers and legislators, including the simplest one of all: toughening cockpit doors and latches. In one swoop on Congress and after announcing 80,000 layoffs, the industry came away with $5 billion in cash and $10 billion in loan guarantees. The workers got nothing; the top executives maintained their ample pay.
Scrutinize even “good” corporate welfare which helps public
If a program involves the government giving more to private companies than it gets back, then it should be considered corporate welfare. This definition suggests analytic inquiries other than whether a program is “good” or “bad.” It allows for the possibility of “good” corporate welfare-programs that confer subsidies on business but are merited because of overall public gain. There ARE cases of “good” corporate welfare, but these too should be subjected to proper procedural and substantive checks.
Source: Cutting Corporate Welfare, p. 31 , Oct 9, 2000
Corporate welfare is a function of political corruption
Corporate welfare-the enormous and myriad subsidies, bailouts, giveaways, tax loopholes, debt revocations, loan guarantees, and other benefits conferred by government on business-is a function of political corruption. Corporate welfare programs siphon funds from appropriate public investments, subsidize companies ripping minerals from federal lands, enable pharmaceutical companies to gouge consumers, perpetuate anti-competitive oligopolistic markets, injure national security, & weaken our democracy.
Source: Cutting Corporate Welfare, p. 13 , Oct 9, 2000
S&L bailout helped bankers & hurt consumers
Perhaps the largest corporate welfare expenditure of all time-ultimately set to cost taxpayers $500 billion in principal and interest-the savings and loan bailout is in a large part a story of political corruption, the handiwork of the industry’s legion of lobbyists and political payoffs to campaign contributors. The well-connected S&L industry successfully lobbied Congress for a deregulatory bill in the early 1980s, which the industry from historic constraints and paved the way for the speculative and corrupt failures that came soon after.
When Congress finally did address the problem, it put the bailout burden on the backs of taxpayers, rather than on the financial industry.
Congress even refused in the bailout legislation to include measures to empower consumers to band together into financial consumer associations-a modest quid pro quo that would have imposed zero financial cost and would have enabled consumers to act on their own to prevent future S&L-style crises and bailouts.
Disallow benefits to companies except for public purposes
A series of inquisitive screens can be applied to corporate welfare programs, regardless of their merit:
Does the program serve some broad public purpose that suggests it has merits beyond the benefits it confers on particular companies? If not, the program should be cancelled.
If it does serve some public interest, can the government achieve the same ends by retaining an interest in an asset being given away of doing a service in-house?
Does the program involve functions that should be properly left to the market?
Is there any reason the government should not charge for services provided?Are there non-monetary reciprocal obligations that should be demanded? These might include reasonable pricing of government-subsidized goods and services provided to consumers.
Is the program subject to judicial challenge? What are the avenues for citizen challenge?
Is there an institutional means of periodic review? Are criteria delineated by which the program should be evaluated?
Stadiums & other local tax abatements ignore small business
Large corporations routinely pit states and cities against each other in bidding contests that are structurally biased in favor of Big Business. The price of their doing business, they communicate explicitly and implicitly, is massive subsidization by local and state authorities-through tax abatements, government financing of building projects, improper use of eminent domain, or other supports. This is corporate welfare in its rawest form.Among the most outrageous types of bidding for business involves sports stadiums. Now gambling casinos are looking for similar subsidies.
Many tax breaks and abatements are directed to specific companies. They properly raise the public ire as citizens demand to know why the rich and powerful have taxes forgiven while local small businesses are required to pay their fair share without special dispensation. This should sharpen the cutting edge of a nascent movement to end corporate welfare as we know it.
Federal regulation of state & local abatements & subsidies
Some corrective policy initiatives to oppose local and state giveaways:
States and localities should adopt a policy of annual disclosure of all corporate welfare recipients.
Where state and local governments decide that taxpayer support for a business is necessary, they should include binding commitments that recipients deliver on job creation and other promises.
Congress should encourage states to refuse to enter a race to the bottom against each other in terms of special tax breaks and related benefits.
The federal government should levy a surtax on companies receiving state and local tax breaks, treating the tax breaks as income upon which federal tax should be paid.
Finally, there must be court tests of the claim that the provision of tax subsidies and similar incentives distort economic decision-making concerning the location of business activity and therefore constitutes an unconstitutional infringement of Congress’ power to regulate interstate commerce.
Bailouts: require payback; practice prevention by regulation
The bailout, a premier form of corporate welfare, is yet another market distortion against the interests of small and medium-sized businesses. Bailouts are different from other corporate welfare categories in that they are ad hoc and unplanned. Some lessons from recent bailout experience:
Congress should prioritize the issue of payback in full, after the company is nursed back to health.
Monetary payback is not enough. Because the government is doing more than making a market-justified loan, it has the right to make demands designed to prevent the need for future bailouts.
The S&L crisis was triggered in large part by industry deregulation. This should be an important cautionary note: that underregulation paves the way for bailouts, especially in the financial sector.
Strong anti-trust policy and enforcement is a vital prophylactic against the emergence of too-big-to-fail institutions which are sure to benefit from a government bailout in the face of potential collapse.
: A simple bill could provide a valuable tool for citizen education and organizing. Such legislation would not propose a permanent ban on corporate welfare, but would require affirmative re-commencement of subsidies under both procedural safeguards and reciprocal obligations. The central operative language for such a bill might read:
Every federal agency shall terminate all below-market-rate sales or arrangements with for-profit beneficiaries; shall cease making any below-market-rate loans; shall terminate all export assistance ort marketing promotion for corporations; and shall terminate any below-market-value technology transfer or subsidy of any kind to for-profit corporations.
The Internal Revenue Code is amended to eliminate all corporate tax expenditures.
The Internal Revenue Code is amended so that the value of local and state tax subsidies to corporations shall be treated as income.
$1000 bounty for suing for abuse of corporate welfare
Citizen Standing to Sue to Challenge Corporate Welfare Abuses: Citizens could be empowered to mount judicial challenges to runaway agencies that reach beyond their statutory powers. Taxpayers could be given standing to file such suits, by awarding a $1,000 “bounty” (plus reasonable attorney’s fees and court costs) for those who successfully challenge improper agency action. Consideration should be given to creating an incentive for such suits by awarding successful plaintiffs a percentage of the money saved through such suits, perhaps according to a sliding scale of declining percentage returns for higher savings with a cap set at certain amounts. Just as qui tamsuits under the False Claims Act have helped curtail oil company underpayment of royalties owed the federal government, so such a measure would create a structural counterbalance to corporate influence over federal agencies.
Over the past twenty years we have seen the unfortunate resurgence of big business influence, generating its unique brand of wreckage, propaganda and ultimatums on American labor, consumers, taxpayers and most generically, American voters. Big business has been colliding with American democracy and democracy has been losing.
Source: Nomination Acceptance Speech , Jun 25, 2000
Corporate sponsorship turns debates into beer commercials
Complaining that corporate sponsorship is turning presidential debates into beer commercials, Ralph Nader and others filed a suit today against the Federal Election Commission over how the debates are financed. The suit, which the plaintiffs say could affect this fall’s presidential debates, says that corporate financing of the debates amounts to an illegal corporate campaign contribution. It asks the court to strike down the Federal Election Commission regulations that allow corporations – like Anheuser-Busch, the maker of Budweiser and a sponsor of this fall’s debates — to contribute millions of dollars to the staging of the debates. “It’s turning our presidential debates into a beer commercial,” Nader said in a telephone interview. “And these companies are really sponsoring an exclusive to our campaign commercial for Bush and Gore.”
Corporate government has hijacked political leadership
Over the past 20 years, big business has increasingly dominated our political economy. This control by the corporate government over our political government is creating a widening “democracy gap.” Active citizens are left shouting their concerns over a deep chasm between them and their government. This is a world away from the legislative milestones seen in the 60s and 70s. At that time, informed and dedicated citizens powered their concerns through the channels of government to produce laws that bettered the lives of millions of Americans.Today we face grave and growing societal problems in health care, education, labor, energy and the environment. These are problems for which active citizens have solutions, yet their voices are not carrying across the democracy gap. Citizen groups and individual thinkers have generated a tremendous capital of ideas and solutions, while our government has been drawn away from us by a corporate government. Our political leadership has been hijacked.
Source: Green Party Announcement Speech , Feb 21, 2000
States & the public should oppose corporate tax breaks
What are the possible remedies for the megabillion-dollar corporate welfare epidemic? State governments should agree among themselves not to engage in races to the bottom [via tax breaks]. And the national government should work to abolish such subsidies entirely. The public should initiate a constitutional challenge to tax inducements designed to lure companies across state lines. [Some scholars] argue that such actions violate the interstate commerce clause. It’s a sound argument
Source: “In the Public Interest” newspaper column , Apr 14, 1999
Role of government is to counteract power of corporations
I like Thomas Jefferson’s definition of government: Do together what we can’t do by ourselves. And that the function of representative government is to counteract what he called “the excesses of the monied interests”-that today is the corporate interests. There’s other things-not only the defense of the country but public health, public safety, research and development-that only government can generate, and we’d better take control of it and have it represent people instead of corporations.
Source: National Public Radio, interview by Diane Rehm , Apr 3, 1996
Coined the term “corporate welfare”
Q: The term “corporate welfare” has been around since 1956, when Ralph Nader first brought it up. What do you mean by corporate welfare?
A: Corporate welfare comes in two forms and many variations. One is the active form. That includes agribusiness subsidies, military contractor subsidies, loan guarantees, bailouts of S&Ls. There are giveaways of minerals on federal lands, there are giveaways of computer databases. Then there’s the passive corporate welfare, which are the tax breaks, the loopholes. There are dozens of those, they make up about half the tax code. One example is that foreign corporations don’t pay many taxes at all. And then there are the rates themselves — if you’re an individual you pay a higher rate than a corporation.
Numero Tres—“It began with Christopher Columbus, who gave the people the name Indios. Those Europeans, the white men, spoke in different dialects, and some pronounced the word Indien, or Indianer, or Indian. Peaux-rouges, or redskins, came later. As was the custom of the people when receiving strangers, the Tainos on the island of San Salvador generously presented Columbus and his men with gifts and treated them with honor.
‘So tractable, so peaceable, are these people,’ Columbus wrote to the King and Queen of Spain, ‘that I swear to your Majesties there is not in the world a better nation. They love their neighbors as themselves, and their discourse is ever sweet and gentle, and accompanied with a smile; and though it is true that they are naked, yet their manners are decorous and praiseworthy.’
All this, of course, was taken as a sign of weakness, if not heathenism, and Columbus being a righteous European was convinced the people should be ‘made to work, sow and do all that is necessary and to adopt our ways.’ Over the next four centuries (1492–1890) several million Europeans and their descendants undertook to enforce their ways upon the people of the New World.
Columbus kidnapped ten of his friendly Taino hosts and carried them off to Spain, where they could be introduced to the white man’s ways. One of them died soon after arriving there, but not before he was baptized a Christian. The Spaniards were so pleased that they had made it possible for the first Indian to enter heaven that they hastened to spread the good news throughout the West Indies.
The Tainos and other Arawak people did not resist conversion to the Europeans’ religion, but they did resist strongly when hordes of these bearded strangers began scouring their islands in search of gold and precious stones. The Spaniards looted and burned villages; they kidnapped hundreds of men, women, and children and shipped them to Europe to be sold as slaves. Arawak resistance brought on the use of guns and sabers, and whole tribes were destroyed, hundreds of thousands of people in less than a decade after Columbus set foot on the beach of San Salvador, October 12, 1492.
Communications between the tribes of the New World were slow, and news of the Europeans’ barbarities rarely overtook the rapid spread of new conquests and settlements. Long before the English-speaking white men arrived in Virginia in 1607, however, the Powhatans had heard rumors about the civilizing techniques of the Spaniards. The Englishmen used subtler methods. To ensure peace long enough to establish a settlement at Jamestown, they put a golden crown upon the head of Wahunsonacook, dubbed him King Powhatan, and convinced him that he should put his people to work supplying the white settlers with food. Wahunsonacook vacillated between loyalty to his rebellious subjects and to the English, but after John Rolfe married his daughter, Pocahontas, he apparently decided that he was more English than Indian. After Wahunsonacook died, the Powhatans rose up in revenge to drive the Englishmen back into the sea from which they had come, but the Indians underestimated the power of English weapons. In a short time the eight thousand Powhatans were reduced to less than a thousand.
In Massachusetts the story began somewhat differently but ended virtually the same as in Virginia. After the Englishmen landed at Plymouth in 1620, most of them probably would have starved to death but for aid received from friendly natives of the New World. A Pemaquid named Samoset and three Wampanoags named Massasoit, Squanto, and Hobomah became self-appointed missionaries to the Pilgrims. All spoke some English, learned from explorers who had touched ashore in previous years. Squanto had been kidnapped by an English seaman who sold him into slavery in Spain, but he escaped through the aid of another Englishman and finally managed to return home. He and the other Indians regarded the Plymouth colonists as helpless children; they shared corn with them from the tribal stores, showed them where and how to catch fish, and got them through the first winter. When spring came they gave the white men some seed corn and showed them how to plant and cultivate it.
For several years these Englishmen and their Indian neighbors lived in peace, but many more shiploads of white people continued coming ashore. The ring of axes and the crash of falling trees echoed up and down the coasts of the land which the white men now called New England. Settlements began crowding in upon each other. In 1625 some of the colonists asked Samoset to give them 12,000 additional acres of Pemaquid land. Samoset knew that land came from the Great Spirit, was as endless as the sky, and belonged to no man. To humor these strangers in their strange ways, however, he went through a ceremony of transferring the land and made his mark on a paper for them. It was the first deed of Indian land to English colonists.
Most of the other settlers, coming in by thousands now, did not bother to go through such a ceremony. By the time Massasoit, great chief of the Wampanoags, died in 1662 his people were being pushed back into the wilderness. His son Metacom foresaw doom for all Indians unless they united to resist the invaders. Although the New Englanders flattered Metacom by crowning him King Philip of Pokanoket, he devoted most of his time to forming alliances with the Narragansetts and other tribes in the region.
In 1675, after a series of arrogant actions by the colonists, King Philip led his Indian confederacy into a war meant to save the tribes from extinction. The Indians attacked fifty-two settlements, completely destroying twelve of them, but after months of fighting, the firepower of the colonists virtually exterminated the Wampanoags and Narragansetts. King Philip was killed and his head publicly exhibited at Plymouth for twenty years. Along with other captured Indian women and children, his wife and young son were sold into slavery in the West Indies.
When the Dutch came to Manhattan Island, Peter Minuit purchased it for sixty guilders in fishhooks and glass beads, but encouraged the Indians to remain and continue exchanging their valuable peltries for such trinkets. In 1641, Willem Kieft levied tribute upon the Mahicans and sent soldiers to Staten Island to punish the Raritans for offenses which had been committed not by them but by white settlers. The Raritans resisted arrest, and the soldiers killed four of them. When the Indians retaliated by killing four Dutchmen, Kieft ordered the massacre of two entire villages while the inhabitants slept. The Dutch soldiers ran their bayonets through men, women, and children, hacked their bodies to pieces, and then leveled the villages with fire.
For two more centuries these events were repeated again and again as the European colonists moved inland through the passes of the Alleghenies and down the westward-flowing rivers to the Great Waters (the Mississippi) and then up the Great Muddy (the Missouri).
The Five Nations of the Iroquois, mightiest and most advanced of all the eastern tribes, strove in vain for peace. After years of bloodshed to save their political independence, they finally went down to defeat. Some escaped to Canada, some fled westward, some lived out their lives in reservation confinement.
During the 1760s Pontiac of the Ottawas united tribes in the Great Lakes country in hopes of driving the British back across the Alleghenies, but he failed. His major error was an alliance with French-speaking white men who withdrew aid from the peaux-rouges during the crucial siege of Detroit.
A generation later, Tecumseh of the Shawnees formed a great confederacy of midwestern and southern tribes to protect their lands from invasion. The dream ended with Tecumseh’s death in battle during the War of 1812.
Between 1795 and 1840 the Miamis fought battle after battle, and signed treaty after treaty, ceding their rich Ohio Valley lands until there was none left to cede.
When white settlers began streaming into the Illinois country after the War of 1812, the Sauks and Foxes fled across the Mississippi. A subordinate chief, Black Hawk, refused to retreat. He created an alliance with the Winnebagos, Pottawotamies, and Kickapoos, and declared war against the new settlements. A band of Winnebagos, who accepted a white soldier chief’s bribe of twenty horses and a hundred dollars, betrayed Black Hawk, and he was captured in 1832. He was taken East for imprisonment and display to the curious. After he died in 1838, the governor of the recently created Iowa Territory obtained Black Hawk’s skeleton and kept it on view in his office.
In 1829, Andrew Jackson, who was called Sharp Knife by the Indians, took office as President of the United States. During his frontier career, Sharp Knife and his soldiers had slain thousands of Cherokees, Chickasaws, Choctaws, Creeks, and Seminoles, but these southern Indians were still numerous and clung stubbornly to their tribal lands, which had been assigned them forever by white men’s treaties. In Sharp Knife’s first message to his Congress, he recommended that all these Indians be removed westward beyond the Mississippi. “I suggest the propriety of setting apart an ample district west of the Mississippi . . . to be guaranteed to the Indian tribes, as long as they shall occupy it.”
Although enactment of such a law would only add to the long list of broken promises made to the eastern Indians, Sharp Knife was convinced that Indians and whites could not live together in peace and that his plan would make possible a final promise which never would be broken again. On May 28, 1830, Sharp Knife’s recommendations became law.
Two years later he appointed a commissioner of Indian affairs to serve in the War Department and see that the new laws affecting Indians were properly carried out. And then on June 30, 1834, Congress passed An Act to Regulate Trade and Intercourse with the Indian Tribes and to Preserve Peace on the Frontiers. All that part of the United States west of the Mississippi “and not within the States of Missouri and Louisiana or the Territory of Arkansas” would be Indian country. No white persons would be permitted to trade in the Indian country without a license. No white traders of bad character would be permitted to reside in Indian country. No white persons would be permitted to settle in the Indian country. The military force of the United States would be employed in the apprehension of any white person who was found in violation of provisions of the act.
Before these laws could be put into effect, a new wave of white settlers swept westward and formed the territories of Wisconsin and Iowa. This made it necessary for the policy makers in Washington to shift the “permanent Indian frontier” from the Mississippi River to the 95th meridian. (This line ran from Lake of the Woods on what is now the Minnesota-Canada border, slicing southward through what are now the states of Minnesota and Iowa, and then along the western borders of Missouri, Arkansas, and Louisiana, to Galveston Bay, Texas.) To keep the Indians beyond the 95th meridian and to prevent unauthorized white men from crossing it, soldiers were garrisoned in a series of military posts that ran southward from Fort Snelling on the Mississippi River to forts Atkinson and Leavenworth on the Missouri, forts Gibson and Smith on the Arkansas, Fort Towson on the Red, and Fort Jesup in Louisiana.
More than three centuries had now passed since Christopher Columbus landed on San Salvador, more than two centuries since the English colonists came to Virginia and New England. In that time the friendly Tainos who welcomed Columbus ashore had been utterly obliterated. Long before the last of the Tainos died, their simple agricultural and handicraft culture was destroyed and replaced by cotton plantations worked by slaves. The white colonists chopped down the tropical forests to enlarge their fields; the cotton plants exhausted the soil; winds unbroken by a forest shield covered the fields with sand. When Columbus first saw the island he described it as “very big and very level and the trees very green . . . the whole of it so green that it is a pleasure to gaze upon.” The Europeans who followed him there destroyed its vegetation and its inhabitants—human, animal, bird, and fish—and after turning it into a wasteland, they abandoned it.
On the mainland of America, the Wampanoags of Massasoit and King Philip had vanished, along with the Chesapeakes, the Chickahominys, and the Potomacs of the great Powhatan confederacy. (Only Pocahontas was remembered.) Scattered or reduced to remnants were the Pequots, Montauks, Nanticokes, Machapungas, Catawbas, Cheraws, Miamis, Hurons, Eries, Mohawks, Senecas, and Mohegans. (Only Uncas was remembered.) Their musical names remained forever fixed on the American land, but their bones were forgotten in a thousand burned villages or lost in forests fast disappearing before the axes of twenty million invaders. Already the once sweet-watered streams, most of which bore Indian names, were clouded with silt and the wastes of man; the very earth was being ravaged and squandered. To the Indians it seemed that these Europeans hated everything in nature—the living forests and their birds and beasts, the grassy glades, the water, the soil, and the air itself.
The decade following establishment of the “permanent Indian frontier” was a bad time for the eastern tribes. The great Cherokee nation had survived more than a hundred years of the white man’s wars, diseases, and whiskey, but now it was to be blotted out. Because the Cherokees numbered several thousands, their removal to the West was planned to be in gradual stages, but discovery of Appalachian gold within their territory brought on a clamor for their immediate wholesale exodus. During the autumn of 1838, General Winfield Scott’s soldiers rounded them up and concentrated them into camps. (A few hundred escaped to the Smoky Mountains and many years later were given a small reservation in North Carolina.) From the prison camps they were started westward to Indian Territory. On the long winter trek, one of every four Cherokees died from cold, hunger, or disease. They called the march their “trail of tears.” The Choctaws, Chickasaws, Creeks, and Seminoles also gave up their homelands in the South. In the North, surviving remnants of the Shawnees, Miamis, Ottawas, Hurons, Delawares, and many other once mighty tribes walked or traveled by horseback and wagon beyond the Mississippi, carrying their shabby goods, their rusty farming tools, and bags of seed corn. All of them arrived as refugees, poor relations, in the country of the proud and free Plains Indians.
Scarcely were the refugees settled behind the security of the “permanent Indian frontier” when soldiers began marching westward through the Indian country. The white men of the United States—who talked so much of peace but rarely seemed to practice it—were marching to war with the white men who had conquered the Indians of Mexico. When the war with Mexico ended in 1847, the United States took possession of a vast expanse of territory reaching from Texas to California. All of it was west of the “permanent Indian frontier.”
In 1848 gold was discovered in California. Within a few months, fortune-seeking easterners by the thousands were crossing the Indian Territory. Indians who lived or hunted along the Santa Fe and Oregon trails had grown accustomed to seeing an occasional wagon train licensed for traders, trappers, or missionaries. Now suddenly the trails were filled with wagons, and the wagons were filled with white people. Most of them were bound for California gold, but some turned southwest for New Mexico or northwest for the Oregon country.
To justify these breaches of the “permanent Indian frontier,” the policy makers in Washington invented Manifest Destiny, a term which lifted land hunger to a lofty plane. The Europeans and their descendants were ordained by destiny to rule all of America. They were the dominant race and therefore responsible for the Indians—along with their lands, their forests, and their mineral wealth. Only the New Englanders, who had destroyed or driven out all their Indians, spoke against Manifest Destiny.
In 1850, although none of the Modocs, Mohaves, Paiutes, Shastas, Yumas, or a hundred other lesser-known tribes along the Pacific Coast were consulted on the matter, California became the thirty-first state of the Union. In the mountains of Colorado gold was discovered, and new hordes of prospectors swarmed across the Plains. Two vast new territories were organized, Kansas and Nebraska, encompassing virtually all the country of the Plains tribes. In 1858 Minnesota became a state, its boundaries being extended a hundred miles beyond the 95th meridian, the “permanent Indian frontier.”
And so only a quarter of a century after enactment of Sharp Knife Andrew Jackson’s Indian Trade and Intercourse Act, white settlers had driven in both the north and south flanks of the 95th meridian line, and advance elements of white miners and traders had penetrated the center.
It was then, at the beginning of the 1860s, that the white men of the United States went to war with one another—the Bluecoats against the Graycoats, the great Civil War. In 1860 there were probably 300,000 Indians in the United States and Territories, most of them living west of the Mississippi. According to varying estimates, their numbers had been reduced by one-half to two-thirds since the arrival of the first settlers in Virginia and New England. The survivors were now pressed between expanding white populations on the East and along the Pacific coasts—more than thirty million Europeans and their descendants. If the remaining free tribes believed that the white man’s Civil War would bring any respite from his pressures for territory, they were soon disillusioned.
The most numerous and powerful western tribe was the Sioux, or Dakota, which was separated into several subdivisions. The Santee Sioux lived in the woodlands of Minnesota, and for some years had been retreating before the advance of settlements. Little Crow of the Mdewkanton Santee, after being taken on a tour of eastern cities, was convinced that the power of the United States could not be resisted. He was reluctantly attempting to lead his tribe down the white man’s road. Wabasha, another Santee leader, also had accepted the inevitable, but both he and Little Crow were determined to oppose any further surrender of their lands.
Farther west on the Great Plains were the Teton Sioux, horse Indians all, and completely free. They were somewhat contemptuous of their woodland Santee cousins who had capitulated to the settlers. Most numerous and most confident of their ability to defend their territory were the Oglala Tetons. At the beginning of the white man’s Civil War, their outstanding leader was Red Cloud, thirty-eight years old, a shrewd warrior chief. Still too young to be a warrior was Crazy Horse, an intelligent and fearless teenaged Oglala.
Among the Hunkpapas, a smaller division of the Teton Sioux, a young man in his mid-twenties had already won a reputation as a hunter and warrior. In tribal councils he advocated unyielding opposition to any intrusion by white men. He was Tatanka Yotanka, the Sitting Bull. He was mentor to an orphaned boy named Gall. Together with Crazy Horse of the Oglalas, they would make history sixteen years later in 1876.
Although he was not yet forty, Spotted Tail was already the chief spokesman for the Brulé Tetons, who lived on the far western plains. Spotted Tail was a handsome, smiling Indian who loved fine feasts and compliant women. He enjoyed his way of life and the land he lived upon, but was willing to compromise to avoid war.
Closely associated with the Teton Sioux were the Cheyennes. In the old days the Cheyennes had lived in the Minnesota country of the Santee Sioux, but gradually moved westward and acquired horses. Now the Northern Cheyennes shared the Powder River and the Bighorn country with the Sioux, frequently camping near them. Dull Knife, in his forties, was an outstanding leader of the Northern branch of the tribe. (To his own people Dull Knife was known as Morning Star, but the Sioux called him Dull Knife, and most contemporary accounts use that name.)
The Southern Cheyennes had drifted below the Platte River, establishing villages on the Colorado and Kansas plains. Black Kettle of the Southern branch had been a great warrior in his youth. In his late middle age, he was the acknowledged chief, but the younger men and the Hotamitaneos (Dog Soldiers) of the Southern Cheyennes were more inclined to follow leaders such as Tall Bull and Roman Nose, who were in their prime.
The Arapahos were old associates of the Cheyennes and lived in the same areas. Some remained with the Northern Cheyennes, others followed the Southern branch. Little Raven, in his forties, was at this time the best-known chief.
South of the Kansas-Nebraska buffalo ranges were the Kiowas. Some of the older Kiowas could remember the Black Hills, but the tribe had been pushed southward before the combined power of Sioux, Cheyenne, and Arapaho. By 1860 the Kiowas had made their peace with the northern plains tribes and had become allies of the Comanches, whose southern plains they had entered. The Kiowas had several great leaders—an aging chief, Satank; two vigorous fighting men in their thirties, Satanta and Lone Wolf; and an intelligent statesman, Kicking Bird.
The Comanches, constantly on the move and divided into many small bands, lacked the leadership of their allies. Ten Bears, very old, was more a poet than a warrior chief. In 1860, half-breed Quanah Parker, who would lead the Comanches in a last great struggle to save their buffalo range, was not yet twenty years old.
In the arid Southwest were the Apaches, veterans of 250 years of guerrilla warfare with the Spaniards, who taught them the finer arts of torture and mutilation but never subdued them. Although few in number—probably not more than six thousand divided into several bands—their reputation as tenacious defenders of their harsh and pitiless land was already well established. Mangas Colorado, in his late sixties, had signed a treaty of friendship with the United States, but was already disillusioned by the influx of miners and soldiers into his territory. Cochise, his son-in-law, still believed he could get along with the white Americans. Victorio and Delshay distrusted the white intruders and gave them a wide berth. Nana, in his fifties but tough as rawhide, considered the English-speaking white men no different from the Spanish-speaking Mexicans he had been fighting all his life. Geronimo, in his twenties, had not yet proved himself.
The Navahos were related to the Apaches, but most Navahos had taken the Spanish white man’s road and were raising sheep and goats, cultivating grain and fruit. As stockmen and weavers, some bands of the tribe had grown wealthy. Other Navahos continued as nomads, raiding their old enemies the Pueblos, the white settlers, or prosperous members of their own tribe. Manuelito, a stalwart mustachioed stock raiser, was head chief—chosen by an election of the Navahos held in 1855. In 1859, when a few wild Navahos raided United States citizens in their territory, the U.S. Army retaliated not by hunting down the culprits but by destroying the hogans and shooting all the livestock belonging to Manuelito and members of his band. By 1860, Manuelito and some Navaho followers were engaged in an undeclared war with the United States in northern New Mexico and Arizona.
In the Rockies north of the Apache and Navaho country were the Utes, an aggressive mountain tribe inclined to raid their more peaceful neighbors to the south. Ouray, their best-known leader, favored peace with white men even to the point of soldiering with them as mercenaries against other Indian tribes.
In the far West most of the tribes were too small, too divided, or too weak to offer much resistance. The Modocs of northern California and southern Oregon, numbering less than a thousand, fought guerrilla-fashion for their lands. Kintpuash, called Captain Jack by the California settlers, was only a young man in 1860; his ordeal as a leader would come a dozen years later.
Northwest of the Modocs, the Nez Percés had been living in peace with white men since Lewis and Clark passed through their territory in 1805. In 1855, one branch of the tribe ceded Nez Percé lands to the United States for settlement, and agreed to live within the confines of a large reservation. Other bands of the tribe continued to roam between the Blue Mountains of Oregon and the Bitterroots of Idaho. Because of the vastness of the Northwest country, the Nez Percés believed there would always be land enough for both white men and Indians to use as each saw fit. Heinmot Tooyalaket, later known as Chief Joseph, would have to make a fateful decision in 1877 between peace and war. In 1860 he was twenty years old, the son of a chief.
In the Nevada country of the Paiutes a future Messiah named Wovoka, who later would have a brief but powerful influence upon the Indians of the West, was only four years old in 1860.
During the following thirty years these leaders and many more would enter into history and legend. Their names would become as well known as those of the men who tried to destroy them. Most of them, young and old, would be driven into the ground long before the symbolic end of Indian freedom came at Wounded Knee in December 1890. Now, a century later, in an age without heroes, they are perhaps the most heroic of all Americans.” Dee Brown, Bury My Heart at Wounded Knee: An Indian History of the American West; Chapter One
Numero Cuatro—“Among Marxian economists ‘monopoly capitalism’ is the term widely used to denote the stage of capitalism which dates from approximately the last quarter of the nineteenth century and reaches full maturity in the period after the Second World War. Marx’s Capital, like classical political economy from Adam Smith to John Stuart Mill, was based on the assumption that all commodities are produced by industries consisting of many firms, or capitals in Marx’s terminology, each accounting for a negligible fraction of total output and all responding to the price and profit signals generated by impersonal market forces. Unlike the classical economists, however, Marx recognized that such an economy was inherently unstable and impermanent. The way to succeed in a competitive market is to cut costs and expand production, a process which requires incessant accumulation of capital in ever new technological and organizational forms. In Marx’s words: ‘The battle of competition is fought by cheapening of commodities. The cheapness of commodities depends, ceteris paribus, on the productiveness of labor, and this again on the scale of production. Therefore the larger capitals beat the smaller.’ Further, the credit system which ‘begins as a modest helper of accumulation’ soon ‘becomes a new and formidable weapon in the competition in the competitive struggle, and finally it transforms itself into an immense social mechanism for the centralization of capitals(Marx, 1894, ch. 27).’
There is thus no doubt that Marx and Engels believed capitalism had reached a turning point. In this view, however, the end of the competitive era marked not the beginning of a new stage of capitalism but rather the beginning of a transition to the new mode of production that would take the place of capitalism. It was only somewhat later, when it became clear that capitalism was far from on its last legs that Marx’s followers, recognizing that a new stage had actually arrived, undertook to analyze its main features and what might be implied for capitalism’s ‘laws of motion.’
The pioneer in this endeavor was the Austrian Marxist Rudolf Hilferding whose magnum opus Das Finanzkapital appeared in 1910. A forerunner was the American economist Thorstein Veblen, whose book The Theory of BusinessEnterprise (1904) dealt with many of the same problems as Hilferding’s: corporation finance, the role of banks in the concentration of capital, etc. Veblen’s work, however, was apparently unknown to Hilferding, and neither author had a significant impact on mainstream economic thought in the English-speaking world, where the emergence of corporations and related new forms of business activity and organization, though the subject of a vast descriptive literature, was almost entirely ignored in the dominant neoclassical orthodoxy.
In Marxist circles, however, Hilferding’s work was hailed as a breakthrough, and its pre-eminent place in the Marxist tradition was assured when Lenin strongly endorsed it at the beginning of his lmperialism, The Highest Stage of Capitalism. ‘In 1910,’ Lenin wrote, ‘there appeared in Vienna the work of the Austrian Marxist, Rudolf Hilferding, Finance Capital….This work gives a very valuable theoretical analysis of ‘the 1atest phase of capitalist development,’ the subtitle of the book.”
As far as economic theory in the narrow sense is concerned, Lenin added little to Finance Capital, and in retrospect it is evident that Hilferding himself was not successful in integrating the new phenomena of capitalistdevelopment into the core of Marx’s theoretical structure (value, surplus value and above all the process of capital accumulation). In chapter 15 of his book (“Price Determination in the Capitalist Monopoly, Historical Tendency of Finance Capital”) Hilferding, in seeking to deal with some of these problems, came up with a very striking conclusion which has been associated with his name ever since. Prices under conditions of monopoly, he thought, are indeterminate and hence unstable. Whenever concentration enables capitalists to achieve higher than average profits, suppliers and customers are put under pressure to create counter combinations which wiI1 enable them to appropriate part of the extra profits for themselves. Thus monopoly spreads in all directions from every point of origin. The question then arises as to the limits of “cartellization” (the term is used synonymously with monopolization). Hilferding answers:
The answer to this question must be that there is no absolute limit to cartellization. What exists rather is a tendency to the continuous spread of cartellization. Independent industries, as we have seen, fall more and more under the sway of the cartellized ones, ending up finally by being annexed by the cartellized ones. The result of this process is then a general cartel. The entire capitalist production is consciously controlled from one center which determines the amount of production in all its spheres….It is the consciously controlled society in antagonistic form.
There is more about this vision of a future totally monopolized society, but it need not detain us. Three quarters of a century of monopoly capitalist history has shown that while the tendency to concentration is strong and persistent, it is by no means as ubiquitous and overwhelming as Hilferding imagined. There are powerful counter-tendencies—the breakup of existing firms and the founding of new ones—which have been strong enough to prevent the formation of anything even remotely approaching Hilferding’s general cartel.
The first signs of important new departures in Marxist economic thinking began to appear toward the end of the interwar years, i.e., the 1920s and 1930s; but on the whole this was a period in which Lenin’s Imperialism was accepted as the last word on monopoly capitalism, and the rigid orthodoxy of Stalinism discouraged attempts to explore changing developments in the structure and functioning of contemporary capitalist economies. Meanwhile, academic economists in the West finally got around to analyzing monopolistic and imperfectly competitive markets (especially Edward Chamberlin and Joan Robinson), but for a long time these efforts were confined to the level of individual firms and industries. The so-ca1led Keynesian revolution which transformed macroeconomic theory in the 1930s was largely untouched by these advances in the theory of markets, continuing to rely on the time-honored assumption of atomistic competition.
The 1940s and 1950s witnessed the emergence of new trends of thought within the general framework of Marxian economics. These had their roots on the one hand in Marx’s theory of concentration and centralization which, as we have seen, was further developed by Hilferding and Lenin; and on the other hand in Marx’s famous Reproduction Schemes presented and analyzed in volume 2 of Capital, which were the focal point of a prolonged debate on the nature of capitalist crisis involving many of the leading Marxist theorists of the period between Engels’s death (1895) and the First World War. Credit for the first attempt to knot these two strands of thought into an elaborated version of Marxian accumulation theory goes to Michal Kalecki, whose published works in Polish in the early 1930s articulated, according to Joan Robinson and others, the main tenets of the contemporaneous Keynesian revolution in the West. Kalecki had been introduced to economics through the works of Marx and the great Polish Marxist Rosa Luxemburg, and he was consequently free of the inhibitions and preconceptions that went with a training in neoclassical economics. He moved to England in the mid-1930s, entering into the intense discussions and debates of the period and making his own distinctive contributions along the lines of his previous work and that of Keynes and his followers at Cambridge, Oxford and the London School of Economics. In April 1938 Kalecki published an article in Econometrica (“The Distribution of the National Income”) which highlighted differences between his approach and that of Keynes, especially with respect to two crucially important and closely related subjects, namely, the class distribution of income and the role of monopoly. With respect to monopoly, Kalecki stated at the end of the article a position which had deep roots in his thinking and would henceforth be central to his theoretical work:
The results arrived at in this essay have a more general aspect. A world in which the degree of monopoly determines the distribution of the national income is a world far removed from the pattern of free competition. Monopoly appears to be deeply rooted in the nature of the capitalist system: free competition, as an assumption, may be useful in the first stage of certain investigations, but as a description of the normal stage of capitalist economy it is merely a myth.
A further step in the direction of integrating the two strands of Marx’s thought—concentration and centralization on the one hand and crisis theory on the other—was marked by the publication in 1942 of The Theory of Capitalist Development by Paul M. Sweezy, which contained a fairly comprehensive review of the prewar history of Marxist economics and at the same time made explanatory use of concepts introduced into mainstream monopoly and oligopoly theory during the preceding decade. This book, soon translated into several foreign languages, had a significant effect in systematizing the study and interpretation of Marxian economic theory.
It should not be supposed, however, that these new departures were altogether a matter of theoretical speculation. Of equal if not greater importance were the changes in the structure and functioning of capitalism which had emerged during the 1920s and 1930s. On the one hand the decline in competition which began in the late nineteenth century proceeded at an accelerated pace—as chronicled in the classic study by Arthur R. Burns, TheDecline of Competition: A Study of the Evolution of American Industry (1936)—and on the other hand the unprecedented severity of the depression of the 1930s provided dramatic proof of the inadequacy of conventional business cycle theories. The Keynesian revolution was a partial answer to this challenge, but the renewed upsurge of the advanced capitalist economies during and after the war cut short further development of critical analysis among mainstream economists, and it was left to Marxists to carry on along the lines that had been pioneered by Kalecki before the war.
Kalecki spent the war years at the Oxford Institute of Statistics whose director, A. L. Bowley, had brought together a distinguished group of scholars, most of them émigrés from occupied Europe. Among the latter was Josef Steindl, a young Austrian economist who came under the influence of Kalecki and followed in his footsteps. Later on, Steindl (1985) recounted the following:
On one occasion I talked with Kalecki about the crisis of capitalism. We both, as well as most socialists, took it for granted that capitalism was threatened by a crisis of existence, and we regarded the stagnation of the 1930s as a symptom of such a major crisis. But Kalecki found the reasons, given by Marx, why such a crisis should develop, unconvincing; at the same time he did not have an explanation of his own. I still do not know, he said, why there should be a crisis of capitalism, and he added: Could it have anything to do with monopoly? He subsequently suggested to me and to the Institute, before he left England, that I should work on this problem. It was a very Marxian problem, but my methods of dealing with it were Kaleckian.
Steindl’s work on this subject was completed in 1949 and published in 1952 under the title Maturity and Stagnation in American Capitalism. While little noticed by the economics profession at the time of its publication, this book nevertheless provided a crucial link between the experiences, empirical as well as theoretical, of the 1930s, and the development of a relatively rounded theory of monopoly capitalism in the 1950s and 1960s, a process which received renewed impetus from the return of stagnation to American (and global) capitalism during the 1970s and 1980s.
The next major work in the direct line from Marx through Kalecki and Steindl was Paul Baran’s book, The Political Economy of Growth (1957), which presented a theory of the dynamics of monopoly capitalism and opened up a new perspective on the nature of the interaction between developed and underdeveloped capitalist societies. This was followed by the joint work of Baran and Sweezy, Monopoly Capital: An Essay on the American Economic and Social Order (1966), incorporating ideas from both of their earlier works and attempting to elucidate, in the words of their introduction, the “mechanism linking the foundation of society (under monopoly capitalism) with what Marxists call its political, cultural, and ideological superstructure.” Their effort however, still fell short of a comprehensive theory of monopoly capitalism since it neglected “a subject which occupies a central place in Marx’s study of capitalism,” that is, a systematic inquiry into “the consequences which the particular kinds of technological change characteristic of the monopoly capitalist period have had for the nature of work, the composition (and differentiation) of the working class, the psychology of workers, the forms of working-class organization and struggle, and so on.” A pioneering effort to fill this gap in the theory of monopoly capitalism was taken by Harry Braverman a few years later (Braverman, 1974) which in turn did much to stimulate renewed research into changing trends in work processes and labor relations in the late twentieth century.
Marx wrote in the preface to the first edition of volume I of Capital that “it is the ultimate aim of this work to lay bare the economic law of motion of modern society.” What emerged, running like a red thread through the whole work, could perhaps better be called a theory of the accumulation of capital. In what respect, if at all, can it be said that latter-day theories of monopoly capitalism modify or add to Marx’s analysis of the accumulation process?
As far as form is concerned, the theory remains basically unchanged, and modifications in content are in the direction of putting even greater emphasis on certain tendencies already demonstrated by Marx to be inherent in the accumulation process. This is true of concentration and centralization, and even more spectacularly so of the role of what Marx called the credit system, now grown to monstrous proportions compared to the small beginnings of his day. In addition, and perhaps most important, the new theories seek to demonstrate that monopoly capitalism is more prone than its competitive predecessor to generating unsustainable rates of accumulation, leading to crises, depressions and prolonged periods of stagnation.
The reasoning here follows a line of thought which recurs in Marx’s writings, especially in the unfinished later volumes of Capital (including Theories of Surplus Value); individual capitalists always strive to increase their accumulation to the maximum extent possible and without regard for the ultimate overall effect on the demand for the increasing output of the economy’s expanding capacity to produce. Marx summed this up in the well-known formula that “the real barrier to capitalist production is capital itself.” The upshot of the new theories is that the widespread introduction of monopoly raises this barrier still higher. It does this in three ways.
(1) Monopolistic organization gives capital an advantage in its struggle with labor, hence tends to raise the rate of surplus value and to make possible a higher rate of accumulation.
(2) With monopoly (or oligopoly) prices replacing competitive prices, a uniform rate of profit gives way to a hierarchy of profit rates—highest in the most concentrated industries, lowest in the most competitive. This means that the distribution of surplus value is skewed in favor of the larger units of capital which characteristically accumulate a greater proportion of their profits than smaller units of capital, once again making possible a higher rate of accumulation.
(3) On the demand side of the accumulation equation, monopolistic industries adopt a policy of slowing down and carefully regulating the expansion of productive capacity in order to maintain their higher rates of profit.
Translated into the language of Keynesian macro theory, these consequences of monopoly mean that the savings potential of the system is increased, while the opportunities for profitable investment are reduced. Other things being equal, therefore the level of income and employment under monopoly capitalism is lower than it would be in a more competitive environment.
To convert this insight into a dynamic theory, it is necessary to see monopolization (the concentration and centralization of capital) as an ongoing historical process. At the beginning of the transition from the competitive to the monopolistic stage, the accumulation process is only minimally affected. But with the passage of time the impact grows and tends sooner or later to become a crucial factor in the functioning of the system. This, according to monopoly capitalist theory, accounts for the prolonged stagnation of the 1930s as well as for the return of stagnation in the 1970s and 1980s following the exhaustion of the long boom caused by the Second World War and its multifaceted aftermath effects.
Neither mainstream economics nor traditional Marxian theory have been able to offer a satisfactory explanation of the stagnation phenomenon which has loomed increasingly large in the history of the capitalist world during the twentieth century. It is thus the distinctive contribution of monopoly capitalist theory to have tackled this problem head on and in the process to have generated a rich body of literature which draws on and adds to the work of the great economic thinkers of the last 150 years. A representative sampling of this literature, together with editorial introductions and interpretations, is contained in Foster and Szlajfer (1984).” Paul Sweezy, “Monopoly Capital;” in New Palgrave Dictionary of Economics, 1987, via Monthly Review, 2004
Today is Women’s Day in Zoroastrian tradition, Engineer’s Day elsewhere in Iran, Flag Day in Mexico, and National Artists Day in Thailand; in the Eastern Roman Empire seventeen hundred fourteen years ago, Emperor Galerius first broadcast an edict that permitted and encouraged persecution of Christians, a situation that continued for nearly a decade; a hundred eighty-one years past that, in 484, meanwhile, the Vandal King Huneric furthered his support for Arian thinking by removing Christian bishops, banishing some to Corsica and martyring others; more or less exactly a thousand and ninety-eight years thereafter, in 1582, a thirteenth Pope Gregory instituted the Gregorian Calendar to catch up with missing leap years; precisely a quarter century further along, in 1607, one of the first operas opened its performances as L’Orfeo, in Mantua, Italy; another century and four years onward, in 1711, a George Frideric Handel opera—the first Italian show written for an English stage—premiered in London;
The facade of ‘neutrality,’ the non-sequitur of ‘objectivity,’ in no realm exercise a more mysterious and pernicious primacy than in the arena of media policy and production, the central locus of the creation of culture in the current context of the Internet’s fruition and ‘reality television’s’ rise as aspects of the power of the electromagnetic spectrum: this surreal fetishization of an intellectual impossibility, which holds creators to a rotten standard that one can no more achieve than one can, to coin a phrase, ‘fill God’s shoes,’ serves obvious and yet often overlooked sociopolitical purposes, to wit, first, it makes certain that no ‘established’ mediation can easily either ‘take sides’ against establishment machinations or illustrate the biases and truly objective one-sidedness of the ruling perspectives of the day; second, it makes essentially unattainable any embrace of dialectical thinking or awareness of paradox since all ‘centrist’ ideation inherently eschews polarity and its necessary revelations of contradiction; and third, it completely sunders the critical connection between the discovery of knowledge and any action to effectuate what such understanding implies as important to do, inasmuch as being neutral means that one must remain indifferent as to ‘interested’ outcomes; such a listing of invidious results might continue, though perhaps one can state the case with sufficient force when one says that maintaining a commitment to fairness is not even slightly incompatible with ridding ourselves of the contemptible concatenations of a detachment that violates both the laws of physics and the potential for any sort of ethical or moral foundation.
Quote of the Day
Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do. If you haven’t found it yet, keep looking. Don’t settle. As with all matters of the heart, you’ll know when you find it. Steve Jobs
Doc of the Day
1. Jaqueline Smetak, 1994.
2. Fidel Castro, 2008.
3. Judith Butler, 2011.
Numero Uno—“Praise boss when morning work bells chime. Praise him for chunks of overtime.
Praise him whose bloody wars we fight. Praise him, fat leech and parasite.
Whether we are old enough to remember the Vietnam war or not, we all see the images–of the soldiers, of the victims of the war, of those who opposed it. Absent from the pictures are working people who, if they appear, seem strange and alien creatures–the young construction worker captured in a New York Times photo as he attacked a wheelchair-bound antiwar veteran with an American flag, or perhaps those sad and foolish people at the end of The Deer Hunter singing their sad and foolish little patriotic song. American working people, although it was they and their sons and husbands and brothers who were sent off to fight the war, are dim and silent in our memories. Philip Foner, Emeritus Professor, Lincoln University, Pennsylvania, old time radical and respected Labor historian, wrote his book to fill in some gaps and set some records straight. US Labor and the Vietnam War, barely 150 pages, doesn’t and can’t, tell the whole story. It exists as an outline and a collection of documents; a place to start with what people in Labor did. It doesn’t fully answer the why, and those who are curious will have to search further. But Foner is a teacher, and like all good teachers, he gives us a start. The finish we will have to do ourselves.
Located on a scenic farm near Chicago and Lake Michigan, Taleamor Park hosts two-week and longer residencies for artists seeking quiet time and ample space. The co-directors seek to create a loving, esthetic, and respectful atmosphere that balances needs for creative solitude and for community. There is a simple online application form due six weeks before the start of the month of the proposed residency, but later applications will be considered whenever space remains. However, assistantship applications for any 2017 date are due February 14, 2017.
Pacific Review accepts a variety of work, including comics, visual art, photography, documented performance, hybrid, fiction, memoir and more.
OPBis hiring a Video Producer/Writer to tell compelling stories exploring Oregon history. This salaried, exempt position is a full time, regular status position with benefits. For more information and instructions on how to apply, go to OPB’s careers page. OPB is an Equal Opportunity Employer.
Today is Women’s Day in Zoroastrian tradition, Engineer’s Day elsewhere in Iran, Flag Day in Mexico, and National Artists Day in Thailand; in the Eastern Roman Empire seventeen hundred fourteen years ago, Emperor Galerius first broadcast an edict that permitted and encouraged persecution of Christians, a situation that continued for nearly a decade; a hundred eighty-one years past that, in 484, meanwhile, the Vandal King Huneric furthered his support for Arian thinking by removing Christian bishops, banishing some to Corsica and martyring others; more or less exactly a thousand and ninety-eight years thereafter, in 1582, a thirteenth Pope Gregory instituted the Gregorian Calendar to catch up with missing leap years; precisely a quarter century further along, in 1607, one of the first operas opened its performances as L’Orfeo, in Mantua, Italy; another century and four years onward, in 1711, a George Frideric Handel opera—the first Italian show written for an English stage—premiered in London; twenty-eight years yet later on, in 1739, Persia’s ruler led his empire’s forces in a decisive victory at the Battle of Karnal over the Mughal Emperor in India, setting the stage for the sack of Delhi; two hundred thirty-one years back, a baby boy opened his eyes who would rise as the thinker, writer, folklorist, Wilhelm Grimm; two hundred fourteen years prior to today, the Supreme Court issued its decision in Marbury v. Madison, which established the primacy of judicial review as a ‘court of last appeal’ for issues of law and policy; twenty-eight years past that conjunction, in 1831, the Treaty of Dancing Rabbit Creek finalized the ejection of Mississippi’s Choctaw Indians in accordance with the U.S.’s Indian Removal Act; a hundred forty-nine years before the here and now, Andrew Johnson became the first President to face impeachment by the House of Representatives; another thirteen years thereafter, in 1881, China and Russia signed a treaty that settled border and control issues mostly in favor of Qing Dynasty; fourteen years forward from there, in 1895, insurrection began near Santiago, Cuba, leading soon enough to transfer of imperial domination from Spain to the United States; thirteen years hence, in 1908, the Supreme Court upheld an Oregon statute that disallowed forcing women to work longer than ten hour days; four years more on time’s path, in 1912, police in
Lawrence, Massachusetts viciously beat children and women who had taken their fathers’ and husbands’ place on the picket line; half a decade past that, in 1917, across the Atlantic in London, the U.S. ambassador received the Zimmermann telegram, in which German authorities promised Mexicans the return of Texas, New Mexico, and Arizona if the nation would declare war on the United States; seven hundred thirty days henceforth, in 1919, the U.S. Congress passed another in a long line of laws that criminalized child industrial labor, its predecessor’s having just suffered the fate that all of these laws did, till the end of the New Deal, to wit the Supreme Court found them an unwarranted intrusion of the property rights of the rich; two decades onward, in 1939, the Supreme Court did its part to in a different way to defend ruling class profiteering, finding sit-down strikes illegal interference with property’s imprimatur; three years subsequent to that instant, in 1942, Canada’s war administrator authorized internment of all Japanese residents and citizens; thirteen years nearer to now, in 1955, a male child was born who would mature as the wildly successful Steve Jobs; just three hundred sixty-five days after that moment, in 1956, a baby girl entered our midst who would grow up as the incisive theorist of gender and identity and consciousness, Judith Butler; nine years further along the temporal arc, in 1965, the Service Employees
International Union, Local 1199 became the first labor organization to condemn the U.S.’s illegal intervention in Vietnam; two years shy of two decades past that exact second, in 1983, the U.S. Congress’ condemnation of placing Japanese in concentration camps four decades before resulted in a final report about these matters; half a dozen years still closer to now, in 1989, Ayatollah Khomeini put up $3 million for the assassin who would murder the author ofThe Satanic Verses, Salman Rushdie; just three years less than two decades more proximate to the present pass, in 2006, the magnificent maven of science fiction and novel storytelling, Octavia Butler, closed her eyes for the final time; just two years hence, in 2008, Fidel Castor, after nearly a half century at the helm of Cuba, stepped down from power; four additional years in the direction of now, in 2012, Jay Berenstain, beloved children’s storyteller, breathed his last.