Why did the British “Indian” army come disproportionately from particular regions?

During World War II, the British field several armies that included Indians, but the most "Indian" of these, was the 14th Army, in which 8 of 13 divisions were predominantly "Indian." Of the battalions in this army, a disproportionate number of them were either "Gurka" (Nepal) or "Punjab" units. These were soldiers from the borderlands of modern India (or just across the border from it).

Notably absent, or at least underrepresented were units, from south and central India, that is, the country's "mainstream. Why was that? Did Gandhi's philosophy of non-violence carry greater sway with "most" Indians other than those discussed in the paragraph above?

Gandhi came too late to have any influence on this.

The British rulers of India believed in inherent racial differences. There existed a lot of scholarly and popular obsession around classifying ethnologies and attributing characteristics to them.

From the wikipedia article on the "Martial Races" theory

The doctrine of 'martial races' postulated that the qualities that make a useful soldier are inherited and that most Indians, with the exception of the specified groups, did not have the requisite traits that would make them warriors. The British recruited heavily from the 'martial races' for service in the colonial army. Sensing the inequalities and fierce loyalty to one's tribe or group of the diverse native peoples of the subcontinent, the British found opportunities to use it to their own great advantage. These already wide divides were a fertile breeding ground to inculcate pride in one's identity based on 'race'. This also forwarded the Divide and Rule Policy of the British.

The rebellion of 1857 was another factor. The Bengal troopers who first revolted were recruits drawn from the martial stock of Bhumihars(my stock, incidentally) and Rajputs of Bihar and eastern Uttar Pradesh. However, the loyal Pashtuns, Punjabis, Gurkhas, Kumaoni and Garhwalis did not join the mutiny, and fought on the side of the British Army and helped to suppress the rebellion. This reinforced the British policy of recruitment.

The British regarded the 'martial races' as valiant and strong but also intellectually inferior, lacking the initiative or leadership qualities to command large military formations. They were also regarded as politically subservient or docile to authority. For these reasons, the martial races theory did not lead to officers being recruited from them; recruitment was based on social class and loyalty to the British Raj. Winston Churchill was reportedly concerned that the theory was abandoned during the war and wrote to the Commander-in-Chief, India that he must "rely as much as possible on the martial races".

Upon its inception, the Indian Army inherited the British Indian Army's organisational structure, which continues to the present. Like its predecessor, an Indian infantry regiment's responsibility is not to undertake field operations but to provide battalions and well trained personnel to the field formations, as such it is common to find battalions of the same regiment spread across several brigades, divisions, corps, commands, and even theatres.

Like its British and commonwealth counterparts troops enlisted within the regiment are immensely loyal and take great pride in the regiment to which they are assigned and generally spend their entire career within the regiment. All of the regiments classified erstwhile as 'martial' exist to date. And except The Gurkha, and the Sikhs, no ethnic or, religious preference is excercised in recruitment to any of them. Special mention must be made of the raising of the Gurkha Rifles.

This is an interesting interview.

Sikhs from the Punjab and Nepalese Gurkha's were purposely recruited into the British army because these two regions were the hardest to conquer during the British Conquest.




Black British people

Black British people are British citizens of either African descent or of Black African-Caribbean (sometimes called "Afro-Caribbean") background. [5] The term Black British developed in the 1950s, referring to the Black British West Indian people from the former Caribbean British colonies in the West Indies (ie , the New Commonwealth) now referred to as the Windrush Generation and people from Africa, who are residents of the United Kingdom and are British. Black people with connections to Liverpool are also placed under this classification. It is also sometimes used to refer to the United Kingdom's Asian population, particularly those from India, Pakistan or Bangladesh. [6] [7]

  1. ^ For the purpose of harmonising results to make them comparable across the United Kingdom, the ONS includes individuals in Scotland who classified themselves in the "African" category (29,638 people), which in the Scottish version of the census is separate from "Caribbean or Black" (6,540 people), [2] in this "Black or Black British" category. The ONS note that "the African categories used in Scotland could potentially capture White/Asian/Other African in addition to Black identities". [3]

The term black has historically had a number of applications as a racial and political label and may be used in a wider sociopolitical context to encompass a broader range of non-European ethnic minority populations in Britain. This has become a controversial definition. [8] Black British is one of various self-designation entries used in official UK ethnicity classifications.

Black residents constituted around 3 per cent of the United Kingdom's population in 2011. The figures have increased from the 1991 census when 1.63% of the population were recorded as Black or Black British to 1.15 million residents in 2001, or 2 per cent of the population, this further increased to just over 1.9 million in 2011. Almost 97% of Black Britons live in England, particularly in England's larger urban areas, with most (over a million) Black British living in Greater London.

The resource curse

In the Niger Delta, people’s rights are rarely provided for, and frequently violated. Many of those who live in the region experience very poor living standards. The abundance of natural resource wealth from the region’s oil and gas reserves, coupled with weak governance, has produced the ‘resource curse’ (or ‘paradox of plenty’), maintaining this injustice. The combination of natural resource wealth and weak governance enable those in positions of political and other power to benefit disproportionately from oil money, and the oil and gas industry to operate below international standards. Many historic oil spills have never been cleaned up, and this contributes to severe local health consequences, as well as for local livelihoods: farmland and fishing waters are poisoned when contaminated by oil.

Lack of accountability

As a result, leaders at all levels often lack accountability to the populations they are supposed to serve. Many aspects of government and public services are mismanaged or function ineffectively, and the Niger Delta—and Nigeria as a whole—has seen little benefit or development from its natural resource wealth, principally oil and gas reserves. The democratic system is undermined by political patronage that uses money and violence to achieve and retain power. This is likely to be a major factor in why elections in the Niger Delta have suffered from major fraud and violence. Even between election cycles, the Niger Delta suffers violence perpetrated by those seeking political power and control over resources, high levels of criminality, and ongoing environmental damage caused by oil spills and gas which is flared, or burnt off, as a by-product of oil production.

The rise and fall in militancy

From 2004, escalating armed and violent conflict between gangs and towards government and oil company security forces increased the levels of militancy in the region. Particularly well-known militant groups include the The Movement for the Emancipation of the Niger Delta (MEND), and Niger Delta Avengers. In the wider context of underdevelopment, widespread oil pollution, a lack of livelihood options, and a largely unresponsive government, militants targeted oil pipelines and kidnapped foreign oil workers for ransom. This militancy affected oil production to the extent that the price of oil rose above USD$100 per barrel for the first time in world history. In 2009, approximately 26,000 militants agreed to a ceasefire, and the government led an amnesty process that makes payments and offers opportunities to ex-militants, intended to prevent a return to militancy. To this day, the Presidential Amnesty Programme is still operating, and there is relative stability but this is fragile as the original conditions giving rise to the militancy have not been fully addressed. There is also no concrete exit-plan for this hugely expensive Programme.

Despite the relief from militancy, there is still every-day insecurity within areas of the Niger Delta, where cultism, piracy, corrupt security personnel, and destabilising economic activities are rife. 20% of households in the Niger Delta between 2010-2017 were affected by conflict—with 1/20 experiencing a bereavement due to conflict over that period. The fall in militancy has, however, created the space for a vibrant civil society to develop and organisations to come into the Niger Delta to encourage a more sustainable peace, political stability, and economic diversity in Niger Delta.

Attempts to improve the system

In the face of these challenges, inhabitants of the Niger Delta have few options to seek redress. The justice system, in areas such as claiming compensation for nearby oil spills, is flawed and hard to access. Furthermore, collective action can be met with a disproportionate and intolerant response from government and other actors, including violence. However, some compensation cases against Shell and its subsidiaries have been taken to England, where Shell is headquartered, with out-of-court settlements that amount to many times what was originally offered to the impacted communities. Although the legal representation required to take this action is available to very few, it may prove to be an alternative avenue to seek justice where the Nigerian processes are inadequate.

The resulting sense of injustice is a major source of conflict, while the unequal ability to influence change is another key feature. For instance, women and young people are almost entirely excluded from formal politics.

International politics and interests also shape the situation in the region. For example, global demand for oil means that the interests of foreign investors in Nigeria can be favoured over its people. Companies wishing to exploit the weak monitoring and enforcement of standards in Nigeria, therefore, can easily do so—as may be the case with the import of dirty fuels from Europe and elsewhere.

At SDN, we produce research on many of these issues, and implement projects to address them. Find out more about our work.

Women, Gender, and World War II

The Second World War changed the United States for women, and women in turn transformed their nation. Over three hundred fifty thousand women volunteered for military service, while twenty times as many stepped into civilian jobs, including positions previously closed to them. More than seven million women who had not been wage earners before the war joined eleven million women already in the American work force. Between 1941 and 1945, an untold number moved away from their hometowns to take advantage of wartime opportunities, but many more remained in place, organizing home front initiatives to conserve resources, to build morale, to raise funds, and to fill jobs left by men who entered military service.

The U.S. government, together with the nation’s private sector, instructed women on many fronts and carefully scrutinized their responses to the wartime emergency. The foremost message to women—that their activities and sacrifices would be needed only “for the duration” of the war—was both a promise and an order, suggesting that the war and the opportunities it created would end simultaneously. Social mores were tested by the demands of war, allowing women to benefit from the shifts and make alterations of their own. Yet dominant gender norms provided ways to maintain social order amidst fast-paced change, and when some women challenged these norms, they faced harsh criticism. Race, class, sexuality, age, religion, education, and region of birth, among other factors, combined to limit opportunities for some women while expanding them for others.

However temporary and unprecedented the wartime crisis, American women would find that their individual and collective experiences from 1941 to 1945 prevented them from stepping back into a prewar social and economic structure. By stretching and reshaping gender norms and roles, World War II and the women who lived it laid solid foundations for the various civil rights movements that would sweep the United States and grip the American imagination in the second half of the 20th century.



The wartime arenas where American women witnessed—and often helped to generate—crucial changes and challenges were wage-based employment, volunteer work, military service, and sexual expression. In each of these arenas, women exercised initiative, autonomy, circumspection, caution, or discretion according to their individual needs and the dictates of patriotic duty.

Wage Work and Opportunity

Economic opportunities abounded for women willing and able to seize them. Wage work in war industries offered hourly pay rates much higher than those to which most women had been accustomed, with the best wages paid in munitions plants and the aircraft industry. Women were encouraged to apply for “war work” after President Franklin Roosevelt created the U.S. War Manpower Commission (WMC) to mobilize Americans in various venues for a total war effort. In August 1942 , the WMC organized a Women’s Advisory Committee to consider how female employees could be used most effectively toward this end. Late in 1942 , the WMC announced a new campaign to recruit women workers after estimating that “the great majority” of some five million new employees in 1943 would have to be women. The WMC also identified one hundred U.S. cities as “Critical War Areas,” with intent to marshal the “widely dispersed” womanpower reserves in these cities. The main targets were local married women who already lived in the designated metropolitan areas, including middle-aged and older individuals who had never worked outside their homes or whose experience was limited to domestic work. A major challenge would be “to remove social stigma attached to the idea of women working,” the WMC literature noted. 1 Since the employment of married women had been a long-standing practice in working-class families and in the middle-class African American community, the WMC propaganda implicitly targeted white middle-class women who had not typically worked for wages.

Madison Avenue advertising agencies designed and produced a variety of propaganda campaigns for the U.S. government, including the WMC’s bold declaration and appeal late in 1942 : “Women Workers Will Win the War.” Local U.S. Employment Service offices coordinated efforts to place women in jobs best suited to their skills and family needs. Mothers with children under fourteen were encouraged not to seek employment outside their homes unless other family members or trusted neighbors could offer reliable childcare. 2 The propaganda campaigns generated posters, billboards, films, and radio announcements urging women to join the work force some touted their domestic skills as advantageous for carrying out defense work, since women were thought to excel at repetitive tasks requiring small operations with fine details. While the images overwhelmingly featured young, white, married women, an occasional entreaty announced, “Grandma’s got her gun,” referring to an elderly worker’s riveting tool. Several corporations with U.S. government contracts proudly sponsored chapters of the War Working Grandmothers of America. In Washington war agencies, the demographic defined as “older” meant “women over 35.” 3 Women of color rarely appeared in advertisements for industrial work, although their accomplishments and workplace awards were widely reviewed in African American newspapers and journals, including the NAACP’s principal publication, The Crisis, and the National Urban League’s Opportunity. Such coverage constituted a vital part of the “Double V” campaign, an effort launched by the black press to defeat racism at home while troops fought fascism abroad. 4

American women became artillery inspectors, aircraft welders, sheet metal assemblers, gear cutters, lathe operators, chemical analysts, and mechanics of all kinds. Length and depth of training varied according to industry, with many forced to learn quickly if not “on the job” itself. By 1944 , skilled female workers earned an average weekly wage of $31.21. In spite of federal regulations requiring equitable pay for similar work, their male counterparts in similar positions earned $54.65 weekly. 5 Years of experience in specific jobs accounted for some wage disparity between men and women but could not account for aggregate discrimination during the war years. However unequal their wages compared with men’s, women in defense industries out-earned most “pink collar” employees who held retail, service, or clerical jobs. Constance Bowman, a schoolteacher who spent the summer of 1943 working in a San Diego B-24 bomber factory, earned 68 cents an hour. A beginning sales clerk at the upscale Bullock’s Wilshire Department Store in Los Angeles earned about $20 week, two thirds of a factory worker’s salary. 6 If women were able to cross boundaries into the “masculinized” workplaces of heavy industry, they would be remunerated more handsomely than women who remained in safely “feminized” spheres of employment but they would not always see paychecks matching those of their male co-workers, even when they faced the same workplace challenges and hazards.

The Women’s Bureau (WB) at the U.S. Department of Labor sent field representatives to factories throughout the country to scrutinize working conditions. Among the WB administrators’ gravest concerns were endangered female bodies on factory floors, where safety seemed subordinate to management’s production quotas and workers’ personal style preferences. An alarming New York Times story announced in January 1944 that American “industry deaths” since the attack on Pearl Harbor had exceeded the “number killed in war” by 7,500. 7 The Labor Department tried to convince American women to prioritize safety when choosing work apparel: to wear safety shoes or boots rather than ordinary footwear and to don protective caps or helmets rather than bandanas and scarves. A WB analyst reported that “the most distressing accident” in war industry resulted from long hair catching in machinery. In Rhode Island a woman was “completely scalped” after her hair coiled on an assembly line belt. The Office of War Information (OWI), the U.S. government’s chief propaganda agency, produced documents illustrating proper and improper ways to style and wear hair in industrial jobs. The WB urged factories to adopt rules about head coverings as well as safety shoes and slacks. The Labor Department even designed “fashionable” caps and hats in a variety of shapes and colors, since their research concluded that women did not wish to look exactly like one another in the workplace. 8

More shocking than minimal head protection was the use of substandard footwear, which led U.S. Secretary of Labor Frances Perkins to sound a warning bell at a 1943 “Women in War Industries” conference. In her opening address, Perkins noted that most industrial accidents among women were in the “slip, fall, and stumble categories,” leading her to recommend that work uniforms include “shoes devised particularly to help women prevent” such accidents. 9 Perkins and others concerned about occupational safety had to contend with American shoe retailers—and their representatives in Washington—who insisted that women would want to wear their sandals, moccasins, and espadrilles to work. 10 Retail store managers were told they could assist in recruitment and retention of female defense workers by displaying attractive work clothes that promoted safety, neatness, and good health. 11 In spite of U.S. government war agencies’ directives to defense plants to enforce safety standards on all fronts, some Labor Department inspectors found that corporate managers would not comply until threatened with prosecution. 12

Munitions makers and retailers alike were encouraged to take women employees’ “health and beauty” needs seriously, providing them with cosmetics, soaps, and sanitary supplies to use in workplace restrooms and lounges. Such comfort packages would not merely attract employees but also keep them content and more likely to stay after they had been hired. 13 The Labor Department recommended a sufficient number of showers and lockers on site for particular industries, such as shipbuilding, where women preferred to travel to and from work in their “street clothes.” 14 Working women saw magazine advertisements instructing them to pay particularly close attention to skincare and personal hygiene, lest they lose their “femininity” in the much-altered economic and social landscape of wartime America. 15

Job opportunities and steady wages could not offset for many the hardships of fulltime employment: shift work, long commutes, limited childcare options, and inconvenient shopping hours for food and other necessities. Very few grocery and department store owners chose to accommodate women who needed to do their shopping in the late evening or night hours. That women workers got sick more often than men was attributed to the fact that they were doing, “in effect, two fulltime jobs.” 16 U.S. government promises to organize day care centers in war boom areas went largely unfulfilled, meeting the needs of a mere fraction of the large population of working mothers the public childcare project was not funded until 1943 , and “even then, the centers provided care for only 10 percent of the children who needed it.” 17

While limited training, sore muscles, and exhaustion from the home/work double shift discouraged many women, added burdens for women of color included workplace discrimination and harassment. They endured racial slurs and physical attacks in factories, and disproportionately filled the lowest-paid and least appealing jobs, including janitorial work. The Fair Employment Practices Committee (FEPC)—created by Executive Order 8802 in 1941 to address racial discrimination in industry—lacked the funds to handle the wave of complaints engendered by rapid wartime mobilization. When FEPC cases faced delays, black women searching for work or seeking promotions in their current jobs suffered the most. But women of color, like all American women, found their greatest challenge to be reconciling home life and work life during the war years. Opportunity magazine noted that black women in defense jobs grew “much more irritated than men by periods of standing around and doing nothing,” since they knew they could use the down time running errands for their second shift duties at home. One commentator suggested release of workers in factory down periods in order to promote “better morale” and to stem the tide of absenteeism, a significant problem among female employees eighteen months into the war. 18

American women were encouraged to consider every job a war job, however irrelevant a particular position might seem with regard to the military effort. Beyond riveting and welding, other tasks required even more hands and minds nationwide. The United States needed farm laborers, telephone operators, laundry workers, food servers, and bus drivers. Three million women cultivated crops in the federal agriculture program known as the Women’s Land Army. And while women had filled clerical positions for nearly half a century in the United States, the war accelerated the trend. Women took certain places as men vacated them, with the U.S. government offering hundreds of thousands of desk jobs to anyone who could file, type, and take dictation. The expanding bureaucratic structure of war was matched by private sector growth, where American businesses were forced to open their doors and offices to female employees. With the military draft taking its share of male, middle-class clerks and salesmen, openings for women abounded in the consumer economy. Radio stations, insurance firms, and advertising agencies hired more women than ever before. Banking, in particular, saw “feminization” in its employment ranks at the beginning of the war, some sixty-five thousand women worked in banking but by the end of 1944 , approximately one hundred thirty thousand women were bank employees, constituting nearly one half of the industry’s total personnel. 19

Volunteer Work

Beyond those who earned wages, millions of women donated their time, money, or both, especially in the realm of morale work. Those who cultivated a genuine spirit of volunteerism saw their work bear fruit, even though some groups were criticized for their “charity bazaar” approach. Images circulated of the rich snob who sat at a booth for a few hours a week but remained oblivious to real sacrifice. 20 A government handbook for the American Women’s Voluntary Service (AWVS) clarified the organization’s purpose as well its diverse membership in many states, where women carried out “real hard work.” They took classes on home repair and first aid, helped children, and learned practical wartime skills such as map reading, convoy driving, clinical photography, and Morse code. The AWVS affected every aspect of wartime culture, sending its members to assist military personnel, distribute ration books, sell war bonds, and collect salvage, as well as to recruit blood donors, nurses, farm workers, and child care workers, and to knit, sew, and recondition clothes for military families and relief agencies. 21

AWVS chapters took pride in their “non-sectarian, non-political, non-profit-making” status to encourage women from many backgrounds to join their ranks. Across the country the AWVS made strides in several socially sensitive areas including interracial cooperation. Indeed, African American women urged others to support the organization, because it “transcend[ed] any consideration of race, or color, or class, or caste.” The AWVS became a place where, through their work together, women could understand “each other’s problems and shortcomings and consciously or unconsciously, [develop] an appreciation of each other’s virtues,” one member reported. Interracial volunteer activities among women spurred optimism for a more inclusive postwar America while stimulating the growth of similar organizations where women could meet and serve a larger cause. 22

In the realm of “morale,” the presumed purview of women, one group enjoyed the spotlight above all others—the United Service Organizations (USO). In assisting and entertaining U.S. military troops, USO volunteers were asked to consider their work the female equivalent of military service. Through gender-defined actions and activities, USO volunteers were expected to assume particular mental and emotional postures when dealing with soldiers and sailors. The ideal USO junior hostess’s femininity quotient was determined in part by her ability to yield to a serviceman’s wishes within the boundaries of middle-class American womanhood. How she presented herself would determine the reactions of soldiers and sailors, she was instructed. Patience, general optimism, and good listening skills were a good hostess’s requisite qualities. Since many USO sites provided games, women played table tennis, checkers, and cards, and often allowed their male opponents to win. Such “gendered emotional work” meant women were not to appear too smart or too competitive to challenge a serviceman’s masculinity undermined the organization’s purpose of supporting male service members’ morale. As historian Meghan Winchell argues, “If a hostess made a serviceman happy, then she had done her job, and this, not meeting her own interests, theoretically provided her with satisfaction.” Her selflessness would presumably reinforce cultural gender norms and uphold social order in the midst of wartime crisis. 23

This requisite “cheerful selflessness” was matched by the initiative of women who chose to relocate near their spouses’ military installations. In packed trains and buses, often with young children in tow, they made their way cross-country to visit or live near their husbands. One observer called them “the saddest and most predictable feature of the crowded train stations and bus terminals.” 24 War brides on the move could easily identify each other and found comfort in their shared condition. 25 African American army wives who accompanied their husbands to Fort Huachuca, Arizona, lived in a squalid “unconverted barrack” outside the camp’s gates during the day they served the base as secretaries, janitors, cooks, food servers, launderers, and maids in white officers’ homes. But their main priority, according to a reporter for The Crisis, was “the morale of their menfolk.” 26

Military Service

Women who volunteered for military service posed a great challenge to the collective consciousness about gender and sexual norms and clear gender divisions, especially regarding who could be considered a soldier, sailor, or marine. The women in uniform closest to the front lines were nurses, government-sanctioned “angels of mercy” whose work Americans more readily accepted because it reflected expectations that women were natural caregivers. Precedent also helped to secure the public’s approval of women serving in this capacity both the army nurse corps and navy nurse corps had existed since the early 20th century, with more than twenty thousand military nurses serving during the First World War, half of them in overseas duty. But female volunteers in military organizations founded during World War II faced tougher scrutiny than nurses their womanhood and femininity were questioned by many detractors, even though the idea of national service for women was not new. As early as 1940 , First Lady Eleanor Roosevelt had recommended a required service responsibility (although not specifically a military duty) for all young American women. 27 Roosevelt did not get her peacetime wish, but after the U.S. declared war in December 1941 , the mobilization of women as assistants in the army seemed not merely plausible but imperative. U.S. Congresswoman Edith Nourse Rogers’ bill to that effect had languished since May 1941 , but in May 1942 , Congress approved it and President Roosevelt signed it, creating the all-volunteer Women’s Army Auxiliary Corps.

Three additional military units followed the creation of a women’s army. The women’s naval organization, Women Accepted for Volunteer Emergency Service (WAVES), was founded in July of 1942 the women’s coast guard, Semper Paratus Always Ready (SPAR), followed in November and finally, the U.S. Marine Corps Women’s Reserve (USMCWR) was established in February 1943 . All four of the women’s military groups were designed to release men who held military desk jobs and other stateside responsibilities for combat duty, something many men resented. In addition, because of the expansive mobilization of the military for the war, thousands of new clerical positions emerged in all branches of the armed services and this too inspired calls for female military personnel. As one colorful recruitment poster directed at women commanded, “Be A Marine. Free A Marine to Fight.” Recruiters had to proceed cautiously with a message whose logic told women that joining a military service organization would send more men to their deaths. Even so, the message reinforced gender differences—women might wear uniforms, march in formation, and be promoted, but only men could face enemy forces at battle sites. Thus, men continued to dominate the most masculine of human activities—warfare—which was further masculinized by U.S. government propaganda in the 1940s. 28

The Women Airforce Service Pilots (WASP) did not receive military status during World War II, but members participated in the American war effort by ferrying planes from factory sites to defense camps and embarkation points. These female aviators also tested new aircraft, hauled cargo, and assisted male pilots in training exercises. In 1944 , U.S. Army Air Corps General Henry “Hap” Arnold publicly declared WASP pilots as capable as their male counterparts. Thirty-eight women died serving in the WASP during its two-year existence ( 1942–44 ), yet none of the pilots’ families received government support for their funerals because the organization was not officially militarized. 29

Propaganda aimed at enticing women to join one of the military forces touted substantial base pay in addition to food, lodging, clothing, and medical and dental care. But the Office of War Information (OWI) insisted that recruitment messages refrain from appealing “entirely to the self-interest approach.” Women were not supposed to entertain individual needs or wishes, but instead to join for higher, nobler reasons: “patriotism and the desire to help our fighting men,” the OWI instructed. 30 Even so, years later, many female soldiers, sailors, marines, and pilots admitted to volunteering because they wanted an adventure or independence or both. 31

Figure 1. Recruitment poster created by the Office for Emergency Management, Office of War Information-Domestic Operations Branch, Bureau of Special Services, 1944 . U.S. National Archives (44-PA-260A).

In 1943 , the women’s army group discarded its “auxiliary” status to become an integral part of the U.S. Army and was renamed the Women’s Army Corps (WAC), a move that generated an outpouring of criticism, concern, and derision. Male GIs carried out a smear campaign against the organization. They spread rumors that WAC volunteers served as prostitutes for male soldiers, reinforcing a notion that army life encouraged promiscuity. Some wondered whether incorporating the WAC into the regular army meant that its members would—like their male counterparts—be issued condoms. Would army life encourage sexual activity among female volunteers? 32 Viewed not simply in ethical terms, women’s sexual autonomy was considered transgressive behavior that aligned them too closely with men in uniform, whose masculinity was often measured by their sexual prowess and emphasized during the war years. 33 The blurring or crossing of gender and sexual lines in this realm implied a social disorder that many Americans could not abide.

Worries about women’s sexual independence also inspired rumors of a “lesbian threat” in the WAC. In the 1940s, both American medical opinion and public opinion associated female sexual “deviance” as much with a woman’s appearance as her actions. Androgyny or, in wartime language, a “mannish” way, could mark a woman as suspect since she challenged the rules of femininity that grounded heterosexuality and secured a traditional social order. As women stepped into previously all-male venues during the war years, gender “disguise” could be interpreted as dangerous. Acutely aware of this, WAC director Colonel Oveta Culp Hobby ordered army women “to avoid rough or masculine appearance which would cause unfavorable public comment.” 34 In the spring of 1944 , female mechanics at Ellington Air Base, Texas, attended lectures about “proper dress for work” with a warning not to “roll up” the legs or sleeves of their coveralls. One Ellington mechanic wrote to her parents, “We are now buttoned and covered from tip to toe.” The OWI instructed advertisers and illustrators to show female soldiers in “complete G. I. uniform” and never “smoking or drinking alcoholic beverages,” concerns not voiced about men in uniform. These rules of propriety indicated the preeminent role that clothing played in assigning gender and sexual identities during the war. Even the appearance of impropriety could be grounds for dismissal and a dishonorable discharge. 35

Beyond the role of patriotic duty, the U.S. government’s preeminent recruitment message emphasized gender, declaring: “Women in uniform are no less feminine than before they enlisted.” In fact, officials hoped to appeal to women’s sartorial interests by using fashion plate graphic designs in recruitment literature. Illustrations of female soldiers posing as atelier models and department store mannequins displayed the numerous stylish items in a military wardrobe—from foundations to outerwear—together worth about $250. The idea was not only to recruit women but also to counter critics who railed against the idea of women’s military organizations in the United States. The tactics worked many volunteers admitted joining one organization or another because they liked the uniforms. 36

Enlistment criteria, training, and job assignments varied widely by organization. The WAC accepted volunteers with a minimum of two years of high school, while the WAVES required a high school diploma, with college “strongly recommended.” Female marines in the women’s reserve (WRs) needed at least two years of college credit. Their respective training models also bespoke their differences. While WAC recruits trained, lived, and worked at army camps, WAVES and WRs took instruction on college campuses. As a result of the varying minimum standards for enlistment in the women’s services, the WAC became home to a more ethnically and racially diverse population, and it enlisted women from a wider range of socio-economic backgrounds, including those who could not afford to attend college. More age-diverse as well, the WAC welcomed women between the ages of 20 and 50 who had no children under 14 years, whereas the WAVES, SPAR, and USMCWR limited their volunteer base to women between the ages of 20 and 36 who had no children under 18. Of the four women’s military services, only the WAC allowed its members to serve overseas. 37

To alert women to the army’s variety of needs and encourage them to volunteer, the WAC advertised “239 kinds of jobs.” Many recruits received specialized army training in radio, chemistry, mechanics, and other fields, while others brought previously honed skills, such as foreign language training, into the army. Bilingual Latinas, for example, were recruited specifically for cryptology and interpretation a special unit comprised of two hundred Puerto Rican WAC volunteers served at the New York Port of Embarkation and other locations dedicated to the shipment of U.S. troops. Nevertheless, some female soldiers were given tasks considered “women’s work” rather than jobs they had been promised or trained to do. WAC officer Betty Bandel discovered low morale among troops whose expectations about their roles were not met. The army had given them domestic tasks, similar to those they had held in civilian life, or it had failed to utilize the professional expertise they brought with them into service. Disappointed at what she and her colleagues interpreted as gender discrimination, Bandel confided to her mother that some Army Air Force units had even requested that Wacs do the pilots’ laundry and provide “troop entertainment.” 38

Women of color who wished to join military units faced steep discrimination. Excluded from the WAVES and SPAR until November 1944 , and excluded from the wartime marines or WASP, sixty-five hundred African Americans joined a segregated women’s army. As one of the first female African American army officers, Charity Adams experienced vicious discrimination at Ft. Des Moines on several occasions. Early in her training, a higher-ranking white male officer—a fellow South Carolinian—excoriated Adams for appearing at the officers’ club one evening. In his lengthy peroration, Adams stood silently at attention while the colonel reminded her about segregation laws, the southern past, racialized slavery, and her “place” in this scheme. 39 Adams persevered at the Iowa base, rising in the ranks to major and commanding an all-black battalion of eight hundred fifty women assigned to a postal unit in Great Britain and France in 1945 . But she spent many hours at Ft. Des Moines tending to “extra” duties that fellow soldiers expected of her because she was black one of those tasks was cultivating the small Victory Garden at their barracks. Other women of color in uniform were assaulted at southern railway stations, denied access to facilities and dining cars on trains, and treated with disdain in towns near their bases and well beyond. 40

Japanese American women, initially barred from joining the Women’s Army Corps, were admitted beginning in November 1943 , but organization officials preferred that news outlets not publicize the inductions of Nisei women. 41 The WAVES, the second largest women’s military organization, did not accept Japanese American volunteers during the war. The pervasiveness of anti-Japanese sentiment adversely affected U.S. citizens of Japanese ancestry, many of whom strove to prove their loyalty in the face of embedded racism and a nationwide hatred that took even deeper root among white supremacists as the 1940s wore on. 42

Sex, Marriage, and Motherhood

Loosening sexual mores, skyrocketing marriage rates, and a burgeoning baby boom characterized the war years. Casual sexual relations among the unmarried startled many Americans, who blamed young women—especially those who worked outside their homes—for shifting standards. Government propaganda associated the spread of sexually transmitted diseases, such as syphilis and gonorrhea, with women rather than men by casting disease carriers as female. 43 Among the most vulnerable to infected women, official media suggested, were America’s men in uniform. Posters warned: “She May Look Clean—But” and, in 1941 , before the United States entered the war, the May Act declared prostitution near U.S. defense camps a federal crime. Yet the vast wartime mobilization effort combined with the cultural politics of the early 1940s provided American women a wide berth to express and enjoy sexual intimacy in the name of patriotism. Many who migrated to war boom cities and military installments left behind constraints on sexual behavior that had guided them in their home communities. As circumstances “opened up new sexual possibilities,” women more freely explored their erotic desires. 44 For example, lesbians socialized, fell in love, and “began to name and talk about who they were,” contributing to one of the war’s significant legacies, the establishment and reinforcement of lesbian and gay communities. 45 At the same time, shifting social standards made more women open targets for sexual innuendo and unwelcome invitations from strangers San Diego factory worker Constance Bowman wrote about cat calls and whistles and, on one occasion, a marine stalking her down a street with the persistent entreaty, “How about a little war work, Sister?” 46 The intersections of rapid defense mobilization, loosened social constraints, and greater female sexual autonomy created a home front where women became a “suspect category, subject to surveillance for the duration of the war,” Marilyn Hegarty argues. 47

Paradoxically, in the midst of wartime fear and surveillance of women’s sexuality, female allure and glamour were used to sell everything from laundry detergent to soda pop to troop morale. The World War II years marked the heyday of the “pin up girl,” and an unprecedented display of American women’s bodies movie stars such as Betty Grable, Rita Hayworth, and Lana Turner posed seductively for photographers and other artists, whose prints, posters, and calendars were reproduced in the millions and circulated widely. Ordinary American women copied these poses in photographs that they sent stateside to military camps and overseas to battlefronts. 48 And many women took the next logical step by literally offering their bodies—out of patriotic duty, to cap a brief encounter, or to seal a romantic relationship. 49

High U.S. marriage rates during World War II created a “Wartime Marriage Boom.” Between 1940 and 1943 , some 6,579,000 marriages took place, yielding over 1.1 million more marriages than rates in the 1920s and 1930s would have predicted. 50 A “bridal terror” had emerged soon after the Selective Service Act of 1940 initiated the United States’ first peacetime draft, and a rumored “man shortage” took hold of the American imagination midway through the war. Early on it was unclear how marriage and parenthood might affect military deferments, leading couples to tie the knot with expectations of securing extra time. In addition, with the wartime draft extending to males between the ages of 18 and 45, the pool of eligible men for marriage had presumably shrunk. By 1944 , rising U.S. casualty figures also contributed to the alarm. In large cities and defense camp areas, where soldiers and sailors congregated before deployment, “the urge to send men away happy meant numerous intimate liaisons, quick marriages, or both.” Many couples barely knew each other before taking their vows. A 1944 U.S. Census Bureau survey revealed that more than 2.7 million young, married women had husbands away in the armed services. The following year, the U.S. Census Bureau reported that more marriages had occurred “in each of the past four years than in any prior year in the history of the United States.” 51 War mobilization encouraged many couples to marry sooner than they had planned and others to marry soon after meeting each other. Many of these long distance relationships unraveled over the war years, with the high wartime marriage rates resulting in the highest divorce rates in U.S. history. 52

A baby boom accompanied the marriage boom, and many young mothers were left alone to care for their children and make ends meet. The more resourceful of them pooled their funds by “tripling up” in apartments, splitting the rent and food costs, and sharing childcare and housekeeping responsibilities. 53 Others found childcare where they could in order to take advantage of defense industry jobs. These working mothers received limited assistance from federally sponsored childcare facilities that had been authorized under the 1940 Lanham Act, an extension of the Depression-era public works projects. Underfunded and concentrated primarily in war boom areas, federal childcare centers served some six hundred thousand children during the war years yet at their greatest use, they served only 13 percent of children who needed them. Americans’ steadfast belief in a mother’s responsibility to remain at home with her children persisted during World War II even the war emergency failed to temper this deeply entrenched, middle class standard. 54 The notable exception to otherwise meager organized childcare assistance came on the west coast, where the Kaiser Shipbuilding Company provided its female employees in Washington, Oregon, and California with reliable, well-staffed facilities. The Richmond shipyards in the San Francisco Bay area oversaw approximately fourteen hundred children daily. 55

Figure 2. Josie Lucille Owens, Kaiser Shipyards, Richmond, California.

Working mothers were forced to make difficult choices during the war years. Some chose second shifts or night shifts, so they could be with their children during the day and work while they were sleeping. Others who worked day shifts were criticized for leaving their children. In several defense boom areas, social workers and school staff speculated that women entering the work force were spurred by “additional income and a too great readiness to evade full responsibility for their children” rather than “patriotic motives.” 56 Pressure on mothers to assume full responsibility for their children intensified during the war years, as reports of increasing juvenile delinquency appeared in magazines and newspapers. In A Generation of Vipers ( 1942 ), Philip Wylie criticized “Mom” for many “social discomforts and ills,” particularly the problems of American youth. FBI Director J. Edgar Hoover instructed mothers to stop “the drift of normal youth toward immorality and crime,” telling them not to take war jobs if their employment meant “the hiring of another woman to come in and take care of [their] children.” American society, in spite of the wartime emergency, barely budged on its expectations of working mothers. 57

Figure 3. “And then in my spare time . . .” Bob Barnes for the Office of War Information, ca. 1943. Prints and Photographs Division, Library of Congress (LC-USZ62-97636), digital ID: cph 3b43729.

Mobility, Sacrifice, and Patriotic Duty

Women’s growing independence during World War II was visibly characterized by their mobility. The cities, towns, and camps attracting them were located on both coasts and everywhere in between—Washington, DC, Seattle, Portland, Mobile, Detroit, St. Louis, and numerous other places where the prospects of war work, steady wages, or other opportunities beckoned. Some traveled occasionally to see their sweethearts, sons, and husbands, while others took to the road daily or weekly to punch time clocks in defense factories. Extending and expanding the Great Migration from the rural south to urban, industrial America, black women entered shipyards, ordnance plants, and bomber factories in unprecedented numbers.

Industrial growth and military mobilization allowed women to crisscross the nation in trains and buses, but their new mobility caused many Americans a sense of uneasiness and discontent. Women who traveled or lived alone were viewed with suspicion, while those who crowded into teeming defense areas, with or without their families, were often treated with scorn by local residents. In Portland, Oregon, community women criticized female shipyard workers who came into town “dirty and tired” at the end of their shifts. In Mobile, Alabama, a woman berated newcomers as “the lowest type of poor whites, these workers flocking in from the backwoods. They prefer to live in shacks and go barefoot . . . Give them a good home and they wouldn’t know what to do with it.” Many were met with the Depression-era epithet, “Okies.” In addition to the contempt they endured, migrants had to tolerate conditions that posed health risks: overcrowded boarding houses, makeshift accommodations, brimming sewers, limited water supplies and hard-pressed local schools. 58

In the nation’s capital, thousands of women who answered the persistent calls for office workers—a “Girls for Washington Jobs” campaign—created a “spectacle” that “staggered the imagination.” The women arrived in the city to find substandard lodging, if they found it at all. Construction on U.S. government residence halls that had been promised to unmarried female workers lagged months behind schedule, forcing women to find rooms in boardinghouses run by mercenary landlords or strict matrons. 59

Testing a woman’s conscience about her full participation in the war effort was commonplace in home front propaganda. She was supposed to want to undertake defense work, volunteer positions, or join a women’s military organization in order to support combat troops and out of a sense of patriotic duty. To use such positions to launch personal independence of any kind—especially financial—could be viewed as selfish or even reckless. African American sociologist Walter Chivers observed, in 1943 , that black women who thought they had left domestic work behind by seizing defense jobs would once again “have to seek employment in the white woman’s home.” An appeal for more military nurses late in the war asked: “Is Your Comfort as Important as the Lives of 15 Wounded Soldiers?” 60

Women were advised to spend their extra coins and dollars on war bonds or other U.S. government initiatives. The 1942 handbook Calling All Women advised that a ten-cent war stamp would purchase “a set of insignia for the Army” or “five .45 cartridges for the Marine Corps.” The 6th War Bond Drive in 1944 included a “Pin Money War Bond” promotion for women who previously had been unable to afford to buy bonds whether unemployed or underemployed, they could spend pennies and nickels to fill a “stamp” album that would eventually convert to a war bond. Eleanor Sewall, a Lockheed Aircraft employee whose husband was captured on Bataan, was heralded by the company for her decision to contribute 50 percent of her salary in payroll deductions toward war bonds. Beyond such an investment’s practical value in assisting the government, less disposable income for women would limit paths to financial independence that could be viewed as self-serving. Sacrifice in the cause of patriotic duty would temper desires for—and achievement of—personal autonomy. 61

Among many American women who sacrificed during the war were those who served near the front lines or had family members in military service. The sixty-six nurses who were captured by the Japanese on Corregidor spent three years in Santo Tomas prison camp in Manila. Besides sharing scarce food and limited supplies with three thousand other American and British prisoners, they shared three showers and five toilets with the five hundred other women there. 62 American mothers, wives, sisters, and sweethearts together lost more than four hundred thousand loved ones—the U.S. death casualty count—during the war. The writer Zelda Popkin noted that some women became “widows before they were really wives.” 63

Lasting Changes

Amidst sacrifice and loss, many American women clung to the opportunities extended to them during World War II. Prewar gender expectations had been tested and found wanting. Susan B. Anthony II, great-niece and namesake of the women’s suffrage fighter, argued in 1944 that women had proven their abilities in every field and therefore deserved “equal pay for equal work, a right grudgingly acceded” them during the war. Having worked all three shifts as a grinder in the Washington Navy Yard machine shop, while her fifty-six-year-old mother worked at a Pennsylvania radar factory, Anthony was confident that war’s end would “mark a turning point in women’s road to full equality.” 64

If the Allies’ fight for “freedom” meant personal independence, then American women had embraced it in the early 1940s. Of the “Four Freedoms” articulated by President Roosevelt in 1940 , “freedom from want” and “freedom from fear” went a long way in explaining why some American women enjoyed the financial, social, and emotional rewards of the war years. The large number of those who developed skills and carried out new work, who put on military uniforms, married quickly, engaged in sexual activity freely, or moved several hundred miles away from home—or all of these—did so inside the grander framework of national and global crisis. Out of crisis, the most meaningful transformations emanated from the confidence they developed and the independence they felt and exercised. Many feared these would fade or be retracted after the war, and their fears were justified. From popular culture to social commentary to political leadership, powerful voices urged women to “go back home to provide jobs for service men,” despite the fact that the jobs many held were not available to servicemen before the war and that many returning servicemen had not worked for wages regularly in the 1930s. 65 Numerous surveys and polls of female workers found that most wanted to remain in the work force rather than return to their prewar employment conditions. 66 Efforts to “contain” women during the late 1940s and convince them to embrace a middle-class dream where they would play starring roles as domestic goddesses in their own homes eventually backfired. 67 Their wartime experiences combined with collective memory not only affected their daughters, sisters, and friends directly, but also reinforced the deep foundations of the equality crusades—from civil rights to women’s rights to workers’ rights to gay and lesbian rights—that would take center stage in the postwar generations.

Discussion of the Literature

Women featured in a few early histories of the Second World War, but they did not receive much scholarly notice as a group until the late 1970s, after the women’s movement and the field of women’s history had gained traction. The simultaneous influence of social sciences on history contributed to the heightened interest in women as subjects—they could be counted, plotted on graphs, and studied in the aggregate, especially as war workers. Thus the earliest scholarship highlighted women’s contributions to U.S. success in World War II, particularly through their work as builders and inspectors of military equipment. Leila J. Rupp’s book Mobilizing Women for War: German and American Propaganda, 1939–1945 ( 1978 ) focused on the U.S. government propaganda campaigns to get women into the factories and other places of employment and to keep them there for the duration. 68

In the 1980s, four landmark works appeared, establishing the vital role of American women in the Second World War and positing an essential question: How did women’s work for wages affect their abilities as wives, mothers, and homemakers? In Wartime Women: Sex Roles, Family Relations, and the Status of Women during World War II ( 1981 ), Karen Anderson focused on three of the fastest-growing industrial areas for war production: Detroit, Baltimore, and Seattle. Anderson unveiled the underside of these burgeoning urban workplaces, with their racial tensions and violence, age discrimination, and unfulfilled government promises to working homemakers who needed assistance with shopping, meal preparation, and child care. Susan Hartmann’s The Home Front and Beyond: American Women in the 1940s ( 1982 ) launched Twayne’s American Women in the Twentieth Century series, a chronological history organized by decade. That Hartmann analyzed the 1940s, whole and entire, allowed readers to see the social and political forces operating to encourage the maintenance of traditional, clearly defined gender duties in postwar America ( 1945–1949 ), namely homemaking and motherhood for women. 69

In 1984 , D’Ann Campbell published the cleverly titled Women At War With America: Private Lives in a Patriotic Era, a work that approached various groups of American women in terms of their roles and resources. Using the rich material produced by social scientists and their organizations during the war, Campbell combined the techniques of both a social scientist and humanist to show that military women, homemakers, stateside service wives, and female industrial laborers, among others, fared much worse on all fronts than one group singled out and heralded because their work fit within acceptable gender parameters: nurses. All of these groups had gone to war, many answering the numerous calls to assist however they could, but Campbell demonstrated that American women remained at war with a nation that extended opportunities to them while simultaneously reining them in. 70

The fourth significant book published in the 1980s, Maureen Honey’s Creating Rosie the Riveter: Class, Gender, and Propaganda during World War II ( 1984 ), revealed how high-circulation magazines aimed at particular audiences sought to appeal to women on the basis of class status and values. In addition to these four important works, Alice Kessler-Harris and Ruth Milkman also conducted studies in the 1980s on the challenges women faced during World War II as laborers. By the end of the decade these historians and other scholars generally agreed that the war had offered myriad and measurable opportunities to women of all races and at all socioeconomic levels, but the options proved temporary, resulting in little significant redefinition of cultural gender norms that had cast women primarily as wives and mothers. 71

This early scholarship was enriched by oral history projects begun in earnest in the 1980s, notably Sherna Berger Gluck’s interviews of southern California war workers in Rosie the Riveter Revisited: Women, the War and Social Change ( 1987 ), a collection that encouraged scholars to follow Gluck’s lead in focusing on personal narratives of women who now seemed comfortable talking candidly about their wartime experiences. Oral history projects would flourish in the 1990s, as fiftieth anniversary commemorations of U.S. involvement in World War II not only marked specific events but also prompted an urgency to record aging participants’ stories. Scholars’ concentration on particular locales or geographic regions, as well as specific groups of women or the jobs they carried out became organizing principles for a succession of oral history collections, some available online and others in print, such as Cindy Weigand’s Texas Women in World War II ( 2003 ) and Jeffrey S. Suchanek’s Star Spangled Hearts: American Women Veterans of World War II ( 2011 ). 72

While oral history projects flourished in the 1990s and beyond, Judy Barrett Litoff and David Smith began soliciting, collecting, and publishing as many wartime letters as possible. Their quest, begun in 1990 , continues a generation later, with an amassed total of over 30,000 letters written by women. Litoff and Smith’s edited collections remain a starting point for any scholar pursuing the voices of ordinary American women who corresponded during the war. 73

The emerging field of cultural studies influenced scholarship from the 1990s forward, bringing gender and sexuality to the fore. The questions raised by cultural studies required scholars to consider the intersections of race, ethnicity, class, and sexuality as central elements in how women were viewed and what they experienced as a result. In Abiding Courage, Gretchen Lemke-Santangelo surveyed African American women who had migrated to northern California’s East Bay area, where employment in the shipyards and auxiliary industries offered economic opportunities unavailable in the Jim Crow south. Leisa D. Meyer’s Creating GI Jane revealed the myriad challenges, both real and imaginary, posed by a women’s army—notably Americans’ views on who could and should be a soldier and what that meant for a social order dependent on clear-cut gender norms Meyer was one of the first to analyze lesbian Wacs during WWII. Maureen Honey’s edited collection of primary sources, Bitter Fruit: African American Women in World War II ( 1999 ), investigated how women of color were depicted in popular culture, including the African American press, and how they negotiated these characterizations in addition to the challenges of wartime mobility, displacement, and opportunity. 74

In recent years, scholars examining American women during World War II have synthesized and built on the foundations laid by the previous generation, taking further the equations linking gender, sexuality, personal autonomy, and the media’s role in guiding individual and collective self-awareness, behavior, and cultural values. The historians’ titles reveal not only the characterizations of wartime women but also the pressures brought to bear on them during the crisis: Marilyn Hegarty’s Victory Girls, Khaki-Wackies, and Patriotutes: The Regulation of Female Sexuality during World War II ( 2008 ), Meghan K. Winchell’s Good Girls, Good Food, Good Fun: The Story of USO Hostesses during World War II ( 2008 ), and Melissa A. McEuen’s Making War, Making Women: Femininity and Duty on the American Home Front, 1941–1945 ( 2011 ), all pose research questions that uncover uneasy truths about the measured oversight and careful management of American women during a U.S. war inspired by and fought to defend “freedom.” Similar questions remain today as historians still seek to understand how U.S. propaganda agencies, and American media in general, depicted women during the war, and what this meant to them, to those conducting the war effort, and to the nation at large. 75

Primary Sources

Primary sources depicting or targeting American women during World War II—including photographs, posters, cartoons, advertisements, letters, government documents, and oral history interviews—are available in several major collections, most notably at the Library of Congress, the National Archives at College Park, Maryland, and Duke University’s Rubenstein Library.

A good place to initiate any study of women on the home front is with “Rosie Pictures,” a selection of images of wartime workers from the Library of Congress, Prints and Photographs Division. The representative sampling in “Rosie Pictures” hints at what may be found among the library’s vast holdings of visual images, including the invaluable Farm Security Administration-Office of War Information Collection, comprised of 175,000 photographs taken by U.S. government photographers who traveled throughout the nation between 1935 and 1944 . The collection has been carefully curated, with each item fully described and contextualized, and nearly all of them digitized.

The National Archives Library Information Center (ALIC) has organized information on women topically, so that the subject of war may be pursued from several angles and according to themes such as “women in the military” or “African American women.” Links to a variety of websites containing women’s history materials—though not necessarily items housed in the National Archives—may be found at the ALIC’s reference hub on Women. Millions of the U.S. government’s paper records not yet digitized are available at the College Park research facility, including documents produced by federal agencies created during the Second World War for specific objectives, such as the Office of War Information, the War Manpower Commission, and the War Production Board. At the U.S. Department of Labor, the Women’s Bureau generated countless pages of reports during the war, and all are available to researchers who visit the National Archives.

Duke University’s Rubenstein Library houses a variety of primary source materials in several major collections, including the War Effort Mobilization Campaigns Poster Collection, 1942‐1945, and the extensive Guide to the J. Walter Thompson Company. World War II Advertising Collection, 1940‐1948. Additional collections located in the John W. Hartman Center for Sales, Advertising, and Marketing History at the Rubenstein Library offer such resources as roadside billboard advertisements and department store window displays, designed to appeal to female consumers in the 1940s. Finally, among Duke University Libraries’ Digital Collections is Ad Access, a database of magazine and newspaper advertisements that features over 1,700 items from the war years, including official propaganda and many promotions directed specifically at women.

Three other significant primary sources collections deserve attention and offer scholars insight into women’s lives and experiences during World War II. Interview transcripts and video excerpts of interviews conducted for the “Rosie the Riveter WWII American Home Front Project” by the Regional Oral History Office at the University of California, Berkeley, are available at the Bancroft Library site. Northwestern University Library’s World War II Poster Collection contains 338 items, thoroughly identified and contextualized and at a high resolution to facilitate close analysis, many of them featuring women. Images are available as high-resolution files for close analysis. For wartime correspondence, there is no better starting point than the U.S. Women and World War II Letter Writing Project, developed by Professor Judy Barrett Litoff at Bryant University, and housed there in 175 boxes. Several hundred letters are available as PDFs on the project site, along with a helpful Finding Aid to the entire collection, prepared by Litoff.

A number of museums and special exhibits devoted to American women’s roles and contributions in World War II contain valuable primary sources and historical analysis. These include: The Farm Labor Project: Brooklyn College Oral Histories on World War II and the McCarthy Era, Brooklyn College “Focus on: Women at War,” See & Hear Collections, The National World War II Museum, New Orleans National WASP World War II Museum, Sweetwater, Texas “Partners in Winning the War: American Women in World War II, National Women’s History Museum, Alexandria, Virginia “Women Come to the Front,” Library of Congress “WAVES, World War II, Establishment of Women’s Reserve,” Naval History and Heritage Command and “World War II: Women and the War,” Women in Military Service for America Memorial Foundation, Arlington, Virginia.

Why the U.S. Military Is So Southern

Because in some parts of America, the call to duty is heard more clearly than in others.

Much has been written in recent years about the growing gulf between the worlds inhabited by America’s civilians and the members of its military, whose 1.34 million active-duty personnel, as of May 2016, account for just 0.4 percent of the population. One of the more remarkable, but less noticed, aspects of this divide is its geographical bent.

According to the U.S. Department of Defense’s most recent Population Representation in the Military Services report:

Almost 44 percent of all military recruits came from the South.

That’s despite the region having only about 36 percent of the nation’s relevant population. And Southerners have been similarly overrepresented in military recruiting classes for decades now — since 1990, they have consistently made up more than two-fifths of America’s enlistees.

Are the young Americans inhabiting the 16 states and the District of Columbia that make up the U.S. Census’ South Region somehow more patriotic than their counterparts in the other 34 states? Why — now that we are more than 150 years removed from the Civil War — is the Union’s army so disproportionately Southern?

Many throughout history have observed the martial qualities of the Southerner. As he camped in Mississippi in September 1863, Union General William T. Sherman observed in a letter of the “young bloods of the South” he had been fighting:

War suits them, and the rascals are brave, fine riders, bold to rashness … and they are the most dangerous set of men that this war has turned loose upon the world. They … must all be killed or employed by us before we can hope for peace.

More recently, in a 1997 interview, former Senator and Secretary of the Navy Jim Webb observed of his own Southern, Scotch-Irish heritage, that “we have been soldiers for 2,000 years. The military virtues have been passed down at the dinner table.” Research suggests, as Harvard psychologist Steven Pinker chronicles in The Better Angels of Our Nature, that the South has a distinct history of violence and a culture of honor that can be traced back to the Scotch-Irish herders who settled there — more belligerent than farming communities because they must protect their flocks — and that still persists today. Could this culture of honor and military tradition help explain the Southerners’ disproportionate numbers in the U.S. armed forces?

While there may be some regional difference in the propensity to serve, says Anita Hattiangadi, a research team leader at the Center for Naval Analyses, which prepares the PopRep report for the DOD, the higher military recruitment numbers in the South can likely be attributed to a number of other factors as well, including regional differences in “veteran populations, school accessibility, recruiter distribution and goaling, and the location of military bases.”

Indeed, it is important to remember that the South is not only where many members of the military are from but also where they live and work. Georgia, North Carolina, Virginia and Texas are home to a number of military bases, and if you add in California, then almost half of the nation’s active-duty service members reside in those five states alone. Given that roughly 80 percent of those who serve have, like Webb, a parent or sibling in the military, it makes sense that the children growing up around these bases where their parents work will be more likely to enlist.

This factor is supported by the historical data. In the first decade and a half that the DOD did its PopRep report (1973–87), the South’s representation among enlistees was closer to its share of the population (33 to 36 percent), but after military base closures and realignment during the late 1980s and early 1990s moved forces away from the West and Northeast, and concentrated them more heavily in the Southern and coastal “Gun Belt” states, then the Southern enlistee numbers followed suit.

The result today, in many ways, is a separate warrior class, concentrated in the South and living in isolated military communities and installations like Fayetteville, North Carolina. The families making up these communities may have a proud tradition of service, but it can be a double-edged saber they carry. “When you see other cultures having strengths that don’t require you to go out and get your butt shot off,” observes Webb, “this particular cultural strength seems thankless and kind of a curse, but it’s there.”

The Military Defeat of the South Africans in Angola by Horace Campbell (Monthly Review), Africa, Angola

In Angola in the spring of 1988 the armed forces of apartheid South Africa and the US-backed mercenaries of Jonas Savimbi were defeated by the combined force of the Cuban military, the Angolan army, and the military units of the liberation movements of South Africa and Namibia. This led directly to the independence of Namibia and then to the fall of the apartheid regime in South Africa itself. Cuba’s heroic role is the outstanding example of principled anti-imperialist internationalism in the last decades of the twentieth century.

We celebrate the twenty-fifth anniversary of these events by reprinting the account by Horace Campbell that appeared inMonthly Review in April 1989, with some pride at having published so sharp an analysis of current events—events largely ignored by the mass media then and since. We then present a military-focused historical analysis by Monthly Review Press author Ronnie Kasrils, who had the extraordinary fate to have headed ANC military intelligence in the battle alongside the Cubans, and then to have served for five years as Deputy Minister of Defense in the post-apartheid South African government—in regular contact with officers who had commanded the opposing forces. —The Editors


Ten years after United Nations Resolution 435 laid the basis for an independent Namibia, the South Africans agreed to withdraw from the territory they still occupied in defiance of international opinion. In a ceremony at UN headquarters in New York on December 22, 1988, an agreement was signed by Angola, Cuba, and South Africa, with the United States ostensibly acting as mediator. This accord was a major step toward self-determination for the peoples of Southern Africa, for it finally gave the United Nations Transitional Group the go-ahead to implement steps for the withdrawal of South African troops from Namibia and the return of refugees, elections, and independence to the former Portuguese colony. This historic agreement came not because of the tenacious negotiating of U.S. Assistant Secretary of State for African Affairs Chester Crocker, but because of the decisive military defeat of the South African forces at Cuito Cuanavale in Angola (see map on page 43).

Between October 1987 and June 1988, in the fiercest conventional battles on African soil since Erwin Rommel was defeated at El Amien, the South African Defence Forces (SADF) fought pitched tank and artillery battles with the Angolan army (FAPLA, the People’s Armed Forces for the Liberation of Angola) and its Cuban supporters at Cuito Cuanavale. This small base located in southeastern Angola became important in the military history of Africa, for there the South African army, supposedly the best on the continent, was trapped with its tanks and artillery and held down more than 300 miles from its bases in Namibia. Failing to take Cuito Cuanavale with over 9,000 soldiers, even after announcing that it had done so, losing air superiority, and faced with mutinies among black troops and a high casualty rate among whites, the South Africans reached such a desperate situation that President Botha had to fly to the war zone when the operational command of the SADF broke down.

With Cuban reinforcements, the Angolans withstood major assaults on January 23, February 25, and March 23. The South Africans were repulsed with heavy losses, and the Angolan/Cuban forces seized the initiative. For the first time since 1981, the Angolan army was able to reoccupy the area adjacent to Namibia. So confident were the Angolans and Cubans, that in the space of less than three months they built two air strips to consolidate their recapture of the southern province of Cunene. Trapped by the rainy season, bogged down by the terrain, and encircled, the South Africans made one desperate attempt to break out on June 27 and were again defeated. One South African newspaper called the defeat “a crushing humiliation.”

These episodes of war were followed by diplomatic initiatives that the South Africans had previously been able to block. After the March 23 reversals at Cuito Cuanavale, the South Africans started talks that culminated in the December 22 agreement. For the Angolans, who had been fighting continuously since 1961, the war and diplomacy were focused not only on the limited question of the South African withdrawal from Angola, but also on ending South African destabilization of the region and on independence for Namibia. Diplomatic initiatives accelerated after the South Africans failed to break out of their encirclement at Tchipa on June 27. Only then could the frontline combatants and the United States agree on the basis for withdrawal of the South Africans from Angola.

The Militarization of Africa

To understand the war in Angola and Southern Africa, it is crucial to comprehend militarization both at the basic level of arms transfers, weapons systems, military expenditures, and armed intervention, and also at the broader level of state power. Militarization in Southern Africa is the process by which the South African state attempts to solve its political contradictions by means of force. Its fetishism of weapons systems has become interwoven with the mystique of white superiority, as the South African army has spread all kinds of warfare across Southern Africa. It is always necessary to bear in mind the larger issues underlying militarization in Southern Africa, so that the implications of the military defeat of the South Africans for the political, social, and economic transformation of Africa can be seen.

War has always speeded the transformation or regression of society. Many times a particular battle like Cuito Cuanavale becomes decisive and becomes the basis for a change in the overall struggle of which it is a part. Von Clausewitz spoke of defense as a higher form of warfare when both combatants have the same means. This principle was important in the context of the military defeat of the South Africans, for in the siege of Cuito Cuanavale, the Angolans were not only defending their own sovereignty but also fighting for the self-determination of the African people.

Von Clausewitz also said that “no one starts a war without being clear in their mind what the real objectives are, and what they intend to achieve. The first is the political purpose and the second is the operational objective.” In the context of the war in Southern Africa, the South Africans confused their political objectives with their operational objectives and with what was actually possible given the limitations of the form of organization of South African society. Because of what South Africa represents, the fetishism of weapons became interwoven with the mystique of white superiority. But the South African militarists underestimated the capacity of the African people to resist apartheid, both outside and inside South Africa.

Militarization in Southern Africa stems from two sources: (1) Africa’s ties to the global armaments culture, and (2) the unfinished decolonization process in Africa. It encompasses all the techniques of modern military warfare, but the principal cause of its persistence is the attempt by South Africa to perpetuate a form of government that has become outmoded.

In all societies, Marx said, there is a point where the changes in the material relations can be measured with the precision of natural science. In Southern Africa, these changes have undermined the idea of separate development. At the political, philosophical, and cultural level men and women have come to understand what has happened and are fighting to transform society. In Southern Africa, there is a war between the old idea of white domination and the new idea of black liberation, as well as a clash between weapons systems, forms of political organization, and the cultures of Europe and Africa. The point where these contradictions are concentrated is the state of South Africa, where the white ruling class can no longer rule by normal means.

In the townships, there are 39,000 troops that enforce the subjugation of the African people. This brutality is so well documented that there is a United Nations arms embargo against South Africa. But this militarization of state and society has failed to crush the resistance of the democratic forces. The state of emergency—involving a ban on all political groups, the imprisonment even of nine-year-old children, and the holding of an entire society under siege—reflects the failure of apartheid to legitimate itself through ideological means.

No society can be ruled by force on a day-to-day basis. As Napoleon once said, “One can do anything with bayonets but sit on them.” Political stability requires that there be a coherent, widely accepted ideology that gives the oppressors confidence in their right to rule and resigns the oppressed to their subordination. In South Africa, the ideology of white supremacy has fallen apart. To postpone the inevitable reorganization of the region, the South Africans carry war to those societies that have articulated a higher form of social organization, at both the level of racial democracy and that of the planned use of resources for the majority of the population. This desperation of the South Africans in the face of challenges to their racist order explains the intensity of the wars in Angola and Mozambique, wars that have cost these societies thousands of lives and more than $30 billion.

It is important to understand the scope and nature of the spread of the apartheid war machine across Southern Africa:

  1. The South Africa-backed contra war in Mozambique has devastated the country. More than a million Mozambicans have been driven from their communities, over 250,000 have been killed or maimed, and the whole economy of Mozambique has been irreparably damaged. The territorial integrity of Mozambique is upheld by the intervention of Zimbabwean troops (and, until IMF pressures forced their withdrawal, by Tanzanian troops).
  2. There is a war of economic destabilization against the nine states of the Southern Africa Development Coordinating Conference—Angola, Botswana, Lesotho, Malawi, Mozambique, Swaziland, Tanzania, Zambia, and Zimbabwe. South Africa supported a coup in Lesotho in 1986 and backed an unsuccessful mercenary intervention in the Seychelles in 1981. It was behind a coup attempt in Tanzania in 1983 and has provided continuous support for armed elements in Zimbabwe since independence. The South Africans have carried out raids on Maputo (Mozambique), Harare (Zimbabwe), and Gaberone (Botswana), and attacked refugees in Swaziland.
  3. There is a counterinsurgency war in Namibia. Here the South Africans have over 120,000 troops, making it one of the most militarized spaces on the earth. This war has spilled over into a conventional war in Angola.

The Struggle For Independence in Angola

Angola does not border South Africa. This point is important to understanding why the South Africans became militarily involved there. A major reason was that the Angolan economy is not as integrated into the South African economy as the other economies of Southern Africa are.

Angola is potentially one of the wealthiest countries in Africa, and international capital is still involved in a fierce competition for its resources. It is one of Africa’s leading oil producers, and before 1973 it was a major diamond and coffee source, and, with a population of just over 8 million, has been underpopulated since the time of the slave trade. Portugal was so poor that it could never fully exploit the resources of its colony and opened it to capital from other exploiters. U.S. investments dominated in petroleum, British capital in diamonds and the Benguela railway, German capital in the iron mines at Kassinga, and Japanese, French, and other capital in other resources. This multinational character of capital in Angola meant that Portugal was de facto administering its colony on behalf of the European Economic Community and the United States.

The anticolonial struggle took international form from the outset, because the external orientation of the Portuguese colonies influenced the evolution of the liberation movements: their origins, ideological outlook, political orientation, and external support.

Three movements emerged out of the fragmented colonial situation:

  1. The MPLA (Popular Movement for the Liberation of Angola) was the oldest and was rooted in the urban working class. This party, led by Agostino Neto, was linked to the intelligentsia, the educated mulattoes, and the workers in the segregated ghettos of Luanda. It was this party that largely carried out the anticolonial war and effectively thwarted foreign intervention on the eve of independence.
  2. The UPA (Union of Peoples of Angola) was originally formed as part of the attempt of sections of the Congolese aristocracy to link up with the rebelling masses of the regions adjacent to Zaire. Founded by Holden Roberto, it changed its name to the Front for the Liberation of Angola (FLNA) as part of an attempt to go beyond a tribal basis and reach out to groups not previously in the UPA. FLNA never seriously fought the Portuguese, and information is now coming out about the treachery of this organization, which the South Africans tried to place in power in 1975.
  3. UNITA (Union for the Total Liberation of Angola) was formed in 1966 by the foreign minister of FLNA, Jonas Savimbi. He broke away from the FLNA, saying that its leadership was tribalist and captive to the CIA. Whatever political capital UNITA may have had (and this is all quite questionable now, given the revelation that Savimbi had been linked to the Portuguese army) was certain to evaporate once UNITA allied with the South Africans.

The First Defeat of South Africa, 1975–1976

The anticolonial struggle in Angola, which led to the 1974 coup in Portugal, was a turning point for Southern Africa. South Africa intervened with CIA support by land, sea, and air to stop the MPLA from coming to power. The Angolans asked for Cuban help to defeat the invasion led by South Africans, Zairian regulars, and CIA mercenaries. The South African invasion was turned back outside Luanda. South Africa never accepted its defeat the 1987–1988 siege of Cuito Cuanavale was only the most recent in a number of stages in the ensuing military buildup. But the 1976 defeat at Luanda, and that in Mozambique a year earlier, also inspired the generation that was maturing in the South African townships. The 1976 Soweto uprisings and their aftermath formed an important watershed in the militarization of the state and society in Southern Africa.

It was after the first defeat in Angola and the uprisings in Soweto that the generals of the SADF formulated the “Total Strategy,” a multidimensional preparation for war, involving a political strategy (the support of dissident groups to oppose liberation movements all over the region) an economic strategy (creating dependence on South African transport, communications, air traffic, rails, harbors, agriculture, mining equipment—in effect, ensuring that the region remain open to South African capital) psychological warfare (promoting the idea that Africans cannot rule themselves, that Africans are inferior) and a military strategy. Their intention was to have the Total Strategy be primarily political, economic, and psychological, making the military effort secondary.

After the defeat in Angola and the uprisings in Soweto, a number of stages led up to the South African humiliation at Cuito Cuanavale.

The War in 1976–1980

In this period, the South Africans were on the defensive politically and diplomatically, but were staging a massive military buildup in Namibia. They began conscription, constructed new military bases, and made raids against the Southwest African People’s Organization (SWAPO), which had moved its headquarters to Luanda from Dar es Salaam after 1976. Thousands of youths dodged conscription in the tribal regiments and joined SWAPO.

It was at this point that the South Africans organized UNITA, which had previously been wandering in Angola without a clear mission. The role of UNITA changed drastically when the Carter administration persuaded the Chinese to give it 800 tons of weapons. This kind of weaponry enabled UNITA to wage conventional war its officers were trained in Morocco, and it was thoroughly integrated into the military strategy of the SADF. This was a strange twist of history, for UNITA got its first weapons from SWAPO, to fight against the Portuguese. UNITA was now used to track SWAPO while the South African air force bombed its concentrations in Angola. It was in one such raid that the South Africans carried out the Kassinga massacre, in which over 800 people were killed at a SWAPO refugee camp in 1978. The ensuing international outcry led to the adoption of UN Resolution 435 in 1978, detailing steps leading to the independence of Namibia: briefly, the withdrawal of South African troops, the return of Namibian refugees, UN-supervised elections, and the “granting” of independence.

Phase Two: 1981–1984

The 1980 Republican electoral victory in the United States emboldened the South African government. Washington and Pretoria vowed that there should be no red flag over Windhoek. In this climate, the South Africans began a major conventional war in Angola, and the United States developed its policy of “constructive engagement and linkage.” In simple terms, this policy was a way for the South Africans to buy time to deepen apartheid structures in Namibia while Chester Crocker used the international media to divert attention from South African atrocities by linking the independence of Namibia to the withdrawal of the Cubans from Angola.

From 1981 to 1988, the SADF occupied the provinces of Cunene and Cuando Cubango in Angola. FAPLA, the Angolan army, was not prepared for this massive invasion of over 11,000 troops with the most sophisticated artillery pieces available. The South African command closely coordinated its air force and army. If the army found resistance, the air arm came in with surgical bombing strikes and then the ground troops moved forward. The provincial capital of Ngiva was sacked. Over 100,000 peasants fled their homes. The southern provinces of Angola were occupied until December of 1981, and the SADF did not withdraw even after the UN Resolution condemning the invasion. The SADF used this occupation to put elements of UNITA in place on the Namibian-Angolan border.

A major South African objective was to destabilize Angola so that the reconstruction of its economy would be postponed. UNITA carried out attacks on economic targets, especially railways, and kidnapped expatriate workers. UNITA’s headquarters was moved to Jamba, near the Namibian border, in order to be more closely integrated into the South African command structure.

The Lusaka Accord of 1984

The next major South African invasion took place in August 1983. Here UNITA announced it had taken Cangamba. The South African air force destroyed it and turned the rubble over to UNITA to show off to journalists flown in from Zambia. South Africa wanted UNITA to take Cuvelai so that the front of the war could be driven northward. Its self-confidence was heightened by the invasion of Grenada, when anticommunist rhetoric in the United States reached an incredible peak. The South African government intended an all-out attack on Luanda, the capital of Angola. This was a case in which operational objectives were confused with the political rhetoric of anticommunism. The South African generals said that it was operationally impossible to take Luanda, but the Magnus Malan faction within the State Security Council wanted to intensify the war. The Angolans were getting more experienced, and the South Africans’ Operation Askari failed. This failure led the United States to intercede on behalf of South African troops encircled in Angola. The resulting accord was named after the Zambian capital, Lusaka. It set up a joint military commission to oversee the withdrawal of South African troops.

South Africa was increasingly caught in a complex contradiction. The conscription of blacks into its armed forces was limited by the racism of the white ruling class. The army, therefore, had to be mainly white, and the domestic labor force to be mainly black. But black factory workers knew they were producing weapons to be used against their sisters, brothers, and children, and they resisted. Only an end to racist practices could have resolved the military dilemma of the whites, but in that event there would be no political dilemma remaining to be resolved by military means. This fact should be uppermost in the minds of those who want to conceptualize the nature of the military in South Africa after apartheid.

The Third Period: 1984–1987

In September, FAPLA forces started to drive against Jamba, near the Namibian border. South Africa intervened, but with the uprisings in the townships it could not carry the battle and called upon the United States to help. The United States supplied Stinger missiles to UNITA and $15 million additional aid. UNITA itself lacked the administrative and military infrastructure to manage this assistance, which in fact went indirectly to the South Africans. During the siege of Cuito Cuanavale, Savimbi complained that the South Africans worked out the cost of the battle and told him he had to “ask his friends to pay.” The United States also reactivated the base at Kamina in Zaire, where the CIA was dropping supplies for the South Africans via UNITA. The U.S. support for UNITA, and in essence the SADF, led to the final stage of the war.

The Defeat of the SADF

Operation Modular Hooper was launched to seize Menongue and set up a provisional UNITA government as a pretext for increased U.S. support. Building the roads and transporting heavy equipment for over 9,000 SADF regulars took six months.

The Angolans launched an offensive against Savimbi’s base areas in southeastern Angola, and the battle at the Lomba River was the preamble to the big battle at Cuito Cuanavale, where the Angolans decided to set up a defensive line. The SADF started its siege in November of 1987. When they faced stiff resistance from the Angolans, the operational command of the SADF broke down. It was at this point that President Botha had to boost the morale of his troops in person. This visit prompted the fortification of the Angolan position by the Cubans, who had been out of direct fighting since 1981. The Cuban command calculated that if the FAPLA defensive line broke the Cuban forces themselves would be threatened. The siege of Cuito Cuanavale now involved all the combatants of the Angolan theater of the war: the Angolans, the Cubans, SWAPO, and the ANC on one side and the SADF, the Americans, and UNITA on the other.

Supported by radar on the ground, Angolan and Cuban MiG 23s proved superior to the South African Air Force. With its air force grounded and its tanks stopped by mines and difficult terrain, the besieging force was reduced to shelling Cuito Cuanavale at long range for three months. In major ground battles in January, February, and March, the South Africans failed to take it.

By the time of the March attack, the conditions of battle had begun to turn against the SADF. First there was a mutiny by the conscripted troops of the Southwest African Territorial Force. The South Africans were racist even in military tactics, and placed black troops in front of the white troops to bear the brunt of the fighting. Second, the heavy equipment bogged down on the eastern bank of the Cuito during the rainy season. Most important, without air support, the South Africans were outgunned by the Angolans. By the end of March the South African siege was over and the South Africans themselves were trapped and under siege.

The war became more and more unpopular in South Africa when young whites began coming home in body bags. This intensified the End Conscription campaign in South Africa and forced the South Africans to take steps leading to the talks among the principal combatants: the Angolans, the Cubans, the South Africans, and the United States. (It is important to see the United States as a combatant, and not as a peacemaker, as the Western media have suggested.) So confident were the Cubans and Angolans after repulsing the South Africans that in the space of two months they built two airfields to consolidate their control of the southern provinces. At this point the United States attempted to open a new front in the north with UNITA. The calculation was that as long as UNITA was integrated into the SADF there would be little popular support for it in the United States. The U.S. military carried out exercises called Operation Flintlock in May to drop supplies for UNITA, hoping to relieve the trapped South African forces.

The reversal of the South Africans’ military fortunes was sealed at Tchipa on June 27, 1988. Here the SADF tried to open a new front to relieve the troops trapped at Cuito Cuanavale. In this decisive battle, the FAPLA forces confirmed their air superiority. When the news of their defeat at Calueque Dam reached South Africa, more young whites protested against the draft. One South African newspaper called the battle of Tchipa “a crushing humiliation.” It said, “The SADF resembled the trenches of the Somme, rather than the troops of a mobile counterinsurgency force.”

The Conference Table

The talks and jockeying about the independence in Namibia should be seen as an attempt to win at the conference table what South Africa had lost in battle. But in reality the South Africans had only two genuine choices: to negotiate a capitulation or to surrender openly. The siege of Cuito Cuanavale ended after the SADF agreed to withdraw from Namibia. There was dithering at the diplomatic level as the prime minister of South Africa tried to get Zaire to continue the war, the Americans tried through third parties to pressure Angola to form a government of national unity with UNITA, and the Western press tried to link the South African retreat to the withdrawal of the Cuban troops from Angola. The United States has since used its influence in the UN Security Council to water down Resolution 435 by limiting the deployment of UN troops in Namibia. At the same time, the South Africans are deploying former commandos of Koevet, a death squad-type organization, in an attempt to prevent a massive victory by SWAPO. But the siege of Cuito Cuanavale was a turning point in the process of militarization in Africa. It opened the way for the genuine decolonization of Namibia.

Our focus on the military has been guided by the way in which militarism has compounded the crisis of reconstruction in Africa. Angola has suffered disproportionately, and its economy has had to postpone reorganization in order to meet the South African invasion. In the past three years, more than 50 percent of the Angolan budget had to be spent on weapons. The Angolans will inherit refugees, amputees, demolished homes, and a destroyed economic infrastructure after this war. The political and economic challenges facing the Angolan society will be as formidable as the military battles with the SADF were.

War has profound effects on any society, and the impact of Cuito Cuanavale is still unfolding in Southern Africa. In Namibia, SWAPO and others struggling for independence now have renewed confidence. The siege of Cuito Cuanavale has changed the military balance in Southern Africa on the side of liberation self-determination, not white domination, is the agenda of Africa today. But the cost of the war also highlighted the tremendous burdens that have been placed on the African peoples who bear the brunt of capitalist oppression. The political and economic battles they now face will be as demanding as the military struggles from which they have so recently emerged victorious. These tasks confront a generation that has matured in the post-independence period. The unfolding of this process will have repercussions well beyond Africa.

Brief History Of Oligarchy (Full Version)

Oligarchy is a word derived from the Greek language. It means the rule of the few. We have come to associate it with rich Russian businessmen, but American society, behind the facade of democracy, has come more and more to resemble an Oligarchy. Oligarchs use their enormous influence to turn the country to benefit themselves to the exclusion of other members. A key characteristic of Oligarchy is that the relationships between the various Oligarchs forms the basis of rule.

Oligarchs tend to only associate with other Oligarchs. They see themselves as the movers and shakers of this world. They fund think tanks, lobbyists and develop proteges all to steer politics to their benefit. This allows the majority to have little, if any real say in the body politic. As Winston Churchill once said

”The best argument against democracy is a five minute conversation with the average voter.”

Oligarchs have a superiority complex, and a belief that only in their hands can the best for their country be achieved. They have synonymised the countries interests with their own. This is the trick they use to convince the majority they work on behalf of all. Meanwhile, they siphon off wealth into their own pockets. This increases inequality and fuels the sense of an unjust society.

Oligarchy does have its weaknesses. An elite clan can become very insular looking. The can become divorced from the reality of day to day life and unable to respond to growing unhappiness levels. Their very lack of diversity and general conservative outlook can mean missed opportunities. New exciting developments, like the blockchain and cryptocurrencies for example, are seen as threatening to Oligarchical power, rather than embraced as liberating. Finally, Oligarchies are an anathema to free markets and free trade. Fixing is the order of the day, manipulation, insider trading, high levels of sycophancy and corruption.

Famous Oligarchies of history have included Sparta, the Roman Republic, the Venetian Republic and the British Empire. Plato viewed Oligarchy as “a constitution teeming with many ills”. One of these he identified as greed. He saw the ruling passion of Oligarchs as the accumulation and preservation of wealth. The transition to Oligarchy from other forms of rule is marked by the amassing of great private fortunes. Trends towards growing income inequality have been marked in the world especially over the last ten years. According to a report by Oxfam last year the world’s eight richest people have the same as the poorest 50%.

Much of the wealth, though, is hidden in our society. We can clearly see the influence of the Walton family and the Koch Brothers. Through political donations they openly seek to have laws passed in their favour. However, both the giant banking families, like the Rockefellers, and the remnants of the aristocracy are also owners of vast undisclosed wealth. Even companies like Forbes, that put these rich lists together, admit that the amounts used are on the low side. The admiration of the rich and worship of money becomes the essential heart of an Oligarchic society.

Of course Oligarchs are happy to rule and get along when the going is good. The Bilderberg meetings, exposed by the intrepid reporting of Daniel Estulin and Jim Tucker, are yearly gatherings where the wealthy elite meet to discuss an agenda of important topics in global finance and politics. Papers that have been subsequently released reveal discussions and resolutions for action way in advance of events. Bilderberg attendee Etienne Davignon admitted that the group was instrumental in setting up the Euro currency. In 2002 a discussion was held at the Bilderberg meeting to invade Iraq. In an article from 2008 the Bilderbergers discuss the global financial melt-down and rescue measures back in 2006. These, once secret, meetings clearly indicate an Oligarchical tendency.

In Russia today we can clearly see an Oligarchy, represented at their head by Putin. They are an entrenched ruling class that have evolved from the nomenklatura, or Communist bureaucrats, of the Soviet Union. Following the fall of the Berlin Wall and the collapse of the USSR, Western business had hoped to enter Russia to steal and plunder her resources. However, they found that the Russians had already organised themselves into an Oligarchy capable of plundering their own nation’s wealth. The same process can be identified in China where previous state-owned enterprises, under the guiding hand of the old Communist ruling class, are transferred into the ownership of the new post-Communist ruling class. Who just happen to be the same people!

In ancient China Confucius stood up against Oligarchy. One of the main problems China faced in his time was the constant warfare driven by the private needs of the Chou dynasty. Confucius regarded many of the nobility as useless parasites and idlers. He promoted the ideas of education on the basis of ability and a government filled with advisers (to the leader) who served not feudal loyalty but answered to higher moral principles. Confucius stood in opposition to entrenched privilege, which Oligarchy maintains, and was opposed to the domination by a caste of rulers with no intrinsic merits of their own.

Machiavelli, who lived in Italy during the Renaissance period, wrote about the body politic. Drawing on his country’s history he wrote about the nobility living as he saw it on the abundant revenue derived from their estates, and called them pests in his Discourse on Livy. He was of the opinion though that class conflict between the rich few and the many have-nots was a good thing for political development. He identified five classes: feudal nobility bankers and merchants the middle class poor urban mass and the country peasant. The key was that one group’s power did not become excessive. Machiavelli believed that a healthy republic required a series of checks and balances.

The idea of a series of checks and balances was enshrined at the heart of the US Constitution. The American Founding Fathers were concerned with preventing Tyranny, but they were also aware of the dangers of Oligarchy. They attempted to form safeguards in the Constitution to prevent the US from falling prey to this type of rule. It can be summed up in the slogan, “Of the People, By the People, and For the People.” Have the Oligarchs now taken over the United States? More commentators, like Thomas Piketty in his book Capital in the Twenty First Century that have focused on the widening inequality gap, are drawing this conclusion. The US is being run by Oligarchs.

Part 2

The word Oligarch, meaning rule of the few, came from the Ancient Greek civilisations that rose up in the millennia following the collapse of the mighty Egyptian Empire. The later period was named the Archaic and Classical periods which saw the emergence of Greek city states like Sparta which was founded about the 10 th century BCE. Greece was the birthplace of Western philosophy and included the thinkers Socrates, Plato and Aristotle. In the works of Plato’s Republic, we can see the focus given to questions of political governments.

Sparta was ruled, unusually at this time, by a group of 28 powerful men plus two kings. This group of thirty was called the council of elders, or gerousia, and they drafted resolutions that were put to the vote of an assembly of “free” men. A board of five overseers, or ephors, chosen from the “free” men, was used to counterbalance the council. These elected ephors were charged with maintaining law, even if that meant charging an Oligarch. Spartan citizens themselves spent their time hunting, fighting and politicking. The lower orders, the workers, were excluded from government. It was their regular uprisings that seriously undermined Sparta’s fighting capacity. The Oligarchy is also a reason for Sparta’s reputation as a conservative city state slow to make decisions. They can be contrasted to their main rival Athens, who were at this time experimenting with democracy.

In 411 BCE after a 100 years of democracy there was a revolution in Athens. This eventually resulted in a Spartan victory over Athens and the imposition of rule by the “Thirty Tyrants”. Numerous atrocities were committed by this Oligarchy. They were overthrown some ten years later. Socrates, who taught some of the Oligarchs, was sentenced to death. An irony as Socrates believed that his teachings were available, freely, to everyone. His successor, Plato, developed a philosophy that fit the aristocratic outlook on the world. Their desire to turn their back on reality and deny the turbulence all around them. A philosophy that down played change in favour of stability. Later in Greek civilisation philosophy was to give way to the more popular astrology.

One of the main factors that undermined both the Spartan Oligarchy and the Athenian democracy was the growth of slavery. The same factors can be also be detected in Ancient Rome. Following Roman victories in the Carthaginian Wars there were vast tracts of land that required distributing. The Gracchus Brothers, Tiberius and Gaius, both high ranking nobles, began to argue for the civilisation of Rome’s Italian allies, and against slavery. In 133 BCE they proposed a series of far reaching land reforms aimed at creating independent farmers with citizenship rights and responsibilities. However, they met with fierce resistance by the patricians, or wealthy aristocrats.

The aristocrats saw the land reforms as an attack on their powers and so therefore on the Republic. Gracchus took their bill directly to the people, which further outraged the Senators. On the day of the vote, the rich landowners simply had the voting urns removed. The Gracchus brothers were persuaded to bring the matter before the Senate, but by this point the Oligarchs, about 2,000 wealthy families, had matters sewn up. Rome was now split between the Senate and the Popular Assembly.

The ruling Oligarchy, fearful of losing wealth and influence, accused Tiberius Gracchus of wanting to be a tyrant. Whilst publicly adopting a defensive stance to protect the Republic, they incited violence to bring down the tyrants. In 132 BCE Tiberius sought to be re-elected as tribune bust was set upon and killed by a group of Senators. Supporters of Tiberius were also hunted down and killed. However, this did not bring an end to the dispute. Gaius was then elected Tribune in 124 BCE and continued agitating for land reform. The Oligarchy was forced to buy peace with bribes. To pay for this they taxed the Italian countryside, which only resulted in an exodus to the cities, where more dispossessed added their voices to demands for change.

In both Sparta and the Roman Republic the Oligarchys were undermined by the dispossessed. It is an essential feature that Oligarchs siphon off wealth into their own pockets, leaving less for everyone else. They then use this power to keep the balance of forces in society stable and ripe for picking. The growth of cheap slave labour achieved by military victories benefited only the wealthy landowners. It only created the complete immiseration of the peasants and workers. Following the defeat of the Gracchus Brothers and their land reforms, Rome turned into a society where slavery was marginal to one supported by vast armies of slaves.

The world centre for oligarchism and monetarism from about 1100 to 1700 was the Venetian Republic. Unlike many of the Italian feudal city-states, Venice was dominated by Oligarchs from its inception as a trading and mercantile port. It was ruled by the Council of Ten, an all-powerful and secret body. It was also a strong state that publicly owned the ships which were then leased to the merchants and traders. The Oligarchs worked both to crush any commune sympathies and to prevent any Doge, or Duke, who would make themselves a tyrant/dictator.

In 1355 the oligarchy imposed their collective will over Doge Marin Faliero who was charged with attempting to overthrow the constitution in favour of the type of autonomy favoured by other Italian states at that time. In marshaling the support of the common people against the Venetian aristocrats he was accused of wanting to become a dictator. He was deposed and beheaded. The Oligarchs were keen to impress that the Doge was never above their law.

1527 there were about 2,700 patricians entitled to vote in the Grand Council. Many were so poor that they made their living from selling their vote. A feature of Oligarchy is that many deals are struck and compromises made. Many of these discussions took place at the Doge’s Palace. This type of intrigue and diplomacy became a way of life for the Oligarchs. The Doge meanwhile was subject to the same type of strict restrictions that those who were not Oligarchs faced. The average age of the Doge was around 70 as the Oligarchs feared the creation of strong dynasties that would challenge their collective rule. This did give the Venetian regime a certain old man feel, bordering on senility at times, and reinforcing a conservative outlook.

During the 1600s the wealthy Venetian bankers were backing and transferring their assets to the nascent British Empire. In addition they spread their ideas, as can be seen by the popularity of Paolo Sapri, the founder of modern empiricism, materialism and determinism. You can trace Sapri’s influence, and his successor, Antonio Conti in the works of Hobbes, Locke, Newton and later Bentham. Utilitarianism, a form of moral and ethnic arithmetic, became the centerpiece of the British Empire’s ideology. The trick, once again, of the Oligarchs was to present their own good as synonymous with the good for the maximum.

In 1705 a pamphlet of doggerel poetry was published that became the most talked about philosophical poem ever. Called the Fable of the Bees, Bernard Mandeville, put forward the idea that private vices such as greed and vanity result in publicly beneficial results. He focused his attention on self-interested passions:

“The desire to create a purely virtuous society was based on a vain UTOPIA seated in the Brain: fancying that a nation can, with virtues like honesty, attain great wealth and success, when in fact it is the desire to improve one’s material condition in acts of self-indulgence that lies at the heart of economic productivity (The Fable, Vol. I, pg. 36).

In Great Britain a profound respect emerged for money. Money for money’s sake too. The growth of the British Empire was largely driven by this pursuit, with the British East India Trading Company leading the charge, funded by the new financiers of the City of London. The monarchy was no longer one of absolute rule and Parliament represented the wealthiest people in the country exclusively. There has been a long history in Britain over the nineteenth and twentieth centuries of movements to extend Parliamentary democracy. However, even today politics and governance is still dominated by an elite group with privileged backgrounds and selective education.

The singular pursuit of money enshrines an Oligarchical outlook on the world. Adopting the idea that improving ones own material condition will result in benefits for wider society justifies the selfish pursuit of wealth. It has been enshrined in the twentieth century as the American Way. This was given a huge boost by the huge post-war boom that saw unprecedented improvements for the mass of people in their living standards. However, through a booming middle class and a leaning to democracy from the outset, there was a lessening of Oligarchical influence in politics. However, Oligarchs in America have always been present, think of Forbes 400 and the Bostonian’s of Henry James. This is unsurprising given the close Anglo-American relations that existed.

Part 3

Robert Michels, a German sociologist and pro-anarchist, wrote what he called the Iron Law of Oligarchy. In Political Parties he argued that organised bureaucracy always resulted in Oligarchy. He used for his study the socialist parties of Europe, arguing that the Oligarchical structures he found proved that it is an organic tendency which every organisation experiences. The more complex the more likely the tendency to Oligarchy. However, his theory that the socialist parties should be Oligarch free could be a faulty assumption. Democratic Centralism, as it is called, gives the appearance of democracy whilst in actuality a small elite group make the decisions. Is this the true nature of the complex society we live in today?

It is a stark reality today that we are living in a period of massive wealth disparity. This has only accelerated since the 2008 financial crisis. The richest 1% have seen their wealth increase on average by 6% a year, whilst the rest have averaged 3%. This increase is due to the money printing or QE that has been propping our economies up for the last decade. It has sent bond and stock markets soaring, as well as the continuation of inflated housing bubbles. Those who are asset rich have done disproportionately well. Meanwhile due to high inflation, even those lucky enough to see a 3% increase will in real terms have been losing money. This point is even further underlined by the fact that most of that 3% gain is from the rising price of the family home. If this trend continues then the top 1% will hold 64% of the world’s wealth by 2030. Given that much wealth in our society is hidden, it will probably be much higher than that.

This concentration of income and wealth make democracy a ‘demockery’. We are witnessing an upward redistribution of money from the bottom 80% to the upper 1%, if not to the 0.1% the real elite in our society. A trend Karl Marx noted and said would only get worse over 150 years ago. This 0.1% are America’s corporate leaders. Many have described the US as a Corptocracy meaning total control by corporate interests. They control politicians and the media to ensure political decisions are made that promote corporate interests fracking, tax cuts, deregulations, etc. Are these corporate leaders the new Oligarchs of our time?

In a paper from Princeton University in 2014 the authors argued that the US political system is geared towards serving the interests of special interest groups. They concluded that US government policies rarely align with preferences of the majority. The French economist Thomas Picketty, who has written extensively about Oligarchy in Russia, has pointed the finger at the US and has charted the rise of inequality in his book Capital in the Twenty First Century and its impact on democracy. Of course his conclusions that a more progressive tax system is needed is met with horror. There are also those who have made livings out of claiming that wealth inequality is not a bad thing, or that further, it is a good thing.

It is in the nature of Oligarchs to believe they are specially imbued with the qualities to govern. Their wealth, for example, may have given them a superior education and experience in the world fitting them to hold political office. There are now more than 2,000 billionaires in the world that we know of. Many families fortunes are deliberately diffuse and obscure so we can not be certain. It has been estimated in Treasure Islands, by Nicholas Shaxson, that at least $20 trillion could be hidden off-shore. Inequality tends to convince elites that they have unequal abilities in other areas. Oligarchs are fearful of the masses, and take steps to ensure that their wealth and privilege are protected first and foremost. It for this reason that they stack the judicial system, seek to control freedom of speech, movement and association, feather their own nests and engage in criminal behaviour including murder.

Oligarchy has its base in the US, this can be seen in the types of wealthy inhabiting the novels of Henry James. The Bostonian families of Forbes, Lowells, Perkins, Cabots etc. The financiers from the City of London, the Peabodys and the Morgans, were also Oligarchs. They formed the so-called Eastern Establishment. They made up Forbes original 400 rich families list, the amount that could fit into a doyenne’s ballroom. Then there were the Robber Barons of the nineteenth century known for their monopolisation of huge industries and ruthless/murderous accumulation of massive wealth. Key players included Carnegie, Vanderbilt, Rockefeller, Gould and Astor.

Oligarchy can be seen clearly in the US South amongst the large slave-holding plantation owners. They formed an elite class who created their own ideas of gentility, honour, and the Southern way of life. The attitude towards those at the bottom was expressed in racial terms, but displays the same superiority we find in Oligarchs through history

“[I]t is clear the Athenian democracy would not suit a negro nation, nor will the government of mere law suffice for the individual negro. He is but a grown up child and must be governed as a child . . . The master occupies towards him the place of parent or guardian.” George Fitzhugh

We also see overtime a familiar situation in which the wealthy got wealthier. By 1860 0.1% owned the vast majority of the slaves. For example, Nathaniel Heyward owned more than eighteen hundred slaves. Even after the abolition of slavery certain intermarried families became dominant in cities, tending to hold numerous posts such as Governor, the mayor, newspaper editor, the bank president and the local businessman. Even today many parts of the US South still consist of regions where feudal relations, especially in relation to the black population, is a better descriptor than democracy.

Oligarchies are concerned with maintaining the status quo, and they do this by an emphasis on conformity and consensus. Spartan virtues were harmony, simplicity and strength. The Roman Republic virtues were known as the ‘Via Romana’ and were seen as the reason for the moral fortitude to conquer the known world. They included dignity, fortitude, dutifulness, and respectability. In Venice they cultivated compromise, self-effacement and conformity, all virtues that perpetuated the Venetian model. In Britain the virtues of the Oligarchs came to be associated with the Victorian era. Virtues such as fairness, diligence, conformity and enterprise were seen as the gift that Britain brought to the world through her “civilizing” Empire.

The country run by the few is one based on conformity and acceptance. Abidance to the national qualities or virtues that are held to make the country “great” is a must. Anything that seeks to challenge this mainstream is seen as threatening. It is why Julius Caesar was brutally murdered in the Senate. Caesar, although from an old aristocratic family, was regarded by the Oligarchic families who dominated the Senate as a mortal threat as he sought to carry out progressive reforms and leant on the plebeian masses for support. He did not conform with Oligarch values of acceptance of the status quo and disdain for the plebs.

It has been noted by many observers that the response to the 2008 crash is very different to the comparable 1929 crash. In 2008 Congress took the step to bail out the banks. This was largely accepted by the majority because they have synonymised banking interests with that of the mass of people. They were labeled as “too big to fail”. In 1929 however the US President Franklin D Roosevelt (FDR) enacted a series of legislative measures throughout the 1930s aimed at assisting the mass of people and restricting the banks. The question in a crisis is always who pays. The answer for the Oligarchs is always not us.

FDR refused to be that doddering Venetian Doge. Despite coming, naturally, from a very wealthy family himself, he saw that America was best served by a pro-active labour oriented solution. FDR saw that reform from above was needed to prevent revolution from below. His hand was immeasurably strengthened by the coalition of the trade union movement and the various political parties representing workers, such as the Communist Party which exerted enormous pressure for reform upon the ruling elite. His coalition also extended to black voters, the middle class, farmers, intellectuals and the army. The wealthy Oligarchs hated him and mobilised the Supreme Court to try and block his New Deal measures such as the Alphabet agencies. Indeed there is still much hatred shown towards these two New Deals, as they were called, and expressed as “entitlements” and the bain of the virtues of entrepreneurship and self-reliance. FDR’s policies appealed directly to the man at the bottom of the pyramid, and resulted in an unprecedented four terms. Indeed it was this success that resulted in the 22 nd Amendment to the Constitution in 1947 limiting the term to twice.

The US had another anti-Oligarchical president in John F Kennedy (JFK). Again, coming from a very wealthy family, he went against the established norms. This was very much part of the times with a general increase in political participation amongst wider sections of the US population. As president JFK attempted to break up the Federal reserve, bring an end to the Cold War, get out of Vietnam, and compelled steel executives to roll back price increases. Basically Kennedy refused to listen to the Harvard elite foreign policy advisers after the Cuban missile crisis brought the world to the brink of World War Three. After the Cuban Missile Crisis in October 1962 in particular he displayed an autonomy that was unsettling to the elite that surrounded him. He was assassinated in November 1963, undoubtedly at the behest of the Oligarchy, and whitewashed by the Warren Commission. It has though, with other prominent political assassinations of the 1960s – Martin Luther King and Malcolm X – created a permanent scepticism within large sections of the American public towards official narratives.

Subsequently the grooming and selection of US presidents has since been concentrated into channels that the Oligarchy believe they can exert control over. However, each president presents their own challenge. Nixon, for example, (not from a wealthy background) was seen as Kissinger’s man. Kissinger, himself a talented protege of Nelson Rockefeller, was a key player. However, they did not like Nixon’s authoritarian tendencies nor his Quakerism. Impeachment proceedings began in February 1974 and he resigned in August. Gerald Ford was seen in inner circles as dumb. Jimmy Carter, rumoured to have been selected as a candidate at the Trilateral Commission (Rockefeller’s private discussion club for the Oligarchy) in 1974, was content to put his administration under the control of the Federal Reserve and Paul Volcker. Ronald Reagan couldn’t resemble a doddering old Doge more if he acted the role. In the background Vice President George Bush Senior ran the show.

Bill Clinton, on the other hand was not the Oligarch’s first choice. It shows that the wealthy elite are adept at making compromises and through conformity seek to keep power. Clinton’s advisers drew attention back to the ideas of economic populism rather than social issues. “It’s the economy, stupid!”. Lee Atwater, Republican strategist, suggested that US politics was divided into populist and elitist issues. Clinton reluctantly took the populist path promising to restore the hopes of the middle classes. Reluctantly, because he was also Harvard educated, Yale law graduate, craving approval from the liberal intelligentsia. The old labour leaders of FDR’s New Deal coalition had been eviscerated and totally compromised. Praised for its diversity in sexuality and ethnicity nonetheless Clinton’s cabinet resembled those of a British Conservative government – full of millionaires. For example, Robert Rubin, Andrew Cuomo and Madeline Albright. He ended his presidency facing sexual scandals and impeachment charges.

Despite the appearance of the most important job in the land the US presidency has been largely stripped of its economic and financial power which now lies with the Federal Reserve. Itself being a private organisation to serve the bank interests, originating in secret at a meeting on Jeckyll Island in 1910. Further the Department of Justice and the FBI, not to mention the CIA and the Department of Defense, all operate as if they were separate fiefdoms. The media is owned by and on behalf of large conglomerates, key players are Murdoch, Disney, and General Electric. Gore Vidal, America’s greatest writer and an astute insider who was best friends with the Kennedy’s, wrote extensively about the “Washington Establishment” and how it was in effect an “invisible” Oligarchy. His insights are illuminating

“You know, I’ve been around the ruling class all my life, and I’ve been quite aware of their total contempt for the people of the country.”

As the number of obscenely wealthy people has grown, so we have entered according to some a golden age of philanthropy. The wealthy are then praised for giving back to society through charitable foundations. However, they are classic vehicles, not only for tax avoidance, but to shape culture and politics. Hidden behind educational foundations is often a complete disdain for what Reagan famously referred to as black “welfare queens”. The problem with Americans is that they are simply not educated! Let us benevolently help you. A popular ideology today is social Darwinism. Survival of the fittest. It implies that the winners, those who top the public rich lists, constitute a meritocracy. This ignores though the important class divisions that operate in the US, but are vehemently denied. This is expressed politically as tax cuts for the rich and welfare cuts for the poor. This policy track has only accelerated since 2008 as the Oligarchy seeks to put the burden of the financial crisis on the poor.

Part 4

Oligarchs are not nation builders. They are concerned first and foremost with family. The super wealthy of New York, Moscow or Mumbai have more in common with each other than they do with their own country men and women. As war correspondent Chris Hedges writes we have had

“A slow-motion coup by a corporate state has cemented into place a neofeudalism in which there are only masters and serfs.”

This neo-feudalism can be seen as discussed previously in the various divisions in the US that operate as fiefdoms such as the CIA and the Military Industrial Complex. It can be seen in Russia where the collapse of the Soviet Union has led to the creation of a new class of medieval boyars who have carved up its economy and dominate the political elite. The globalist nature of the Oligarchy can be traced through various international bodies, including the United Nations and the World Economic Forum.

The origins of the term ruling class date back far beyond Marx and Engels, however, they put these class divisions in society as central to understanding and analysing society. We may begin with the principle that all men (and women) are created equal, but if we don’t see the very real class divisions that operate to keep the “rich man in his castle/ the poor man at his gate” we will retain a deluded view of how things really work.

Oligarchies prefer anonymity to celebrity. Conformity to deviance. A belief in their own superiority, supported by philosophical and ideological underpinnings. British utilitarianism lives on in the words of economist Milton Friedman “The world runs on individuals pursuing their own self interests.” Whilst governments survey and collect vast quantities of information about the ruled, the inner workings of the state are shrouded in mystery. Nearly always for reasons of ‘national security’, information and therefore accountability and oversight is weak to non-existent. Living in the British state is the equivalent of living in a Panopticon, a prison designed by English philosopher Jeremy Bentham. The prison is designed so that a guarded central point is able to observe all those in their cells. The inmates cannot know if they are being observed at any one time. As a result, lifting the veil on Oligarchy will be associated by those in power to being a conspiracy theorist.

Immediately labeled a conspiracy theorist, US radio host Alex Jones has long argued on his show, with copious amounts of evidence, that clandestine societies secretly control national governments from behind the scenes. Organisations like the Trilateral Commission, the Bilderberg Group, the Council on Foreign Relations, and the World Economic Forum are used by Oligarchs to unduly influence world affairs. These bodies are supra-national and represent the fact that capitalism has far outstripped the usefulness of the nation state to them. There is a clear, publicly stated desire to head towards international co-operation, or a ‘new world order’ as it is more often referred to. When it is understood within the context of a society run by Oligarchs though this new world order undoubtedly takes on a more sinister motivation in the imposition of rule that forever cements neo-feudal relations.

Let us take the European Union (EU) as an example. At the centre of the EU wielding and directing operations is the spiderlike European Central Bank (ECB). The monetary and credit policy of Europe is now controlled by this autonomous, unelected, self-perpetuating formation. They represent the interests, not of the various nation states but in answer to the financiers. Economic historian Nomi Prins has extensively detailed how the ECB has operated in global collusion with the U.S. Federal Reserve, the Bank of Japan and the Bank of England to pursue the policy of Quantitative Easing for the 1% and austerity for the people. They have caused untold misery to the Greek people by the imposition of harsh austerity measures and forcing the Greek government to sell national assets. Meanwhile, under cover of helping Greece, the reality is that the ECB have ‘bailed’ out Greece so they can pay the extortionate interest to the European banks that lent them money. The further tragedy of the situation for the Greek workers is that the country only entered the EU due to fraudulent accounting by Goldman Sachs and co. to meet the Maastricht Treaty qualifications.

The United Nations since its inception has had an Oligarchical base drawn from the ranks of diplomats trained in British diplomacy. Regarded as superior to nation states, and above them, it became a talking shop and exhibition centre for Oligarchy. The rise of non-governmental organisations (NGOs) who are founded by the philanthropic billionaires of this world, play an important role behind the scenes. They then present themselves to the world stage as the representatives of global public opinion, issues and concerns. Whilst purporting to legislate for mankind, the reality is that they represent narrow Oligarch interests that have been synonimised with all of humanity.

The Trilateral Commission was founded by Davis Rockefeller and Zbigniew Brzezinski in 1973 to discuss foreign relations and to create a “new international economic order”. As Brzezinski, an astute analyst of capitalism and geo-politics once remarked

”International banks and multinational corporations are acting and planning in terms that are far in advance of the political concepts of the nation state.”

Most members of the discussion group have gone on to hold important government posts such as Timothy Geithner (Obama’s Treasury secretary), Paul Volcker (Fed chair), and Dennis Blair (Head of US intelligence) to name but a few.

These days the Bilderberg meeting of elites can no longer be held in the secret it once was. Even though it is packed with leading government officials, business leaders, super wealthy individuals, members of the aristocracy and the odd celebrity, we the people are denied access. Last summer a Guardian journalist collared Ryanair’s CEO Michael O’Leary as he left the conference. This is a man with a race horse named “Rule The World”! Also a man that calls environmentalists “lying wankers”. He revealed that the group had been discussing Trump’s presidency and that “he has more to do”. As the Guardian article goes on to point out Bilderberg is at its heart a machine of the financial interests. The conference chairman is director of HSBC, the Treasurer is the Head of Deutsche Bank, and the administration is run by Goldman Sachs. However, I’m sure these private banking interests have all our welfare at heart. When the CEO of a major business says attending Bilderberg is “very useful” it can be translated directly to financially beneficial to me and my shareholders.

The Council on Foreign Relations founded in 1921 as another private think tank has its roots in the Royal Institute for International Affairs, or Chatham House. Chatham House is a London based polity journal. In other words, both bodies seek through intellectual propaganda, to promote Oligarchy thinking toward their new world order. In the 1980s they publicly discussed the necessary conditions for a new international economic order. A new strategy was developed known as “controlled disintegration”, or what we have been sold as deregulation. Although, even then more foresighted writers for the Council warned of the potential instability arising. Their main proponent was Paul Volcker, who through instruments such as the International Monetary Fund (IMF) and the Federal Reserve carried through massive interest rate rises forcing many economies around the world into recession.

The World Economic Forum, set up in 1971, is held annually in Davos, Switzerland. Synonymous with banking and a country whose self-proclaimed independence from the conflict of World War Two actually meant it was a destination for Nazis to hide stolen Jewish gold. It is an international organisation for promoting public-private cooperation. In other words a privately funded think tank aimed at ensuring or, as they say, “shaping” public bodies to carry out agendas and policies that fit the needs of the Oligarchy. Attended by the likes of Christine Legarde, head of the IMF, this body clearly pulls a lot of weight. A private think tank that this year required British Prime Minister Theresa May to present her Brexit plans to them. High on the list of discussion this year was the rise of populism and nationalism, both seen as threatening to the new world order.

The word Oligarchy in our time period has become synonymous with Russia. The rise of the mafia-capitalist class in Russia was phenomenal when the Soviet Union collapsed in 1991. Following the death of Stalin in 1953 the rule of Russia was taken over by a collective of officials known as the nomenklatura. Khrushchev, Brezhnez, Gorbachev and finally Yeltsin all became leaders on behalf of the Russian Oligarchy who carved out private fiefdoms in different spheres of the economy. Yeltsin was handpicked as leader by the financiers at Davos, the world economic forum in 1996. The financiers and the mafia combined have essentially dominated society and government in Russia for decades. Russia’s economic life is now firmly under the control of financiers. The Russian people themselves are widely regarded as lumpens (after the Marxist term lumpen proletariats). The elite of Russia are only interested in one thing – money and the power and privileges that great wealth brings. Putin himself came from a working class background and advanced his career in the intelligence services (KGB) and was taken under the wing of the entourage known as the ‘’Family’’ that surrounded Yeltsin. He is a nationalist rather than an Oligarch, but according to the Council on Foreign Relations (the Oligarchical funded think tank) he made a deal with the Oligarchs, beginning with giving Yeltsin immunity from prosecution. Putin is a nationalist leader with authoritarian tendencies, but he quite clearly operates with the full approval of Russian Oligarchy, and now he is reportedly one of the richest men in the world.

Now in the US we have President Trump. One of the richest men to ever become president. However, like Putin whom he admires, Trump leans more towards populism and nationalism. He has tapped into the anti-globalist anti new world order view held by so many ordinary people which they have blamed as being responsible for the decline in their living standards. The British Brexit can also be seen in part within this context: a revolt by millions of ordinary people against an indifferent establishment that presided over the destruction of their living standards and communities. It explains why Trump praised the Brexit supporters in the UK and criticised the current plans of May’s government so openly. Oligarchs like to keep things, their deals and compromises, in secret. Trump goes against the established order by his blunt openness. Hilary Clinton was the preferred candidate of the Establishment, and they were visibly shocked by Trump’s victory. They were complacent and have been retroactively undermining his presidency to a degree which many regard as an open coup.

The entire Trump/Russia narrative of the establishment media has openly revealed the separate fiefdoms that operate within the US ruling elite. That the office of the presidency is in open conflict with his intelligence services is unprecedented and reveals a deep split in the US ruling class. Trump was supported into office by a group at the Pentagon who have recognised the military failure that is Afghanistan and Syria and wish to withdraw. The recent deal by Trump, in his recent meeting with Putin, to abandon the sunni jihadis in Southern Syria that prompted Israel’s rescue of the infamous White Helmets is confirmation of this.

In the press Trump is continually ridiculed, (admittedly not hard), for his constant faux-pas, and his inability to put intelligent sentences together. This is part of the tactic of ostracism that Oligarchs use. It is based on the individual being punished for not conforming to the established norms. In many respects Trump reminds one of the parvenu at the doyenne’s ballroom, inflating his wealth to get on the Forbes rich list. This is not to say he does not represent the elite, he is as much a man of the bankers as any politician in Washington who hopes to get anywhere. Trump’s tax cuts have benefited the wealthy and his austerity cuts have hurt the poor. To protect his own financial interests and those of his family, he will make deals the Establishment.

Today the word Oligarchy, originally Greek, has become attached to Russia. In the West we simply refer to the ruling class as elites, implying some special abilities. It is more accurate to understand that our society is run and controlled by a few and that is technically an Oligarchy. Individual Oligarchs come and go but families that build and learn to hang on to their wealth are what forms the heart of Oligarchical rule. They do not operate in public but through privately funded think tanks and foundations we do now glimpse elite concerns. Certain names such as Rothschild, Rockefeller, Morgan, Koch, Walton, Murdoch, Bezos, and Roberts, are prominently involved in helping to shape public discourse. If that is not the wealthy having undue influence I don’t know what is. Or you can do what they want you to do, shake your head and call it all a conspiracy theory.

U.S. in the 1960s | HSTA 324 01

Instructor: Michael Mayer
Credits: 3

Delivery Method: Face-to-Face
Days and times: MWF 11:00-11:50

Course Description: Examines the political, social, cultural, intellectual developments of America in the 1950s. Particular emphasis is placed on cultural history.

Additional Details: CRN 74631

Hidden world

Eastern Kentucky falls within that part of Appalachia that has come to epitomise the white underclass in America ever since president Lyndon Johnson sat down on the porch of a wood cabin in the small town of Inez in 1964 and made it the face of his War on Poverty.

The president arrived virtually unannounced at the home of Tom Fletcher, a 38-year-old former coalminer who had not held a full-time job in two years and was struggling to feed eight children. The visit offered the rest of the US a disturbing glimpse into a largely hidden world where houses routinely lacked electricity and indoor plumbing, and children habitually failed to get enough to eat. The 1960 census records that one in five adults in the region could neither read nor write.

Half a century later, while poverty levels have fallen dramatically in some other parts of the country in good part thanks to Johnson, the economic gap between the region and much of the rest of America is as wide. And its deprivation is once again largely invisible to most of the country.

Beattyville’s median household income is just $12,361 (about £8,000) a year, placing it as the third lowest income town in the US, according to that Census Bureau 2008-12 survey.

Nationally, the median household income was $53,915 in 2012. In real terms, the income of people in Beattyville is lower than it was in 1980.

The town’s poverty rate is 44% above the national average. Half of its families live below the poverty line. That includes three-quarters of those with children, with the attendant consequences. More than one-third of teenagers drop out of high school or leave without graduating. Just 5% of residents have college degrees.

Surrounding communities are little better. Beattyville is the capital of Lee County, named after the commander of the Confederate army of Northern Virginia in the civil war, General Robert E Lee.

Five of the 10 poorest counties in the US run in a line through eastern Kentucky and they include Lee County. Life expectancy in the county is among the worst in the US, which is not unconnected to the fact that more than half the population is obese. Men lived an average of just 68.3 years in 2013, a little more than eight years short of the national average. Women lived 76.4 years on average, about five years short of national life expectancy.

An abandoned truck in Beattyville. Photograph: David Coyle/Team Coyle for the Guardian

A few months before he visited eastern Kentucky, Johnson said in his State of the Union address: “Our aim is not only to relieve the symptoms of poverty, but to cure it and, above all, to prevent it.”

Over time, the focus of that effort shifted to inner-city poverty and many of the programmes Johnson launched came to be seen as aimed at minorities, even though to this day white people make up the largest number of beneficiaries.

But when the president sat on Fletcher’s porch in Inez, he had in mind rural poverty of an almost exclusively white region where the coal industry – which for a while provided jobs but not the much-promised prosperity – was already receding and people struggled for more than a basic income from the land.

Television pictures of Johnson’s visit presented Americans with a hardness of living in the midst of some of the greatest beauty the US has to offer. Life in a log cabin buried in the forest from which it was hewed is romantic until you have to collect water by bucket in the dead cold of winter.

The War on Poverty did relieve many of the symptoms. Food stamps and housing grants, healthcare for the poor and older people and improved access to a decent education have kept millions from struggling with the deprivations Johnson encountered in Inez. There are few homes in eastern Kentucky without electricity and indoor toilets these days. But the promised cure for poverty never materialised.

Lyndon B Johnson’s Poverty Tours

Three decades after Johnson’s visit, Fletcher was still unemployed but receiving disability benefits. His first wife had died of cancer. His second had been convicted of murdering their three-year-old daughter and attempting to kill their four-year-old son with a drug overdose to claim the life insurance.

A film of Johnson’s visit describes joblessness in the region as primarily attributable to “lack of industrialisation and losses in the coalmining industry”.

People in eastern Kentucky still call it “coal country”, even though the decline continued largely unabated and the number of jobs in the industry fell with the passing of each presidency. There were 31,000 under Bill Clinton but fewer than 14,000 by the time George W Bush left power.

The number of people employed in mining in eastern Kentucky has fallen by half since Barack Obama came to power, although the long history of decline has been conveniently set aside in the clamour to blame the current president. The more cautious critics say Obama is anti-coal because of his environment policies. But a no less popular view in the region is that it is part of president Obama’s war on white people.

Beattyville and Lee County did well out of oil, too, until the 1980s. A decade later, the largest employers in the town were a factory making uniforms, a data company and a private jail holding prisoners from Vermont. Now, the garment and computer businesses are gone and Vermont has just moved its prisoners to Michigan, where it is cheaper to house them.

The office of the Sturgeon Mining Company, Main Street, Beattyville. Photograph: David Coyle/Team Coyle for the Guardian

The largest employer in the county is now the school system. There are five times as many healthcare workers in eastern Kentucky as miners. “Coal country” is today little more than a cultural identity.

The office of Ed Courier’s Sturgeon Mining Company is on the high street. Its few remaining mines involve people digging coal out of hillsides. “I’ve been in the coal business since ’78 and the last five years I’ve been trying to get out of the coal business. There’s no future for it here,” he said.

Courier’s office is an old store front on Beattyville’s Main street. He nodded towards the window and commented caustically on how many former shops in the once bustling town centre were given over to payday loan companies and charities. One gave away what is popularly known as the “Obama Phone”, a free mobile available to anyone on food stamps or other assistance that provides 250 minutes of calls per month.

“Things were really good when I came here in ’72 and I ended up staying. When I came here there were three new car dealerships. There hasn’t been a new car dealership here since ’89,” he said. “There’s no future here. I have a sense of sadness. I wish people had a better life.”

The War on Poverty lives on through federal grants. Food stamps, employment programmes and disability allowance have cushioned many people from the harshest effects of the retreat of jobs from the region. Some families still struggle to put enough food on the table but their children are fed – if not well in the sense of healthily – at school.

Federal money also built Vivian Lunsford a new house – a spacious wooden bungalow with a balcony on two sides and forest to the back, constructed in a ravine just outside Beattyville. The narrow road from the town winds past simple log cabins buried in the trees.

“They’ve probably been there since the early 1900s,” she said. “I don’t know how people live in them. They’re real basic. Their only running water is the stream. But people just keep staying there. They don’t want to leave. It’s the pride. The heritage of that land.”

Trailers in Beattyville. Photograph: Sean Smith for the Guardian

Before getting the house Lunsford, 38, was unemployed and homeless. Her mother applied for a grant and a cut-rate mortgage on her daughter’s behalf without telling her, in order to build a more modern and spacious version of the old wood cabins. Lunsford repays the mortgage at $389 a month, less than it would cost to rent.

“There’s so much grant money went toward it that so long as I live there for 10 years I don’t have to pay that grant money back,” she said.

Lunsford was also able to land a job with the Beattyville housing association that built her home, which she shares these days with her partner and his school-age daughter.

“This place is notably poorer. You can’t just go out and get a job in McDonald’s. A Walmart is an hour away. I can go to my daddy’s in Florida and the world is like a different place. Here is more stuck in time,” she said.

“Our homeless situation is really different to a big city. It’s couch surfing. You’ve got lower income people, grandparents with their children and spouses living there with the grandchildren. They’re all crammed into this one house. There’s a lot of them.”

Other people on the waiting list for new homes – wooden bungalows or trailers – are what she calls “burn downs”, whose homes were destroyed by fire from candles, kerosene heaters or pot belly stoves. Many of those are in homes disconnected from electricity and other utilities to save money.

“Utility bills are outrageous in a trailer because they lack insulation. I have a little lady I’ve been helping with, Miss Nelly. She’s in her late 70s. Her electric bill in the wintertime here runs about $400 a month. She can’t afford that. Trailers don’t heat good,” she said. “Some people choose not to connect to utilities to save money. A lot of people here, their income is like between $500 and $700 a month. That’s all they get. That’s not a lot, especially if you’ve got kids and the price of gas and car insurance and you’ve got all these things that have to be paid.”

Sheriff Wendell ‘Bug’ Childerts. Photograph: Sean Smith for the Guardian

Still, the rehousing programme is not without its issues. Bob Ball built Lunsford’s home. He also built one for a man in his early 20s called Duke and his wife, both of whom were unemployed and had been living in a caravan.

Ball has since hired Duke as a worker. Federal money keeps the builder’s business alive but he still commented with a hint of disapproval at the government funding homes. “He got a new house so young. We all paid for that,” said Ball.

Through much of the 19th century, this part of the Bluegrass State was romanticised in stories of rugged frontiersmen and courageous hunters as the epitome of American self-reliance. None more so than Daniel Boone, a hunter and surveyor at the forefront of settling Kentucky. A good part of Lee County carves into a national forest named after him.

“Cultural heritage here is important,” said Dee Davis, whose family was from Lee County, though he grew up in a neighbouring county where he heads the Center for Rural Strategies. “The first bestselling novels were about this region. It was at one time the iconic America. This kind of frontier: white, noble. This was the iconography.”

By the time Johnson arrived a different image had taken hold – that of the anti-modern, moonshine swilling, gun toting, backwards “hillbilly”. The stereotype was perpetuated on television by a popular 1960s comedy show, The Beverly Hillbillies, in which unsophisticated mountain folk find oil on their land, get rich and move with their guns, bibles and Confederate sympathies to live among California’s millionaires.

In 2003, Davis led a campaign against a CBS plan to remake the comedy as reality television by setting up a poor Appalachian family in a Beverly Hills mansion. One mocking CBS executive remarked on the potential: “Imagine the episode where they have to interview maids.”

Davis beat back CBS but said the planned programme reflected a sense that white people living in poorer communities were blamed for their condition.

“There’s this feeling here like people are looking down on you. Feeling like it’s OK to laugh at you, to pity you. You’re not on the same common ground for comparison as someone who’s better off or living in a better place. That doesn’t mean it’s always true, it just means we feel that burden quickly. We’re primed to react to people we think are looking down on us. That they judge us for our clothes, judge us for our car, judge us for our income, the way we talk,” he said.

“This is the poorest congressional district in the United States. I grew up delivering furniture with my dad. No one ever said they were in poverty. That’s a word that’s used to judge people. You hear them say, I may be a poor man but we live a pretty good life for poor people. People refer to themselves as poor but they won’t refer to themselves as in poverty.”

Karen Jennings encountered the prejudice when she first left Beattyville.

“When I went to Louisville as a teenager to work in Waffle House I had this country accent. They laughed at me and asked if we even had bathrooms where I come from. People here are judged in the bigger cities and they resent that,” she said. “The difference is the cities hide their problems. Here it’s too small to hide them. There’s the drugs, and the poverty. There’s a lot of the old people come in here for food. The welfare isn’t enough. Three girls in my granddaughter’s class are pregnant. This is a hard place to grow up. People don’t hide it but they resent being judged for it.”


The establishment of the Argyll Colony in the Cape Fear valley of the Carolinas in 1739 was an important event in the history of Scotland and America. It was for some time the largest settlement of Scottish Highlanders anywhere outside of Scotland. Why did people leave their native homes in Scotland for an unknown land? What did they think about their experiences in America?

When examining any immigrant group that has formed over several generations, we need to allow for the diversity of the experiences and backgrounds of the people themselves. No group is entirely homogenous. All societies have different social strata, specialized professions, gender roles, and so on. New customs, foods, technologies and ideas can enter a society through a neighboring community, or through an internal institution (such as schools or churches). The resulting changes can impact a society in many unforeseeable ways.

This is particularly true in the case of the immigrant community of Scottish Highlanders of the Cape Fear that was founded in the late 1730s and continued to take in new immigrants into the early decades of the 1800s. During that time, Highland society in Scotland was undergoing a dramatic transformation in nearly all of its aspects: economics, education, language, religion, and social structure.

Despite this, there are some generalizations we can make that cut across the divisions of Highland society and describe the motivations for leaving Scotland and the impressions of life in their newly adopted country, America. These observations are supported by their own words in prose and poetry in English and in their own native tongue, Scottish Gaelic.

Although the mass of Scottish Highlanders never had any reason to leave their home communities where their ancestors had lived beyond the recall of tradition, there has always been an elite segment of Gaelic society which has been highly educated and well traveled. The cattle droving business developed in the Highlands during the seventeenth century, allowing some members of the native middle class to increase their wealth by gathering huge droves of cattle from every corner of the Highlands and accompanying them to markets in the Scottish Lowlands.

New links across the Irish channel were forged by the Plantation of Ulster, for Gaelic-speaking Highlanders were among the many Protestant colonists settled by the Crown’s plan for the subjugation of Ireland. Looking further back in time, there were centuries of travel between the western Highlands and Ireland: Highland chieftains hired out their extra military might as “redshanks” (mercenaries) to Gaelic chieftains in Ireland, and also recruited members of the professional learned classes who had been trained in the bardic schools of Ireland. In addition, the Scots in general had always had ties of various sorts to continental Europe, whether related to religion, military or education. An estimated 30,000+ Scots left for Poland between 1600 and 1650 alone.

As the Highland elite became increasingly influenced by and integrated into the centralized British state, they began to adopt the values and practices of the English-speaking world. The biggest impact was social and economic. Highland society had been a pastoral subsistence economy, loyalty was local and familial, and kinship an overriding factor in the running of the community. The Highland elite, however, was becoming involved in a cash economy, which required them to generate a surplus product for export to markets outside the Highlands. This led to raising rents and minimizing overheads, which ultimately meant abandoning the human population which in previous ages had been the foundation of their status and wealth.

It was the men of the middle class — the “tacksmen” as they were called in English, or fir-bhaile in their native Gaelic — who were the first to respond to these social and economic destabilizations. They knew that they were, as a class, being squeezed out of existence. Beyond this, they tended to be literate in English and were used to being administrators and leaders. As they anticipated that the new regime would bring about a reduction in their quality of life, they had the greatest incentive to organize migrations which would take them and their subtenants to new lands.

As early as 1729, Archibald Campbell, an agent for the Argyll estate, observed that tenants in Kintyre were being inspired by their Irish neighbors to leave for America. The island of Islay (or Ìle as it is properly called by its native inhabitants) came into the possession of Daniel Campbell, a Glasgow merchant and member of Parliament. It is little wonder that a man of this background would mount plans for a massive restructuring of the island with profit in mind. Rather than join into competition for leases, some of the tacksmen decided that they would rather risk their fate in America.

It was such a group as this who established the Argyll Colony in 1740 around modern Fayetteville, North Carolina. Highlanders in Scotland remained hesitant about departing their native shores, and their doubts were fomented by the anti-emigration propaganda of landlords wishing to keep a servile population. In time, however, the Highland immigrant community flourished, and beckoned their relations in Scotland to join them. We know, for example, that Alexander McAlester of New Troy, Cumberland County, was writing to his brother back on the isle of Arran by 1747. The Reverend Allan Macqueen, writing in the 1790s, recalled:

copies of letters from persons who had emigrated several years before to America, to their friends at home, containing the most flattering accounts of the province of North Carolina, were circulated among them. The implicit faith given to these accounts made them resolve to desert their native country, and to encounter the dangers of crossing the Atlantic to settle in the wilds of America.

In about 1767 a second and larger wave of migration began to flood the Cape Fear valley with Highlanders. This group was from a larger area of Argyll, including the islands of Arran, Jura, Islay and Gigha. With worsening economic conditions in the 1770s, the impulse to emigrate spread much further throughout the Highlands and Outer Hebrides. About 20% of the isle of Skye was lured away by tacksmen during this period, much to the annoyance of the landlords.

Highland landlords (for by this time they could no longer be called “chieftains” in any meaningful sense of the word) possessed powers more unrestricted than any others in Europe, and many of them used this power to exploit their tenants. While economic conditions were poor throughout Britain, only in the Scottish Highlands were landlords able to use their authority to force people to leave their homes. From the late 1700s to 1850, Highlanders were a disproportionately large part of the Scottish exodus.

Numerous sources confirm that Highlanders were inextricably tied to their native land but left because of the economic, cultural, and social oppression that seemed impossible to remedy in the Highlands. Alexander McAlester wrote to his brother in 1770:

In short, our lairds’ or landlords’ oppression will soon help to plant your colony … This part of North Carolina will soon be a New Scotland for within these three or four years there is an immense number come in to this place.

The anonymous tract Informations Concerning the Province of North Carolina, printed in 1773, corroborates that it was the breaking of traditional social ties that had unwoven the fabric of Highland society and resulted in such a massive hemorrhage of population:

The natives of the Highlands and Isles have always been remarkable for the strongest attachment to the place of their nativity, and for the highest respect towards their masters and superiors. In these, they were wont to find kind patrons and protectors, and cherishing, indulgent fathers to themselves and their families. This endeared them to a soil and climate to which nature has not been very liberal of its favours, insomuch, that they have ever shewn the utmost aversion at leaving their country, or removing to happier regions, and more indulgent climates. That this is true of the Highlanders, in general, will be acknowledged by those who are in the least acquainted with them. The cause, then, that could induce a people of this cast, to forsake their native lands, in such numbers, and make them seek for habitations in countries far distant and unknown, must, doubtless, be very cogent and powerful. And, here, let the present land-holders and proprietors consider, whether, of late, they are not greatly to blame? Whether they have not begun to shake the iron rod of oppression too much over them? … They are spurned away with looks expressive of the utmost contempt, or dismissed with a volley of oaths and curses, in a language strange and unknown to the most of them.

This is, indeed, the same view expressed in the Gaelic poetry of the period. The following translation of a short extract of a song by Iain mac Mhurchaidh (“John MacRae” in English) of Kintail urges his neighbors to embark for the Cape Fear after receiving glowing reports in letters from the Reverend John Bethune:

I received a letter from John Bethune Let us all depart!
Which pleased a man who didn’t even see it! I care not for the weak-hearted!
A few of my fellow countrymen Better that than to remain under lords
Have emigrated to where they will find wealth in plenty. Who won’t tolerate their own tenants.

John MacCodrum, addressing emigrants leaving for the Cape Fear in the early 1770s, similarly rebukes the landlords for their abandonment of the traditional values of Highland society and for betraying their kin in this translation of an excerpt of a Gaelic song:

Raise your spirits and joy, What is the point of telling it
Be merry and make music Given that the nobles have become so stingy
And put your hopes That they would neuter an insect
In the help of the high-king, God. If it would turn a profit.
Because you must sail away The mighty leaders have gone
And it is not your desire to do so, Who were thirsty for the Truth
To a kingdom alien to you Who were fond of their followers
As your relations have begun to do. And hard on their enemies.
Because they will not suffer you to live
In the lands which are familiar to you
It is better for you to leave willingly
Than to descend like slaves …

Once they reached America, they found a life and land quite alien to them. Gaelic poetry of the period frequently refers to America as the “land of the never-ending forest.” Although some parts of Argyll are wooded, there are very few trees on most of the Western Isles. Not only was the landscape strange and forbidding, they encountered many animals wild and hostile to them. The colonists who settled the land before them – English, Scotch-Irish, Germans, Huguenots, Swiss, and others — were similarly foreign and unwelcoming, wearing clothing and speaking languages unfamiliar to the Highlanders.

These themes of exile and uncertainty appear in one of the early Gaelic poems from North America, an excerpt of which is given here in translation:

I wish I were in the cattle-rich glens [of Scotland],
Where I was raised as a child,
Where there are gorgeous hillsides and young calves bellowing
And the herring fleet that sails by so winsomely.

We are now in America
At the edge of the never-ending forest
When the winter is over and the warmth returns
Pears and apples will grow fragrantly.

The people I see here are not attractive
Wearing hemp cloaks and huge hats
Wearing open and unsashed khaki trousers,
Lacking kilt and hose: it is a pity.

Of all of the aspects of their identity as Highlanders, most fundamental was their language. It was through Gaelic that they expressed their experiences as a community and as individuals, and had access to centuries of oral tradition in the form of song, poetry, legend, folktale, lineage, and so on, which embodied their collective past as a people. The vast majority of the emigrants spoke only Gaelic, although the upper ranks of Highland society were typically fluent in English.

The leaders of the Argyll Colony petitioned the Presbytery of Inveraray for a Gaelic-speaking minister as they prepared for their departure in 1739:

In consequence of your and our resolution of going to the continent of America in summer next and fixing a Colony there, and our design of having a clergyman that can speak the Highland language since from that country [the Scottish Highlands] all our servants are to be, many of which cannot speak any other language …

Such petitions were repeated from time to time until the area was finally supplied with the Reverend James Campbell in 1758. The Earl of Argyll had been actively promoting and developing the Protestant faith in the area of the Highlands over which he had influence since the 1560s. Protestantism was an important marker of loyalty to the British Crown and the majority of Highlanders in Argyll had been nominally converted to Protestantism by the time of the migrations.

Despite its political significance, Highlanders were noted in contrast to their Lowland neighbors at this time as being rather indifferent to religion on the whole, but people turned increasingly to religion for leadership and hope as their cultural crisis deepened. Familiar as they were with Biblical stories, poets compared their plight with that of the children of Israel, as does poet Domhnall MacMhathain (“Donald Matheson” in English) when a shipload of his countrymen departed for the Carolinas circa 1768:

I see a reflection With strong force he took them
Of the things of long ago Away from the Pharaoh
When the people of Israel He parted the sea for them
Were in pain in Egypt When the Pharaoh rushed after them.

The Highlanders were likewise departing across the ocean to escape oppression and hardship. Religion did not just provide solace: it was the church in particular which was to act as a major catalyst of change for Gaelic society both in the Scottish Highlands and in the immigrant communities of America.