Information

Poorhouses Were Designed to Punish People for Their Poverty


When Anne Sullivan came to Tewksbury, she wasn’t yet the renowned “miracle worker” who would teach Helen Keller to communicate. It was 1866, and 10-year-old Annie was a blind child living in abject poverty. Her years at the poorhouse—a facility designed to house poor people in a time before social services— were “a crime against childhood,” she later remembered.

Residents at the Massachusetts poorhouse milled about like forgotten animals. As Anne and her brother slept on the institution’s iron cots in a gigantic dormitory, rats ran up and down the spaces between beds.

In 1883, a massive investigation exposed the conditions at Tewksbury—but the institution was far from unique. Throughout the 19th and early 20th centuries, poorhouses were a reality for society’s most vulnerable people. These locally run institutions filled a need in a time before Social Security, Medicaid and Section 8 housing became a reality. They also exposed the stigma and shame society placed on those who were unable to support themselves.

The concept of the poorhouse originated in England during the 17th century. Municipalities were expected to care for their poor, and made a distinction between people who were old and unable to care for themselves and the able-bodied. People who were able to work were expected to do so—and could be imprisoned if they refused.

They lived in workhouses, bare bones facilities designed to make poverty seem even less attractive. In these facilities, poor people ate thrifty, unpalatable food, slept in crowded, often unsanitary conditions, and were put to work breaking stones, crushing bones, spinning cloth or doing domestic labor, among other jobs.

In the United States, the idea emigrated along with English colonists. In 1660, Boston built its first workhouse—a brick building intended for “dissolute and vagrant persons.” Massachusetts’ poor people had more than the workhouse to fear: Towns could also banish poor people or even auction them off to the lowest bidder. “Warning out” allowed towns to exile poor newcomers or make it clear they were not willing to pay to support them.

The vendue system allowed cities to auction off poor individuals to private bidders. The individual who bought the poor person then put them to work in exchange for reimbursement of what it cost to clothe and feed them. Sometimes, people had another option—asking the Overseer of the Poor, a town official, for relief. In some cases, the overseer would provide them with town-sponsored food, clothing or firewood.

By the early 19th century, the poorhouse system had won out over warning or vendue—and their construction coincided with an increasingly negative attitude toward poor people. These facilities were designed to punish people for their poverty and, hypothetically, make being poor so horrible that people would continue to work at all costs. Being poor began to carry an intense social stigma, and increasingly, poorhouses were placed outside of public view.

Paupers struck a difficult bargain in exchange for shelter and food. As historian Debbie Mauldin Cottrell writes, many states required them to give an oath “swearing to their lack of worldly goods and to their need for assistance,” a ritual designed to weed out those who didn’t want to publicly swear that they were poor. Once they were an “inmate” of the facility, they had to submit to an often draconian control of what they ate and wore and how they worked and acted. “Consequently, it was the most desperate, those with the least pride, who often populated poor facilities,” Cottrell notes.

Often, conditions in poorhouses were alarming. However, life in the poorhouse was not always miserable. Historians have documented the ways poor people used workhouses and poor farms as places in which to build community during their most vulnerable moments. As historian Ruth Wallis Herndon has noted, many women returned to the Boston Almshouse again and again and maintained connections to the outside world while inside the poorhouse. “For most men, on the other hand, the Almshouse was an unfamiliar place in an unfamiliar city,” she writes.

For men on the move, there were alternatives to poorhouses: the tramp house. These tiny, temporary homes were erected for vagrants and itinerant people—often men—passing through communities. Often amounting to little more than shacks, these houses provided the bare basics, like mattresses and firewood, to people in towns near railroads, and generally weren’t publicized since communities didn’t want to advertise their charity toward vagrants.

As society tried to hide institutions designed to help the poor, poorhouses took on another form: the poor farm. Like poorhouses, these institutions were carefully regulated places for indigent people to live and work. However, poor farms were located in rural areas and the outskirts of cities instead of city centers. Instead of performing industrial or domestic labor, residents did farm work instead.

The poorhouse faded out around the time of the Great Depression as the federal government became more involved with social welfare. Most remaining poor farms and poorhouses closed in the 1930s and 1940s, though a few remained in places like Texas until the 1970s.

Though the poorhouses are no longer, their memory is preserved in testimony by people like Anne Sullivan. “I doubt if life, or eternity for that matter, is long enough to erase the errors and ugly blots scored upon my brain by those dismal years,” she wrote later.


The workhouse at the dawn of the 19th century was a very different place to the later institution that would come to occupy such a strong place in popular memory. Although shunned and dreaded by many, it was a gateway to various means of support for the disabled and destitute.

The early workhouse was seen as an all-purpose parish institution. It combined an emergency ward, orphanage, geriatric ward and home of last resort. Used for parish business and meetings, it also housed the parish fire engine and morgue. Most had a 'shell' (a form of stretcher) to collect accident victims and dispensaries to treat disabled and sick residents.

Living arrangements were very simple but relatively informal, with disabled and non-disabled people living together. The workhouse could be built on a domestic scale, like the parish workhouse in Aldenham, Hertfordshire, or with considerable style like Gressenhall in Norfolk and the Harrogate workhouse - the so-called 'pauper palaces'.


Sustainable Social Policy: Fighting Poverty Without Poverty Programs

The history of social policy has a clear lesson. Programs that benefit all citizens do more to reduce poverty than programs targeted to the poor. So a new strategy for gamily security makes more sense than another War on Poverty.

What to do about poverty is, once again, on the public agenda in the United States. A decade ago, social researchers and research-funders, stung by the backlash against the War on Poverty, averted their attention from race-related social ills. Then Charles Murray's rightwing broadside against social programs in Losing Ground (1984) provoked critics to reenter the fray, and William Julius Wilson's The Truly Disadvantaged (1987) revalidated discussion of "the underclass" by progressives.

This renewal of controversy is good news for citizens interested in doing more to fight poverty. But there are also reasons to worry Public discussion today, while less optimistic than in the 1960s, is repeating many themes and assumptions of the War on Poverty and Great Society. Policy makers then attributed poverty in part to behavioral problems and cultural deficiencies that they hoped special training and community action programs for the poor could correct. Similarly, the welfare reform consensus of the mid-1980s quickly converged on the notion that mandated work and job training could best alleviate poverty. To be sure, there are differences policy makers today are more willing to make welfare contingent on work. As ever, conservatives want to discipline welfare clients, while liberals want to deliver more training, health care, and child care to the underprivileged. But almost everyone seems to think, now as before, that programs targeted to the poor are the best way to proceed.

Universal versus Targeted Policies
Amidst the chorus of welfare reformers, a few voices sing a different melody. They tell us that social provision in the United States should emphasize universal programs -- that is, programs that benefit all citizens, not only racial minorities or the poor.

William Julius Wilson argues this position forcefully in The Truly Disadvantaged, a book that embodies a paradoxical dual message that needs to be heard in its entirety. The first part of the message has gotten across: Wilson calls for renewed attention to the multiple pathologies and special problems of the inner-city black underclass, who constitute about ten percent of Americans below the official poverty line. But the second part of Wilson's message does not follow simple-mindedly from the first. In his book and in the pages of this journal (see "Race- Neutral Programs and the Democratic Coalition," TAP, Spring 1990), Wilson sharply criticizes racially specific measures to aid blacks exclusively as well as redistributive programs that help only lower-income people. Racially targeted policies primarily aid socially advantaged blacks, Wilson argues, while benefits or services restricted to the poor cannot generate sustained political support. Instead, Wilson advocates improving "the life chances of groups such as the ghetto underclass by emphasizing programs. [to] which the more advantaged groups of all races can positively relate."

Supporters of targeted antipoverty policies criticize such calls for universal programs as being expensive and politically unrealistic. The American public, the critics say, will not pay taxes to finance such programs. Furthermore, universal programs provide the greatest benefits or services to the middle class or the people with low incomes who are already best prepared to improve themselves. According to the targeters, America's poorest people, especially female-headed black families in inner cities, face dramatically different circumstances from the rest of us therefore, only highly concentrated programs, devised specially for them, can succeed.

Rarely, however, do proponents of targeted, custom-tailored social services explain how they will obtain sustained majority support. They simply have not faced up to the hard political questions:

  • Why should people just above the poverty line, struggling without benefit of health coverage, child care, or adequate unemployment insurance, pay for programs that go exclusively to people below the poverty line?
  • Why might not many Americans from the working and middle classes simply write off troubled inner-city people, and just call for the police and prisons to contain their threatening behavior?

Whether we like it or not, some voters clearly prefer to punish the underclass rather than to help it. Many of those who make it on their own and "earn their own way" define themselves in self-conscious contrast to the poor. In their minds programs targeted to the poor take on all the negative stereotypes of the poor themselves. That may well be the reason why cross-national research on social expenditures finds that in democracies universal programs are more sustainable, even if more expensive, than policies targeted to the poor or other "marginal" social groups.

Are we left, then, with a stand-off, in which advocates of universalism and advocates of targeting each explain cogently why the other's solutions are likely to be unsuccessful or unattainable? So far as speculative arguments are concerned, there certainly is a stand-off. Yet I think we can draw two conclusions from the history of social provision in the United States. First, targeted antipoverty efforts have generally been inadequately funded, demeaning to the poor, and politically unsustainable. Second, some kinds of (relatively) universal social policy have succeeded politically. And within the framework of universal programs, less privileged people have received extra benefits without stigma. I call this pattern "targeting within universalism" and suggest it could become the basis for a revitalized strategy against poverty.

The Travails of Targeting
Without plunging into a detailed discussion of all governmental approaches to poverty in U.S. history, we can look quickly at some of the most important initiatives. In each case -- poorhouses in the nineteenth century, mothers' pensions in the early twentieth century, and the War on Poverty and subsequent reforms of the late 1960s and early 1970s -- the high hopes of reformers soon collided with the rock-hard realities of politics.

The Rise and Fall of the Poorhouse. Nineteenth-century debates about antipoverty policy often concerned a choice between "indoor relief" in poorhouses and "outdoor relief," that is, assistance to the poor living outside institutional walls. Reformers before the Civil War often preferred indoor relief because of its presumed potential to reduce costs and improve the behavior of the poor. Like the mental asylum, the poorhouse was an institution aimed at reforming the habits and morals of defective people as well as preventing such abuses as the shunting of paupers from town to town. Adults would be put to character-building work and children sent to school. Fueled by these hopes, reformers peppered almshouses across most of the settled United States, except the rural South.

By the 1850s, however, poorhouses were clearly not working as intended. As Michael Katz puts it in his recent history of poverty policy, In the Shadow of the Poor House, a "preoccupation with order, routine, and [low] cost replaced the founders' concern with the transformation of character and social reform." In practice it turned out to cost more to maintain people in institutions than in homes, and most poorhouses were never adequately funded in the first place. They failed to help needy inmates and quickly became prey to corrupt managers who made special deals with merchants and doctors. Appropriate work was often not devised for able-bodied paupers, and the old, insane, and children were simply shut up and often preyed upon by others. Furthermore, poorhouses did not always "discipline" poor people, who sometimes learned to come and go as they wished in response to the ebb and flow of outside opportunities.

Before long, poorhouses lost broad support, and further movements were launched to abolish all forms of outdoor and indoor public assistance for the able-bodied and to create more specialized institutions for subgroups such as orphans and the insane. Surviving poorhouses served mainly as miserable, negative warnings to working people to avoid dependency at all costs. By the turn of the century, poorhouses were principally old-age homes for those unlucky enough to lack resources and family ties. Meanwhile, destitution and dependency proliferated along with industrialism.

Pensions for Mothers. Mothers' pension laws swept across forty states between 1911 and 1920 and were enacted by four more before 1931. These laws allowed localities to give benefits to widowed mothers, and sometimes other caretakers, who had charge of dependent children. Federations of elite and middle-class married women were the impetus behind mothers' pensions. The National Congress of Mothers, the General Federation of Women's Clubs, and the Women's Christian Temperance Union argued that impoverished widowed mothers should not have to accept low-wage labor to survive, nor should they be forced to give up their children to custodial institutions. Above all, they should not be stigmatized as paupers. Instead, the women's associations argued, all mothers should be honored for their child-nurturing service and adequately supported by government, like disabled veteran soldiers, when their ordinary sources of support -- breadwinner husbands -- were not available.

Despite the generous intentions and broad popular support behind mothers' pensions, they evolved into one of the most socially demeaning and poorly funded public welfare programs. Mothers' pensions were implemented only in some predominantly urban, local jurisdictions, leaving many widowed mothers, including most nonwhites, unable even to apply for benefits. Where established, the local programs were starved for funds by communities reluctant to spend taxpayers' money on the poor. With meager benefits, many clients could not avoid working for wages or taking in (male) boarders, even though doing so opened them to charges of child neglect and immorality. Social workers, whose organizations had originally opposed mothers' pensions, became local administrators and caseworkers after the laws passed. Hypersensitive to possible accusations of political corruption and lacking sufficient resources to help all needy applicants, the social workers applied eligibility rules and carried out "proper home" investigations with a vengeance.

When mothers' pensions were federalized as Aid to Dependent Children (ADC) under the Social Security Act of 1935, benefit levels and administrative procedures remained decentralized and the tradition of low funding continued. At first, the federal government offered only one-third matching funds to the states. The 1939 Amendments to the Social Security Act increased the federal proportion to one half, but also removed from ADC the very "worthy widows" who had originally embodied reformers' hopes to make this an honorable form of social provision. Henceforth, surviving dependents of contributing wage-workers were to be covered by Social Security's Old Age and Survivors Insurance, leaving poorer caretakers of children, increasingly women without conventional family histories, as the residual clients to be helped by ADC (later renamed Aid to Families with Dependent Children, or AFDC). Nationwide citizen support never emerged for this program as it did for Social Security. Benefits remained stingy and geographically uneven, and surveillance by social workers became even more intrusive once blacks and unmarried mothers became the program's major beneficiaries. Directly contradicting the original sponsors' plans, mothers' pensions evolved into the core program of what today is pejoratively known as "welfare."

The Limits of the War on Poverty. Antipoverty efforts reached unprecedented visibility and scale between the early 1960s and middle 1970s. Though the elderly gained more from the social policy changes of this era, the non-elderly poor were the more visible concern, especially of the widely trumpeted War on Poverty. As before in American history, antipoverty reformers dreamed of educating the poor to take advantage of economic opportunities, especially by reforming juvenile delinquents, giving children a "head start," improving the schools, and offering job training to adults. The Great Society also sought to improve access to medical care for both the poor and the elderly. Under President Nixon, income transfers to the poor also grew significantly. Between 1969 and 1974, aid targeted to the poor almost tripled, much of it channeled through programs such as Food Stamps, Medicaid, and housing subsidies. Throughout this period, too, Aid to Families with Dependent Children expanded, as states eased eligibility rules in response to changing federal regulations and incentives.

Our evaluation of the era's initiatives against poverty must be mixed. Many people certainly were helped. Elderly Americans, including the impecunious elderly, benefited enormously from the enactment of Medicare, increases in Social Security, and the nationalization of need-based, old-age assistance with the adoption of the Supplemental Security Income (SSI) program in 1974. Community action projects launched many local black activists into political careers. Many poor women and their families, often from minority groups, were newly sustained by the expanded programs of this period. Overall, according to John Schwarz's estimates in his study America's Hidden Success, economic expansion between 1965 and 1972 lifted out of poverty only about one-tenth of the 21.3 percent of Americans who were below the poverty line in 1965. Government programs, on the other hand, lifted above the poverty line more than half of the remaining number.

Yet both the services and income transfers of the period failed to reduce poverty rates greatly among Americans under age 65. And the programs certainly failed to reverse such worrisome trends as the rise in out-of-wedlock births and mother-only family units. Antipoverty warriors can argue that nowhere near enough was ever done or spent to make either services or transfers sufficient to end poverty or to reduce pathologies among the severely disadvantaged.

But that rejoinder only brings us to the more fundamental difficulty. The antipoverty services and increased expenditures on the non-elderly poor of the 1960s and early 1970s very soon generated a political backlash that blocked their further extension. The Community Action Program helped to mobilize poor people, especially blacks, but these efforts were quickly de-emphasized by President Johnson in response to angry local Democratic leaders. President Nixon had no interest in enlarging the flow of federal money to groups and local governments hostile to his administration. Surviving social service programs were also highly vulnerable to charges of corruption in unfavorable political climates.

Even the broader income transfers emphasized during the later Johnson and Nixon years ended up backfiring politically against lower-income Americans, blacks, and the Democratic Party. In due course they fell victim to Jimmy Carter's retrenchments, intense conservative intellectual and political attacks, and the cuts of the Reagan era. During the 1970s public opinion polls recorded decreasing levels of support for government efforts to aid minorities and for public social spending. Support declined especially for stigmatized service programs popularly identified with poor blacks. Electorally, blacks in general remained staunchly Democratic and in favor of strengthened governmental social programs. But union members, white urban ethnics, and white Southerners moved away from the Democratic Party, especially in presidential elections. The perceived Democratic Party position on racial and welfare issues contributed to these defections.

This political situation was rooted in a split between people who benefited most from policy changes and people who saw themselves as burdened with higher taxes. Surely many working- and middleclass families have elderly parents or grandparents who gained from Medicare and increases in Social Security, but higher "welfare" transfers to the poor produced no gain for them. Meanwhile, these working- and middle-class families faced rising tax burdens from government at all levels. Little wonder, then, that many found appealing Ronald Reagan's tax cuts and his generalized attacks on government's social role. Although Reagan's efforts were not as successful as often supposed, the political and intellectual discourse of the 1980s has scarcely reversed the wide hostility toward "big government" and "throwing money" at poor people. Redistributive benefits or targeted services for the poor alone are highly unlikely to regain favor at this point. We still live amidst the political backlash against the War on Poverty and the Great Society.

Universal Programs that Reach the Poor
While targeted programs generate forces that undo their aims, social policies that deliver benefits across classes and races generate broad, cross-class political coalitions that sustain and protect the policies. What is more, universalistic programs have sustained moral imageries that allow the programs to redistribute income and deliver special services to disadvantaged Americans without risking public disaffection and political backlash. For much of American history, universal, locally supported public education has helped poor as well as more privileged children. Here I focus on federal social policies.

Civil War Benefits: Aid for a Worthy Generation. Benefits to Civil War veterans are not often considered in histories of public social provision in the United States. But between the 1870s and early 1900s, veterans' pensions, disability, and survivors' benefits evolved into a massive, de facto system of public support for an aging generation of Northern men who could demonstrate even minimal service in the Union armies. Unrestricted by any means test, Civil War pensions absorbed from one-fifth to one-third of the federal budget between the 1880s and the 1910s. By 1910 approximately 29 percent of American men over age 65 (along with approximately 8 percent of elderly women and various other younger women, children, and other dependents of deceased men) were receiving benefits that were remarkably generous by contemporary international standards. While German old-age pensions were averaging only about 18 percent of annual earnings, U.S. military pensions in 1910 averaged about 30 percent.

To be sure, ethnic and class differences showed up in the distribution of benefits. Civil War pensions went to native-born Northerners and to northern and central Europeans who had come to the North prior to the 1860s. By the late nineteenth century, the Union veterans were disproportionately likely to include farmers, skilled workers, and members of the middle class. Left out of the pension system were Southern whites and most Southern blacks as well as most immigrants from southern and central Europe, who came to America after the Civil War and were then disproportionately low-skilled urban workers. Nevertheless, black veterans and their survivors did qualify for full benefits, and the number eligible was significant because more than 186,000 blacks had served in the Union armies. Even among whites, veterans' pensions helped many who were economically disadvantaged as well as those who had done well during their working lives but then became impoverished in old age.

Within the overall system of Civil War benefits, moreover, special aid beyond the federal pensions was available to the neediest veterans and their dependents. Some states, such as Massachusetts, offered generous public assistance to needy veterans in their own homes. Starting in 1888, the federal government offered subsidies for state-run veterans' homes. By 1910,31,830 Union veterans, or about five percent of those still living, were being housed in veterans' old-age homes across the country. These men had typically been skilled workers few were middle-class.

Conceived as a repayment for service to the nation and in explicit opposition to poor relief, these veterans' benefits were unequivocally honorable. Broad political coalitions agitated for benefit increases. Though the Republican Party generally led these campaigns, they also gained support from many Northern Democrats, who could not afford to let Republicans outbid them for votes. And since Civil War benefits were a badge of honor rather than disgrace, it was easy for individual recipients to accept public assistance, or a place in an old-age home, during what was supposedly the preeminent era of "rugged individualism."

"An old soldier," Commissioner of Pensions Green B. Raum explained in 1891, "can receive a pension as a recognition of honorable service with a feeling of pride, while he would turn his back with shame upon an offer of charity." So popular were Civil War benefits that they eventually reached over 90 percent of surviving veterans. Despite vociferous elite attacks against the "political corruption" that pension expenditures supposedly expressed, the benefits did not recede until the generation of men who received them died out.

Health Education Services for Mothers and Babies. During the early twentieth century, many programs to help mothers and children were enacted in America. While mothers' pensions were targeted on the poor alone, others were universal efforts, including the federal Children's Bureau established in 1912 and the Sheppard-Towner program enacted in 1921 to provide health education to pregnant women and new mothers throughout the United States.

With a mandate to look into "all matters pertaining to the welfare of children and child life among all classes of our people," the chief of the Children's Bureau, Julia Lathrop, astutely mobilized women's associations and reformers on behalf of improved maternal and child health. Even though her aim was to reach out to underprivileged mothers, especially in remote, rural areas, Lathrop deliberately decided against a narrowly targeted program and insisted her efforts had nothing to do with charity. If the services were not open to all, Lathrop felt, they would degenerate into stigmatized poor relief.

After the passage of Sheppard-Towner, the Children's Bureau was able to reach a broad cross-section of American mothers, just as it had in its earlier programs. By 1929, according to a study by Molly Ladd-Taylor, the bureau could claim that its childrearing information had benefited half of the babies born in the United States. The bureau had coordinated efforts that distributed "over twenty-two million pieces of literature, conducted 183,252 health conferences, established 2,978 prenatal centers, and visited over three million homes." Ladd-Taylor notes, "Women from every geographic region, social class, and educational background wrote to the Bureau as many as 125,000 letters a year." Yet while the bureau was reaching so wide a population, it was also effectively targeting less educated, white and non-white mothers in rural areas for special help through Sheppard-Towner-sponsored clinics, conferences, and consultations with public health nurses. While allowing great state-to-state variation in program design, the bureau prodded all states to improve official birth statistics and to channel resources toward places where infant and maternal mortality rates were highest.

Politically, however, the Sheppard-Towner program was not an unequivocal success. As the 1920s ended, Congress refused to make the program permanent. Sheppard -Towner's chief opponents, private physicians, wanted to take over pre- and post-natal health counseling themselves, and their local associations affiliated with the American Medical Association were able to kill the program through congressional maneuvers. Sheppard-Towner, however, never experienced a democratic political backlash. It remained broadly popular with American women, and most of the elite and middle-class women's associations that had backed the original law in 1921 continued to lobby Congress on behalf of its extension throughout the decade. Many states continued Sheppard-Towner programs after the federal matching funds disappeared, and a few years later the federal program itself was revived in a new form under the Social Security Act of 1935.

Sheppard-Towner itself was politically vulnerable because the 1921 legislation had not established any entitlement to benefits as a discretionary program, it was subject to the annual appropriations process. The broad political support that follows from a universalistic program structure is clearly not the only factor affecting the survival of social policies. Entitlement status has also been important in ensuring the longevity of social policies. The most successful measures, such as Civil War pensions and Social Security, have, in fact, been those that ensured entitlements to cross-class categories of beneficiaries.

Economic Security for the Elderly. Over the past half century, the national, contributory social insurance programs chartered by the Social Security Act of 1935 have evolved into a broad and, by international standards, generous set of income supports and medical services for retired American workers and their dependents. How Social Security particularly aids poorer elderly people within the framework of its universal benefits offers us a powerful lesson about the wisdom of targeting within universalism.

Social Security was far from an antipoverty policy in its early years. For the first three decades, its originators and early administrators concentrated on building mainstream support, expanding the program's scope, and working out effective relationships with congressional committees. They sought to rein in public assistance to the elderly poor and to deflect conservative and populist demands for non-contributory need-based benefits or universal flat-rate pensions. They astutely propagated a public imagery that portrayed Social Security as a system of individual "accounts" that received "contributions" as workers built up "earned" benefits for themselves and their families.

Despite the imagery, of course, early beneficiaries got back much more than they put in. They reaped windfalls because they had not paid taxes for many years before receiving benefits and because payroll tax increases were repeatedly deferred in the early years while benefits were increased.

As Civil War pensions had once been contrasted to charity and poor relief, so Social Security was morally defined in symbolic contrast to public assistance. Step by step, new categories of beneficiaries and taxpayers were brought into the contributory insurance system, until it encompassed more than 90 percent of the labor force in the early 1970s. New benefits were added, partially filling the programmatic vacuum left by the absence of national health insurance in America. The original Old Age Insurance (OAI) program of 1935 became Old Age and Survivors Insurance (OASI) in 1939, and eventually Old Age, Survivors and Disability Insurance (OASDI) in 1956. Congress added Medicare in 1965.

Social Security has always favored, not the neediest Americans, but the stably employed and the middle class. The system pegs pensions to earnings received during an employee's working years. As is well known, payroll taxes are regressive because they are set at a flat rate up to an income ceiling.

Nevertheless, from the start Social Security gave retirement benefits to lower-income workers that, in proportion to their past wages, were relatively higher than those received by higher-wage retirees. More important, once Social Security was fully established, its administrators sought increased benefits for everyone and relatively better benefits for the less privileged, to bring Social Security pensions closer to a sufficient retirement income.

Taking advantage of windows of political opportunity during the 1960s and early 1970s, Social Security's promoters gained presidential and congressional backing for major jumps in services and benefits for all of the elderly. They followed a strategy that Hugh Heclo describes as "helping the poor by not talking about them." Amidst the fuss about the War on Poverty, prior plans for Medicare were brought to fruition. Social Security administrators then began to work for higher retirement benefits, and in 1969, 1971, and 1972 increases adopted by Congress and approved by President Nixon far outpaced growth in the cost of living. At that time, Republicans could not afford politically to propose less for the elderly than the Democrats were proposing. In 1972 they pressed for indexing benefits to the consumer price index, unwittingly tying benefits to prices rather than wages just prior to a period when prices were to rise faster than wages. "Replacement rates in 1975," Martha Derthick explains, referring to the ratio of Social Security benefits to wages, "were approximately 67 percent for a married man earning average wages and 92 percent for a married man earning the federal minimum wage -- up from 50 percent and 67 percent, respectively, a decade earlier, on the eve of the drive for expansion." So while benefits increased for everyone, they jumped even more for poorer beneficiaries, and Social Security became by far modem America's most effective program for lifting people out of poverty.

In general, social insurance does much more than means-tested income transfers to raise American families from below to above the officially defined poverty line. Social Security is the most effective poverty-reducing program even for non-elderly families, who receive disability, unemployment, and survivors' benefits. But there is no question that the elderly benefit most of all. Social Security pensions and Medicare were chiefly responsible for cutting poverty among the elderly, as officially measured, from 35 percent in 1959 to 14 percent by 1983, while the rate for other Americans fell only from 22 to 15 percent.

The gains achieved for Social Security programs during the 1960s and early 1970s proved durable even in the face of the Reagan administration's retrenchments. While public support for "welfare" declined sharply during the 1970s, 95 percent or more of people polled continued to agree that "the government spends too little or about the right amount on the elderly." When the first Reagan administration discussed cuts in Social Security, it faced an immediate public uproar and soon backed down (except that it continued for a time to use administrative regulations to cut people from the disability rolls). Even though targeted public assistance programs for low-income people accounted for less than 18 percent of federal social spending, they took the brunt of the first Reagan administration's cuts. Impoverished mothers and children suffered, but the elderly on Social Security, including many who would otherwise have been poor, preserved their improved economic standing. And so they have to this day. Even in a generally conservative period, Social Security continues to be protected by its cross-class constituency and to be championed by congressional representatives of all partisan and ideological stripes.

Current Possibilities
The message of history is clear. Those who want to help the poor should not try to devise new programs finely targeted to low-income people or the "underclass." They should forget about reforming means-tested public assistance programs like AFDC. Rather, they should aim at bypassing and ultimately displacing "welfare" with new policies that address the needs of the less privileged in the context of programs that also serve middle-class and stable working-class citizens. Newly devised policies must speak with a consistent moral voice to all Americans, whether as recipients or as taxpayers, reinforcing rather than undermining fundamental values such as rewards for work, opportunities for individual betterment, and responsibility for the care of children and other vulnerable people.

Establishing politically sustainable principles for social provision is more important than securing major new budgetary commitments up front. Even if new measures start small and give significant proportions of their benefits to families who seem less needy than the most desperately poor, advocates for the poor should realize that, before long, such measures could create new symbolic and political space for more targeted efforts on behalf of disadvantaged people. In contrast, measures that start out small and are narrowly focused on the poor may soon lose political support or never get the resources to reach more than a fraction of the needy people they are supposed to help.

To supplement Social Security programs for the aged, the United States could develop a Family Security program for all of its working-aged citizens and children. This program would express and reinforce shared American values of work and family responsibility. I have in mind the following new policies, which would at least partially replace AFDC, unemployment insurance, and Medicaid:

All of the elements in this program have separately received discussion elsewhere. Together, however, they form the basis for a coherent strategy for a new generation of sustainable social policy.

Child support assurance, first outlined by Irwin Garfinkel and Elizabeth Uhr, would establish nationwide guidelines requiring all absent parents (most of whom are fathers) to pay fixed proportions of their wages as child support. In turn, the program would provide some income support to all custodial parents, most of whom are mothers. As a substitute for the current haphazard system of judicially awarded child support, payments would be more or less automatically fixed and collected through wage withholding, exactly as income and payroll taxes are now. The government would send checks to custodial parents in amounts nearly equal to collections from absent parents, except in cases where support payments could not be collected or fell below a minimum benefit needed to raise children. In that case, the custodial parent would get the minimum.

A child support assurance program along these lines would express the nation's interest in helping single parents raise the approximately half of America's children who are now growing up in such families. The problems of single-parent families now cross class and racial lines, since about half of marriages today end in divorce and only about half of divorced mothers actually receive child support. (Separated and never-married mothers fare even worse.) Because the same program would process child support payments for women of all income levels, participation would not be stigmatizing. At the same time, an adequate minimum benefit could do much for the most needy. Child support, unlike welfare payments, would remain after the needy mother went to work. All custodial parents would, therefore, be free to earn as much as they could. If a mother received the minimum benefit because the father of her child was contributing nothing, she would have her benefit trimmed back only gradually as her earnings increased. She would have an incentive to work to make a better life for her children and herself.

Parental leave and child care assistance respond to the broad need of American families struggling to balance jobs and parental responsibilities. Across social classes, more married women, including mothers of young children, are working outside the home. Yet current policies in the United States do little by international standards to buffer families from the extra stresses of childbirth or adoption, or to help families find and finance adequate child care while both parents work. Paid parental leaves for families of newborns or new adoptees should be one of our long-run objectives. Legally mandated unpaid leaves, including guaranteed reinstatement afterwards, could serve as a first step in that direction. We should also seek to direct more assistance than we do now for child care expenses through refundable tax credits.

Help to parents raising children is one leg of an overall Family Security Program. The other leg must be assistance in securing jobs to make it easier for two-parent families to form and for parents to support their children. Such aid is consonant with American values because adults would be self-supporting, not put on a permanent public dole. As William Wilson and his collaborators have argued, grim job prospects for unskilled young black males are directly linked to the failure of poor men and women to marry, especially in economically depressed and socially disrupted inner cities. At the same time, the present welfare system traps many single mothers into a stark choice between low-paid, unskilled work and welfare benefits available only if the mother does not work (a situation that may encourage her to have more children). The existing welfare system rarely helps single mothers to gain new employment skills. Nor does it urge and help them to relocate, if necessary, to take advantage of decent jobs and housing in new locations.

A new program of labor market assistance would correct these failures. It would provide help and incentives to low-income ' unskilled women and men within the same non-stigmatizing program that offers aid to more advantaged people who lose their jobs and cannot easily find similar ones. The labor market program would identify jobs and regions in need of new workers. It would provide transitional unemployment benefits and perhaps housing subsidies to help workers and their families relocate geographically, if necessary Most important, the program could train or retrain people for jobs.

Some analysts believe that a new labor market program should also guarantee public jobs at the minimum wage as a last resort. But in that case, this new effort might get stereotyped as a "make-work

' program. An emphasis on training and relocation, on the other hand, would make public labor market policies more appealing. If existing labor markets cannot generate enough employment, other policies, such as infrastructure development, should be used to create new jobs. Furthermore, the institution of a higher Earned Income Tax Credit (EITC) to subsidize the incomes of low-wage workers might indirectly serve to make more self-supporting jobs available to newly trained workers. Although this is a targeted measure, it qualifies as targeting within universalism because it channels subsidies through the income tax system.

Finally, more universally available health benefits are essential. The present patchwork system of health insurance discourages labor mobility and job redefinitions within industries and workplaces because medical coverage for many workers and their dependents is tied to particular jobs and firms. The present system provides Medicaid coverage to AFDC recipients and some other categories of poor people, but it leaves uncovered many working people, especially low-wage workers. This situation discourages welfare recipients from moving off welfare and arouses taxpayer resistance to improved public health provision. The present uneven system should be replaced either with Canadian-style universal health insurance or with publicly mandated private insurance. The example of other countries indicates that such systems can, in fact, hold down national health expenditures to levels lower than in the United States.

A universalistic Family Security Program would require new budgetary commitments from the federal government -- but isn't this impossible in the present political climate? Before concluding that Family Security is pie-in-the-sky, we should realize that some of the policies discussed here would generate new revenues and save taxpayers' money as well as spend it. Child support assurance would be accompanied by wage withholding from absent parents, and the minimum benefit would allow us to phase out much of what we now call "welfare." The system of job training and relocation assistance would replace today's unemployment insurance. What is more, the new child support and labor market policies would encourage people to enter the paid work force and perform more efficiently in it, generating new tax revenues.

To be sure, child care assistance would require significant new funding, and health benefits well might, depending on whether more comprehensive reforms were introduced at the same time. Still the history of Social Security demonstrates that American citizens will accept taxes that they perceive as "contributions" toward public programs yielding benefits to their own families, not just for others. Americans are also willing to pay for social programs that are morally worthy and economically productive. Perhaps the introduction of new programs could be accompanied by a Family Security payroll tax, collected up to a higher wage base than are Social Security taxes to avoid their worst regressive features. In any event, new universalistic programs and new sources of revenue -- to be collected from virtually the entire population, not just subgroups as with the unsuccessful catastrophic illness surcharge -- should be discussed in close connection with one another.

Can the new Family Security Program I have outlined really help the American poor? Wouldn't this set of policies principally benefit the middle and working classes, along with the most privileged and least troubled of the poor? Initially, that might happen. But we should not hesitate to attack the pathologies of poor communities by doing the most, soonest, to help adults who are willing to work and care responsibly for their children. Once genuinely new and non-stigmatizing incentives, social supports, and job opportunities were in place, the example of hardworking people taking advantage of new programs and forging better lives for themselves might spread. Less privileged families and communities would have more hope and avenues for betterment through individual efforts augmented by non-stigmatizing public aid. Some people would take longer to get the message than others, but over time most would. By publicly supporting work and family responsibility, universalistic social policies would facilitate the moral reintegration of poor people into the mainstream of national life.

Like Social Security, new Family Security policies could stimulate broad political alliances reinforcing the programs and willing to accept extra efforts for less privileged citizens as part of the overall package. With their own values and needs recognized, larger numbers of middle-class Americans might be prepared to go the extra mile for the needy, including the truly disadvantaged blacks of our most desolate urban ghettos. The barely privileged would not have to help the poor with no benefit to themselves, and the more privileged would see that support was consonant with national values. Past experience teaches us that targeting the poor alone fails targeting within universal programs, however, can be both effective and politically sustainable.

A different version of this article with full references is forthcoming in The Urban Underclass, edited by Paul E. Peterson and Christopher Jencks. This paper was originally commissioned for the Conference on the Truly Disadvantaged, held October 19-21 1989, in Evanston, Illinois, co-sponsored by the Committee for Research on the Urban Underclass of the Social Science Research Council and the Center for Urban Affairs and Policy Research of Northwestern University.


Poor Relief in the Early America

Early American patterns of publicly funded poor relief emerged mainly from the English heritage of early settlers. The policies and practices of aiding the poor current in England when the Pilgrims landed at Plymouth, Massachusetts were shaped primarily by the Elizabethan Poor Laws of 1594 and 1601, and the Law of Settlement and Removal of 1662. The English poor laws classified poor/dependent people into three major categories and established a requirement for “residency” before aid was provided. Dependent persons were categorized as: vagrant, the involuntary unemployed and the helpless. In effect, the poor laws separated the poor into two classes: the worthy (e.g., orphans, widows, handicapped, frail elderly) and the unworthy (e.g., drunkards, shiftless, lazy). The poor laws also set down the means for dealing with each category of needy persons and established the parish (i.e., local government) as the responsible agent for administering the law. Parish officials were given the authority to raise taxes as needed and use the funds to build and manage almshouses to supply food and sustenance in their own homes for the aged and the handicapped, (e.g., blind, crippled) and to purchase materials necessary to put the able-bodied to work. If vagrants or able-bodied persons refused to work they could be put in jail.

The preamble to the English Law of Settlement and Removal of 1662 claimed that large numbers of indigent persons were moving to those rural communities where more liberal poor relief was provided to the needy. This law was enacted to permit local authorities to “eject” from their parish an individual or family who might become dependent. The law also authorized local authorities to “pass along” or “remove” persons who could not prove they had contributed to the well being of the parish by their labor or paying taxes.

In time, colonial legislatures and later State governments adopted legislation patterned after these English laws, establishing the American tradition of public responsibility for the care of the destitute while also requiring evidence of legal residence in a particular geographic locality (i.e., town, municipality, county) as a prerequisite for receiving assistance. The most popular means for caring for the poor in early American communities using public funds included: the contract system, auction of the poor, the poorhouse, and relief in the home, or “outdoor relief.” The contract system placed dependent persons under the care of a homeowner or farmer who offered to care for them for a lump sum. The process of “auctioning” the destitute resulted in an individual or family being placed with a local couple or family bidding the lowest amount of public funding needed to care for them. It should be noted the contract system and auctioning the poor were not prevalent outside rural or lightly populated areas. Part of the reason was evidence that the practice of entrusting the care of the poor to the lowest bidder essentially legalized abusive behavior and near starvation existence.

The most prevalent means of caring for the poor with public funds in early America were poorhouses and outdoor relief. The major advantages for a locality funding a poorhouse (sometimes labeled an almshouse or workhouse) to care for dependent persons were: the necessity of working every day would be a deterrent for able bodied persons who were simply lazy or shiftless and the regimen of daily life in a congregate setting would instill habits of economical and virtuous living in persons who were destitute because of moral weakness or self-indulgence. The facts revealed that only a small proportion of residents were able-bodied, and then usually in the winter months when jobs were scarce. In many areas, poorhouses became a refuge for the sick, the severely disabled, frail elderly and homeless children who were unable to work and had no one to care for them. Complicating the use of a poorhouse for the care of all destitute persons was the necessary mixing of the worthy and the unworthy poor. Often living in the same congregate setting were able-bodied adults as well as dependent persons such as children, the aged, the sick and the disabled. Eventually, separate facilities were established to care for the different populations, with the able-bodied being placed in a “workhouse” or “poor farm.”

By the middle of the nineteenth century, the conditions and reputation of poorhouses had deteriorated significantly. There was growing evidence of unseemly rates of death and disease, illicit births, lack of discipline, graft, and mismanagement. Perhaps worst of all, the costs of maintaining poorhouses increased beyond the expectations and promises of public officials. It soon became apparent to some public officials that it would be less expensive to provide some public assistance to the community’s dependent persons living with friends or relatives, or while living in their own homes.

Outdoor Relief

The nature and amount of outdoor relief varied widely in early America but it was seldom generous or widely available. The concept of public assistance conflicted with Calvinist values and was sometimes viewed as impinging on the personal gratifications derived from private works of charity. The beginning of a more general acceptance of the role of outdoor relief was revealed by a 1857 report of a New York Senate, “Select Committee to Visit Charitable and Penal Institutions.”

“…A still more efficient and economical auxiliary in supporting the poor, and in the prevention of absolute pauperism consists in the opinion of the committee, in the proper and systematic distribution of “out door” relief. Worthy indigent persons should, if possible, be kept from the degradation of the poor house, by reasonable supplies of provisions, bedding, and other absolute necessaries, at their own homes. Half the sum requisite for their maintenance in the poor house would often save them from destitution, and enable them to work in their households and their vicinity, sufficiently to earn the remainder of their support during the inclement season when indigence suffers the most, and when it is most likely to be forced into the common receptacles of pauperism, whence it rarely emerges without a loss of self respect and a sense of degradation…”

Despite the societal and religious values prevalent in this period of American history, opponents of outdoor relief found it difficult to argue in support of poor houses as a more suitable solution for helping relieve the economic distress of the aged, severely handicapped, widows and orphaned children. Further contributing to the acceptance of public assistance in the form of outdoor relief was the emergence of urban areas as centers of labor during the 19th Century. Several economic depressions and other business turndowns resulted in large numbers of the able-bodied being unemployed with no money with which to buy needed food and clothing for themselves or their families. Large numbers of unemployed workers often participated in demonstrations and protests of one kind or another. Others became violent. It was during these episodes of unrest that local public officials responded with various types of public employment programs, soup kitchens, and other forms of public financed charity designed to quell the protests or stabilize the environment.


Falmouth, Mass., Poorhouse

The Falmouth poorhouse was built as a tavern in 1769. Then the War of 1812 caused a depression across Cape Cod, and the tavern owner packed up and left for Cincinnati. The town moved the tavern 4-1/2 miles next to the Methodist Cemetery, and it became the Falmouth Poor House.

The town fenced the land and later built a barn, which allowed it to reclassify the poorhouse as a work house. Anyone who refused to work would be confined in a cell or farmed out indefinitely.

In 1878 the poorhouse turned into a poor farm, which actually supported itself by selling food. It wasn’t closed until 1960. The seven people remaining in the poorhouse went on welfare and moved to hospitals. The poorhouse now stands empty.


A Brief Timeline of the History of Disabilities: The Shameful Treatment of People with Disabilities

The ADA was passed on July 26, 1990 so this year is the 28th anniversary. Here is the anniversary website for the ADA beyond their just regular site if you want more anniversary information. https://www.adaanniversary.org/

When he signed the Americans with Disabilities Act, Paul Longmore and Lauri Umansky state: “George Bush proclaimed ‘Let the shameful wall of exclusion finally come tumbling down.’” (p.1)

The history of disabilities and the ADA did not begin on July 26, 1990 at the signing ceremony at the White House. It did not begin in 1988 when the first ADA was introduced in Congress. The struggles for equality go back much farther.

Looking back, our approach to people with disabilities has been less than desirable there is a long history of abuse, discrimination and lack of compassion and understanding. One could also argue that throughout history there have been many forms of discrimination of those with disabilities and that included such perceived disabilities as being a minority race, female poor.

For the early tribes and nomads, survival was paramount. It is not exactly known for sure how a person with a disability was cared for there has been some research that indicates they were cared for. Then again early live was a matter of survival, if an individual was unable to join in hunting and gathering, they were of no use and as the tribe moved on the disabled were left behind if they couldn’t keep up.

The following is a summary timeline:

Greek Empire: Were obsessed with human perfection.

· They believed beauty and intelligence were intertwined. This may have laid the ground work for future beliefs.

· Societies attempted to rationalize disabilities with ideas such as Meggie Shreve wrote in her research: “people who were deaf could not learn because communication was essential to learning.” (para 3)

Roman Empire: Developed a similar attitude to the Greeks.

· Romans considered the disabled inferior.

· There was no Latin word for “disabled”, so the Romans used the word “monstrum” which was the word used for mythical monsters.

· According to the Disability History Exhibit web site: “response to disability: Abandonment, Exposure, Mutilation.” (Alaskan D.H. & S.S. 2011)

· A disability was a mark of the god’s wrath.

· In many cases it was up to the father if a disabled child was to live or die. The disabled child was often taken and left naked in the woods.

· The attitude was to kill quickly

· In Sparta, children were the property of the state, not the parents and by law abandonment of a disabled child was mandatory.

Fall of Roman: The rise of Christianity.

· This was time of increased sympathy and pity towards the disabled.

· This period of compassion would soon be replaced by fear.

Middle Ages : (approx. 1060-1490) A time of social oppression, economic hardship and intellectual decline.

· Describing society, Shreve wrote: “became fearful of people with disabilities as their attraction to the attraction of the supernatural increased.” (para. 5)

· Those born with disabilities became outcasts and homeless beggars.

· Those born with disabilities were considered evil and a disability was the work of the devil as punishment for sins.

· Institutions developed more to hide the disabled than to treat, and conditions were less than humane.

Renaissance: (approx. 1400’s thru 1600’s) Increase in religious influence

· There was a change in the treatment of the disabled with the introduction of medical care and institutionalization.

· Many of the disabled were cared for by monks and religious organizations.

· Shreve claims: “People with disabilities were ridiculed, such as a court jester who was actually someone with a humped back.” (para. 5)

· Under King James I, creation of Bethlem Royal Hospital, Europe’s first psychiatric hospital

1700’s: Disabilities were common place.

· Start of industrial revolution.

· Long hours, in dangerous working conditions. No job safety regulations.

· Bloody battles, leaving many with permanent disabilities.

In an article by Stephen Weisman, he writes: “in the 18th century, having a disability was a death sentence in some instances.” Weisman goes on to report: “those who weren’t able to work were often left destitute and without other options aside from begging.”

· Many could not afford to have assistive devices made.

· They were the subjects of public mockery and scorn.

1800’s: those unable to fend for themselves were left to wander the streets, becoming beggars.

· The number of people homeless and living on the streets became a social issue and new policies developed. As described by the Alaskan Department Health and Social Services, Disability History Exhibit: “Moral Viewpoint – Persons who lived in extreme poverty, including many with physical or mental disabilities were often put into poorhouses or almshouses. Such establishments, supported by public funds, began in the Middle Ages as a means of removing economic outcasts from society.”(panel 6)

· Social and moral changes during this time brought training schools for the disabled bringing some relief to families of the disabled.

At the end of the Civil War job opportunities for those completing training dwindled.

Twentieth Century: Rhonda Neuhaus, Cindy Smith, Molly Burgdorf on the American Bar Association web site wrote: “the laws of the United States devalued persons with disabilities as society as a whole viewed such persons as a group of people to be pitied, ridiculed, rejected, and feared, or as objects of fascination. Persons with disabilities were seen as objects of charity or welfare or as needing to be subjected to medical treatment or cure.”

o People found by an examining surgeon to be mentally or physically defective, affecting their ability to earn a living, were not permitted to enter the country, often even with family members.

o There was a fear of making the U.S. a country of defectives.

· Late in the nineteenth century and into the early twentieth century a number of cities passed what were known as “Ugly Laws”. These laws made it illegal for any person diseased, maimed, mutilated or disfigured in any way to show themselves in public view. The last city to repeal ugly laws was Chicago in 1974.

· At the turn of century many families who had a child with a disability kept them hidden or they were sent to an institution and they were often forgotten. A disability in many cases was a family embarrassment.

o With the advent of industrialization of the world, more than not the disabled were institutionalized. As Shreve explains: “society believed people with disabilities might be educated, but usually in “special” schools, far from urban or heavily populated areas.” (para. 7)

· In the Midwestern U.S. county fairs commonly entertained spectators by presenting oddities such as the “bearded women, the dog faced boy, giants, dwarves, five legged calves and learned pigs.” Fair promoters quickly learned that fair goers were so captivated by this mangled miscellany of humans and animal “oddities” that the “sideshow was born and became instrumental to the fairs and popularized the term “Freaks” and the “Freak Show”. Longmore and Umansky wrote: “so-called exhibitions, displays of the disfigured, disabled, or exaggerated bodies masquerading under the guise of scientific enlightenment.”(p. 178)

o It should be noted that one plus to the sideshows was that this gave people with disabilities a source of income and a feeling of community.

· 1930’s The rise of Hilter and the so called “Master Race”

o Perhaps the greatest abuse of the disabled.

o Those with physical deformities and in particular those with mental disabilities were subject to brutal experimentation and used as human guinea pigs.

o There is no record as to how many were put to death. Shreve reports: “Jews, Gays and Lesbians and other minorities and their supporters and people with disabilities were put to death by Hitler’s concentration camp staff.” (para. 8), anyone who failed Hitler’s view of the superior race.

· 1933 Franklin Roosevelt elected president.

o Roosevelt tried to hide his disability, but most people knew and he inspired many people with disabilities.

o Roosevelt created a nonprofit foundation, the Georgia Warm Springs Foundation

· 1938 Passage of the Fair Labor Standards Act, Employers may pay lower wages to employees whose productivity is limited due to physical or mental disability.

· 1946 The National Mental Health Foundation Helped to expose the abusive conditions at state mental institutions and became an early advocate for people with disabilities to live in community settings rather than institutions. The precursor to de-institutionalization.

· 1947 First ever meeting of the President’s Committee on National Employ the Physically Handicapped Week. It began an ad campaign emphasizing the competence of people with disabilities.

· 1953 In-home care for adults with polio as a cost savings began in Los Angeles County.

· 1954 The office of Vocational Rehabilitation provided federal funds for over 100 university based rehabilitation programs.

· 1958 The Rehabilitation Gazette began publication, focusing on disability rights across the US. Many of its articles were written by disabled writers on their experiences.

· 1961 The American National Standard Institute (ANSI) this became the basis for architectural codes of accessibilities.

· 1962 The President’s Committee on Employment of the Physically Handicapped became the President’s Committee on Employment of the Handicapped there was increasing interest relating to employment for people with cognitive disabilities and mental illness.

Edwards Roberts sued and gained admission to Berkeley. Surviving polio, Roberts used a wheel chair and iron lung. He became the first university student admitted with a significant disability.

· 1964 Passage of the Civil Rights Act which became the inspiration for future disabilities rights legislation.

Invention of the “acoustic coupler”, leading to the telephone modem which allowed type written message across telephone and that in turn paved the way for the today’s “TTY communication, allowing the deaf and hard of hearing to use the telephone.

· The National Technical Institute for the Deaf was created at Rochester Institute of Technology in Rochester, NY.

· 1967 The National Theatre of the Deaf was established.

· 1968 Passage of the Architectural Barriers Act, required all federally owned or leased buildings to be accessible to the disabled.

· 1970 The Urban Mass Transit Act required all new mass transit vehicles to be equipped with a wheel chair lift.

the Physically Disabled Students Program (PDSP) founded by John Hessler, Ed Roberts, Hale Zukas and other from Berleley. Promoted community living, personal assistance programs and political advocacy, was the foundation for the nation’s first Center for Independent Living.

Judith Heumann founded the Disabled in Action in New York City after a successful lawsuit against the city’s public school system for employment discrimination.

Passage of Developmental Disabilities Service and Facilities Construction Amendments, contained the nation’s first definition of “Developmental Disabilities” and provided grants for facilities for the rehabilitation for people with developmental disabilities.

· 1971 WGBH a public television station in Boston begin providing “Closed Captioned” programming for deaf viewers.

Wyatt v. Stickney, the US District Court for the Middle District of Alabama decided that people in residential state schools and institutions have a constitutional right to receive such individual treatment as would give them a realistic opportunity to be cured or to improve his or her mental condition. Disabled people could no longer be locked away in institutions without treatment or education.

· 1972 The US District Court for the District of Columbia, ruled the school system could not exclude disabled children from attending public schools. In the same year the District Court for Eastern Pennsylvania struck down a number of state laws used to exclude children with disabilities from public schools. These rulings worked to inspire passage of the Education for all Handicapped Children Act of 1975.

Disabled activists in New York City and other cities around the country held demonstrations to protest Nixon’s veto of the Rehabilitation Act.

Outraged by a television broadcast from Willow Brook State School in Staten Island parents of the residents filled suit to end the deplorable conditions at the institution. Thousands of people were relocated to community based living arrangements.

The Center for Independent Living was found in Berkeley, California. It is recognized as the first center for independent living.

The Judge David L. Bazelon Center for Mental Health Law was founded in Washington, DC. Its mission is to provide legal representation and advocate for the rights of people with mental illness.

· 1973 Passage of the Rehabilitation Act. The law states, “No otherwise qualified handicapped individual in the United States, shall solely by reason of his handicap, be subject to discrimination under any program or activity receiving federal financial assistance.” This was intended to prohibit discrimination in federallly funded programs and services.

Passage of the Federal-Aid Highway Act Provided government funding for the construction of curb cuts.

The first handicapped parking sticker was introduced in Washington, DC.

· 1974The First Client Assistant Project (CAPs) to advocate for clients of state vocational rehabilitation agencies.

The first convention of People First held in Salem, Oregon, became the largest People First organization. It was led by people with cognitive disabilities.

· 1975 Passage of the Education for All Handicapped Children Act allowed the right of disabled children to be integrated into a public school environment. This was later renamed the Individuals with Disabilities Act (IDEA).

The founding of Parent and Training Information Centers (PTIs) was established to assist parents of disabled children to exercise their rights under the Education for All Handicapped Children Act.

The American Coalition of Citizens with Disabilities was founded. It became an important cross-disability rights organization of the 1970s by pulling together disability rights groups representing blind, deaf, physically disabled, and developmentally disabled people.

The Supreme Court ruled that people could not be held against their will, or placed in a psychiatric hospital institution, unless they are proven to be a threat to themselves or to others. (O’Connor v. Donaldson)

· 1976 Amendments to the Higher Education Act of 1972, provided for services for the physically disabled students entering college.

The Cerebral Palsy telethon in New York City is picketed by “Disabled in Action” calling the event the telethon “demeaning and paternalistic shows which celebrate and encourage pity.”

· 1977 Disability rights activists in ten cities staged demonstrations and occupations of the offices of the federal department of Health Education and Welfare (HEW) to force the Carter

Administration to issue regulations implementating Section 504 of the Rehabilitation Act of 1973.

The demonstration in San Francisco lasted nearly a month. One 28 April, HEW Secretary Joseph Califano signed the regulations.

The first ever White House Conference to discuss federal policies towards people with disabilities. The conference brought together 3,000 individuals with disabilities and was the beginning of a grassroots disabilities right organization.

· 1978 A sit-in demonstration by disability rights activists was held, blocking the Denver Regional Transit Authority buses due to the inaccessibility of the mass transit system. The year-long protest forced the Denver Transit Authority to invest in wheelchair lift buses.

Passage of VII of the Rehabilitation Act Amendments provided for the first federal funding for the Centers for Independent Living (CILs) creating the National Council of the Handicapped through the Dept. of Education.

· 1979 The National Alliance for the Mentally Ill (NAMI) is founded in Madison Wisconsin for the parents of persons having a mental illness.

· 1980 Congress passes the Civil Rights of Institutionalized Persons Act. This allowed the Justice department to file suit on behalf of those institutionalized whose rights may have been violated.

The US Supreme Court ruled that Southeastern Community College must make “reasonable modifications” for qualified disabled individuals while receiving federal funding. This was test of Section 504 of the Rehabilitation Act of 1973, which is an important principle of disability rights law.

· 1981 The International Year of Disabled Persons began with speeches before the United Nations General Assembly. During the year, governments were encouraged to sponsor programs bringing people with disabilities into the mainstream of their societies

· 1985 The U.S. Supreme Court ruled, City of Cleburne v. Cleburne Living Center, that localities cannot use zoning laws to prohibit group homes for people with developmental disabilities from opening in a residential area because its residents are disabled.

· 1986 The National Council on the Handicapped issued “Toward Independence.” The report outlined the legal status of Americans with disabilities, documented the existence of discrimination, and cited the need for federal civil rights legislation (what will eventually be passed as the Americans with Disabilities Act of 1990).

· 1988 The Fair Housing Amendments Act added people with disabilities to those groups protected by federal fair housing legislation. It established minimum standards of adaptability for newly constructed multiple-dwelling housing.

· 1989 The original version of the Americans with Disabilities Act, introduced into Congress the previous year, was redrafted and reintroduced. Disability organizations across the country advocated on its behalf.

· 1990 The Americans with Disabilities Act (ADA) was signed into law. It prohibits discrimination and guarantees that people with disabilities have the same opportunities as everyone else to participate in the mainstream of American life—to enjoy employment opportunities, to purchase goods and services, and to participate in State and local Programs and services.

· 1995 The U.S. Court of Appeals ruled that continued institutionalization of a disabled Pennsylvania woman, when not medically necessary and when there is the option of home care, was a violation of her rights under the Americans with Disabilities Act (Helen L. v. Snider).

Disability rights advocates viewed this ruling as a landmark decision regarding the rights of people in nursing homes to instead live in their own home and receive personal assistance services.

· 2004 The first ever Disability Pride Parade was held in Chicago and other communities around the country.

Ÿ 2006 The first bill requiring that students in a K-12 public school system be taught the history of the disability rights movement is passed, largely due to the efforts of 20 young people with disabilities from the state of West Virginia

Ÿ 2010 According to FBI 1.5% of hate crimes are due to bias against disabled.

Rosa’s Law, which changed references in many federal statutes that referred to “mental retardation” to make them refer, instead, to “intellectual disability“, became law in the U.S.[127]

Ÿ 2011 Delta fined for violating rules protecting air travelers with disabilities

Ÿ 2011 On March 15, 2011, new Americans with Disabilities Act rules came into effect. These rules expanded accessibility requirements for recreational facilities such as swimming pools, golf courses, exercise clubs, and boating facilities. They also set standards for the use of wheelchairs and other mobility devices like Segways in public spaces, and changed the standards for things such as selling tickets to events and reserving accessible hotel rooms. The new rules also clearly defined “service animal” as “…any dog that is individually trained to do work or perform tasks for the benefit of an individual with a disability, including a physical, sensory, psychiatric, intellectual, or other mental disability.” This portion of the law also states that the services the service animal provides must be “directly related to the handler’s disability” and dogs that provide only emotional support or crime deterrence cannot be defined as service animals.[131]

Ÿ 2014 Employees of federal service and concession contractors with disabilities who had been paid less than minimum wage under Section 14(c) of the Fair Labor Standards Act were included in an executive order (signed in 2014) raising the minimum wage for employees of federal service and concession contractors to $10.10 an hour.[166][167]

Ÿ 2016 An ethics rule of the American Bar Association now forbids comments or actions that single out someone on the basis of disability, as well as other factors.[187]

Disclaimer: We acknowledge there are events which may have not been included on this time.

Alaska Department of Health & Social Services. (2001) Disability History Exhibit Retrieved June, 2018 from: http://hss.state.ak.us/gcdse/history/HTML_Content_Main.htm.

Longmore, Paul, K. & Umansky Lauri.(2001). The New Disability History, American Perspectives. New York: University Press.

Neuhaus, Rhonda, Smith, Cindy, Burgdorf, Molly. (2014). Equality for People with Disabilities, Then and Now. Vol. 31 No. 6. Published by the American Bar Association. Retrieved from:

Shreve, Maggie. (1982). The Movement for Independent Living: A Brief History. Attitudes Started It All. ILUSA, Retrieved from: http://www.ilusa.com/articles/mshreve_article_ilc.htm

Additional Information gather from:

Brignell, Victoria. (2008). Ancient World. From: The New Statesman website.

Gracer, Bonnie L.(2003). What the Rabbis Heard: Deafness in the Mishnah. From: Disability Studies Quarterly website.


Walkabout: Alms for the Poor, Part 2

Kings County Almshouse, 1900. Collection of the Museum of the City of New York.

History

Kings County Almshouse, 1900. Collection of the Museum of the City of New York.

In my last post, I introduced you to the Kings County Almshouse, a 70 acre farm complex located in Flatbush. It was established in 1830, as Brooklyn’s population was swelling, and so too were the numbers of poor and indigent people who could not take care of themselves. Aside from people who were just out and out destitute and poor, the 19th century American almshouse was also designed to take in those who were mentally ill, developmentally slow or impaired, orphans, the blind, deaf and mute, and elderly people who had no families to take care of them.

The first two groups were generally called “lunatics” and “idiots”, and the understanding of their conditions was a long ways away. The understanding of the conditions of poverty was also long in coming, and arguably, we still haven’t figured out what to do with it, or the people affected by it. The Victorians knew, and acted accordingly. The world, after all, was simply filled with the “deserving” and “undeserving” poor. They attempted to help one group. The other one was on their own.

Many of our American social ideas come from our time of being an English colony. The British Isles have had a long history of social institutions that they felt would combat poverty. Prisons were high on this list, and in the case of those who could not pay their debts to their creditors, debtor’s prison was the sentence that was often imposed on entire families.

So were workhouses, where the poor could pay for their room and board by toiling in the many factories that represented the new Industrial Revolution. While we took on many aspects of British society, here in America, debtor’s prison was never popular here. Probably because a large percentage of Americans of British descent had come here as indentured servants and transported felons themselves, shipped to America from those very debtor’s prisons and penitentiaries of London and other cities.

But the social problems of an industrialized country were the same here as in England: what to do with the rising numbers of poor people, who were a drain on the normal levels of charity? The first idea was called “outdoor relief”. The county would auction off the care of the poor to the lowest bidder, who would take the poor person or family under their care, and having them work to earn their keep.

The county would help with small amounts of money, as well as fuel and other necessities. As one can imagine, this system, and the people in it, were ripe for abuse. A central institution giving care was then thought to be the answer.

By the beginning of the nineteenth century, social policy towards the poor was centered on the almshouse, or poorhouse. This would continue in some fashion, until the New Deal. The idea was that a central institution, established on a county-wide level, would be the best and most efficient way to help the poor. Ironically, this system was based on the dual policy of both helping the poor and deterring them from asking for that help. Because tax dollars supported the county poorhouse, this also was seen as the answer to private charity.

Most counties throughout the country had a poorhouse. It was usually located far from the general population, and ideally had land around it suitable for farming, so that it could be self-sustaining, and generate income. Farm labor would come from the inmates, giving them the opportunity to help pay for their keep. Larger, and often more urban facilities might have small factories where inmates worked, as well.

The poorhouse was obligated to take everyone that society rejected, yet was seldom given enough resources to do so, especially as America grew in population, and it didn’t take long for these almshouses to be synonymous with hell. Although every child, even today, is told by their parents that they will be the cause of them going there, no one really wanted to go to the poorhouse. And for good reason, they were awful. This contradiction in terms caused any lofty goals in helping people to be mixed, at best.

On the one hand the counties wanted to deter people from asking for help, on the other hand, they were providing that humanitarian help. But almshouses were never able to be self-sustaining. They cost a lot to run, and the capacity of the inmates to pay for their own keep by working at the farm, or working at the almshouse itself, was greatly overestimated. There wasn’t enough staff, and facilities were not kept up. And the poor kept coming.

Remember the “deserving and undeserving poor?” The general societal outlook in the 19th century, and frankly, we still haven’t lost it, is that poverty was the fault of the poor. Helping the widows and orphans, the blind and old was one thing they were the “deserving poor”. The “undeserving poor,” were another story. It was thought that institutionalizing the poor would rehabilitate them, and train them to be productive citizens.

Their incarceration would teach them discipline, which was obviously all that was lacking and the reason for their poverty in the first place. To this end, once in the poorhouse, children were separated from their parents and put in separate orphanages or sent away, husbands and wives were segregated into workhouses, and not allowed to even speak to each other, and conditions could be so awful that some would rather starve in the street than go to the almshouse. Irish and Negro families were especially targeted for separation and incarceration, as both groups were seen by many in the upper classes to be at the bottom of the social order, and responsible for their own conditions.

In 1857, the Kings County Almshouse in Flatbush, housed 380 people. The almshouse nursery had 350 babies and children, while the hospital was caring for 430 patients, and the attached lunatic asylum had a population of 205. In total, there were 674 males, 691 females, of which 870 were foreign born, 475 native born, including 424 children under sixteen years of age. All of these people were under the care of only one keeper, aided by three male and four female assistants. The sexes were kept completely separated from each other at all times.

Everyone there was required to work, either on the farm, or on the complex. Children over twelve were bound out by the superintendent, and could be “rented out” to factories or other facilities. Elderly inmates were not exempt, either, unless too infirm to do chores or factory work.

The 1857 report that details the conditions and population of the Kings County Almshouse goes on to admit that the place was overcrowded and understaffed. Yet, as justification of conditions, the report on the facility states that two-thirds of the inmates were forced to accept this public charity because of inebriation, which seems hardly likely, due to the numbers of children and inmates of the insane asylum, alone.

The inmates of the asylum were in especially horrible conditions. The insane were seen by many to be just a burden on society, as well as a danger. At Kings County, they were incarcerated in a separate facility, and either allowed to wander around uncared for, or were placed in various kinds of restraints. There were more women than men here. The facility was designed to hold 150 patients, but in 1857, had a population of 205. As horrible as conditions here were, this report also noted that the asylum was now under a new administration. The previous one had abused the patients, resulted in permanent crippling of several, and obvious marks of chains and confinement on others. The report was happy to state that this was no longer the case.

The Civil War actually began the change in poorhouse policies. As the above report from Kings County and other poorhouse records began to show, the poorhouse model was not working. Almshouses were expensive to run, were not self-sustaining, as designed, and the numbers of the poor were overwhelming the facilities. It was becoming obvious that laziness and drunkenness were not the only contributing factors to poverty, and as deserving or undeserving as the poor may be, there were far too many to manage by incarcerating them in poorhouses.

On top of that, the war produced widows and orphans, parents, grandparents and children without the male breadwinner, and a great deal of temporary and permanently disabled men who were not able to work. Yet very few of these families were put into poorhouses, because the government began to take a hand, establishing pension plans for veterans, and temporary aid to veterans and families by means of direct “outdoor relief” in the form of monetary payments. Governmental social services had begun.

Poorhouses continued into the 20th century, but were changing. By the turn of that century, many large poorhouse facilities had divided into their separate parts. The Kings County Almshouse also changed. Although it operated into the early 20th century, the hospital part of the complex became the nucleus of present day Kings County Hospital. The rest of the buildings were eventually torn down in the 1930’s to expand the hospital, today a huge complex.

The almshouses got out of the mental health field, and separate facilities, generally known as insane asylums, were built, usually as far away from everyone else as possible. Mental health care has a long way to go. Separate orphanages became more expedient for children, which has changed into in-home foster care, and most almshouses became homes for the growing number of senior citizens who could no longer take care of themselves, or had anyone who would take them in.

By the beginning of the 20th century, almshouses were old age homes. The Museum of the City of New York has a series of fascinating photos of the Kings County Almshouse taken in 1900. The majority of the inmates are elderly. (Link not functioning. Go to MCNY site and enter Kings Country Almshouse.)

Governmental aid to the poor and elderly changed as well. Social security was established in the New Deal era of 1935, during the Great Depression. Workman’s compensation, unemployment compensation, welfare, and other social safety net programs would follow. By the 1950’s, the last of the poorhouses had closed. The institutions may have closed, but the challenges and resources to deal with the root causes remains. Poverty remains one of society’s largest, most complicated, and most contentious problems. Alms for the poor, unfortunately, are needed more than ever.

Kings County Almshouse, Shoe shop, 1900. Photo: Museum of the City of New York


The Police Were Created to Control Working Class and Poor People, Not ‘Serve and Protect’

We shouldn’t expect the police to be something they’re not. (David Shankbone / Flickr)

In most of the liberal discussions of the recent police killings of unarmed black men, there is an underlying assumption that the police are supposed to protect and serve the population. That is, after all, what they were created to do.

If only the normal, decent relations between the police and the community could be re-established, this problem could be resolved. Poor people in general are more likely to be the victims of crime than anyone else, this reasoning goes, and in that way, they are in more need than anyone else of police protection. Maybe there are a few bad apples, but if only the police weren’t so racist, or didn’t carry out policies like stop-and-frisk, or weren’t so afraid of black people, or shot fewer unarmed men, they could function as a useful service that we all need.

This liberal way of viewing the problem rests on a misunderstanding of the origins of the police and what they were created to do.

The police were not created to protect and serve the population. They were not created to stop crime, at least not as most people understand it. And they were certainly not created to promote justice. They were created to protect the new form of wage-labor capitalism that emerged in the mid- to late- 19 th century from the threat posed by that system’s offspring, the working class.

This is a blunt way of stating a nuanced truth, but sometimes nuance just serves to obfuscate.

Before the 19 th century, there were no police forces that we would recognize as such anywhere in the world. In the Northern United States, there was a system of elected constables and sheriffs, much more responsible to the population in a very direct way than the police are today. In the South, the closest thing to a police force was the slave patrols.

Then, as Northern cities grew and filled with mostly immigrant wage workers who were physically and socially separated from the ruling class, the wealthy elite who ran the various municipal governments hired hundreds and then thousands of armed men to impose order on the new working class neighborhoods.

Class conflict roiled late- 19 th century American cities like Chicago, which experienced major strikes and riots in 1867 , 1877 , 1886 , and 1894 . In each of these upheavals, the police attacked strikers with extreme violence, even if in 1877 and 1894 the U.S. Army played a bigger role in ultimately repressing the working class. In the aftermath of these movements, the police increasingly presented themselves as a thin blue line protecting civilization (by which they meant bourgeois civilization) from the disorder of the working class. This ideology of order that developed in the late 19 th century echoes down to today — except that today, poor black and Latino people are the main threat, rather than immigrant workers.

Of course, the ruling class did not get everything it wanted, and had to yield on many points to the immigrant workers it sought to control. This is why, for instance, municipal governments backed away from trying to stop Sunday drinking, and why they hired so many immigrant police officers, especially the Irish. But despite these concessions, businessmen organized themselves to make sure the police were increasingly isolated from democratic control, and established their own hierarchies, systems of governance, and rules of behavior.

The police increasingly set themselves off from the population by donning uniforms establishing their own rules for hiring, promotion and firing working to build a unique esprit des corps and identifying themselves with order. And despite complaints about corruption and inefficiency, they gained more and more support from the ruling class, to the extent that in Chicago, for instance, businessmen donated money to buy the police rifles, artillery, Gatling guns, buildings, and money to establish a police pension out of their own pockets.

There was a never a time when the big city police neutrally enforced ​ “ the law,” or came anywhere close to that ideal. (For that matter, the law itself has never been neutral.) In the North, they mostly arrested people for the vaguely defined ​ “ crimes” of disorderly conduct and vagrancy throughout the nineteenth century. This meant that the police could arrest anyone they saw as a threat to ​ “ order.” In the post-bellum South, they enforced white supremacy and largely arrested black people on trumped-up charges in order to feed them into convict labor systems.

The violence the police carried out and their moral separation from those they patrolled were not the consequences of the brutality of individual officers, but were the consequences of careful policies designed to mold the police into a force that could use violence to deal with the social problems that accompanied the development of a wage-labor economy.

For instance, in the short, sharp depression of the mid- 1880 s, Chicago was filled with prostitutes who worked the streets. Many policemen recognized that these prostitutes were generally impoverished women seeking a way to survive, and initially tolerated their behavior. But the police hierarchy insisted that the patrolmen do their duty whatever their feelings, and arrest these women, impose fines, and drive them off the streets and into brothels, where they could be ignored by some members of the elite and controlled by others.

Similarly, in 1885 , when Chicago began to experience a wave of strikes, some policemen sympathized with strikers. But once the police hierarchy and the mayor decided to break the strikes, policemen who refused to comply were fired. In these and a thousand similar ways, the police were molded into a force that would impose order on working class and poor people, whatever the individual feelings of the officers involved.

Though some patrolmen tried to be kind and others were openly brutal, police violence in the 1880 s was not a case of a few bad apples — and neither is it today.

Much has changed since the creation of the police — most importantly the influx of black people into the Northern cities, the mid-twentieth century black movement, and the creation of the current system of mass incarceration in part as a response to that movement. But these changes did not lead to a fundamental shift in policing. They led to new policies designed to preserve fundamental continuities. The police were created to use violence to reconcile electoral democracy with industrial capitalism. Today, they are just one part of the ​ “ criminal justice” system which continues to play the same role. Their basic job is to enforce order among those with the most reason to resent the system — who in our society today are disproportionately poor black people.

A democratic police system is imaginable — one in which police are elected by and accountable to the people they patrol. But that is not what we have. And it’s not what the current system of policing was created to be.

If there is one positive lesson from the history of policing’s origins, it is that when workers organized, refused to submit or cooperate and caused problems for the city governments, they could back the police off from the most galling of their activities.

Murdering individual police officers, as happened in in Chicago on May 3 , 1886 and more recently in New York on December 20 , 2014 , only reinforced those calling for harsh repression — a reaction we are beginning to see already. But resistance on a mass scale could force the police to hesitate. This happened in Chicago during the early 1880 s, when the police pulled back from breaking strikes, hired immigrant officers, and tried to re-establish some credibility among the working class after their role in brutally crushing the 1877 upheaval.

The police might be backed off again if the reaction against the killings of Eric Garner, Michael Brown, Tamir Rice and countless others continues. If they are, it will be a victory for those mobilizing today, and will save lives — though as long as this system that requires police violence to control a big share of its population survives, any change in police policy will be aimed at keeping the poor in line more effectively.

We shouldn’t expect the police to be something they’re not. We ought to know that origins matter, and the police were created by the ruling class to control working class and poor people, not help them. They’ve continued to play that role ever since.


How it became a crime to be poor in America

I n the United States, a system of modern peonage – essentially, a government-run loan shark operation – has been going on for years. Beginning in the 1990s, the country adopted a set of criminal justice strategies that punish poor people for their poverty. Right now in America, 10 million people, representing two-thirds of all current and former offenders in the country, owe governments a total of $50bn in accumulated fines, fees and other impositions.

The problem of “high fines and misdemeanors” exists across many parts of the country: throughout much of the south in states ranging from Washington to Oklahoma to Colorado and of course in Ferguson, Missouri, where, in the wake of the killing of Michael Brown, revelations about the systematic criminalization of the city’s poor black residents brought these issues to national attention.

As a result, poor people lose their liberty and often lose their jobs, are frequently barred from a host of public benefits, may lose custody of their children, and may even lose their right to vote. Immigrants, even some with green cards, can be subject to deportation. Once incarcerated, impoverished inmates with no access to paid work are often charged for their room and board. Many debtors will carry debts to their deaths, hounded by bill collectors and new prosecutions.

Mass incarceration, which has disproportionately victimized people of color from its beginning in the 1970s, set the scene for this criminalization of poverty. But to understand America’s new impulse to make being poor a crime, one has to follow the trail of tax cuts that began in the Reagan era, which created revenue gaps all over the country.

The anti-tax lobby told voters they would get something for nothing: the state or municipality would tighten its belt a little, it would collect big money from low-level offenders, and everything would be fine.

Deep budget cuts ensued, and the onus of paying for our justice system – from courts to law enforcement agencies and even other arms of government – began to shift to the “users” of the courts, including those least equipped to pay.

Exorbitant fines and fees designed to make up for revenue shortfalls are now a staple throughout most of the country. Meanwhile, white-collar criminals get slaps on the wrist for financial crimes that ruin millions of lives. Though wealthy scofflaws owe a cumulative $450bn in back taxes, fines and fees from the justice system hit lower-income people – especially people of color – the hardest.

“Broken windows” law enforcement policy – the idea that mass arrests for minor offenses promote community order – aided and abetted this new criminalization of poverty, making the police complicit in the victimization of the poor. Community policing turned into community fleecing. Enforcing “quality of life” rules was touted as a way to achieve civic tranquility and prevent more serious crime. What it actually did was fill jails with poor people, especially because those arrested could not pay for bail.

Budget cuts and the new criminalization have inflicted other cruelties as well. Under “chronic nuisance” ordinances created by underfunded police departments, women in some poor communities can be evicted for calling 911 too often to seek protection from domestic abuse.

Public school children, particularly in poor communities of color, are arrested and sent to juvenile and even adult courts for behavior that not long ago was handled with a reprimand. The use of law enforcement both to criminalize homelessness and to drive the homeless entirely out of cities is increasing, as municipalities enact ever more punitive measures due to shortages of funds for housing and other services.

In addition, low-income people are deterred from seeking public benefits by threats of sanctions for made-up allegations of benefits fraud. As elected officials have moved to the right, laws designed to keep people from seeking assistance have grown more common. Budget cuts have also led to the further deterioration of mental health and addiction treatment services, making the police the first responders and jails and prisons the de facto mental hospitals, again with a special impact on minorities and low-income people.

Racism is America’s original sin, and it is present in all of these areas of criminalization, whether through out-and-out discrimination, structural and institutional racism, or implicit bias. Joined together, poverty and racism have created a toxic mixture that mocks our democratic rhetoric of equal opportunity and equal protection under the law.

A movement to fight back is showing signs of developing. Organizers and some public officials are attacking mass incarceration, lawyers are challenging the constitutionality of debtors’ prisons and money bail, judicial leaders are calling for fair fines and fees, policy advocates are seeking repeal of destructive laws, more judges and local officials are applying the law justly, and journalists are covering all of it.

The Obama administration’s Department of Justice stepped into the fray on a number of fronts. Ferguson was a spark that turned isolated instances of activism into a national conversation and produced numerous examples of partnerships between advocates and decision-makers.

Now we must turn all of that into a movement. The ultimate goal, of course, is the end of poverty itself. But as we pursue that goal, we must get rid of the laws and practices that unjustly incarcerate and otherwise damage the lives of millions who can’t fight back. We must fight mass incarceration and criminalization of poverty in every place where they exist, and fight poverty, too.

We must organize – in neighborhoods and communities, in cities and states, and nationally. And we must empower people to advocate for themselves as the most fundamental tool for change.


How Grave Robbers And Medical Students Helped Dehumanize 19th Century Blacks And The Poor

The history of autopsy and dissection of human bodies in the United States may seem like an innocuous topic, a necessary means to study life and its inevitable end. But in the 19th century, the vast majority of people who were dissected and autopsied were socially and economically marginalized groups. Bioarchaeologists have recently turned their attention to the skeletal remains of people who were dissected or autopsied, revealing evidence that socioeconomic inequality experienced during life extended into death.

Postmortem Trauma Or Violence?

Discussions of postmortem trauma to human remains are made within the boundaries of a concept called structural violence. This sociological idea explains, for example, the relationship between poverty and chronic infectious diseases like tuberculosis in modern groups as well as the relationship between skeletal markers of health stress and social status in ancient populations. Sometimes, structural inequalities not only persist within a society but are codified into law, so the elite stay that way and the non-elite become more down-trodden. Within 19th century America, the adoption of new anatomy laws essentially made the lives and deaths of the poor and socially marginalized groups worse.

Bioarchaeologist Ken Nystrom of SUNY New Paltz has been researching structural violence for years and became interested in the history of dissection in the U.S. In a recently published article, he says that he “would characterize the development and passing of anatomy laws that made it legal to dissect unclaimed bodies from almshouses and the resulting psychological stress associated with the fear of dissection as manifestations of structural violence.”

Criminals And Resurrectionists

But what’s the harm in dissection? Nystrom explains that “in the early 16th and 17th centuries in both Britain and its North American colonies, dissection was meted out as postmortem punishment for executed criminals.” In Britain, King Henry VIII offered executed criminals to surgeons in the 1540s, and the colony of Massachusetts was the first to draft statutes that allowed dissection of criminals in 1641. A law written in 1789 in New York state specifically said that a judge could add dissection as a penalty for sentencing in particular crimes.

The 19th century saw a boom in medical education, with the number of American med schools increasing from 4 to 160. This meant the legal supply of cadavers was suddenly insufficient and created demand for “resurrected” bodies in both Britain (think: Burke and Hare) and the U.S. “The public anxiety and fear generated by grave robbing boiled over on a number of occasions,” Nystrom says, and led to a series of “anatomy riots”–at least 20 between 1785 and 1855.

A group of free blacks in 1787 petitioned the city of New York to stop people from robbing bodies from their graves in the Negro Burying Ground but were unsuccessful. When the body of a white woman was discovered stolen a year later, New Yorkers took action and rioted. “The City Hospital was ransacked,” Nystrom notes, “medical students took refuge in the city jail, and six people were killed in confrontations.”

New York, Connecticut, Massachusetts, Maine, and Ohio all passed anti-grave robbing laws in the first half of the 19th century. They didn’t work. Even worse, the laws were particularly ineffective “for those groups generally targeted for such activity: African Americans, Native Americans, immigrants, and the poor,” Nystrom explains. In an attempt to distance themselves from grave-robbing, medical professionals instead became interested in the growing social issue of poverty.

Laminectomy on a thoracic vertebra from a skeleton buried in the Erie County Poorhouse cemetery. . [+] (Photo used with kind permission of Ken Nystrom.)

Dissection Of The American Poor

By the 19th century, these anatomical riders to legal punishments were extended to the poor and indigent, who formed a new group of people whose bodies could be exploited. Early in this century, widespread industrialization and urbanization contributed to an unprecedented rise in unemployment and cycles of poverty throughout the U.S. Many populous states began to treat poverty as a social problem, splitting people into “able-bodied” and “impotent” poor. While the latter were considered worthy of pity and help, the former were considered lazy. Almshouses were set up to support the “worthy” poor. At first, these institutions seemed to work, but by the mid-1800s, Nystrom says, “living conditions were deplorable” and “exposed the poor to conditions that directly and negatively affected [their] health."

Enter Jeremy Bentham. The famed 19th century philosopher saw a link between changing laws regarding the poor and the study of anatomy. He argued that poorhouses should be punitive, designed to dissuade people from seeking assistance, and that the only value in a dead body is in its use to the living. Laws passed at this time to stop grave robbing and advance medical science in essence reframed the idea of dissection as a deterrent against indigence.

The British Warburton Anatomy Act of 1832 allowed the unclaimed bodies of people who died in taxpayer-funded institutions–almshouses, mental institutions, jails, charity clinics–to be given to anatomical schools for dissection, Carlina de la Cova, a bioarchaeologist at the University of South Carolina who also studies dissection, told me. “This British law made its way across the pond and became the standard anatomical legislation in the U.S. for over 60 years.” Dissection of the poor and marginalized was quickly popularized.

Several factors specific to this historical time and place converged in the practice of dissection of the poor, according to Nystrom. “The changing societal perception of poverty,” he says, “coupled with the demand for cadavers generated by the emergence and expansion of the medical profession in the United States, created a context in which the poor became vulnerable to the postmortem violence of dissection.”

View of the side of Onondaga County Poorhouse (1827-1978). This building is typical of 19th century . [+] poorhouses in NY state. (Image in public domain, via wikimedia commons.)

Almshouse Cemetery Research

Nystrom has been working with two cemeteries from almshouses in order to get a clearer picture of dissection practices in the 19th century. The Albany County (NY) Almshouse was associated with the Albany Medical Center from 1826 to 1926. Almshouse inmates joined in death individuals from local hospitals and prisons, as well as unclaimed bodies from the city. Nystrom learned that 51 individuals’ skeletons– two-thirds of whom were male–revealed cut marks suggestive of dissection, surgical practice, or experimentation.

The Erie County (NY) Poorhouse opened in 1851 and was associated with the Buffalo Medical Center. This cemetery included nearly 100 empty coffins or wood logs in place of bodies. Perhaps families eventually claimed the bodies, but “it is also possible that these bodies were disinterred for dissection,” Nystrom says. Similar to the findings in Albany, the Erie County Poorhouse cemetery has more dissected males than females, but this cemetery included a dissected juvenile skeleton as well. At both cemeteries, the number of individuals who had been dissected is about 5-6%.

Nystrom is quick to note that dissection, autopsy, and amputation are not always easy to distinguish from one another in archaeologically recovered human skeletons. Most of the evidence he found was in the form of craniotomies, or the removal of a portion of the skull, which is clear from cut marks. Some indications of postmortem trauma take the form of cutting long bones or vertebral elements. Long bone cuts may represent failed amputations, while craniotomies could be done in service of an autopsy rather than dissection. The difference is subtle but important.

“Dissection was widely regarded as a violation of the body and was generally punitive in nature,” Nystrom says, because it “transformed the body into an object.” De la Cova adds that “to dismember the body conflicted with 19th century religious beliefs about resurrection in the afterlife. If one was dissected, this was tantamount to eternal purgatory, a fate no one, rich or poor, wished to succumb to.” Autopsies, on the other hand, “marked an individual as important enough to warrant an investigation of their death,” Nystrom explains. Dissection can therefore be evidence of structural violence, while autopsy generally would not be.

For the skeletons that had few instances of postmortem cut marks, autopsy could be a viable explanation. Ten people from the Erie County Poorhouse that had evidence only of craniotomies, for example, may very well have been autopsied rather than dissected. “This is not to say,” Nystrom reiterates, “that these individuals did not suffer from structural violence in life, just that in these particular instances we cannot unequivocally extend this violence into death.”

Bioarchaeologist Debra Martin of the University of Nevada Las Vegas, whose research deals with violence in the past, suggests that Nystrom’s work “gives insight into the ways that institutionalized violence and implementation of laws manifest in poor health and early death for vulnerable people” and that it highlights the “culturally-sanctioned ways that the uncared-for and unprotected dead are also subject to mistreatment.”

Additional evidence of craniotomy from the Erie County Poorhouse cemetery. (Photo used with kind . [+] permission of Ken Nystrom.)

Black Bodies Stolen And Dissected

The history of medicine in the U.S. was built on the bodies of the poor and the marginalized, particularly in the 18th and 19th centuries. But this social inequality in death extended into the early 20th century as well. “The historical data that is available on dissection in the late 19th and early 20th century America clearly indicates a bias toward African Americans,” De la Cova notes, “with more black bodies being anatomized than white bodies.”

De la Cova works with three American anatomical collections comprised of people whose bodies were not claimed at death. All three collections, which are from Cleveland, St. Louis, and Washington D.C., include “poor Euro-Americans, immigrants, and African Americans who lived on the margins of society” and who were dissected, she says. The Euro-Americans were largely from hospitals for the indigent, although many Euro-American women were institutionalized for decades and were not claimed by family members after they died.

A large segment of the African Americans that de la Cova studied were part of the Great Migration, black Southerners who moved far away from their homes and families to escape racial discrimination in the early 20th century. De la Cova has found that “African Americans who lived during this time period had significantly higher rates of skeletal tuberculosis than Euro-Americans. Not only does structural violence appear in the form of dissection,” she points out, “but it also is seen in regard to the actual skeletal health of these individuals” as blacks were marginalized into “poor neighborhoods with substandard, overpriced, and crowded tenement housing” where infectious disease can spread rapidly.

Archaeologist James Davidson of the University of Florida studied a black cemetery from the turn of the 20th century in Dallas, Texas, and describes secret agreements between medical schools and local government officials for procuring dead bodies.

Davidson writes that African Americans were a particularly marginalized group in the early 20th century South, and “their bodies were used in disproportionate numbers to serve as cadavers for medical instruction.” At the time, medical science had created 'evidence' that African Americans were biologically inferior to Euro-Americans to justify social systems like slavery and segregation. “In the field of medicine,” Davidson says, “black bodies, despite these claims of inferiority, were often used as instructional materials to aid white doctors training to treat white patients.”

Gate to the Freedman's Cemetery in Dallas. (Image via Flickr user QuesterMark, used under a CC BY-SA . [+] 2.0 license.)

The Freedman’s Cemetery in Dallas, which was excavated in the 1990s, contained the remains of over 1,000 people, nearly all African American. It was a cemetery that grew out of the “Beautification of Death Movement,” an idea influenced by religious ideology but that became a larger social phenomenon, which contributed to larger and more expensive funerals. There was a “palpable fear in the black community of dying without the necessary funds to provide for a ‘proper funeral,’ and instead being reduced to a pauper or indigent’s grave—losing one’s identity and perhaps falling prey to the Resurrection Men,” Davidson writes.

The burials at Freedman’s Cemetery reveal in their coffins and other accoutrements that African Americans in Dallas at the turn of the 20th century invested heavily in “proper funerals.” But the archaeology also reveals “unmistakeable evidence of the illegal use of African American bodies as cadavers in medical dissections, robbing these persons of their identity as individuals, and of even the charade of ‘a decent and proper burial’,” Davidson concludes.

A double-burial at the Freedman's Cemetery–that of two adult men in their 40s–include clear evidence of dissection of the bodies, in the form of cut marks on the skulls and other bones. Unlike the evidence in the Erie County Poorhouse or Albany County Almshouse that Nystrom examined, the placement of the two men’s bodies in the coffin including “placing one man’s body on his back in the superior position, and then positioning the other’s body prone on his stomach, where his head (if present) would have been face down in the groin of the other dead man,” Davidson found. The missing head’s whereabouts are unknown, but Davidson suspects the skull went into a study collection or onto a physician’s desk.

But similar to the finds at the Erie County Poorhouse Cemetery, the Freedman Cemetery had 16 graves that contained no skeletal remains but whose coffin lids had been seriously damaged. While Davidson also notes that bodies may have been dug up for legitimate reasons of moving the burials, grave robbing is an equally valid suggestion, and would have happened soon after death.

Freedman Cemetery closed in 1907. Some of the exhumations may therefore represent the legal transfer of remains to a new cemetery. But considering there was a dramatic increase in medical student enrollment in Dallas in the same year, Davidson thinks that many of those empty boxes are the only tangible remains of 20th century graverobbing. The remains themselves, since they were procured through illegal activity, would not have been reburied where they were taken from and were likely unceremoniously dumped somewhere else in Dallas or cremated in the city’s trash incinerator, as dead animals were treated.

"An Interrupted Dissection." Wood engraving by William A. Rogers for Harper's Magazine 1882. . [+] Protesters of exploratory dissection trying to force their way into room where dissection is being performed. (Image via Library of Congress.)

American Medicine And Structural Violence

While the medical profession has moved beyond antiquated but long-held ideas about race-based differences in human bodies, the legacy of structural violence is not gone from American culture. Beyond the practice of dissection, violence done historically to black bodies in service of medicine ranges from the Tuskegee syphilis experiment to the HeLa cell line taken from a black woman and commoditized. A history of distrust of the medical establishment “has profound implications for African American health,” Davidson says, and that today “African Americans are 60% less willing to consider whole body donation than whites of similar backgrounds.”

Nystrom also notes that, historically, we can see inequality “embodied as health disparities in the living but also ‘disembodiment’ and the treatment and fate of the dead body.” His work does not directly address postmortem examination and structural violence in African American skeletal collections, but he writes that “given the deep history of direct and indirect violence perpetrated against people of African descent, this would be a useful perspective in which to interpret such evidence.” De la Cova agrees that “working with human skeletons allows us to see and visualize the impact that structural violence has on people. The flesh may not remain, but cultural activities and disease leave their marks on the bones and give us a window into the biological impact of discrimination.”


Watch the video: Die Geschichte der Mode erzählt von Karl Lagerfeld, Teil 1 u0026 2 1989 (January 2022).