History Podcasts

When Whiskey Was the Backbone of the US Economy

When Whiskey Was the Backbone of the US Economy

Whiskey—a liquor whose origins in medieval Scotland or Ireland remain murky—once was an uncommon, exotic liquor in the 13 colonies, where rum, gin and brandy were the strong drinks of choice. But during and immediately after the Revolutionary War, that all changed. Whiskey became a popular—and profitable—drink, and more than that, a crucial commodity in the economy of the new United States of America.

Whiskey’s rise as an American liquor was due in large part to the fact that it didn’t have to be imported. Unlike rum, which was made from sugarcane and molasses shipped from British-controlled islands in the Caribbean to distilleries in New England, whiskey could be distilled in this country from domestically sourced raw ingredients. Corn, in particular, was plentiful in America.

“The British cut off molasses and rum from the West Indies during the Revolution,” explains William Rorabaugh, author of the 1979 book The Alcoholic Republic: An American Tradition, which explores the nation’s drinking habits prior to Prohibition. “Domestic whiskey originally was a wartime substitute.”

But even after the hostilities ended, the rum trade wasn’t reestablished to the same degree as it had existed prior to the war, Rorabaugh says. Scottish-Irish immigrants, who were used to drinking whiskey in their own countries, took to American whiskey readily. An even bigger factor in whiskey’s rise, he says, was as Americans migrated westward, they had access to more land to grow more corn.

“Surplus corn would rot, unless it was preserved by turning it into whiskey,” Rorabaugh says. A whiskey boom was born and soon farmers were shipping crates of it East for cash.

In Europe, liquor was commonly made from potatoes, wheat, rye or barley, so American corn-made whiskey was unique, although it had to be made in small batches. “Corn scorched in large stills but worked fine in small ones,” Rorabaugh says. “Most American distillers found that mixing a bit of rye or wheat (both more expensive than corn) made distilling easier. The new small stills also became more efficient, and the bite of raw whiskey was solved by aging whiskey in barrels.”

New Tax Spurs Whiskey Rebellion

Whiskey generated so much income, that when the new nation struggled under the weight of Revolutionary War debt, Treasury Secretary Alexander Hamilton proposed a tax on domestic liquor as a means of paying it off. Congress passed the legislation, but as Loyola University-trained historian Peter Kotowski explains, the tax soon met strident opposition.

To small farmers and distillers on the frontier in western Pennsylvania, whiskey was a means of financial survival, and they weren’t about to share their hard-earned money with the federal government. They refused to pay, and began tarring and feathering tax collectors and seizing their records at gunpoint in what became known as the Whiskey Rebellion.

President Washington—who himself later made whiskey in a distillery at Mount Vernon after he left office—initially tried to quell the uprising with a 1792 proclamation that admonished the farmers to comply. But two years later, after the malcontents set fire to the Pittsburgh home of a tax official, Washington didn’t have much choice but to respond with force.

He organized a federal militia of nearly 13,000 men and marched to western Pennsylvania, where 150 of the whiskey rebels were arrested. Only two were tried and convicted of treason. Washington, who’d made his point about the federal government’s authority, eventually pardoned them.

Whiskey Drunk as Water, Medicine

Despite the brouhaha, whiskey itself remained popular, especially on the frontier. As Kate Hopkins details in 99 Drams of Whiskey: The Accidental Hedonist's Quest for the Perfect Shot, at a time when the nation lacked clean drinking water, whiskey provided a sanitary substitute.

“Whiskey was drunk on a daily basis, as a means to start the day or complete a deal, or even as prescribed medicine,” she writes. In the west it even was used as an alternative currency, in places where actual dollars and cents were difficult to come by.

In 1802, President Thomas Jefferson eliminated the unpopular tax on whiskey, which gave the whiskey industry a renewed boost. Whiskey became an integral part of daily American life, with many drinking it at meals as well as starting and ending each day with a swallow or two. James Madison was said to have drunk a pint of whiskey daily. “By the 1820s, whiskey sold for twenty-five cents a gallon, making it cheaper than beer, wine, coffee, tea, or milk,” Rorabaugh writes.

Whiskey Profits Dip With Rise of Industrial Revolution

Even so, whiskey’s economic importance eventually subsided, as the nation evolved. “After 1830 the U.S. economy became more industrial in the modern sense with mass production of textiles, shoes, books and other goods, as well as a transportation revolution that started with canals and steamboats and quickly turned to railroads,” Rorabaugh says. “These new giant industries were highly capitalized in a way that whiskey was not.”

Another factor that made whiskey less lucrative was the rise of the American temperance movement, which was strongly aligned with the nascent crusade for women’s rights. To early feminists, alcohol abuse by men, who sometimes drank up their pay at the tavern and left their families struggling, had the effect of oppressing women.

In an increasingly industrialized, rapidly growing America, whiskey also got in the way of factory profits. Factory work required Americans to stay awake and alert for long periods during the Industrial Revolution. That’s when another strong drink became popular across the country—coffee.


In June 1794, innkeeper John Lynn agreed to sublet part of his rented house in western Pennsylvania to John Neville. Neville was an excise inspector whose job it was to make sure that the federal tax on whiskey was collected from the backwoods frontiersmen. When the news circulated that Lynn was sheltering a tax collector in his home, however, a dozen armed men went to the inn. The men kidnapped Lynn, carried him into the woods, stripped him naked, shaved off his hair, and coated him with hot tar and feathers. After extracting a promise from Lynn not to allow his house to be used as a tax office and not to reveal their identities to the authorities, the men tied him to a tree and left him overnight in the middle of the forest. Although Lynn kept his promise, the notoriety he gained from his association with the tax ruined his business. Events like Lynn's kidnapping threatened federal tax collectors all across western Pennsylvania during the summer of 1794. They marked the beginning of what became known as the Whiskey Rebellion — the largest and most serious challenge to federal authority yet faced by the new United States.

The Whiskey Rebellion had its roots in the period around the American Revolution (1775 – 1783). Before the war hundreds of families crossed the Appalachian Mountains, searching for better, cheaper land. They were accompanied by an equal number of land speculators, who were working for rich colonial interests. The speculators laid claim to hundreds of thousands of acres of the best farm land in the name of men who already owned thousands of acres in Virginia, Pennsylvania, New York, and elsewhere. George Washington (1789 – 1797), who had trained as a surveyor, was one of the largest buyers of land. He owned more than 63,000 acres in western Pennsylvania by the time he became president. Absentee landowners like Washington claimed most of the best land, and the poor farmers were forced to survive on the remnants.

Separated from the older colonies by the mountains, the frontiersmen were forced to rely on themselves for protection and assistance. They were threatened by hostile Native Americans and hampered by lack of money, but they were especially frustrated by transportation problems. All the major colonial markets for their grain were on the other side of the Appalachians, and the costs of transporting their produce across the mountains were very high. The Spanish, who controlled the mouth of the Mississippi River, blocked an alternate route down the Ohio and Mississippi rivers. In order to turn a profit on their excess grain, the frontiersmen built private stills and converted it into whiskey. Whiskey was easier to sell than raw grain, and it held its value better.

When the U.S. Constitution was ratified in 1788 the new federal government agreed to assume the outstanding war debts of the former colonies. In order to pay these debts, President Washington's Secretary of the Treasury Alexander Hamilton (1755 – 1804) pushed a tax on whiskey and other alcoholic beverages through Congress in 1791. Many congressional delegates from the West were opposed to the whiskey tax. For his part, Hamilton believed that such a tax was the fairest way of spreading the costs of the American Revolution and the maintenance of the federal government across the population.

What Hamilton failed to consider was how strongly the settlers across the Appalachian Mountains felt about paying the tax. The western frontiersmen believed that they were maintaining their rights against the distant federal government in the same way their predecessors had done against the British government during the 1760s and 1770s. They felt betrayed by John Jay's (1745 – 1829) negotiations with the Spanish from 1785 to 1786 that kept them from shipping their grain down the Mississippi. Further, after two major defeats of federal troops by Miami and Shawnee tribesmen, the frontiersmen believed that the federal government was even unable to protect them.

During September 1791, representatives of the four westernmost Pennsylvania counties — Washington, Fayette, Allegheny, and Westmoreland — assembled at Pittsburgh, Pennsylvania, to discuss how to persuade Congress to repeal the whiskey tax. Although Hamilton would later portray them as radical anti-federalists, they held moderate views about the national government. Other westerners were not so tolerant. By the summer of 1794 what little patience they had was exhausted. Early in the morning of July 16, 1794, some 50 men armed with rifles approached the house where John Neville was staying. They demanded that Neville resign his position as excise inspector and turn over to them all the information he had collected on distilling in the area. Neville and the armed men exchanged shots five of the besiegers were wounded, one of them fatally. The next day a mob of hundreds of local residents surrounded Neville's property. Neville, who had been reinforced by several soldiers from Fort Pitt, escaped without injury, but several soldiers were wounded and died, as did three or four of the attackers. The mob burned Neville's home and property to the ground.

The attack on John Neville marked the beginning of the Whiskey Rebellion. Throughout August and September threats of violence against tax collectors and inspectors spread out of the western districts of Pennsylvania and into Maryland, Virginia, Ohio, and Kentucky. In most cases, the rioters got their way through intimidation, and little blood was shed. The largest assembly came outside Pittsburgh on August 1, 1794, where about 7,000 frontiersmen gathered — mostly poor people who did not own property or even a still and were not directly affected by the tax. "Not surprisingly, then," wrote historian Thomas Slaughter in The Whiskey Rebellion: Frontier Epilogue to the American Revolution (1986), "their grievances were primarily economic in character their victims were primarily members of wealthier commercial classes and the property they envied was often the object of violence." However, the townspeople managed to defuse much of the threat by welcoming the frontiersmen into their houses and making whiskey freely available. They also convinced them not to burn property in the town and allowed them to expel some of the most obnoxious townsmen. The presence of the soldiers at nearby Fort Fayette also helped keep the rioters in check. Within a few weeks the whiskey rebels had dispersed and returned to their homes.

At the same time the Whiskey rebels near Philadelphia were beginning to disperse, the federal government was preparing to take action. President Washington called a meeting of his Cabinet to consider what action to take regarding the rebels. He found himself in agreement with Treasury Secretary Hamilton that the rebellion was a serious threat to the Constitution and the federal government. A proclamation was issued instructing the rebels to disperse by September 1. By that date, however, Hamilton had already begun to assemble a 12,950-man army that he believed would crush the rebellion and teach his political opponents a lesson. Although cooler heads had already prevailed among the leaders of the westerners, Hamilton's army marched at the end of September.

The Whiskey Rebellion trickled to a halt without much bloodshed. There were only two fatalities in western Pennsylvania, both of them accidental — one boy was shot by a soldier whose gun went off accidentally, and a drunken rebel supporter was stabbed with a bayonet while resisting arrest. By November 19 the federal army had managed to round up only 20 accused "leaders" of the Whiskey Rebellion. Eighteen of the accused were later acquitted in the courts the other two were convicted of treason but were later given a presidential pardon.

The Whiskey Rebellion ended not because of the threat posed by Hamilton's army, but because many of the concerns of the frontiersmen were finally addressed. On August 20, 1794, an American army under General "Mad" Anthony Wayne decisively defeated a confederation of Native Americans at the battle of Fallen Timbers, outside modern Toledo, Ohio. The Treaty of Greenville (1795) that Wayne negotiated with the Native Americans opened the Ohio country to settlement. The Jay Treaty (1794) with Great Britain, and the Pinckney Treaty (1795) with Spain moved foreign troops away from western American borders and opened the Mississippi River to American shipping. Perhaps the most significant factor, however, was the fact that a political party with sympathies toward the frontier position, the Jeffersonian Republicans, came into power in the election of 1800. One of the first actions of President Thomas Jefferson's (1801 – 1809) administration was to strike down the Whiskey Tax and other internal taxes.

See also: Appalachain Mountains, Jay Treaty, Pinckney Treaty

Defining the U.S. Economy

The term market economy describes an economy in which the forces of supply and demand dictate the way in which goods and resources are allocated and what prices will be set. The opposite of a market economy is a planned economy, in which the government determines what will be produced and what prices will be charged. In a market economy, producers anticipate what products the market will be interested in and at what price, and they make decisions about what products they will bring to market and how these products will be produced and priced. Market economies foster competition among businesses, which typically leads to lower prices and is generally considered beneficial for both workers and consumers. By contrast, a planned economy is directed by a central government that has a far greater degree of influence over prices and production, as well as a tighter regulation of industries and manufacturing procedures. The United States has a mixed economy, which combines aspects of a market economy with some central planning and control to create a system with a high degree of market freedom along with regulatory agencies and social programs that promote the public welfare.

This mixed economy did not develop overnight. It has evolved over more than two centuries and has been shaped by American experiences at various times with hardship, war, peace, and prosperity.

Regulation Nation

The history of whiskey in America is a contentious one. Remember the Whiskey Rebellion of 1794, when small whiskey producers rioted against an excise tax on whiskey? You don’t remember that? OK, well, it happened. And then there’s Prohibition, which drove whiskey producing underground from 1920 to 1933.

In the early 1800’s there were an estimated 14,000 distilleries in the U.S. That number plummeted during prohibition, and the businesses that bounced back after the 1933 repeal weren’t the small producers.

From the 1791 excise tax on whiskey to the distribution rules passed by Congress after Prohibition, the government has tended to make it easier to be a large producer of whiskey than a small-batch producer. That hasn’t dampened the American tradition of small distilleries, though. By some estimates, nearly 900 small producers are licensed to make whiskey in the States today.

It’s not just federal regulation that can tie small producers in knots. States impose their own rules, particularly those states that have monopolies on the sale of liquor. There are also sin taxes on the sales of alcohol in some places.

Still, that tangle of regulation didn’t stop bourbon production from jumping from 13,137,000 9-liter cases in 2002 to 19,357,000 9-liter cases in 2014. In 2000, there were 24 craft distilleries in the country. As of 2014, there were more than 430.

Slavery and the History of US Economic Growth

Slavery was both a set of economic arrangements and also a raw authoritarian human rights violation.

It's unsurprising that there has been long-standing controversy over the relationship: for example, did slavery in the United States boost to economic growth or hold it back? Gavin Wright revisits these issues in "Slavery and Anglo‐American capitalism revisited" ( Economic History Review, May 2020, 73:2, pp. 353-383 , subscription required). The paper was also the subject of the Tawney lecture at the Economic History Society meetings in 2019, and the one-hourlecture can be freely viewed here .

Wright frames his discussion around the "Williams thesis," based on the 1944 book Capitalism and Slavery, focused on the United Kingdom. Williams argued that while slavery played an important role in British capitalism in the 18th century--in particular, the brutalities of slave labor were central to production of sugar and thus to Britain's international trade--by early in the 19th century the British economy and exports had evolved toward manufacture of industrial products, in such a way that slave labor was no longer vital. Wright argues that as the US economy of the 19th century evolved, slavery tended to hold back US economic growth.

To set the stage, let's be clear that the economic activity of slavery was deeply entangled with capitalism. Wright offers an example that will resonate with those of us working in higher education:

The prominence of slave-based commerce for the Atlantic economy provides the background for the arresting connections reported by C. S. Wilder in his book Ebony and ivy, associating early American universities with slavery. The first five colleges in British America were major beneficiaries of the African slave trade and slavery. ‘Harvard became the first in a long line of North American schools to target wealthy planters as a source of enrollments and income’. The reason for what might seem an incongruous liaison is not hard to identify: ‘The American college was an extension of merchant wealth’. A wealthy merchant in colonial America was perforce engaged with the slave trade or slave-based commerce.

However, as numerous writers have pointed out over time, the coexistence of slavery with British and American capitalism of the 17th century does not prove that slavery was necessary or sufficient for an emerging capitalism. As many writers have pointed out, historical slavery across what we now call Latin America. At that time, Spain and Portugal (among others) were also active participants in the slave trade, yet their economies did not develop an industrial revolution like that of the UK. Countries all over Latin America were recipients of slaves, like the area that became the US, but those countries did not develop a US-style economy. Clearly, drawing a straight line from slavery to capitalism of the Anglo-American variety would be wildly simplistic.

Wright argues that slavery did seem essential to sugar plantations: "Sugar plantations required slave labour not because of any efficiency advantage associated with that organizational system, but because it was all but impossible to attract free labour to those locations and working conditions." But Wright argues that when it came to cotton (or tobacco or other crops), slavery did not have any particular advantage over free labor. Thus, US cotton plantations run by slave labor did not come into being because they had an economic advantage, but rather because slaveowners saw it as a way to benefit from owning slaves.

The Atlantic economy of the eighteenth century was propelled by sugar, a quintessential slave crop. In contrast, cotton required no large investments of fixed capital and could be cultivated efficiently at any scale, in locations that would have been settled by free farmers in the absence of slavery. Early mainland cotton growers deployed slave labour, not because of its productivity or aptness for the new crop, but because they were already slave owners, searching for profitable alternatives to tobacco, indigo, and other declining crops. Slavery was, in effect, a ‘pre-existing condition’ for the nineteenth-century American South.

It's true that a lot of pro-slavery writers in the 1850s boasted that cotton was essential to the US economy, as a way of arguing that their own role as slave-owners was also essential. But slave-holders also argued that wage labor was exploitative and slavery represented true Christian morality and the Golden Rule . Rather than listening to the explanations of those trying to justify evil, it's more useful to look at what actually happened in history. If it was true that slave-produced cotton was essential to US economic growth,, then end of slavery should have wiped out US economic growth. But it didn't. Wright points to some research literature looking back at the US economy in the 1830s: "Cotton production accounted for about 5 per cent of GDP at that time. Cotton dominated US exports after 1820, but exports never exceeded 7 per cent of GDP during the antebellum period. The chief sources of US growth were domestic. . [The] cotton staple growth theory has been overwhelmingly rejected by economic historians as an explanation for US growth in the antebellum era."

Similarly, if it was true that slave plantations were the most efficient way of growing cotton, then the end of slavery should have caused the price of cotton to rise on world markets. But it didn't.

The best evidence that slavery was not essential for cotton supply is what happened after slavery’s demise. The wartime and postwar years of ‘cotton famine’ were times of great hardship for Lancashire, only partially mitigated by high-cost imports from India, Egypt, and Brazil. After the war, however, merchants and railroads flooded into the south-east, enticing previously isolated farm areas into the cotton economy. Production in plantation areas gradually recovered, but the biggest source of new cotton came from white farmers in the Piedmont. When the dust settled in the 1880s, India, Egypt, and slave-using Brazil had retreated from world markets, and the price of cotton in Lancashire was back to its antebellum level .

Again, slave labor on US cotton plantations was for the benefit of the slaveholders, not the US economy as a whole. Indeed, as the 19th century evolved, the US South consistently underperformed as a cotton supplier. Wright points out three reasons.

First, "[t]he region closed the African slave trade in 1807 and failed to recruit free labour, making labour supply inelastic." Why were slaveowners against having more slaves? As Wright points out: "After voting for secession in 1861 by 84 to 14, the Mississippi convention voted down a re-opening resolution by 66 to 13. The reason for this ostensible contradiction is not difficult to identify: to re-open the African trade was to threaten the wealth of thousands of slaveholders across the South." In short, bringing in more slaves would have reduce the price of existing slaves--so existing slaveowners were against it. In addition, immigrant to the US from, say, 1820 to 1880 overwhelmingly went to free states. Slave states in the southwest "displayed net white outmigration, even during cotton booms, at times when one might have expected a rush of immigration. One result was low population density and a level of cotton production well below potential."

Second, "[s]laveholders neglected infrastructure, so that large sections of the antebellum South were bypassed by the slave economy and left on the margins of commercial agriculture." The middle of the 19th century was a time when the US had a vast expansion of turnpikes, railroads, canals, and other infrastructure often built by state-charted corporations. However, almost all of this contruction occurred in the northern states. Not only were the southern states uninterested, they actively blocked national-level efforts along these lines: "Over time, however, the slave South increasingly assumed the role of obstructer to a national pro-growth agenda. . [S]outhern presidents vetoed seven Rivers & Harbors bills between 1838 and 1860, frustrating the ambitions of entrepreneurs in the Great Lakes states."

Third, "the fixed-cost character of slavery meant that even large plantations aimed at self-sufficiency in foodstuffs, limiting the overall degree of market specialization." One main advantage of slavery in cotton production was that it guaranteed having sufficient labor available at the two key times of the year for cotton: planting and harvesting. But during the rest of the year, most cotton plantations grew other crops and raised livestock

The shortcomings of the South as a cotton producer during this time were clear to some contemporary observers. Wright says: "Particularly notable are the views of Thomas Ellison, long-time chronicler and statistician of cotton markets, who observed in 1858: `That the Southern regions of the United States are capable of producing a much larger quantity of Cotton than has yet been raised is very evident in fact, their resources are, practically speaking, almost without limit’. What was it that restrained this potential supply? Ellison had no doubt that the culprit was slavery . "

In short, the slave plantations of the American South were a success for the slaveowners, but not for the US economy. From a broader social perspective, slavery was a policy that scared off new immigrants, ignored infrastructure, and blocked the education and incentives of much of the workforce. These policies are not conducive to growth. As Wright puts it: ""Slavery was a source of regional impoverishment in nineteenth-century America, not a major contributor to national growth."

When Whiskey Was the Backbone of the US Economy - HISTORY


The story of Chinatown is the story of a neighborhood an American neighborhood, an old neighborhood, an immigrant neighborhood, where the old country still lives inside the new one. The past and the present are inseparably woven together in this neighborhood defined by Broadway, California, Kearny and Powell streets.

In the mid-1840's, following defeat by Britain in the first Opium War, a series of natural catastrophes occurred across China resulting in famine, peasant uprisings and rebellions. Understandably, when the news of gold and opportunity in far away Gum San, (Golden Mountain- the Chinese name for America) reached China, many Chinese seized the opportunity to seek their fortune.

The Chinese were met with ambiguous feelings by Californians. In 1850, San Francisco Mayor John W. Geary invited the "China Boys" to a ceremony to acknowledge their work ethic. However, as the American economy weakened, the Chinese labor force became a threat to mainstream society. Racial discrimination and repressive legislation drove the Chinese from the gold mines to the sanctuary of the neighborhood that became known as Chinatown. The only ethnic group in the history of the United States to have been specifically denied entrance into the country, the Chinese were prohibited by law to testify in court, to own property, to vote, to have families join them, to marry non-Chinese, and to work in institutional agencies.

The success and survival of Chinatown depended a great deal on the family and district benevolent associations which served as political and social support systems to newcomers. The members strove to meet the basic needs of the community, and represented a united voice in the fight against discriminatory legislation process.

"CHINATOWN" offers a revealing look at how a group of people bound geographically, culturally, linguistically and economically during hostile times has flourished to become a vibrant, courageous and proud community for Chinese Americans and greater San Francisco, referred to as Dai Fao (Big City) in Chinese.

Return to the Chinatown Resource Guide Table of Contents


Depression followed the completion of the railroad. In 1869 twenty thousand Chinese were suddenly out of work. Many traditional means of wage earning were inaccessible to the Chinese.

Their farm laboring skills produced superior varieties of rice, oranges, apples, cherries and peaches. The Chinese filled the need for domestic services in white homes and developed laundry businesses. They became successfully involved in the restaurant business, fishing and shrimping industries, and leather goods manufacturing. As soon as their new businesses flourished, they were targeted as unwelcome competition to the struggling economy of San Francisco.

The Burlingame Treaty of 1869 encouraged the Chinese to emigrate to the United States in greater numbers. Reacting to the America's fear of the "yellow peril," in 1877 Denis Kearney organized the Workingman's Party with the rallying cry, "The Chinese Must Go!" which led to the looting and burning of many Chinese businesses.

More than thirty anti-Chinese legislations were enacted during the l870's at both the state and local levels. (See legislation section) The result of this codified racism was to exclude Chinese from many occupations and to deprive them of full participation in a society they had helped to build. Culmination of this discriminatory legislation resulted in the Chinese Exclusion Act of May 6, l882. This act suspended the immigration of Chinese laborers for ten years.

Return to the Chinatown Resource Guide Table of Contents


The American flag was raised in Portsmouth Square, on July 9, 1846. The small frontier town rapidly grew into a city after the discovery of gold. Portsmouth Square, served as a cow pen, surrounded by tents and adobe huts in 1848, and by brick and stone buildings, hotels, business offices, shops, gambling places and restaurants by the late 1850's. At that time hundreds of Chinese strategically chose to locate their laundries, restaurants and shops close to the center of the city, Portsmouth Square to cater to mining related needs. They were established on or within a block of the square, and gradually branched out to Dupont (present-day Grant) and Kearny Streets. The area referred to as "Little Canton," had thirty-three retail stores, fifteen pharmacies/Chinese herbalists and five restaurants. In 1853 the neighborhood was given the name "Chinatown" by the press. The first Chinese hand laundry was started on the corner of Washington Dupont Streets in 1851. By 1870 some 2,000 Chinese laundries were in the trade growing to 7,500 in 1880. Merchants and peddlers provided fresh fruits, vegetables and flowers. As San Francisco became a recreation center, the Chinese seized opportunities to provide festive activities. In addition an entire theater building was imported from China and erected in Chinatown to house the Chinese theatrical troupe.

Chinatown's twelve blocks of crowded wooden and brick houses, businesses, temples, family associations, rooming houses for the bachelor majority, (in 1880 the ratio of men to women was 20 to 1) opium dens, gambling halls was home to 22,000 people. The atmosphere of early Chinatown was bustling and noisy with brightly colored lanterns, three-cornered yellow silk pennants denoting restaurants, calligraphy on sign boards, flowing costumes, hair in queues and the sound of Cantonese dialects. In this familiar neighborhood the immigrants found the security and solidarity to survive the racial and economic oppression of greater San Francisco.

Return to the Chinatown Resource Guide Table of Contents


On April 18, 1906, San Francisco was devastated by a huge earthquake. As fires raged, Chinatown was leveled. It seemed that what the city and country wanted for fifty years, nature had accomplished in forty-five seconds. Ironically, because the immigration records and vital statistics at City Hall had been destroyed, many Chinese were able to claim citizenship, then send for their children and families in China. Legally, all children of U.S. citizens were automatically citizens, regardless of their place of birth. Thus began the influx of"paper sons and paper daughters" - instant citizens - which helped balance the demographics of Chinatown's "bachelor society." Finally, Chinatown had what it had been missing for so long - children.

The city fathers had no intention of allowing Chinatown to be rebuilt in its own neighborhood, on valuable land next to the Financial District. While they were deciding where to relocate the Chinese, a wealthy businessman named Look Tin Eli developed a plan to rebuild Chinatown to its original location. He obtained a loan from Hong Kong and designed the new Chinatown to be more emphatically "Oriental" to draw tourists. The old Italianate buildings were replaced by Edwardian architecture embellished with theatrical chinoiserie. Chinatown, like the phoenix, rose from the ashes with a new facade, dreamed up by an American-born Chinese man, built by white architects, looking like a stage-set China that does not exist.

Return to the Chinatown Resource Guide Table of Contents


Angel Island, the immigration station on San Francisco Bay, opened in 1910 to enforce the Chinese Exclusion Act of 1882, is where two hundred fifty thousand Chinese immigrants were processed. The average detention was two weeks, the longest was twenty-two months. Conditions on Angel Island were harsh, families were isolated, separated, and the interrogated. Detainees were questioned in great detail about who they were and why they were claiming the right to enter the United States. Those whose answers were unacceptable to the officers were denied admission. To prepare for the questions, immigrants often relied on coaching papers which contained details on the background of individuals who could legally claim American citizenship. Typically such papers were purchased as part of the package of tickets and information about entering the United States.

Angel Island Station was closed in 1940 after a fire destroyed many of the buildings. The Exclusion Act was repealed in 1943 and in 1962 most of Angel Island was converted to state park.

Return to the Chinatown Resource Guide Table of Contents


As with the Great Quake and fire of 1906, the catastrophic events of World War II and it's aftermath benefited the Chinese in America. The Japanese attack on Pearl Harbor (December 7, 1941) became a vehicle of opportunity for the Chinese Americans. China became an ally in the war against Japan, and public sentiment in favor of America's Chinese allies surged. For the first time, Chinese aliens entered the mainstream of American society. Chinese Americans wore the same uniform as American soldiers, and fought side by side with them under the American flag. Labor shortages on the homefront opened jobs previously closed to them.

The most important declaration came on December 17, 1943, halfway through the war, when President Roosevelt signed the repeal of the Chinese Exclusion Act, ending more than sixty years of legalized racism and discrimination. This did not guarantee instant acceptance by the dominant society. After the repeal of the Exclusion Act and the enactment of the War Bride Act, acculturation and assimilation began to take place. The once bachelor society began to shift toward a new American Chinese community filled with families and children. Finally Chinese immigrants were legally allowed to become citizens and to own property.

Return to the Chinatown Resource Guide Table of Contents


Today's Chinatown is a unique neighborhood defined by its people, its institutions and its history - a history of welcome, rejection and acceptance. Chinese-style buildings and the narrow bustling streets give Chinatown its character. Beyond the gilded storefronts you will find tenements crowded with elderly people and new immigrants struggling with problems left by years of exclusion and discrimination - unemployment, health problems and substandard housing. Core Chinatown itself, limited by its capacity to grow, no longer serves as the major residential area for the Chinese of San Francisco. Many have moved out of crowded Chinatown to the Richmond and Sunset districts.

In 1977, the Chinatown Resource Center and the Chinese Community Housing Corporation launched a comprehensive improvement program striving to find solutions for land use changes. Since 1895 the Chinese American Citizens Alliance has fought against disenfranchisement of citizens of Chinese ancestry and sponsored a number of community projects.

Today, San Francisco's Chinatown has developed cultural autonomy which sustains many activities: dance, music groups, a children's orchestra, artists, a Chinese Culture Center, and the Chinese Historical Society of America. A result of the community's commitment to excellence in education is its involvement in the legal debates of affirmative action vs. school desegregation for Asian-American youth.

"Viewed within the context of the City of San Francisco, Chinatown is one of many culturally distinct neighborhoods that together make up the backbone of the City. Viewed within the context of America, Chinatown is an American working class community that has been a partner in building this nation with every other American working class community. Like all other American neighborhoods, Chinatown has been developed by the will and energies of immigrants."*

* Elaine Joe, "American Communities Built on Multiculturalism," Neighborhood Bulletin, A Newsletter of the Chinatown Resource Center and Chinese Community Housing Corporation vol.17, no.4 (Fall 1995).

When Whiskey Was the Backbone of the US Economy - HISTORY

The U.S. Economy:
A Brief History

The U.S. Economy:
A Brief
The modern American economy traces its roots to the quest of European settlers for economic gain in the 16th, 17th, and 18th centuries. The New World then progressed from a marginally successful colonial economy to a small, independent farming economy and, eventually, to a highly complex industrial economy. During this evolution, the United States developed ever more complex institutions to match its growth. And while government involvement in the economy has been a consistent theme, the extent of that involvement generally has increased.
North America's first inhabitants were Native Americans -- indigenous peoples who are believed to have traveled to America about 20,000 years earlier across a land bridge from Asia, where the Bering Strait is today. (They were mistakenly called "Indians" by European explorers, who thought they had reached India when first landing in the Americas.) These native peoples were organized in tribes and, in some cases, confederations of tribes. While they traded among themselves, they had little contact with peoples on other continents, even with other native peoples in South America, before European settlers began arriving. What economic systems they did develop were destroyed by the Europeans who settled their lands.
Vikings were the first Europeans to "discover" America. But the event, which occurred around the year 1000, went largely unnoticed at the time, most of European society was still firmly based on agriculture and land ownership. Commerce had not yet assumed the importance that would provide an impetus to the further exploration and settlement of North America.
In 1492, Christopher Columbus, an Italian sailing under the Spanish flag, set out to find a southwest passage to Asia and discovered a "New World." For the next 100 years, English, Spanish, Portuguese, Dutch, and French explorers sailed from Europe for the New World, looking for gold, riches, honor, and glory.
But the North American wilderness offered early explorers little glory and less gold, so most did not stay. The people who eventually did settle North America arrived later. In 1607, a band of Englishmen built the first permanent settlement in what was to become the United States. The settlement, Jamestown, was located in the present-day state of Virginia.

Early settlers had a variety of reasons for seeking a new homeland. The Pilgrims of Massachusetts were pious, self-disciplined English people who wanted to escape religious persecution. Other colonies, such as Virginia, were founded principally as business ventures. Often, though, piety and profits went hand-in-hand.
England's success at colonizing what would become the United States was due in large part to its use of charter companies. Charter companies were groups of stockholders (usually merchants and wealthy landowners) who sought personal economic gain and, perhaps, wanted also to advance England's national goals. While the private sector financed the companies, the King provided each project with a charter or grant conferring economic rights as well as political and judicial authority. The colonies generally did not show quick profits, however, and the English investors often turned over their colonial charters to the settlers. The political implications, although not realized at the time, were enormous. The colonists were left to build their own lives, their own communities, and their own economy -- in effect, to start constructing the rudiments of a new nation.
What early colonial prosperity there was resulted from trapping and trading in furs. In addition, fishing was a primary source of wealth in Massachusetts. But throughout the colonies, people lived primarily on small farms and were self-sufficient. In the few small cities and among the larger plantations of North Carolina, South Carolina, and Virginia, some necessities and virtually all luxuries were imported in return for tobacco, rice, and indigo (blue dye) exports.
Supportive industries developed as the colonies grew. A variety of specialized sawmills and gristmills appeared. Colonists established shipyards to build fishing fleets and, in time, trading vessels. The also built small iron forges. By the 18th century, regional patterns of development had become clear: the New England colonies relied on ship-building and sailing to generate wealth plantations (many using slave labor) in Maryland, Virginia, and the Carolinas grew tobacco, rice, and indigo and the middle colonies of New York, Pennsylvania, New Jersey, and Delaware shipped general crops and furs. Except for slaves, standards of living were generally high -- higher, in fact, than in England itself. Because English investors had withdrawn, the field was open to entrepreneurs among the colonists.
By 1770, the North American colonies were ready, both economically and politically, to become part of the emerging self-government movement that had dominated English politics since the time of James I (1603-1625). Disputes developed with England over taxation and other matters Americans hoped for a modification of English taxes and regulations that would satisfy their demand for more self-government. Few thought the mounting quarrel with the English government would lead to all-out war against the British and to independence for the colonies.
Like the English political turmoil of the 17th and 18th centuries, the American Revolution (1775-1783) was both political and economic, bolstered by an emerging middle class with a rallying cry of "unalienable rights to life, liberty, and property" -- a phrase openly borrowed from English philosopher John Locke's Second Treatise on Civil Government (1690). The war was triggered by an event in April 1775. British soldiers, intending to capture a colonial arms depot at Concord, Massachusetts, clashed with colonial militiamen. Someone -- no one knows exactly who -- fired a shot, and eight years of fighting began. While political separation from England may not have been the majority of colonists' original goal, independence and the creation of a new nation -- the United States -- was the ultimate result.

The New Nation's Economy
The U.S. Constitution, adopted in 1787 and in effect to this day, was in many ways a work of creative genius. As an economic charter, it established that the entire nation -- stretching then from Maine to Georgia, from the Atlantic Ocean to the Mississippi Valley -- was a unified, or "common," market. There were to be no tariffs or taxes on interstate commerce. The Constitution provided that the federal government could regulate commerce with foreign nations and among the states, establish uniform bankruptcy laws, create money and regulate its value, fix standards of weights and measures, establish post offices and roads, and fix rules governing patents and copyrights. The last-mentioned clause was an early recognition of the importance of "intellectual property," a matter that would assume great importance in trade negotiations in the late 20th century.
Alexander Hamilton, one of the nation's Founding Fathers and its first secretary of the treasury, advocated an economic development strategy in which the federal government would nurture infant industries by providing overt subsidies and imposing protective tariffs on imports. He also urged the federal government to create a national bank and to assume the public debts that the colonies had incurred during the Revolutionary War. The new government dallied over some of Hamilton's proposals, but ultimately it did make tariffs an essential part of American foreign policy -- a position that lasted until almost the middle of the 20th century.
Although early American farmers feared that a national bank would serve the rich at the expense of the poor, the first National Bank of the United States was chartered in 1791 it lasted until 1811, after which a successor bank was chartered.
Hamilton believed the United States should pursue economic growth through diversified shipping, manufacturing, and banking. Hamilton's political rival, Thomas Jefferson, based his philosophy on protecting the common man from political and economic tyranny. He particularly praised small farmers as "the most valuable citizens." In 1801, Jefferson became president (1801-1809) and turned to promoting a more decentralized, agrarian democracy.

Movement South and Westward
Cotton, at first a small-scale crop in the South, boomed following Eli Whitney's invention in 1793 of the cotton gin, a machine that separated raw cotton from seeds and other waste. Planters in the South bought land from small farmers who frequently moved farther west. Soon, large plantations, supported by slave labor, made some families very wealthy.
It wasn't just southerners who were moving west, however. Whole villages in the East sometimes uprooted and established new settlements in the more fertile farmland of the Midwest. While western settlers are often depicted as fiercely independent and strongly opposed to any kind of government control or interference, they actually received a lot of government help, directly and indirectly. Government-created national roads and waterways, such as the Cumberland Pike (1818) and the Erie Canal (1825), helped new settlers migrate west and later helped move western farm produce to market.
Many Americans, both poor and rich, idealized Andrew Jackson, who became president in 1829, because he had started life in a log cabin in frontier territory. President Jackson (1829-1837) opposed the successor to Hamilton's National Bank, which he believed favored the entrenched interests of the East against the West. When he was elected for a second term, Jackson opposed renewing the bank's charter, and Congress supported him. Their actions shook confidence in the nation's financial system, and business panics occurred in both 1834 and 1837.
Periodic economic dislocations did not curtail rapid U.S. economic growth during the 19th century. New inventions and capital investment led to the creation of new industries and economic growth. As transportation improved, new markets continuously opened. The steamboat made river traffic faster and cheaper, but development of railroads had an even greater effect, opening up vast stretches of new territory for development. Like canals and roads, railroads received large amounts of government assistance in their early building years in the form of land grants. But unlike other forms of transportation, railroads also attracted a good deal of domestic and European private investment.
In these heady days, get-rich-quick schemes abounded. Financial manipulators made fortunes overnight, but many people lost their savings. Nevertheless, a combination of vision and foreign investment, combined with the discovery of gold and a major commitment of America's public and private wealth, enabled the nation to develop a large-scale railroad system, establishing the base for the country's industrialization.

Industrial Growth
The Industrial Revolution began in Europe in the late 18th and early 19th centuries, and it quickly spread to the United States. By 1860, when Abraham Lincoln was elected president, 16 percent of the U.S. population lived in urban areas, and a third of the nation's income came from manufacturing. Urbanized industry was limited primarily to the Northeast cotton cloth production was the leading industry, with the manufacture of shoes, woolen clothing, and machinery also expanding. Many new workers were immigrants. Between 1845 and 1855, some 300,000 European immigrants arrived annually. Most were poor and remained in eastern cities, often at ports of arrival.
The South, on the other hand, remained rural and dependent on the North for capital and manufactured goods. Southern economic interests, including slavery, could be protected by political power only as long as the South controlled the federal government. The Republican Party, organized in 1856, represented the industrialized North. In 1860, Republicans and their presidential candidate, Abraham Lincoln were speaking hesitantly on slavery, but they were much clearer on economic policy. In 1861, they successfully pushed adoption of a protective tariff. In 1862, the first Pacific railroad was chartered. In 1863 and 1864, a national bank code was drafted.
Northern victory in the U.S. Civil War (1861-1865), however, sealed the destiny of the nation and its economic system. The slave-labor system was abolished, making the large southern cotton plantations much less profitable. Northern industry, which had expanded rapidly because of the demands of the war, surged ahead. Industrialists came to dominate many aspects of the nation's life, including social and political affairs. The planter aristocracy of the South, portrayed sentimentally 70 years later in the film classic Gone with the Wind , disappeared.

Inventions, Development, and Tycoons
The rapid economic development following the Civil War laid the groundwork for the modern U.S. industrial economy. An explosion of new discoveries and inventions took place, causing such profound changes that some termed the results a "second industrial revolution." Oil was discovered in western Pennsylvania. The typewriter was developed. Refrigeration railroad cars came into use. The telephone, phonograph, and electric light were invented. And by the dawn of the 20th century, cars were replacing carriages and people were flying in airplanes.
Parallel to these achievements was the development of the nation's industrial infrastructure. Coal was found in abundance in the Appalachian Mountains from Pennsylvania south to Kentucky. Large iron mines opened in the Lake Superior region of the upper Midwest. Mills thrived in places where these two important raw materials could be brought together to produce steel. Large copper and silver mines opened, followed by lead mines and cement factories.
As industry grew larger, it developed mass-production methods. Frederick W. Taylor pioneered the field of scientific management in the late 19th century, carefully plotting the functions of various workers and then devising new, more efficient ways for them to do their jobs. (True mass production was the inspiration of Henry Ford, who in 1913 adopted the moving assembly line, with each worker doing one simple task in the production of automobiles. In what turned out to be a farsighted action, Ford offered a very generous wage -- $5 a day -- to his workers, enabling many of them to buy the automobiles they made, helping the industry to expand.)
The "Gilded Age" of the second half of the 19th century was the epoch of tycoons. Many Americans came to idealize these businessmen who amassed vast financial empires. Often their success lay in seeing the long-range potential for a new service or product, as John D. Rockefeller did with oil. They were fierce competitors, single-minded in their pursuit of financial success and power. Other giants in addition to Rockefeller and Ford included Jay Gould, who made his money in railroads J. Pierpont Morgan, banking and Andrew Carnegie, steel. Some tycoons were honest according to business standards of their day others, however, used force, bribery, and guile to achieve their wealth and power. For better or worse, business interests acquired significant influence over government.
Morgan, perhaps the most flamboyant of the entrepreneurs, operated on a grand scale in both his private and business life. He and his companions gambled, sailed yachts, gave lavish parties, built palatial homes, and bought European art treasures. In contrast, men such as Rockefeller and Ford exhibited puritanical qualities. They retained small-town values and lifestyles. As church-goers, they felt a sense of responsibility to others. They believed that personal virtues could bring success theirs was the gospel of work and thrift. Later their heirs would establish the largest philanthropic foundations in America.
While upper-class European intellectuals generally looked on commerce with disdain, most Americans -- living in a society with a more fluid class structure -- enthusiastically embraced the idea of moneymaking. They enjoyed the risk and excitement of business enterprise, as well as the higher living standards and potential rewards of power and acclaim that business success brought.
As the American economy matured in the 20th century, however, the freewheeling business mogul lost luster as an American ideal. The crucial change came with the emergence of the corporation, which appeared first in the railroad industry and then elsewhere. Business barons were replaced by "technocrats," high-salaried managers who became the heads of corporations. The rise of the corporation triggered, in turn, the rise of an organized labor movement that served as a countervailing force to the power and influence of business.
The technological revolution of the 1980s and 1990s brought a new entrepreneurial culture that echoes of the age of tycoons. Bill Gates, the head of Microsoft, built an immense fortune developing and selling computer software. Gates carved out an empire so profitable that by the late 1990s, his company was taken into court and accused of intimidating rivals and creating a monopoly by the U.S. Justice Department's antitrust division. But Gates also established a charitable foundation that quickly became the largest of its kind. Most American business leaders of today do not lead the high-profile life of Gates. They direct the fate of corporations, but they also serve on boards for charities and schools. They are concerned about the state of the national economy and America's relationship with other nations, and they are likely to fly to Washington to confer with government officials. While they undoubtedly influence the government, they do not control it -- as some tycoons in the Gilded Age believed they did.

Government Involvement
In the early years of American history, most political leaders were reluctant to involve the federal government too heavily in the private sector, except in the area of transportation. In general, they accepted the concept of laissez-faire, a doctrine opposing government interference in the economy except to maintain law and order. This attitude started to change during the latter part of the 19th century, when small business, farm, and labor movements began asking the government to intercede on their behalf.
By the turn of the century, a middle class had developed that was leery of both the business elite and the somewhat radical political movements of farmers and laborers in the Midwest and West. Known as Progressives, these people favored government regulation of business practices to ensure competition and free enterprise. They also fought corruption in the public sector.
Congress enacted a law regulating railroads in 1887 (the Interstate Commerce Act), and one preventing large firms from controlling a single industry in 1890 (the Sherman Antitrust Act). These laws were not rigorously enforced, however, until the years between 1900 and 1920, when Republican President Theodore Roosevelt (1901-1909), Democratic President Woodrow Wilson (1913-1921), and others sympathetic to the views of the Progressives came to power. Many of today's U.S. regulatory agencies were created during these years, including the Interstate Commerce Commission, the Food and Drug Administration, and the Federal Trade Commission.
Government involvement in the economy increased most significantly during the New Deal of the 1930s. The 1929 stock market crash had initiated the most serious economic dislocation in the nation's history, the Great Depression (1929-1940). President Franklin D. Roosevelt (1933-1945) launched the New Deal to alleviate the emergency.
Many of the most important laws and institutions that define American's modern economy can be traced to the New Deal era. New Deal legislation extended federal authority in banking, agriculture, and public welfare. It established minimum standards for wages and hours on the job, and it served as a catalyst for the expansion of labor unions in such industries as steel, automobiles, and rubber. Programs and agencies that today seem indispensable to the operation of the country's modern economy were created: the Securities and Exchange Commission, which regulates the stock market the Federal Deposit Insurance Corporation, which guarantees bank deposits and, perhaps most notably, the Social Security system, which provides pensions to the elderly based on contributions they made when they were part of the work force.
New Deal leaders flirted with the idea of building closer ties between business and government, but some of these efforts did not survive past World War II. The National Industrial Recovery Act, a short-lived New Deal program, sought to encourage business leaders and workers, with government supervision, to resolve conflicts and thereby increase productivity and efficiency. While America never took the turn to fascism that similar business-labor-government arrangements did in Germany and Italy, the New Deal initiatives did point to a new sharing of power among these three key economic players. This confluence of power grew even more during the war, as the U.S. government intervened extensively in the economy. The War Production Board coordinated the nation's productive capabilities so that military priorities would be met. Converted consumer-products plants filled many military orders. Automakers built tanks and aircraft, for example, making the United States the "arsenal of democracy." In an effort to prevent rising national income and scarce consumer products to cause inflation, the newly created Office of Price Administration controlled rents on some dwellings, rationed consumer items ranging from sugar to gasoline, and otherwise tried to restrain price increases.

The Postwar Economy: 1945-1960
Many Americans feared that the end of World War II and the subsequent drop in military spending might bring back the hard times of the Great Depression. But instead, pent-up consumer demand fueled exceptionally strong economic growth in the postwar period. The automobile industry successfully converted back to producing cars, and new industries such as aviation and electronics grew by leaps and bounds. A housing boom, stimulated in part by easily affordable mortgages for returning members of the military, added to the expansion. The nation's gross national product rose from about $200,000 million in 1940 to $300,000 million in 1950 and to more than $500,000 million in 1960. At the same time, the jump in postwar births, known as the "baby boom," increased the number of consumers. More and more Americans joined the middle class.
The need to produce war supplies had given rise to a huge military-industrial complex (a term coined by Dwight D. Eisenhower, who served as the U.S. president from 1953 through 1961). It did not disappear with the war's end. As the Iron Curtain descended across Europe and the United States found itself embroiled in a cold war with the Soviet Union, the government maintained substantial fighting capacity and invested in sophisticated weapons such as the hydrogen bomb. Economic aid flowed to war-ravaged European countries under the Marshall Plan, which also helped maintain markets for numerous U.S. goods. And the government itself recognized its central role in economic affairs. The Employment Act of 1946 stated as government policy "to promote maximum employment, production, and purchasing power."
The United States also recognized during the postwar period the need to restructure international monetary arrangements, spearheading the creation of the International Monetary Fund and the World Bank -- institutions designed to ensure an open, capitalist international economy.
Business, meanwhile, entered a period marked by consolidation. Firms merged to create huge, diversified conglomerates. International Telephone and Telegraph, for instance, bought Sheraton Hotels, Continental Banking, Hartford Fire Insurance, Avis Rent-a-Car, and other companies.
The American work force also changed significantly. During the 1950s, the number of workers providing services grew until it equaled and then surpassed the number who produced goods. And by 1956, a majority of U.S. workers held white-collar rather than blue-collar jobs. At the same time, labor unions won long-term employment contracts and other benefits for their members.
Farmers, on the other hand, faced tough times. Gains in productivity led to agricultural overproduction, as farming became a big business. Small family farms found it increasingly difficult to compete, and more and more farmers left the land. As a result, the number of people employed in the farm sector, which in 1947 stood at 7.9 million, began a continuing decline by 1998, U.S. farms employed only 3.4 million people.
Other Americans moved, too. Growing demand for single-family homes and the widespread ownership of cars led many Americans to migrate from central cities to suburbs. Coupled with technological innovations such as the invention of air conditioning, the migration spurred the development of "Sun Belt" cities such as Houston, Atlanta, Miami, and Phoenix in the southern and southwestern states. As new, federally sponsored highways created better access to the suburbs, business patterns began to change as well. Shopping centers multiplied, rising from eight at the end of World War II to 3,840 in 1960. Many industries soon followed, leaving cities for less crowded sites.

Years of Change: The 1960s and 1970s
The 1950s in America are often described as a time of complacency. By contrast, the 1960s and 1970s were a time of great change. New nations emerged around the world, insurgent movements sought to overthrow existing governments, established countries grew to become economic powerhouses that rivaled the United States, and economic relationships came to predominate in a world that increasingly recognized military might could not be the only means of growth and expansion.
President John F. Kennedy (1961-1963) ushered in a more activist approach to governing. During his 1960 presidential campaign, Kennedy said he would ask Americans to meet the challenges of the "New Frontier." As president, he sought to accelerate economic growth by increasing government spending and cutting taxes, and he pressed for medical help for the elderly, aid for inner cities, and increased funds for education. Many of these proposals were not enacted, although Kennedy's vision of sending Americans abroad to help developing nations did materialize with the creation of the Peace Corps. Kennedy also stepped up American space exploration. After his death, the American space program surpassed Soviet achievements and culminated in the landing of American astronauts on the moon in July 1969.
Kennedy's assassination in 1963 spurred Congress to enact much of his legislative agenda. His successor, Lyndon Baines Johnson (1963-1969), sought to build a "Great Society" by spreading benefits of America's successful economy to more citizens. Federal spending increased dramatically, as the government launched such new programs as Medicare (health care for the elderly), Food Stamps (food assistance for the poor), and numerous education initiatives (assistance to students as well as grants to schools and colleges).
Military spending also increased as American's presence in Vietnam grew. What had started as a small military action under Kennedy mushroomed into a major military initiative during Johnson's presidency. Ironically, spending on both wars -- the war on poverty and the fighting war in Vietnam -- contributed to prosperity in the short term. But by the end of the 1960s, the government's failure to raise taxes to pay for these efforts led to accelerating inflation, which eroded this prosperity. The 1973-1974 oil embargo by members of the Organization of Petroleum Exporting Countries (OPEC) pushed energy prices rapidly higher and created shortages. Even after the embargo ended, energy prices stayed high, adding to inflation and eventually causing rising rates of unemployment. Federal budget deficits grew, foreign competition intensified, and the stock market sagged.
The Vietnam War dragged on until 1975, President Richard Nixon (1969-1973) resigned under a cloud of impeachment charges, and a group of Americans were taken hostage at the U.S. embassy in Teheran and held for more than a year. The nation seemed unable to control events, including economic affairs. America's trade deficit swelled as low-priced and frequently high-quality imports of everything from automobiles to steel to semiconductors flooded into the United States.
The term "stagflation" -- an economic condition of both continuing inflation and stagnant business activity, together with an increasing unemployment rate -- described the new economic malaise. Inflation seemed to feed on itself. People began to expect continuous increases in the price of goods, so they bought more. This increased demand pushed up prices, leading to demands for higher wages, which pushed prices higher still in a continuing upward spiral. Labor contracts increasingly came to include automatic cost-of-living clauses, and the government began to peg some payments, such as those for Social Security, to the Consumer Price Index, the best-known gauge of inflation. While these practices helped workers and retirees cope with inflation, they perpetuated inflation. The government's ever-rising need for funds swelled the budget deficit and led to greater government borrowing, which in turn pushed up interest rates and increased costs for businesses and consumers even further. With energy costs and interest rates high, business investment languished and unemployment rose to uncomfortable levels.
In desperation, President Jimmy Carter (1977-1981) tried to combat economic weakness and unemployment by increasing government spending, and he established voluntary wage and price guidelines to control inflation. Both were largely unsuccessful. A perhaps more successful but less dramatic attack on inflation involved the "deregulation" of numerous industries, including airlines, trucking, and railroads. These industries had been tightly regulated, with government controlling routes and fares. Support for deregulation continued beyond the Carter administration. In the 1980s, the government relaxed controls on bank interest rates and long-distance telephone service, and in the 1990s it moved to ease regulation of local telephone service.
But the most important element in the war against inflation was the Federal Reserve Board, which clamped down hard on the money supply beginning in 1979. By refusing to supply all the money an inflation-ravaged economy wanted, the Fed caused interest rates to rise. As a result, consumer spending and business borrowing slowed abruptly. The economy soon fell into a deep recession.

The Economy in the 1980s
The nation endured a deep recession throughout 1982. Business bankruptcies rose 50 percent over the previous year. Farmers were especially hard hit, as agricultural exports declined, crop prices fell, and interest rates rose. But while the medicine of a sharp slowdown was hard to swallow, it did break the destructive cycle in which the economy had been caught. By 1983, inflation had eased, the economy had rebounded, and the United States began a sustained period of economic growth. The annual inflation rate remained under 5 percent throughout most of the 1980s and into the 1990s.
The economic upheaval of the 1970s had important political consequences. The American people expressed their discontent with federal policies by turning out Carter in 1980 and electing former Hollywood actor and California governor Ronald Reagan as president. Reagan (1981-1989) based his economic program on the theory of supply-side economics, which advocated reducing tax rates so people could keep more of what they earned. The theory was that lower tax rates would induce people to work harder and longer, and that this in turn would lead to more saving and investment, resulting in more production and stimulating overall economic growth. While the Reagan-inspired tax cuts served mainly to benefit wealthier Americans, the economic theory behind the cuts argued that benefits would extend to lower-income people as well because higher investment would lead new job opportunities and higher wages.
The central theme of Reagan's national agenda, however, was his belief that the federal government had become too big and intrusive. In the early 1980s, while he was cutting taxes, Reagan was also slashing social programs. Reagan also undertook a campaign throughout his tenure to reduce or eliminate government regulations affecting the consumer, the workplace, and the environment. At the same time, however, he feared that the United States had neglected its military in the wake of the Vietnam War, so he successfully pushed for big increases in defense spending.
The combination of tax cuts and higher military spending overwhelmed more modest reductions in spending on domestic programs. As a result, the federal budget deficit swelled even beyond the levels it had reached during the recession of the early 1980s. From $74,000 million in 1980, the federal budget deficit rose to $221,000 million in 1986. It fell back to $150,000 million in 1987, but then started growing again. Some economists worried that heavy spending and borrowing by the federal government would re-ignite inflation, but the Federal Reserve remained vigilant about controlling price increases, moving quickly to raise interest rates any time it seemed a threat. Under chairman Paul Volcker and his successor, Alan Greenspan, the Federal Reserve retained the central role of economic traffic cop, eclipsing Congress and the president in guiding the nation's economy.
The recovery that first built up steam in the early 1980s was not without its problems. Farmers, especially those operating small family farms, continued to face challenges in making a living, especially in 1986 and 1988, when the nation's mid-section was hit by serious droughts, and several years later when it suffered extensive flooding. Some banks faltered from a combination of tight money and unwise lending practices, particularly those known as savings and loan associations, which went on a spree of unwise lending after they were partially deregulated. The federal government had to close many of these institutions and pay off their depositors, at enormous cost to taxpayers.
While Reagan and his successor, George Bush (1989-1992), presided as communist regimes collapsed in the Soviet Union and Eastern Europe, the 1980s did not entirely erase the economic malaise that had gripped the country during the 1970s. The United States posted trade deficits in seven of the 10 years of the 1970s, and the trade deficit swelled throughout the 1980s. Rapidly growing economies in Asia appeared to be challenging America as economic powerhouses Japan, in particular, with its emphasis on long-term planning and close coordination among corporations, banks, and government, seemed to offer an alternative model for economic growth.
In the United States, meanwhile, "corporate raiders" bought various corporations whose stock prices were depressed and then restructured them, either by selling off some of their operations or by dismantling them piece by piece. In some cases, companies spent enormous sums to buy up their own stock or pay off raiders. Critics watched such battles with dismay, arguing that raiders were destroying good companies and causing grief for workers, many of whom lost their jobs in corporate restructuring moves. But others said the raiders made a meaningful contribution to the economy, either by taking over poorly managed companies, slimming them down, and making them profitable again, or by selling them off so that investors could take their profits and reinvest them in more productive companies.

The 1990s and Beyond
The 1990s brought a new president, Bill Clinton (1993-2000). A cautious, moderate Democrat, Clinton sounded some of the same themes as his predecessors. After unsuccessfully urging Congress to enact an ambitious proposal to expand health-insurance coverage, Clinton declared that the era of "big government" was over in America. He pushed to strengthen market forces in some sectors, working with Congress to open local telephone service to competition. He also joined Republicans to reduce welfare benefits. Still, although Clinton reduced the size of the federal work force, the government continued to play a crucial role in the nation's economy. Most of the major innovations of the New Deal, and a good many of the Great Society, remained in place. And the Federal Reserve system continued to regulate the overall pace of economic activity, with a watchful eye for any signs of renewed inflation.
The economy, meanwhile, turned in an increasingly healthy performance as the 1990s progressed. With the fall of the Soviet Union and Eastern European communism in the late 1980s, trade opportunities expanded greatly. Technological developments brought a wide range of sophisticated new electronic products. Innovations in telecommunications and computer networking spawned a vast computer hardware and software industry and revolutionized the way many industries operate. The economy grew rapidly, and corporate earnings rose rapidly. Combined with low inflation and low unemployment, strong profits sent the stock market surging the Dow Jones Industrial Average, which had stood at just 1,000 in the late 1970s, hit the 11,000 mark in 1999, adding substantially to the wealth of many -- though not all -- Americans.
Japan's economy, often considered a model by Americans in the 1980s, fell into a prolonged recession -- a development that led many economists to conclude that the more flexible, less planned, and more competitive American approach was, in fact, a better strategy for economic growth in the new, globally-integrated environment.
America's labor force changed markedly during the 1990s. Continuing a long-term trend, the number of farmers declined. A small portion of workers had jobs in industry, while a much greater share worked in the service sector, in jobs ranging from store clerks to financial planners. If steel and shoes were no longer American manufacturing mainstays, computers and the software that make them run were.
After peaking at $290,000 million in 1992, the federal budget steadily shrank as economic growth increased tax revenues. In 1998, the government posted its first surplus in 30 years, although a huge debt -- mainly in the form of promised future Social Security payments to the baby boomers -- remained. Economists, surprised at the combination of rapid growth and continued low inflation, debated whether the United States had a "new economy" capable of sustaining a faster growth rate than seemed possible based on the experiences of the previous 40 years.
Finally, the American economy was more closely intertwined with the global economy than it ever had been. Clinton, like his predecessors, had continued to push for elimination of trade barriers. A North American Free Trade Agreement (NAFTA) had further increased economic ties between the United States and its largest trading partners, Canada and Mexico. Asia, which had grown especially rapidly during the 1980s, joined Europe as a major supplier of finished goods and a market for American exports. Sophisticated worldwide telecommunications systems linked the world's financial markets in a way unimaginable even a few years earlier.
While many Americans remained convinced that global economic integration benefited all nations, the growing interdependence created some dislocations as well. Workers in high-technology industries -- at which the United States excelled -- fared rather well, but competition from many foreign countries that generally had lower labor costs tended to dampen wages in traditional manufacturing industries. Then, when the economies of Japan and other newly industrialized countries in Asia faltered in the late 1990s, shock waves rippled throughout the global financial system. American economic policy-makers found they increasingly had to weigh global economic conditions in charting a course for the domestic economy.
Still, Americans ended the 1990s with a restored sense of confidence. By the end of 1999, the economy had grown continuously since March 1991, the longest peacetime economic expansion in history. Unemployment totaled just 4.1 percent of the labor force in November 1999, the lowest rate in nearly 30 years. And consumer prices, which rose just 1.6 percent in 1998 (the smallest increase except for one year since 1964), climbed only somewhat faster in 1999 (2.4 percent through October). Many challenges lay ahead, but the nation had weathered the 20th century -- and the enormous changes it brought -- in good shape.

Moonshine Is Growing in the U.S., and Big Whiskey Wants a Taste

Moonshine, the outlaw hooch made famous in backwoods Appalachia, is now regulated by the government and is sold at Walmart. The spirit has grown so popular that even the industry's biggest distilleries are getting in the game

Eastern Tennessee is now home to several moonshine distilleries, including Ole Smoky. Several big-name whiskey distillers began releasing their own white whiskeys this year


For decades, most people had never even seen a jar of moonshine, let alone tasted it. These days, you can find it at stores and restaurants around the country thanks to loosened liquor laws and changing consumer preferences. Even the industry’s biggest distilleries are experimenting with moonshine.

Moonshine has been distilled in backwoods Appalachia since the 1800s. By its most traditional definition, the term means “illegal spirit,” and many families in that historically independent-minded, libertarian-leaning area of the U.S. made a living off making it — partly because the liquor could be produced and sold quickly, as it didn’t require years of aging in barrels. (That, by the way, is also what gives the hooch its oftentimes harsh character.) Today, moonshine is generally used as a catchall term for unaged white whiskeys, many of which are made in Tennessee and North Carolina.

Another difference with modern-day moonshine is that the people distilling it aren’t operating outside the law. Making moonshine is now legal in Tennessee and is quickly gaining popularity around the country.

When the recession hit in 2008 and 2009, a number of states looked for ways to generate employment and keep tax revenue rolling in. One way to accomplish both goals was to loosen laws regulating distilleries. For years, the production of distilled spirits was legal only in a handful of Tennessee counties. But in 2009, the state legislature opened dozens of other counties to the business, including several in eastern Tennessee that had been home to unlawful moonshine production for decades. One of the biggest operations is Ole Smoky Moonshine Distillery, which opened in 2010 in Gatlinburg, Tenn. Roughly 250,000 to 280,000 cases of moonshine were sold in 2012, a jump from 50,000 in 2010 and 80,000 in 2011, according to food-and-beverage-analysis firm Technomic. (A case holds 12 750-ml jars.) Ole Smoky accounted for 100,000 of the cases sold in 2012.

Ole Smoky founder Joe Baker expects the company to sell 250,000 cases (3 million jars) this year. Baker attributes Ole Smoky’s growth to a number of big-box stores, including Walmart and Sam’s Club, deciding to carry the spirit “because it’s an American-made product from a small family business and because it was a well-known product that had been previously unavailable.” Ole Smoky is now available in 49 states.

While the very existence of distilleries like Ole Smoky can be credited to loosened liquor laws, the popularity of the product can be attributed to increasing consumer demand for products that are distinctive, novel and perceived as local. “Consumers are looking for a unique drink, that unique flavor you can’t get anywhere else, not something people are drinking all the time,” says David Henkes of Technomic.

Ole Smoky, for example, comes in Ball mason jars, the way moonshine did (and still does in some places) when it was sold illegally. It’s distilled right in the moonshine heartland, and the product’s outlaw backstory alone piques consumer interest. “When I went off to college, one of the first questions I was asked by anyone who found out I was from east Tennessee was, ‘Well, can you get us some moonshine?’” Baker says. “That interest in the culture of the area where I was raised kind of pushed me along to embrace it.”

Ole Smoky’s unique flavors also play into another reason for the product’s popularity: 65% of its moonshine sales are flavored, and the distillery has even advertised flavored moonshines as Mother’s Day gifts. The company’s lineup includes apple pie, blackberry, peach and cherry flavors — all of which, Baker says, are authentic to the spirit’s heritage. “We tried to embrace the rich knowledge and expertise of this area instead of just basing it on my granddad’s recipe,” Baker says. “We took the best of a lot of different recipes and came up with a product that we think best represents the area.”

Frank Coleman, senior vice president of the Distilled Spirits Council trade group, says the recent distillery legalization in states like Tennessee, coupled with the popularity of small-batch distilleries elsewhere in the U.S., has led to the recent explosion of moonshine distilleries. Ole Smoky is just one of a number of distilleries to have popped up in the Appalachian region in recent years, including East Tennessee Distillery, Short Mountain Distillery and Asheville Distilling Company in neighboring North Carolina. “You’ve had a lot of people come into the business,” Coleman says. “There’s a little bit of a gold-rush mentality.”

The growth of those distilleries has even gotten the attention of Big Whiskey, despite the fact that moonshine represents just 1% of the American whiskey sales. Earlier this year, Jack Daniels released its own white whiskey, Unaged Tennessee Rye, and Jim Beam released Jacob’s Ghost, a white whiskey that has been aged for only a year. (True straight moonshine is unaged. Regular Jim Beam bourbon, by contrast, is aged for four years in charred white-oak barrels, according to the company, which is what gives Beam and other aged whiskeys their golden brown color.)

Bill Newlands, the North America president of Beam Global, admits that the company’s new white whiskey is a direct response to the popularity of distilleries like Ole Smoky. “We certainly saw that moonshine had quite a pickup,” he says. “The question that we had around it is, ‘How broad-based would the interest be?’”

Newlands says his company isn’t quite convinced that white whiskey is the next big thing, but sales are being closely watched.

The growth in moonshine is somewhat akin to what’s happened in the beer industry over the past decade, during which big-brewery sales of beers like Bud Light and Miller Lite have been flat or declining while craft breweries like Deschutes, Brooklyn Brewery and Dogfish Head continue growing at a fast clip. That’s leading the big breweries to introduce their own “crafty” beers, like MillerCoors’ Blue Moon.

As moonshine creeps into the mainstream, however, there are some in Appalachia who question whether a spirit that’s aboveboard — and regulated and taxed by the government – can truly be considered moonshine. It may be unaged whiskey. But is it really good ol’ ’shine?

“I think there are people out there who feel that if you’re paying taxes on it, it’s not moonshine,” says Ole Smoky’s Baker. “And sure, if you pay taxes, you lose a little bit of credibility. But I think most folks — certainly people who are familiar with how we make our products and people who have been to our distillery — they see that we do it the same way that it’s been done around here forever.”

Updated, July 26: A previous version of the story stated that 130,000 cases of moonshine were sold in 2012. According to updated numbers by Technomic, between 250,000 and 285,000 cases were sold last year. Piedmont Distillers, which makes Junior Johnson’s Midnight Moon, sold roughly 130,000 cases alone in 2012.


Again, slave labor on US cotton plantations was for the benefit of the slaveholders, not the US economy as a whole. Indeed, as the 19th century evolved, the US South consistently underperformed as a cotton supplier. Wright points out three reasons.

Second, "[s]laveholders neglected infrastructure, so that large sections of the antebellum South were bypassed by the slave economy and left on the margins of commercial agriculture." The middle of the 19th century was a time when the US had a vast expansion of turnpikes, railroads, canals, and other infrastructure often built by state-charted corporations. However, almost all of this contruction occurred in the northern states. Not only were the southern states uninterested, they actively blocked national-level efforts along these lines: "Over time, however, the slave South increasingly assumed the role of obstructer to a national pro-growth agenda. . [S]outhern presidents vetoed seven Rivers & Harbors bills between 1838 and 1860, frustrating the ambitions of entrepreneurs in the Great Lakes states."

Third, "the fixed-cost character of slavery meant that even large plantations aimed at self-sufficiency in foodstuffs, limiting the overall degree of market specialization." One main advantage of slavery in cotton production was that it guaranteed having sufficient labor available at the two key times of the year for cotton: planting and harvesting. But during the rest of the year, most cotton plantations grew other crops and raised livestock

The shortcomings of the South as a cotton producer during this time were clear to some contemporary observers. Wright says: "Particularly notable are the views of Thomas Ellison, long-time chronicler and statistician of cotton markets, who observed in 1858: `That the Southern regions of the United States are capable of producing a much larger quantity of Cotton than has yet been raised is very evident in fact, their resources are, practically speaking, almost without limit’. What was it that restrained this

Watch the video: George Thorogood u0026 The Destroyers - GEAR JAMMER (January 2022).