Various History

Discussion in 'History' started by StrangerInAStrangeLand, Jun 17, 2014.

  1. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    t

    200-Year-Old Vaginal Douche Found Under City Hall In Manhattan
    ‎25 ‎February, ‎2014, ‏‎12:00:44 | brama1Go to full article
    NEW YORK (CBSNewYork) — This is an archaeological find you don’t hear about often.

    According to Livescience.com, while sifting through a 19th-century trash heap buried below City Hall Park in 2010, archaeologists found a dirt-caked tube that was finely carved out of bone and had a perforated, threaded screw cap.

    It wasn’t until recently that archaeologists discovered it was a vaginal syringe used for douching.

    The syringe was found along with alcohol bottles, smoking pipes, fine pottery and the bones of sheep, cows, fish and even turtles — then considered a delicacy — that were likely served for dinner during a party around the time City Hall was built 200 years ago, according to Live Science.

    “We think the trash deposit feature was from a single event, possibly a celebratory event,” Alyssa Loorya, who heads the Brooklyn-based Chrysalis Archaeological Consultants, told Live Science.


    The pile was discovered during excavations four years ago as a City Hall rehabilitation project.

    Archaeologist Lisa Geiger made the douche connection while volunteering at Philadelphia’s Mutter Museum — which houses various medical oddities.

    Geiger presented her research at the meeting of the Society for Historical Archaeology in Quebec City in January and hasn’t found any other syringes made from bone.

    This device is also notable for its careful construction and skilled craftsmanship, Geiger told Live Science.

    Although the syringes weren’t uncommon, talking about feminine hygiene was taboo.



    Y
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    New York Times

    Op-Ed Contributor

    Cruel and Unusual History
    By GILBERT KING

    Published: April 23, 2008


    Correction Appended

    THE Supreme Court concluded last week, in a 7-2 ruling, that Kentucky’s three-drug method of execution by lethal injection does not violate the Eighth Amendment’s prohibition on cruel and unusual punishment. In his plurality opinion, Chief Justice John Roberts cited a Supreme Court principle from a ruling in 1890 that defines cruelty as limited to punishments that “involve torture or a lingering death.”

    But the court was wrong in the 19th century, an error that has infected its jurisprudence for more than 100 years. In this nation’s landmark capital punishment cases, the resultant executions were anything but free from torture and prolonged deaths.

    The first of those landmark cases, the 1879 case of Wilkerson v. Utah, was cited by Justice Clarence Thomas, in his concurring opinion in the Kentucky case. The court “had no difficulty concluding that death by firing squad” did not amount to cruel and unusual punishment, Justice Thomas wrote.

    Wallace Wilkerson might have begged to differ. Once the Supreme Court affirmed Utah’s right to eradicate him by rifle, Wilkerson was let into a jailyard where he declined to be blindfolded. A sheriff gave the command to fire and Wilkerson braced for the barrage. He moved just enough for the bullets to strike his arm and torso but not his heart.

    “My God!” Wilkerson shrieked. “My God! They have missed!” More than 27 minutes passed as Wilkerson bled to death in front of astonished witnesses and a helpless doctor.

    Just 11 years later, the Supreme Court heard the case of William Kemmler, who had been sentenced to death by electric chair in New York. The court, in affirming the state’s right to execute Kemmler, ruled that electrocution reduced substantial risks of pain or “a lingering death” when compared to executions by hanging. Kemmler, had he lived through the ensuing execution (and he nearly did), might too have disagreed.

    After a thousand volts of current struck Kemmler on Aug. 6, 1890, the smell of burnt flesh permeated the room. He was still breathing. Saliva dripped from his mouth and down his beard as he gasped for air. Nauseated witnesses and a tearful sheriff fled the room as Kemmler’s coat burst into flames.

    Another surge was applied, but minutes passed as the current built to a lethal voltage. Some witnesses thought Kemmler was about to regain consciousness, but eight long minutes later, he was pronounced dead.

    Perhaps the most egregious case came to the court more than 50 years later. “Lucky” Willie Francis, as the press called him, was a stuttering 17-year-old from St. Martinville, La. In 1946, he walked away from the electric chair known as “Gruesome Gertie” when two executioners (an inmate and a guard) from the state penitentiary at Angola botched the wiring of the chair.

    When the switch was thrown, Francis strained against the straps and began rocking and sliding in the chair, pleading with the sheriff and the executioners to halt the proceedings. “I am n-n-not dying!” he screamed. Gov. Jimmie Davis ordered Francis returned to the chair six days later.

    Francis’ lawyers obtained a stay, and the case reached the Supreme Court. Justice Felix Frankfurter defined the teenager’s ordeal as an “innocent misadventure.” In the decision, Louisiana ex rel. Francis v. Resweber, the court held that “accidents happen for which no man is to blame,” and that such “an accident, with no suggestion of malevolence” did not violate the Constitution.

    Fewer than 24 hours before Francis’ second scheduled execution, his lawyers tried to bring the case before the Supreme Court again. They had obtained affidavits from witnesses stating that the two executioners from Angola were, as one of the witnesses put it, “so drunk it would have been impossible for them to have known what they were doing.” Although the court rejected this last-minute appeal, it noted the “grave nature of the new allegations” and encouraged the lawyers to pursue the matter in state court first, as required by law.

    Willie Francis was executed the next morning. Because his case never made it back to the Supreme Court, the ruling lingers, influencing the decisions of today’s justices. In his plurality opinion last week, Chief Justice Roberts called Louisiana’s first attempt at executing Francis an “isolated mishap” that “while regrettable, does not suggest cruelty.”

    Justice Clarence Thomas, writing separately, also mentioned the Francis case: “No one suggested that Louisiana was required to implement additional safeguards or alternative procedures in order to reduce the risk of a second malfunction.” In fact, Louisiana did just that. Two weeks after the botched execution of Willie Francis, its Legislature required that the operator of the electric chair “shall be a competent electrician who shall not have been previously convicted of a felony.” This law would have prohibited both executioners from participating in Francis’ failed execution.

    The court’s majority opinion in the Willie Francis case acknowledged, “The traditional humanity of modern Anglo-American law forbids the infliction of unnecessary pain in the execution of the death sentence.” Yet the Supreme Court continues to flout that standard.

    In its ruling last week, the court once more ignored the consequences of its rulings for men like Wallace Wilkerson, William Kemmler and Willie Francis. The justices cited and applied Wilkerson’s and Kemmler’s cases as if their executions went off without a hitch.

    And 60 years after two drunken executioners disregarded the tortured screams of a teenage boy named Willie Francis, the Supreme Court continues to do so.


    Gilbert King is the author of “The Execution of Willie Francis: Race, Murder and the Search for Justice in the American South.”


    This article has been revised to reflect the following correction:

    Correction: April 29, 2008
    An Op-Ed article on Wednesday, on executions, misstated the status of Chief Justice John Roberts’s opinion in a recent Supreme Court capital punishment case, Baze v. Rees. His opinion, joined by two other justices in the seven-justice majority, was the plurality opinion. It was not the majority opinion.


    T
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    The American Civil War, like all wars, is a part of our country’s history, but so often history concentrates on names, dates and places and overlooks the more anecdotal aspects, some amusing and some tragic, associated with an event. Such is the case of the “Confederate secret weapon.” While nothing more than white phosphorus dissolved in a solvent and poured into small bottles, those little bottles brought more than a little fear to the hearts of more than one Union soldier. When the bottles were thrown at an object and broken, the object would burst into flames. This was because white phosphorous ignites on contact with air, and once the solvent evaporated, the phosphorous was exposed to air. There are numerous stories of Confederate commanders intimidating Union troops with a quick demonstration of their “secret weapon.”

    It has been said by some historians that the civil war broke out in Wilmer McClean’s front yard and ended in his parlor. During the first Battle of Bull Run or Manassas, which is generally recognized as the first official battle of the civil war, McClean was a fairly wealthy Manassas, Virginia, farmer. During the battle, a lot of fighting occurred in his front yard and around his home. This proved to be a dangerous situation, so McClean decided to move his family to Appomattox Court House, Virginia, for their safety. Being a wealthy man, he bought one of the finest homes in the town. On April 9, 1865, a battle was fought in Appomattox Court House that was a Union victory, and that prompted Union General Ulysses S. Grant to begin approaching Confederate General Robert E. Lee about surrendering his army and ending the war. Once it became obvious that Lee would meet with Grant, a place for the meeting was needed, and McClean was approached about using his home because it was so elegant. McClean agreed to the request, and on April 12, 1865, the meeting took place in his parlor. Lee agreed to disband the Army of Northern Virginia, which effectively ended the war in Virginia and, for the most part, ended the American Civil War.


    Once the ceremony was over, members of the Army of the Potomac began taking the tables, chairs, and various other furnishings in the house—essentially, anything that was not tied down—as souvenirs. They simply handed the protesting McLean money as they made off with his property.[4] Major General Edward Ord paid $40.00 (equivalent to $616.26 in today's dollars).[5] for the table Lee had used to sign the surrender document, while Major General Philip Sheridan got the table on which Grant had drafted the document for $20.00 (equivalent to $308.13 in today's dollars) in gold.[6][7] Sheridan then asked George Armstrong Custer to carry it away on his horse.[7] The table was presented to Custer's wife and is now on exhibit at the American History Museum at the Smithsonian. McLean's second home is now part of the Appomattox Court House National Historical Monument operated by the National Park Service of the United States Department of the Interior.

    After the war, McLean and his family sold their house in 1867, unable to keep up the mortgage payments, and returned to their home in Manassas.[9] They later moved to Alexandria, Virginia. He worked for the Internal Revenue Service from 1873 to 1876.

    McLean died in Alexandria and is buried there at St. Paul's Episcopal Cemetery.




    Bull Run was the largest and bloodiest battle in American history up to that point. Union casualties were 460 killed, 1,124 wounded, and 1,312 missing or captured; Confederate casualties were 387 killed, 1,582 wounded, and 13 missing.[32] Among the Union dead was Col. James Cameron, brother of President Lincoln's first Secretary of War, Simon Cameron.[33] Among the Confederate casualties was Col. Francis S. Bartow, who was the first Confederate brigade commander to be killed in the Civil War. General Bee was mortally wounded and died the following day.

    The Northern public was shocked at the unexpected defeat of their army when an easy victory had been widely anticipated. Both sides quickly came to realize the war would be longer and more brutal than they had imagined. On July 22 President Lincoln signed a bill that provided for the enlistment of another 500,000 men for up to three years of service.[36] On July 25, eleven thousand Pennsylvanians who had earlier been rejected by the U.S. Secretary of War, Simon Cameron, for federal service in either Patterson's or McDowell's command arrived in Washington, D.C., and were finally accepted.


    In popular media

    The First Battle of Bull Run is mentioned in the novel Gods and Generals, but is depicted more fully in its film adaptation. It also appears in the first episode of the second season of the mini-series North and South and in the first episode of the miniseries The Blue and the Gray. Manassas (1999) is the first volume in the James Reasoner Civil War Series of historical novels. The battle is described in Rebel (1993), the first volume of Bernard Cornwell's The Starbuck Chronicles series of historical novels. The battle is described from the viewpoint of a Union infantryman in Upton Sinclair's novella Manassas, which also depicts the political turmoil leading up to the Civil War. The battle is also depicted in John Jakes's The Titans, the fifth novel in The Kent Family Chronicles, a series that explores the fictional Confederate cavalry officer Gideon Kent. The battle is the subject of the Johnny Horton song, "Battle of Bull Run". Shaman, second in the Cole family trilogy by Noah Gordon, includes an account of the battle. The battle is also depicted in the song "Yankee Bayonet" by indie-folk band The Decemberists. In Murder at 1600, Detective Harlan Regis (Wesley Snipes) has built a plan-relief of the battle which plays a certain role in the plot.

    Sesquicentennial

    Prince William County staged special events commemorating the 150th anniversary of the Civil War through 2011. Manassas was named the No. 1 tourist destination in the United States for 2011 by the American Bus Association for its efforts in highlighting the historical impact of the Civil War. The cornerstone of the commemoration event featured a reenactment of the battle on July 23–24, 2011. Throughout the year, there were tours of the Manassas battlefield and other battlefields in the county and a number of related events and activities.

    The City of Manassas commemorated the 150th anniversary of the battle July 21–24, 2011.
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    Indian Treaties and the Removal Act of 1830

    The U.S. Government used treaties as one means to displace Indians from their tribal lands, a mechanism that was strengthened with the Removal Act of 1830. In cases where this failed, the government sometimes violated both treaties and Supreme Court rulings to facilitate the spread of European Americans westward across the continent.

    As the 19th century began, land-hungry Americans poured into the backcountry of the coastal South and began moving toward and into what would later become the states of Alabama and Mississippi. Since Indian tribes living there appeared to be the main obstacle to westward expansion, white settlers petitioned the federal government to remove them. Although Presidents Thomas Jefferson and James Monroe argued that the Indian tribes in the Southeast should exchange their land for lands west of the Mississippi River, they did not take steps to make this happen. Indeed, the first major transfer of land occurred only as the result of war.

    In 1814, Major General Andrew Jackson led an expedition against the Creek Indians climaxing in the Battle of Horse Shoe Bend (in present day Alabama near the Georgia border), where Jackson’s force soundly defeated the Creeks and destroyed their military power. He then forced upon the Indians a treaty whereby they surrendered to the United States over twenty-million acres of their traditional land—about one-half of present day Alabama and one-fifth of Georgia. Over the next decade, Jackson led the way in the Indian removal campaign, helping to negotiate nine of the eleven major treaties to remove Indians.

    Under this kind of pressure, Native American tribes—specifically the Creek, Cherokee, Chickasaw, and Choctaw—realized that they could not defeat the Americans in war. The appetite of the settlers for land would not abate, so the Indians adopted a strategy of appeasement. They hoped that if they gave up a good deal of their land, they could keep at least some a part of it. The Seminole tribe in Florida resisted, in the Second Seminole War (1835–1842) and the Third Seminole War (1855–1858), however, neither appeasement nor resistance worked.

    From a legal standpoint, the United States Constitution empowered Congress to “regulate commerce with foreign nations, and among the several States, and with the Indian tribes.” In early treaties negotiated between the federal government and the Indian tribes, the latter typically acknowledged themselves “to be under the protection of the United States of America, and of no other sovereign whosoever.” When Andrew Jackson became president (1829–1837), he decided to build a systematic approach to Indian removal on the basis of these legal precedents.

    To achieve his purpose, Jackson encouraged Congress to adopt the Removal Act of 1830. The Act established a process whereby the President could grant land west of the Mississippi River to Indian tribes that agreed to give up their homelands. As incentives, the law allowed the Indians financial and material assistance to travel to their new locations and start new lives and guaranteed that the Indians would live on their new property under the protection of the United States Government forever. With the Act in place, Jackson and his followers were free to persuade, bribe, and threaten tribes into signing removal treaties and leaving the Southeast.

    In general terms, Jackson’s government succeeded. By the end of his presidency, he had signed into law almost seventy removal treaties, the result of which was to move nearly 50,000 eastern Indians to Indian Territory—defined as the region belonging to the United States west of the Mississippi River but excluding the states of Missouri and Iowa as well as the Territory of Arkansas—and open millions of acres of rich land east of the Mississippi to white settlers. Despite the vastness of the Indian Territory, the government intended that the Indians’ destination would be a more confined area—what later became eastern Oklahoma.

    The Cherokee Nation resisted, however, challenging in court the Georgia laws that restricted their freedoms on tribal lands. In his 1831 ruling on Cherokee Nation v. the State of Georgia, Chief Justice John Marshall declared that “the Indian territory is admitted to compose a part of the United States,” and affirmed that the tribes were “domestic dependent nations” and “their relation to the United States resembles that of a ward to his guardian.” However, the following year the Supreme Court reversed itself and ruled that Indian tribes were indeed sovereign and immune from Georgia laws. President Jackson nonetheless refused to heed the Court’s decision. He obtained the signature of a Cherokee chief agreeing to relocation in the Treaty of New Echota, which Congress ratified against the protests of Daniel Webster and Henry Clay in 1835. The Cherokee signing party represented only a faction of the Cherokee, and the majority followed Principal Chief John Ross in a desperate attempt to hold onto their land. This attempt faltered in 1838, when, under the guns of federal troops and Georgia state militia, the Cherokee tribe were forced to the dry plains across the Mississippi. The best evidence indicates that between three and four thousand out of the fifteen to sixteen thousand Cherokees died en route from the brutal conditions of the “Trail of Tears.”

    With the exception of a small number of Seminoles still resisting removal in Florida, by the 1840s, from the Atlantic to the Mississippi, no Indian tribes resided in the American South. Through a combination of coerced treaties and the contravention of treaties and judicial determination, the United States Government succeeded in paving the way for the westward expansion and the incorporation of new territories as part of the United States.


    «
     
  8. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    History of Hawaii/Missionaries Sugar Immigration


    SUGAR PLANTATIONS

    Sugarcane has historically been an important source of income for Hawaii. The colonial powers brought capitalism with them to Hawaii, and with it, the production of sugar. Sugar exportation became a central component of the Hawaiian economy in a short period of time due to the exploitative nature in which the land and population were handled by imperial powers. These events were a major turning point in Hawaiian history.

    In the early 19th century, sugarcane agriculture was very limited on the Hawaiian Islands. The first commercial sugar plantations were developed in the 1830s under the reign of Hawaiian King Kamehameha III. The plantations in Hawaii were unlike those that existed elsewhere in the world during that time, such as Jamaica, Cuba, Puerto Rico and Haiti. The main difference was that Hawaiian plantation owners paid their laborers. Some plantation owners leased land from the King to harvest sugarcane, paying a flat rate each year. One such plantation, called the Koloa Plantation, was operated by three American businessmen who founded Ladd & Company. The Koloa Plantation was built on 980 acres of land leased from King Kamehameha III for 50 years at a rate of $300.00 annually. The plantation grew from only 25 staff in September of 1835 to 100 by March of 1838. Male Chinese workers were often recruited to work in the mill with the Hawaiian natives. Within a year of being established, the Koloa plantation contained twenty-five acres of cane under cultivation and many buildings including twenty houses for native workers, a house for the superintendent, a carpenter’s shop, a blacksmith’s shop, a mill dam, a sugar house, a boiling house and a sugar mill. Life on the Koloa Plantation involved labor for both male and female workers. Laborers were assigned to living quarters and allowed to take Fridays off to maintain their own food crops, and Saturdays for cooking and preparing meals. The workers on the plantation were paid in the form of coupons which could be redeemed at the plantation store. The plantation was managed by twenty-six year old William Hooper, from Boston, Massachusetts. Hooper instilled a strong free labor system and a capitalist system on the Islands by creating a wage-earning labor force, as well as a consumer class that was dependent on a market of sugar exports. Hooper is credited for helping set the pattern of good owner-worker relationships in Hawaii. His successful development and organization of the Koloa Plantation ensured that even after he departed the island in 1839, his legacy and institutions would remain and flourish. Hooper’s most important contribution was instigating the development of a corporate-dominated sugar economy in Hawaii.

    During the early years of sugar production, commerce between Hawaii and the United States was relatively limited. However, the California gold rush of the 1840s would change that. The California gold rush had a significant impact on the Hawaiian economy because it increased settlement on the west coast of the United States, which led to rapid agricultural and plantation development in Hawaii. American miners began sending their soiled laundry to Hawaii because it was less expensive than getting it laundered in the States. Mining companies began importing Hawaiian food, clothing, and other supplies from over the Pacific rather than haul them across the American interior. With increased revenue to Hawaii came increased opportunity for sugar plantation owners to expand. While in 1859 the Hawaiian Islands’ annual sugar production was only about 1.8 million pounds, towards the end of the 1860s, sugar exports from Hawaii had increased ten-fold, with annual sugar exports of over 18 million pounds in 1868. Due to the increase of sugar production, this lead to a high demand for laborers to assist the farmers. “The sugar industry increased from 10 plantations in 1858 to 22 plantation operating in 1861, and sugar farmers continued to request additions to the labor force.”. This increase in sugar production corresponds almost perfectly with the California gold rush, which occurred during the years of 1848 -1855. By the end of the 19th century, Hawaii’s sugar exports would skyrocket to hundreds of millions of pounds of sugar each year.

    As the California gold rush demonstrates, the success of the sugar industry in Hawaii was largely tied to events that occurred in America. The American Civil War which began in 1861 is an example of this relationship. The Civil War largely spurred the sugar industry in Hawaii because the Union significantly reduced importing products from the Southern States. Hawaii therefore gained new markets in the North, who sought sugar elsewhere. This demonstrates how the Hawaiian sugar industry was widely influenced by greater economic production in the United States.

    As the century progressed, many plantation owners (some of whom were missionaries) had become very wealthy and powerful. Their influence on both the economy and religion of the island allowed them to manipulate the fledgling government. Sugar plantation owners dominated the capitalist system, and this allowed for significant influence in both public and private spheres of society. Firstly, the growth of the sugar industry was the major phenomenon to stimulate population growth in the form of immigrant workers, and with these people came their respective cultures. Secondly, the money brought into the island for sugar sales meant industrial development of the island, along with the many developments that come with wealth. It is therefore likely that the sugar industry had a significant impact on Hawaiian lifestyle and culture.

    Another issue that resulted from the commercial production of sugar cane was the environmental impact it had on the island. Development of more efficient methods of cultivation allowed for greater yield per acre. Increase in production gave way to immense environmental degradation and deforestation. This altered both the resources and landscape of the island as a whole.

    The plantations were harsh environments; however, they allowed natives to escape the traditional life on the islands, which consisted of hard labor for the Chiefs of the King, where failure to perform or complete work could sometimes result in death. People lived in “chronic fear” of the Chiefs on the islands and most people jumped at any opportunity to escape these norms and work on a plantation. The California Gold Rush, and the Great Mahele of 1848 where the traditional system of land ownership in Hawaii was destroyed, and the signing of the Reciprocity Treaty with the United States in 1875, were all factors in the growth of Hawaiian outside investment and economic growth. With increased investment came increased exports. Sugar production on the island increased from 30 tons during Hooper’s time in 1838 to 375 tons. By the turn of the 19th century, exports climbed all the way to 298,544 tons.

    The rapid increases in sugar exports seen towards the end of the 19th century were also in part due to reciprocity agreements between Hawaii and the United States. In 1856, the King of Hawaii commissioned the Hon. E. H. Allen to act as the Minister Plenipotentiary and Envoy Extraordinary in Washington to negotiate an agreement between the United States and Hawaii that would allow entry to ports free of duty. Although the proposed agreement was initially received favorably by the United States federal government, it was heavily opposed by senators from southern states such as Louisiana that also relied on sugar production as a source of income. As a result, the agreement was initially rejected.

    Finally, in 1875, the United States and Hawaii were able to reach agreeable trading terms. The Reciprocity Treaty of 1875 allowed for the admission of a number of products in the United States free from duty. Products listed in the treaty as being free from duty included: Muscovado, brown and all other unrefined sugar, commonly known as “Sandwich Island Sugar,” syrups of sugarcane, and molasses. By the end of the 19th century, sugar had fully emerged as the dominant export in Hawaiian industry, and many of the richest Hawaiians were those involved with the sugar industry.

    IMMIGRATION


    During the nineteenth century, Hawaii saw a high rate of immigration. At the time, many people were working on farms producing sugar cane, a driving force in the the Hawaiian economy. The sugar cane and pineapple industries provided many pull factors to potential immigrants.

    Hawaii then looked to Puerto Rico for laborers. Puerto Ricans came to Hawaii looking to find employment in the many sugar cane fields because of their previous experience in Puerto Rico. There were two hurricanes that hit Puerto Rico and completely destroyed their sugar cane plantations and left many without a job. This meant that a major producer of sugar cane was eliminated from the equation and now Hawaii was seen as a major producer. After many of the new immigrants work contracts had begun to expire, people began returning home or moving to mainland America to try and establish a life there. However, there were many who stayed behind and established communities including schools, churches and even building a stronger economy.

    In order for the sugar industry to be commercially profitable, it was necessary to import foreign laborers. This is because the native population had been decimated by diseases which were introduced by Westerners to which the natives had no immunity. This shows that the elite class in Hawaii needed a working class group, so they allowed foreigners to migrate to Hawaii. Hawaii began accepting too many new immigrants and they were not necessarily paying these immigrants well in the sugar fields. Around 1864, King Kamehameha V thought that a Board of Immigration was needed to help control importation of foreign labor because the current process was very obsolete. During the 1900’s the demand for these two industries in Hawaii’s economy created a huge need for unskilled workers. According to an article called, Dual Chain Migration: Post 1965 Filipino Immigration to the United States, “The Hawaiian sugar planters deliberately recruited illiterate men who were either single or willing to leave their family behind, and by 1831, about 113 000 Filipinos mainly from the Ilocano provinces, had migrated to Hawaii.” This helps illustrate the large number of willing workers who had immigrated in order to make a living. The Board of Immigration in Hawaii failed to consider the needs of the immigrants that they were accepting from China specifically. Five hundred Chinese men were brought over to Hawaii to serve as additional workers. However, they did not bring over any women that lead complaints of prostitution and sexual perversion. The Board of Immigration later then was able to bring Chinese women to the islands in order for prostitution to be limited.

    Japanese immigration to the Hawaiian islands began in 1868, but the systematic immigration of contract workers did not begin until 1884 when the Japanese government finally approved it. Prior to 1884 the Japanese government opposed sending their citizens to Hawaii because they did not want to be perceived as another “coolie storehouse”, or reserve of manual labor - like nations such as China. The Japanese Government also had a negative impression of Hawaii due to the behavior of American representatives in Hawaii through correspondence. Hawaiian Foreign Minister Robert Crichton Wyllie, who was a plantation owner in Hawaii himself and was therefore motivated by his own need for plantation workers, wrote to an American businessman in Japan, Mr. Eugene M. Van Reed. He arranged for contract workers from Japan to fill the many positions available at sugar plantations in Hawaii. This communication and the sugar industry on the islands are the main catalysts that began mass Japanese immigration. Van Reed’s correspondence resulted in 148 Japanese people arriving in Hawaii in 1868, which served to anger the Japanese government as Van Reed did not attain official permission from the Japanese government during treaty negotiations to begin immigration. The offense the Japanese government took to Van Reed's conduct halted Japanese immigration to Hawaii for the next seventeen years.

    From 1778-1872, the overall population on the islands dropped from 300,000 to 50,000, due to a series of epidemics. It is estimated that over 46,000 Chinese were brought to Hawaii as laborers, mainly between 1876 – 1885 and 1890 – 1897. This shows the large contribution that the Chinese labor force had towards the Hawaiian economy. This mass immigration of the Chinese into Hawaii came to a close in the 1900`s. The annexation of Hawaii, meant that Hawaii became part of the continental United States of America and was therefore subject to the laws in the USA. This had vast implications to Chinese immigration in Hawaii. The Chinese Exclusion Act could now be enforced in Hawaii. This meant the legal end to large-scale Chinese immigration. The Chinese Exclusion Act stopped the supply of Chinese immigrants to Hawaii and forced plantations to seek workers from elsewhere. Since Hawaii could no longer rely on the Chinese to supply their labor force they had to encourage other cultures to immigrate. In early 1885, Japanese people again started coming to the islands in large numbers as contract workers, with many of them returning to Japan at the end of their three-year contracts. At first, they comprised a “low caste of Japanese gathered from the riff raff of the cities,” but as time passed the immigrants were said to have started coming from higher classes. In this year, two ships (one arriving on February 8th and the other on June 17th) brought over 900 Japanese to Hawaii, and immigration continued at a steady pace from then onwards. In fact, over 9000 Japanese contract workers and farmers came to the islands from 1885-86. The first Japanese immigrants in 1885 lived in unstable huts that they had to build themselves once they arrived.

    The sugar industry and later the pineapple industry were and are Hawaii’s chief commodities and have substantially affected the state both politically and economically. In order for these two industries of cultivation to become commercially profitable they had to rely on cheap labor. Since the native population had been decimated by disease brought on by Westerners, plantation owners needed to import foreign workers. The Hawaiian native population went from 800,000 in 1778 to 40,000 in 1878, and the state became a hub for foreigners willing to relocate and work. Hawaii was the destination of the earliest and the largest Asian immigrations to America. It all began in the mid-19th century with many Asians flocking into the state to find work. The main ethnic groups were the Chinese, Korean, Japanese and Filipino. The plantation owners would only take on men, since women were deemed useless. Most Asian women were illiterate since education for a female child was deemed irrelevant and even jeopardized her chance for a good marriage. Through this immigration, Native Hawaiian’s became the minority in their own home. By the year 1884 Chinese laborers constituted about a quarter (22.6 percent) of the total population of Hawaii. Native Hawaiian's were being replaced by Asian workers willing to uproot their lives and work for next to nothing on these plantations. This immigration continued and allowed the sugar and pineapple industries to prosper until 1934 when the depression heightened racial animosity towards Asians. It was in this year that the Tyding-McDuffie Act restricted the entry of Filipinos into the United States to fifty persons a year. The act also changed the status of Filipinos from American nationals to alien immigrants. As years faded, so did the racial tension and tight immigration policies for Asians were loosened. If it were not for the immigration of Asians willing to work for almost nothing into Hawaii, the sugar and pineapple industries would not have been able to prosper and Hawaii would not be the prosperous and respected state that it is today.

    By the 1896 census, Japanese people comprised a quarter of the population in the Hawaiian islands. By 1910, they encompassed 40% of the population.

    HAWAIIAN PIDGIN


    As plantation owners sought outside labor many immigrants emerged to work in Hawaii. This immigration sparked by the sugar companies had an everlasting effect on Hawaiian culture, creating a multicultural society, along with the emergence of a new language – Hawaiian Pidgin. The language emerged as immigrants on plantation farms struggled to communicate with one another. In seeking a common language to communicate through, a hybrid primarily influenced from Hawaiian, English, Japanese, Chinese and Portuguese languages emerged. The language is often referred to as “Hawaii Creole”, or, “Hawaii Creole English”, due to its similar appearance to the English language. The language has historically been deemed a sub-standard of English, though many linguists argue the language stands separately. While English and Hawaiian are the two official languages of the legislature, Hawaiian Pidgin is still commonly heard in advertisements, neighborhood conversation and even sometimes in Hawaiian school systems. The language possesses its own specific spelling system, though it can be found spelled out in English. Hawaiian Pidgin also has a very unique intonation with word rhythms quite different than those found within the English language.


    M
     
  9. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    MISSIONARIES

    The industry was originally tightly controlled by “The Big Five”; five major corporations that started within the sugar industry. These five companies, started by missionary families, were Castle & Cook, Alexander & Baldwin, C. Brewer & Co., American Factors, and Theo H. Davies & Co. Dealing their workers very low wages, these companies were able to prosper. In the early-nineteenth century, Protestant missionaries from the United States arrived in Hawaii with the aim of Christianizing and “civilizing” its inhabitants, an idea related to that of manifest destiny. It was in 1810 that the American Board of Commissioners for Foreign Ministers set a plan in motion to “[promote] the spread of the gospel in Heathen lands,” attracting a handful of American Protestant missionaries who began their journey from Boston to Hawaii in 1819. Upon their arrival, they were greeted by “children of nature,”--what they observed as Hawaiians, who in their eyes, were in need of Christ and a missionaries’ model of Western society. They were eager to evangelize the Sandwich Islands, believed to be “a dark and ruined land,” as many Protestants thought that the Second Coming of Christ was near. It was hoped that Hawaii could be transformed into a purely Protestant nation, ready for the Salvation of the Lord. While the missionaries failed to achieve the total victory of Protestantism they had envisioned at their onset, Hawaiian culture and legislation were profoundly Christianized under their influence, but not without having to overcome several obstacles.

    Although the missionaries came to Hawaii with the intention of bringing Abrahamic faith to the Islanders, they were met with an opposition from merchants who had settled in Hawaii in the 1790s, and who desired an economic focus on Hawaii rather than a religious one. British merchants established trade by exchanging goods like guns, cloth, glass, and rum for Hawaiian sandalwood; they would then trade these goods to the Chinese for silk and furniture. At first, these merchants argued that allowing missionaries into Hawaii would have negative political and social consequences, and that they were “sent by the American government for political purposes.” However, sugar then quickly became a major industry fueled by emigrants brought to Hawaii by missionaries. This wave of emigrants helped power the missionaries’ cause for Christ by establishing a foundation of people to be “saved,” but it did not alleviate the negative opinions that were held by merchants towards missionaries and their work. The two different camps clashed so much, that by 1823, Reverend William Ellis called merchants, “the enemy” for their economic motivations hindered the latter’s religious cause. Since Hawaii’s population had faced a sharp decline and there weren’t enough people to work the sugarcane fields, Hakka emigrant workers were brought in with the help of the missionaries. One of the Reverends, Lias Bond, “operated a sugar plantation... in order to support his mission work.” There is a notable convergence of mission work and economic pursuit in Hawaii at this time, regardless of the tension between missionaries and merchants. The eager missionaries helped handfuls of refugees enter Hawaii safely, managing to show them their point of view, and successfully converting them to Christianity. On the other hand, the merchants and men of industry were benefiting from the mission work, supplied with plenty of workers--the fuel of Hawaii’s sugar crops. Although the established relationship between men of God and men of the empire was held in a negative light, the two continued to depend on each other for success in their respective pursuits.

    The missionaries began their quest by targeting Hawaiian leaders in the hope that their conversion would influence the masses to follow. Little success was achieved with the King, Liholiho, who demonstrated relatively no interest in converting to Christianity. The missionaries were more successful with Hawaiian chiefs; more specifically, Kaahumanu and Kalanimoku. These chiefs, under the influence of the missionaries, would make significant cultural and legal changes in Hawaii. While missionaries agreed not to get involved in politics directly, they had no problem impacting politics and legislation indirectly by advising the chiefs and informing them about the laws and political institutions of Christian countries. These changes to culture and law in Hawaii had become visible by 1824 when the beginnings of a new moral law began to appear. Kaahumanu and Kalanimoku instructed Hawaiians not to work or travel on Sabbath and to attend school and church. On December 17th 1817, Hawaiian chiefs imposed new laws that prohibited murder, theft and adultery. In 1831, under the influence of the Protestant missionaries, the chiefs declared that Catholicism was extirpated in Hawaii and forced all Catholic missionaries to leave the island. Shortly after the extirpation of Catholics, the Protestant American missionary, Titus Coan, arrived in Honolulu. Coan demonstrated an amazing ability to convert large numbers of Hawaiians to Protestantism; his period of mass conversion was later deemed the “Great Awakening”. Between 1837 and 1840 approximately 100,000 Hawaiians entered the Protestant church as Protestantism had begun to reach the masses.

    One major technique utilized by the missionaries to influence conversion was through literacy and also the establishment of print media. Teaching natives to read and write was an integral part of the “civilizing” process, working to increase Protestant conversion by the spread of Christian teachings, as well as colonial ideas such as capitalism rather than subsistence.

    With the missionaries and other colonial settlers came the arrival of European disease that the island had never before been exposed to, such as syphilis and leprosy. Because the native peoples lacked the immunity to ward of these illnesses, their population was significantly depleted by epidemics such as the smallpox disease, which took thousands of lives in 1853. Illness weakened the native race, serving as another way in which missionaries and other settlers could assert dominance. Thus, a sense of biological superiority prevailed, creating a line of racial discourse and increasing the motivation for missionaries to civilize the native population.

    The sense of accomplishment that the “Great Awakening” brought to Protestant American missionaries began to dissolve in 1839 with the arrival of the French Captain, C.P.T. Laplace. Laplace came with a list of demands that, if not met, would lead to war between France and Hawaii. The Hawaiian King, Kauikeaouli, met the demands of the Captain and ordered religious freedom for Catholics, a bond of $20,000 from the chiefs to guarantee compliance and a salute for the French flag. Before long the American Protestant missionaries were forced to compete against missionaries of Roman Catholicism, Mormonism, and Episcopalism.

    In 1854, the American Board of Commissioners for Foreign Missions created the Hawaiian Evangelical Association to direct and control the Protestant mission in Hawaii, from within the island. In 1870, when the Hawaiian Evangelical Association celebrated the fiftieth anniversary of the coming of the first group of missionaries, there were fifty-eight churches in the association, with a membership of 14,850 - approximately one-fourth of the whole population of the Kingdom. Clearly the Protestant missionaries had achieved great success in Hawaii, but they had ultimately failed to win the kind of absolute victory for Protestantism that they had been so sure of fifty years earlier.

    One must not forget an important element of missionary work in the islands: women. In the early 19th century, women did not venture to Hawaii as missionaries themselves, but as the wives of missionaries. Men were highly encouraged to marry before they departed on a call. Missionary wives came from middle class New England lifestyles, where Protestantism reigned and there were clearly defined roles for all members of society.

    Missionary wives were, “for the most part, energetic, intelligent, and well-educated women, daughters of farmers or small business men”. These women embodied “a passion to reform the habits, inform the minds, and modify the world views of those whose life-styles differed markedly from the model established by New England Protestantism.” The spirited involvement in the mission field of Hawaii is an example of women attempting to break through into the public sphere of life during the 19th century.

    As the wife of a missionary, a woman’s focus revolved around the domestic sphere. In addition to bearing responsibility for the household concerns within their own homes, missionary wives were mainly involved in the lives of other women. These women from New England saw their Hawaiian counterparts as heathens in desperate need of salvation. Missionary wives saw it as their duty to reform Hawaiian women so that they were “genuinely pious, sexually pure, dutifully submissive and domestically oriented as housewives and mothers.” It was expected that Hawaiian women would then transfer these values onto their children.

    This reformation of Hawaiian women took place in many forms, including but not limited to “bible-reading groups, church meetings, school examinations, Sunday school picnics and tea meetings, as well as formal classroom instruction.” The wives of missionaries completed all these tasks while they were accompanied by their own children, of whom they had many: “with fertile couples, first infants arrived as early as nine or ten months after marriage...second and subsequent births occurred at around two-yearly intervals.”

    The division between the work male missionaries were doing, and the work female missionaries were undertaking was stark. Male missionaries were quite content with the separation of the sexes: “they were vociferous in criticizing women who stepped outside their appropriate sphere” 6. In 1834, these women created a ‘Maternal Association’ whereby “they could discuss together those issues affecting their lives that were ignored in the mens deliberations.” Stating that the work of missionary wives was different from that of their husbands’ in no way diminishes their belief in their work; these women “believed they had a strong call in their own right to teach the nations”. Both missionaries and their wives’ efforts had a significant influence on the Hawaiian people in the 19th century.


    COLONIALISM AND HAWAIIAN RESISTANCE

    Though remote and isolated, Hawaii was realized by many in the 19th century to be of rather strategic importance for both trade and naval operations. Russia, France, Britain and the United States of America all staked imperial claims on the islands throughout the 19th century, with the United States finally annexing Hawaii in 1898. The story of Hawaii throughout the 19th century is one of exploitation and mistreatment by nations with colonial aspirations on the islands, of immigration, of missionaries, and plantations. Though taken advantage of time and time again, the native Hawaiians were not passively colonized. Silvia Noenoe asserts that the European and American powers desired to exploit the land and subjugate its people, but the native Hawaiians resisted in a number of ways.

    Foreigners established contact with the native Hawaiians in the 18th century. The first and most notable were the expeditions of Captain Cook, who discovered the Hawaiian islands in 1778. On his third expedition Cook was killed in a quarrel with the natives, who showed little fear of the Europeans and their superior weaponry. Resistance of colonizing punctuated 19th century Hawaii, though the mode of resistance was not homogenous. Silvia Noenoe emphasizes the variation of resistance throughout Hawaii, pointing out that the way in which people resisted in rural areas was vastly different from those living in more urban areas like Honolulu. Creating a nation in a form similar and recognizable to European and American governments was a strategy of resistance because it increased Hawaii’s chance of being recognized by a large power like France or England.

    In the early 19th century Imperial Russia began to show a serious interest in the colonization of the islands, establishing three forts. Though a very brief and futile attempt, it was the first time in the island’s history that a government funded expedition had made serious efforts to settle in the islands. The French and British also made futile attempts to colonize Hawaii, but an agreement between the two countries recognized Hawaii as an independent sovereign nation.

    The last and ultimately successful attempt at colonization was perpetrated by the United States in the later half of the 19th century. Through several trade agreements, the United States invested a great amount into the plantations and agriculture throughout Hawaii. Many Americans settled on the island, bringing Asian immigrants along with them as cheap laborers. Most of the islands’ inhabitants would not work for foreigners on Hawaiian ground. This labor boycott can also be seen as a form of economic resistance to colonialism exerted by Hawaii's native population. In 1893 the United States government funded an overthrow of Hawaii’s monarchy, ousting Queen Liliʻuokalani in January of that year. In his inaugural address, President Cleveland admitted that “substantial wrong has thus been done” and that the United States “should endeavor to repair the monarchy." Although many Americans were disturbed by such a blatant act of Manifest Destiny, no action was ever taken to restore Queen Lili’oukalani to her throne. The Hawaiians stood in opposition to Hawaii's annexation, as exhibited through the "1897 Petition Against the Annexation of Hawaii" which was presented to the U.S. congress and turned the tide of opinion against annexation. However, this success was short lived as the Spanish American War soon forced the United States to annex Hawaii for strategic purposes in 1898.

    Hawaii's past is marked by foreign powers with colonial aspirations intervening in Hawaiian affairs. The Hawaiians had successfully established a constitutional monarchy, which was recognized as sovereign by both France and England, but not taken seriously among world powers. Though the islanders never staged a full-on rebellion to colonialism, the various strategies they employed to resist colonialism are a mark of their courage and ingenuity.




    C
     
  10. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    6 Famous Places that Never Existed


    The travelers of old often told tales of fabled cities, phantom islands and exotic civilizations located deep in the unexplored reaches of the globe. These fanciful lands were usually disregarded as myths and legends, but a few found their way onto world maps and helped inspire some of history’s most important journeys of discovery. From a fabled Christian empire in Asia to a supposed lost kingdom in Canada, find out more about six of the most influential lands that never were.


    1. The Kingdom of Prester John

    For more than 500 years, Europeans believed a Christian king ruled over a vast empire somewhere in the wilds of Africa, India or the Far East. The myth first gained popularity in 1165, after the Byzantine and Holy Roman emperors received a letter—most likely a European forgery—from a monarch calling himself “Prester John.” The mysterious king claimed to serve as “supreme ruler of the three Indies” and all its 72 kingdoms. He described his realm as a utopia rich in gold, filled with milk and honey and populated by exotic races of giants and horned men. Perhaps most important of all, Prester John and his subjects were Christians—even the name “Prester” meant “Priest.”

    A Papal mission to find Prester John’s court disappeared without a trace, but the myth of his kingdom took hold among Europeans. Crusading Christians rejoiced in the idea that a devout ruler might come to their aid in the struggle against Islam, and when Genghis Khan’s Mongol hordes conquered parts of Persia in the early 1200s, many mistakenly credited Prester John’s forces with the attack. The fantastical kingdom later became a subject of fascination for travelers and explorers. Marco Polo spun a dubious tale about encountering its remnants in Northern China, and Vasco da Gama and other Portuguese mariners quested after it in Africa and India. While explorers did eventually discover a far-flung Christian civilization in Ethiopia, it lacked the grandeur—and the gold—that Europeans had come to associate with Prester John’s realm. By the 17th century, the legend had faded, and the famed empire was dropped from most maps.


    2. Hy-Brasil

    Long before Europeans ever stepped foot in the New World, explorers searched in vain for the island of Hy-Brasil, a spectral atoll said to lurk off the west coast of Ireland. The story of the island most likely comes from Celtic legend—its name means “Isle of the Blest” in Gaelic”—but its precise origins are unclear. Hy-Brasil first started appearing on maps in the 14th century, usually in the form of a small, circular island narrowly split in two by a strait. Many mariners accepted it as a real place until as recently as the 1800s, and it became popular fodder for myths and folktales. Some legends described the island as a lost paradise or utopia; others noted that it was perpetually obscured by a dense curtain of mist and fog, and only became visible to the naked eye every seven years.

    Despite its fanciful reputation, Hy-Brasil was widely sought after by Britain-based explorers in the 15th century. The navigator John Cabot launched several expeditions to track it down, and supposedly hoped to encounter it during his famous journey to the coast of Newfoundland in 1497. Documents from Cabot’s time claim that previous explorers had already reached Hy-Brasil, leading some researchers to argue that these mariners may have inadvertently traveled all the way to the Americas before Christopher Columbus.


    3. Thule

    A subject of fascination for ancient explorers, romantic poets and Nazi occultists alike, Thule was an elusive territory supposedly located in the frozen north Atlantic near Scandinavia. Its legend dates back to the 4th century B.C., when the Greek journeyman Pytheas claimed to have travelled to an icy island beyond Scotland where the sun rarely set and land, sea and air commingled into a bewildering, jelly-like mass.
    Many of Pytheas’ contemporaries doubted his claims, but “distant Thule” lingered in the European imagination, and it eventually came to represent the northernmost place in the known world. Explorers and researchers variously identified it as Norway, Iceland and the Shetland Islands, and it served a recurring motif in poetry and myth. The island is perhaps most famous for its connection to the Thule Society, a post-World War I esoteric organization in Germany that considered Thule the ancestral home of the Aryan race. The Munich-based group counted many future Nazis among its guests, including Rudolf Hess, who later served as Deputy Führer of Germany under Adolf Hitler.


    4. El Dorado

    Beginning in the 16th century, European explorers and conquistadors were bewitched by tales of a mythical city of gold located in the unexplored reaches of South America. The city had its origin in accounts of “El Dorado” (“The Gilded One”), a native king who powdered his body with gold dust and tossed jewels and gold into a sacred lake as part of a coronation rite. Stories of the gilded king eventually led to rumors of a golden city of untold wealth and splendor, and adventurers spent many years—and countless lives—in a fruitless search for its riches.

    One of the most famous El Dorado expeditions came in 1617, when the English explorer Sir Walter Raleigh traveled up the Orinoco River on a quest to find it in what is now Venezuela. The mission found no trace of the gilded city, and King James I later executed Raleigh after he disobeyed an order to avoid fighting with the Spanish. El Dorado continued to drive exploration and colonial violence until the early 1800s, when the scientists Alexander von Humboldt and Aimé Bonpland branded the city a myth after undertaking a research expedition to Latin America.


    5. St. Brendan’s Island

    St. Brendan’s Island was a mysterious incarnation of Paradise once thought to be hidden somewhere in the eastern Atlantic Ocean. The myth of the phantom isle dates back to the “Navigatio Brendani,” or “Voyage of Brendan,” a 1,200-year-old Irish legend about the seafaring monk St. Brendan the Navigator. As the story goes, Brendan led a crew of pious sailors on a 6th century voyage in search of the famed “Promised Land of the Saints.” After a particularly eventful journey on the open sea—the tale describes attacks by fireball-wielding giants and run-ins with talking birds—Brendan and his men landed on a mist-covered island filled with delicious fruit and sparkling gems. The grateful crew are said to have spent 40 days exploring the island before returning to Ireland.

    While there is no historical proof of St. Brendan’s voyage, the legend became so popular during the medieval era that “St. Brendan’s Island” found its way onto many maps of the Atlantic. Early cartographers placed it near Ireland, but in later years it migrated to the coasts of North Africa, the Canary Islands and finally the Azores. Sailors often claimed to have caught fleeting glimpses of the isle during the Age of Discovery, and it’s likely that even Christopher Columbus believed in its existence. Nevertheless, its legend eventually faded after multiple search expeditions failed to track it down. By the 18th century, the famed “Promised Land of the Saints” had been excised from most navigational charts.


    6. The Kingdom of Saguenay

    The story of the mirage-like Kingdom of Saguenay dates to the 1530s, when the French explorer Jacques Cartier made his second journey to Canada in search of gold and a northwest passage to Asia. As his expedition traveled along the St. Lawrence River in modern day Quebec, Cartier’s Iroquois guides began to whisper tales of “Saguenay,” a vast kingdom that lay to the north. According to a chief named Donnacona, the mysterious realm was rich in spices, furs and precious metals, and was populated by blond, bearded men with pale skin. The stories eventually drifted into the realm of the absurd—the natives claimed the region was also home to races of one-legged people and whole tribes “possessing no anus”—but Cartier became taken by the prospect of finding Saguenay and plundering its riches. He brought Donnacona back to France, where the Iroquois chief continued to spread tales of a lost kingdom.

    Legends about Saguenay would haunt French explorers in North America for several years, but treasure hunters never found any trace of the mythical land of plenty or its white inhabitants. Most historians now dismiss it as a myth or tall tale, but some argue the natives may have actually been referring to copper deposits in the northwest. Still others have suggested that the Indians’ Kingdom of Saguenay could have been inspired by a centuries old Norse outpost left over from Viking voyages to North America.



    El Dorado wasn’t the only gilded city supposedly tucked away in the New World. European explorers also hunted for the Seven Cities of Cibola, a mythical group of gold-rich settlements said to be located somewhere in what are now Mexico and the American Southwest. The most famous search for the Seven Cities came in the 16th century, when the Spanish conquistador Francisco Vasquez de Coronado scoured the Great Plains in search of a city of riches called Quivira.



    z
     
  11. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    Jan 26, 1788:
    First Australian penal colony established


    The first 736 convicts banished from England to Australia land in Botany Bay. Over the next 60 years, approximately 50,000 criminals were transported from Great Britain to the "land down under," in one of the strangest episodes in criminal-justice history.

    The accepted wisdom of the upper and ruling classes in 18th century England was that criminals were inherently defective. Thus, they could not be rehabilitated and simply required separation from the genetically pure and law-abiding citizens. Accordingly, lawbreakers had to be either killed or exiled, since prisons were too expensive. With the American victory in the Revolutionary War, transgressors could no longer be shipped off across the Atlantic, and the English looked for a colony in the other direction.

    Captain Arthur Phillip, a tough but fair career naval officer, was charged with setting up the first penal colony in Australia. The convicts were chained beneath the deck during the entire hellish six-month voyage. The first voyage claimed the lives of nearly 10 percent of the prisoners, which remarkably proved to be a rather good rate. On later trips, up to a third of the unwilling passengers died on the way. These were not hardened criminals by any measure; only a small minority were transported for violent offenses. Among the first group was a 70-year-old woman who had stolen cheese to eat.

    Although not confined behind bars, most convicts in Australia had an extremely tough life. The guards who volunteered for duty in Australia seemed to be driven by exceptional sadism. Even small violations of the rules could result in a punishment of 100 lashes by the cat o'nine tails. It was said that blood was usually drawn after five lashes and convicts ended up walking home in boots filled with their own blood--that is, if they were able to walk at all.

    Convicts who attempted to escape were sent to tiny Norfolk Island, 600 miles east of Australia, where the conditions were even more inhumane. The only hope of escape from the horror of Norfolk Island was a "game" in which groups of three prisoners drew straws. The short straw was killed as painlessly as possible and a judge was then shipped in to put the other two on trial, one playing the role of killer, the other as witness.


    Beyond the Headlines | 26 January 2012
    Australia’s penal colony roots
    In Australia
    By Suemedha Sood

    New South Wales, a state in southeast Australia, was founded by the British as a penal colony in 1788. Over the next 80 years, more than 160,000 convicts were transported to Australia from England, Ireland, Scotland and Wales, in lieu of being given the death penalty.

    Today, about 20% of Australians are descendants of convicts, including plenty of prominent citizens. According to genealogists, former Prime Minister Kevin Rudd’s great-great-great-great-great grandmother was sentenced to be hanged when she was just 11 years old for committing robbery. When her sentence was reduced, she was sent to Australia on the second fleet, where conditions were so bad that 25% of its convicts died on the voyage. Celebrity chef Maggie Beer discovered on an episode of Who Do You Think You Are?, a family ancestry programme, that her great-great-great grandmother and great-great-great grandfather (a thief and a bigamist, respectively) met after being transported to Australia. On another episode of the same show, actor Jack Thompson learned that his great-great-great grandfather was a convict from Ireland, charged with highway robbery.

    For at least a century after convict transportation ended in 1868, the Australian colonies tried to hide their founding legacy. Historians were met with serious hesitation when they wanted to highlight the injustices of transportation, a harsh punishment that was often sentenced to impoverished people whose crimes were extremely minor, wrote Babette Smith in the book Australia’s Birthstain.

    But Australia’s shame has been transformed into pride in the last century. The truth about many working class convicts has helped remove the stigma, since some were children, some did little more than steal a bag of sugar, some were political prisoners and some were falsely accused. Plus, there were several celebrities produced by the convict era, including Australia’s most infamous outlaw, Ned Kelly; the country’s first novelist, Henry Savery; celebrated architect Francis Greenway; and,William Buckley, “the wild white man” who escaped captivity and lived out his days with the Aboriginal Watourong tribe.

    Ned Kelly, a Robin-Hood-like folk hero, represented the struggle between poor rural Irish Australians and the repressive British ruling class during and after transportation. Born to a father charged with being an Irish pig thief, in Victoria, Kelly was eventually charged with petty crimes himself; yet, he and his family said they were being persecuted by the police. After authorities accused him of shooting a policeman, Kelly escaped into the Australian bush and formed the Kelly Gang with his brother and two friends in April 1878. They ran from the law for two years, robbing banks and killing cops in shoot-outs along the way. Their time as bushrangers ended in a nine-hour battle with the police; the other gang members were shot to death and Kelly was captured and later hanged.
    Last autumn, Kelly’s remains were identified in a mass prison grave in Melbourne, thanks to a DNA sample from his great grandnephew.
    The convict era left behind a number of landmarks throughout the country for locals and travellers to explore, including the Tasmania Convict Trail, the 11 Australian Convict Sites (now World Heritage Sites) and Ned Kelly tours.


    Convicts and the British colonies in Australia

    On 18 January 1788 the First Fleet arrived at Botany Bay, which Joseph Banks had declared suitable for a penal colony after he returned from a journey there in 1770.
    Captain Arthur Phillip, the fleet's commander, brought a small party of marines and seamen ashore, but found the location unsuitable because the harbour was unsafe and the area lacked fresh water. (The Oxford Companion to Australian History).
    The fleet then relocated to Port Jackson. On 21 January 1788 Phillip, with a party of officers and marines, landed at an unnamed place, believed to be the beachfront at Camp Cove (known as 'cadi' to the local Cadigal people). This occasion marks the first landing of members of the First Fleet within Port Jackson, and the first known European landing in Sydney Harbour.
    After moving further into the harbour, on 26 January 1788 Phillip raised the British flag at Sydney Cove. 751 convicts and their children disembarked, along with 252 marines and their families.

    Two more convict fleets arrived in 1790 and 1791, and the first free settlers arrived in 1793. From 1788 to 1823, the Colony of New South Wales was officially a penal colony comprised mainly of convicts, soldiers and the wives of soldiers.
    The early convicts were all sent to the colony, but by the mid-1800s they were also being sent directly to destinations such as Norfolk Island, Van Diemen's Land, Port Macquarie and Moreton Bay.
    Twenty per cent of these first convicts were women. The majority of women convicts, and many free women seeking employment, were sent to the 'female factories' as unassigned women. The female factories were originally profit-making textile factories. The Parramatta Factory grew as an enclave for pregnant women and also served as an orphanage from the 1830s.

    Convict labour

    Governor Philip (1788-1792) founded a system of labour in which people, whatever their crime, were employed according to their skills - as brick makers, carpenters, nurses, servants, cattlemen, shepherds and farmers.
    Educated convicts were set to the relatively easy work of record-keeping for the convict administration. Women convicts were assumed to be most useful as wives and mothers, and marriage effectively freed a woman convict from her servitude.
    From 1810, convicts were seen as a source of labour to advance and develop the British colony. Convict labour was used to develop the public facilities of the colonies - roads, causeways, bridges, courthouses and hospitals. Convicts also worked for free settlers and small land holders.
    The discipline of rural labour was seen to be the best chance of reform. This view was adopted by Commissioner Bigge in a series of reports for the British Government published in 1822-23. The assignment of convicts to private employers was expanded in the 1820s and 1830s, the period when most convicts were sent to the colonies, and this became the major form of employment.

    Convicts formed the majority of the colony's population for the first few decades, and by 1821 there was a growing number of freed convicts who were appointed to positions of trust and responsibility as well as being granted land.

    The convict experience

    In the mid-1830s only around six per cent of the convict population were 'locked up', the majority working for free settlers and the authorities around the nation. Even so, convicts were often subject to cruelties such as leg-irons and the lash. Places like Port Arthur or Norfolk Island were well known for this. Convicts sometimes shared deplorable conditions. One convict described the working thus:'We have to work from 14-18 hours a day, sometimes up to our knees in cold water, 'til we are ready to sink with fatigue... The inhuman driver struck one, John Smith, with a heavy thong.'

    The experience of these convicts is recorded through the first Australian folk songs written by convicts. Convict songs like Jim Jones, Van Diemen's Land, and Moreton Bay were often sad or critical. Convicts such as Francis Macnamara (known as 'Frankie the Poet') were flogged for composing original ballads with lines critical of their captors.

    In addition to the physical demands of convict life, some convicts arrived without sufficient English to communicate easily with others:
    By 1852, about 1,800 of the convicts had been sentenced in Wales. Many who were sent there could only speak Welsh, so as well as being exiled to a strange country they were unable to speak with most of their fellow convicts.
    Martin Shipton, Western Mail, 2006

    Also telling of convicts' experiences were convict love tokens, mainly produced in the 1820s and 1830s by transported convicts as a farewell to their loved ones. Made from coins such as pennies, most of the engraved inscriptions refer to loss of liberty. One token, made from a penny for convict James Godfrey, is dedicated to his love Hannah Jones. The inscription reads: 'When in/Captivity/Time/Goeth/Very slow/But/Free as air/To roam now/Quick the/Time/Doth/Go'.

    End of transportation

    When the last shipment of convicts disembarked in Western Australia in 1868, the total number of transported convicts stood at around 162,000 men and women. They were transported here on 806 ships.
    The transportation of convicts to Australia ended at a time when the colonies' population stood at around one million, compared to 30,000 in 1821. By the mid-1800s there were enough people here to take on the work, and enough people who needed the work. The colonies could therefore sustain themselves and continue to grow. The convicts had served their purpose.

    Who were the convicts?

    While the vast majority of the convicts to Australia were English and Welsh (70%), Irish (24%) or Scottish (5%), the convict population had a multicultural flavour. Some convicts had been sent from various British outposts such as India and Canada. There were also Maoris from New Zealand, Chinese from Hong Kong and slaves from the Caribbean.
    A large number of soldiers were transported for crimes such as mutiny, desertion and insubordination. Australia's first bushranger - John Caesar - sentenced at Maidstone, Kent in 1785 was born in the West Indies.

    Most of the convicts were thieves who had been convicted in the great cities of England. Only those sentenced in Ireland were likely to have been convicted of rural crimes. Transportation was an integral part of the English and Irish systems of punishment. It was a way to deal with increased poverty and the severity of the sentences for larceny. Simple larceny, or robbery, could mean transportation for seven years. Compound larceny - stealing goods worth more than a shilling (about $50 in today's money) - meant death by hanging.
    Men had usually been before the courts a few times before being transported, whereas women were more likely to be transported for a first offence. The great majority of convicts were working men and women with a range of skills.

    Good behaviour and 'Ticket of leave' licences

    Good behaviour meant that convicts rarely served their full term and could qualify for a Ticket of Leave, Certificate of Freedom, Conditional Pardon or even an Absolute Pardon. This allowed convicts to earn their own living and live independently. However, for the period of their sentence they were still subject to surveillance and the ticket could be withdrawn for misbehaviour. This sanction was found to work better in securing good behaviour then the threat of flogging.
    The ticket of leave licences were developed first to save money, but they then became a central part of the convict system which provided the model for later systems of probation for prisoners.
    Governor King (1800-1804) first issued tickets of leave to any convicts who seemed able to support themselves, in order to save on providing them with food from the government store. The tickets were then used as a reward for good behaviour and special service, such as informing on bushrangers. Gentlemen convicts were issued with tickets on their arrival in the colony although Governor Macquarie (1810-1821) later ordered that a convict had to serve at least three years before being eligible.
    Governor Brisbane (1821-1825) finally set down regulations for eligibility. Convicts normally sentenced to seven year terms could qualify for a Ticket of Leave after four years, while those serving 14 years could expect to serve between six to eight years. 'Lifers' could qualify for their 'Ticket' after about 10 or 12 years. Those who failed to qualify for a pardon were entitled to a Certificate of Freedom on the completion of their term.

    Transportation to the other British colonies

    Van Diemen's Land

    The colony of Van Diemen's Land was established in its own right in 1825 and officially became known as Tasmania in 1856. In the 50 years from 1803-1853 around 75,000 convicts were transported to Tasmania. By 1835 there were over 800 convicts working in chain-gangs at the infamous Port Arthur penal station, which operated between 1830 and 1877.

    Western Australia

    Western Australia was established in 1827 and proclaimed a British penal settlement in 1849 with the first convicts arriving in 1850. Rottnest Island, off the coast of Perth, became the colony's convict settlement in 1838 and was used for local colonial offenders.
    Just under 10,000 British convicts were sent directly to the colony in the 18 years to 1868. They were used by local settlers as labour to develop the region. On January 9, 1868, Australia's last convict ship, the Hougoumont unloaded the final 269 convicts.

    Victoria

    In 1851 Victoria (Port Phillip District) separated from New South Wales. Apart from the early attempts at settlement, the only convicts sent directly to Victoria from Britain were about 1,750 convicts known as the 'Exiles'. They arrived between 1844 and 1849. They were also referred to as the 'Pentonvillians' because most of them came from Pentonville Probationary Prison in England.

    Queensland

    In 1859 Queensland separated from New South Wales. In 1824, the penal colony at Redcliffe was established by Lieutenant John Oxley. Known as the Moreton Bay Settlement, it later moved to the site now called Brisbane.
    The main inhabitants of 'Brisbane Town', as it was known, were the convicts of the Moreton Bay Penal Station until it was closed in 1839. Around 2,280 convicts were sent to the settlement in those fifteen years.

    The abolition of transportation

    Transportation to the colony of New South Wales was officially abolished on 1 October 1850, and in 1853 the order to abolish transportation to Van Diemen's Land was formally announced.
    South Australia, and the Northern Territory of South Australia, never accepted convicts directly from England, but still had many ex-convicts from the other States. After they had been given limited freedom, many convicts were allowed to travel as far as New Zealand to make a fresh start, even if they were not allowed to return home to England.
    At the time, there was also a great deal of pressure to abolish transportation. Given that only a small percentage of the convict population was locked up, many believed that transportation to Australia was an inappropriate punishment - that it did not deliver 'a just measure of pain'. This, combined with the employment needs of Australia's thriving population, ensured the abolition of convict transportation.

    Convicts in film and television

    The novel For the Term of his Natural Life by Marcus Clarke (1846-1881) is a story about a young man who is wrongly accused of murder and transported to Australia as a convict. A film based on the novel was directed and produced in 1927 by Norman Dawn. The novel was also adapted as a television serial in 1983.

    The last confessions of Alexander Pearce (2008) is a one-hour television drama telling the 180-year-old story of the escaped cannibal convict Alexander Pearce. The story is centred on the colony's Catholic priest, who heard Pearce's confession after he was recaptured. The story was co-written by producer Nial Fulton and director Michael James Rowland.



    z
     
  12. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    Jan 31, 1990:
    The McMartin Preschool trials

    Los Angeles prosecutors announce that they will retry teacher Raymond Buckey, who was accused of molesting children at the McMartin Preschool in Manhattan Beach, California. The McMartin trials had already taken over six years and cost more than $13.5 million without a single guilty verdict resulting from 208 charges. However, a jury had deadlocked on 13 charges (voting 11-2 for acquittal) against Buckey, and prosecutors, not willing to let the matter drop, decided to retry him on eight of these counts.

    The McMartin prosecutions represented the height of the hysteria over sexual abuse of children in America. Despite a complete lack of reputable evidence against the teachers and workers at McMartin Preschool, and with every indication that the children had been coerced and manipulated into their testimony, the prosecutors nonetheless proceeded against Ray Buckey for more than six years.

    "Believe the children" became the mantra of advocates who insisted that children never lied or were mistaken about abuse. The courts made unprecedented changes to criminal procedure to accommodate this mistaken notion. The California Supreme Court ruled that child witnesses were not required to provide details about the time and place of the alleged molestation to support a conviction. The U.S. Supreme Court held that child witnesses could testify outside the courtroom despite the Sixth Amendment's clear command that a defendant had the right to confront his or her accusers.

    Throughout the nation, parents and day-care workers were jailed after false, and often absurd, allegations about child sexual abuse. As this hysteria swept the country, abuse counseling quickly became a cottage industry, attracting often-unqualified people who seemed to find sexual abuse everywhere.

    Recent research has found that young children are exceptionally easy to manipulate. Even when only subtly suggested, a child will respond with the answers he or she believes a questioner wants to hear.

    This was abundantly clear in Ray Buckey's case. In one instance, a girl initially failed to identify Buckey as someone who had harmed her. After an interview with Children's Institute International, the counseling agency that worked with every child in the case, the girl did pick Buckey as her attacker. It later turned out that Buckey wasn't even at the school during the time period that the child attended McMartin.

    Buckey's retrial went much faster. By July, the jury had acquitted on seven charges and were deadlocked (once again, the majority voting for acquittal) on the other six accusations. The district attorney then finally decided to drop the case. The Buckeys successfully sued the parents of one child for slander in 1991, but they were awarded only $1 in damages.



    The McMartin Preschool Abuse Trial: A Commentary
    by Doug Linder (2003)

    The McMartin Preschool Abuse Trial, the longest and most expensive criminal trial in American history, should serve as a cautionary tale. When it was all over, the government had spent seven years and $15 million dollars investigating and prosecuting a case that led to no convictions. More seriously, the McMartin case left in its wake hundreds of emotionally damaged children, as well as ruined careers for members of the McMartin staff. No one paid a bigger price than Ray Buckey, one of the principal defendants in the case, who spent five years in jail awaiting trial for a crime (most people recognize today) he never committed. McMartin juror Brenda Williams said that the trial experience taught her to be more cautious: "I now realize how easily something can be said and misinterpreted and blown out of proportion." Another juror, Mark Bassett, singled out "experts" for blame: "I thought some of the expert testimony about the children told you more about the expert than the child. I mean, if the expert says children are always 100% believable and then you have a child who is not believable, either the expert is extremely biased or they've never seen anything like that child before."
    The McMartin trial had its origins in a call placed to police in Manhattan Beach, California by Judy Johnson, the mother of a two-and-a-half-year-old son who attended the McMartin Preschool on about ten occasions in 1983. Johnson told Detective Jane Hoag that a school aide, Ray Buckey, the 25-year-old son of the owner of the preschool, had molested her son. Despite the fact that the young boy was unable to identify Ray from photos and medical investigations of the boy showed no signs of sexual abuse, the police conducted searches of Buckey's home, confiscating such "evidence" as a rubber duck, a graduation robe, a Teddy bear, and Playboy magazines. Detective Hoag arrested Buckey on September 7, 1983.

    The next day, Police Chief Harry Kuhlmeyer sent a letter to 200 McMartin Preschool parents informing them that Ray Buckey was suspected of child abuse and asking them for information. The letter asked parents to "question your child to see if he or she has been a witness to any crime or if he or she has been a victim." The letter listed the possible criminal acts under investigation:

    [The acts include] oral sex, fondling of genitals, buttock or chest area, and sodomy, possibly committed under the pretense of "taking the child's temperature." Also photos may have been taken of children without their clothing. Any information from your child regarding having ever observed Ray Buckey to leave a classroom alone with a child during any nap period, or if they have ever observed Ray Buckey Ray Buckey tie up a child, is important.

    Chief Kuhlmeyer's letter ended by asking parents "to please keep this investigation strictly confidential because of the nature of the charges and the highly emotional effect it could have on our community. Please do not discuss this investigation with anyone outside your immediate family." Needless to say, it wasn't long before everyone connected with the McMartin Preschool, and indeed most everyone in the Los Angeles metropolitan area, knew of the ongoing investigation of Ray Buckey.
    Judy Johnson's reports of misbehavior at the McMartin Preschool became increasingly bizarre. She claimed that Peggy Buckey, Ray's mother, was involved in satanic practices: she was said to have taken Johnson's son to a church, where the boy was made to watch a baby being beheaded, and then was forced to drink the blood. She insisted that Ray Buckey had sodomized her son while his head was in the toilet, and had taken him to a car wash and locked him in the trunk. Johnson told police that Ray pranced around the preschool in a cape and a Santa Claus costume, and that other teachers at the school chopped up rabbits and placed "some sort of star" on her son's bottom.

    Eventually most prosecutors would come to recognize Johnson's allegations as the delusions of a paranoid schizophrenic, but the snowball of suspicion had been started rolling. Chief Kuhlmeyer's letter led to new accusations and demands from parents for a full-scale investigation of doings at the McMartin Preschool. Bowing to this pressure, the District Attorney's office handed a major portion of the continuing investigation over to Kee MacFarlane, a consultant for the Children's Institute International (CII), an agency for the treatment of abused children.

    Parents were encouraged to send their children to CII for two-hour interviews. MacFarlane pressed 400 children, through a series of leading questions and the offer of rewards, to report instances of abuse at McMartin. Children generally denied seeing any evidence of abuse at first, but eventually many gave MacFarlane the stories that she clearly wanted to hear. After the interviews, MacFarlane told parents that their children had been abused, and described the nature of the alleged abuse. By March 1984, 384 former McMartin students had been diagnosed as sexually abused.

    In addition to interviews, 150 children received medical examinations. Dr. Astrid Heger, of CII, concluded that 80% of the children she examined had been molested. For the most part, she based her findings not on physical evidence, but on medical histories and her belief that "any conclusion should validate the child's history."

    On March 22, 1984, a grand jury indicted Ray Buckey, Peggy Buckey (Ray's mother), Peggy Ann Buckey (Ray's sister), Virginia McMartin (founder of the preschool thirty years earlier), and three other McMartin teachers, Mary Ann Jackson, Bette Raidor, and Babette Spitler. The grand jury initially indicted the "McMartin Seven" on 115 counts of child sexual abuse. Two months later, and additional 93 indictment counts were added, as District Attorney Robert Philobosian pursued his strategy of hyping the McMartin case to boost his chances in an upcoming primary election. In June, bail for Peggy Buckey was set at one million dollars. Ray Buckey was held without bail.


    The Preliminary Hearing

    By the time the preliminary hearing began in August 1984, Prosecutor Lael Rubin was telling the media that the seven defendants committed 397 sexual crimes (far more than the number for which they were indicted) and that thirty additional individuals associated with the McMartin Preschool were under investigation.

    Searches of the McMartin Preschool and the homes of defendants failed to produce much incriminating evidence. No nude photographs of children were discovered, despite the insistence of investigators and parents that such photographing was commonplace at McMartin. No evidence was found of the "secret rooms" where massive instances of sexual abuse were said to have taken place. In March 1985, a group of nearly fifty McMartin parents, determined to unearth the fabled secret tunnels, began digging at a lot next to the school. A few days later, the parents were joined in their efforts by an archeological firm hired by the District Attorney's office. Still, no secret rooms were ever discovered.

    The longest--and probably strangest--preliminary hearing in history began before Municipal Court Judge Aviva Bobb in early 1984. The chaotic proceeding featured seven defendants (each with his or her own attorney) and three prosecutors. Unlike the typical preliminary hearing in which the prosecution tries to demonstrate cause for bringing the defendants to trial and the defense passively observes, the defense in the McMartin hearing mounted an "affirmative defense," aggressively cross-examining a parade of prosecution witnesses including allegedly abused children, McMartin parents, therapists, and medical experts. The defense repeatedly tried to raise questions as to how abuse on such a massive scale could have gone undetected for years and suggested that much of the testimony of the prosecution's child witnesses was flatly unbelievable.

    Kee MacFarlane testified at the preliminary hearing that the abuse was able to go on for years because children either suffered from "denial syndrome" or were afraid that revealing McMartin's dark secrets would result in their own deaths, or the deaths of family members. MacFarlane explained that she succeeded in bringing out the secrets with the help of anatomically correct dolls and a set of puppets, through which she asked children questions during her interviews. The puppets included Mr. Alligator, Mr. Snake, Detective Dog, and Mr. Sparky. Videotapes of the interviews also showed that MacFarlane and other therapists relied heavily on leading questions and subtle pressure to persuade children to join the chorus of accusers. The defense played tapes that showed therapist Shawn Connerly telling a child interviewee that 183 kids had already revealed "yucky secrets" and that all the McMartin teachers were "sick in the head" and deserved to be beaten up.

    The testimony of children at the preliminary hearing was shockingly bizarre, and often riddled with inconsistencies and contradictions. Several children reported being photographed while performing nude somersaults as part of the Naked Movie Star Game. One child said that as the game was being played the children sang, "What you see is what you are, you're a naked movie star!" Others testified as to playing a nude version of "Cowboys and Indians"-- sometimes with the Indians sexually assaulting the cowboys, and sometimes vice versa. Children testified that sexual assaults took place on farms, in circus houses, in the homes of strangers, in car washes, in store rooms, and in a "secret room" at McMartin accessible by a tunnel. One boy told of watching animal sacrifices performed by McMartin teachers wearing robes and masks in a candle-lit ceremony at St. Cross Episcopal Church. In response to a defense question, the boy added that the kids were forced to drink the blood of the sacrificed animals. Perhaps strangest of all, was the testimony of one boy who said that the McMartin teachers took students to a cemetery where the kids were forced to use pickaxes and shovels to dig up coffins. Once the coffins were removed from the ground, according to the child, they would be opened and the McMartin teachers would begin hacking the bodies with knives.

    By September 1985, and well over a year into the preliminary hearing, some members of the prosecution's own team began to express doubts about the case. One prosecutor was quoted as saying, "Kee MacFarlane could make a sixth month old baby say he was molested." The two co-prosecutors in the case urged dropping all charges against five of the seven defendants, and pushing ahead with prosecution only for Ray Buckey and Peggy Buckey. Chief Prosecutor Lael Rubin, however, argued that all seven deserved prosecution. After a December 1985 meeting involving over a dozen members of the District Attorney's Office, the decision was made to drop charges against all defendants except Ray and Peggy Buckey. So far the case had cost Los Angeles County four million dollars--and the trial had yet to begin.


    The First Trial

    A legal bombshell exploded before the trial was scheduled to begin in the courtroom of Judge William R. Pounders. Independent filmmakers producing a documentary on the McMartin trial turned over to both the California A. G.'s office and to defense attorneys copies of a taped interview with McMartin prosecutor Glenn Stevens. In the interview, Stevens acknowledges that children began "embellishing and embellishing" their stories of sexual abuse and said that, as prosecutors, "we had no business being in court." Stevens also admitted on tape that prosecutors withheld potentially exculpatory information from defense attorneys, including evidence concerning the mental instability of the original complainant in the case, Judy Johnson, as well as evidence that Johnson's son was unable to identify Ray Buckey in a police line-up. Based on the revelations contained in the Stevens interview, defense attorneys moved that charges against Ray and Peggy Buckey be dismissed, but Judge Pounders denied the motion.

    Jury selection took weeks. The twelve finally selected included eight males and four females. Half of the jurors were white, three African-American, two Asian, and one Hispanic. All but two jurors had at least some college education. Defense attorneys said later they were pleased with the jury.

    In many ways, the trial was a condensed version of the preliminary hearing. While the prosecution attempted to prove widespread sexual abuse of McMartin children, the defense tried to prove that the whole show was driven by the suggestive and overzealous interview techniques of the crusading therapists of CII. In addition to featuring two rather than seven defendants, there were fewer charges, fewer attorneys, and fewer witnesses. Still, by any measure, it was a major trial. Before it was over, the prosecution would present 61 witnesses, including nine child witnesses, a jailhouse informant, parents, medical specialists, therapists, and even a woman who had sexual relations with Ray Buckey.

    Opening statements in the McMartin trial began on July 14, 1987. Deputy District Attorney Lael Rubin characterized the trial as one about the betrayal of trust. Dean Gits, attorney for Peggy Buckey, described a case in which the children, the parents, and the McMartin teachers were all victims of an overzealous prosecutor. He told the jurors to consider that the McMartin Preschool operated for over twenty years without complaints, and that the prosecution--despite moving heaven and earth in a search for secret tunnels, pornographic pictures, semen, and buried animals--had turned up no hard evidence of any sexual molestation. Daniel Davis, attorney for Ray Buckey, said that he would offer a "common sense defense" that would show his client to be the victim of suggestive interviewing techniques and a virtual witch hunt.

    The prosecution produced several parent witnesses to lay a foundation for the accounts of their children that would follow. Typically, a parent would testify that prior to the infamous letter of Chief Kuhlmeyer announcing that Ray Buckey was suspected of child abuse, he or she had no reason to suspect that his or her child had been molested. After taking the children to CII and talking with Kee MacFarlane, however, the parents became convinced that their children had been sexually abused. Parents suggested that bladder infections, nightmares, anatomically correct artwork, or masturbation were confirming evidence of abuse. A couple of parents theorized that the massive abuse might have occurred during nap time, when parents were prohibited from picking up their children.

    The prosecution's child witnesses, ranging in age from eight to fifteen, repeated many of their stories from the preliminary hearing. Jurors heard of the Naked Movie Star Game, Ray Buckey scaring the children into silence by executing a cat with a knife, and numerous graphic accounts of sexual abuse by both Ray and Peggy Buckey. The defense countered with evidence of contradictions between trial testimony and testimony at the preliminary hearing, videotaped interviews in which the children denied that they were molested, and CII interviews revealing MacFarlane coaching children and rewarding "right" answers.

    The defense tried to produce a child witness of its own, the young boy who started the whole investigation rolling: the son of Judy Johnson. With Judy Johnson now deceased, the boy's father flatly told reporters that his son would testify "over my dead body." Judge Pounders agreed with Johnson that trial testimony might prove too stressful for his son and declared the boy legally unavailable as a witness.

    Perhaps the key witness in the trial was CII therapist Kee MacFarlane. In her five weeks on the stand, MacFarlane fought to defend her controversial interview techniques that included naked puppets, anatomically correct dolls, and telling children what other children had previously reported about sexual abuse at the McMartin School. Before MacFarlane finished her lengthy testimony, even Judge Pounders was expressing concern about her techniques. Outside of the presence of the jury, Pounders declared, "In my view, her credibility is becoming more of an issue as she testifies here."

    Defense expert Dr. Michael Maloney, professor of psychiatry at USC, further discredited MacFarlane's interview techniques. Maloney criticized the technique as presenting children with a "script" that discouraged "spontaneous information" and instead encouraged the children to supply expected answers to "please mother and father" and prove themselves "good detectives."

    Another distinct weakness in the prosecution's case was the lack of medical evidence of sexual abuse. Although Dr. Astrid Heger testified that she found numerous scars "consistent with rape," the defense's medical expert, Dr. David Paul, said that his review of the medical evidence turned up virtually no evidence of molestation. In the case of nine of the eleven alleged victims, Paul found the body parts to be "perfectly normal."

    Perhaps the strangest testimony at the trial came from jailhouse informant George Freeman, Ray Buckey's cell mate and a nine-time felon and confessed perjurer. Freeman testified that Buckey had admitted to him that he sexually molested children at the McMartin School and elsewhere, had a long-standing incestuous relationship with his sister, shipped pornographic materials to Denmark, and had buried incriminating photos of himself and children in South Dakota.

    The high point of the trial, from the standpoint of media attention, came with the testimony of the defendants themselves. Peggy McMartin Buckey was the first to testify, telling the jury "never" when asked whether "she ever molested those children." She also told jurors that she never witnessed her son behaving in a sexually inappropriate way at the school. Ray Buckey also denied each and every prosecution charge--as well as the allegations made by jailhouse informant George Freeman. He testified that he was not even teaching at the school during many of the times in which he was accused of abusing children. During cross-examination, prosecutor Lael Rubin kept hammering Buckey with questions about two barely relevant facts uncovered during the investigation: that Buckey sometimes did not wear underwear and that he owned several sexually explicit adult magazines.

    On November 2, 1989, after nearly thirty months of testimony, the case went to the jury. The jury spent another two-and-a-half months deliberating its verdicts. On fifty-two of the sixty-five charges against the two defendants (some charges were dropped during the trial), including all of the charges against Peggy Buckey, the jury returned an acquittal. On the thirteen remaining charges against Ray Buckey, the jury announced that it was hopelessly deadlocked. Jury foreperson Luis Chang explained the vote: "The interview tapes were too biased; too leading. That's the main crux of it." Another juror told reporters, "Whether I believe he did it and whether it was proven are very different." Judge Pounders offered his own appraisal of the verdict: "I was not surprised by the verdicts. I would not have been surprised at any decision the jury made."


    Aftermath and Second Trial

    Child protection groups and parents pressured prosecutors to retry Ray Buckey on the charges on which the first jury deadlocked. Five hundred people, including many McMartin parents, marched through the streets of Manhattan Beach carrying signs such as "We believe the children." One McMartin parent called the verdict in the first trial "a crime...almost equal to the crime that occurred outside the courtroom." A television poll showed 87% of respondents thought the Buckeys guilty.

    District Attorney Ira Reiner signed off on the retrial. Two new prosecutors were assigned to the case, Joe Martinez and Pamela Ferrero. The second trial also saw a new judge, following a successful motion by defense attorney Daniel Davis to have Judge Pounders removed from the case. Pounders expressed relief at the development: "I'm finally free after three years and three months. I was honestly afraid I couldn't live through it." Superior Court Judge Stanley Weisberg was assigned to replace Pounders.

    The second trial was a much more focused proceeding, involving only eight counts of molestation and three children. The prosecution presented its entire case in just thirteen days (compared to fifteen months in the first trial) and offered only eleven witnesses. One of the witnesses was a mother who, on the stand, glared at Ray Buckey and announced, "I'm so angry at you, I could kill you right now." The prosecution chose not to call CII interviewer Kee MacFarlane; instead, MacFarlane was called as a defense witness.

    Jury deliberations after the three-month trial were described by one juror as "excruciating." The jury ended its deliberations deadlocked on all eight counts. The jury leaned toward acquittal on six of the counts, and leaned toward conviction on only one count.

    Following the mistrial, District Attorney Reiner chose not to retry Buckey a third time and all charges against him were dismissed.

    The McMartin Preschool Abuse Trial was costly in many ways. In monetary terms, it cost taxpayers over $15 million dollars. For the defendants, the costs of the trial included long terms in jail (Ray Buckey spent five years in jail before being released on bail), loss of homes, loss of jobs, loss of life savings, and a stigma that might never leave. The children too were victims. Ray Buckey in a CBS interview said: "Those poor children went through hell,...but I'm not the cause of their hell and neither is my mother..The cause of their hell is the ...adults who took this case and made it what it was." Parents, too, suffered. Many felt betrayed by the justice system. The community of Manhattan Beach was another victim, left uneasy and polarized by the long investigation and judicial proceedings.

    The effects of the McMartin trial even extended beyond the state of California. Across the country, day care providers resisted the temptation to hug or touch children--contact almost all child experts say children need--out of a fear that their actions might be interpreted as signs of abuse. Many day care centers were forced to close their doors after insurance companies, fearing molestation lawsuits, dramatically raised liability insurance rates. Early publicity surrounding the McMartin investigation also spawned a rash of charges against day care providers elsewhere, many of which proved to be unsubstantiated.

    There are many lessons to be learned from the McMartin Preschool Trial. There are lessons for police and prosecutors, but there are also lessons for the media. It was "pack journalism"--slanted heavily toward the prosecution, providing sensational headlines day after day, almost never seriously questioning the allegations--that turned the McMartin trial into the expensive and damaging fiasco that it became.



    z
     
  13. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    Title:Artist Murdered at the National Museum
    Date:January 16, 1907

    Notes:The crime occurred during the tenure of Secretary Samuel P. Langley and the Smithsonian Annual Report for the year 1907 made no mention of the incident

    "Artist Shot to Death," Washington Post, January 17,1907, p. 2
    "Suffered a Breakdown," Washington Post, January 17, 1907 p. 2
    "Artist's Slayer Held," Washington Post, January 18, 1907, p. 2
    "Insanity Will Be His Plea," Washington Post, January 19, 1907, p. 2
    "Indicted for Murder," Washington Post, March 14, 1907, p. 11
    "Insanity Plea for Slayer," Washington Post, March 28, 1907, p. 3
    "Slayer Found Insane," Washington Post, March 29, 1907, p. 13

    Summary

    Please Register or Log in to view the hidden image!

    n January 16, 1907, Otto Seelhorst fatally shot Frederick von Iterson in the United States National Museum. Seelhorst arrived in Washington, D.C., from Philadelphia where he worked as an artist. In the early morning, Seelhorst entered the National Museum and obtained directions to the office of Assistant Curator in the Division of Mammals Marcus W. Lyon from Edgar W. Hanvey, a carpenter working in the building. Lyon recently hired von Iterson, an artist at the Wistar Institute of Anatomy, to make some scientific drawings for the Division of Mammals. Around 10:30 A.M., Seelhorst approached von Iterson, talked briefly and opened fire on the man with a .22-caliber Winchester. Seelhorst shot von Iterson four times, three of which were fatal. Seelhorst remained with the body until he was arrested. Seelhorst had a history of mental illness and spent a year in an Eaton, Pennsylvania, sanitarium in 1906. However, the incident was not random. Fifteen years prior to the incident, Seelhorst worked with von Iterson in Philadelphia at the Keterlinus Manufacturing Company. According to Seelhorst, von Iterson molested the young man which caused Seelhorst's life long problems with mental illness. Witnesses to the murder told reporters that Seelhorst claimed he committed the crime to protect other young children from von Iterson and showed no remorse at the scene of the crime. Seelhorst was arrested and put on trial in March of the same year. Seelhorst was tried by the Criminal Court Circuit No. 2 of the District of Columbia and was acquitted on account of his insanity. The murder trial was the shortest trial on record to date in the District of Columbia. Justice Job Barnard committed Seelhorst to the Government Hospital for the Insane until he recovered or for the remainder of his life.


    Category:Chronology of Smithsonian History

    Place

    Please Register or Log in to view the hidden image!

    hiladelphia (Pa.)Full Record:http://siris-sihistory.si.edu/ipac2...sichronology&uri=full=3100001~!11884~!0#focus
     
  14. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    Aug 10, 1793:
    Louvre Museum opens


    After more than two centuries as a royal palace, the Louvre is opened as a public museum in Paris by the French revolutionary government. Today, the Louvre's collection is one of the richest in the world, with artwork and artifacts representative of 11,000 years of human civilization and culture.

    The Louvre palace was begun by King Francis I in 1546 on the site of a 12th-century fortress built by King Philip II. Francis was a great art collector, and the Louvre was to serve as his royal residence. The work, which was supervised by the architect Pierre Lescot, continued after Francis' death and into the reigns of kings Henry II and Charles IX. Almost every subsequent French monarch extended the Louvre and its grounds, and major additions were made by Louis XIII and Louis XIV in the 17th century. Both of these kings also greatly expanded the crown's art holdings, and Louis XIV acquired the art collection of Charles I of England after his execution in the English Civil War. In 1682, Louis XIV moved his court to Versailles, and the Louvre ceased to be the main royal residence.

    In the spirit of the Enlightenment, many in France began calling for the public display of the royal collections. Denis Diderot, the French writer and philosopher, was among the first to propose a national art museum for the public. Although King Louis XV temporarily displayed a selection of paintings at the Luxembourg Palace in 1750, it was not until the outbreak of the French Revolution in 1789 that real progress was made in establishing a permanent museum. On August 10, 1793, the revolutionary government opened the Musée Central des Arts in the Grande Galerie of the Louvre.

    The collection at the Louvre grew rapidly, and the French army seized art and archaeological items from territory and nations conquered in the Revolutionary and Napoleonic wars. Much of this plundered art was returned after Napoleon's defeat in 1815, but the Louvre's current Egyptian antiquities collections and other departments owe much to Napoleon's conquests. Two new wings were added in the 19th century, and the multi-building Louvre complex was completed in 1857, during the reign of Napoleon III.

    In the 1980s and 1990s, the Grand Louvre, as the museum is officially known, underwent major remodeling. Modern museum amenities were added and thousands of square meters of new exhibition space were opened. The Chinese American architect I.M. Pei built a steel-and-glass pyramid in the center of the Napoleon courtyard. Traditionalists called it an outrage. In 1993, on the 200th anniversary of the museum, a rebuilt wing formerly occupied by the French ministry of finance was opened to the public. It was the first time that the entire Louvre was devoted to museum purposes.
     
  15. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    Throughout Europe, Asia, and the Americas a variety of Solanum species containing potent tropane alkaloids were used for anesthesia. In 13th century Italy, Theodoric Borgognoni used similar mixtures along with opiates to induce unconsciousness, and treatment with the combined alkaloids proved a mainstay of anesthesia until the nineteenth century. Local anesthetics were used in Inca civilization where shamans chewed coca leaves and performed operations on the skull while spitting into the wounds they had inflicted to anesthetize.[25]Cocaine was later isolated and became the first effective local anesthetic. It was first used in 1859 by Karl Koller, at the suggestion of Sigmund Freud, in eye surgery in 1884.[26] German surgeon August Bier (1861–1949) was the first to use cocaine for intrathecal anesthesia in 1898.[27] Romanian surgeon Nicolae Racoviceanu-Piteşti (1860–1942) was the first to use opioids for intrathecal analgesia; he presented his experience in Paris in 1901.
    Early Arab writings mention anesthesia by inhalation. This idea was the basis of the "soporific sponge" ("sleep sponge"), introduced by the Salerno school of medicine in the late twelfth century and by Ugo Borgognoni (1180–1258) in the thirteenth century. The sponge was promoted and described by Ugo's son and fellow surgeon, Theodoric Borgognoni (1205–1298). In this anesthetic method, a sponge was soaked in a dissolved solution of opium, mandragora, hemlock juice, and other substances. The sponge was then dried and stored; just before surgery the sponge was moistened and then held under the patient's nose. When all went well, the fumes rendered the patient unconscious.

    In 1275, Spanish physician Raymond Lullus, while experimenting with chemicals, made a volatile, flammable liquid he called sweet vitriol. Sweet vitriol, or sweet oil of vitriol, was the first inhalational anesthetic used for surgical anesthesia. It is no longer used often because of its flammability. In the 16th century, a Swiss-born physician commonly known as Paracelsus made chickens breathe sweet vitriol and noted that they not only fell asleep but also felt no pain. Like Lullus before him, he did not experiment on humans. In 1730, German chemist Frobenius gave this liquid its present name, ether, which is Greek for “heavenly”. But 112 more years would pass before ether’s anesthetic powers were fully appreciated.

    Meanwhile, in 1772, English scientist Joseph Priestley discovered the gas nitrous oxide. Initially, people thought this gas to be lethal, even in small doses. However, in 1799, British chemist and inventor Humphry Davy decided to find out by experimenting on himself. To his astonishment he found that nitrous oxide made him laugh, so he nicknamed it laughing gas. Davy wrote about the potential anesthetic properties of nitrous oxide, but nobody at that time pursued the matter any further.

    American physician Crawford W. Long noticed that his friends felt no pain when they injured themselves while staggering around under the influence of ether. He immediately thought of its potential in surgery. Conveniently, a participant in one of those “ether frolics", a student named James Venable, had two small tumors he wanted excised. But fearing the pain of surgery, Venable kept putting the operation off. Hence, Long suggested that he have his operation while under the influence of ether. Venable agreed, and on 30 March 1842 he underwent a painless operation. However, Long did not announce his discovery until 1849.[29]

    In 1846, Boston dentist William Thomas Green Morton conducted the first public demonstration of the inhalational anesthetic.[30] Morton, who was unaware of Long's previous work, was invited to the Massachusetts General Hospital to demonstrate his new technique for painless surgery. After Morton had induced anesthesia, surgeon John Collins Warren removed a tumor from the neck of Edward Gilbert Abbott. This occurred in the surgical amphitheater now called the Ether Dome. The previously skeptical Warren was impressed and stated, "Gentlemen, this is no humbug." In a letter to Morton shortly thereafter, physician and writer Oliver Wendell Holmes, Sr. proposed naming the state produced "anesthesia", and the procedure an "anesthetic".[31]

    Morton at first attempted to hide the actual nature of his anesthetic substance, referring to it as Letheon. He received a US patent for his substance, but news of the successful anesthetic spread quickly by late 1846. Respected surgeons in Europe including Liston, Dieffenbach, Pirogov, and Syme, quickly undertook numerous operations with ether. An American-born physician, Boott, encouraged London dentist James Robinson to perform a dental procedure on a Miss Lonsdale. This was the first case of an operator-anesthetist. On the same day, 19 December 1846, in Dumfries Royal Infirmary, Scotland, a Dr. Scott used ether for a surgical procedure.[32] The first use of anesthesia in the Southern Hemisphere took place in Launceston, Tasmania, that same year. Drawbacks with ether such as excessive vomiting and its flammability led to its replacement in England with chloroform.

    Discovered in 1831 by an American physician Samuel Guthrie (1782-1848); and independently a few months later by Frenchman Eugène Soubeiran (1797-1859) and Justus von Liebig (1803–73) in Germany. Chloroform was named and chemically characterised in 1834 by Jean-Baptiste Dumas (1800–84). Its anaesthetic properties were noted early in 1847 by Marie-Jean-Pierre Flourens (1794-1867). The use of chloroform in anesthesia is linked to James Young Simpson, who, in a wide-ranging study of organic compounds, found chloroform's efficacy on 4 November 1847. Its use spread quickly and gained royal approval in 1853 when John Snow gave it to Queen Victoria during the birth of Prince Leopold. Unfortunately, chloroform is not as safe an agent as ether, especially when administered by an untrained practitioner (medical students, nurses, and occasionally members of the public were often pressed into giving anesthetics at this time). This led to many deaths from the use of chloroform that (with hindsight) might have been preventable. The first fatality directly attributed to chloroform anesthesia was recorded on 28 January 1848 after the death of Hannah Greener.

    John Snow of London published articles from May 1848 onwards "On Narcotism by the Inhalation of Vapours" in the London Medical Gazette. Snow also involved himself in the production of equipment needed for the administration of inhalational anesthetics.



    Health
    The painful story behind modern anesthesia
    BY Dr. Howard Markel October 16, 2013 at 2:49 PM EDT

    One of the truly great moments in the long history of medicine occurred on a tense fall morning in the surgical amphitheater of Boston’s Massachusetts General Hospital.

    It was there, on Oct. 16, 1846, that a dentist named William T. G. Morton administered an effective anesthetic to a surgical patient. Consenting to what became a most magnificent scientific revolution were John Warren, an apprehensive surgeon, and Glenn Abbott, an even more nervous young man about to undergo removal of a vascular tumor on the left side of his neck.

    Both Warren and Abbott sailed through the procedure painlessly, although some have noted that Abbott moved a bit near the end. Turning away from the operating table toward the gallery packed with legitimately dumbstruck medical students, Dr. Warren gleefully exclaimed, “Gentlemen, this is no humbug!”

    Morton named his “creation” Letheon, after the Lethe River of Greek mythology. Drinking its waters, the ancients contended, erased painful memories. Hardly such an exotic elixir, Morton’s stuff was actually sulfuric ether.

    Regardless of composition, Letheon inspired a legion of enterprising surgeons to devise and execute an armamentarium of lifesaving, invasive procedures that continue to benefit humankind to this very day.

    Yet while the discovery of anesthesia was a bona fide blessing for humankind, it hardly turned out to be that great for its “discoverer,” William T. G. Morton.

    Morton began his dental studies in Baltimore in 1840. Two years later he set up practice in Hartford, ultimately working with a dentist named Horace Wells. At this time, surgeons could offer patients little beyond opium and alcohol to endure the agonizing pain engendered by scalpels.

    From the late 18th century well into the 1840s, physicians and chemists experimented with agents such as nitrous oxide, ether, carbon dioxide, and other chemicals without success. In an era before the adoption of daily dental hygiene and fluoride treatments, excruciating tooth extractions were an all too common part of the human experience. Consequently, dentists joined physicians and surgeons in the Holy Grail-like search for safe and effective substances to conquer operative pain.

    Around this time, Morton and Wells conducted experiments using nitrous oxide, including a demonstration at Harvard Medical School in 1845 that failed to completely squelch the pain of a student submitting to a tooth-pulling, thus publicly humiliating the dentists. Although Morton and Wells amicably dissolved their partnership, Morton continued his search for anesthetic agents.

    A year earlier, in 1844, during studies at Harvard Medical School (which were cut short by financial difficulties), Morton attended the lectures of chemistry professor Charles Jackson. One session was on how the common organic solvent sulfuric ether could render a person unconscious and even insensate.

    Recalling these lessons during the summer of 1846, Morton purchased bottles of the stuff from his local chemist and began exposing himself and a menagerie of pets to ether fumes. Satisfied with its safety and reliability, he began using ether on his dental patients.

    Soon, mobs of tooth-aching, dollar-waving Bostonians made their way to his office. Morton relished his financial success but quickly perceived that Letheon was good for far more than pulling teeth.

    Morton’s remarkable demonstration at the Massachusetts General Hospital that long ago October morning transmogrified his status from profitable dentist to internationally acclaimed healer.

    But the half-life of his celebrity turned out to be molto presto, followed by an interminable period of infamy and hardship during which he was lambasted for insisting on applying for an exclusive patent on Letheon.

    In the United States of the mid-19th century, it was considered unseemly, if not outright greedy, for members of the medical profession to profit from discoveries that universally benefited humankind, particularly from a patent for what turned out to be the easily acquired sulfuric ether.

    As long as Morton stuck to dentistry, many physicians argued, he could do as he liked; but if he desired acceptance of Letheon by physicians and surgeons, he needed to comply with what they considered their higher-minded ideals and ethics.

    Morton aggressively rejected all such suggestions, much to his detriment. There was also the issue of credit. Horace Wells demanded his share. So did Crawford W. Long, a Georgia practitioner who claimed to have used nitrous oxide and ether as early as 1842 but who was too busy to publish his findings. Morton’s former professor, Charles Jackson, argued that he, too, deserved a piece of the action.

    While many toyed with anesthetic agents, it was Morton who first developed a novel delivery instrument to enable ether inhalation during an operation.

    The device consisted of a glass flask with a wooden mouthpiece that could be opened and closed depending on the patient’s state of consciousness.

    This was critical because other experimenters, including Wells and Long, could not ensure rapid reversibility of the anesthetic state and often overdosed their patients.

    Morton’s genius resided not only in his observations of the power of ether but also in his development of a crude but scientific method of regulating its inhalation, thus creating the field of anesthesiology.

    Not everyone saw it that way. Vigorously combating the whispered and shouted campaigns against him, the dentist spent his remaining days trying to restore his sullied reputation. Morton died broke and embittered in 1868. It would be many decades more before Morton was rightfully returned to the pantheon of medical greats.

    Morton’s search to conquer pain was a remarkable contribution to medicine and human health even if it did not turn out to be the personal and financial success he so badly craved.

    Although Morton was a man of great accomplishment, he was all too human.
    Sadly, like many human beings, Morton aggressively hunted for fame, glory, professional success, and ego gratification at the expense of judiciously contemplating the consequences of his actions. It was a quest that cost him dearly even as it made life– and surgically correctible illnesses — far better for the rest of us.
     
    Last edited by a moderator: Jun 20, 2014
  16. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    Jul 26, 1941:
    United States freezes Japanese assets

    On this day in 1941, President Franklin Roosevelt seizes all Japanese assets in the United States in retaliation for the Japanese occupation of French Indo-China.

    On July 24, Tokyo decided to strengthen its position in terms of its invasion of China by moving through Southeast Asia. Given that France had long occupied parts of the region, and Germany, a Japanese ally, now controlled most of France through Petain's puppet government, France "agreed" to the occupation of its Indo-China colonies. Japan followed up by occupying Cam Ranh naval base, 800 miles from the Philippines, where Americans had troops, and the British base at Singapore.

    President Roosevelt swung into action by freezing all Japanese assets in America. Britain and the Dutch East Indies followed suit. The result: Japan lost access to three-fourths of its overseas trade and 88 percent of its imported oil. Japan's oil reserves were only sufficient to last three years, and only half that time if it went to war and consumed fuel at a more frenzied pace. Japan's immediate response was to occupy Saigon, again with Vichy France's acquiescence. If Japan could gain control of Southeast Asia, including Malaya, it could also control the region's rubber and tin production—a serious blow to the West, which imported such materials from the East. Japan was now faced with a dilemma: back off of its occupation of Southeast Asia and hope the oil embargo would be eased—or seize the oil and further antagonize the West, even into war.


    The United States and the Opening to Japan, 1853

    On July 8, 1853, American Commodore Matthew Perry led his four ships into the harbor at Tokyo Bay, seeking to re-establish for the first time in over 200 years regular trade and discourse between Japan and the western world.
    Although he is often credited with opening Japan to the western world, Perry was not the first westerner to visit the islands. Portuguese, Spanish, and Dutch traders engaged in regular trade with Japan in the 16th and 17th centuries. Persistent attempts by the Europeans to convert the Japanese to Catholicism and their tendency to engage in unfair trading practices led Japan to expel most foreigners in 1639. For the two centuries that followed, Japan limited trade access to Dutch and Chinese ships with special charters.

    There were several reasons why the United States became interested in revitalizing contact between Japan and the West in the mid-19th century. First, the combination of the opening of Chinese ports to regular trade and the annexation of California, creating an American port on the Pacific, ensured that there would be a steady stream of maritime traffic between North America and Asia. Then, as American traders in the Pacific replaced sailing ships with steam ships, they needed to secure coaling stations, where they could stop to take on provisions and fuel while making the long trip from the United States to China. The combination of its advantageous geographic position and rumors that Japan held vast deposits of coal increased the appeal of establishing commercial and diplomatic contacts with the Japanese. Additionally, the American whaling industry had pushed into the North Pacific by the mid-18th century, and sought safe harbors, assistance in case of shipwrecks, and reliable supply stations. In the years leading up to the Perry mission, a number of American sailors found themselves shipwrecked and stranded on Japanese shores, and tales of their mistreatment at the hands of the unwelcoming Japanese spread through the merchant community and across the United States.

    The same combination of economic considerations and belief in Manifest Destiny that motivated U.S. expansion across the North American continent also drove American merchants and missionaries to journey across the Pacific. At the time, many Americans believed that they had a special responsibility to modernize and civilize the Chinese and Japanese. In the case of Japan, missionaries felt that Protestant Christianity would be accepted where Catholicism had generally been rejected. Other Americans argued that, even if the Japanese were unreceptive to Western ideals, forcing them to interact and trade with the world was a necessity that would ultimately benefit both nations.

    Commodore Perry’s mission was not the first American overture to the Japanese. In the 1830s, the Far Eastern squadron of the U.S. Navy sent several missions from its regional base in Guangzhou (Canton), China, but in each case, the Japanese did not permit them to land, and they lacked the authority from the U.S. Government to force the issue. In 1851, President Millard Fillmore authorized a formal naval expedition to Japan to return shipwrecked Japanese sailors and request that Americans stranded in Japan be returned to the United States. He sent Commodore John Aulick to accomplish these tasks, but before Aulick left Guangzhou for Japan, he was relieved of his post and replaced by Commodore Matthew Perry. A lifetime naval officer, Perry had distinguished himself in the Mexican-American War and was instrumental in promoting the United States Navy’s conversion to steam power.

    Perry first sailed to the Ryukyus and the Bonin Islands southwest and southeast of the main Japanese islands, claiming territory for the United States, and demanding that the people in both places assist him. He then sailed north to Edo (Tokyo) Bay, carrying a letter from the U.S. President addressed to the Emperor of Japan. By addressing the letter to the Emperor, the United States demonstrated its lack of knowledge about the Japanese government and society. At that time, the Japanese emperor was little more than a figurehead, and the true leadership of Japan was in the hands of the Tokugawa Shogunate.

    Perry arrived in Japanese waters with a small squadron of U.S. Navy ships, because he and others believed the only way to convince the Japanese to accept western trade was to display a willingness to use its advanced firepower. At the same time, Perry brought along a variety of gifts for the Japanese Emperor, including a working model of a steam locomotive, a telescope, a telegraph, and a variety of wines and liquors from the West, all intended to impress upon the Japanese the superiority of Western culture. His mission was to complete an agreement with the Japanese Government for the protection of shipwrecked or stranded Americans and to open one or more ports for supplies and refueling. Displaying his audacity and readiness to use force, Perry’s approach into the forbidden waters around Tokyo convinced the Japanese authorities to accept the letter.

    The following spring, Perry returned with an even larger squadron to receive Japan’s answer. The Japanese grudgingly agreed to Perry’s demands, and the two sides signed the Treaty of Kanagawa on March 31, 1854. According to the terms of the treaty, Japan would protect stranded seamen and open two ports for refueling and provisioning American ships: Shimoda and Hakodate. Japan also gave the United States the right to appoint consuls to live in these port cities, a privilege not previously granted to foreign nations. This treaty was not a commercial treaty, and it did not guarantee the right to trade with Japan. Still, in addition to providing for distressed American ships in Japanese waters, it contained a most-favored-nation clause, so that all future concessions Japan granted to other foreign powers would also be granted to the United States. As a result, Perry’s treaty provided an opening that would allow future American contact and trade with Japan.
    The first U.S. consul assigned to a Japanese port was Townsend Harris. Like many of the early consuls in Asia, Harris was a New York merchant dealing with Chinese imports. He arrived in Shimoda in 1856, but, lacking the navy squadron that strengthened Perry’s bargaining position, it took Harris far longer to convince the Japanese to sign a more extended treaty. Ultimately, Japanese officials learned of how the British used military action to compel the opening to China, and decided that it was better to open its doors willingly than to be forced to do so. The United States and Japan signed their first true commercial treaty, sometimes called the Harris Treaty, in 1858. The European powers soon followed the U.S. example and drew up their own treaties with Japan. Japan sent its first mission to the West in 1860, when Japanese delegates journeyed to the United States to exchange the ratified Harris Treaty.

    Although Japan opened its ports to modern trade only reluctantly, once it did, it took advantage of the new access to modern technological developments. Japan’s opening to the West enabled it to modernize its military, and to rise quickly to the position of the most formidable Asian power in the Pacific. At the same time, the process by which the United States and the Western powers forced Japan into modern commercial intercourse, along with other internal factors, weakened the position of the Tokugawa Shogunate to the point that the shogun fell from power. The Emperor gained formal control of the country in the Meiji Restoration of 1868, with long-term effects for the rule and modernization of Japan.
     
  17. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    Japanese-American Relations at the Turn of the Century, 1900–1922

    In the first two decades of the twentieth century, the relationship between the United States and Japan was marked by increasing tension and corresponding attempts to use diplomacy to reduce the threat of conflict. Each side had territory and interests in Asia that they were concerned the other might threaten. U.S. treatment of Japanese immigrants, and competition for economic and commercial opportunities in China also heightened tensions. At the same time, each country’s territorial claims in the Pacific formed the basis for several agreements between the two nations, as each government sought to protect its own strategic and economic interests.
    At the turn of the century, U.S. and Japanese interests appeared to be aligned. Both nations supported the idea of an “open door” for commercial expansion in China. After the Russo-Japanese War of 1904–05, U.S. President Theodore Roosevelt acted as a mediator at Japan’s request, and the two sides of the conflict met on neutral territory in Portsmouth, New Hampshire. In the same year, U.S. Secretary of War William Howard Taft met with Prime Minister Katsura Taro in Japan. The two concluded the secret Taft-Katsura Agreement, in which the United States acknowledged Japanese rule over Korea and condoned the Anglo-Japanese alliance of 1902. At the same time, Japan recognized U.S. control of the Philippines.

    In the years that followed, however, tensions rose over Japanese actions in northeast China and immigration to the United States. In 1905, the Japanese started to establish more formal control over South Manchuria by forcing China to give Japan ownership rights to the South Manchurian Railway. The Japanese used this opening to make further inroads into northeast China, causing the Roosevelt Administration concern that this violated the ideals of free enterprise and the preservation of China’s territorial integrity. Simultaneously, leading Japanese officials expressed frustration with the treatment of Japanese immigrants in the United States. A U.S.-Japanese treaty signed in 1894 had guaranteed the Japanese the right to immigrate to the United States, and to enjoy the same rights in the country as U.S. citizens. In 1906, however, the San Francisco Board of Education enacted a measure to send Japanese and Chinese children to segregated schools. The Government of Japan was outraged by this policy, claiming that it violated the 1894 treaty. In a series of notes exchanged between late 1907 and early 1908, known collectively as the Gentlemen’s Agreement, the U.S. Government agreed to pressure the San Francisco authorities to withdraw the measure, and the Japanese Government promised to restrict the immigration of laborers to the United States.
    With the immigration problem temporarily settled, the two countries met to provide mutual reassurances about their territories and interests in East Asia. In 1908, U.S Secretary of State Elihu Root and Japanese Ambassador Takahira Kogoro formed an agreement in which Japan promised to respect U.S. territorial possessions in the Pacific, its Open Door policy in China, and the limitation of immigration to the United States as outlined in the Gentlemen’s Agreement. The Government of Japan redirected its labor emigrants to its holdings in Manchuria, maintaining that these were not a part of China. For its part, the United States recognized Japanese control of Taiwan and the Pescadores, and the Japanese special interest in Manchuria. By reiterating each country’s position in the region, the Root-Takahira Agreement served to lessen the threat of a misunderstanding or war between the two nations.

    This series of agreements still did not resolve all of the outstanding issues. U.S. treatment of Japanese residents continued to cause tension between the two nations. The Alien Land Act of 1913, for example, barred Japanese from owning or leasing land for longer than three years and adversely affected U.S.-Japanese relations in the years leading up to World War I. Economic competition in China, which the United States feared would result in increasing Japanese control, was another issue that increased tensions between the two nations. In 1915, the Japanese issued its “Twenty-One Demands” of China, in which it asked that China recognize its territorial claims, prevent other powers from obtaining new concessions along its coast, and take a series of actions designed to benefit the Japanese economically. China turned to the United States for assistance, and U.S. officials responded with a declaration that they would not recognize any agreement that threatened the Open Door. Although this was consistent with past policies, this announcement was of little use to the Chinese. However, President Woodrow Wilson was not willing to take a stronger stand given his need for assistance in protecting U.S. interests in Asia, addressing the growing conflict in Europe, and managing racial issues in California.
    The potential for conflict between the United States and Japan, especially over China, led the two governments to negotiate yet again. In the Ishii-Lansing Agreement of 1917, Secretary of State Robert Lansing acknowledged that Manchuria was under Japanese control, while Japanese Foreign Minister Ishii Kikujiro agreed not to place limitations on U.S. commercial opportunities elsewhere in China. The two powers also agreed not to take advantage of the war in Europe to seek additional rights and privileges. Though non-binding, Lansing considered the agreement an important measure in promoting mutual interests in Asia, but it proved short-lived. Ultimately, the two nations agreed to cancel the Ishii-Lansing Agreement after concluding the Nine-Power Treaty, which they signed in 1922 at the Washington Conference.

    Japan and the United States clashed again during the League of Nations negotiations in 1919. The United States refused to accept the Japanese request for a racial equality clause or an admission of the equality of the nations. In addition, the Versailles Treaty granted Japan control over valuable German concessions in Shandong, which led to an outcry in China. This coupled with the growing fear of a militant Japan, contributed to the defeat of the League Covenant in the U.S. Senate. The persistent issues preventing accommodation continued to be racial equality (especially with regard to the treatment of Japanese immigrants in the United States) and differences in how to address expansion in Asia. In spite of the many efforts to reach agreements on these points, by the early 1920s Japan and the United States were again at odds.
     
  18. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    History of Cats

    The world’s most popular pet is the domestic cat, there is believed to be close to half a billion domestic cats in the world today. The history of cats however starts way back before the modern twentieth century right to the beginning of known and recorded history. The ancient history of the domestic cat starts at the time of the ancient Egyptians 4000 years ago where they were regarded as sacred creatures.

    Cats in Ancient Egypt

    The origins of cats (domestic cats) are thought to have come from the African Wild Cat. The breed was domesticated in ancient Egypt to control vermin which was harming crops and causing diseases. And like our fury friends now they were very good at catching mice and rats! The cats controlled the rat population which reduced disease and deaths and also allowed a larger supply of food for the poor. This therefore changed the quality of living for the Egyptians and cats become a sacred creature representing life. They were associated with the goddesses Bast, Isis and Pasht. By the time the Egyptian empire fell cats were revered as master hunters and were worshipped like gods by all Egyptians including the pharaoh. If an Egyptian killed a cat they were immediately given the death penalty yet the fear of the all mighty cat itself made this a rare occurrence. The pharaoh's were mummified and buried with statues of cats. This represented good luck and safe companionship to the afterlife. Even today archaeologists are finding more and more hieroglyphics, statues and carvings of cats emphasising there importance in Ancient Egypt. Some cats were even mummified and their bodies left to lay in tombs and shrines. It was illegal to sell a cat outside of Egypt as they were such an important asset to their beliefs and society. The history of domesticated cats started in Egypt where they acclaimed their first home, but like all cats they didn't want to stay in one place to long!

    History of Cats...From Egypt to China

    Towards the end of the Egyptian empire cats were sold to the Greeks and Persians. In 500 BC a domesticated cat was given to the Emperor of China and cats were the most popular pet of the rich during the Song Dynasty. The cats were breed with the wildcats of Asia and became a common asset of the first emperors, then the nobility, priests and eventually the peasants. Cats in China were breed with many local breeds which had helped produce some of the breeds we know today such as the Siamese and Birmese. The domesticated cat spread to all the surrounding countries of China Including India and Japan.

    History of Cats...From Rome to Beyond

    Egyptian traders brought cats to Europe and they were introduced to the Greeks and then the Romans. The Romans used cats to control the pest population and as their empire increased so did the population of cats. Cats become common and valuable assets to all those who harvest crops who had problems with rats and disease. They were introduced to Britain around 100 AD and were protected by Law by the King of Wales, Hywel Dda as sacred and valuable animals. Killing a cat could again be punishable by death.

    History of Cats...The Fall of the Cat

    During the Middle Ages cats were associated with superstition and witch craft, they were considered animals of sin and were thought to be associated with Satan. When the Black Death (The Plague) started in 1348 the rulers ordered the killing of all cats who were the initial suspect of the disease (or the devils work). Ironically because of this mass killing the rodents spread and populated Europe in abundance which spread and worsened the pandemic. Many believe that the mass cull of cats cost millions of lives where the population of Europe was killed off by 50%.

    From Europe to America

    Cats were used upon Ships on voyages of discovery during the 15th/16th Century to control rodent population and disease. A ship crashed off the Isle of Man in the United Kingdom and the cats on board the ship swam to the shore. This created one of the first known pedigree breeds, the Manx. When Christopher Columbus discovered America cats from the Ship were left in the country and flourished. The breed today known as the American Shorthair is thought to have originated from the British Shorthair which was believed to be used on those ships.

    The Twentieth Century

    Cats flourished in the Twentieth Century when they were introduced once again as household pets by Queen Victoria of England and have become a key part of modern society. Kings and Queens, Presidents and Prime Ministers have all owned pet cats during the twentieth century. New breeds were created such as the Sphynx, the Bengal and the Himalayan. During the 1990s cats overtook the dog as the world’s favourite and most common pet and today there is thought to be close to 500,000,000 domestic cats in the world. Films both Animation and Sci-Fi have been made about cats and they are a huge part of family life and culture amongst modern society.



    A Brief History of House Cats By David Zax smithsonian.com June 30, 2007

    It may be that "nobody owns a cat," but scientists now say the popular pet has lived with people for 12,000 years

    On any of the surprising number of Web sites dedicated entirely to wisdom about cats, one will find quotations like these: "As every cat owner knows, nobody owns a cat" (attributed to Ellen Perry Berkeley); "The phrase 'domestic cat' is an oxymoron" (attributed to George F. Will); and "A dog is a man's best friend. A cat is a cat's best friend" (attributed to Robet J. Vogel). Of course, there is such a thing as the domestic cat, and cats and humans have enjoyed a mostly symbiotic relationship for thousands of years. But the quips do illuminate a very real ambivalence in the long relationship between cats and humans, as this history of the house cat shows.

    The Mystery of the Ancient House Cat

    It has taken a while for scientists to piece together the riddle of just when and where cats first became domesticated. One would think that the archaeological record might answer the question easily, but wild cats and domesticated cats have remarkably similar skeletons, complicating the matter. Some clues first came from the island of Cyprus in 1983, when archaeologists found a cat's jawbone dating back 8,000 years. Since it seemed highly unlikely that humans would have brought wild cats over to the island (a "spitting, scratching, panic-stricken wild feline would have been the last kind of boat companion they would have wanted," writes Desmond Morris in Catworld: A Feline Encyclopedia), the finding suggested that domestication occurred before 8,000 years ago.

    In 2004, the unearthing of an even older site at Cyprus, in which a cat had been deliberately buried with a human, made it even more certain that the island's ancient cats were domesticated, and pushed the domestication date back at least another 1,500 years.

    Just last month, a study published in the research journal Science secured more pieces in the cat-domestication puzzle based on genetic analyses. All domestic cats, the authors declared, descended from a Middle Eastern wildcat, Felis sylvestris, which literally means "cat of the woods." Cats were first domesticated in the Near East, and some of the study authors speculate that the process began up to 12,000 years ago.

    Civilization's Pet

    While 12,000 years ago might seem a bold estimate—nearly 3,000 before the date of the Cyprus tomb's cat—it actually is a perfectly logical one, since that is precisely when the first agricultural societies began to flourish in the Middle East's Fertile Crescent.

    When humans were predominantly hunters, dogs were of great use, and thus were domesticated long before cats. Cats, on the other hand, only became useful to people when we began to settle down, till the earth and—crucially—store surplus crops. With grain stores came mice, and when the first wild cats wandered into town, the stage was set for what the Science study authors call "one of the more successful 'biological experiments' ever undertaken." The cats were delighted by the abundance of prey in the storehouses; people were delighted by the pest control.

    "We think what happened is that the cats sort of domesticated themselves," Carlos Driscoll, one of the study authors, told the Washington Post. The cats invited themselves in, and over time, as people favored cats with more docile traits, certain cats adapted to this new environment, producing the dozens of breeds of house cats known today. In the United States, cats are the most popular house pet, with 90 million domesticated cats slinking around 34 percent of U.S. homes.


    If cats seem ambivalent towards us, as the quotations from cat fan-sites indicate, then it may be a reflection of the wildly mixed feelings humans, too, have shown cats over the millennia.

    The ancient Egyptian reverence for cats is well-known—and well-documented in the archaeological record: scientists found a cat cemetery in Beni-Hassan brimming with 300,000 cat mummies. Bastet, an Egyptian goddess of love, had the head of a cat, and to be convicted of killing a cat in Egypt often meant a death sentence for the offender.

    Ancient Romans held a similar—albeit tempered and secularized—reverence for cats, which were seen as a symbol of liberty. In the Far East, cats were valued for the protection they offered treasured manuscripts from rodents.

    For some reason, however, cats came to be demonized in Europe during the Middle Ages. They were seen by many as being affiliated with witches and the devil, and many were killed in an effort to ward off evil (an action that scholars think ironically helped to spread the plague, which was carried by rats). Not until the 1600s did the public image of cats begin to rally in the West.

    Nowadays, of course, cats are superstars: the protagonists of comic strips and television shows. By the mid-90s, cat services and products had become a billion-dollar industry. And yet, even in our popular culture, a bit of the age-old ambivalence remains. The cat doesn't seem to be able to entirely shake its association with evil: After all, how often do you see a movie's maniacal arch-villain, as he lounges in a comfy chair and plots the world's destruction, stroke the head of a Golden Retriever?

    David Zax, a writer in Washington, D.C., recently wrote a brief history of Wimbledon.
     
  19. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    China is one of the world's four ancient civilizations; here we give a concise overview of more than 5000 years of Chinese history, including the Great Wall and the four great inventions of ancient China. Do you know what they are?

    The written history of China can be said to date back to the Shang Dynasty (1600–1046 BC), over 3,000 years ago. The first dynasty was founded in the 21st century B.C., and China was first unified in 221 B.C.

    Imperial Era

    First Dynasties

    The founding of China's first dynasty, Xia Dynasty in the 21st century B.C. marked a change from a primitive society to a slave society. Slave society developed further during the Shang (16th-11th century B.C.) and the Western Zhou (11th century-770 B.C.) Dynasties.
    This era was followed by the Spring and Autumn and Warring States periods (770-221 B.C.), and the transition from the slave society to feudal society.

    The First Emperor

    In 221 B.C., Ying Zheng, a man of great talent and bold vision, ended the rivalry among the independent principalities in the Warring States Period.
    He established the first centralized, unified, multi-ethnic state in Chinese history, under the Qin Dynasty (221-207 B.C.). He called himself Shi Huang Di (the First Emperor), also known as Qin Shi Huang, or First Emperor of the Qin Dynasty.
    During his reign, Qin Shi Huang standardized the script, currencies, and weights and measures, established the system of prefectures and counties, and began the construction of the world-renowned Great Wall . He also built a large palace, a mausoleum (the Terracotta Army), and temporary regal lodges in Xi’anyang, Lishan, and other places.
    At the end of the Qin Dynasty, Liu Bang, a peasant leader, overthrew the Qin regime in cooperation with Xi’ang Yu, an aristocratic general. A few years later, Liu Bang defeated Xi’ang Yu and established the strong Han Dynasty in 206 B.C.

    Han Dynasty

    During the Han Dynasty (206 B.C.-A.D. 220), agriculture, handicrafts, and commerce were well developed. During the reign of Emperor Wudi (Liu Che, 140-87 B.C.), the Han regime reached the period of its greatest prosperity. The multi-ethnic country became more united during the Han regime, which existed in total 426 years.
    The emperor conquered the Xiongnu nomads, sent Zhang Qian as an envoy to the Western Regions (Central Asia), and in the process pioneered the route known as the "Silk Road" from the Han capital Chang' an through Xinjiang to Europe.
    One of the Four Beauties of Ancient China, Wang Zhaojun, was married as a ‘political bride’ to chieftain of the Xiongnu in 33 B.C. Her life and influence created a famous inspiring story about marriage between the Han and the Xiongnu. Middle Dynasties
    Han Dynasty was followed by the Three Kingdoms Period (220-265) of Wei, Shu, and Wu. It was followed by the Jin (265-420), the Southern and Northern Dynasties (420-589), and the Sui Dynasty (581-618).
    In 618, Li Yuan founded the Tang Dynasty (618-907). Later, Li Shimin (r. 626-649), son of Li Yuan, ascended the throne as Emperor Taizong, considered one of the greatest emperors in Chinese history.
    After the Tang Dynasty, came the Five Dynasties and Ten Kingdoms (907-960).

    Song and Yuan Dynasties

    In 960, General Zhao Kuangyin of the Later Zhou Dynasty rose in mutiny, and founded the Song Dynasty (960-1279).
    In 1206 Genghis Khan unified all the tribes in Mongolia and founded the Mongol Khanate. In 1271, his grandson, Kublai Khan, conquered the Central Plain, founded the Yuan Dynasty (1271-1368), and made Dadu (today's Beijing) the capital.
    During the Song and Yuan dynasties, handicraft industry and domestic and foreign trade boomed. Many merchants and travelers came from abroad. Marco Polo from Venice traveled extensively in China, later describing the country's prosperity in his book ‘Travels’.
    The "four great inventions" of the Chinese people in ancient times, paper making, printing, the compass and gunpowder, were further developed in the Song and Yuan dynasties, and introduced to foreign countries.

    Ming and Qing Dynasties

    In 1368, Zhu Yuanzhang founded the Ming Dynasty (1368-1644) in Nanjing, and reigned as Emperor Taizu. When his son, and successor, Zhu Di, ascended the throne, he started to build the palace, temples, city walls, and moat in Beijing. In 1421, he officially made Beijing his capital.
    In the late Ming Dynasty, the Manchus in northeast China grew in strength. Under the leadership of Nurhachi, the Manchus invaded the Central Plain for three generations in succession, and finally founded the Qing Dynasty(1644-1911).
    The two most famous emperors of the Qing Dynasty were Emperor Kangxi (r. 1661-1772) and Emperor Qianlong (r. 1735-1796). The Kangxi and Qianlong reign periods were known as the "times of prosperity".
    China was reduced to a semi-colonial and semi-feudal country since the first Opium War in 1840. The Revolution of 1911, led by Dr. Sun Yat-sen, ended the rule of the Qing Dynasty.

    Modern Era

    The Revolution of 1911 is of great significance in modern Chinese history: the monarchical system was discarded with the founding of the provisional government of the Republic of China.
    The victory was soon compromised by concessions on the part of the Chinese bourgeoisie, and the country entered a period dominated by the Northern Warlords, headed by Yuan Shikai.
    Since the founding of the People's Republic of China in 1949, China has entered a new Communist era of stability, with the Reform and Opening Up policies of 1978, bringing in China’s phenomenal economic growth.
     
  20. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    Terra Cotta Soldiers on the March

    By Arthur Lubow
    Smithsonian Magazine | Subscribe
    July 2009

    In March 1974, a group of peasants digging a well in drought-parched Shaanxi province in northwest China unearthed fragments of a clay figure—the first evidence of what would turn out to be one of the greatest archaeological discoveries of modern times. Near the unexcavated tomb of Qin Shi Huangdi—who had proclaimed himself first emperor of China in 221 B.C.—lay an extraordinary underground treasure: an entire army of life-size terra cotta soldiers and horses, interred for more than 2,000 years.
    The site, where Qin Shi Huangdi's ancient capital of Xianyang once stood, lies a half-hour drive from traffic-clogged Xi'an (pop. 8.5 million). It is a dry, scrubby land planted in persimmon and pomegranate—bitterly cold in winter and scorching hot in summer—marked by dun-colored hills pocked with caves. But hotels and a roadside souvenir emporium selling five-foot-tall pottery figures suggest that something other than fruit cultivation is going on here.

    Over the past 35 years, archaeologists have located some 600 pits, a complex of underground vaults as yet largely unexcavated, across a 22-square-mile area. Some are hard to get to, but three major pits are easily accessible, enclosed inside the four-acre Museum of the Terracotta Army, constructed around the discovery site and opened in 1979. In one pit, long columns of warriors, reassembled from broken pieces, stand in formation.With their topknots or caps, their tunics or armored vests, their goatees or close-cropped beards, the soldiers exhibit an astonishing individuality. A second pit inside the museum demonstrates how they appeared when they were found: some stand upright, buried to their shoulders in soil, while others lie toppled on their backs, alongside fallen and cracked clay horses. The site ranks with the Great Wall and Beijing's Forbidden City as one of the premier tourist attractions within China.

    For those unable to make the journey to Xi'an, some of the choicest specimens unearthed there form the centerpiece of two successive traveling exhibitions that survey the reign of Qin Shi Huangdi (221 B.C.-210 B.C.). "The First Emperor," organized by the British Museum, debuted in London before moving to the High Museum in Atlanta. A second show, "Terra Cotta Warriors," then opened at the Bowers Museum in Santa Ana, California. It is now at the Houston Museum of Natural Science through October 18, and then moves to the National Geographic Society Museum in Washington, D.C. for display from November 19 to March 31, 2010.

    In addition to showcasing recent finds, the exhibitions feature the largest collection of terra cotta figures ever to leave China. The statuary includes nine soldiers arranged in battle formation (armored officers, infantrymen, and standing and kneeling archers), as well as a terra cotta horse. Another highlight is a pair of intricately detailed, ten-foot-long bronze chariots, each drawn by four bronze horses. (Too fragile to be transported, the chariots are represented by replicas.) The artifacts offer a glimpse of the treasures that attract visitors from around the world to the Xi'an museum site, where 1,900 of an estimated 7,000 warriors have been disinterred so far.

    The stupendous find at first seemed to reinforce conventional thinking—that the first emperor had been a relentless warmonger who cared only for military might. As archaeologists have learned during the past decade, however, that assessment was incomplete. Qin Shi Huangdi may have conquered China with his army, but he held it together with a civil administration system that endured for centuries. Among other accomplishments, the emperor standardized weights and measures and introduced a uniform writing script.

    Recent digs have revealed that in addition to the clay soldiers, Qin Shi Huangdi's underground realm, presumably a facsimile of the court that surrounded him during his lifetime, is also populated by delightfully realistic waterfowl, crafted from bronze and serenaded by terra cotta musicians. The emperor's clay retinue includes terra cotta officials and even troupes of acrobats, slightly smaller than the soldiers but created with the same methods. "We find the underground pits are an imitation of the real organization in the Qin dynasty," says Duan Qingbo, head of the excavation team at the Shaanxi Provincial Research Institute for Archaeology. "People thought when the emperor died, he took just a lot of pottery army soldiers with him. Now they realize he took a whole political system with him."
    Qin Shi Huangdi decreed a mass-production approach; artisans turned out figures almost like cars on an assembly line. Clay, unlike bronze, lends itself to quick and cheap fabrication. Workers built bodies, then customized them with heads, hats, shoes, mustaches, ears and so on, made in small molds. Some of the figures appear so strikingly individual they seem modeled on real people, though that is unlikely. "These probably weren't portraits in the Western sense," says Hiromi Kinoshita, who helped curate the exhibition at the British Museum. Instead, they may have been aggregate portraits: the ceramicists, says Kinoshita, "could have been told that you need to represent all the different types of people who come from different regions of China."

    The first emperor's capital, Xianyang, was a large metropolis, where he reportedly erected more than 270 palaces, of which only a single foundation is known to survive. Each time Qin Shi Huangdi conquered a rival state, he is said to have transported its ruling families to Xianyang, housing the vanquished in replicas of palaces they had left behind. At the same time, the emperor directed construction of his tomb complex; some 720,000 workers reportedly labored on these vast projects.
    Upon the death of his father, Yiren, in 246 B.C., the future Qin Shi Huangdi—then a prince named Ying Zheng who was around age 13—ascended the throne. The kingdom, celebrated for its horsemen, sat on the margin of civilization, regarded by its easterly rivals as a semi-savage wasteland. Its governing philosophy was as harsh as its terrain. Elsewhere in China, Confucianism held that a well-run state should be administered by the same precepts governing a family: mutual obligation and respect. Qin rulers, however, subscribed to a doctrine known as legalism, which rested on the administration of punitive laws.
    In his early 20s, Ying Zheng turned for guidance to a visionary statesman, Li Si, who likely initiated many of his sovereign's accomplishments. Under Li's tutelage, Ying Zheng introduced a uniform script (thereby enabling subjects of vastly different dialects to communicate). Standardization, a hallmark of the Qin state, was applied to weaponry as well: should an arrow shaft snap, or the trigger on a repeating crossbow malfunction, the component could be easily replaced. The young ruler also presided over creation of an advanced agricultural infrastructure that incorporated irrigation canals and storage granaries.

    With methodical zeal, Ying Zheng set about conquering the warring states that surrounded him in the late third century B.C. As his armies advanced, principalities fell. No one could thwart consolidation of an empire that eventually stretched from parts of present-day Sichuan in the west to coastal regions along the East China Sea. Having unified the entire civilized world as he knew it, Ying Zheng in 221 B.C. renamed himself Qin Shi Huangdi, translated as First Emperor of Qin.
    He then invested in infrastructure and built massive fortifications. His road network likely exceeded 4,000 miles, including 40-foot-wide speedways with a central lane reserved for the imperial family. On the northern frontier, the emperor dispatched his most trusted general to reinforce and connect existing border barriers, creating a bulwark against nomadic marauders. Made of rammed earth and rubble, these fortifications became the basis for the Great Wall, most of which would be rebuilt in stone and brick during the 15th century A.D. under the Ming dynasty.

    As the grandeur of his tomb complex suggests, Qin Shi Huangdi kept an eye on posterity. But he also longed to extend his life on earth—perhaps indefinitely. Alchemists informed the emperor that magical herbs were to be found on what they claimed were three Islands of the Immortals in the East China Sea. The emissaries most likely to gain entry to this mystical realm, they asserted, were uncorrupted children; in 219 B.C., Qin Shi Huangdi reportedly dispatched several thousand youngsters to search for the islands. They never returned. Four years later, the emperor sent three alchemists to retrieve the herbs. One of them made it back, recounting a tale of a giant fish guarding the islands. Legend has it that the first emperor resolved to lead the next search party himself; on the expedition, the story goes, he used a repeating crossbow to kill a huge fish. But instead of discovering life-preserving elixirs, the emperor contracted a fatal illness.

    As he lay dying in 210 B.C., 49-year-old Qin Shi Huangdi decreed that his estranged eldest son, Ying Fusu, should inherit the empire. The choice undercut the ambitions of a powerful royal counselor, Zhao Gao, who believed he could govern the country behind the scenes if a more malleable successor were installed. To conceal Qin Shi Huangdi's death—and disguise the stench of a decomposing corpse—until the travelers returned to the capital, Zhao Gao took on a cargo of salted fish. The delaying tactic worked. Once Zhao Gao managed to return to Xianyang, he was able to operate on his home turf. He managed to transfer power to Ying Huhai, a younger, weaker son.
    Ultimately, however, the scheme failed. Zhao Gao could not maintain order and the country descended into civil war. The Qin dynasty outlived Qin Shi Huangdi by only four years. The second emperor committed suicide; Zhao Gao eventually was killed. Various rebel forces coalesced into a new dynasty, the Western Han.

    For archaeologists, one indicator that Qin rule had collapsed suddenly was the extensive damage to the terra cotta army. As order broke down, marauding forces raided the pits where clay soldiers stood guard and plundered their real weapons. Raging fires, possibly set deliberately, followed the ransacking, weakening support pillars for wooden ceilings, which crashed down and smashed the figures. Some 2,000 years later, archaeologists discovered charring on the walls of one pit.

    Throughout recorded Chinese history, the first emperor's Ebang Palace—its site on the Wei River, south of ancient Xianyang, was not investigated until 2003— was synonymous with ostentation. The structure was said to have been the most lavish dwelling ever constructed, with an upper-floor gallery that could seat 10,000 and a network of covered walkways that led to distant mountains to the south.
    "All Chinese people who can read, including middle- school students, believed that the Qin dynasty collapsed because it put so much money into the Ebang Palace," says archaeologist Duan. "According to excavation work from 2003, we found it was actually never built—only the base. Above it was nothing." Duan says that if the palace had been erected and demolished, as historians thought, there would be potsherds and telltale changes in soil color. "But tests found nothing," says Duan. "It is so famous a symbol of Chinese culture for so long a time, showing how cruel and greedy the first emperor was—and archaeologists found it was a lie." Duan also doubts accounts of Qin Shi Huangdi's expedition for life-prolonging herbs. His version is more prosaic: "I believe that the first emperor did not want to die. When he was sick, he sent people to find special medicines."

    The emperor's tomb lies beneath a forested hill, surrounded by cultivated fields about a half-mile from the museum. Out of reverence for an imperial resting place and concerns about preserving what might be unearthed there, the site has not been excavated. According to a description written a century after the emperor's death, the tomb contains a wealth of wonders, including man-made streambeds contoured to resemble the Yellow and Yangtze rivers, flowing with shimmering, quicksilver mercury that mimics coursing water. (Analysis of soil in the mound has indeed revealed a high level of mercury.)
    Yet answers about the tomb are not likely to emerge anytime soon. "I have a dream that one day science can develop so that we can tell what is here without disturbing the emperor, who has slept here for 2,000 years," says Wu Yongqi, director of the Museum of the Terracotta Army. "I don't think we have good scientific techniques to protect what we find in the underground palace. Especially if we find paper, silk or textiles from plants or animals; it would be very bad if they have been kept in a balanced condition for 2,000 years, but suddenly they would vanish in a very short time." He cites another consideration: "For all Chinese people, he is our ancestor, and for what he did for China, we cannot unearth his tomb just because archaeologists or people doing tourism want to know what is buried there."

    Whatever future excavations reveal about Qin Shi Huangdi's enigmatic nature, some things seem unlikely to change. The emperor's importance as a seminal figure of history won't be diminished. And the mysteries that surround his life will likely never be completely resolved.
     
  21. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    Ann Landers was a pen name created by Chicago Sun-Times advice columnist Ruth Crowley in 1943 and taken over by Eppie Lederer in 1955.[1] For 56 years, the Ask Ann Landers syndicated advice column was a regular feature in many newspapers across North America. Due to this popularity, 'Ann Landers', though fictional, became something of a national institution and cultural icon.


    Ruth Crowley: the original 'Ann Landers' (1943–1955)

    The creator of the 'Ann Landers' pseudonym was Ruth Crowley, a Chicago nurse who had been writing a child-care column for the Sun since 1941. She chose the pseudonym at random — borrowing the surname 'Landers' from a family friend — in order to prevent confusion between her two columns. Unlike her eventual successor Esther Lederer, Crowley kept her identity as Landers secret, even enjoining her children to help her keep it quiet.[2] Crowley took a three-year break from writing the column from 1948 until 1951. After 1951 she continued the column for the Chicago Sun-Times and in syndication (since 1951[2]) to 26 other newspapers until her death, aged 48, on July 20, 1955. Crowley spent a total of nine years writing advice as 'Ann Landers'.

    Interim writers (July– October 1955)
    In the three-month period after Crowley’s death, various writers, including Connie Chancellor, took over the column.[2]

    The Esther Lederer years (1955–2002)

    By including expert advice from authorities, which none of her competitors did, Lederer won a contest to become the new writer of the column, debuting on October 16, 1955.[3] The column opened with a letter from a "Non-Eligible Bachelor," who despaired of getting married. Her advice was, "You're a big boy now...don't let spite ruin your life."[4] Lederer went on to advise thousands of other readers over the next several decades. Eventually, she became owner of the copyright. She chose not to have a different writer continue the column after her death, so the "Ann Landers" column ceased after publication of the few weeks' worth of material which she had written before her death.

    Sometimes she expressed unpopular opinions. She repeatedly favored legalization of prostitution and was pro-choice, yet denounced atheist Madalyn Murray O'Hair.[6] In 1973, she wrote in support of the legalization of homosexual acts, saying that she had been "pleading for compassion and understanding and equal rights for homosexuals" for 18 years,[7] and in 1976 writing that she "fought for the civil rights of homosexuals 20 years ago and argued that they should be regarded as full and equal citizens. Nevertheless, for years, she described homosexuality as "unnatural," a "sickness," and a "dysfunction."[9][10][11] Not till 1992 did she eventually reverse her opinion, and even then only after reviewing research and receiving nearly 75,000 letters that gays and lesbians wrote to her saying that they were happy being gay, writing that "it is my firm conviction that homosexuality is not learned behavior," adding that while being gay could be suppressed, it could not be altered.
    Even so, in 1996, she wrote regarding gay marriage, "Before you gay-rights folks land on me with both feet ... I cannot support same-sex marriage, however, because it flies in the face of cultural and traditional family life as we have known it for centuries."

    In 1995, Eppie commented thus in The New Yorker about Pope John Paul II: "He has a sweet sense of humor. Of course, he's a Polack. They're very anti-women." Polish-Americans responded with outrage. She issued a formal apology, but refused to comment further. The Milwaukee Journal Sentinel canceled her column after that incident. In that same article, she had noted that President John F. Kennedy's father, Joseph P. Kennedy, Sr., was anti-Semitic.

    A 1995 "Ann Landers" column said, "In recent years, there have been reports of people with twisted minds putting razor blades and poison in taffy apples and Halloween candy. It is no longer safe to let your child eat treats that come from strangers." The vague warning was criticized for causing fear dishonestly, as there have been no documented cases of children receiving poisoned candy during door-to-door Halloween trick-or-treating.

    In her March 28, 1965, column, regarding ownership of wedding gifts, Lederer wrote that "the wedding gifts belong to the bride." She went on to state that the bride should "consult a lawyer about the checks. In some states this could be considered community property." The advice was mistaken because only gifts given after the marriage would be considered community property in some states (or else because wedding gifts—if so designated—can be considered back-dated gifts to the bride). The column has provided teaching material for law professors and law students.

    In a 1996 column, she "informed" her readers that they should avoid throwing rice at weddings, lest birds eat it and explode. Such advice was erroneous, as milled rice is not harmful to birds.


    Annie's Mailbox
    After Lederer died in June 2002, her last column ran July 27. Lederer's daughter Margo Howard (who wrote Dear Prudence) said the column would end according to Lederer's wishes. Creators Syndicate President Rick Newcombe said Lederer's editors, Kathy Mitchell and Marcy Sugar, should start a column of their own. Though Mitchell and Sugar were reluctant, many readers wanted the column to continue. Thus began the "Annie's Mailbox" column in approximately 800 newspapers. Newspapers were given three possible choices: classic Ann Landers, Annie's Mailbox, and Dear Prudence. "Annie's Mailbox" is still syndicated in numerous newspapers throughout the US.

    Dear Abby
    A few months after Eppie Lederer took over as Ann Landers, her twin sister Pauline Phillips introduced a similar column, Dear Abby, which produced a lengthy estrangement between the two sisters. Pauline Phillips wrote her column until her retirement in 2002 when her daughter, Jeanne Phillips, took over.

    Further reading[edit]
    Howard, Margo. Eppie: The Story of Ann Landers. New York: Putnam, 1982. ISBN 0-399-12688-0.
    Pottker, Janice, and Bob Speziale. Dear Ann, Dear Abby: The Unauthorized Biography of Ann Landers and Abigail Van Buren. New York: Dodd, Mead, 1987. ISBN 0-396-08906-2.
    Aronson, Virginia. Ann Landers and Abigail Van Buren. Women of achievement. Philadelphia: Chelsea House Publishers, 2000. ISBN 0-7910-5297-4. (children's book).
    Landers, Ann, and Margo Howard. A Life in Letters: Ann Landers' Letters to Her Only Child. New York, NY: Warner Books, 2003. ISBN 0-446-53271-1.
    Gudelunas, David. Confidential to America: Newspaper Advice Columns and Sexual Education. Edison, NJ: Transaction, 2007. ISBN 1-4128-0688-7.[2]
    Rochman, Sue. Dear Ann Landers. Fall, 2010. CR magazine (magazine profile)
     
  22. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    Al Feldstein: 1925–2014 by Jerry Whitworth

    On April 29, 2014, the United States lost a pop culture icon. Al Feldstein passed away in his home in Livingston, Montana and while his may not be a household name, his contributions to our culture are significant. Discovering a talent in art at a young age, Feldstein worked as a teenager for Will Eisner and Jerry Iger as part of their Art Syndication Company which provided comics for various publishers including Editors Press Service, Fox Comics, and Quality Comics. Some of Feldstein’s earliest published art would be backgrounds for Sheena, Queen of the Jungle. When he was old enough, however, the artist would enlist as a soldier in the Air Force in order to join in World War II. Similarly with Stan Lee and Will Eisner, Feldstein’s talent as an artist brought him to Special Services where he produced comic strips and helped paint and decorate planes. He returned from the war as a freelancer working mostly for Fox. When that publisher seemed to be on the way out, Feldstein approached Bill Gaines for work at EC Comics.
    At the time, EC Comics was in a transitional period as its founder Max “M.C.” Gaines died tragically in a boating accident. Gaines’ son William (better known as Bill), a military veteran going to school to be a Chemistry teacher, would inherit his father’s company Educational Comics which specialized in adapting Bible stories into comic books. Bill revised the publisher into Entertaining Comics, producing work in various genres like horror, crime fiction, science fiction, and so on. Instrumental in this effort was Al Feldstein and Harvey Kurtzman who came on as editors, the former taking on the lion’s share of the work editing and writing for seven of these titles. The result was dynamite. The company nurtured new writing talent (including Harlan Ellison’s first published work) unshackled by the restraints of other publishers and the artists were encouraged to develop their own distinctive styles. This environment offered the opportunity for the editors to take assignments and hand them out to those they felt best complimented the concept. Fans ate the books up, with EC Comics blowing virtually every other publisher out of the water. Although, it wouldn’t be long before people in power found the books to be dangerous to young people. Simply put, EC tackled the persisting problems within the country’s society no other medium at the time dared mention, be it drugs, racism, rape, domestic abuse, police brutality, and on and on. What did in the company, however, was the graphic nature of its art and its portrayal of gore and eroticism. Psychiatrist Fredric Wertham would publish a book connecting comic books to juvenile delinquency which led to a public and political outcry against comics (including book burnings). EC Comics was all but dismantled in the aftermath.
    Following the backlash against the comic book industry, new self-regulated guidelines were put in place to prevent the collapse of the companies that survived. EC Comics tried to abide by the new guidelines but soon learned those in power exploited the system towards their own ends. The final straw was a comic EC produced presenting an allegory on racism featuring an African-American in the story’s reveal that the Comics Code Authority wanted to suppress. The story, “Judgment Day” by Al Feldstein, helped motivate the publisher to produce magazines instead (which were exempt from the CCA). Mad, a humor comic largely produced by Harvey Kurtzman, was chosen to be turned into a magazine. This decision was both motivated by the CCA and Kurtzman’s desire to move toward magazine work. As EC changed towards this direction, Feldstein was released but after a year, Kurtzman left Mad and Feldstein returned to become the magazine’s editor. Since Kurtzman provided the overwhelming majority of work on Mad, his departure left a virtually clean slate for Feldstein who brought on Don Martin and Frank Jacobs to produce the title. Feldstein led Mad to become the most popular humor magazine on the planet. At its zenith, the magazine circulated millions of copies per issue. Its been said Mad helped shape the youth of our nation toward viewing the government (and the establishment) critically, be it questioning wars, calling out political corruption, and to realize big media was controlling the perceptions of its audience. Through the use of humor and parodying pop culture, Mad was an entertaining product that also made you think (something seemingly becoming rarer every day). Feldstein guided the ship for Mad for nearly three decades before retiring in 1984.

    Having worked for the legendary Will Eisner and been a guiding force for two of the biggest publication ventures in American history, Al Feldstein took up oil painting and moved to the West. Settling down in Montana with his wife Michelle, the couple ran a guest house at a horse and llama ranch just north of Yellowstone National Park. Feldstein spent his final decades doing what he loved: painting scenes of nature, animals, Western lore (of cowboys and native Americans), and fantasy. His art was featured in dozens of galleries and is actively sought by collectors. In 2003, he was inducted into the Will Eisner Award Hall of Fame and in 2011 he earned the Bram Stoker Lifetime Achievement Award by the Horror Writers Association. As can be inferred, you can’t properly quantify the contributions Feldstein and his work at EC Comics provided to our culture. From kicking the hornet’s nest of what was considered acceptable content for the youth of America to expanding the minds and perceptions of the public, Feldstein’s work was some of the earliest shots fired in the war on ignorance (purposeful of otherwise) in our society. Feldstein, an admitted liberal Democrat, took both pleasure and joy in his contributions towards waking the public up to what was going on around them while taking no sides politically (lampooning both sides of the aisle equally). As a creative force and a scholar in the study of man, Al Feldstein will be missed and his work is a gift that the world can enjoy for the rest of eternity.
     
  23. StrangerInAStrangeLand SubQuantum Mechanic Valued Senior Member

    Messages:
    15,396
    Ronald Ebens, the man who killed Vincent Chin, apologizes 30 years later
    June 22, 2012 10:56 PM


    After 30 years, the killer of Asian American icon Vincent Chin told me in an exclusive interview that the murder known as a hate crime, wasn't about race, nor does he ever even remember hitting Chin with a baseball bat.

    Incredible as that sounds, there is one thing Ronald Ebens is clear about.

    Ebens, who was convicted of second degree murder but spent no time in prison for the act, is sorry for the beating death of Vincent Chin on June 19, 1982, in Detroit--even though for many Asian Americans, he can't say sorry enough.

    For years, Ebens has been allowed to live his life quietly as a free man.

    With the arrival of the 30th anniversary this month--and after writing about the case for years--I felt the need to hear Ebens express his sorrow with my own ears, so that I could put the case behind me.

    So I called him up. And he talked to me.

    On the phone, Ebens, a retired auto worker, said killing Chin was "the only wrong thing I ever done in my life."

    Though he received probation and a fine, and never served any time for the murder, Ebens says he's prayed many times for forgiveness over the years. His contrition sounded genuine over the phone.

    "It's absolutely true, I'm sorry it happened and if there's any way to undo it, I'd do it," said Ebens, 72. "Nobody feels good about somebody's life being taken, okay? You just never get over it. . .Anybody who hurts somebody else, if you're a human being, you're sorry, you know."

    Ebens said he'd take back that night if he could "a thousand times," and that after all these years, he can't put the memory out of his mind. "Are you kidding? It changed my whole life," said Ebens. "It's something you never get rid of. When something like that happens, if you're any kind of a person at all, you never get over it. Never."

    Ebens' life has indeed changed. As a consequence of the Chin murder, Ebens said he lost his job, his family, and has scraped by from one low-wage job to the next to make ends meet. Ultimately, he remarried and sought refuge in Nevada, where he's been retired eight years, owns a home and lives paycheck to paycheck on Social Security. His current living situation makes recovery of any part of the millions of dollars awarded to Chin's heirs in civil proceedings highly unlikely.

    The civil award, with interest, has grown to around $8 million.

    "It was ridiculous then, it's ridiculous now," Ebens said with defiance.

    His life hasn't been easy the last 30 years. But at least, he's alive. He watches a lot of TV, he said, like "America's Got Talent."

    "They've got good judges," he said.

    Sort of like the judges he got in his case? Like Judge Charles Kaufman, the Michigan judge who sentenced him to probation without notifying Chin's attorneys, virtually assuring Ebens would never serve time for the murder?

    Ebens didn't want to comment on that.

    For all the time he spends in front of the television, Ebens said he has never seen either of the two documentaries that have been made on the case, and said he made a mistake speaking to one of the filmmakers. Even for this column, Ebens showed his reluctance to be interviewed.

    But he finally consented to let me use all his statements because I told him I would be fair. I'm not interested in further demonizing Ronald Ebens. I just wanted to hear how he deals with being the killer of Vincent Chin.

    For three decades, the Chin case has been a driving force that has informed the passion among activists for Asian American civil rights. Some still feel there was no justice even after the long legal ordeal that included: 1) the state murder prosecution, where Ebens and his stepson, Michael Nitz, were allowed to plea bargain to second degree murder, given 3 years' probation and fined $3,720; 2) the first federal prosecution on civil rights charges that ended in a 25-year sentence for Ebens; 3) the subsequent appeal by Ebens to the Sixth Circuit, which was granted; 4) the second federal trial that was moved from Detroit to Cincinnati and ended in Ebens' acquittal.

    Add it all up, and it seems a far cry from justice. One man dead. Perps go free. I thought that maybe Ebens could help me understand how he got justice and not Vincent Chin.

    I asked him about his side of the story, which was a key dispute in the court testimony about how it all started at the Fancy Pants strip club.

    "It should never have happened," said Ebens. "[And] it had nothing to do with the auto industry or Asians or anything else. Never did, never will. I could have cared less about that. That's the biggest fallacy of the whole thing."

    That night at the club, after some harsh words were exchanged, Ebens said Chin stood up and came around to the other side of the stage. "He sucker-punched me and knocked me off my chair. That's how it started. I didn't even know he was coming," Ebens said.

    Chin's friends testified that Ebens made racial remarks, mistaking Chin to be Japanese. And then when Chin got into a shoving match, Ebens threw a chair at him but struck Nitz instead.

    But Ebens' version that there was no racial animosity or epithets is actually supported by testimony from Chin's friend, Jimmy Choi, who apologized to Ebens for Chin's behavior that included Chin throwing a chair and injuring Nitz.

    What about the baseball bat and how Ebens and Nitz followed Chin to a nearby McDonald's?

    Ebens said when all parties were asked to leave the strip club, they were out in the street. It's undisputed that Chin egged Ebens to fight on.

    "The first thing he said to me is 'You want to fight some more?'" Ebens recalled. "Five against two is not good odds," said Ebens, who declined to fight.

    Then later, when Chin and his friends left, Ebens' stepson went to get a baseball bat from his car.(Ironically, it was a Jackie Robinson model). Ebens said he took it away from Nitz because he didn't want anyone taking it from him and using it on them.

    But then Ebens said his anger got the best of him and he drove with Nitz to find Chin, finally spotting him at the nearby McDonald's.

    "That's how it went down," Ebens said. "If he hadn't sucker punched me in the bar...nothing would have ever happened. They forced the issue. And from there after the anger built up, that's where things went to hell."

    Ebens calls it "the gospel truth."

    But he says he's cautious speaking now because he doesn't want to be seen as shifting the blame. "I'm as much to blame," he sadly admitted. "I should've been smart enough to just call it a day. After they started to disperse, [it was time to] get in the car and go home."

    At the McDonald's where the blow that led to Chin's death actually occurred, Ebens' memory is more selective. To this day, he even wonders about hitting Chin with the bat. "I went over that a hundred, maybe 1,000 times in my mind the last 30 years. It doesn't make sense of any kind that I would swing a bat at his head when my stepson is right behind him. That makes no sense at all."

    And then he quickly added, almost wistfully, "I don't know what happened."

    Another time in the interview, he admitted his memory may be deficient. "That was really a traumatic thing, " he told me about his testimony. "I hardly remember even being on the stand."

    He admitted that everyone had too much to drink that night. But he's not claiming innocence.

    "No," Ebens said. "I took my shot in court. I pleaded guilty to what I did, regardless of how it occurred or whatever. A kid died, OK. And I feel bad about it. I still do."

    Ebens told me he has Asian friends where he lives, though he didn't indicate if he shares his past with them. When he thinks about Chin, he said no images come to mind.

    "It just makes me sick to my stomach, that's all," he said, thinking about all the lives that were wrecked, both Chin's and his own.

    By the end of our conversation, Ebens still wasn't sure he wanted me to tell his story. "It will only alienate people," he said. "Why bother? I just want to be left alone and live my life."

    But I told him I wouldn't judge. I would just listen, and use his words. I told him it was important in the Asian American community's healing process to hear a little more from him than a one line, "I'm sorry."

    He ultimately agreed. One line doesn't adequately explain another human being's feelings and actions. I told him I would paint a fuller picture.

    So now that we've heard what Ebens has to say 30 years later. I don't know from a phone conversation if he's telling me the truth. Nor do I know if I'm ready to forgive him. But I heard from him. And now that I have, I can deal with how the justice system failed Vincent Chin, and continue to help in the fight that it never happens again.

    ***

    For more information, read the pivotal 6th Circuit federal appeals court decision, which sent the case back for a new trial. See also Remembering Vincent Chin.



    ***
    (This blog post was revised on June 27, 2012, to include additional information from the Ebens interview. It was originally published on June 22, 2012. You can also read Emil's blog post on the 29th anniversary of Vincent Chin's death here, when he first contacted Ebens for an interview.)



    ***
    Updates at www.amok.com. Follow Emil on Twitter, @emilamok.
     

Share This Page