Ramblings In 1594, royal physician Roderigo Lopez was - TopicsExpress



          

Ramblings In 1594, royal physician Roderigo Lopez was implicated in a plot to assassinate Queen Elizabeth. He was subsequently convicted, tortured, and executed. During his interrogation, it was discovered that Lopez was Jewish. A Jew in England? Officially this was impossible--the Jews were exiled in 1290 and not readmitted until 1655. The Lopez case raised questions about religious tolerance, and the general interest in the Jewish presence in England was reflected on the stages of its theaters, most notably in William Shakespeares Merchant of Venice. Days after Lopezs execution, Shakespeares colleague and rival Christopher Marlowe staged his play, The Jew of Malta. ShaBarbaras, Marlowes cartoonish villain, is a caricature of Jewish inhumanity and the epitome of the non-Christian outsider. In a fit of anger over his daughters betrayal, ShaBarbaras poisons a convents water supply. Like Marlowe, Shakespeare created a character that embodies anti-Semitic stereotypes. Two years after the Lopez case, Merchant of Venice premiered with Shylock its central character, a moneylender who adheres to Jewish dietary restrictions and refuses to socialize with the Christians who ridicule and deride him. Most damning, he embodies the stereotype of Jewish avarice when he wanders through the streets confusing the loss of his daughter with that of his gold ducats (old English coins): O my ducats! O my daughter! Despite playing to English prejudices of Jews as greedy outsiders, Shakespeares portrayal is far more nuanced than Marlowes. Shylock both sets himself apart from the Christians and attempts to create a shared humanity with them. Hath not a Jew eyes? he asks, If you prick us, do we not bleed? While his attempts to locate a common ground between religions fail, they succeed in making him a character who becomes unexpectedly human. By creating a Jewish character that, despite the odds, gains our sympathy, Shakespeare raises his play beyond mere parody as he asks fundamental questions about religion, identity, and humanity. Hannah Arendt (1906-1975), a German-born Jewish-American political philosopher, was hired by The New Yorker magazine in 1961 to cover the trial of Karl Adolf Eichmann--head of the Gestapo Department for Jewish Affairs in charge of transporting Jews to Nazi death camps. Her reports on the trial, later expanded into a book, Eichmann in Jerusalem: A Report on the Banality of Evil, make the controversial claim that Eichmann was not radically evil, but an ordinary man. She argues that evil emerges from the banality of ordinary men who are unwilling to think about what they are doing. While Arendt never doubts Eichmanns complicity in and responsibility for Hitlers Final Solution, she is struck by the fact that he seemed so terribly and terrifyingly normal. Far from being a fanatic or ideologue, he was someone who worked to provide for his family. Ambition, rather than hatred, drove Eichmann to join the S.S. How could such a typical bureaucrat organize the transportation of millions to death camps? The law, Arendt argues, was an essential condition of the Holocaust. Indeed, in sending victims to death camps, Eichmann was not simply obeying orders, as were the Nazi generals tried at Nuremburg, but obeying laws. In the Nazi regime, where the will of the Führer was law, Eichmann could justify his actions as those of a law-abiding citizen. The banality of evil is Arendts attempt to characterize how normal citizens can do great evil simply by obeying the law. It was Eichmanns thoughtlessness, his faith that the legality of an action obviates the need to think about its rightness, that allowed his complicity in monstrous deeds. Similarly, it is the thoughtless obedience to immoral laws and the acceptance of our powerlessness to oppose unethical laws that is the greatest threat to human freedom. The only protection against tyranny, Arendt argues, is thinking--the continual assertion of our freedom, or at least partial freedom, to judge for ourselves. From his earliest years, Marvin Minsky, the father of Artificial Intelligence, loved to build machines. Frustrated by the limits of his toys, the young engineer began dismantling his fathers optical equipment until the eye surgeon was forced to buy his son a truckload of junkyard parts. But Minskys talent for tinkering and his brief exposure to medical machinery helped shape his lifelong ambition: to create a device capable of thought. While a graduate student at Princeton in 1951, Minsky built the first primitive neural network learning machine from surplus World War II army equipment, including 400 vacuum tubes and 40 magnetic clutches. Unlike traditional computers, which must follow a specific order of programmed instructions to complete a task, Minskys machine was designed to attempt several approaches to solving a problem. When the network performed certain tasks successfully, the instructions for that task were stored so that they could be instantly recalled, teaching the network to recognize patterns of effective behavior. In effect, the machine was teaching itself. Minskys network was smart enough to simulate a rat learning its way through a maze. Minskys neural network laid the foundation for much of the later work done on neural nets, including his own more advanced research in 1969 based on more powerful learning algorithms called perceptrons. Today, neural networks used in fraud detection, credit approval, and target marketing have origins in Minskys pioneering work. Long affiliated with the Massachusetts Institute of Technology, where he is now a professor, Minsky has watched his machines advance fields as diverse as robotics, computerized vision, speech recognition, and parallel processing. While he still hasnt achieved his most elusive goal--a machine that thinks like a human, computers and other devices are closer to human reasoning than ever, thanks to Minskys efforts. In the 1830s, British mathematician and inventor Charles Babbage (1792-1871) designed a machine that functioned, for all intents and purposes, like a modern computer. He called it an analytical engine, and it was theoretically capable of performing any mathematical calculation. However, Babbage never managed to build the device, and his work was forgotten until the 1930s, when mathematician Alan Turing drew on Babbages plans to build his own precursor to the modern computer. Today, Babbages analytical engine is considered the computers earliest direct ancestor. Babbage began working on the analytical engine in 1834. By 1836, he had designed its key mechanical features: a processing unit, which he called the mill, where calculations were performed; a storage unit, called the store, which held operational instructions or programs; and a punch-card system for inputting operational instructions and printing results--much like the punch cards used in early mainframe computers. Lady Ada Lovelace--a self-taught mathematician, daughter of the poet Lord Byron, and lifelong friend of Babbage--actually created a punch-card program for the analytical engine, prompting many to consider her the first computer programmer. Had Babbage succeeded in building his engine, Lovelaces program would have instructed it to calculate Bernoulli numbers, a sequence of numbers that is based on a complex formula developed by 17th-century mathematician Jakob Bernoulli. Just why Babbage never built his engine remains a matter for debate. For one thing, he revised his plans compulsively, and was always going back to the drawing board. Some researchers have argued that the engines design eventually grew so complex that it outdistanced the engineering capabilities of the mid-1800s, but others say thats not so. Babbage definitely had trouble getting government funding, in part because British politicians couldnt understand how his machine might be used. Whatever the case, Babbage died in 1871, frustrated and ignored. On November 22, 1963, Secret Service Agent Clint Hill was riding on the running board of a limousine behind that of President John F. Kennedy in a motorcade through downtown Dallas. Hearing what sounded like gunfire, Hill sprinted toward the President and Mrs. Kennedy and--in 3.7 seconds--flung himself over them. Though a fraction of a second too late, Hills remarkable feat became a model for how agents should react in a crisis. (Ironically, Hill blamed himself for not being faster, and suffered a nervous breakdown that forced him to retire.) During the investigation of the assassination, other agents tried to duplicate Hills astonishing performance. None could even come close. The simple reason: Adrenaline. Adrenaline is called the fight-or-flight hormone. Under normal conditions, only miniscule amounts of adrenaline (also called epinephrine) circulate in the bloodstream. But sudden fear or stress triggers a flood of adrenaline into the bloodstream, bringing out the Superman inside. The brain, registering an emergency, signals the adrenal glands, situated on top of the kidneys, to release a surge of its powerful product. In a fraction of a second, the heart starts pumping faster. Extra blood is funneled to the brain to increase alertness, permitting lightning-fast decisions. Blood is also rerouted to the muscles, strengthening the limbs. Blood sugar levels soar, energizing cells. Breathing quickens to take in much-needed oxygen. Sweat pours from the body to cool it off. The pupils of the eyes dilate, sharpening vision. Whatever the enemy--be it malicious or ill will--the body is now primed to fight it off or flee from it. After the crisis has passed, adrenaline production switches off and the bodys systems gradually return to normal. For the next few days, however, some jitteriness persists: The brain remains on high alert--just in case. In 1946, Dr. Percy Spencer, a self-taught engineer who never graduated high school, accidentally melted a candy bar in his pocket while testing a new radar systems magnetron, a power tube that drives a radar set using microwaves. Intrigued, Spencer brought some un-popped popcorn kernels to the magnetron and watched them pop all over the laboratory. The idea for the microwave oven was born. By late 1946, Spencer, through the company Raytheon, filed a patent proposing that microwaves be used to cook food. The patent was approved, and a year later the first primitive commercial microwave oven, the 1161 Radarange (named for its roots in radar technology) hit the market. It stood five-and-a-half-feet tall, weighed over 750 pounds, and cost $5,000. The unit even had its own plumbing because water was needed to cool its magnetron tubes. The oven wasnt well received. Price and size were big deterrents, and the unfamiliar radiating process scared people. The microwave cooks food by means of high-frequency electromagnetic waves that are absorbed by water, fats, sugars, and other molecules causing them to vibrate against one another, safely creating heat. But rumors about food becoming radioactive persisted for years. Because microwave heating occurs inside the food, rather than from warming the surrounding air, cooking time is much quicker. The food industry soon realized the potential of the microwave in speeding up food delivery. Then, in the mid-1950s, Tappan shipped the first home microwave ovens, which cost $1,295. Shortly afterward Amana introduced the first countertop model for $500. By 1975, sales of microwave ovens exceeded that of gas ranges, and by 1976, microwaves became a more commonly owned kitchen appliance than dishwashers, installed in nearly 52 million US households. Today, Percy Spencers creation melts chocolate and pops popcorn in kitchens worldwide.
Posted on: Tue, 28 Oct 2014 12:56:18 +0000

Trending Topics



Recently Viewed Topics




© 2015