Curio Cabinet / Person, Place, or Thing
-
FREEUS History PP&T CurioFree1 CQ
It inspired Abraham Lincoln’s most famous speech, countless documentaries, and even quite a few ghost stories. The Battle of Gettysburg, which ended on this day in 1863, is one of the most famous battles in American history and by far the bloodiest of the American Civil War with more than 50,000 estimated casualties. It was also a military campaign that didn’t go the way anyone was expecting it to. In fact, the Battle of Gettysburg became the war’s crucial turning point, unforeseen by both the Union and the Confederacy.
On May 6, 1863, Confederate General Robert E. Lee won his greatest military victory in the Battle of Chancellorsville, in Spotsylvania County, Virginia. Although Lee’s forces had been badly outnumbered, he triumphed thanks to his sharp tactical skills, leading many to believe that the Confederacy would win the war. In high spirits, Lee decided to lead his army northward into Pennsylvania to launch an invasion in Union territory, north of the Mason-Dixon line. Lee believed that a successful invasion of the North would cause panic in cities like New York, demoralize the Union, and pave the way for the Confederacy to take control of Washington D.C., bringing the war to a quick end. Meanwhile, Union Major General Joseph Hooker was relieved of his command after his defeat at Chancellorsville, and Major General George Gordon Meade was placed in charge of around 90,000 Union troops. Under Meade’s leadership, they began moving to block Lee’s route to Washington, drawing very close to Lee’s troops in the process. Though both sides knew that the other could be nearby, they had no idea just how close they were as they both neared the town of Gettysburg, Pennsylvania. The situation was a powder keg.
On the morning of July 1st, Confederate Major General Henry Heth disobeyed orders from Lee and led a division into Gettysburg on a supply run. To his surprise, he ended up bumping right into a group of Union cavalry, and a skirmish broke out immediately. If not for Heth’s blunder, the Battle of Gettysburg might never have taken place, or might have ended differently. When later confronted about his ill-advised trip into town, Heth claimed that he and his troops desperately needed shoes, and he had no choice but to disobey Lee in order to get them. In any case, the relatively small fight took a grave turn when Union reinforcements under the command of Major General John F. Reynolds arrived in town. Reynolds was killed in the fighting, and both sides began calling for more reinforcements. Soon, tens of thousands of troops had taken to the battlefield. By nightfall, the Confederates appeared victorious. Outnumbered Union troops withdrew to an area south of town to wait for daybreak.
The Confederacy gained more ground on the battle’s second day by attempting to encircle Union troops who were defending hills and ridges south of Gettysburg. However, despite heavy losses, the Union successfully defended most of their positions, including a hill called Culp’s Hill and a ridge called Cemetery Ridge. These positions gave them advantageous high ground, with Culp’s Hill functioning as the Union’s right defensive flank. On the morning of July 3, Lee was determined to drive the Union from their high ground. This culminated in what became known as Pickett’s charge. Under orders from Lieutenant General James Longstreet, Lee’s second-in-command, around 12,5000 Confederate troops charged up the center of Cemetery Ridge, attempting to take it from the Union through sheer force. About half of them were under the command of Brigadier General George Pickett. Though one Confederate brigade did make it to the top of the ridge, Lee’s hopes of a successful, all-out assault were quickly dashed. The charge was a disaster for the Confederacy. From their position atop the ridge, Union troops picked off a critical number of Confederates. Having lost around 60 percent of his men, Lee was forced to retreat from Gettysburg, through Maryland, and then back to Virginia with wagons full of gravely injured soldiers. In fact, the ambulance wagon train stretched for around 15 to 20 miles.
Although the Civil War continued for almost two more years, the Battle of Gettysburg gave the Union a much-needed morale boost and ended the Confederacy’s long-held plan to invade the North. Just think, the U.S. might look very different today if General Heth hadn’t been so desperate for new shoes.
[Image description: Monument to Battery B, a stone monument and two cannons near Gettysburg, Pennsylvania] Credit & copyright: indiesilver, PixabayIt inspired Abraham Lincoln’s most famous speech, countless documentaries, and even quite a few ghost stories. The Battle of Gettysburg, which ended on this day in 1863, is one of the most famous battles in American history and by far the bloodiest of the American Civil War with more than 50,000 estimated casualties. It was also a military campaign that didn’t go the way anyone was expecting it to. In fact, the Battle of Gettysburg became the war’s crucial turning point, unforeseen by both the Union and the Confederacy.
On May 6, 1863, Confederate General Robert E. Lee won his greatest military victory in the Battle of Chancellorsville, in Spotsylvania County, Virginia. Although Lee’s forces had been badly outnumbered, he triumphed thanks to his sharp tactical skills, leading many to believe that the Confederacy would win the war. In high spirits, Lee decided to lead his army northward into Pennsylvania to launch an invasion in Union territory, north of the Mason-Dixon line. Lee believed that a successful invasion of the North would cause panic in cities like New York, demoralize the Union, and pave the way for the Confederacy to take control of Washington D.C., bringing the war to a quick end. Meanwhile, Union Major General Joseph Hooker was relieved of his command after his defeat at Chancellorsville, and Major General George Gordon Meade was placed in charge of around 90,000 Union troops. Under Meade’s leadership, they began moving to block Lee’s route to Washington, drawing very close to Lee’s troops in the process. Though both sides knew that the other could be nearby, they had no idea just how close they were as they both neared the town of Gettysburg, Pennsylvania. The situation was a powder keg.
On the morning of July 1st, Confederate Major General Henry Heth disobeyed orders from Lee and led a division into Gettysburg on a supply run. To his surprise, he ended up bumping right into a group of Union cavalry, and a skirmish broke out immediately. If not for Heth’s blunder, the Battle of Gettysburg might never have taken place, or might have ended differently. When later confronted about his ill-advised trip into town, Heth claimed that he and his troops desperately needed shoes, and he had no choice but to disobey Lee in order to get them. In any case, the relatively small fight took a grave turn when Union reinforcements under the command of Major General John F. Reynolds arrived in town. Reynolds was killed in the fighting, and both sides began calling for more reinforcements. Soon, tens of thousands of troops had taken to the battlefield. By nightfall, the Confederates appeared victorious. Outnumbered Union troops withdrew to an area south of town to wait for daybreak.
The Confederacy gained more ground on the battle’s second day by attempting to encircle Union troops who were defending hills and ridges south of Gettysburg. However, despite heavy losses, the Union successfully defended most of their positions, including a hill called Culp’s Hill and a ridge called Cemetery Ridge. These positions gave them advantageous high ground, with Culp’s Hill functioning as the Union’s right defensive flank. On the morning of July 3, Lee was determined to drive the Union from their high ground. This culminated in what became known as Pickett’s charge. Under orders from Lieutenant General James Longstreet, Lee’s second-in-command, around 12,5000 Confederate troops charged up the center of Cemetery Ridge, attempting to take it from the Union through sheer force. About half of them were under the command of Brigadier General George Pickett. Though one Confederate brigade did make it to the top of the ridge, Lee’s hopes of a successful, all-out assault were quickly dashed. The charge was a disaster for the Confederacy. From their position atop the ridge, Union troops picked off a critical number of Confederates. Having lost around 60 percent of his men, Lee was forced to retreat from Gettysburg, through Maryland, and then back to Virginia with wagons full of gravely injured soldiers. In fact, the ambulance wagon train stretched for around 15 to 20 miles.
Although the Civil War continued for almost two more years, the Battle of Gettysburg gave the Union a much-needed morale boost and ended the Confederacy’s long-held plan to invade the North. Just think, the U.S. might look very different today if General Heth hadn’t been so desperate for new shoes.
[Image description: Monument to Battery B, a stone monument and two cannons near Gettysburg, Pennsylvania] Credit & copyright: indiesilver, Pixabay -
FREEPolitical Science PP&T CurioFree1 CQ
Happy birthday, United Nations! On this day in 1945, the United Nations Charter, the organization’s founding document, was signed in the hopes that an international body could help nations resolve disputes and promote world peace. Since then, the U.N. has doubled in size and grown in influence. In fact, the U.N.’s top court recently ordered Russia to end its invasion of Ukraine. But just how much power does this international organization have? The answer is complicated.
The story of the United Nations begins with the outbreak of World War I, in 1914. Upset by the massive loss of life caused by the war, influential world leaders, most notably President Woodrow Wilson, began calling for an international body to promote peace between nations. He believed that such an organization could prevent another world war. By 1918, the idea had taken off, and the League of Nations was officially established in 1920, at the Paris Peace Conference, which established the terms of peace after World War I. The League had 42 founding members. Over the following decades, some nations were added via elections, while others withdrew or were expelled. In fact, the Soviet Union was expelled in 1939 for invading Finland. Strangely, even though a U.S. President was influential in establishing the League, the United States never joined due to strong isolationist sentiments among American lawmakers at the time.
The League wasn’t as functional as Woodrow Wilson might have hoped. From the offset, it was hampered by issues ranging from infighting to the United States’ refusal to join, and a general unwillingness among members to actually enforce the League’s resolutions. The final nail in the coffin came in 1939, with the outbreak of World War II—the very thing the League had been created to prevent. In 1943, allied powers agreed to dissolve the League once the war ended, and replace it with a more active, effective international organization. President Franklin Roosevelt coined the name “United Nations” in 1941, and many of the new organization’s rules and goals were outlined, debated and revised in various meetings throughout the remainder of the war.
On June 26, 1945, the United Nations was formed with the signing of the Charter of the United Nations, which had taken two months to fully draft. 50 countries signed, and the document was ratified on October 24. U.N. members had no time to celebrate and take it easy, though. In 1947, the U.N. approved a resolution to create the state of Israel by partitioning Palestine, a move that remains controversial to this day, but that effectively demonstrated the new organization’s power. Over the next decade, the escalation of the Cold War between the U.S. and the Soviet Union caused tensions in the U.N. Some resolutions had to be passed without the U.S.S.R. present, including a 1950 resolution that agreed to let U.S. forces stop the North-Korean-led, U.S.S.R.-supported invasion of South Korea. Still, the U.N. immediately proved more effective than the defunct League of Nations. Unlike the League, The U.N. took a very active role in global peacekeeping, approving resolutions for certain nations, like the U.S., to send humanitarian aid, military troops, or both into war-torn areas in an attempt to calm tensions. So far, the U.N. has conducted over 70 peacekeeping missions, at least 12 of which are still ongoing.
The modern U.N. includes 193 member states and functions via six principal organs: the General Assembly, the Security Council, the Trusteeship Council, the Economic and Social Council, the Secretariat, and the International Court of Justice (ICJ). The ICJ recently made headlines by ruling that Russia’s invasion of Ukraine is illegal under international law, and that Russia must withdraw. Of course, since no international government exists, the U.N. can’t actually force Russia to comply. But that doesn’t make the ICJ’s ruling useless. In fact, the ruling makes it easier for other nations to justify harsh sanctions against Russia, and makes it much more difficult for Russia to deny the illegality of its invasion. It also serves as a subtle reminder of the U.N.’s policy of collective security: an attack against one U.N. nation will lead to military resistance from all of them. Certainly something for any country to keep in mind before tangling with the U.N.
[Image description: The blue-and-white United Nations flag] Credit & copyright: padrinan, PixabayHappy birthday, United Nations! On this day in 1945, the United Nations Charter, the organization’s founding document, was signed in the hopes that an international body could help nations resolve disputes and promote world peace. Since then, the U.N. has doubled in size and grown in influence. In fact, the U.N.’s top court recently ordered Russia to end its invasion of Ukraine. But just how much power does this international organization have? The answer is complicated.
The story of the United Nations begins with the outbreak of World War I, in 1914. Upset by the massive loss of life caused by the war, influential world leaders, most notably President Woodrow Wilson, began calling for an international body to promote peace between nations. He believed that such an organization could prevent another world war. By 1918, the idea had taken off, and the League of Nations was officially established in 1920, at the Paris Peace Conference, which established the terms of peace after World War I. The League had 42 founding members. Over the following decades, some nations were added via elections, while others withdrew or were expelled. In fact, the Soviet Union was expelled in 1939 for invading Finland. Strangely, even though a U.S. President was influential in establishing the League, the United States never joined due to strong isolationist sentiments among American lawmakers at the time.
The League wasn’t as functional as Woodrow Wilson might have hoped. From the offset, it was hampered by issues ranging from infighting to the United States’ refusal to join, and a general unwillingness among members to actually enforce the League’s resolutions. The final nail in the coffin came in 1939, with the outbreak of World War II—the very thing the League had been created to prevent. In 1943, allied powers agreed to dissolve the League once the war ended, and replace it with a more active, effective international organization. President Franklin Roosevelt coined the name “United Nations” in 1941, and many of the new organization’s rules and goals were outlined, debated and revised in various meetings throughout the remainder of the war.
On June 26, 1945, the United Nations was formed with the signing of the Charter of the United Nations, which had taken two months to fully draft. 50 countries signed, and the document was ratified on October 24. U.N. members had no time to celebrate and take it easy, though. In 1947, the U.N. approved a resolution to create the state of Israel by partitioning Palestine, a move that remains controversial to this day, but that effectively demonstrated the new organization’s power. Over the next decade, the escalation of the Cold War between the U.S. and the Soviet Union caused tensions in the U.N. Some resolutions had to be passed without the U.S.S.R. present, including a 1950 resolution that agreed to let U.S. forces stop the North-Korean-led, U.S.S.R.-supported invasion of South Korea. Still, the U.N. immediately proved more effective than the defunct League of Nations. Unlike the League, The U.N. took a very active role in global peacekeeping, approving resolutions for certain nations, like the U.S., to send humanitarian aid, military troops, or both into war-torn areas in an attempt to calm tensions. So far, the U.N. has conducted over 70 peacekeeping missions, at least 12 of which are still ongoing.
The modern U.N. includes 193 member states and functions via six principal organs: the General Assembly, the Security Council, the Trusteeship Council, the Economic and Social Council, the Secretariat, and the International Court of Justice (ICJ). The ICJ recently made headlines by ruling that Russia’s invasion of Ukraine is illegal under international law, and that Russia must withdraw. Of course, since no international government exists, the U.N. can’t actually force Russia to comply. But that doesn’t make the ICJ’s ruling useless. In fact, the ruling makes it easier for other nations to justify harsh sanctions against Russia, and makes it much more difficult for Russia to deny the illegality of its invasion. It also serves as a subtle reminder of the U.N.’s policy of collective security: an attack against one U.N. nation will lead to military resistance from all of them. Certainly something for any country to keep in mind before tangling with the U.N.
[Image description: The blue-and-white United Nations flag] Credit & copyright: padrinan, Pixabay -
FREEUS History PP&T CurioFree1 CQ
It's Flashback Friday! In honor of World UFO Day, enjoy these curios all about mankind’s fascination with the mysteries of space. Take us to your leaders! If extraterrestrials ever did land on Earth, the government probably wouldn't host them at Area 51. The location in Lincoln County, Nevada, was the source of much speculation during the second half of the 20th century, and for good reasons—though, not the ones we would expect.
In April of 1955, the CIA scouted locations near the Edwards Air Force base seeking a testing ground for new weapons and aircraft. President Eisenhower had recently approved a secret counterstrategy to address the Soviet Union's aggressive air force expansion, and it needed a venue. Groom Lake was chosen because of its flat, plane-accessible field and Emigrant Valley's sight-obstructing mountain ranges. Once Project OXCART began in 1959, so did decades of classified testing. Pilots were trained to fly U-2 jets and search-and-rescue helicopters; anti-radar stealth materials were tested; and captured Soviet fighter planes were analyzed for structural weaknesses. All of the military secrecy combined with higher flight restrictions around Groom Lake led locals to suspect something fishy (or alien-y) was going on.
Some of the earliest UFO sightings near Area 51 came from commercial pilots who had never seen an aircraft fly as high as the air force's new prototypes. Air Force officials knew about the reports, however their need to protect classified projects led to stories of "high-altitude weather research" or "natural phenomena" as the source of all the "nothings" to see at Groom Lake. Conspiracy theories were stoked in 1989 when Bob Lazar claimed to have reverse engineered alien spacecrafts in a clandestine, underground lair at Area 51. He also claimed the as-yet-undiscovered element 115 (moscovium) was the main fuel source for the vehicle. Lazar's allegations and MIT degree have yet to be verified by outside sources.
Area 51's shroud was finally cleared in 2013 when a Freedom of Information Act request was granted, forcing the CIA to publicly acknowledge the existence of the classified base. Detailed records of the base's history and purpose have also been released, but conspiracy theorists remain skeptical. These days the Extraterrestrial Highway surrounding Edwards Air Force base is a hotbed of kitschy UFO tourism where visitors of the terrestrial kind can see the Alien Research Center, or grab a room at the A'Le'Inn. Just don't expect to wake up to spooky theremin music and a neon gravity beam pulling the inn up to the mothership!
Image credit & copyright: MartinStr, Pixabay
It's Flashback Friday! In honor of World UFO Day, enjoy these curios all about mankind’s fascination with the mysteries of space. Take us to your leaders! If extraterrestrials ever did land on Earth, the government probably wouldn't host them at Area 51. The location in Lincoln County, Nevada, was the source of much speculation during the second half of the 20th century, and for good reasons—though, not the ones we would expect.
In April of 1955, the CIA scouted locations near the Edwards Air Force base seeking a testing ground for new weapons and aircraft. President Eisenhower had recently approved a secret counterstrategy to address the Soviet Union's aggressive air force expansion, and it needed a venue. Groom Lake was chosen because of its flat, plane-accessible field and Emigrant Valley's sight-obstructing mountain ranges. Once Project OXCART began in 1959, so did decades of classified testing. Pilots were trained to fly U-2 jets and search-and-rescue helicopters; anti-radar stealth materials were tested; and captured Soviet fighter planes were analyzed for structural weaknesses. All of the military secrecy combined with higher flight restrictions around Groom Lake led locals to suspect something fishy (or alien-y) was going on.
Some of the earliest UFO sightings near Area 51 came from commercial pilots who had never seen an aircraft fly as high as the air force's new prototypes. Air Force officials knew about the reports, however their need to protect classified projects led to stories of "high-altitude weather research" or "natural phenomena" as the source of all the "nothings" to see at Groom Lake. Conspiracy theories were stoked in 1989 when Bob Lazar claimed to have reverse engineered alien spacecrafts in a clandestine, underground lair at Area 51. He also claimed the as-yet-undiscovered element 115 (moscovium) was the main fuel source for the vehicle. Lazar's allegations and MIT degree have yet to be verified by outside sources.
Area 51's shroud was finally cleared in 2013 when a Freedom of Information Act request was granted, forcing the CIA to publicly acknowledge the existence of the classified base. Detailed records of the base's history and purpose have also been released, but conspiracy theorists remain skeptical. These days the Extraterrestrial Highway surrounding Edwards Air Force base is a hotbed of kitschy UFO tourism where visitors of the terrestrial kind can see the Alien Research Center, or grab a room at the A'Le'Inn. Just don't expect to wake up to spooky theremin music and a neon gravity beam pulling the inn up to the mothership!
Image credit & copyright: MartinStr, Pixabay
-
FREEWriting Music PP&T CurioFree1 CQ
Here’s a composer who’s scored as many prestigious awards as he has movies. In addition to his five Academy Awards and four Golden Globes, John Williams has the honor of being a household name—a rare feat among composers. Then again, how could he not be well-known, when he’s responsible for the music of mega-blockbusters like Star Wars, Jaws, Indiana Jones, Harry Potter, and E.T. the Extra-Terrestrial, which was released this month in 1982? Williams’ career had an interesting beginning, though. Few would guess that the famed composer was playing on military bases and in jazz clubs long before he was scoring movies.
Born in New York City in 1932, Williams grew up surrounded by music. His father, Johnny Williams, was a jazz drummer who played with the Raymond Scott quartet and later the CBS radio orchestra. By the time he was a teenager, Williams played several instruments, including piano, trumpet, trombone and clarinet. He also began orchestrating music, following his father’s lead. In 1948, Williams’ family moved to L.A. Greatly influenced by his father’s work in jazz, Williams attended the University of California but transferred to Los Angeles City College because he was so eager to play in their studio jazz band. Williams didn’t get the chance to graduate on time, though. In 1951, he was drafted into the U.S. Air Force, though he didn’t let that stop his musical pursuits.
While in the Air Force, Williams worked with military bands, arranging, conducting, and playing music. His talents eventually led to him working with the U.S. Air Force Band as a piano and brass player. After leaving the military in 1955, Williams moved back to New York City and began attending the school to which he still has the closest ties: The Juilliard School, a prestigious, private performing arts conservatory in New York City. However, at the same time that Williams was studying piano with famed Ukrainian pianist Rosina Lhévinne at Juilliard, he was spending his nights working as a jazz pianist in New York clubs. Williams loved performing, and dreamed of becoming a concert pianist before finally deciding that his true strength lay in composing.
After Juilliard, Williams returned to L.A. By 1959, he was working steadily as a film studio orchestrator, someone who assigns instruments to musicians in an orchestra. He played piano for various movies and T.V. shows including 1961’s West Side Story. In 1971, he won an Academy Award for his score adaptation of the film version of Fiddler on the Roof. Stepping outside the comfort zone of many classically-trained composers, Williams made a name for himself lending dramatic, swelling music to disaster films in the early 1970s. It was Williams’ penchant for thrilling scores that made Steven Speilberg approach him about scoring Speilberg’s directorial debut, Sugarland Express, in 1974. Their partnership was solid, and the following year Williams created the score for a movie that made him a household name outside of Hollywood: Jaws. Famous for its two-note ostinato, or repeated musical phrase, Williams’ score plays each time the film’s monstrous shark approaches. Speilberg and Williams went on to work together on 25 more films over the next 43 years, as Williams racked up a total of 50 Academy Award nominations.
Williams’ 60-year career has impacted the sound of Hollywood in innumerable ways. Before him, only “serious” movies, such as period dramas, had “serious-sounding” scores. But, perhaps because Williams always had one foot in the classical world and one foot in the jazz world, he didn’t adhere to that rule. Now, movies of any genre can have elevated, orchestral music, be they family films (like E.T.), thrillers (like Jaws) or sci-fi flicks (like Star Wars.) In 2016, The American Film Institute awarded Williams a lifetime achievement award. At the award ceremony, Speilberg summed up his friend’s contributions thusly: “Without John Williams, bikes don’t really fly, nor do brooms in Quidditch matches, nor do men in red capes. There is no Force, dinosaurs do not walk the Earth, we do not wonder, we do not weep, we do not believe.” How’s that for a glowing endorsement?
[Image description: John Williams conducting an orchestra.] Credit & copyright: Chris Devers, image cropped for size, image is hereby distributed under the same license linked here.Here’s a composer who’s scored as many prestigious awards as he has movies. In addition to his five Academy Awards and four Golden Globes, John Williams has the honor of being a household name—a rare feat among composers. Then again, how could he not be well-known, when he’s responsible for the music of mega-blockbusters like Star Wars, Jaws, Indiana Jones, Harry Potter, and E.T. the Extra-Terrestrial, which was released this month in 1982? Williams’ career had an interesting beginning, though. Few would guess that the famed composer was playing on military bases and in jazz clubs long before he was scoring movies.
Born in New York City in 1932, Williams grew up surrounded by music. His father, Johnny Williams, was a jazz drummer who played with the Raymond Scott quartet and later the CBS radio orchestra. By the time he was a teenager, Williams played several instruments, including piano, trumpet, trombone and clarinet. He also began orchestrating music, following his father’s lead. In 1948, Williams’ family moved to L.A. Greatly influenced by his father’s work in jazz, Williams attended the University of California but transferred to Los Angeles City College because he was so eager to play in their studio jazz band. Williams didn’t get the chance to graduate on time, though. In 1951, he was drafted into the U.S. Air Force, though he didn’t let that stop his musical pursuits.
While in the Air Force, Williams worked with military bands, arranging, conducting, and playing music. His talents eventually led to him working with the U.S. Air Force Band as a piano and brass player. After leaving the military in 1955, Williams moved back to New York City and began attending the school to which he still has the closest ties: The Juilliard School, a prestigious, private performing arts conservatory in New York City. However, at the same time that Williams was studying piano with famed Ukrainian pianist Rosina Lhévinne at Juilliard, he was spending his nights working as a jazz pianist in New York clubs. Williams loved performing, and dreamed of becoming a concert pianist before finally deciding that his true strength lay in composing.
After Juilliard, Williams returned to L.A. By 1959, he was working steadily as a film studio orchestrator, someone who assigns instruments to musicians in an orchestra. He played piano for various movies and T.V. shows including 1961’s West Side Story. In 1971, he won an Academy Award for his score adaptation of the film version of Fiddler on the Roof. Stepping outside the comfort zone of many classically-trained composers, Williams made a name for himself lending dramatic, swelling music to disaster films in the early 1970s. It was Williams’ penchant for thrilling scores that made Steven Speilberg approach him about scoring Speilberg’s directorial debut, Sugarland Express, in 1974. Their partnership was solid, and the following year Williams created the score for a movie that made him a household name outside of Hollywood: Jaws. Famous for its two-note ostinato, or repeated musical phrase, Williams’ score plays each time the film’s monstrous shark approaches. Speilberg and Williams went on to work together on 25 more films over the next 43 years, as Williams racked up a total of 50 Academy Award nominations.
Williams’ 60-year career has impacted the sound of Hollywood in innumerable ways. Before him, only “serious” movies, such as period dramas, had “serious-sounding” scores. But, perhaps because Williams always had one foot in the classical world and one foot in the jazz world, he didn’t adhere to that rule. Now, movies of any genre can have elevated, orchestral music, be they family films (like E.T.), thrillers (like Jaws) or sci-fi flicks (like Star Wars.) In 2016, The American Film Institute awarded Williams a lifetime achievement award. At the award ceremony, Speilberg summed up his friend’s contributions thusly: “Without John Williams, bikes don’t really fly, nor do brooms in Quidditch matches, nor do men in red capes. There is no Force, dinosaurs do not walk the Earth, we do not wonder, we do not weep, we do not believe.” How’s that for a glowing endorsement?
[Image description: John Williams conducting an orchestra.] Credit & copyright: Chris Devers, image cropped for size, image is hereby distributed under the same license linked here. -
FREEPolitical Science PP&T CurioFree1 CQ
It's Flashback Friday, and the anniversary of the Watergate scandal. As such, enjoy these curios all about government; it’s procedures and some of the shenanigans it gets up to.
When a government official must publicly declare that they aren't a criminal, it's usually a bad sign. But Richard Milhous Nixon, the 37th President of the United States, did just that on November 17, 1973, during a now-infamous speech. The address came on the heels of a government wiretapping scandal which had cemented Nixon's place as an extremely controversial President. On August 9, 1974, Nixon became the first and only U.S. President to ever resign from office, leaving behind a strange governmental legacy.
Born January 9, 1913, Nixon started making waves early in his career. In 1946, after some time in the navy, he won a seat in the U.S. House of Representatives at the age of 33. But it was during the "Red Scare" of the late 1940s and early 1950s, in which everyone from government officials to Hollywood actors were accused of being communists, that Nixon truly made a name for himself. He became known as a fervent anti-communist when he sat on the rather dystopian-sounding (and later disgraced) "House Un-American Activities Committee" and aggressively questioned former State Department official Alger Hiss, who was accused of being a communist spy.
Nixon gained a seat in the U.S. Senate in 1950. In 1952, he won the nomination to run as Vice President to Dwight D. Eisenhower, but soon the New York Post reported that Nixon was improperly using political contributions, which he kept in a secret "slush fund." In response, Nixon gave a now-famous address—the "Checkers" speech—in which he insisted that the only political gift he had ever received was his daughter's cocker spaniel, Checkers. Nixon was allowed to run with Eisenhower, and became Vice President later that year.
Nixon first ran for President himself in 1960, but lost to Democratic opponent John F. Kennedy. He didn't launch another Presidential campaign until 1968. His biggest campaign promise was to end the Vietnam War and the draft, which were hugely unpopular. This helped propel him to a narrow victory. During his first term, Nixon focused on easing tensions with China and Russia. Ending the Vietnam War proved difficult for a number of logistical reasons, and a treaty wasn't signed until 1973, during Nixon's second term. It was around this time that the scandal for which Nixon is best remembered came to light, thanks to a secret informant.
According to the source, Nixon's administration had burglarized and wiretapped the Democratic Party National Headquarters at the Watergate complex in Washington, D.C., during his re-election campaign. Government investigations revealed that the five men connected with the burglary had been hired by the Republican Party's Committee to Re-elect the President, and that Nixon had tried to hide his involvement by paying the burglars hush money. He was even caught on tape discussing how to block the FBI's investigation.
In 1973, at the height of national outrage over the Watergate scandal, Nixon gave his now-infamous speech, saying "...people have got to know whether or not their president is a crook. Well, I am not a crook. I have earned everything I have got." The majority of the U.S. government didn't agree, however. By 1974, it became obvious that Nixon would be removed from office. He resigned from the Presidency on August 9, 1974.
When Gerald Ford assumed the Presidency in Nixon's place, he pardoned Nixon of all crimes he had committed while in office. Nixon largely remained out of the public eye for the remainder of his life. He passed away from complications from a stroke in 1994. No one can deny that the legacy "Tricky Dick" left behind isn't unique...though surely not in the way he had hoped for!
Below: footage of Nixon's "I am not a crook" speech.
Image credit & copyright: White House Photo Office, General Services Administration, National Archives and Records Service, Office of Presidential Libraries, Office of Presidential Papers, Wikimedia Commons, Public Domain
It's Flashback Friday, and the anniversary of the Watergate scandal. As such, enjoy these curios all about government; it’s procedures and some of the shenanigans it gets up to.
When a government official must publicly declare that they aren't a criminal, it's usually a bad sign. But Richard Milhous Nixon, the 37th President of the United States, did just that on November 17, 1973, during a now-infamous speech. The address came on the heels of a government wiretapping scandal which had cemented Nixon's place as an extremely controversial President. On August 9, 1974, Nixon became the first and only U.S. President to ever resign from office, leaving behind a strange governmental legacy.
Born January 9, 1913, Nixon started making waves early in his career. In 1946, after some time in the navy, he won a seat in the U.S. House of Representatives at the age of 33. But it was during the "Red Scare" of the late 1940s and early 1950s, in which everyone from government officials to Hollywood actors were accused of being communists, that Nixon truly made a name for himself. He became known as a fervent anti-communist when he sat on the rather dystopian-sounding (and later disgraced) "House Un-American Activities Committee" and aggressively questioned former State Department official Alger Hiss, who was accused of being a communist spy.
Nixon gained a seat in the U.S. Senate in 1950. In 1952, he won the nomination to run as Vice President to Dwight D. Eisenhower, but soon the New York Post reported that Nixon was improperly using political contributions, which he kept in a secret "slush fund." In response, Nixon gave a now-famous address—the "Checkers" speech—in which he insisted that the only political gift he had ever received was his daughter's cocker spaniel, Checkers. Nixon was allowed to run with Eisenhower, and became Vice President later that year.
Nixon first ran for President himself in 1960, but lost to Democratic opponent John F. Kennedy. He didn't launch another Presidential campaign until 1968. His biggest campaign promise was to end the Vietnam War and the draft, which were hugely unpopular. This helped propel him to a narrow victory. During his first term, Nixon focused on easing tensions with China and Russia. Ending the Vietnam War proved difficult for a number of logistical reasons, and a treaty wasn't signed until 1973, during Nixon's second term. It was around this time that the scandal for which Nixon is best remembered came to light, thanks to a secret informant.
According to the source, Nixon's administration had burglarized and wiretapped the Democratic Party National Headquarters at the Watergate complex in Washington, D.C., during his re-election campaign. Government investigations revealed that the five men connected with the burglary had been hired by the Republican Party's Committee to Re-elect the President, and that Nixon had tried to hide his involvement by paying the burglars hush money. He was even caught on tape discussing how to block the FBI's investigation.
In 1973, at the height of national outrage over the Watergate scandal, Nixon gave his now-infamous speech, saying "...people have got to know whether or not their president is a crook. Well, I am not a crook. I have earned everything I have got." The majority of the U.S. government didn't agree, however. By 1974, it became obvious that Nixon would be removed from office. He resigned from the Presidency on August 9, 1974.
When Gerald Ford assumed the Presidency in Nixon's place, he pardoned Nixon of all crimes he had committed while in office. Nixon largely remained out of the public eye for the remainder of his life. He passed away from complications from a stroke in 1994. No one can deny that the legacy "Tricky Dick" left behind isn't unique...though surely not in the way he had hoped for!
Below: footage of Nixon's "I am not a crook" speech.
Image credit & copyright: White House Photo Office, General Services Administration, National Archives and Records Service, Office of Presidential Libraries, Office of Presidential Papers, Wikimedia Commons, Public Domain
-
FREETravel PP&T CurioFree1 CQ
Care to take a walk through the clouds? Pamukkale, Turkey, is a natural site whose name fittingly translates to “cotton castle.” Located in Denizli Province in southwestern Turkey, the site is famous for its terraced thermal pools surrounded by white travertine, a type of limestone which gives the site a cottony, cloudlike appearance. The otherworldly formation was created by calcium oxide-rich waters flowing down the slopes of a plateau overlooking the plain of Cürüksu. Over time, the mineral deposits created stunning, white terraces and separated the thermal water into individual pools. It’s easy to see why the sight has been famous for centuries, but the pools aren’t the only thing that Pamukkale has to offer. Ancient ruins also dot the area, as Pamukkale has a rich and fascinating history.
Ancient Pamukkale was known as Hierapolis, or “Holy City” in Greek. Some historians believe that the city’s name was a reference to Hiera, the mythological wife of Telephus, who was a son of Hercules. It’s likely that the city’s original builders were drawn to the area by the thermal terraces, since they built their city atop the same plateau. In fact, ancient writings and artwork suggest that the city’s residents considered the terraces and thermal pools sacred. Hierapolis' early history remains murky, but some historians believe it was founded by Eumenes II, an ancient ruler of Pergamon, a rich and powerful city in Mysia on the coast of the Aegean Sea. Eumenes II lived from around 197 BCE to 159 BCE, and in that time greatly expanded his kingdom. However, in 146 BCE, Greece was defeated by Rome in the Battle of Corinth, putting an end to the long Roman-Greek wars. Afterward, Greece, including Hierapolis, was ruled by Rome.
Disaster struck Hierapolis in 17 BCE, during the reign of Tiberius. An earthquake destroyed many of the structures that had been built by the Greeks. This ended up giving the Romans a chance to completely rebuild the city, and since the natural thermal pools and white terraces weren’t substantially damaged by the quake, they had incentive to do so. In fact, they re-made Hierapolis into an ancient resort town, where high-ranking officials and politicians could come to vacation. In the center of the city was a long street called the Plateia, lined with shops. The southern and northern entrances to the city featured large, limestone archways. The city’s public baths were popular with tourists, and athletes trained in Hierapolis’ large gymnasium. Possibly the most famous building in the ancient city is the Theater of Hieropolis. It features 45 semi-circular rows of seats with eight staircases running between them, and could hold up to 15,000 people. The theater’s existence hasn’t been an easy one. In 60 AD, the city suffered another earthquake, and the theater had to be rebuilt afterward, during the reign of Septimius Severus. The theater has an elaborate skenea, a building common to ancient Greek and Roman theaters which was used as a changing room for actors and as a backdrop in front of which plays were performed. Hieropolis’ skenea is decorated with friezes, horizontal bands of sculpted decoration, which feature scenes of Apollo and Artemis. Apollo was the God of healing, music and sunlight, among other things. His twin sister, Artemis, was the Goddess of hunting, moonlight, and chastity. Apollo was the main deity of Hierapolis, and the city’s ruins also include a temple dedicated to him.
Today, cotton is one of the major crops harvested near Pamukkale. This helped give the city its modern name, since legend holds that the white terraces were formed by giants leaving cotton out to dry. Pamukkale was declared a UNESCO World Heritage Site in 1988, and around 1.5 million tourists visit it each year. Some tours even offer the chance to swim in its famous thermal pools. Just watch out for wandering giants if you decide to take the plunge!
[Image description: Visitors file past travertine terraces in Pamukkale, Turkey] Credit & copyright: LoggaWiggler, PixabayCare to take a walk through the clouds? Pamukkale, Turkey, is a natural site whose name fittingly translates to “cotton castle.” Located in Denizli Province in southwestern Turkey, the site is famous for its terraced thermal pools surrounded by white travertine, a type of limestone which gives the site a cottony, cloudlike appearance. The otherworldly formation was created by calcium oxide-rich waters flowing down the slopes of a plateau overlooking the plain of Cürüksu. Over time, the mineral deposits created stunning, white terraces and separated the thermal water into individual pools. It’s easy to see why the sight has been famous for centuries, but the pools aren’t the only thing that Pamukkale has to offer. Ancient ruins also dot the area, as Pamukkale has a rich and fascinating history.
Ancient Pamukkale was known as Hierapolis, or “Holy City” in Greek. Some historians believe that the city’s name was a reference to Hiera, the mythological wife of Telephus, who was a son of Hercules. It’s likely that the city’s original builders were drawn to the area by the thermal terraces, since they built their city atop the same plateau. In fact, ancient writings and artwork suggest that the city’s residents considered the terraces and thermal pools sacred. Hierapolis' early history remains murky, but some historians believe it was founded by Eumenes II, an ancient ruler of Pergamon, a rich and powerful city in Mysia on the coast of the Aegean Sea. Eumenes II lived from around 197 BCE to 159 BCE, and in that time greatly expanded his kingdom. However, in 146 BCE, Greece was defeated by Rome in the Battle of Corinth, putting an end to the long Roman-Greek wars. Afterward, Greece, including Hierapolis, was ruled by Rome.
Disaster struck Hierapolis in 17 BCE, during the reign of Tiberius. An earthquake destroyed many of the structures that had been built by the Greeks. This ended up giving the Romans a chance to completely rebuild the city, and since the natural thermal pools and white terraces weren’t substantially damaged by the quake, they had incentive to do so. In fact, they re-made Hierapolis into an ancient resort town, where high-ranking officials and politicians could come to vacation. In the center of the city was a long street called the Plateia, lined with shops. The southern and northern entrances to the city featured large, limestone archways. The city’s public baths were popular with tourists, and athletes trained in Hierapolis’ large gymnasium. Possibly the most famous building in the ancient city is the Theater of Hieropolis. It features 45 semi-circular rows of seats with eight staircases running between them, and could hold up to 15,000 people. The theater’s existence hasn’t been an easy one. In 60 AD, the city suffered another earthquake, and the theater had to be rebuilt afterward, during the reign of Septimius Severus. The theater has an elaborate skenea, a building common to ancient Greek and Roman theaters which was used as a changing room for actors and as a backdrop in front of which plays were performed. Hieropolis’ skenea is decorated with friezes, horizontal bands of sculpted decoration, which feature scenes of Apollo and Artemis. Apollo was the God of healing, music and sunlight, among other things. His twin sister, Artemis, was the Goddess of hunting, moonlight, and chastity. Apollo was the main deity of Hierapolis, and the city’s ruins also include a temple dedicated to him.
Today, cotton is one of the major crops harvested near Pamukkale. This helped give the city its modern name, since legend holds that the white terraces were formed by giants leaving cotton out to dry. Pamukkale was declared a UNESCO World Heritage Site in 1988, and around 1.5 million tourists visit it each year. Some tours even offer the chance to swim in its famous thermal pools. Just watch out for wandering giants if you decide to take the plunge!
[Image description: Visitors file past travertine terraces in Pamukkale, Turkey] Credit & copyright: LoggaWiggler, Pixabay -
FREELiterature PP&T CurioFree1 CQ
It's Flashback Friday! In honor of National Marriage Day, enjoy these curios all about weddings.
A literary great gets some seriously cold feet.If anyone objects to this union, write a novel! Acclaimed author Jane Austen famously critiqued 19th-century gentry and lamented the pressures society put upon women to marry for wealth. Born on this day in 1775 in Hampshire, England, she almost accepted an unfeeling marriage, herself—one that would have wrecked her mental health and doomed her uncompleted novels.
Austen began writing at age 12 with fiction, plays, and poems. By 23, she had written the first drafts of her novels Sense and Sensibility, Pride and Prejudice, and Northanger Abbey. But her career faced a looming threat; though her father's position as parish rector had provided her with the comforts necessary to write so assiduously—she was now of age to marry. For Austen, marriage would've meant an end to her dream of writing and a monotonous life tending to a household.
The aspiring writer had a lot to fret over, and not just her freedom or her career. Her father was growing older, and who would take care of the family after he passed? An answer came one day when Austen and her sister visited old friends in nearby Manydown Park. A longtime friend of Austen's family, a Mr. Harris Bigg-Wither, spontaneously proposed to her in 1802.
Austen said, "yes". She couldn't overlook the fact Bigg-Wither had just graduated from Oxford and had professional prospects: he could provide for her family. But by morning, Austen realized she'd made a horrible mistake. Despite all his perks, Bigg-Wither wholly lacked tact and was both plain-looking and awkward. And there was just no way she was going to marry a lummox while three uncompleted, ground-breaking novels sat idle on her writing desk. That morning she retracted her offer, writing later, "nothing can be compared to the misery of being bound without love." The situation was so cringeworthy, Austen immediately fled Manydown.
She focused on her writing from there, publishing Sense and Sensibility in 1811 at age 35. She ultimately published all six of her books anonymously to avoid sexist critique. When she died in 1816 from Addison's disease, her brother posthumously revealed her as an author; by the 20th-century, Austen finally achieved recognition for her unflinching depictions of 19th-century England and the phenomenal mechanics of her prose. In particular, Austen's use of free indirect discourse in third-person narratives and her dabbling in realism predated both Leo Tolstoy and Charles Dickens by decades. Talk about an Ausome legacy!
Image credit & copyright: James Andrews, Wikimedia Commons, Public Domain
It's Flashback Friday! In honor of National Marriage Day, enjoy these curios all about weddings.
A literary great gets some seriously cold feet.If anyone objects to this union, write a novel! Acclaimed author Jane Austen famously critiqued 19th-century gentry and lamented the pressures society put upon women to marry for wealth. Born on this day in 1775 in Hampshire, England, she almost accepted an unfeeling marriage, herself—one that would have wrecked her mental health and doomed her uncompleted novels.
Austen began writing at age 12 with fiction, plays, and poems. By 23, she had written the first drafts of her novels Sense and Sensibility, Pride and Prejudice, and Northanger Abbey. But her career faced a looming threat; though her father's position as parish rector had provided her with the comforts necessary to write so assiduously—she was now of age to marry. For Austen, marriage would've meant an end to her dream of writing and a monotonous life tending to a household.
The aspiring writer had a lot to fret over, and not just her freedom or her career. Her father was growing older, and who would take care of the family after he passed? An answer came one day when Austen and her sister visited old friends in nearby Manydown Park. A longtime friend of Austen's family, a Mr. Harris Bigg-Wither, spontaneously proposed to her in 1802.
Austen said, "yes". She couldn't overlook the fact Bigg-Wither had just graduated from Oxford and had professional prospects: he could provide for her family. But by morning, Austen realized she'd made a horrible mistake. Despite all his perks, Bigg-Wither wholly lacked tact and was both plain-looking and awkward. And there was just no way she was going to marry a lummox while three uncompleted, ground-breaking novels sat idle on her writing desk. That morning she retracted her offer, writing later, "nothing can be compared to the misery of being bound without love." The situation was so cringeworthy, Austen immediately fled Manydown.
She focused on her writing from there, publishing Sense and Sensibility in 1811 at age 35. She ultimately published all six of her books anonymously to avoid sexist critique. When she died in 1816 from Addison's disease, her brother posthumously revealed her as an author; by the 20th-century, Austen finally achieved recognition for her unflinching depictions of 19th-century England and the phenomenal mechanics of her prose. In particular, Austen's use of free indirect discourse in third-person narratives and her dabbling in realism predated both Leo Tolstoy and Charles Dickens by decades. Talk about an Ausome legacy!
Image credit & copyright: James Andrews, Wikimedia Commons, Public Domain
-
FREELiterature PP&T CurioFree1 CQ
Aliens, monsters, totalitarian governments, oh my! American author Ray Bradbury, who passed away on this day in 2012, was known for his speculative fiction—stories that imagine what various futures might be like. His most famous work, Fahrenheit 451, is considered one of the best speculative works about government censorship…though Bradbury claimed it wasn’t actually about censorship. Some fans might be surprised just how quirky Bradbury’s opinions about his own writings and the world at large were.
Born August 22, 1920, in Waukegan, Illinois, to Swedish immigrants, Bradbury was interested in writing, theater, and movies from a young age. He was greatly inspired by stage magic and acting performances as a child. Once, when he was three, a circus performer called Mr. Electrico touched Bradbury’s nose with an electrified sword, shouting, “Live forever!” Because he was so young, Bradbury believed that the magic was real, and was inspired to write every day. He later said that he had written every day of his life since Mr. Electricio’s performance. When he was 14, Bradbury’s family moved to Los Angeles, where he made a hobby of roller skating through Hollywood to spot celebrities. This paid off when Bradbury managed to meet George Burns, of the popular Burns and Allen radio show. Bradbury convinced Burns to let him be an audience member for the show, and then to use one of his scripts for a performance. Just two years later, another influential but far less pleasant event took place for Bradbury: he witnessed a fatal car crash in which six people died. The crash was so horrifying to Bradbury that he later said, in an interview with Playboy, “I walked home holding on to walls and trees. It took me months to begin to function again.” As a result, Bradbury never learned to drive, and developed a lifelong fear of cars. In that same Playboy interview, Bradbury said, “The automobile is the most dangerous weapon in our society—cars kill more than wars do.”
Since Bradbury came of age during the Great Depression, he couldn’t afford to attend college. Instead, he spent much of his time at the library. Bradbury considered libraries to be equalizers between the rich and poor, and he stressed their importance throughout his career. Bradbury began publishing science fiction short stories, mostly in limited-run fanzines, in 1938. His first officially published story was Hollerbochen's Dilemma, about a man named Hollerbochen who can stop time to escape danger, but who explodes when he encounters too much danger at once. Years later, Bradbury admitted that he didn’t like the story, even though it was the first he managed to publish. World War II broke out when Bradbury was a young man, but he was rejected from military service due to his bad eyesight. In the 1940s, he became a regular contributor to several literary, science fiction, and film magazines, including Rob Wagner’s magazine, Script.
By the 1950s, Bradbury was a respected writer with two successful short story collections, 1950’s The Martian Chronicles and 1951’s The Illustrated Man. 1953 saw the publication of Bradbury’s most celebrated work, and the one which earned him a Pulitzer Prize: Fahrenheit 451, a speculative fiction novel about a future in which books are banned and burned by the government. In this world, most people are only interested in shallow conversions and watching T.V. The story follows Guy Montag, a “fireman” who burns books for a living. His life is turned upside down when a teenage girl shows him the value of books and intellectualism. After his new friend is killed, Montag rebels against his work, and ends up in exile with other intellectuals, who together survive a nuclear war and are tasked with rebuilding society. The novel rocketed Bradbury to stardom, but the author had an unusual take on his own work. Though most people consider government censorship to be one of the main themes of Fahrenheit 451, Bradbury firmly stated that it was actually about the dangers of television, and his fear that television would kill books as a medium. He felt that the apathy of common people in Fahrenheit 451 was the true tragedy—not their government. Bradbury felt so strongly about his position that he once walked out of a class at UCLA when students insisted that the book was actually about censorship.
Bradbury was undoubtedly quirky, but his works have inspired generations of writers. He spent much of his later life talking and teaching about writing. In honor of his many stories about Mars, NASA named the Martian landing site of their Curiosity rover Bradbury Landing. A fitting tribute for a lover of space, science, and magic.
[Image description: A black-and-white photo of Ray Bradbury raising his hands as he speaks into a microphone.] Credit & copyright: MDCarchives, Wikimedia Commons, image cropped for size, image is hereby distributed under the same license linked hereAliens, monsters, totalitarian governments, oh my! American author Ray Bradbury, who passed away on this day in 2012, was known for his speculative fiction—stories that imagine what various futures might be like. His most famous work, Fahrenheit 451, is considered one of the best speculative works about government censorship…though Bradbury claimed it wasn’t actually about censorship. Some fans might be surprised just how quirky Bradbury’s opinions about his own writings and the world at large were.
Born August 22, 1920, in Waukegan, Illinois, to Swedish immigrants, Bradbury was interested in writing, theater, and movies from a young age. He was greatly inspired by stage magic and acting performances as a child. Once, when he was three, a circus performer called Mr. Electrico touched Bradbury’s nose with an electrified sword, shouting, “Live forever!” Because he was so young, Bradbury believed that the magic was real, and was inspired to write every day. He later said that he had written every day of his life since Mr. Electricio’s performance. When he was 14, Bradbury’s family moved to Los Angeles, where he made a hobby of roller skating through Hollywood to spot celebrities. This paid off when Bradbury managed to meet George Burns, of the popular Burns and Allen radio show. Bradbury convinced Burns to let him be an audience member for the show, and then to use one of his scripts for a performance. Just two years later, another influential but far less pleasant event took place for Bradbury: he witnessed a fatal car crash in which six people died. The crash was so horrifying to Bradbury that he later said, in an interview with Playboy, “I walked home holding on to walls and trees. It took me months to begin to function again.” As a result, Bradbury never learned to drive, and developed a lifelong fear of cars. In that same Playboy interview, Bradbury said, “The automobile is the most dangerous weapon in our society—cars kill more than wars do.”
Since Bradbury came of age during the Great Depression, he couldn’t afford to attend college. Instead, he spent much of his time at the library. Bradbury considered libraries to be equalizers between the rich and poor, and he stressed their importance throughout his career. Bradbury began publishing science fiction short stories, mostly in limited-run fanzines, in 1938. His first officially published story was Hollerbochen's Dilemma, about a man named Hollerbochen who can stop time to escape danger, but who explodes when he encounters too much danger at once. Years later, Bradbury admitted that he didn’t like the story, even though it was the first he managed to publish. World War II broke out when Bradbury was a young man, but he was rejected from military service due to his bad eyesight. In the 1940s, he became a regular contributor to several literary, science fiction, and film magazines, including Rob Wagner’s magazine, Script.
By the 1950s, Bradbury was a respected writer with two successful short story collections, 1950’s The Martian Chronicles and 1951’s The Illustrated Man. 1953 saw the publication of Bradbury’s most celebrated work, and the one which earned him a Pulitzer Prize: Fahrenheit 451, a speculative fiction novel about a future in which books are banned and burned by the government. In this world, most people are only interested in shallow conversions and watching T.V. The story follows Guy Montag, a “fireman” who burns books for a living. His life is turned upside down when a teenage girl shows him the value of books and intellectualism. After his new friend is killed, Montag rebels against his work, and ends up in exile with other intellectuals, who together survive a nuclear war and are tasked with rebuilding society. The novel rocketed Bradbury to stardom, but the author had an unusual take on his own work. Though most people consider government censorship to be one of the main themes of Fahrenheit 451, Bradbury firmly stated that it was actually about the dangers of television, and his fear that television would kill books as a medium. He felt that the apathy of common people in Fahrenheit 451 was the true tragedy—not their government. Bradbury felt so strongly about his position that he once walked out of a class at UCLA when students insisted that the book was actually about censorship.
Bradbury was undoubtedly quirky, but his works have inspired generations of writers. He spent much of his later life talking and teaching about writing. In honor of his many stories about Mars, NASA named the Martian landing site of their Curiosity rover Bradbury Landing. A fitting tribute for a lover of space, science, and magic.
[Image description: A black-and-white photo of Ray Bradbury raising his hands as he speaks into a microphone.] Credit & copyright: MDCarchives, Wikimedia Commons, image cropped for size, image is hereby distributed under the same license linked here -
FREEActing PP&T CurioFree1 CQ
It's Flashback Friday! On this day in 2001, the musical The Producers won a record-breaking 12 Tony Awards. In honor of this achievement, enjoy these curios all about theater. A dying school of puppetry seeks to reinvent itself.
Are those dolls breathing? The old Japanese art of Bunraku puppet theater combines stage performance, narrative, and music to create Shakespearean experiences. A week from now, 78-year-old Japanese national treasure Komanosuke Takemoto will narrate one of these Bunraku productions. The tradition symbolizes a vestige of old Japanese culture but is unpopular among youth. Bunraku could die out with Takemoto and her dwindling generation.
Bunraku is the common name for ningyo joruri, or chanted puppet narration, which became popular during the Edo Period (1600-1868). In Bunraku, a musician plays a 3-stringed instrument called a shamisen, while a narrator like Takemoto chants the narration and dialogue. After 60 years of studying Bunraku narration, Takemoto says only in the last 20 did her performances become stage-worthy.
Takemoto's extreme modesty and intense work ethic are all part of the job. Puppeteers typically train for 30 years to achieve Bunraku mastery. Bunraku puppets are so complex, each one requires three handlers: one for the head and right arm, one for the left arm, and one for the legs. The puppets must become human on stage, dance in step to music, and mouth the narrator's words. Some puppets even transform from maidens into demons with a flip of a switch.
Bunraku plays customarily focus on the tragic interplay between societal obligations and human emotions. Chikamatsu Monzaemon (1653-1724), without any contact with the West, wrote plays remarkably similar to Shakespeare's. His masterpiece Love Suicides at Sonezaki parallels Romeo and Juliet despite showing a century later. Authorities outlawed suicide by lovesickness because Chikamatsu's play made the grisly act so popular.
Given Bunraku's outdated feel and extreme demands, Japan's teens find little interest in continuing the custom. Recent attempts at humor have drawn small crowds, but humor alone can't save the tradition; the extreme demands of Bunraku isolate it from the influence of contemporary entertainment. We at Curious remain hopeful of the possibilities: imagine Bunraku stand-up, or Bunraku nightly news—these talented puppets could take any stage!)
Below: excerpts from Bunraku productions.
[Image description: Description ] Credit & copyright: Pilar Aymerich i Puig, Wikimedia Commons, this image is hereby distributed under the same license linked here.
It's Flashback Friday! On this day in 2001, the musical The Producers won a record-breaking 12 Tony Awards. In honor of this achievement, enjoy these curios all about theater. A dying school of puppetry seeks to reinvent itself.
Are those dolls breathing? The old Japanese art of Bunraku puppet theater combines stage performance, narrative, and music to create Shakespearean experiences. A week from now, 78-year-old Japanese national treasure Komanosuke Takemoto will narrate one of these Bunraku productions. The tradition symbolizes a vestige of old Japanese culture but is unpopular among youth. Bunraku could die out with Takemoto and her dwindling generation.
Bunraku is the common name for ningyo joruri, or chanted puppet narration, which became popular during the Edo Period (1600-1868). In Bunraku, a musician plays a 3-stringed instrument called a shamisen, while a narrator like Takemoto chants the narration and dialogue. After 60 years of studying Bunraku narration, Takemoto says only in the last 20 did her performances become stage-worthy.
Takemoto's extreme modesty and intense work ethic are all part of the job. Puppeteers typically train for 30 years to achieve Bunraku mastery. Bunraku puppets are so complex, each one requires three handlers: one for the head and right arm, one for the left arm, and one for the legs. The puppets must become human on stage, dance in step to music, and mouth the narrator's words. Some puppets even transform from maidens into demons with a flip of a switch.
Bunraku plays customarily focus on the tragic interplay between societal obligations and human emotions. Chikamatsu Monzaemon (1653-1724), without any contact with the West, wrote plays remarkably similar to Shakespeare's. His masterpiece Love Suicides at Sonezaki parallels Romeo and Juliet despite showing a century later. Authorities outlawed suicide by lovesickness because Chikamatsu's play made the grisly act so popular.
Given Bunraku's outdated feel and extreme demands, Japan's teens find little interest in continuing the custom. Recent attempts at humor have drawn small crowds, but humor alone can't save the tradition; the extreme demands of Bunraku isolate it from the influence of contemporary entertainment. We at Curious remain hopeful of the possibilities: imagine Bunraku stand-up, or Bunraku nightly news—these talented puppets could take any stage!)
Below: excerpts from Bunraku productions.
[Image description: Description ] Credit & copyright: Pilar Aymerich i Puig, Wikimedia Commons, this image is hereby distributed under the same license linked here.
-
FREEBiology PP&T CurioFree1 CQ
With spring comes gardening, but these “gardeners’ best friends” have been hiding a dastardly secret. Believe it or not, practically all earthworms in North America are invasive. Many worm species, originally brought to the continent from Europe and Asia, are doing active harm to forests in the U.S. and Canada. Oddly, some of the same traits that make worms so helpful to gardens make them a devastating force in wild forests.
At first glance, worms may seem like simple creatures with little destructive potential. They have no eyes. Their brains contain a mere 300 or so neurons, compared with a dog’s 530 million. Worms have a simple body layout, sometimes described as a “tube in tube” system: a digestive tube inside a muscular tube, with five heart-like organs called aortic arches to pump their blood. Once upon a time, there were many native species of earthworms in North America. Their demise came not by man-made means, but by unprecedented natural events. Around 15,000 years ago, during the ice age of the Pleistocene epoch, glaciers formed over much of North America. This included Canada and the northern United States, stretching as far south as midwestern states like Illinois and Ohio. The glaciers eroded soil, severely impacting native earthworm populations. Around 3,700 years later, as the ice age ended, the glaciers retreated. This left behind a lot of barren land that gradually grew into worm-free ecosystems. Trees and other plants in these areas evolved to live without worms, meaning that they were adapted for a particular kind of soil, with particular levels of nutrients like nitrogen, phosphorus, and potassium. Things were going great until explorers, traders, and settlers from Europe and Asia brought new worms to North America with them. The wriggly invaders likely hid out in the roots of imported crops and other organic materials. These included several species in the Acanthodrilidae family and the Lumbricidae family, which today make up the majority of earthworms in North America.
In the confines of a garden, non-native worms help aerate soil with their tunneling. Their droppings, called castings, are water-soluble, so plants that live in soil with a lot of castings can more easily absorb water. In a forest, however, trees and native plants have adapted to live in dryer soil with fewer nutrients. When worms move in, the soil changes, and invasive plants often reap the benefits while trees and other native plants suffer. Worms also love to eat leaf litter—dead leaves on forest floors which would otherwise compost into soil. This causes many species of insects, which rely on the leaf litter for shelter and food, to die out. Even some bigger species, like salamanders, can’t survive without adequate leaf litter. With fewer insects, many kinds of birds can’t survive. From top to bottom, invasive worms affect every part of an ecosystem they touch. In an interview with The Atlantic, Anise Dobson, a forest ecologist at Yale University, said of the worm invasion, “If you were to think about the soil food web as the African savanna, it’s like taking out all the animals and just putting in elephants—a ton of elephants.”
So, can anything be done to stop these tunneling troublemakers? There are no easy answers, since pesticides can’t be used in forests, and removing invasive species is notoriously difficult once they’re well-established. For now, some ecologists suggest that people hiking or traveling through forests check their shoes before entering and after leaving, to prevent further spread of worms. Education and outreach can also help ensure that people don’t release worms used for fishing bait into wild areas. Here’s hoping that we can avoid a worm-pocalypse.
[Image description: An earworm crawls on top of soil.] Credit & copyright: PortalJardin, PixabayWith spring comes gardening, but these “gardeners’ best friends” have been hiding a dastardly secret. Believe it or not, practically all earthworms in North America are invasive. Many worm species, originally brought to the continent from Europe and Asia, are doing active harm to forests in the U.S. and Canada. Oddly, some of the same traits that make worms so helpful to gardens make them a devastating force in wild forests.
At first glance, worms may seem like simple creatures with little destructive potential. They have no eyes. Their brains contain a mere 300 or so neurons, compared with a dog’s 530 million. Worms have a simple body layout, sometimes described as a “tube in tube” system: a digestive tube inside a muscular tube, with five heart-like organs called aortic arches to pump their blood. Once upon a time, there were many native species of earthworms in North America. Their demise came not by man-made means, but by unprecedented natural events. Around 15,000 years ago, during the ice age of the Pleistocene epoch, glaciers formed over much of North America. This included Canada and the northern United States, stretching as far south as midwestern states like Illinois and Ohio. The glaciers eroded soil, severely impacting native earthworm populations. Around 3,700 years later, as the ice age ended, the glaciers retreated. This left behind a lot of barren land that gradually grew into worm-free ecosystems. Trees and other plants in these areas evolved to live without worms, meaning that they were adapted for a particular kind of soil, with particular levels of nutrients like nitrogen, phosphorus, and potassium. Things were going great until explorers, traders, and settlers from Europe and Asia brought new worms to North America with them. The wriggly invaders likely hid out in the roots of imported crops and other organic materials. These included several species in the Acanthodrilidae family and the Lumbricidae family, which today make up the majority of earthworms in North America.
In the confines of a garden, non-native worms help aerate soil with their tunneling. Their droppings, called castings, are water-soluble, so plants that live in soil with a lot of castings can more easily absorb water. In a forest, however, trees and native plants have adapted to live in dryer soil with fewer nutrients. When worms move in, the soil changes, and invasive plants often reap the benefits while trees and other native plants suffer. Worms also love to eat leaf litter—dead leaves on forest floors which would otherwise compost into soil. This causes many species of insects, which rely on the leaf litter for shelter and food, to die out. Even some bigger species, like salamanders, can’t survive without adequate leaf litter. With fewer insects, many kinds of birds can’t survive. From top to bottom, invasive worms affect every part of an ecosystem they touch. In an interview with The Atlantic, Anise Dobson, a forest ecologist at Yale University, said of the worm invasion, “If you were to think about the soil food web as the African savanna, it’s like taking out all the animals and just putting in elephants—a ton of elephants.”
So, can anything be done to stop these tunneling troublemakers? There are no easy answers, since pesticides can’t be used in forests, and removing invasive species is notoriously difficult once they’re well-established. For now, some ecologists suggest that people hiking or traveling through forests check their shoes before entering and after leaving, to prevent further spread of worms. Education and outreach can also help ensure that people don’t release worms used for fishing bait into wild areas. Here’s hoping that we can avoid a worm-pocalypse.
[Image description: An earworm crawls on top of soil.] Credit & copyright: PortalJardin, Pixabay -
FREEWorld History PP&T CurioFree1 CQ
It's Flashback Friday! This week we’re throwing things waaay back with these curios all about ancient times and practices.
Nothing to see here but a lighthearted sibling rivalry. According to ancient Roman tradition, mythological twins Romulus and Remus founded Rome on today's date in 753 B.C.E. as the culmination of an epic family feud started by their twisted great-uncle. Their legend, which originated in the third or fourth century B.C.E., was once taken as fact, and with so many theatrical ups and downs, it's easy to see why the story jibed with the masses. Even if it stretches the truth… just a smidge.
The legend of Romulus and Remus begins with their grandfather, King Numitor, who ruled the ancient Latin city of Alba Longa until he was usurped by his brother, Amulius. Amulius forced Numitor's daughter Rhea Silvia to take a sacred vow of chastity, to ensure that none of Numitor's progeny could seek revenge. She was sealed within a convent, but the wily god of war Mars impregnated her.
When Amulius caught wind that Rhea had given birth to twins, he ordered their death, and the two babies were left to perish on the shores of the Tiber River. But a generous she-wolf came across the twins and nursed them. Eventually, a shepherd named Faustulus encountered the feral children, too, and raised them as his own. Romulus and Remus matured with no knowledge of their origins, though their godly blood made them just, charismatic leaders in their village community.
By this time, King Amulius was loathed by his citizens. When Romulus and Remus involved themselves in a local dispute between rival supporters of Numitor and Amulius, Remus was imprisoned and whisked away to Alba Longa. There, both Numitor and Amulius suspected Remus' true identity. Romulus, meanwhile, gathered a mob of supporters to rescue his brother. During the ensuing quarrel, fiendish uncle Amulius died. Numitor revealed to the twins that he was their grandfather, and was then reinstated as king of Alba Longa.
The boys were delighted to have all that nasty business behind them. They set out from Alba Longa to establish a new kingdom. Only, they couldn't agree on a location for it. Romulus argued that Palatine Hill was the perfect site. No way, scoffed Remus; it was definitely Aventine Hill. To settle their dispute, they consulted augury, or the practice of interpreting omens from the flight of birds. When Romulus saw more auspicious birds than Remus, the latter flew into a rage. In an act of anger and mockery, Remus leaped over a meager wall that Romulus had been building on his hill of choice. Fed-up Romulus, in turn, slew Remus for his contempt. So Palatine Hill, it was.
After gloating over his senseless act of fratricide, Romulus went on to found Rome. Well, not really. According to modern dating of ancient Roman walls and pottery, some of the artifacts and structures of Rome actually predate the legend of Romulus and Remus by 100 years. Sure, the tale's happenings could have transpired further back in time, but we're not betting on it. After all, who in their right mind would choose to live under the king of a hill, no less one who gutted his brother?
Image credit & copyright: Benutzer:Wolpertinger, Wikimedia Commons, Public Domain
It's Flashback Friday! This week we’re throwing things waaay back with these curios all about ancient times and practices.
Nothing to see here but a lighthearted sibling rivalry. According to ancient Roman tradition, mythological twins Romulus and Remus founded Rome on today's date in 753 B.C.E. as the culmination of an epic family feud started by their twisted great-uncle. Their legend, which originated in the third or fourth century B.C.E., was once taken as fact, and with so many theatrical ups and downs, it's easy to see why the story jibed with the masses. Even if it stretches the truth… just a smidge.
The legend of Romulus and Remus begins with their grandfather, King Numitor, who ruled the ancient Latin city of Alba Longa until he was usurped by his brother, Amulius. Amulius forced Numitor's daughter Rhea Silvia to take a sacred vow of chastity, to ensure that none of Numitor's progeny could seek revenge. She was sealed within a convent, but the wily god of war Mars impregnated her.
When Amulius caught wind that Rhea had given birth to twins, he ordered their death, and the two babies were left to perish on the shores of the Tiber River. But a generous she-wolf came across the twins and nursed them. Eventually, a shepherd named Faustulus encountered the feral children, too, and raised them as his own. Romulus and Remus matured with no knowledge of their origins, though their godly blood made them just, charismatic leaders in their village community.
By this time, King Amulius was loathed by his citizens. When Romulus and Remus involved themselves in a local dispute between rival supporters of Numitor and Amulius, Remus was imprisoned and whisked away to Alba Longa. There, both Numitor and Amulius suspected Remus' true identity. Romulus, meanwhile, gathered a mob of supporters to rescue his brother. During the ensuing quarrel, fiendish uncle Amulius died. Numitor revealed to the twins that he was their grandfather, and was then reinstated as king of Alba Longa.
The boys were delighted to have all that nasty business behind them. They set out from Alba Longa to establish a new kingdom. Only, they couldn't agree on a location for it. Romulus argued that Palatine Hill was the perfect site. No way, scoffed Remus; it was definitely Aventine Hill. To settle their dispute, they consulted augury, or the practice of interpreting omens from the flight of birds. When Romulus saw more auspicious birds than Remus, the latter flew into a rage. In an act of anger and mockery, Remus leaped over a meager wall that Romulus had been building on his hill of choice. Fed-up Romulus, in turn, slew Remus for his contempt. So Palatine Hill, it was.
After gloating over his senseless act of fratricide, Romulus went on to found Rome. Well, not really. According to modern dating of ancient Roman walls and pottery, some of the artifacts and structures of Rome actually predate the legend of Romulus and Remus by 100 years. Sure, the tale's happenings could have transpired further back in time, but we're not betting on it. After all, who in their right mind would choose to live under the king of a hill, no less one who gutted his brother?
Image credit & copyright: Benutzer:Wolpertinger, Wikimedia Commons, Public Domain
-
FREEGames PP&T CurioFree1 CQ
Eat the dots, get the fruit…avoid the ghosts! There’s no doubt that Pac-Man is a wacky game, yet it caused a sensation upon its release on this day in 1980. Created by Japanese company Namco, Pac-Man was designed to lift the reputation of “seedy” arcades and provide an alternative to the violence found in other popular video games of the time. Pac-Man managed to do all of that and more.
The company which would become Namco began taking shape in 1955, when out-of-work shipbuilder Masaya Nakamura bought two hand-cranked rocking horses that children could pay to ride and put them on the roof of a busy department store in Yokohama. Over the next decade, Nakamura turned his attention to electro-mechanical games such as shooting galleries and pinball machines. This put him in touch with the emerging world of gaming arcades, and he began using the name Namco to brand some of his games. Namco’s foray into video games came in 1974, with the help of American gaming giant Atari, inventors of the smash hit game Pong. Atari was expanding into Japan and asked Namco to be their official Japanese distributor. Under Nakamura’s leadership, Atari Japan officially became Namco in 1977. Namco released its first original video game in 1978. The game’s designer, Toru Iwatani, was an out-of-the-box thinker who, in 1979, designed Namco’s first true hit: Galaxian, a game similar to the popular Space Invaders by Taito. But Iwatani had another idea for a game in mind, and it looked nothing like anything else on the market.
By the early 1980s, video games and arcades were hugely popular with boys, but not so much with girls. Games tended to have violent themes, and shooting was usually a key element of gameplay. Adults in the early 1980s weren’t exactly crazy about arcades. They saw them as seedy places where teenage boys caused trouble. Iwatani saw girls as an untapped gaming market, and he knew that arcades needed a reputation-boost, if his industry was going to continue its success. As a solution to both problems, he began working on a game with friendly-looking characters, and very little violence. When exploring themes that might appeal to female gamers, Iwatani thought of young girls eating at cafes with friends, and concluded that young women loved to eat. He thus decided to base the game around food. Galaxian had revolutionized video game displays by employing an RGB color display, making it so that game sprites could be more than one solid color. This technology allowed Iwatani to create bright, colorful sprites, all of which he designed with cute, rounded shapes. Rather than shooting, the game’s main character would traverse a maze, eating dots and fruits, while avoiding blob-like ghosts. To fit the game’s food theme, Pac-Man’s shape was based on a pizza with one slice cut out to form a mouth. Inspired by Popeye the Sailor, a cartoon character who got stronger by eating spinach, Iwatani included power-ups in the game, which gave the player advantages when eaten. Even the game's sound effects and music, created by composer Toshio Kai, were meant to be cute and appealing. Iwatani called the game Puck Man. Namco later changed it to Pakkuman, since the game focused on eating and “paku paku taberu” is the Japanese onomatopoeia for chewing. Although it didn’t take off during its initial Japanese testing phase in May of 1980, the game soared to success in the American market, where it was released as Pac-Man, later that year.
Just two years later, coin-operated Pac-Man arcade games were making around $8 million per quarter. Pac-Man decorations and merchandise, like figurines and stuffed animals, began popping up in arcades, giving them a friendlier appearance. Pac-Man did, indeed, attract female gamers, so much so that in 1981, Ms. Pac-Man, featuring a female version of Pac-Man’s main character, was released to great acclaim. By the late 1990s, Pac-Man had become the highest grossing video game in history, up to that time, with total sales of over $2.5 billion dollars. And all thanks to a designer who appreciated girls’ love of eating.
[Image description: A digital illustration of gameplay from Pac-Man, showing a yellow character approaching four white dots against a black background.] Credit & copyright: Perlinator, Pixabay, image expanded for sizeEat the dots, get the fruit…avoid the ghosts! There’s no doubt that Pac-Man is a wacky game, yet it caused a sensation upon its release on this day in 1980. Created by Japanese company Namco, Pac-Man was designed to lift the reputation of “seedy” arcades and provide an alternative to the violence found in other popular video games of the time. Pac-Man managed to do all of that and more.
The company which would become Namco began taking shape in 1955, when out-of-work shipbuilder Masaya Nakamura bought two hand-cranked rocking horses that children could pay to ride and put them on the roof of a busy department store in Yokohama. Over the next decade, Nakamura turned his attention to electro-mechanical games such as shooting galleries and pinball machines. This put him in touch with the emerging world of gaming arcades, and he began using the name Namco to brand some of his games. Namco’s foray into video games came in 1974, with the help of American gaming giant Atari, inventors of the smash hit game Pong. Atari was expanding into Japan and asked Namco to be their official Japanese distributor. Under Nakamura’s leadership, Atari Japan officially became Namco in 1977. Namco released its first original video game in 1978. The game’s designer, Toru Iwatani, was an out-of-the-box thinker who, in 1979, designed Namco’s first true hit: Galaxian, a game similar to the popular Space Invaders by Taito. But Iwatani had another idea for a game in mind, and it looked nothing like anything else on the market.
By the early 1980s, video games and arcades were hugely popular with boys, but not so much with girls. Games tended to have violent themes, and shooting was usually a key element of gameplay. Adults in the early 1980s weren’t exactly crazy about arcades. They saw them as seedy places where teenage boys caused trouble. Iwatani saw girls as an untapped gaming market, and he knew that arcades needed a reputation-boost, if his industry was going to continue its success. As a solution to both problems, he began working on a game with friendly-looking characters, and very little violence. When exploring themes that might appeal to female gamers, Iwatani thought of young girls eating at cafes with friends, and concluded that young women loved to eat. He thus decided to base the game around food. Galaxian had revolutionized video game displays by employing an RGB color display, making it so that game sprites could be more than one solid color. This technology allowed Iwatani to create bright, colorful sprites, all of which he designed with cute, rounded shapes. Rather than shooting, the game’s main character would traverse a maze, eating dots and fruits, while avoiding blob-like ghosts. To fit the game’s food theme, Pac-Man’s shape was based on a pizza with one slice cut out to form a mouth. Inspired by Popeye the Sailor, a cartoon character who got stronger by eating spinach, Iwatani included power-ups in the game, which gave the player advantages when eaten. Even the game's sound effects and music, created by composer Toshio Kai, were meant to be cute and appealing. Iwatani called the game Puck Man. Namco later changed it to Pakkuman, since the game focused on eating and “paku paku taberu” is the Japanese onomatopoeia for chewing. Although it didn’t take off during its initial Japanese testing phase in May of 1980, the game soared to success in the American market, where it was released as Pac-Man, later that year.
Just two years later, coin-operated Pac-Man arcade games were making around $8 million per quarter. Pac-Man decorations and merchandise, like figurines and stuffed animals, began popping up in arcades, giving them a friendlier appearance. Pac-Man did, indeed, attract female gamers, so much so that in 1981, Ms. Pac-Man, featuring a female version of Pac-Man’s main character, was released to great acclaim. By the late 1990s, Pac-Man had become the highest grossing video game in history, up to that time, with total sales of over $2.5 billion dollars. And all thanks to a designer who appreciated girls’ love of eating.
[Image description: A digital illustration of gameplay from Pac-Man, showing a yellow character approaching four white dots against a black background.] Credit & copyright: Perlinator, Pixabay, image expanded for size -
FREERunning PP&T CurioFree1 CQ
Sports weren’t always “anyone’s game”, but this innovation was equalizing! As warm weather ushers in the return of fun outdoor activities like jogging, it’s hard to believe that, not so long ago, women were discouraged from participating in them. Before 1970’s Title IX, which banned sex-based discrimination in sports, the prevailing attitude in the U.S. was that sports were unladylike or even dangerous for girls and women. After Title IX’s passage, sports-minded women found that there wasn’t much on offer when it came to athletic gear that fit their needs. Enter Lisa Lindahl, Polly Palmer Smith, and Hinda Miller, inventors of a modern athletic staple: the sports bra.
By the late 1970s, Title IX and other breakthroughs had made womens’ sports suddenly popular. Jogging was particularly in vogue, likely due to the influence of figures like Kathrine Switzer, who had run in the 1967 Boston Marathon even though women were banned. In 1977, 28-year-old University of Vermont graduate student Lisa Lindahl was jogging around 30 miles per week. There was just one problem: jogging was extremely uncomfortable for her chest due to a lack of proper support. Lindahl tried wearing different bras while jogging, even going so far as to wear one a size too small, but nothing seemed to work. Her sister, who often jogged with her, also complained of chest and back pain from jogging, and joked that there should be a jockstrap for women. Realizing that many women were likely suffering from the same problem, Lindahl turned to her costume-designer friend, Polly Palmer Smith. Smith brought fellow costume designer Hinda Miller in on the project as well. Together, the three women began working on garments that might solve their jogging support problem. After many failed attempts, it was Lindahl’s inside joke with her sister that offered a solution. Having heard the joke about a jockstrap for women, Lindahl’s husband strolled into her shared workshop wearing two jockstraps on his chest. The three women immediately saw design potential in the way that the straps crossed across his back, and they began sewing jockstraps together in different configurations. Their prototype was appropriately dubbed the “JockBra”, which they eventually changed to the more appealing “JogBra.” Designed to provide chest, shoulder, and back support, the Jogbra featured crisscrossing straps and seams on the outside, to minimize chafing and blistering. Unlike normal bras, Jogbras came in three standard sizes: small, medium, and large.
Yet the male-dominated athletic industry took some time to warm up to the innovation. After Smith left the operation to pursue other design work, Lindahl and Miller took on the task of explaining to male store owners that the JogBra wasn’t underwear or lingerie. “Almost every time they'd say ‘we don't sell bras in our store,’” Lindahl told the BBC. The inventors got around the problem by giving free JogBra samples to female assistant managers. That’s when the JogBra really got moving. By 1978, JobBras were selling like hotcakes at $16 each, and in 1979 their design was officially patented. After making around $500,000 in its first year, Jogbra Inc. grew around 25 percent per year until the late 1980s. In 1990, Lindahl and Miller decided to sell their company to Playtex. By then, larger companies like Reebok were making their own versions of the sports bra, eating into JogBra Inc.’s profits.
Today, the sports bra is considered one of the most important inventions in sports history. In fact, in 2018, Runner’s World magazine called it “The Greatest Invention in Running—EVER.” This year, Lindahl, Miller, and Smith were inducted into the The National Inventors Hall of Fame, which is part of the U.S. Patent and Trade Office. A JogBra prototype can be found in the archives of the Smithsonian's National Museum of American History. The stores that refused to carry sports bras must be feeling a little sheepish, these days.
[Image description: Colorful sports bras hang in a store.] Credit & copyright: Rusty Clark, Wikimedia Commons, image cropped for size, image is hereby distributed under the same license linked here.Sports weren’t always “anyone’s game”, but this innovation was equalizing! As warm weather ushers in the return of fun outdoor activities like jogging, it’s hard to believe that, not so long ago, women were discouraged from participating in them. Before 1970’s Title IX, which banned sex-based discrimination in sports, the prevailing attitude in the U.S. was that sports were unladylike or even dangerous for girls and women. After Title IX’s passage, sports-minded women found that there wasn’t much on offer when it came to athletic gear that fit their needs. Enter Lisa Lindahl, Polly Palmer Smith, and Hinda Miller, inventors of a modern athletic staple: the sports bra.
By the late 1970s, Title IX and other breakthroughs had made womens’ sports suddenly popular. Jogging was particularly in vogue, likely due to the influence of figures like Kathrine Switzer, who had run in the 1967 Boston Marathon even though women were banned. In 1977, 28-year-old University of Vermont graduate student Lisa Lindahl was jogging around 30 miles per week. There was just one problem: jogging was extremely uncomfortable for her chest due to a lack of proper support. Lindahl tried wearing different bras while jogging, even going so far as to wear one a size too small, but nothing seemed to work. Her sister, who often jogged with her, also complained of chest and back pain from jogging, and joked that there should be a jockstrap for women. Realizing that many women were likely suffering from the same problem, Lindahl turned to her costume-designer friend, Polly Palmer Smith. Smith brought fellow costume designer Hinda Miller in on the project as well. Together, the three women began working on garments that might solve their jogging support problem. After many failed attempts, it was Lindahl’s inside joke with her sister that offered a solution. Having heard the joke about a jockstrap for women, Lindahl’s husband strolled into her shared workshop wearing two jockstraps on his chest. The three women immediately saw design potential in the way that the straps crossed across his back, and they began sewing jockstraps together in different configurations. Their prototype was appropriately dubbed the “JockBra”, which they eventually changed to the more appealing “JogBra.” Designed to provide chest, shoulder, and back support, the Jogbra featured crisscrossing straps and seams on the outside, to minimize chafing and blistering. Unlike normal bras, Jogbras came in three standard sizes: small, medium, and large.
Yet the male-dominated athletic industry took some time to warm up to the innovation. After Smith left the operation to pursue other design work, Lindahl and Miller took on the task of explaining to male store owners that the JogBra wasn’t underwear or lingerie. “Almost every time they'd say ‘we don't sell bras in our store,’” Lindahl told the BBC. The inventors got around the problem by giving free JogBra samples to female assistant managers. That’s when the JogBra really got moving. By 1978, JobBras were selling like hotcakes at $16 each, and in 1979 their design was officially patented. After making around $500,000 in its first year, Jogbra Inc. grew around 25 percent per year until the late 1980s. In 1990, Lindahl and Miller decided to sell their company to Playtex. By then, larger companies like Reebok were making their own versions of the sports bra, eating into JogBra Inc.’s profits.
Today, the sports bra is considered one of the most important inventions in sports history. In fact, in 2018, Runner’s World magazine called it “The Greatest Invention in Running—EVER.” This year, Lindahl, Miller, and Smith were inducted into the The National Inventors Hall of Fame, which is part of the U.S. Patent and Trade Office. A JogBra prototype can be found in the archives of the Smithsonian's National Museum of American History. The stores that refused to carry sports bras must be feeling a little sheepish, these days.
[Image description: Colorful sports bras hang in a store.] Credit & copyright: Rusty Clark, Wikimedia Commons, image cropped for size, image is hereby distributed under the same license linked here. -
FREELiterature PP&T CurioFree1 CQ
It's Flashback Friday! Enjoy these creepy curios in honor of Friday the 13th!
Reader beware—you're in for a scare!
Read if you dare…. Before Harry Potter dominated young adult fiction, a much darker series held America's young bookworms within its death grip: Goosebumps. Written by quirky author R.L. Stine, Goosebumps brought evil cuckoo clocks, try-on werewolf skin, and phantom dogs to the young masses—and they loved it.
But at first, Stine was opposed to writing horror for children. He worried it would ruin sales for his teenage-geared Fear Street series. Regardless, co-owner of Parachute Press, Joan Waricha, felt the children's horror genre had unactualized promise and convinced Stine to give it a chance in July 1992.
Stine came up with the name, wrote the first six Goosebumps novels—and waited. The books sat on the shelves for roughly three or four months, and sales were stagnant. Stine feared that the first book in the series, Welcome to Dead House, was too scary for children. "I didn't have the formula then, to combine funny and scary," said Stine, who admits he would soften the book if given a chance to rewrite it.
Then, out of nowhere, Goosebumps was a household name. Books started flying off the shelves, and soon Stine was given a deal to write a book a month. To keep pace, he devised a routine: first, he chose the title and built the plot around it. For example, Stine thought up Say Cheese and Die and knew it had to be about a sinister camera of some sort. Once Stine conceived the basics of a plot, he sent a short summary to illustrator Tim Jacobus, so Jacobus could paint the covers as he wrote. Typically, it took Stine anywhere from six days to three-and-a-half weeks to finish a Goosebumps novella. Between the years of 1992 and 1997, he pumped out 62 books!
Today, 350 million Goosebumps have sold worldwide in 32 languages. Following Harry Potter, Goosebumps brags the highest ever sales for a children's book series, spawning a television show, six other book series, and a movie in 2015. Recently, Goosebumps has experienced a bit of a renaissance. Nevertheless, Stine's son Matt has never read a single of the books. "He does it just to make me crazy," says Stine. We'd be interested to see how Goosebumps written by a nowadays "crazy" Stine would turn out!
Image credit & copyright: William Tung, Wikimedia Commons, image cropped for size, this image is hereby distributed under the same license linked here.
It's Flashback Friday! Enjoy these creepy curios in honor of Friday the 13th!
Reader beware—you're in for a scare!
Read if you dare…. Before Harry Potter dominated young adult fiction, a much darker series held America's young bookworms within its death grip: Goosebumps. Written by quirky author R.L. Stine, Goosebumps brought evil cuckoo clocks, try-on werewolf skin, and phantom dogs to the young masses—and they loved it.
But at first, Stine was opposed to writing horror for children. He worried it would ruin sales for his teenage-geared Fear Street series. Regardless, co-owner of Parachute Press, Joan Waricha, felt the children's horror genre had unactualized promise and convinced Stine to give it a chance in July 1992.
Stine came up with the name, wrote the first six Goosebumps novels—and waited. The books sat on the shelves for roughly three or four months, and sales were stagnant. Stine feared that the first book in the series, Welcome to Dead House, was too scary for children. "I didn't have the formula then, to combine funny and scary," said Stine, who admits he would soften the book if given a chance to rewrite it.
Then, out of nowhere, Goosebumps was a household name. Books started flying off the shelves, and soon Stine was given a deal to write a book a month. To keep pace, he devised a routine: first, he chose the title and built the plot around it. For example, Stine thought up Say Cheese and Die and knew it had to be about a sinister camera of some sort. Once Stine conceived the basics of a plot, he sent a short summary to illustrator Tim Jacobus, so Jacobus could paint the covers as he wrote. Typically, it took Stine anywhere from six days to three-and-a-half weeks to finish a Goosebumps novella. Between the years of 1992 and 1997, he pumped out 62 books!
Today, 350 million Goosebumps have sold worldwide in 32 languages. Following Harry Potter, Goosebumps brags the highest ever sales for a children's book series, spawning a television show, six other book series, and a movie in 2015. Recently, Goosebumps has experienced a bit of a renaissance. Nevertheless, Stine's son Matt has never read a single of the books. "He does it just to make me crazy," says Stine. We'd be interested to see how Goosebumps written by a nowadays "crazy" Stine would turn out!
Image credit & copyright: William Tung, Wikimedia Commons, image cropped for size, this image is hereby distributed under the same license linked here.
-
FREEUS History PP&T CurioFree1 CQ
Here’s an activist whose work spans the better part of a century. In celebration of Asian American and Pacific Islander Heritage Month, we’re honoring Grace Lee Boggs, a fierce fighter in the battle for American racial and class equality. Though Boggs passed away in 2015 at the age of 100, she was involved in racial, feminist, labor and environmental activism up until the time of her death. In fact, the final of her five books, 2011’s The Next American Revolution: Sustainable Activism for the Twenty-First Century, was published when Boggs was 95.
Born in 1915 in Providence, Rhode Island, Bogg’s parents were immigrants from Taishan, Guangdong, China, and her Chinese given name was Yu Ping. Bogg’s father, Chin Lee, was a restauranteur, while her mother, Yin Lan, was a feminist role moel to Boggs. Born poor, Lan had been sold into slavery by her own uncle in China, but escaped. Lan taught her six children, including her daughters, the importance of independent thinking and education. At a time when it was common for U.S. universities to be segregated by race and gender, Boggs earned a scholarship to Barnard College of Columbia University, and eventually received a Ph.D. in philosophy from Bryn Mawr College.
Unfortunately, Boggs wasn’t immune from the prejudices of 1940s America. Despite her education, she struggled to find work. According to NPR, she recalled to a group of students, “Even department stores would say, 'We don't hire Orientals.'” Eventually, she took a low-paying job at the University of Chicago’s philosophy library. Due to her low wages, her living conditions in the city were poor, and Boggs soon found herself connecting with local activists who led protests against such conditions. Many of these activists were Black, and Boggs identified with their struggle against racial injustice. Boggs’ education and mastery of language served her well as an activist, as her pamphlets helped rally more people to causes of labor and race. She began translating Karl Marx’s early letters into English and joined the far-left Workers Party, which had formed from the Socialist Workers Party. Both groups believed that progress could only come from a well-read and organized working class. She then joined the Johnsonites, a group that believed that the working class had the power to emancipate itself.
In 1953, after making a name for herself publishing articles for the Johnsonite publication Correspondence, Boggs moved to Detroit. The city was the epicenter of Johnsonite philosophy, and Boggs made fast friends with other working-class organizers there, including a Black Johnsonite autoworker from Alabama named James Boggs. Soon, the two fell in love and managed to get married, despite inter-racial marriage being illegal in much of the U.S. By this time, the F.B.I. had begun monitoring Boggs for her links to communism and involvement in the Black Power movement. Her official F.B.I. file incorrectly described her ethnicity as “probably Afro Chinese” simply because of her marriage to James Boggs and her work fighting for Black Americans’ civil rights. Grace Lee and James formed a powerful activist alliance, organizing protests, and helping to found the National Organization for an American Revolution (NOAR), which published literature about inequality related to class, race, and gender. Both Grace Lee and James believed that labor rights were inextricably tied to other civil rights for Asian Americans, Black Americans, and women of all races.
In 1970, Boggs helped found the Detroit Asian Political Alliance, which aimed to help Asian Americans in their struggles against racism. In 1992, she founded Detroit Summer, a youth program that aimed to get young people actively involved in improving their communities. Grace Lee and James Boggs remained married until James’ death in 1993. Beginning in 1998, Boggs published five books, including her autobiography. Her final book, The Next American Revolution: Sustainable Activism for the Twenty-First Century Boggs wrote about the need for social changes in the face of modern problems, from internet misinformation to climate change. Clearly, she was not an activist who believed in falling behind the times…or taking time off from work, for that matter.
[Image description: Wearing a black sweatshirt, activist Grace Lee Boggs gestures while speaking in 2012.] Credit & copyright: Kyle McDonald, Wikimedia Commons, image cropped for size, image is hereby distributed under the same license linked here.Here’s an activist whose work spans the better part of a century. In celebration of Asian American and Pacific Islander Heritage Month, we’re honoring Grace Lee Boggs, a fierce fighter in the battle for American racial and class equality. Though Boggs passed away in 2015 at the age of 100, she was involved in racial, feminist, labor and environmental activism up until the time of her death. In fact, the final of her five books, 2011’s The Next American Revolution: Sustainable Activism for the Twenty-First Century, was published when Boggs was 95.
Born in 1915 in Providence, Rhode Island, Bogg’s parents were immigrants from Taishan, Guangdong, China, and her Chinese given name was Yu Ping. Bogg’s father, Chin Lee, was a restauranteur, while her mother, Yin Lan, was a feminist role moel to Boggs. Born poor, Lan had been sold into slavery by her own uncle in China, but escaped. Lan taught her six children, including her daughters, the importance of independent thinking and education. At a time when it was common for U.S. universities to be segregated by race and gender, Boggs earned a scholarship to Barnard College of Columbia University, and eventually received a Ph.D. in philosophy from Bryn Mawr College.
Unfortunately, Boggs wasn’t immune from the prejudices of 1940s America. Despite her education, she struggled to find work. According to NPR, she recalled to a group of students, “Even department stores would say, 'We don't hire Orientals.'” Eventually, she took a low-paying job at the University of Chicago’s philosophy library. Due to her low wages, her living conditions in the city were poor, and Boggs soon found herself connecting with local activists who led protests against such conditions. Many of these activists were Black, and Boggs identified with their struggle against racial injustice. Boggs’ education and mastery of language served her well as an activist, as her pamphlets helped rally more people to causes of labor and race. She began translating Karl Marx’s early letters into English and joined the far-left Workers Party, which had formed from the Socialist Workers Party. Both groups believed that progress could only come from a well-read and organized working class. She then joined the Johnsonites, a group that believed that the working class had the power to emancipate itself.
In 1953, after making a name for herself publishing articles for the Johnsonite publication Correspondence, Boggs moved to Detroit. The city was the epicenter of Johnsonite philosophy, and Boggs made fast friends with other working-class organizers there, including a Black Johnsonite autoworker from Alabama named James Boggs. Soon, the two fell in love and managed to get married, despite inter-racial marriage being illegal in much of the U.S. By this time, the F.B.I. had begun monitoring Boggs for her links to communism and involvement in the Black Power movement. Her official F.B.I. file incorrectly described her ethnicity as “probably Afro Chinese” simply because of her marriage to James Boggs and her work fighting for Black Americans’ civil rights. Grace Lee and James formed a powerful activist alliance, organizing protests, and helping to found the National Organization for an American Revolution (NOAR), which published literature about inequality related to class, race, and gender. Both Grace Lee and James believed that labor rights were inextricably tied to other civil rights for Asian Americans, Black Americans, and women of all races.
In 1970, Boggs helped found the Detroit Asian Political Alliance, which aimed to help Asian Americans in their struggles against racism. In 1992, she founded Detroit Summer, a youth program that aimed to get young people actively involved in improving their communities. Grace Lee and James Boggs remained married until James’ death in 1993. Beginning in 1998, Boggs published five books, including her autobiography. Her final book, The Next American Revolution: Sustainable Activism for the Twenty-First Century Boggs wrote about the need for social changes in the face of modern problems, from internet misinformation to climate change. Clearly, she was not an activist who believed in falling behind the times…or taking time off from work, for that matter.
[Image description: Wearing a black sweatshirt, activist Grace Lee Boggs gestures while speaking in 2012.] Credit & copyright: Kyle McDonald, Wikimedia Commons, image cropped for size, image is hereby distributed under the same license linked here. -
FREEMusic Appreciation PP&T CurioFree1 CQ
It's Flashback Friday! In honor of International Tuba Day, enjoy these curios all about music and instruments.
Every instrument requires a certain form of stimulus to be played, whether it be plucking, picking—or in the theremin's case—waving. Born on this day in 1896, Soviet physicist and spy Léon Theremin revolutionized electronic music in 1919 with his invention of the theremin: an instrument played without any tactile contact, and wholly lacking any tangible scale to assist with pitch.
At 23, Theremin inadvertently created the groundwork for his eponymous device while building a density meter for gases. After he plugged his gizmo in, he noticed the prototype let loose an eerie squeal as he approached, with the pitch and volume dropping as he raised his hands, and rising as he lowered them. An amateur cellist, Theremin couldn't resist the urge to play a few melodies, quickly catching on and wowing his lab assistants with an impromptu performance.
The physicist developed a more streamlined version of the strange instrument, bringing it before Bolshevik leader Vladimir Lenin. He eventually achieved fame stateside with live performances at iconic American venues in the '20s, like Carnegie Hall and the Metropolitan Opera House. One such show attracted the Radio Corporation of America, who offered Theremin $100,000 for the rights to manufacture his invention. Theremin shook on the offer and began making rounds with tech leaders, surreptitiously spilling their secrets to the Soviet Union's KGB before mysteriously returning to the U.S.S.R. in 1938.
Nowadays, the theremin instrument features a standard layout. At its base lies a box full of control oscillators that transmit electronic signals. Two antennae sense the location of the thereminist's hands, with the upright antenna controlling pitch, and the loop antennae volume. Fretless, stringless, and valveless, the theremin must be an easy instrument to master, right? In all actually, the lack of features means musicians must gauge pitch by ear alone, which often steers away amateurs. Among other tricky variables, one must factor in fluctuation in the player's weight, which drastically affects the proximity the thereminist must place their hands to achieve a particular sound.
For these reasons, the theremin never quite caught on like the electric guitar, but its alien tunes did result in some of the classic '50s science fiction soundtracks we've come to know and love. And along this theme, we hope Léon Theremin's far-out instruments aren't transmitting our thoughts back in time to the KGB: beware the Red Menace! Ooo-eee-ooo.
Below: a video of Léon Theremin shredding his theremin.
[Image description: A black-and-white photo of English singer-songwriter Bruce Woolley playing a theremin.] Credit & copyright: Soundsweep, Wikimedia Commons, image cropped for size, this image is hereby distributed under the same license linked here.It's Flashback Friday! In honor of International Tuba Day, enjoy these curios all about music and instruments.
Every instrument requires a certain form of stimulus to be played, whether it be plucking, picking—or in the theremin's case—waving. Born on this day in 1896, Soviet physicist and spy Léon Theremin revolutionized electronic music in 1919 with his invention of the theremin: an instrument played without any tactile contact, and wholly lacking any tangible scale to assist with pitch.
At 23, Theremin inadvertently created the groundwork for his eponymous device while building a density meter for gases. After he plugged his gizmo in, he noticed the prototype let loose an eerie squeal as he approached, with the pitch and volume dropping as he raised his hands, and rising as he lowered them. An amateur cellist, Theremin couldn't resist the urge to play a few melodies, quickly catching on and wowing his lab assistants with an impromptu performance.
The physicist developed a more streamlined version of the strange instrument, bringing it before Bolshevik leader Vladimir Lenin. He eventually achieved fame stateside with live performances at iconic American venues in the '20s, like Carnegie Hall and the Metropolitan Opera House. One such show attracted the Radio Corporation of America, who offered Theremin $100,000 for the rights to manufacture his invention. Theremin shook on the offer and began making rounds with tech leaders, surreptitiously spilling their secrets to the Soviet Union's KGB before mysteriously returning to the U.S.S.R. in 1938.
Nowadays, the theremin instrument features a standard layout. At its base lies a box full of control oscillators that transmit electronic signals. Two antennae sense the location of the thereminist's hands, with the upright antenna controlling pitch, and the loop antennae volume. Fretless, stringless, and valveless, the theremin must be an easy instrument to master, right? In all actually, the lack of features means musicians must gauge pitch by ear alone, which often steers away amateurs. Among other tricky variables, one must factor in fluctuation in the player's weight, which drastically affects the proximity the thereminist must place their hands to achieve a particular sound.
For these reasons, the theremin never quite caught on like the electric guitar, but its alien tunes did result in some of the classic '50s science fiction soundtracks we've come to know and love. And along this theme, we hope Léon Theremin's far-out instruments aren't transmitting our thoughts back in time to the KGB: beware the Red Menace! Ooo-eee-ooo.
Below: a video of Léon Theremin shredding his theremin.
[Image description: A black-and-white photo of English singer-songwriter Bruce Woolley playing a theremin.] Credit & copyright: Soundsweep, Wikimedia Commons, image cropped for size, this image is hereby distributed under the same license linked here. -
FREEPP&T CurioFree1 CQ
From starkly-lit, black-and-white films to terrifying alien invasions, something dramatic likely comes to mind when you think of Orson Welles. The prolific producer, director, screenwriter and actor achieved one of his greatest triumphs on this day in 1941, when his cinematic masterpiece, Citizen Kane, premiered at the Palace Theatre in New York City. Yet Welles’ illustrious life had plenty of ups and downs.
Born on May 6, 1915, in Kenosha, Wisconsin, George Orson Welles had a tumultuous childhood. His father, Richard Head Welles, was an inventor who got rich by creating a new kind of bicycle lamp. Unfortunately, he was also an alcoholic. He and Welles’ mother, Beatrice Ives Welles, split up when Welles was four. Welles and his siblings lived in Chicago with their mother, who played piano to support the family until her death from hepatitis in 1924. Since Welles was just nine years old, his father soon took custody of him. Unable to hold down a job due to his alcoholism, Welles’ father used his riches to travel the world, bringing his son along. Young Welles lived on the road for two years before his father finally settled in Woodstock, Illinois, where Welles entered the Todd Seminary for Boys. There, Welles finally met an adult willing to nurture his talents. Roger Hill, a teacher, encouraged Welles to concentrate on subjects that interested him. Immediately, Welles turned his attention to creating theatrical and radio productions. Unfortunately, not everything was looking up for Welles. By the time he was 15, his father’s alcoholism had gotten so severe that Welles told him he would no longer speak to him. Welles hoped that this would encourage his father to stop drinking. Instead, on December 28, 1930, his father died from heart and kidney failure. Welles’ complicated relationship with his father influenced some of his later works, including Citizen Kane.
Welles’ took his first few professional acting jobs at the Gate Theatre in Dublin. Over the next few years, he began acting in the U.S. and got his first radio job at CBS’s The American School of the Air. By 1934, Welles was making a full-time living in theater and radio. In 1938, he took a job starring in a 13-week radio series called The Mercury Theatre on the Air. On October 30, 1938, the show ran a play based on H.G. Wells’ sci-fi classic, The War of the Worlds. Welles played a news broadcaster reporting on a fictional alien invasion. His performance was so realistic that some listeners supposedly believed the invasion was real. However, modern evidence suggests that reports of “mass panic” were exaggerated—mostly by newspapers who may have been trying to discredit radio as a trustworthy source of news. In any case, the broadcast rocketed Welles to fame. In 1939, he signed a contract with Hollywood’s RKO Pictures which gave him unprecedented freedom to produce, direct, and cast two movies, the first of which was 1941’s Citizen Kane.
Welles co-wrote Citizen Kane, alongside Herman J. Mankiewicz, a screenwriter Welles knew from his days in radio. The movie recounts the life of fictional newspaper magnate Charles Foster Kane (played by Welles), a powerful yet mysterious and lonely man. The story is framed by a reporter’s mission to decipher the meaning of Kane’s last word, “rosebud.” Although the reporter never uncovers its meaning, the audience is shown how the word is a symbol of Kane’s longing for his childhood and his love for his long-lost mother. While the story was loosely based on the life of real-life newspaper magnate William Randolph Hearst, an ex-friend of Mankiewicz, Kane’s longing for his lost parents echoes Welles’ own life. Citizen Kane was a financial flop, but received outstanding reviews and was ultimately nominated for nine Academy Awards, of which it won Best Original Screenplay. Today, it is widely considered one of the best films of all time. Not everyone was pleased, though. William Randolph Hearst barred all of his newspapers from printing anything about the movie.
Welles went on to direct 12 more feature films, as well as various radio plays and programs. He starred in commercials and made appearances in popular T.V. shows like I Love Lucy. He was known to take some very out-of-the-box roles from time to time, including one in 1986’s Transformers The Movie as the world-eating multiversal villain Unicron. Unfortunately, Welles didn’t live to see his Transformers debut, as he passed away not long after recording his lines, in 1985. He left behind more than 20 unfinished radio and film projects. Trust a prolific visionary to never, ever stop working.
[Image description: A 1945 candid black-and-white photograph of Orson Welles gesturing with his hand.] Credit & copyright: Associated Press, Wikimedia Commons, Public DomainFrom starkly-lit, black-and-white films to terrifying alien invasions, something dramatic likely comes to mind when you think of Orson Welles. The prolific producer, director, screenwriter and actor achieved one of his greatest triumphs on this day in 1941, when his cinematic masterpiece, Citizen Kane, premiered at the Palace Theatre in New York City. Yet Welles’ illustrious life had plenty of ups and downs.
Born on May 6, 1915, in Kenosha, Wisconsin, George Orson Welles had a tumultuous childhood. His father, Richard Head Welles, was an inventor who got rich by creating a new kind of bicycle lamp. Unfortunately, he was also an alcoholic. He and Welles’ mother, Beatrice Ives Welles, split up when Welles was four. Welles and his siblings lived in Chicago with their mother, who played piano to support the family until her death from hepatitis in 1924. Since Welles was just nine years old, his father soon took custody of him. Unable to hold down a job due to his alcoholism, Welles’ father used his riches to travel the world, bringing his son along. Young Welles lived on the road for two years before his father finally settled in Woodstock, Illinois, where Welles entered the Todd Seminary for Boys. There, Welles finally met an adult willing to nurture his talents. Roger Hill, a teacher, encouraged Welles to concentrate on subjects that interested him. Immediately, Welles turned his attention to creating theatrical and radio productions. Unfortunately, not everything was looking up for Welles. By the time he was 15, his father’s alcoholism had gotten so severe that Welles told him he would no longer speak to him. Welles hoped that this would encourage his father to stop drinking. Instead, on December 28, 1930, his father died from heart and kidney failure. Welles’ complicated relationship with his father influenced some of his later works, including Citizen Kane.
Welles’ took his first few professional acting jobs at the Gate Theatre in Dublin. Over the next few years, he began acting in the U.S. and got his first radio job at CBS’s The American School of the Air. By 1934, Welles was making a full-time living in theater and radio. In 1938, he took a job starring in a 13-week radio series called The Mercury Theatre on the Air. On October 30, 1938, the show ran a play based on H.G. Wells’ sci-fi classic, The War of the Worlds. Welles played a news broadcaster reporting on a fictional alien invasion. His performance was so realistic that some listeners supposedly believed the invasion was real. However, modern evidence suggests that reports of “mass panic” were exaggerated—mostly by newspapers who may have been trying to discredit radio as a trustworthy source of news. In any case, the broadcast rocketed Welles to fame. In 1939, he signed a contract with Hollywood’s RKO Pictures which gave him unprecedented freedom to produce, direct, and cast two movies, the first of which was 1941’s Citizen Kane.
Welles co-wrote Citizen Kane, alongside Herman J. Mankiewicz, a screenwriter Welles knew from his days in radio. The movie recounts the life of fictional newspaper magnate Charles Foster Kane (played by Welles), a powerful yet mysterious and lonely man. The story is framed by a reporter’s mission to decipher the meaning of Kane’s last word, “rosebud.” Although the reporter never uncovers its meaning, the audience is shown how the word is a symbol of Kane’s longing for his childhood and his love for his long-lost mother. While the story was loosely based on the life of real-life newspaper magnate William Randolph Hearst, an ex-friend of Mankiewicz, Kane’s longing for his lost parents echoes Welles’ own life. Citizen Kane was a financial flop, but received outstanding reviews and was ultimately nominated for nine Academy Awards, of which it won Best Original Screenplay. Today, it is widely considered one of the best films of all time. Not everyone was pleased, though. William Randolph Hearst barred all of his newspapers from printing anything about the movie.
Welles went on to direct 12 more feature films, as well as various radio plays and programs. He starred in commercials and made appearances in popular T.V. shows like I Love Lucy. He was known to take some very out-of-the-box roles from time to time, including one in 1986’s Transformers The Movie as the world-eating multiversal villain Unicron. Unfortunately, Welles didn’t live to see his Transformers debut, as he passed away not long after recording his lines, in 1985. He left behind more than 20 unfinished radio and film projects. Trust a prolific visionary to never, ever stop working.
[Image description: A 1945 candid black-and-white photograph of Orson Welles gesturing with his hand.] Credit & copyright: Associated Press, Wikimedia Commons, Public Domain -
FREEUS History PP&T CurioFree1 CQ
No wonder it’s been called “the worst kept secret in U.S. history.” On April 21, 1961, President John F. Kennedy accepted full responsibility for the Bay of Pigs Invasion, a failed attempt to overthrow Cuba’s government. To understand why the U.S. decided to secretly (not-so-secretly) organize such a risky operation, a little historical context about U.S.-Cuba relations in the 1960s is needed.
Fidel Castro came to power as Prime Minister of Cuba in January, 1959. He didn’t win an election; rather, he and a loyal army of guerilla fighters overthrew former Cuban leader Fulgencio Batista. Batista, a military dictator who had himself come to power through a coup, was known for widespread human rights abuses. As such, he had grown extremely unpopular with many Cubans. The then-33-year-old Castro, on the other hand, was seen as a passionate revolutionary with Cubans’ best interests in mind. The U.S. government much preferred Batista, though, due to his pro-American and anti-communist economic policies. Under his government, American companies owned a large number of profitable sugar plantations and ranches in Cuba. Castro, however, wanted Cubans to have more control of their economy. Soon, “Cuba Sí, Yanquis No”, meaning “Cuba Yes, Yankees No,” was one of Castro’s most popular slogans, and many Cubans agreed with him. As U.S.-Cuba relations broke down, Castro began establishing stronger ties with the Soviet Union, with whom the U.S. was still locked in a cold war. Fearing an alliance between the two nations, U.S. President Dwight D. Eisenhower began working with the CIA to attempt to remove Castro from power. This included several covert assassination attempts. Famously, the CIA even spiked a pack of cigars with botulinum toxin and attempted to have them delivered to Castro, though they never reached him.
Eisenhower had a backup plan, though. Castro’s rise to power had caused some Cubans still loyal to Batista’s government to flee the country. Many of them had ended up in Miami, Florida. Under Eisnenhower’s direction, the CIA recruited around 1,400 of these Cuban exiles to form Brigade 2506, a military unit that would be trained by the U.S. military and eventually sent to Cuba to wrest it back from Castro’s control. At least, that was the hope. The U.S. government began secretly training the Cuban troops in Guatemala. At the same time, John F Kennedy won the 1960 U.S. Presidential election, and was sworn in on January 20, 1961. After being briefed on the Cuban takeover mission, Kennedy continued what Eisenhower had started by continuing the funding and training of Brigade 2506. Although Kennedy worried that the Soviet Union could retaliate if they learned about the mission, he was assured that the U.S.’s involvement would remain a secret.
On April 15, 1961, The U.S. military sent B-26 bombers, disguised to look like stolen Cuban planes, to attack Cuban airfields. However, Castro had already moved most of the country’s planes out of harm's way. On April 17, most of Brigade 2506 was loaded onto boats which launched from Guatemala and Nicaragua. That night, despite rough coral that sank a few of their ships, they made landfall in Cuba at Playa Girón in the Bay of Pigs. For a brief time, things seemed to go alright for the U.S.-backed troops. They took down a pro-Castro militia and began advancing over land. Unfortunately, they had failed to spot a radio station on the beach, which immediately began broadcasting about the invasion. Soon, Cuba and the entire international community knew about the operation. With the element of surprise gone, Kennedy withdrew vital air support from the mission, dooming it to failure. Castro’s troops quickly overwhelmed Brigade 2506, killing 114 of its soldiers. Within three days, the invaders were forced to surrender, and 1,100 of them were taken prisoner.
Although the Bay of Pigs invasion had been planned under Eisenhower, it was Kennedy who was left trying to explain the U.S.’s actions to the international community. In a State Department press conference on April 21, Kennedy accepted full responsibility for the embarrassing failure. The repercussions of the Bay of Pigs Invasion were swift and very nearly catastrophic. Cuba and the Soviet Union formed a strong alliance. The Cuban Missile Crisis, a tense confrontation between the U.S. and Cuba over Soviet-supplied nuclear missiles, happened the very next year. At least World War III was avoided, if narrowly.
[Image description: Troops and tanks engage in fighting near Playa Giron during the Bay of Pigs invasion, in 1961.] Credit & copyright: Rumlin, Wikimedia Commons, image cropped for size, this image is hereby distributed under the same license linked here.No wonder it’s been called “the worst kept secret in U.S. history.” On April 21, 1961, President John F. Kennedy accepted full responsibility for the Bay of Pigs Invasion, a failed attempt to overthrow Cuba’s government. To understand why the U.S. decided to secretly (not-so-secretly) organize such a risky operation, a little historical context about U.S.-Cuba relations in the 1960s is needed.
Fidel Castro came to power as Prime Minister of Cuba in January, 1959. He didn’t win an election; rather, he and a loyal army of guerilla fighters overthrew former Cuban leader Fulgencio Batista. Batista, a military dictator who had himself come to power through a coup, was known for widespread human rights abuses. As such, he had grown extremely unpopular with many Cubans. The then-33-year-old Castro, on the other hand, was seen as a passionate revolutionary with Cubans’ best interests in mind. The U.S. government much preferred Batista, though, due to his pro-American and anti-communist economic policies. Under his government, American companies owned a large number of profitable sugar plantations and ranches in Cuba. Castro, however, wanted Cubans to have more control of their economy. Soon, “Cuba Sí, Yanquis No”, meaning “Cuba Yes, Yankees No,” was one of Castro’s most popular slogans, and many Cubans agreed with him. As U.S.-Cuba relations broke down, Castro began establishing stronger ties with the Soviet Union, with whom the U.S. was still locked in a cold war. Fearing an alliance between the two nations, U.S. President Dwight D. Eisenhower began working with the CIA to attempt to remove Castro from power. This included several covert assassination attempts. Famously, the CIA even spiked a pack of cigars with botulinum toxin and attempted to have them delivered to Castro, though they never reached him.
Eisenhower had a backup plan, though. Castro’s rise to power had caused some Cubans still loyal to Batista’s government to flee the country. Many of them had ended up in Miami, Florida. Under Eisnenhower’s direction, the CIA recruited around 1,400 of these Cuban exiles to form Brigade 2506, a military unit that would be trained by the U.S. military and eventually sent to Cuba to wrest it back from Castro’s control. At least, that was the hope. The U.S. government began secretly training the Cuban troops in Guatemala. At the same time, John F Kennedy won the 1960 U.S. Presidential election, and was sworn in on January 20, 1961. After being briefed on the Cuban takeover mission, Kennedy continued what Eisenhower had started by continuing the funding and training of Brigade 2506. Although Kennedy worried that the Soviet Union could retaliate if they learned about the mission, he was assured that the U.S.’s involvement would remain a secret.
On April 15, 1961, The U.S. military sent B-26 bombers, disguised to look like stolen Cuban planes, to attack Cuban airfields. However, Castro had already moved most of the country’s planes out of harm's way. On April 17, most of Brigade 2506 was loaded onto boats which launched from Guatemala and Nicaragua. That night, despite rough coral that sank a few of their ships, they made landfall in Cuba at Playa Girón in the Bay of Pigs. For a brief time, things seemed to go alright for the U.S.-backed troops. They took down a pro-Castro militia and began advancing over land. Unfortunately, they had failed to spot a radio station on the beach, which immediately began broadcasting about the invasion. Soon, Cuba and the entire international community knew about the operation. With the element of surprise gone, Kennedy withdrew vital air support from the mission, dooming it to failure. Castro’s troops quickly overwhelmed Brigade 2506, killing 114 of its soldiers. Within three days, the invaders were forced to surrender, and 1,100 of them were taken prisoner.
Although the Bay of Pigs invasion had been planned under Eisenhower, it was Kennedy who was left trying to explain the U.S.’s actions to the international community. In a State Department press conference on April 21, Kennedy accepted full responsibility for the embarrassing failure. The repercussions of the Bay of Pigs Invasion were swift and very nearly catastrophic. Cuba and the Soviet Union formed a strong alliance. The Cuban Missile Crisis, a tense confrontation between the U.S. and Cuba over Soviet-supplied nuclear missiles, happened the very next year. At least World War III was avoided, if narrowly.
[Image description: Troops and tanks engage in fighting near Playa Giron during the Bay of Pigs invasion, in 1961.] Credit & copyright: Rumlin, Wikimedia Commons, image cropped for size, this image is hereby distributed under the same license linked here. -
FREEHumanities PP&T CurioFree1 CQ
It's Flashback Friday, and also Earth Day! Enjoy these curios about people and innovations that are helping the environment.
College major changers, take heart. The mother of the modern environmental movement, born May 27, 1907, was once just like you. Halfway into her studies at the Pennsylvania College for Women, Rachel Carson switched from English to biology—shocking friends and family who assumed she was destined for a career in letters.In retrospect, Carson had the right idea. She went on to graduate studies in zoology at Johns Hopkins University. From there, she secured a job at the U.S. Bureau of Fisheries (now known as the Fish and Wildlife Service) as a junior aquatic biologist. She was one of two women who had secured a full-time job in the entire Bureau.
Carson's peers quickly recognized her talent for acting as an intermediary between trade workers, scientists, and the lay public. (Her unique track of studies may have helped!) She started out producing radio segments and went on to manage all publications for the Bureau—sometimes submitting her best work to periodicals like The Atlantic Monthly and The New Yorker.
Her style was sometimes poetic, sometimes journalistic, always with a strong sense of perspective. While she already had some literary renown thanks to her serialized novel, Under the Sea Floor, Carson became a household name with the publication of 1962's Silent Spring, which explored the dangers of pesticides.
Despite massive campaigns to discredit her work, Carson stayed strong. She won out eventually; President John F. Kennedy cited Silent Spring as the impetus for a review of DDT use in America. The substance was banned in 1972 by the newly-formed Environmental Protection Agency (EPA), a federal organization that once called itself "the extended shadow of Rachel Carson."
Carson did not survive to see the new legislation enacted, having passed away only two years after the publication of Silent Spring. But we're guessing she was at peace, regardless: death was just another change serving the "the balance of nature." In a letter to a friend written in her final days, Carson described watching a kaleidoscope of butterflies: "We had felt no sadness when we spoke of the fact that there would be no return [for the butterflies]. And rightly—for when any living thing has come to the end of its life cycle we accept that end as natural." We're wondering what she would have to say about climate change!
It's Flashback Friday, and also Earth Day! Enjoy these curios about people and innovations that are helping the environment.
College major changers, take heart. The mother of the modern environmental movement, born May 27, 1907, was once just like you. Halfway into her studies at the Pennsylvania College for Women, Rachel Carson switched from English to biology—shocking friends and family who assumed she was destined for a career in letters.In retrospect, Carson had the right idea. She went on to graduate studies in zoology at Johns Hopkins University. From there, she secured a job at the U.S. Bureau of Fisheries (now known as the Fish and Wildlife Service) as a junior aquatic biologist. She was one of two women who had secured a full-time job in the entire Bureau.
Carson's peers quickly recognized her talent for acting as an intermediary between trade workers, scientists, and the lay public. (Her unique track of studies may have helped!) She started out producing radio segments and went on to manage all publications for the Bureau—sometimes submitting her best work to periodicals like The Atlantic Monthly and The New Yorker.
Her style was sometimes poetic, sometimes journalistic, always with a strong sense of perspective. While she already had some literary renown thanks to her serialized novel, Under the Sea Floor, Carson became a household name with the publication of 1962's Silent Spring, which explored the dangers of pesticides.
Despite massive campaigns to discredit her work, Carson stayed strong. She won out eventually; President John F. Kennedy cited Silent Spring as the impetus for a review of DDT use in America. The substance was banned in 1972 by the newly-formed Environmental Protection Agency (EPA), a federal organization that once called itself "the extended shadow of Rachel Carson."
Carson did not survive to see the new legislation enacted, having passed away only two years after the publication of Silent Spring. But we're guessing she was at peace, regardless: death was just another change serving the "the balance of nature." In a letter to a friend written in her final days, Carson described watching a kaleidoscope of butterflies: "We had felt no sadness when we spoke of the fact that there would be no return [for the butterflies]. And rightly—for when any living thing has come to the end of its life cycle we accept that end as natural." We're wondering what she would have to say about climate change!
-
FREEWorld History PP&T CurioFree1 CQ
Bunnies, baskets, eggs, chicks, candles, and a name of disputed origin; Easter has a lot of symbols associated with it. While we’ve already written about the origin of the Easter Bunny, and his roots in German folklore, the origins of Easter’s other symbols are just as interesting…and sometimes confusing.
First off, there’s Easter’s name. The word “Easter” doesn’t have any obvious attachment to any of the biblical characters associated with this Christian holiday, so where did it come from? In recent years, rumors began circulating online, sometimes in meme form, that Easter’s name originated from Ishtar, the name of a Mesopotamian goddess of war and love. Historians disagree. They point instead to the ancient goddess Eostre, who was first described by the 8th-Century English monk, Bede. Also known as Saint Bede or the Venerable Bede, he described Eostre as a goddess of spring worshiped by some English Anglo-Saxons. These people referred to April as “Eostremonath”, or Eostre’s Month. Still, Bede was a secondary source. No writings about Eostre have been found from her supposed worshipers themselves. Jacob Grimm, one of the two famous Brothers Grimm, claimed that he heard stories about Eostre in Germany, but he also found no original writings about her, and she isn’t mentioned in any known German folklore. In her paper titled The Goddess Eostre: Bede’s Text and Contemporary Pagan Tradition(s), Carole Cusack of the University of Sydney explains, “It has been established that within medieval studies there is no one authoritative interpretation of Bede’s mention of Eostre in DeTemporum Ratione. It is not possible to say, as it is of Woden, for example, that the Anglo-Saxons definitely worshipped a goddess called Eostre, who was probably concerned with the spring or the dawn.” This means that, even though Easter’s name likely came about because of Bede’s writings about Eostre, we’ll never know for certain whether some Anglo-Saxons actually worshiped her.
Dyed eggs are another Easter symbol that, at first glance, seem to have no relation to Christianity. Yet, they actually do. In addition to being a symbol of fertility and springtime for obvious reasons, eggs became associated with Easter in medieval Europe due to Lent, a 40-day fast observed by many Christians. On the Saturday before fasting began, children would go from home to home and church to church begging for eggs. For extra fun, eggs were sometimes boiled with flowers, dyeing them springtime colors. Even after some German protestants stopped fasting for Lent, they continued the tradition of dying eggs during Easter—usually red, as a symbol of Jesus’s blood. Later, the idea of children receiving colored eggs at Easter carried over into folk tales about the Easter Bunny, or Osterhase, as he was called by Protestant German immigrants in the U.S. Children dutifully made nests for the Easter Bunny (these nests were the precursor to modern Easter Baskets) where he was said to place colored eggs and candy if they had been good. Baby chickens also came to be associated with Easter because of Easter eggs, and because baby chicks symbolized new life.
In much of the world, candles are an important Easter symbol. This is due, at least in part, to the tradition of the Paschal candle, a large white candle lit every Easter in some Christian churches. The candle’s name relates to the Paschal Mystery, a Catholic concept regarding Jesus’s life, death, resurrection, and ascension to heaven as a divine mystery. Traditionally, Paschal candles are taller than all other candles in a church space. Today they normally stand around four inches tall, but in medieval times they were positively huge, with some supposedly reaching heights of over 30 feet. Paschal candles are white, sometimes decorated with symbols such as crosses to represent biblical events. Paschal candles are often lit during services on the Saturday before Easter, also known as Holy Saturday. The candle’s flame is meant to be a symbol of new life and of the bible’s resurrection story. In Europe, worshipers sometimes bring their own candles to light from the flame of the Paschal candle, which is considered sacred. Such a solemn practice may seem a far cry from bunnies, chocolates, and eggs, but then again, just about all of Easter’s symbols are linked to centuries-old traditions. Don’t foget to make a nest for the Easter Bunny!
[Image description: Egg-shaped candles, wooden bunnies, and pink tulips against a light green background.] Credit & copyright: stux, PixabayBunnies, baskets, eggs, chicks, candles, and a name of disputed origin; Easter has a lot of symbols associated with it. While we’ve already written about the origin of the Easter Bunny, and his roots in German folklore, the origins of Easter’s other symbols are just as interesting…and sometimes confusing.
First off, there’s Easter’s name. The word “Easter” doesn’t have any obvious attachment to any of the biblical characters associated with this Christian holiday, so where did it come from? In recent years, rumors began circulating online, sometimes in meme form, that Easter’s name originated from Ishtar, the name of a Mesopotamian goddess of war and love. Historians disagree. They point instead to the ancient goddess Eostre, who was first described by the 8th-Century English monk, Bede. Also known as Saint Bede or the Venerable Bede, he described Eostre as a goddess of spring worshiped by some English Anglo-Saxons. These people referred to April as “Eostremonath”, or Eostre’s Month. Still, Bede was a secondary source. No writings about Eostre have been found from her supposed worshipers themselves. Jacob Grimm, one of the two famous Brothers Grimm, claimed that he heard stories about Eostre in Germany, but he also found no original writings about her, and she isn’t mentioned in any known German folklore. In her paper titled The Goddess Eostre: Bede’s Text and Contemporary Pagan Tradition(s), Carole Cusack of the University of Sydney explains, “It has been established that within medieval studies there is no one authoritative interpretation of Bede’s mention of Eostre in DeTemporum Ratione. It is not possible to say, as it is of Woden, for example, that the Anglo-Saxons definitely worshipped a goddess called Eostre, who was probably concerned with the spring or the dawn.” This means that, even though Easter’s name likely came about because of Bede’s writings about Eostre, we’ll never know for certain whether some Anglo-Saxons actually worshiped her.
Dyed eggs are another Easter symbol that, at first glance, seem to have no relation to Christianity. Yet, they actually do. In addition to being a symbol of fertility and springtime for obvious reasons, eggs became associated with Easter in medieval Europe due to Lent, a 40-day fast observed by many Christians. On the Saturday before fasting began, children would go from home to home and church to church begging for eggs. For extra fun, eggs were sometimes boiled with flowers, dyeing them springtime colors. Even after some German protestants stopped fasting for Lent, they continued the tradition of dying eggs during Easter—usually red, as a symbol of Jesus’s blood. Later, the idea of children receiving colored eggs at Easter carried over into folk tales about the Easter Bunny, or Osterhase, as he was called by Protestant German immigrants in the U.S. Children dutifully made nests for the Easter Bunny (these nests were the precursor to modern Easter Baskets) where he was said to place colored eggs and candy if they had been good. Baby chickens also came to be associated with Easter because of Easter eggs, and because baby chicks symbolized new life.
In much of the world, candles are an important Easter symbol. This is due, at least in part, to the tradition of the Paschal candle, a large white candle lit every Easter in some Christian churches. The candle’s name relates to the Paschal Mystery, a Catholic concept regarding Jesus’s life, death, resurrection, and ascension to heaven as a divine mystery. Traditionally, Paschal candles are taller than all other candles in a church space. Today they normally stand around four inches tall, but in medieval times they were positively huge, with some supposedly reaching heights of over 30 feet. Paschal candles are white, sometimes decorated with symbols such as crosses to represent biblical events. Paschal candles are often lit during services on the Saturday before Easter, also known as Holy Saturday. The candle’s flame is meant to be a symbol of new life and of the bible’s resurrection story. In Europe, worshipers sometimes bring their own candles to light from the flame of the Paschal candle, which is considered sacred. Such a solemn practice may seem a far cry from bunnies, chocolates, and eggs, but then again, just about all of Easter’s symbols are linked to centuries-old traditions. Don’t foget to make a nest for the Easter Bunny!
[Image description: Egg-shaped candles, wooden bunnies, and pink tulips against a light green background.] Credit & copyright: stux, Pixabay