Curio Cabinet / Person, Place, or Thing
-
FREEUS History PP&T CurioFree1 CQ
You may remember the Alamo, but how much do you actually know about it? On this day in 1836, the Battle of the Alamo began. Many Americans (especially Texans) think of the battle as a heroic last stand fought by brave patriots. As with many violent historical conflicts, though, things weren’t quite as simple or morally black-and-white as movies and folk songs might lead us to believe.
The Battle of the Alamo took place during the Texas Revolution, a war for Texan independence from Mexico. While Mexican officials obviously took issue with Texas attempting to break away, they were also upset by Texans’ use of slaves, since Mexico was cracking down on slavery. Texans argued that their economy depended on slavery, with many of their wealthiest being cotton farmers. At first, as the staunchly abolitionist Mexican government tried to outlaw slavery in their territory, some of the colonists left. By 1835, however, tensions grew into armed conflict. By the time of the battle, Texans were emboldened by their previous victories against the Mexican army, especially after they’d managed to drive the Mexicans south of the Rio Grande. However, when Mexico retaliated by sending General Antonio López de Santa Anna north with thousands of soldiers, many of the Texan rebels quickly abandoned the ground they had gained. One of the few remaining garrisons was located in a former Spanish mission called the Alamo. The adobe structure was not suitable for defending against an attack, and few rebels remained. Nevertheless, the commanders of the fort, William Travis and James Bowie, stayed behind, hoping that reinforcements would be sent their way. In the weeks leading to General Santa Anna’s arrival at the fort, the two commanders sent impassioned letters to the Texas legislature asking for reinforcements, to no avail. The Texas government was brand new and not organized enough to mobilize a large fighting force. Even if they had been, it would have been extremely difficult to get enough troops to the Alamo before the Mexican army’s arrival.
Travis and Bowie were both well aware of the approaching army, but refused to flee. Neither of them had much military experience, and with the fort poorly equipped or laid out for a defense, the odds were truly against them. On February 23, 1836, General Santa Anna arrived with his troops, and a 13-day siege began. Among the Texan soldiers was former Tennessee congressman Davy Crockett and the families of the soldiers garrisoned there. Some popular, modern imaginings of the battle, such as 1960’s The Alamo starring John Wayne, depict it as a desperate last stand. Historical accounts paint a different picture, though. Rather than a fight to the last man, about half of the Texan rebels fled before the battle’s end, and most of them were skewered by Mexican cavalry. Crockett himself surrendered and was executed, while the families of the soldiers were allowed to leave unharmed. The battle also did little to delay General Santa Anna and his troops on their way to their larger mission: capturing San Antonio. He had promised the Mexican government that he would take the city by March 2, and it was captured on the 6th.
Today, the Battle of the Alamo and the purported courage of the men who died there is an integral part of Texas state history, but that history is often not told in full. Most accounts of the Alamo make little mention of slavery, even though Texans’ desire to keep slaves was one of their main reasons for wanting to break with Mexico. Most modern adaptations of the story also leave out the Tejanos, the settlers of Mexican descent who fought alongside the white Texans. History might seem like a thing of the past, but it’s always relevant to the present.
[Image description: A watercolor drawing of the ruins of the Alamo.] Credit & copyright: Ruins of the Church of the Alamo, San Antonio de Béxar, Edward Everett (1818-1903). Amon Carter Museum of American Art, Fort Worth, Texas, Gift of Anne Burnett Tandy in memory of her father Thomas Lloyd Burnett, 1870-1938. Public Domain.You may remember the Alamo, but how much do you actually know about it? On this day in 1836, the Battle of the Alamo began. Many Americans (especially Texans) think of the battle as a heroic last stand fought by brave patriots. As with many violent historical conflicts, though, things weren’t quite as simple or morally black-and-white as movies and folk songs might lead us to believe.
The Battle of the Alamo took place during the Texas Revolution, a war for Texan independence from Mexico. While Mexican officials obviously took issue with Texas attempting to break away, they were also upset by Texans’ use of slaves, since Mexico was cracking down on slavery. Texans argued that their economy depended on slavery, with many of their wealthiest being cotton farmers. At first, as the staunchly abolitionist Mexican government tried to outlaw slavery in their territory, some of the colonists left. By 1835, however, tensions grew into armed conflict. By the time of the battle, Texans were emboldened by their previous victories against the Mexican army, especially after they’d managed to drive the Mexicans south of the Rio Grande. However, when Mexico retaliated by sending General Antonio López de Santa Anna north with thousands of soldiers, many of the Texan rebels quickly abandoned the ground they had gained. One of the few remaining garrisons was located in a former Spanish mission called the Alamo. The adobe structure was not suitable for defending against an attack, and few rebels remained. Nevertheless, the commanders of the fort, William Travis and James Bowie, stayed behind, hoping that reinforcements would be sent their way. In the weeks leading to General Santa Anna’s arrival at the fort, the two commanders sent impassioned letters to the Texas legislature asking for reinforcements, to no avail. The Texas government was brand new and not organized enough to mobilize a large fighting force. Even if they had been, it would have been extremely difficult to get enough troops to the Alamo before the Mexican army’s arrival.
Travis and Bowie were both well aware of the approaching army, but refused to flee. Neither of them had much military experience, and with the fort poorly equipped or laid out for a defense, the odds were truly against them. On February 23, 1836, General Santa Anna arrived with his troops, and a 13-day siege began. Among the Texan soldiers was former Tennessee congressman Davy Crockett and the families of the soldiers garrisoned there. Some popular, modern imaginings of the battle, such as 1960’s The Alamo starring John Wayne, depict it as a desperate last stand. Historical accounts paint a different picture, though. Rather than a fight to the last man, about half of the Texan rebels fled before the battle’s end, and most of them were skewered by Mexican cavalry. Crockett himself surrendered and was executed, while the families of the soldiers were allowed to leave unharmed. The battle also did little to delay General Santa Anna and his troops on their way to their larger mission: capturing San Antonio. He had promised the Mexican government that he would take the city by March 2, and it was captured on the 6th.
Today, the Battle of the Alamo and the purported courage of the men who died there is an integral part of Texas state history, but that history is often not told in full. Most accounts of the Alamo make little mention of slavery, even though Texans’ desire to keep slaves was one of their main reasons for wanting to break with Mexico. Most modern adaptations of the story also leave out the Tejanos, the settlers of Mexican descent who fought alongside the white Texans. History might seem like a thing of the past, but it’s always relevant to the present.
[Image description: A watercolor drawing of the ruins of the Alamo.] Credit & copyright: Ruins of the Church of the Alamo, San Antonio de Béxar, Edward Everett (1818-1903). Amon Carter Museum of American Art, Fort Worth, Texas, Gift of Anne Burnett Tandy in memory of her father Thomas Lloyd Burnett, 1870-1938. Public Domain. -
FREEWorld History PP&T CurioFree1 CQ
Did Aphrodite smile on you this Valentine’s Day? As the most romantic weekend of the year draws to a close, it seems only right to learn a bit about Aphrodite, the Greek goddess of love. Also known by her Roman name, Venus, Aphrodite is remembered today as a beautiful, romantic figure…yet she wasn’t actually the goddess of romantic love. Rather, her dominion was over physical desire and lust. This made her a surprisingly dangerous figure in Greek mythology, as she was a character driven by jealousy and prone to driving mortals mad. Even the story of her birth is surprisingly violent.
Like all Greek gods and goddesses, Aphrodite has two origins: the historical and mythical. The worship of Aphrodite might have been introduced to ancient Greece from the Middle East, and, indeed, there are similar figures in other pantheons of antiquity. Namely, she is considered to have many similarities to Ishtar of Mesopotamia and Hathor of ancient Egypt. Regardless of how she came to be introduced to the Greeks, her mythological origins are a little more fantastical, to say the least. Unlike most of the other Greek gods, Aphrodite wasn’t descended from the King of the gods, Zeus, nor any of his siblings. Instead, her birth came about as a result of divine patricide. When the titan Cronus overthrew his father, Uranus, he castrated him and threw his dismembered body into the sea. There, from the blood and foam of Uranus, Aphrodite was born. Due to the circumstances of her birth, she was strongly associated with water and was sometimes worshipped as a sea goddess.
Aphrodite was most commonly depicted as a goddess of beauty, fertility, sexuality, and, of course, love. But the Greeks strongly distinguished between erotic love and romantic love, and Aphrodite was the goddess of the former. Romantic love was seen in a positive light, while erotic love was seen as a sort of madness. With this in mind, Aphrodite’s temperament in mythology makes much more sense. She was frequently depicted as fickle, jealous, and short-tempered. In fact, despite being the goddess of love, she wasn’t particularly loyal to her own husband, Hephaestus. Various stories feature her affairs with Ares, Adonis, and other figures, often ending in humiliation or tragedy. She was also jealous when it came to her son, Eros, who shared his mother’s affinity for love and sexuality. When he fell in love with the mortal Psyche, Aphrodite conspired to break the couple apart by forcing Psyche to complete a set of seemingly impossible trials. Aphrodite’s meddling once resulted in the ruin of an entire kingdom. The goddess, along with Hera and Athena, forced Paris of Troy, a mortal man, to judge which of them to be the most beautiful. After Troy chose Aphrodite, she blessed him by having Eros strike Helen, one of the most beautiful mortal women in the world, with one of his enchanted arrows, making her fall in love with Paris. Unfortunately, Helen was already loved by Menelaus, the king of Sparta. After Paris and Helen eloped by returning to the former’s home, Menelaus rallied the rest of the Greeks to wage war against Troy. Following a 10-year conflict, Troy fell to the Greeks, resulting in Paris’s death.
Today, Aphrodite is often depicted as much more benevolent, which makes sense given the lessened distinction between erotic and romantic love, and a more positive view of sexuality. Depictions of Aphrodite also change with evolving beauty standards, reflecting the ideal female form at various points in history. Perhaps the most famous portrayal of the goddess is in the painting, The Birth of Venus by Sandro Botticelli, which takes inspiration from the story of her birth at sea. If you’ve never seen the painting, don’t worry; it’s much less graphic than the myth it’s based on!
[Image description: The Birth of Venus by Sandro Botticelli (1445–1510), a painting depicting Venus rising naked out of the ocean on a giant shell while flying winged figures blow wind toward her and woman on shore approaches with a blanket.] Credit & copyright: Sandro Botticelli (1445–1510), Uffizi Gallery. Public Domain.Did Aphrodite smile on you this Valentine’s Day? As the most romantic weekend of the year draws to a close, it seems only right to learn a bit about Aphrodite, the Greek goddess of love. Also known by her Roman name, Venus, Aphrodite is remembered today as a beautiful, romantic figure…yet she wasn’t actually the goddess of romantic love. Rather, her dominion was over physical desire and lust. This made her a surprisingly dangerous figure in Greek mythology, as she was a character driven by jealousy and prone to driving mortals mad. Even the story of her birth is surprisingly violent.
Like all Greek gods and goddesses, Aphrodite has two origins: the historical and mythical. The worship of Aphrodite might have been introduced to ancient Greece from the Middle East, and, indeed, there are similar figures in other pantheons of antiquity. Namely, she is considered to have many similarities to Ishtar of Mesopotamia and Hathor of ancient Egypt. Regardless of how she came to be introduced to the Greeks, her mythological origins are a little more fantastical, to say the least. Unlike most of the other Greek gods, Aphrodite wasn’t descended from the King of the gods, Zeus, nor any of his siblings. Instead, her birth came about as a result of divine patricide. When the titan Cronus overthrew his father, Uranus, he castrated him and threw his dismembered body into the sea. There, from the blood and foam of Uranus, Aphrodite was born. Due to the circumstances of her birth, she was strongly associated with water and was sometimes worshipped as a sea goddess.
Aphrodite was most commonly depicted as a goddess of beauty, fertility, sexuality, and, of course, love. But the Greeks strongly distinguished between erotic love and romantic love, and Aphrodite was the goddess of the former. Romantic love was seen in a positive light, while erotic love was seen as a sort of madness. With this in mind, Aphrodite’s temperament in mythology makes much more sense. She was frequently depicted as fickle, jealous, and short-tempered. In fact, despite being the goddess of love, she wasn’t particularly loyal to her own husband, Hephaestus. Various stories feature her affairs with Ares, Adonis, and other figures, often ending in humiliation or tragedy. She was also jealous when it came to her son, Eros, who shared his mother’s affinity for love and sexuality. When he fell in love with the mortal Psyche, Aphrodite conspired to break the couple apart by forcing Psyche to complete a set of seemingly impossible trials. Aphrodite’s meddling once resulted in the ruin of an entire kingdom. The goddess, along with Hera and Athena, forced Paris of Troy, a mortal man, to judge which of them to be the most beautiful. After Troy chose Aphrodite, she blessed him by having Eros strike Helen, one of the most beautiful mortal women in the world, with one of his enchanted arrows, making her fall in love with Paris. Unfortunately, Helen was already loved by Menelaus, the king of Sparta. After Paris and Helen eloped by returning to the former’s home, Menelaus rallied the rest of the Greeks to wage war against Troy. Following a 10-year conflict, Troy fell to the Greeks, resulting in Paris’s death.
Today, Aphrodite is often depicted as much more benevolent, which makes sense given the lessened distinction between erotic and romantic love, and a more positive view of sexuality. Depictions of Aphrodite also change with evolving beauty standards, reflecting the ideal female form at various points in history. Perhaps the most famous portrayal of the goddess is in the painting, The Birth of Venus by Sandro Botticelli, which takes inspiration from the story of her birth at sea. If you’ve never seen the painting, don’t worry; it’s much less graphic than the myth it’s based on!
[Image description: The Birth of Venus by Sandro Botticelli (1445–1510), a painting depicting Venus rising naked out of the ocean on a giant shell while flying winged figures blow wind toward her and woman on shore approaches with a blanket.] Credit & copyright: Sandro Botticelli (1445–1510), Uffizi Gallery. Public Domain. -
FREEUS History PP&T CurioFree1 CQ
In the face of tyranny, sometimes it pays to be a Paine in the neck. British-American political writer and propagandist Thomas Paine was born on this day in 1737. Paine is known for being one of the most influential voices during the American Revolution, but he was also a strong supporter of the French Revolution.
Paine was born in Norfolk, England, to a Quaker father and an Anglican mother, and had limited access to education in his early life. While his abilities to read, write, and perform basic arithmetic allowed him to work several jobs, he had few opportunities for economic advancement. Paine also seemed to struggle with every trade he attempted. One of his earliest jobs was being an officer of the excise, which involved chasing smugglers to collect excise taxes on tobacco and alcohol. The job paid little, and Paine was dismissed from the position after he published a pamphlet arguing that higher pay for excise officers would lead to lower corruption. Paine’s fortunes changed in 1774, when he had a chance meeting with Benjamin Franklin. Franklin urged Paine to move to America, and he arrived in Philadelphia, Pennsylvania, on November 30, 1774. There, he cut his teeth working at the Pennsylvania Magazine, owned by Franklin’s son-in-law, Robert Aitkin. During his tenure working with Aitkin, Paine published a number of articles under his own name and under pseudonyms. Being a steadfast abolitionist, Paine published African Slavery in America, an article that criticized and condemned the African slave trade.
Paine really began to make a name for himself when anti-British sentiment began to grow in the American colonies, along with calls for independence. In January, 1776, Paine anonymously published his most famous pamphlet, Common Sense, largely aimed at American colonists who were still undecided on the matter of independence. More than calling for sympathy, the pamphlet encouraged colonists to revolt against the British and to sever ties with the empire completely. Paine’s limited educational background might have actually contributed to his success. While his arguments were coherent and compelling, they appealed to a wider audience because of his tendency to speak and write in a plain, straightforward manner, forgoing the use of latin terms and phrases or philosophical allusions popular with more educated writers of the time. Upon being published, Common Sense sold 500,000 copies in a matter of months, and it was popular to read it aloud during public gatherings. Another popular pamphlet by Paine, The American Crisis, was published the same year, and it was promoted amongst American officers by George Washington himself. This pamphlet was aimed at bolstering the morale of the colonists as conflicts began to escalate in the American Revolution. It contained the now famous words, “These are the times that try men's souls: The summer soldier and the sunshine patriot will, in this crisis, shrink from the service of his country; but he that stands it now, deserves the love and thanks of man and woman.”
Despite his staunch support for the American Revolution, Paine had few friends left by the end of it. When he returned to Britain in 1787 and wrote The Rights of Man in favor of the French Revolution, he was tried for treason, forcing him to flee to France, where he was imprisoned, ironically, for opposing the execution of King Louis XVI. Paine was released thanks to the American ambassador to France, but he later openly criticized George Washington for failing to support him when he had claimed American citizenship to avoid prison. He eventually returned to the former American colonies, now called the United States of America, at the invitation of President Thomas Jefferson. Paine died in 1809, and only six mourners attended his funeral. For much of the following century, Paine was remembered as an instigator and poorly regarded. Today, though, he’s remembered as a leading thinker and writer who helped embolden everyday Americans. No Paine, no American Revolution!
[Image description: A painting of Thomas Paine wearing a black suit and white neckcloth sitting in a green chair.] Credit & copyright: National Portrait Gallery, Smithsonian Institution. Laurent Dabos, 1761 - 1835. Public Domain, CC0.In the face of tyranny, sometimes it pays to be a Paine in the neck. British-American political writer and propagandist Thomas Paine was born on this day in 1737. Paine is known for being one of the most influential voices during the American Revolution, but he was also a strong supporter of the French Revolution.
Paine was born in Norfolk, England, to a Quaker father and an Anglican mother, and had limited access to education in his early life. While his abilities to read, write, and perform basic arithmetic allowed him to work several jobs, he had few opportunities for economic advancement. Paine also seemed to struggle with every trade he attempted. One of his earliest jobs was being an officer of the excise, which involved chasing smugglers to collect excise taxes on tobacco and alcohol. The job paid little, and Paine was dismissed from the position after he published a pamphlet arguing that higher pay for excise officers would lead to lower corruption. Paine’s fortunes changed in 1774, when he had a chance meeting with Benjamin Franklin. Franklin urged Paine to move to America, and he arrived in Philadelphia, Pennsylvania, on November 30, 1774. There, he cut his teeth working at the Pennsylvania Magazine, owned by Franklin’s son-in-law, Robert Aitkin. During his tenure working with Aitkin, Paine published a number of articles under his own name and under pseudonyms. Being a steadfast abolitionist, Paine published African Slavery in America, an article that criticized and condemned the African slave trade.
Paine really began to make a name for himself when anti-British sentiment began to grow in the American colonies, along with calls for independence. In January, 1776, Paine anonymously published his most famous pamphlet, Common Sense, largely aimed at American colonists who were still undecided on the matter of independence. More than calling for sympathy, the pamphlet encouraged colonists to revolt against the British and to sever ties with the empire completely. Paine’s limited educational background might have actually contributed to his success. While his arguments were coherent and compelling, they appealed to a wider audience because of his tendency to speak and write in a plain, straightforward manner, forgoing the use of latin terms and phrases or philosophical allusions popular with more educated writers of the time. Upon being published, Common Sense sold 500,000 copies in a matter of months, and it was popular to read it aloud during public gatherings. Another popular pamphlet by Paine, The American Crisis, was published the same year, and it was promoted amongst American officers by George Washington himself. This pamphlet was aimed at bolstering the morale of the colonists as conflicts began to escalate in the American Revolution. It contained the now famous words, “These are the times that try men's souls: The summer soldier and the sunshine patriot will, in this crisis, shrink from the service of his country; but he that stands it now, deserves the love and thanks of man and woman.”
Despite his staunch support for the American Revolution, Paine had few friends left by the end of it. When he returned to Britain in 1787 and wrote The Rights of Man in favor of the French Revolution, he was tried for treason, forcing him to flee to France, where he was imprisoned, ironically, for opposing the execution of King Louis XVI. Paine was released thanks to the American ambassador to France, but he later openly criticized George Washington for failing to support him when he had claimed American citizenship to avoid prison. He eventually returned to the former American colonies, now called the United States of America, at the invitation of President Thomas Jefferson. Paine died in 1809, and only six mourners attended his funeral. For much of the following century, Paine was remembered as an instigator and poorly regarded. Today, though, he’s remembered as a leading thinker and writer who helped embolden everyday Americans. No Paine, no American Revolution!
[Image description: A painting of Thomas Paine wearing a black suit and white neckcloth sitting in a green chair.] Credit & copyright: National Portrait Gallery, Smithsonian Institution. Laurent Dabos, 1761 - 1835. Public Domain, CC0. -
FREEWorld History PP&T CurioFree1 CQ
Some historical events are like explosions—they happen in an instant. But others, even some of the most impactful, happen like a series of falling dominos. With events like these, it’s easier to see their true impact in hindsight. This month in 1933, one such event happened in Germany, when Adolf Hitler was appointed Chancellor by President Paul von Hindenburg. The political occurrences leading up to Hitler’s appointment, as well as those directly following it, fell perfectly in place to allow the Nazis to seize control of the country.
The end of WWI saw the end of the German Empire, and its successor, the Weimar Republic, was forced to sign the Treaty of Versailles in June of 1919. The fledgling nation was made to accept responsibility and pay reparations to the parties involved in the war. Between the reparation payments and the debt accrued during the war, the Weimar Republic was in dire economic straits by the early 1920s. There was also plenty of social unrest in the wake of WWI, which gave rise to political extremism on both ends of the spectrum. On one side was the German Communist Party, which quickly gained popularity, while another emergent group was the National Socialist German Workers’ Party, soon to be referred to as the “Nazis.” At first, the Nazis struggled to gain political ground or capture the public’s attention, though they had the support of their own paramilitary group, the Sturmabteilung SA (storm troopers), consisting mostly of WWI veterans. Then, in 1923, Adolf Hitler, a rising figure in the Nazi Party, along with WWI general Erich Ludendorff, made a failed attempt to overthrow the government in what would come to be known as the Beer Hall Putsch. Ludendorff was already a renowned war hero popular with many Germans, but the failed coup was Hitler’s first step toward political fame. Using the sudden burst of notoriety as a springboard, Hitler wrote his autobiography, Mein Kampf (My Struggle) by dictation while spending a year in prison.
When the Great Depression hit global markets at the end of the 1920s, the Nazi Party capitalized on the severe economic hardships facing everyday Germans. They blamed an ineffectual government, the communist movement, Jewish financiers, and modernist cultural movements for the decline of Germany. The party promoted the idea that minority groups, including Jews, immigrants, disabled people, and LGBTQ+ people were draining the nation’s wealth. By 1933, the Nazi Party had the largest single share of votes in parliamentary elections, and they began to throw sand into the governments’ gears, stymying any efforts by parliament to pass meaningful legislation. At the same time, they decried the passivity of parliament and the inefficiencies of democracy.
Then, in 1933, in a vain attempt to court Nazi support, President Paul von Hindenburg appointed Hitler chancellor of Germany. Hindenburg hoped that it would lead to the Nazi party’s cooperation in governance. Unfortunately, no such cooperation emerged. When Hindenburg died the following year, Hitler declared himself Führer (leader) of Germany and began the systematic dismantling of the country’s democratic apparatus. He then cemented his power by attacking or imprisoning his critics and rivals, including the Sturmabteilung SA, which he began to consider a liability due to their violent activities on the street. The systematic purge of Hitler’s enemies, including his former supporters from June 30 to July 2, 1934, came to be known as the Nacht der langen Messer (Night of the Long Knives). It was met with widespread support by the greater German populace. Throughout the rest of the 1930s, Hitler and the Nazi Party expanded their military in direct opposition to the Treaty of Versailles and began claiming neighboring territories based on the supposed populations of ethnic Germans living there. To avoid conflict, European leaders opted for a policy of appeasement in 1938, allowing Germany to claim Czechoslovakia’s Sudetenland in exchange for a pledge not to seek further territory.
Of course, Nazi Germany didn’t stop at Czechoslovakia, which they invaded the following year. Soon came Poland and before long Nazi crosshairs were aimed at the rest of Europe. It took the Nazi Party almost 20 years, but they eventually came to hold absolute power by undermining the principles of democracy and eroding the safeguards that held it in place. While most see the Nazis’ rise to power as a cautionary tale, some modern dictators have used it as a playbook to be copied, making Nazi ideology a threat to this day.Some historical events are like explosions—they happen in an instant. But others, even some of the most impactful, happen like a series of falling dominos. With events like these, it’s easier to see their true impact in hindsight. This month in 1933, one such event happened in Germany, when Adolf Hitler was appointed Chancellor by President Paul von Hindenburg. The political occurrences leading up to Hitler’s appointment, as well as those directly following it, fell perfectly in place to allow the Nazis to seize control of the country.
The end of WWI saw the end of the German Empire, and its successor, the Weimar Republic, was forced to sign the Treaty of Versailles in June of 1919. The fledgling nation was made to accept responsibility and pay reparations to the parties involved in the war. Between the reparation payments and the debt accrued during the war, the Weimar Republic was in dire economic straits by the early 1920s. There was also plenty of social unrest in the wake of WWI, which gave rise to political extremism on both ends of the spectrum. On one side was the German Communist Party, which quickly gained popularity, while another emergent group was the National Socialist German Workers’ Party, soon to be referred to as the “Nazis.” At first, the Nazis struggled to gain political ground or capture the public’s attention, though they had the support of their own paramilitary group, the Sturmabteilung SA (storm troopers), consisting mostly of WWI veterans. Then, in 1923, Adolf Hitler, a rising figure in the Nazi Party, along with WWI general Erich Ludendorff, made a failed attempt to overthrow the government in what would come to be known as the Beer Hall Putsch. Ludendorff was already a renowned war hero popular with many Germans, but the failed coup was Hitler’s first step toward political fame. Using the sudden burst of notoriety as a springboard, Hitler wrote his autobiography, Mein Kampf (My Struggle) by dictation while spending a year in prison.
When the Great Depression hit global markets at the end of the 1920s, the Nazi Party capitalized on the severe economic hardships facing everyday Germans. They blamed an ineffectual government, the communist movement, Jewish financiers, and modernist cultural movements for the decline of Germany. The party promoted the idea that minority groups, including Jews, immigrants, disabled people, and LGBTQ+ people were draining the nation’s wealth. By 1933, the Nazi Party had the largest single share of votes in parliamentary elections, and they began to throw sand into the governments’ gears, stymying any efforts by parliament to pass meaningful legislation. At the same time, they decried the passivity of parliament and the inefficiencies of democracy.
Then, in 1933, in a vain attempt to court Nazi support, President Paul von Hindenburg appointed Hitler chancellor of Germany. Hindenburg hoped that it would lead to the Nazi party’s cooperation in governance. Unfortunately, no such cooperation emerged. When Hindenburg died the following year, Hitler declared himself Führer (leader) of Germany and began the systematic dismantling of the country’s democratic apparatus. He then cemented his power by attacking or imprisoning his critics and rivals, including the Sturmabteilung SA, which he began to consider a liability due to their violent activities on the street. The systematic purge of Hitler’s enemies, including his former supporters from June 30 to July 2, 1934, came to be known as the Nacht der langen Messer (Night of the Long Knives). It was met with widespread support by the greater German populace. Throughout the rest of the 1930s, Hitler and the Nazi Party expanded their military in direct opposition to the Treaty of Versailles and began claiming neighboring territories based on the supposed populations of ethnic Germans living there. To avoid conflict, European leaders opted for a policy of appeasement in 1938, allowing Germany to claim Czechoslovakia’s Sudetenland in exchange for a pledge not to seek further territory.
Of course, Nazi Germany didn’t stop at Czechoslovakia, which they invaded the following year. Soon came Poland and before long Nazi crosshairs were aimed at the rest of Europe. It took the Nazi Party almost 20 years, but they eventually came to hold absolute power by undermining the principles of democracy and eroding the safeguards that held it in place. While most see the Nazis’ rise to power as a cautionary tale, some modern dictators have used it as a playbook to be copied, making Nazi ideology a threat to this day. -
FREEGames PP&T CurioFree1 CQ
You can’t try to tilt things in your favor when it comes to pinball! Once a popular arcade mainstay, pinball is seeing a resurgence in popularity. But while pinball machines are largely seen as harmless, wholesome fun nowadays, there was a time when they were public enemy number one. Anti-pinball sentiment was so high, in fact, that this month in 1942, New York City banned the game outright.
With their colorful designs and sound effects, it’s hard to imagine pinball machines as symbols of the seedy underground. Yet for much of pinball’s history, that’s exactly how many people saw it. The first coin-operated pinball machine was made in 1931, and throughout the Great Depression, they grew in popularity. Early pinball machines were similar to modern ones, minus one crucial detail: the flippers. Without flippers to fling the ball back up, pinball was almost entirely a game of chance. Proprietors of bars, bowling alleys, and candy shops set up machines in hopes that eager players would sink countless nickels and dimes into them. If their pinball managed to go into a specific hole, players could win a prize, ranging from a piece of candy to expensive jewelry. Adding to the game’s less-than-favorable reputation, the pinball manufacturing industry had ties to organized crime in Chicago, and pinball machines were seen by many parents as a way for gangsters to make money off of kids. In New York City, Mayor Fiorello LaGuardia went on a crusade against the arcade icon, and it reached a fever pitch after the Japanese attack against the U.S. at Pearl Harbor.
After the U.S. joined WWII following the attack, pinball machines were seen as a waste of precious resources, like metal and springs, that could go toward the war effort. Suddenly, anti-pinball sentiment wasn’t just about morality, it was about patriotism. On January 21, 1942, LaGuardia got his wish when the city council voted to make pinball machines illegal in New York City. Several other cities soon followed suit. Passing the law proved much easier than actually enforcing it, though. As enthusiastic as they were, LaGuardia and the police never quite stamped out New York’s pinball scourge. Sure, many business owners were arrested for having them on their premises while their machines were seized and destroyed in front of the press with sledgehammers, but the industry continued to thrive. Even after flippers were introduced in 1947 to make pinball a game of skill, many people opposed it.
It wasn’t until 1974, when the California Supreme Court ruled against the ban, that the crusade started to lose steam. After the ban was overturned, a financially struggling New York City saw pinball as a financial opportunity. Operators would be required to pay for a license, raising money for the city. However, proponents of pinball still had to prove that it wasn’t gambling. To that end, the Amusement and Music Operators Association hired Roger Sharpe, one of the top players in the country, to demonstrate to the city council that pinball was a game of skill, not chance. To do this, he stood in front of them and called a shot, pulling the plunger back just enough to get the pinball to land exactly where he said it would. Satisfied with the demonstration, the city lifted the ban in 1976.
Though pinball is considered a bit retro today, there are still hundreds of tournaments around the U.S. alone, some with cash prizes reaching up to $1 million. Pinball’s reputation has also had a complete turnaround. Once a sign of rebellious youth and the criminal underworld, pinball is now more likely to be found at a family-friendly arcade than a seedy bar on the wrong side of town. No need to watch your back—just keep your eyes on the ball.
[Image description: A close-up photo of dials and knobs in a pinball machine.] Credit & copyright: Cottonbro studio, PexelsYou can’t try to tilt things in your favor when it comes to pinball! Once a popular arcade mainstay, pinball is seeing a resurgence in popularity. But while pinball machines are largely seen as harmless, wholesome fun nowadays, there was a time when they were public enemy number one. Anti-pinball sentiment was so high, in fact, that this month in 1942, New York City banned the game outright.
With their colorful designs and sound effects, it’s hard to imagine pinball machines as symbols of the seedy underground. Yet for much of pinball’s history, that’s exactly how many people saw it. The first coin-operated pinball machine was made in 1931, and throughout the Great Depression, they grew in popularity. Early pinball machines were similar to modern ones, minus one crucial detail: the flippers. Without flippers to fling the ball back up, pinball was almost entirely a game of chance. Proprietors of bars, bowling alleys, and candy shops set up machines in hopes that eager players would sink countless nickels and dimes into them. If their pinball managed to go into a specific hole, players could win a prize, ranging from a piece of candy to expensive jewelry. Adding to the game’s less-than-favorable reputation, the pinball manufacturing industry had ties to organized crime in Chicago, and pinball machines were seen by many parents as a way for gangsters to make money off of kids. In New York City, Mayor Fiorello LaGuardia went on a crusade against the arcade icon, and it reached a fever pitch after the Japanese attack against the U.S. at Pearl Harbor.
After the U.S. joined WWII following the attack, pinball machines were seen as a waste of precious resources, like metal and springs, that could go toward the war effort. Suddenly, anti-pinball sentiment wasn’t just about morality, it was about patriotism. On January 21, 1942, LaGuardia got his wish when the city council voted to make pinball machines illegal in New York City. Several other cities soon followed suit. Passing the law proved much easier than actually enforcing it, though. As enthusiastic as they were, LaGuardia and the police never quite stamped out New York’s pinball scourge. Sure, many business owners were arrested for having them on their premises while their machines were seized and destroyed in front of the press with sledgehammers, but the industry continued to thrive. Even after flippers were introduced in 1947 to make pinball a game of skill, many people opposed it.
It wasn’t until 1974, when the California Supreme Court ruled against the ban, that the crusade started to lose steam. After the ban was overturned, a financially struggling New York City saw pinball as a financial opportunity. Operators would be required to pay for a license, raising money for the city. However, proponents of pinball still had to prove that it wasn’t gambling. To that end, the Amusement and Music Operators Association hired Roger Sharpe, one of the top players in the country, to demonstrate to the city council that pinball was a game of skill, not chance. To do this, he stood in front of them and called a shot, pulling the plunger back just enough to get the pinball to land exactly where he said it would. Satisfied with the demonstration, the city lifted the ban in 1976.
Though pinball is considered a bit retro today, there are still hundreds of tournaments around the U.S. alone, some with cash prizes reaching up to $1 million. Pinball’s reputation has also had a complete turnaround. Once a sign of rebellious youth and the criminal underworld, pinball is now more likely to be found at a family-friendly arcade than a seedy bar on the wrong side of town. No need to watch your back—just keep your eyes on the ball.
[Image description: A close-up photo of dials and knobs in a pinball machine.] Credit & copyright: Cottonbro studio, Pexels -
FREEBiology PP&T CurioFree1 CQ
You really shouldn’t spray paint at church—especially not on the grave of the world’s most famous biologist. Two climate activists recently made headlines for spray painting a message on Charles Darwin’s grave, in London’s Westminster Abbey. They hoped to draw attention to the fact that Earth’s global temperatures were 34.7 degrees higher than pre-industrial levels for the first time in 2024. While there’s no way to know how Darwin would feel about our modern climate crisis, during his lifetime he wasn’t focused on global temperatures. Rather, he wanted to learn how living things adapted to their environments. His theory of natural selection was groundbreaking…though, contrary to popular belief, Darwin was far from the first scientist to notice that organisms changed over time.
Born on February 12, 1809, in Shrewsbury, England, Charles Darwin was already interested in nature and an avid collector of plants and insects by the time he was teen. Still, he didn’t set out to study the natural world, at first. Instead, he apprenticed with his father, a doctor, then enrolled at the University of Edinburgh’s medical school in 1825. Alas, Darwin wasn’t cut out to be a doctor. Not only was he bored by medical lectures, he was deeply (and understandably) upset by medical practices of the time. This was especially true of a surgery he witnessed in which doctors operated on a child without anesthetics—because they hadn’t been invented yet. After leaving medical school, Darwin didn’t have a clear direction in life. He studied taxidermy for a time and later enrolled at Cambridge University to study theology. Yet again, Darwin found himself drawn away from his schooling, finally spurning theology to join the five-year voyage of the HMS Beagle to serve as its naturalist. The Beagle was set to circumnavigate the globe and survey the coastline of South America, among other things, allowing Darwin to travel to remote locations rarely visited by anyone.
During the voyage, Darwin did just what he’d done as a child, collecting specimens of insects, plants, animals, and fossils. He didn’t quite have the same “leave only footprints” mantra as modern scientists, though. In fact, Darwin not only documented the various lifeforms he encountered on his journey, he dined on them too. This was actually a habit dating back to his days at Cambridge, where he was the founding member of the Gourmet Club (also known as the Glutton Club). The goal of the club had been to feast on “birds and beasts which were before unknown to human palate,” and Darwin certainly made good on that motto during his time aboard the Beagle. According to his notes, Darwin ate iguanas, giant tortoises, armadillos, and even a puma, which he said was "remarkably like veal in taste." His most important contribution as a naturalist, though, was his theory of natural selection.
Darwin came up with his most famous idea after observing 13 different species of finches on the Galápagos Islands. Examining their behavior in the wild and studying their anatomy from captured specimens, Darwin found that the finches all had differently shaped beaks for different purposes. Some were better suited for eating seeds, while others ate insects. Despite these differences, Darwin concluded that they were all descended from the same bird, having many common characteristics, with specializations arising over time. Darwin wasn’t the first person to posit the possibility of evolution, though. 18th-century naturalist Jean Baptiste Lamarck believed that animals changed their bodies throughout their lives based on their environment, while Darwin’s contemporary Alfred Russel Wallace came up with the same theory of natural selection as he did. In fact, the two published a joint statement and gave a presentation at the Linnean Society in London in 1858. Darwin didn’t actually coin the phrase “survival of the fittest,” either. English philosopher Herbert Spencer came up with it in 1864 while comparing his economic and sociological theories to Darwin’s theory of evolution.
Despite Darwin’s confidence in his theory and praise from his peers in the scientific world, he actually waited 20 years to publish his findings. He was fearful of how his theory would be received by the religious community in England, since it contradicted much of what was written in the Bible. However, despite some public criticism, Darwin was mostly celebrated upon his theory’s publication. When he died in 1882, he was laid to rest in London’s Westminster Abbey, alongside England’s greatest heroes. It seems he didn’t have much to fear if his countrymen were willing to bury him in a church!
[Image description: A black-and-white photograph of Charles Darwin with a white beard.] Credit & copyright: Library of Congress, Prints & Photographs Division, LC-DIG-ggbain-03485, George Grantham Bain Collection. No known restrictions on publication.You really shouldn’t spray paint at church—especially not on the grave of the world’s most famous biologist. Two climate activists recently made headlines for spray painting a message on Charles Darwin’s grave, in London’s Westminster Abbey. They hoped to draw attention to the fact that Earth’s global temperatures were 34.7 degrees higher than pre-industrial levels for the first time in 2024. While there’s no way to know how Darwin would feel about our modern climate crisis, during his lifetime he wasn’t focused on global temperatures. Rather, he wanted to learn how living things adapted to their environments. His theory of natural selection was groundbreaking…though, contrary to popular belief, Darwin was far from the first scientist to notice that organisms changed over time.
Born on February 12, 1809, in Shrewsbury, England, Charles Darwin was already interested in nature and an avid collector of plants and insects by the time he was teen. Still, he didn’t set out to study the natural world, at first. Instead, he apprenticed with his father, a doctor, then enrolled at the University of Edinburgh’s medical school in 1825. Alas, Darwin wasn’t cut out to be a doctor. Not only was he bored by medical lectures, he was deeply (and understandably) upset by medical practices of the time. This was especially true of a surgery he witnessed in which doctors operated on a child without anesthetics—because they hadn’t been invented yet. After leaving medical school, Darwin didn’t have a clear direction in life. He studied taxidermy for a time and later enrolled at Cambridge University to study theology. Yet again, Darwin found himself drawn away from his schooling, finally spurning theology to join the five-year voyage of the HMS Beagle to serve as its naturalist. The Beagle was set to circumnavigate the globe and survey the coastline of South America, among other things, allowing Darwin to travel to remote locations rarely visited by anyone.
During the voyage, Darwin did just what he’d done as a child, collecting specimens of insects, plants, animals, and fossils. He didn’t quite have the same “leave only footprints” mantra as modern scientists, though. In fact, Darwin not only documented the various lifeforms he encountered on his journey, he dined on them too. This was actually a habit dating back to his days at Cambridge, where he was the founding member of the Gourmet Club (also known as the Glutton Club). The goal of the club had been to feast on “birds and beasts which were before unknown to human palate,” and Darwin certainly made good on that motto during his time aboard the Beagle. According to his notes, Darwin ate iguanas, giant tortoises, armadillos, and even a puma, which he said was "remarkably like veal in taste." His most important contribution as a naturalist, though, was his theory of natural selection.
Darwin came up with his most famous idea after observing 13 different species of finches on the Galápagos Islands. Examining their behavior in the wild and studying their anatomy from captured specimens, Darwin found that the finches all had differently shaped beaks for different purposes. Some were better suited for eating seeds, while others ate insects. Despite these differences, Darwin concluded that they were all descended from the same bird, having many common characteristics, with specializations arising over time. Darwin wasn’t the first person to posit the possibility of evolution, though. 18th-century naturalist Jean Baptiste Lamarck believed that animals changed their bodies throughout their lives based on their environment, while Darwin’s contemporary Alfred Russel Wallace came up with the same theory of natural selection as he did. In fact, the two published a joint statement and gave a presentation at the Linnean Society in London in 1858. Darwin didn’t actually coin the phrase “survival of the fittest,” either. English philosopher Herbert Spencer came up with it in 1864 while comparing his economic and sociological theories to Darwin’s theory of evolution.
Despite Darwin’s confidence in his theory and praise from his peers in the scientific world, he actually waited 20 years to publish his findings. He was fearful of how his theory would be received by the religious community in England, since it contradicted much of what was written in the Bible. However, despite some public criticism, Darwin was mostly celebrated upon his theory’s publication. When he died in 1882, he was laid to rest in London’s Westminster Abbey, alongside England’s greatest heroes. It seems he didn’t have much to fear if his countrymen were willing to bury him in a church!
[Image description: A black-and-white photograph of Charles Darwin with a white beard.] Credit & copyright: Library of Congress, Prints & Photographs Division, LC-DIG-ggbain-03485, George Grantham Bain Collection. No known restrictions on publication. -
FREEUS History PP&T CurioFree1 CQ
For most people today, winter is either a time for fun activities like sledding, ice skating, and skiing, or a time of inconvenience, when streets are slippery, commutes are longer, and windshields need scraping. But not so long ago, winter was a truly dangerous time for average people, especially if they were traveling. No story illustrates that point quite as well as the tragic tale of the Donner Party, a group of pioneers migrating from the Midwest to California in 1846. Their attempt to survive a brutal winter in the Sierra Nevada is considered one of the darkest chapters from the time of westward expansion in America.
Before the completion of the first Transcontinental Railroad in 1869, traversing the U.S. was a dangerous, harrowing task. Journeys were made largely on foot with provisions and other supplies carried on wagons. There weren’t always well-established roads or reliable maps, making long-distance travel a particularly haphazard endeavor. Nevertheless, the allure of fertile farmland drew thousands to the West Coast, including brothers George and Jacob Donner, as well as James Reed, a successful businessman from Springfield, Illinois. The Donner brothers and Reed formed a party of around 31 people and set off for Independence, Missouri, on April 14, 1846. On May 12, they joined a wagon train (a group of individual parties that traveled together for mutual protection) and headed west toward Fort Laramie 650 miles away. For the first portion of the trip, they stayed on the Oregon Trail, which ended near present-day Portland, Oregon. The Donners and Reeds, however, were traveling to California, and intended to take the California Trail, which diverged from the Oregon Trail at two points between Fort Bridger and Fort Hall. However, instead of waiting for either of those better-established routes, the Donners and Reed opted to take what they believed was a shortcut on the advice of a guide they were traveling with named Lansford Hastings. This supposed shortcut, called Hastings Cutoff, was purported to cut 300 miles from the trip, which would have gotten the travelers to their destination months earlier than anticipated. Hastings Cutoff was heavily promoted by Hastings in his book, The Emigrants' Guide to Oregon and California, which contained advice and trail maps to the West Coast.
What the Donners and Reed didn’t know was that Hastings had never actually traveled his namesake shortcut himself. Contrary to his assertions, the shortcut actually added 125 miles to the trip. Hastings also didn’t join the Donners and Reeds, who parted ways from him at Fort Bridger. Electing George Donner as their leader, the Donners, the Reeds, and dozens more joined together to tackle Hastings Cutoff. The Donner party reached it on July 31, and initially made good time. But since Hastings Cutoff took them through largely untraveled wilderness, they faced severe delays, preventing them from crossing the Sierra Nevada before winter. On October 31, the party established a camp to survive the winter in the area now known as Donner Pass. By then, Reed and his family had set off on their own after he killed another man in the party. As winter set in, the Donner party built cabins for shelter, but they had little in the way of supplies, having lost most of their food during their previous delays. By December, they were trapped by heavy snow, and on the 16th of that month, 15 members of the party set out to find help. Most of the remaining survivors at camp were children.
The aftermath of the disastrous venture made headlines around the country. Only seven of the party members who set out for help survived, and of the original 89 members of the Donner party, 42 starved or froze to death. Sensational claims of cannibalism became the focus of the story after it was discovered that about half of the survivors had consumed the flesh of the dead after depleting their meager supply of food, livestock, dogs, and whatever leather they could boil. Among the dead were the Donner brothers and most of their immediate family. Today, the doomed expedition is memorialized through museum exhibits and the area where the Donner party spent their harrowing winter, which is now called Donner Pass. The next time you curse yourself for taking the wrong exit on a road trip, thank your lucky stars for GPS.
[Image description: Snow falling against a black background.] Credit & copyright: Dillon Kydd, PexelsFor most people today, winter is either a time for fun activities like sledding, ice skating, and skiing, or a time of inconvenience, when streets are slippery, commutes are longer, and windshields need scraping. But not so long ago, winter was a truly dangerous time for average people, especially if they were traveling. No story illustrates that point quite as well as the tragic tale of the Donner Party, a group of pioneers migrating from the Midwest to California in 1846. Their attempt to survive a brutal winter in the Sierra Nevada is considered one of the darkest chapters from the time of westward expansion in America.
Before the completion of the first Transcontinental Railroad in 1869, traversing the U.S. was a dangerous, harrowing task. Journeys were made largely on foot with provisions and other supplies carried on wagons. There weren’t always well-established roads or reliable maps, making long-distance travel a particularly haphazard endeavor. Nevertheless, the allure of fertile farmland drew thousands to the West Coast, including brothers George and Jacob Donner, as well as James Reed, a successful businessman from Springfield, Illinois. The Donner brothers and Reed formed a party of around 31 people and set off for Independence, Missouri, on April 14, 1846. On May 12, they joined a wagon train (a group of individual parties that traveled together for mutual protection) and headed west toward Fort Laramie 650 miles away. For the first portion of the trip, they stayed on the Oregon Trail, which ended near present-day Portland, Oregon. The Donners and Reeds, however, were traveling to California, and intended to take the California Trail, which diverged from the Oregon Trail at two points between Fort Bridger and Fort Hall. However, instead of waiting for either of those better-established routes, the Donners and Reed opted to take what they believed was a shortcut on the advice of a guide they were traveling with named Lansford Hastings. This supposed shortcut, called Hastings Cutoff, was purported to cut 300 miles from the trip, which would have gotten the travelers to their destination months earlier than anticipated. Hastings Cutoff was heavily promoted by Hastings in his book, The Emigrants' Guide to Oregon and California, which contained advice and trail maps to the West Coast.
What the Donners and Reed didn’t know was that Hastings had never actually traveled his namesake shortcut himself. Contrary to his assertions, the shortcut actually added 125 miles to the trip. Hastings also didn’t join the Donners and Reeds, who parted ways from him at Fort Bridger. Electing George Donner as their leader, the Donners, the Reeds, and dozens more joined together to tackle Hastings Cutoff. The Donner party reached it on July 31, and initially made good time. But since Hastings Cutoff took them through largely untraveled wilderness, they faced severe delays, preventing them from crossing the Sierra Nevada before winter. On October 31, the party established a camp to survive the winter in the area now known as Donner Pass. By then, Reed and his family had set off on their own after he killed another man in the party. As winter set in, the Donner party built cabins for shelter, but they had little in the way of supplies, having lost most of their food during their previous delays. By December, they were trapped by heavy snow, and on the 16th of that month, 15 members of the party set out to find help. Most of the remaining survivors at camp were children.
The aftermath of the disastrous venture made headlines around the country. Only seven of the party members who set out for help survived, and of the original 89 members of the Donner party, 42 starved or froze to death. Sensational claims of cannibalism became the focus of the story after it was discovered that about half of the survivors had consumed the flesh of the dead after depleting their meager supply of food, livestock, dogs, and whatever leather they could boil. Among the dead were the Donner brothers and most of their immediate family. Today, the doomed expedition is memorialized through museum exhibits and the area where the Donner party spent their harrowing winter, which is now called Donner Pass. The next time you curse yourself for taking the wrong exit on a road trip, thank your lucky stars for GPS.
[Image description: Snow falling against a black background.] Credit & copyright: Dillon Kydd, Pexels -
FREEPlay PP&T CurioFree1 CQ
This family business really took off around the globe…by making globes! Snow globes are popular souvenirs and holiday decorations the world over. While these whimsical decorations seem like a simple concept—a diorama inside a glass globe with some water and fake snow thrown in—they have a surprisingly scientific origin.
Erwin Perzy I, an Austrian trademan and tinkerer, didn’t set out to invent the snowglobe. Rather, he was in the business of selling medical instruments to local surgeons. In 1900, many physicians were looking to improve the lighting in their operating rooms, which at the time were often small, dim, and hard to work in. So, Perzy went to work, experimenting with a lightbulb placed near a water-filled glass globe. In order to amplify the brightness, Perzy tried adding different materials in the water to reflect the light. His invention never caught on with surgeons, but it did give Perzy an idea. He was already making miniature pewter replicas of the nearby Mariazell Basilica to sell to tourists and pilgrims who visited the site in droves. The souvenir was already popular, so he decided to bump it up a notch by taking some of the tiny buildings and placing them inside the globes. Filled with water and a proprietary blend of wax to mimic snow, the souvenir was sold as a diorama of the Mariazell Basilica in winter, and it was an instant success.
Some historians have pointed out that snow globes may have existed, at least in some form, before Perzy's invention. During the 1878 Paris Universelle Exposition, a French glassware company sold domed paperweights containing a model of a man holding an umbrella. The dome was also filled with water and imitation snow, but this version never caught on. Either way, Perzy’s patent for his snow globe was the first of its kind, and by 1905, business was booming.
At first, snow globes remained a regional craze. In 1908, Emperor Franz Joseph of Austria awarded Perzy for his novel contributions to toymaking, helping to boost snow globe’s popularity. For the first decades of the 20th century, snow globe’s spread steadily across Europe, but sales fell during World War I, World War II, and the intervening period of economic depression. After World War II, business took off again and began to spread to the U.S. By then, Perzy’s son, Erwin Perzy II, was in charge of the family business and made the decision to market snow globes as a Christmas item. The first Christmas snow globe featured a Christmas tree inside, and proved to be a great success. With the post-war baby boom and a rising economy, snow globe sales skyrocketed. Beginning in the 1970s, Erwin Perzy III took over the family business and started selling snow globes to Japan, but by the end of the 1980s, there was a problem. The patent filed by the first Perzy expired, forcing the family to pivot and market their products as the real deal, naming themselves the Original Viennese Snow Globes.
Today, the company is still owned and operated by the Perzy family, and while plenty of other companies sell snow globes, they’re still recognized as the original. In the years since their rebranding, they’ve been commissioned to make custom snow globes for a number of U.S. presidents, and in 2020, they even made one with a model toilet paper roll inside to poke fun at the shortages during the COVID pandemic. In addition to being the original, the company still uses a proprietary blend of wax and plastic for their snow, which they claim floats longer than their competitors’. That’s one way to keep shaking up the industry after all these years.
[Image description: A snowglobe with two figures inside.] Credit & copyright: Merve Sultan, PexelsThis family business really took off around the globe…by making globes! Snow globes are popular souvenirs and holiday decorations the world over. While these whimsical decorations seem like a simple concept—a diorama inside a glass globe with some water and fake snow thrown in—they have a surprisingly scientific origin.
Erwin Perzy I, an Austrian trademan and tinkerer, didn’t set out to invent the snowglobe. Rather, he was in the business of selling medical instruments to local surgeons. In 1900, many physicians were looking to improve the lighting in their operating rooms, which at the time were often small, dim, and hard to work in. So, Perzy went to work, experimenting with a lightbulb placed near a water-filled glass globe. In order to amplify the brightness, Perzy tried adding different materials in the water to reflect the light. His invention never caught on with surgeons, but it did give Perzy an idea. He was already making miniature pewter replicas of the nearby Mariazell Basilica to sell to tourists and pilgrims who visited the site in droves. The souvenir was already popular, so he decided to bump it up a notch by taking some of the tiny buildings and placing them inside the globes. Filled with water and a proprietary blend of wax to mimic snow, the souvenir was sold as a diorama of the Mariazell Basilica in winter, and it was an instant success.
Some historians have pointed out that snow globes may have existed, at least in some form, before Perzy's invention. During the 1878 Paris Universelle Exposition, a French glassware company sold domed paperweights containing a model of a man holding an umbrella. The dome was also filled with water and imitation snow, but this version never caught on. Either way, Perzy’s patent for his snow globe was the first of its kind, and by 1905, business was booming.
At first, snow globes remained a regional craze. In 1908, Emperor Franz Joseph of Austria awarded Perzy for his novel contributions to toymaking, helping to boost snow globe’s popularity. For the first decades of the 20th century, snow globe’s spread steadily across Europe, but sales fell during World War I, World War II, and the intervening period of economic depression. After World War II, business took off again and began to spread to the U.S. By then, Perzy’s son, Erwin Perzy II, was in charge of the family business and made the decision to market snow globes as a Christmas item. The first Christmas snow globe featured a Christmas tree inside, and proved to be a great success. With the post-war baby boom and a rising economy, snow globe sales skyrocketed. Beginning in the 1970s, Erwin Perzy III took over the family business and started selling snow globes to Japan, but by the end of the 1980s, there was a problem. The patent filed by the first Perzy expired, forcing the family to pivot and market their products as the real deal, naming themselves the Original Viennese Snow Globes.
Today, the company is still owned and operated by the Perzy family, and while plenty of other companies sell snow globes, they’re still recognized as the original. In the years since their rebranding, they’ve been commissioned to make custom snow globes for a number of U.S. presidents, and in 2020, they even made one with a model toilet paper roll inside to poke fun at the shortages during the COVID pandemic. In addition to being the original, the company still uses a proprietary blend of wax and plastic for their snow, which they claim floats longer than their competitors’. That’s one way to keep shaking up the industry after all these years.
[Image description: A snowglobe with two figures inside.] Credit & copyright: Merve Sultan, Pexels -
FREEPP&T CurioFree1 CQ
This is one dispute between neighbors that got way out of hand. On this day in 1845, the U.S. Congress approved the annexation of the Republic of Texas, leading to the Mexican-American War. The conflict lasted for two brutal years and claimed the lives of nearly 40,000 soldiers.
Contrary of popular belief, Texas was not actually part of Mexico at the time of its annexation. Rather, it was a breakaway state—a Republic of its own that had gained independence from Mexico during the fittingly-named Texas Revolution. When the U.S. decided to annex it, the Republic had existed for around 10 years. For most of its existence, the U.S. recognized the Republic of Texas as an independent nation, while Mexico did not. Mexico considered it a rebellious state, and was eager to quash the Republic’s independent economic dealings with other nations. At the same time, they threatened war if the U.S. ever tried to annex the Republic of Texas.
Mexico had plenty of reasons to worry since the Republic of Texas itself was in favor of being annexed. In 1836, the Republic voted to become part of the U.S., as they were eager to procure the protection of the U.S. military and gain a stronger economic standing. However, it wasn’t until 1845 that President John Tyler, with the help of President-elect James K. Polk, passed a joint resolution in both houses of Congress and officially made Texas part of the United States. This increase in U.S. territory followed a trend of westward expansion at the time.
Mexico wasn’t happy, but they didn’t make good on their threat to declare war over the annexation. Rather, they took issue with Texas’ new borders. Mexico believed that the border should only extend as far as the Nueces River, but Texas claimed that their border extended all the way to the Rio Grande River and included portions of modern-day New Mexico and Colorado. In November, 1845, The U.S. sent Congressman John Slidell to negotiate a purchase agreement with Mexico for the disputed areas of land. At the same time, The U.S. Army began to take up stations within the disputed territory, infuriating Mexican military leaders and leading to open skirmishes between Mexican and U.S. troops. President Polk had run on a platform of westward U.S. expansion, so he wasn’t about to cede any land to Mexico, and Mexico wouldn’t allow it to be purchased. So, Polk urged Congress to declare war on Mexico, which they did on May 13, 1846.
From the start, Mexico faced serious disadvantages. Their armaments were outdated compared to those of U.S. troops, as most Mexican soldiers used surplus British muskets while U.S. soldiers had access to rifles and revolvers. Most difficult for Mexico to overcome were its own, severe political divisions. Centralistas, who supported a centralized Mexican government, were bitter rivals with federalists, who wanted a decentralized government structure. These two groups often failed to work together within military ranks, and sometimes even turned their weapons on one another. Even Mexican General Antonio López de Santa Anna, Mexico’s most famous military leader, struggled to get his nation’s divided political factions to fight together.
These obstacles quickly proved insurmountable for the Mexican military. After a three-day battle, the U.S. handily captured the major city of Monterrey, Mexico, on September 24, 1846. Not long after, the U.S. advanced into central Mexico and the bloody Battle of Buena Vista ended ambiguously, with both sides claiming victory. However, Mexico never decisively won a single battle in the war, and on September 14, 1847, the U.S. Army captured Mexico City, ending the fighting.
It wasn’t exactly smooth sailing from that point on. The Mexican government had to reform enough to be able to negotiate the war’s ending. This took time, since most of the Mexican government had fled Mexico City in advance of its downfall. It wasn’t until February 2, 1848, that the Treaty of Guadalupe Hidalgo was signed, and the war officially ended. The treaty granted the U.S. all of the formerly-contested territory, which eventually became the states of New Mexico, Utah, Arizona, Nevada, Colorado, California, and, of course, Texas. In return, Mexico got $15 million—far less than the U.S. originally offered to purchase the territory for. It might not have been a great deal to begin with—but Mexico likely ended up wishing they'd taken it.
[Image description: An illustration of soldiers in blue uniforms on horseback, one holding a sword aloft. Other soldiers are on the ground in disarray as others march up a distant hill amid clouds of smoke.] Credit & copyright: Storming of Independence Hill at the Battle of Monterey Kelloggs & Thayer, c. 1850-1900. Library of Congress Prints and Photographs Division Washington, D.C. 20540 USA. Control number: 93507890. Public Domain.This is one dispute between neighbors that got way out of hand. On this day in 1845, the U.S. Congress approved the annexation of the Republic of Texas, leading to the Mexican-American War. The conflict lasted for two brutal years and claimed the lives of nearly 40,000 soldiers.
Contrary of popular belief, Texas was not actually part of Mexico at the time of its annexation. Rather, it was a breakaway state—a Republic of its own that had gained independence from Mexico during the fittingly-named Texas Revolution. When the U.S. decided to annex it, the Republic had existed for around 10 years. For most of its existence, the U.S. recognized the Republic of Texas as an independent nation, while Mexico did not. Mexico considered it a rebellious state, and was eager to quash the Republic’s independent economic dealings with other nations. At the same time, they threatened war if the U.S. ever tried to annex the Republic of Texas.
Mexico had plenty of reasons to worry since the Republic of Texas itself was in favor of being annexed. In 1836, the Republic voted to become part of the U.S., as they were eager to procure the protection of the U.S. military and gain a stronger economic standing. However, it wasn’t until 1845 that President John Tyler, with the help of President-elect James K. Polk, passed a joint resolution in both houses of Congress and officially made Texas part of the United States. This increase in U.S. territory followed a trend of westward expansion at the time.
Mexico wasn’t happy, but they didn’t make good on their threat to declare war over the annexation. Rather, they took issue with Texas’ new borders. Mexico believed that the border should only extend as far as the Nueces River, but Texas claimed that their border extended all the way to the Rio Grande River and included portions of modern-day New Mexico and Colorado. In November, 1845, The U.S. sent Congressman John Slidell to negotiate a purchase agreement with Mexico for the disputed areas of land. At the same time, The U.S. Army began to take up stations within the disputed territory, infuriating Mexican military leaders and leading to open skirmishes between Mexican and U.S. troops. President Polk had run on a platform of westward U.S. expansion, so he wasn’t about to cede any land to Mexico, and Mexico wouldn’t allow it to be purchased. So, Polk urged Congress to declare war on Mexico, which they did on May 13, 1846.
From the start, Mexico faced serious disadvantages. Their armaments were outdated compared to those of U.S. troops, as most Mexican soldiers used surplus British muskets while U.S. soldiers had access to rifles and revolvers. Most difficult for Mexico to overcome were its own, severe political divisions. Centralistas, who supported a centralized Mexican government, were bitter rivals with federalists, who wanted a decentralized government structure. These two groups often failed to work together within military ranks, and sometimes even turned their weapons on one another. Even Mexican General Antonio López de Santa Anna, Mexico’s most famous military leader, struggled to get his nation’s divided political factions to fight together.
These obstacles quickly proved insurmountable for the Mexican military. After a three-day battle, the U.S. handily captured the major city of Monterrey, Mexico, on September 24, 1846. Not long after, the U.S. advanced into central Mexico and the bloody Battle of Buena Vista ended ambiguously, with both sides claiming victory. However, Mexico never decisively won a single battle in the war, and on September 14, 1847, the U.S. Army captured Mexico City, ending the fighting.
It wasn’t exactly smooth sailing from that point on. The Mexican government had to reform enough to be able to negotiate the war’s ending. This took time, since most of the Mexican government had fled Mexico City in advance of its downfall. It wasn’t until February 2, 1848, that the Treaty of Guadalupe Hidalgo was signed, and the war officially ended. The treaty granted the U.S. all of the formerly-contested territory, which eventually became the states of New Mexico, Utah, Arizona, Nevada, Colorado, California, and, of course, Texas. In return, Mexico got $15 million—far less than the U.S. originally offered to purchase the territory for. It might not have been a great deal to begin with—but Mexico likely ended up wishing they'd taken it.
[Image description: An illustration of soldiers in blue uniforms on horseback, one holding a sword aloft. Other soldiers are on the ground in disarray as others march up a distant hill amid clouds of smoke.] Credit & copyright: Storming of Independence Hill at the Battle of Monterey Kelloggs & Thayer, c. 1850-1900. Library of Congress Prints and Photographs Division Washington, D.C. 20540 USA. Control number: 93507890. Public Domain. -
FREEWorld History PP&T CurioFree1 CQ
Guys, I don’t think that’s Santa! In recent years, a monster-like figure known as Krampus has taken the modern world by storm, popping up in memes and even starring in his own movie. But this folkloric figure is far from a modern invention. In fact, his fame as a Christmas figure began in the 17th century (though his origins stretch back even further, to the 12th century) and he was actually portrayed as Santa’s helper.
The name Krampus, is thought to come from the German word for claw, “Krampen.” Krampus certainly does have fearsome claws, along with exaggerated, goat-like features (horns, legs, hooves, and a tail) on a mostly humanoid body with a long tongue and shaggy, black fur. Krampus is also associated with Norse mythology, and one of his earliest iterations was thought to be as the son of Hel, the god of the underworld. Regardless of exactly where he came from, Krampus came to have just one job during Christmas, according to many European countries: punish children who misbehaved during the year. Unlike Santa, who merely rewards good children, the Krampus takes punitive measures like beating children with sticks and sometimes even kidnapping them. Santa isn’t unaware of Krampus’s deeds, either. According to folklore, since Santa is a saint, he can’t punish children…which is why Krampus does it for him. Both St. Nicholas and Krampus are said to arrive on Krampusnacht, or Krampus Run (December 5), to dole out each child’s reward or punishment, respectively. The next morning, children are supposed to be either basking in their presents or crying over their injuries from the night before. Compared to that, some coal in the stocking might be preferable.
This bizarre goat-monster probably came to be associated with Christmas because he was already associated with Winter Solstice and the pagan traditions surrounding it. Once Christianity began to spread in once-pagan regions, the two traditions became mingled, creating an unlikely crossover of a Turkish saint and a Norse demon. However, Krampusnacht might have taken more from the pagans than the Christians. Krampusnacht usually involves revelers handing out alcohol and a parade where people dressed like the Krampus run around chasing children. No surprise, then, that since the Krampus started to become intertwined with Christmas, the Catholic Church attempted to abolish the figure several times, to no avail. One particularly large, long-running festival takes place in Lienz, Austria, with a parade called Perchtenlauf, where cowbells ring to signal the arrival of Krampus.
Krampus’s popularity really began to take off in the early 20th century, when the figure was featured on holiday cards that ranged from comical to spooky. At first, Krampus cards were contained mostly to Germany and Austria, but the figure’s popularity began to spread around Europe and even across the Atlantic. In the U.S., the Krampus has become the go-to figure for those who wish to forego the typical Christmas sentimentality and embrace a more horror-centric and ironic approach to the holidays.
Today, many of the older traditions around the Krampus are still practiced, but the figure is also something of a pop-culture icon. 2015 saw the debut of Krampus, a horror movie that casts the monster as the main antagonist. Other films have followed suit, often incorporating elements from real folklore. Krampus might have also gained traction in the U.S. partly as a novel way to protest the increasing commercialization of Christmas. But that might have been in vain, since merchandise featuring Krampus is becoming ever more popular. How long until we get a Christmas carol about the guy?
[Image description: Krampus, a furry, black monster with horns and a long tongue, puts a child in a sack while another child kneels by a bowl of fruit.] Credit & copyright: c. 1900, Wikimedia Commons. This work is in the public domain in its country of origin and other countries and areas where the copyright term is the author's life plus 100 years or fewer.Guys, I don’t think that’s Santa! In recent years, a monster-like figure known as Krampus has taken the modern world by storm, popping up in memes and even starring in his own movie. But this folkloric figure is far from a modern invention. In fact, his fame as a Christmas figure began in the 17th century (though his origins stretch back even further, to the 12th century) and he was actually portrayed as Santa’s helper.
The name Krampus, is thought to come from the German word for claw, “Krampen.” Krampus certainly does have fearsome claws, along with exaggerated, goat-like features (horns, legs, hooves, and a tail) on a mostly humanoid body with a long tongue and shaggy, black fur. Krampus is also associated with Norse mythology, and one of his earliest iterations was thought to be as the son of Hel, the god of the underworld. Regardless of exactly where he came from, Krampus came to have just one job during Christmas, according to many European countries: punish children who misbehaved during the year. Unlike Santa, who merely rewards good children, the Krampus takes punitive measures like beating children with sticks and sometimes even kidnapping them. Santa isn’t unaware of Krampus’s deeds, either. According to folklore, since Santa is a saint, he can’t punish children…which is why Krampus does it for him. Both St. Nicholas and Krampus are said to arrive on Krampusnacht, or Krampus Run (December 5), to dole out each child’s reward or punishment, respectively. The next morning, children are supposed to be either basking in their presents or crying over their injuries from the night before. Compared to that, some coal in the stocking might be preferable.
This bizarre goat-monster probably came to be associated with Christmas because he was already associated with Winter Solstice and the pagan traditions surrounding it. Once Christianity began to spread in once-pagan regions, the two traditions became mingled, creating an unlikely crossover of a Turkish saint and a Norse demon. However, Krampusnacht might have taken more from the pagans than the Christians. Krampusnacht usually involves revelers handing out alcohol and a parade where people dressed like the Krampus run around chasing children. No surprise, then, that since the Krampus started to become intertwined with Christmas, the Catholic Church attempted to abolish the figure several times, to no avail. One particularly large, long-running festival takes place in Lienz, Austria, with a parade called Perchtenlauf, where cowbells ring to signal the arrival of Krampus.
Krampus’s popularity really began to take off in the early 20th century, when the figure was featured on holiday cards that ranged from comical to spooky. At first, Krampus cards were contained mostly to Germany and Austria, but the figure’s popularity began to spread around Europe and even across the Atlantic. In the U.S., the Krampus has become the go-to figure for those who wish to forego the typical Christmas sentimentality and embrace a more horror-centric and ironic approach to the holidays.
Today, many of the older traditions around the Krampus are still practiced, but the figure is also something of a pop-culture icon. 2015 saw the debut of Krampus, a horror movie that casts the monster as the main antagonist. Other films have followed suit, often incorporating elements from real folklore. Krampus might have also gained traction in the U.S. partly as a novel way to protest the increasing commercialization of Christmas. But that might have been in vain, since merchandise featuring Krampus is becoming ever more popular. How long until we get a Christmas carol about the guy?
[Image description: Krampus, a furry, black monster with horns and a long tongue, puts a child in a sack while another child kneels by a bowl of fruit.] Credit & copyright: c. 1900, Wikimedia Commons. This work is in the public domain in its country of origin and other countries and areas where the copyright term is the author's life plus 100 years or fewer. -
FREEWorld History PP&T CurioFree1 CQ
It’s really not as scary as it sounds. The Black Forest region of Germany is known for its picturesque landscape and traditional crafts. During the holiday season, German Christmas markets (or Christkindlmarkts) around the world are filled with hand-carved wooden toys and figurines from the region, and Black Forest ham is a beloved culinary delight throughout the year. However, there’s more to this historic, wooded area than just toys and food. The people living there have proudly retained distinct cultural practices that make the region unique.
Located in the southwestern state of Baden-Württemberg, the Black Forest is called Schwarzwald in German, though it went by other names in the past. The ancient Romans once associated the area with Abnoba Mons, a mountain range named after a Celtic deity. The earliest written record of the Black Forest also comes from the Romans, in the form of the Tabula Peutingeriana, a medieval copy of a Roman map that detailed the empire’s public road system. In it, the Black Forest is called Silva Marciana, which means “border forest,” in reference to the Marcomanni ("border people") who lived near Roman settlements in the area. The Black Forest today consists of 2,320 square miles of heavily forested land that stretches around 100 miles long and up to 25 miles wide. It contains the sources of both the Danube and Neckar rivers, and the area was historically known for its rich pastureland. Of course, the true stars of the Black Forest are the trees that define the region. The forests of Schwarzwald are mainly known for their oak, beech, and fir trees, the latter of which gives the region its name. Unsurprisingly, lumber production was historically a large part of the Black Forest’s economy, along with mining.
The Black Forest’s history of woodworking and woodcraft goes back centuries. Arguably the most famous craft to come out of the forest is the cuckoo clock, which was invented some time in the 17th century. As their name implies, cuckoo clocks typically feature a small, carved bird that emerges from above the clock face to mark the arrival of each hour with a call or song. More elaborate clocks sometimes have a set of dancers that circle in and out of a balcony in time to the sound. Most cuckoo clocks are carved out of wood to resemble houses, cabins, beer halls, or other traditional structures, with a scene of domestic or village life around it. While many modern cuckoo clocks use an electronic movement to keep time, mechanical versions using weights and pendulums are still being made. The weights that power the movement are often made to resemble pine cones, and users need only pull down on them periodically to keep the clock ticking. There are a limitless variety of cuckoo clock designs, and there are still traditional craftsmen making them by hand. The Black Forest is also known for wood carved figurines and sculptures, many of which served as children’s toys. Wood carving as an industry first gained traction in the 19th century, when drought and famine forced locals to seek alternative sources of income, but it is now a cherished part of the region’s culture.
Today, the Black Forest is still home to many woodworkers. The region is also a popular destination for outdoor enthusiasts, thanks to its many hiking trails and immense natural beauty. Towns in and around the Black Forest feature traditional, pastoral architecture and growing art scenes, where artists take inspiration from local traditions and landscapes. All those clocks, and they still manage to stay timeless.
[Image description: A section of the northern Black Forest with thin pine trees. Credit & copyright: Leonhard Lenz, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.It’s really not as scary as it sounds. The Black Forest region of Germany is known for its picturesque landscape and traditional crafts. During the holiday season, German Christmas markets (or Christkindlmarkts) around the world are filled with hand-carved wooden toys and figurines from the region, and Black Forest ham is a beloved culinary delight throughout the year. However, there’s more to this historic, wooded area than just toys and food. The people living there have proudly retained distinct cultural practices that make the region unique.
Located in the southwestern state of Baden-Württemberg, the Black Forest is called Schwarzwald in German, though it went by other names in the past. The ancient Romans once associated the area with Abnoba Mons, a mountain range named after a Celtic deity. The earliest written record of the Black Forest also comes from the Romans, in the form of the Tabula Peutingeriana, a medieval copy of a Roman map that detailed the empire’s public road system. In it, the Black Forest is called Silva Marciana, which means “border forest,” in reference to the Marcomanni ("border people") who lived near Roman settlements in the area. The Black Forest today consists of 2,320 square miles of heavily forested land that stretches around 100 miles long and up to 25 miles wide. It contains the sources of both the Danube and Neckar rivers, and the area was historically known for its rich pastureland. Of course, the true stars of the Black Forest are the trees that define the region. The forests of Schwarzwald are mainly known for their oak, beech, and fir trees, the latter of which gives the region its name. Unsurprisingly, lumber production was historically a large part of the Black Forest’s economy, along with mining.
The Black Forest’s history of woodworking and woodcraft goes back centuries. Arguably the most famous craft to come out of the forest is the cuckoo clock, which was invented some time in the 17th century. As their name implies, cuckoo clocks typically feature a small, carved bird that emerges from above the clock face to mark the arrival of each hour with a call or song. More elaborate clocks sometimes have a set of dancers that circle in and out of a balcony in time to the sound. Most cuckoo clocks are carved out of wood to resemble houses, cabins, beer halls, or other traditional structures, with a scene of domestic or village life around it. While many modern cuckoo clocks use an electronic movement to keep time, mechanical versions using weights and pendulums are still being made. The weights that power the movement are often made to resemble pine cones, and users need only pull down on them periodically to keep the clock ticking. There are a limitless variety of cuckoo clock designs, and there are still traditional craftsmen making them by hand. The Black Forest is also known for wood carved figurines and sculptures, many of which served as children’s toys. Wood carving as an industry first gained traction in the 19th century, when drought and famine forced locals to seek alternative sources of income, but it is now a cherished part of the region’s culture.
Today, the Black Forest is still home to many woodworkers. The region is also a popular destination for outdoor enthusiasts, thanks to its many hiking trails and immense natural beauty. Towns in and around the Black Forest feature traditional, pastoral architecture and growing art scenes, where artists take inspiration from local traditions and landscapes. All those clocks, and they still manage to stay timeless.
[Image description: A section of the northern Black Forest with thin pine trees. Credit & copyright: Leonhard Lenz, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication. -
FREEAlgebra PP&T CurioFree1 CQ
Math and logic are an inseparable pair, right? Well, they weren't always. Mathematics and logic existed separately for thousands of years before the two disciplines ever merged together, but their eventual marriage was possible thanks in part to a man named George Boole. Boole, who died on this day in 1864, is known as the father of binary logic, and by extension, a key figure in the field of modern computing.
Despite his later career as a revolutionary academic and educator, Boole never received much formal education. Instead, his early life was enriched by his father’s personal interest in math and science. Born on November 2, 1815, in Lincoln, Lincolnshire, England, Boole was largely educated by his father, a shoemaker. As a child, he also attended local schools, but most of his knowledge in mathematics was self-taught. When his father’s business began to slow down, Boole started teaching at the young age of 16. By 20, he had opened his own school, and remained a dedicated educator throughout his life. He worked as the headmaster of his school for 15 years, during which time he took it upon himself to continue his own education. Beginning in the 1840s, Boole began to publish papers in the Cambridge Mathematical Journal. In 1849, he began his tenure as a professor of mathematics at Queens College in Cork, Ireland.
Before Boole, logic was considered part of philosophy. He published a pamphlet in 1847 titled The Mathematical Analysis of Logic, being an Essay towards a Calculus of Deductive Reasoning in which he argued that logic was not a matter of philosophy, but shared a domain with mathematics. He expounded on this idea in An Investigation into the Laws of Thought, on Which Are Founded the Mathematical Theories of Logic and Probabilities, which he released in 1854.
With these two works, Boolean algebra was established, wherein math and algebraic symbols could be used to express a binary system of logic. Essentially, Boolean algebra is the mathematical representation of logic using boolean values: the values of true or false, often represented today as 1 and 0 in computer science. Boolean algebra also plays an important role in the theory of probabilities, information theory, and circuit design in digital computers. Boole’s integration of math and logic was a revolution millennia in the making, with much of his work based on Aristotle’s system of logic. Even Boole’s book, The Laws of Thought, was titled after existing fundamental laws of logic used by ancient philosophers.
Thanks to the recognition and acclaim he earned from his works, Boole was given an honorary membership to the Cambridge Philosophical Society and an honorary degree from Oxford University in 1858 and 1859, respectively. Sadly, Boole’s extreme dedication to his profession ultimately led to his death. One day in November of 1864, Boole walked through a cold and torrential downpour to reach his class at Queens College. Once in his classroom, he conducted an entire lecture in drenched clothes. In the following days, Boole contracted pneumonia and passed away at the age of 49, survived by his wife and children.
Even if Boole lived a long and healthy life, he wouldn’t have seen the advent of digital computing that relied on his principles with practical applications. While many programming languages exist today, digital computing is fundamentally based on circuits formed using the boolean values of true and false. Boole’s impact has been left on everything from algebra textbooks to the entire field of digital computing. All that, despite spending much of his life with little formal education. Who said a shoemaker’s son couldn’t accomplish great feats?
[Image description: Rows of white 1s and 0s against a back background.] Credit & copyright: Author’s own photo. The author releases this image into the Public Domain.Math and logic are an inseparable pair, right? Well, they weren't always. Mathematics and logic existed separately for thousands of years before the two disciplines ever merged together, but their eventual marriage was possible thanks in part to a man named George Boole. Boole, who died on this day in 1864, is known as the father of binary logic, and by extension, a key figure in the field of modern computing.
Despite his later career as a revolutionary academic and educator, Boole never received much formal education. Instead, his early life was enriched by his father’s personal interest in math and science. Born on November 2, 1815, in Lincoln, Lincolnshire, England, Boole was largely educated by his father, a shoemaker. As a child, he also attended local schools, but most of his knowledge in mathematics was self-taught. When his father’s business began to slow down, Boole started teaching at the young age of 16. By 20, he had opened his own school, and remained a dedicated educator throughout his life. He worked as the headmaster of his school for 15 years, during which time he took it upon himself to continue his own education. Beginning in the 1840s, Boole began to publish papers in the Cambridge Mathematical Journal. In 1849, he began his tenure as a professor of mathematics at Queens College in Cork, Ireland.
Before Boole, logic was considered part of philosophy. He published a pamphlet in 1847 titled The Mathematical Analysis of Logic, being an Essay towards a Calculus of Deductive Reasoning in which he argued that logic was not a matter of philosophy, but shared a domain with mathematics. He expounded on this idea in An Investigation into the Laws of Thought, on Which Are Founded the Mathematical Theories of Logic and Probabilities, which he released in 1854.
With these two works, Boolean algebra was established, wherein math and algebraic symbols could be used to express a binary system of logic. Essentially, Boolean algebra is the mathematical representation of logic using boolean values: the values of true or false, often represented today as 1 and 0 in computer science. Boolean algebra also plays an important role in the theory of probabilities, information theory, and circuit design in digital computers. Boole’s integration of math and logic was a revolution millennia in the making, with much of his work based on Aristotle’s system of logic. Even Boole’s book, The Laws of Thought, was titled after existing fundamental laws of logic used by ancient philosophers.
Thanks to the recognition and acclaim he earned from his works, Boole was given an honorary membership to the Cambridge Philosophical Society and an honorary degree from Oxford University in 1858 and 1859, respectively. Sadly, Boole’s extreme dedication to his profession ultimately led to his death. One day in November of 1864, Boole walked through a cold and torrential downpour to reach his class at Queens College. Once in his classroom, he conducted an entire lecture in drenched clothes. In the following days, Boole contracted pneumonia and passed away at the age of 49, survived by his wife and children.
Even if Boole lived a long and healthy life, he wouldn’t have seen the advent of digital computing that relied on his principles with practical applications. While many programming languages exist today, digital computing is fundamentally based on circuits formed using the boolean values of true and false. Boole’s impact has been left on everything from algebra textbooks to the entire field of digital computing. All that, despite spending much of his life with little formal education. Who said a shoemaker’s son couldn’t accomplish great feats?
[Image description: Rows of white 1s and 0s against a back background.] Credit & copyright: Author’s own photo. The author releases this image into the Public Domain. -
FREEWorld History PP&T CurioFree1 CQ
It’s not exactly floating on cloud nine, but it might feel pretty close. Since they first took to the skies, airships have held the popular imagination captive. Some of the world’s first airships (a term that includes blimps and dirigibles) used hydrogen to become lighter than air. Hydrogen was eventually replaced by helium, which was much less explosive. The very first airship to use helium took its maiden voyage on this day in 1921, and things seemed to be looking up for the future of airships. To the disappointment of many enthusiasts, however, they never really took off as a popular form of transportation.
Airships were, unsurprisingly, inspired by hot air balloons, which were invented in 1783. French engineer Jean Baptiste Meusnier was the first to build on the concept of a lighter-than-air vessel with a design that included steering by way of three propellers and a fully sealed balloon filled with gas, not hot air. Unfortunately for Meusnier, his design was never built, though it went on to inspire others. In 1785, French inventor Jean-Pierre Blanchard and American Dr John Jeffries made history by crossing the English Channel in a hydrogen-filled airship. Their success launched a new airship industry in which improvements and innovations developed fast. One major improvement was steam power, first used in 1852 by yet another French engineer, Henri Giffard. The most famous contribution to airship technology came in 1895, from German inventor Count Ferdinand von Zeppelin. The count designed an entirely new type of airship, named after himself: the Zeppelin, which was much more rigid than its predecessors. The first Zeppelin was built by Hungarian inventor David Schwarz, and was shaped like a long cigar that was wider at the front, with fins at the rear. Its rigid frame the Zeppelin faster than other airships of the time, capable of reaching speeds of up to 25 miles-per-hour. Zeppelins were also more resilient to adverse weather conditions.
Other airships soon adopted more rigid frames. While they were largely used for scenic passenger flights, Zeppelins were also used as military aircraft to bomb Britain during WWI due to their impressive cargo capacity. The U.S. military also adopted the use of airships, though they mostly used non-rigid dirigibles. The most prevalent among them were the Goodyear Pilgrims, invented in 1925. Though these were only capable of carrying two passengers and two crew members, and were originally made for scenic passenger flights, during the war they were utilized for surveillance by the U.S. Army and Navy. In fact, the first helium airship was the U.S. Navy’s C-7 blimp, which could carry a crew of four. Goodyear also made other nonrigid airships, or blimps, and they were a common sight during large events, where they served as advertisements. Some of these even remain in service today. With varied uses and designs, airships seemed to be on the rise during the early 20th century. One tragic event, however, changed course of the airship industry forever: the Hindenburg disaster. The Hindenburg was the first airship to provide regularly-scheduled service between Europe and North America, carrying passengers across the Atlantic faster than any ship of the time. But in 1937, the Hindenburg crashed during its landing approach in Lakehurst, New Jersey. After a hydrogen leak caught on fire from a static discharge, flames consumed the fabric covering containing the gas. In almost no time at all, the Hindenburg fell to the ground in a smoky blaze. Of the 97 passengers and crew on board, 35 lost their lives. Once the terrifying images of the conflagration spread around the world, the golden age of airships was essentially over.
With modern airplanes that can ferry hundreds of passengers across continents in hours, it might seem like airships are irrelevant today. Yet, these unusual aircraft do manage to find a place in modern times. Airships are still used to deliver aid relief to remote, undeveloped areas with no landing strips, since airships can safely drop cargo without having to land. They’re also widely used in scientific research and military surveillance, though in a reversal of past trends, there is a growing interest in airships for scenic flights. Then there are the enthusiasts who still fly dirigibles just for the fun of it. Don’t worry though; airships nowadays are filled with helium, making tragedies like the Hindenburg much less likely to occur. Who’s up for a leisurely blimp ride?
[Image description: A black-and-white image of the airship Captain Ferber in its hangar with people in uniform standing about.] Credit & copyright: Epinal Municipal Library, Limedia galleries. Etalab Open License, Public Domain.It’s not exactly floating on cloud nine, but it might feel pretty close. Since they first took to the skies, airships have held the popular imagination captive. Some of the world’s first airships (a term that includes blimps and dirigibles) used hydrogen to become lighter than air. Hydrogen was eventually replaced by helium, which was much less explosive. The very first airship to use helium took its maiden voyage on this day in 1921, and things seemed to be looking up for the future of airships. To the disappointment of many enthusiasts, however, they never really took off as a popular form of transportation.
Airships were, unsurprisingly, inspired by hot air balloons, which were invented in 1783. French engineer Jean Baptiste Meusnier was the first to build on the concept of a lighter-than-air vessel with a design that included steering by way of three propellers and a fully sealed balloon filled with gas, not hot air. Unfortunately for Meusnier, his design was never built, though it went on to inspire others. In 1785, French inventor Jean-Pierre Blanchard and American Dr John Jeffries made history by crossing the English Channel in a hydrogen-filled airship. Their success launched a new airship industry in which improvements and innovations developed fast. One major improvement was steam power, first used in 1852 by yet another French engineer, Henri Giffard. The most famous contribution to airship technology came in 1895, from German inventor Count Ferdinand von Zeppelin. The count designed an entirely new type of airship, named after himself: the Zeppelin, which was much more rigid than its predecessors. The first Zeppelin was built by Hungarian inventor David Schwarz, and was shaped like a long cigar that was wider at the front, with fins at the rear. Its rigid frame the Zeppelin faster than other airships of the time, capable of reaching speeds of up to 25 miles-per-hour. Zeppelins were also more resilient to adverse weather conditions.
Other airships soon adopted more rigid frames. While they were largely used for scenic passenger flights, Zeppelins were also used as military aircraft to bomb Britain during WWI due to their impressive cargo capacity. The U.S. military also adopted the use of airships, though they mostly used non-rigid dirigibles. The most prevalent among them were the Goodyear Pilgrims, invented in 1925. Though these were only capable of carrying two passengers and two crew members, and were originally made for scenic passenger flights, during the war they were utilized for surveillance by the U.S. Army and Navy. In fact, the first helium airship was the U.S. Navy’s C-7 blimp, which could carry a crew of four. Goodyear also made other nonrigid airships, or blimps, and they were a common sight during large events, where they served as advertisements. Some of these even remain in service today. With varied uses and designs, airships seemed to be on the rise during the early 20th century. One tragic event, however, changed course of the airship industry forever: the Hindenburg disaster. The Hindenburg was the first airship to provide regularly-scheduled service between Europe and North America, carrying passengers across the Atlantic faster than any ship of the time. But in 1937, the Hindenburg crashed during its landing approach in Lakehurst, New Jersey. After a hydrogen leak caught on fire from a static discharge, flames consumed the fabric covering containing the gas. In almost no time at all, the Hindenburg fell to the ground in a smoky blaze. Of the 97 passengers and crew on board, 35 lost their lives. Once the terrifying images of the conflagration spread around the world, the golden age of airships was essentially over.
With modern airplanes that can ferry hundreds of passengers across continents in hours, it might seem like airships are irrelevant today. Yet, these unusual aircraft do manage to find a place in modern times. Airships are still used to deliver aid relief to remote, undeveloped areas with no landing strips, since airships can safely drop cargo without having to land. They’re also widely used in scientific research and military surveillance, though in a reversal of past trends, there is a growing interest in airships for scenic flights. Then there are the enthusiasts who still fly dirigibles just for the fun of it. Don’t worry though; airships nowadays are filled with helium, making tragedies like the Hindenburg much less likely to occur. Who’s up for a leisurely blimp ride?
[Image description: A black-and-white image of the airship Captain Ferber in its hangar with people in uniform standing about.] Credit & copyright: Epinal Municipal Library, Limedia galleries. Etalab Open License, Public Domain. -
FREELiterature PP&T CurioFree1 CQ
If you’re only going to write one book, make it count. That’s exactly what 19th century British author Anna Sewell did with her one and only novel, Black Beauty. Published on this day in 1877, the book was a critical and commercial success. Written from the perspective of a horse, the story follows the titular character as he experiences increasing hardship under different owners. The book features vivid descriptions of inhumane treatment of horses, which was sadly common at the time of its publication. However, the novel actually helped bring an end to at least one cruel practice in addition to changing children’s literature forever.
Born on March 30, 1820 in Norfolk, England, Anna Sewell’s early life was difficult. Growing up in poverty, her family moved frequently, and the Sewell children (Anna and her brother) sometimes stayed with relatives. When she was 12 (or possibly 14), Anna broke both of her ankles after slipping and falling. Her medical treatment was inadequate, leaving her with lifelong mobility issues. Anna’s mother was a prolific author of religious children’s books, as well as books on social issues like abolition and temperance. In her adolescence, Sewell began helping her mother edit her manuscripts. However, it wasn’t until her fifties that Anna began work on a book of her own. The story was inspired by the very animals that her injury forced her to rely upon: horses. Unable to walk without pain and with her condition worsening over her lifetime, she was more dependent on horses than most people. Perhaps owing to her own injury and chronic pain, she developed a deep empathy for the animals. By the time Sewell published her book, she was 57 and in failing health. Just five months after Black Beauty was released, Sewell passed away from what was likely tuberculosis.
Sewell’s novel follows Black Beauty—a highbred male horse—throughout his life from his perspective. As a foal, he lives on a farm owned by kind masters who treat him well. He lives with his mother, Duchess, and half-brother, Rob Roy. After he is trained to be ridden and pull carts, Black Beauty is sold to another master, who also treats him well. During his time with his second masters, Black Beauty makes friends with his master’s other horses. However, his circumstances change for the worse when his owner’s family moves out of England and he is sold yet again. Black Beauty is separated from his friends, and his new owner is not as kind to him. One day, the new owner rides him while drunk, injuring him in the process. The injury is accompanied by a disfiguring scar which renders him unfashionable to ride, and he is sold once again, this time as a work horse in industrialized London. In the city, Black Beauty experiences increasing hardship as he is forced to perform grueling labor. Eventually, he is purchased by a kindly cabdriver, but is sold again after three years. During that time, he encounters one of his old friends, whose health and body have been ruined by years of hard labor and neglect. Later, Black Beauty himself collapses while attempting to pull a crowded cab. He is then purchased by a farmer who restores him to health and later sells him to a couple of old ladies who treat him well. After a long and difficult life, Black Beauty is able to live in quiet and peace once more.
Sewell’s novel was not only a hit, it contributed greatly to the banning of bearing-reins, a piece of horse harness that forced the animal’s neck back to create a more upright posture. The use of bearing-reins (also called checkreins or overchecks) was common before the book was published, and often caused debilitating injuries to horses. Black Beauty was heavily promoted by the Royal Society for the Prevention of Cruelty to Animals for its sympathetic portrayal of horses and their combined efforts helped end the use of bearing-reins in England. In the literary world, Black Beauty ushered in a new type of novel, in which animals could literally tell their stories. Children’s classics like Charlotte's Web might not exist if not for Black Beauty. Young readers (and horses) would do well to thank Anna Sewell!
[Image description: The cover of the 1877 first edition of Black Beauty. The cover is green with gold flowers and the black head and neck of a horse.] Credit & copyright: London: Jarrold and Sons, Wikimedia Commons. This image (or other media file) is in the public domain because its copyright has expired. This applies to the European Union and those countries with a copyright term of 70 years after the work was made available to the public.If you’re only going to write one book, make it count. That’s exactly what 19th century British author Anna Sewell did with her one and only novel, Black Beauty. Published on this day in 1877, the book was a critical and commercial success. Written from the perspective of a horse, the story follows the titular character as he experiences increasing hardship under different owners. The book features vivid descriptions of inhumane treatment of horses, which was sadly common at the time of its publication. However, the novel actually helped bring an end to at least one cruel practice in addition to changing children’s literature forever.
Born on March 30, 1820 in Norfolk, England, Anna Sewell’s early life was difficult. Growing up in poverty, her family moved frequently, and the Sewell children (Anna and her brother) sometimes stayed with relatives. When she was 12 (or possibly 14), Anna broke both of her ankles after slipping and falling. Her medical treatment was inadequate, leaving her with lifelong mobility issues. Anna’s mother was a prolific author of religious children’s books, as well as books on social issues like abolition and temperance. In her adolescence, Sewell began helping her mother edit her manuscripts. However, it wasn’t until her fifties that Anna began work on a book of her own. The story was inspired by the very animals that her injury forced her to rely upon: horses. Unable to walk without pain and with her condition worsening over her lifetime, she was more dependent on horses than most people. Perhaps owing to her own injury and chronic pain, she developed a deep empathy for the animals. By the time Sewell published her book, she was 57 and in failing health. Just five months after Black Beauty was released, Sewell passed away from what was likely tuberculosis.
Sewell’s novel follows Black Beauty—a highbred male horse—throughout his life from his perspective. As a foal, he lives on a farm owned by kind masters who treat him well. He lives with his mother, Duchess, and half-brother, Rob Roy. After he is trained to be ridden and pull carts, Black Beauty is sold to another master, who also treats him well. During his time with his second masters, Black Beauty makes friends with his master’s other horses. However, his circumstances change for the worse when his owner’s family moves out of England and he is sold yet again. Black Beauty is separated from his friends, and his new owner is not as kind to him. One day, the new owner rides him while drunk, injuring him in the process. The injury is accompanied by a disfiguring scar which renders him unfashionable to ride, and he is sold once again, this time as a work horse in industrialized London. In the city, Black Beauty experiences increasing hardship as he is forced to perform grueling labor. Eventually, he is purchased by a kindly cabdriver, but is sold again after three years. During that time, he encounters one of his old friends, whose health and body have been ruined by years of hard labor and neglect. Later, Black Beauty himself collapses while attempting to pull a crowded cab. He is then purchased by a farmer who restores him to health and later sells him to a couple of old ladies who treat him well. After a long and difficult life, Black Beauty is able to live in quiet and peace once more.
Sewell’s novel was not only a hit, it contributed greatly to the banning of bearing-reins, a piece of horse harness that forced the animal’s neck back to create a more upright posture. The use of bearing-reins (also called checkreins or overchecks) was common before the book was published, and often caused debilitating injuries to horses. Black Beauty was heavily promoted by the Royal Society for the Prevention of Cruelty to Animals for its sympathetic portrayal of horses and their combined efforts helped end the use of bearing-reins in England. In the literary world, Black Beauty ushered in a new type of novel, in which animals could literally tell their stories. Children’s classics like Charlotte's Web might not exist if not for Black Beauty. Young readers (and horses) would do well to thank Anna Sewell!
[Image description: The cover of the 1877 first edition of Black Beauty. The cover is green with gold flowers and the black head and neck of a horse.] Credit & copyright: London: Jarrold and Sons, Wikimedia Commons. This image (or other media file) is in the public domain because its copyright has expired. This applies to the European Union and those countries with a copyright term of 70 years after the work was made available to the public. -
FREEEngineering PP&T CurioFree1 CQ
Where there’s a will, there’s a way…even if it takes a lot of digging. Connecting the Red Sea and the Mediterranean Sea seems like an impossible feat, but it actually happened several times throughout history. From the ancient Egyptians to the Byzantines, various rulers attempted and failed to maintain a maritime passage between the two seas. The latest—and possibly the greatest—iteration yet is the Suez Canal. Located on the Isthmus of Suez, the canal opened on this day in 1869, and it continues to be crucial to global commerce as it connects Asia and Europe without the need to navigate around the southern tip of Africa.
Historians believe that the notion of connecting the Red and Mediterranean seas was first ideated by Pharaoh Senausert III of the Twelfth Dynasty in the 19th century B.C.E. The pharaoh envisioned a canal that would lead ships to the Nile River and through the Bitter Lakes, creating a lucrative trade route to Asia. A canal was created, but it became impassable by 610 B.C.E. due to sand deposition. Later attempts to connect the seas were limited in scope, capacity, and permanence. Various canal systems connecting the seas through the Nile and the Bitter Lakes came and went, and in at least one instance, the destruction of the passage was deliberate. Abu Jafar El-Mansur of the Abbasid Caliphate ordered the canal to be filled with sand in 760 C.E. to quell a rebellion in Mecca and Medina, and that was the last time that a passage between the seas existed for over a thousand years. It wasn’t until the 19th century that anyone would make earnest efforts to reconnect the seas. Instead of a system of small canals that made use of the Nile for the majority of its length, this new passage was designed to run straight through the Isthmus of Suez, making it the longest sea level canal in the world at the time.
The Suez Canal was commissioned by Mohamed Sa'id Pasha, the Ottoman governor of Egypt in 1854. That year, he tasked French diplomat Ferdinand de Lesseps with constructing the canal, and in 1856, the Suez Canal Company was given the right to manage it for 99 years starting from the date of completion. Construction was initially expected to take around six years, but was delayed by various setbacks. At first, construction was performed by forced laborers who were only equipped with hand tools and baskets. Many of the laborers died in 1865 when a cholera epidemic swept through the area, and the project eventually switched over to the use of dredgers and steam shovels, which greatly accelerated the pace of construction. Finally, the Suez Canal opened on November 17, 1869, to great fanfare, with the inaugural voyage attended by the wife of Napoleon III, Empress Eugénie. The canal was originally only 25 feet deep, 72 feet wide at the bottom, and up to 300 feet wide at the surface, but was expanded in 1876 to accommodate larger ships.
During its first full year of operation, the canal saw an average of two ships pass through it. Today, an average of 58 ships a day, carrying 437,000 tons of cargo, sail its waters. The canal remains significant to global commerce, and when a cargo ship got stuck and caused a blockage in 2021, it held up 369 ships at a cost of $9.6 billion in trade a day. Lesseps, however, didn’t fare as well as his creation. Following the success of the Suez, he was hired to construct the Panama Canal. Unfortunately, Lesseps wasn’t an engineer. Rather, his previous feat was largely that of organizing financing and creating political will. His attempt to dig another sea level canal through the isthmus nation proved disastrous. Between disease and the much more difficult terrain, Lesseps failed to make meaningful progress using the same techniques he employed before. The Panama Canal was eventually completed later by the U.S., which opted for a system of canal locks that allowed ships to change elevations, eliminating the need to dig straight through the entire length. Lesseps was one man who really should have rested on his laurels.
[Image description: A photo of a navy ship on the Suez Canal, from above.] Credit & copyright: W. M. Welch/US Navy, Wikimedia Commons. This file is a work of a sailor or employee of the U.S. Navy, taken or made as part of that person's official duties. As a work of the U.S. federal government, it is in the public domain in the United States.Where there’s a will, there’s a way…even if it takes a lot of digging. Connecting the Red Sea and the Mediterranean Sea seems like an impossible feat, but it actually happened several times throughout history. From the ancient Egyptians to the Byzantines, various rulers attempted and failed to maintain a maritime passage between the two seas. The latest—and possibly the greatest—iteration yet is the Suez Canal. Located on the Isthmus of Suez, the canal opened on this day in 1869, and it continues to be crucial to global commerce as it connects Asia and Europe without the need to navigate around the southern tip of Africa.
Historians believe that the notion of connecting the Red and Mediterranean seas was first ideated by Pharaoh Senausert III of the Twelfth Dynasty in the 19th century B.C.E. The pharaoh envisioned a canal that would lead ships to the Nile River and through the Bitter Lakes, creating a lucrative trade route to Asia. A canal was created, but it became impassable by 610 B.C.E. due to sand deposition. Later attempts to connect the seas were limited in scope, capacity, and permanence. Various canal systems connecting the seas through the Nile and the Bitter Lakes came and went, and in at least one instance, the destruction of the passage was deliberate. Abu Jafar El-Mansur of the Abbasid Caliphate ordered the canal to be filled with sand in 760 C.E. to quell a rebellion in Mecca and Medina, and that was the last time that a passage between the seas existed for over a thousand years. It wasn’t until the 19th century that anyone would make earnest efforts to reconnect the seas. Instead of a system of small canals that made use of the Nile for the majority of its length, this new passage was designed to run straight through the Isthmus of Suez, making it the longest sea level canal in the world at the time.
The Suez Canal was commissioned by Mohamed Sa'id Pasha, the Ottoman governor of Egypt in 1854. That year, he tasked French diplomat Ferdinand de Lesseps with constructing the canal, and in 1856, the Suez Canal Company was given the right to manage it for 99 years starting from the date of completion. Construction was initially expected to take around six years, but was delayed by various setbacks. At first, construction was performed by forced laborers who were only equipped with hand tools and baskets. Many of the laborers died in 1865 when a cholera epidemic swept through the area, and the project eventually switched over to the use of dredgers and steam shovels, which greatly accelerated the pace of construction. Finally, the Suez Canal opened on November 17, 1869, to great fanfare, with the inaugural voyage attended by the wife of Napoleon III, Empress Eugénie. The canal was originally only 25 feet deep, 72 feet wide at the bottom, and up to 300 feet wide at the surface, but was expanded in 1876 to accommodate larger ships.
During its first full year of operation, the canal saw an average of two ships pass through it. Today, an average of 58 ships a day, carrying 437,000 tons of cargo, sail its waters. The canal remains significant to global commerce, and when a cargo ship got stuck and caused a blockage in 2021, it held up 369 ships at a cost of $9.6 billion in trade a day. Lesseps, however, didn’t fare as well as his creation. Following the success of the Suez, he was hired to construct the Panama Canal. Unfortunately, Lesseps wasn’t an engineer. Rather, his previous feat was largely that of organizing financing and creating political will. His attempt to dig another sea level canal through the isthmus nation proved disastrous. Between disease and the much more difficult terrain, Lesseps failed to make meaningful progress using the same techniques he employed before. The Panama Canal was eventually completed later by the U.S., which opted for a system of canal locks that allowed ships to change elevations, eliminating the need to dig straight through the entire length. Lesseps was one man who really should have rested on his laurels.
[Image description: A photo of a navy ship on the Suez Canal, from above.] Credit & copyright: W. M. Welch/US Navy, Wikimedia Commons. This file is a work of a sailor or employee of the U.S. Navy, taken or made as part of that person's official duties. As a work of the U.S. federal government, it is in the public domain in the United States. -
FREEEngineering PP&T CurioFree1 CQ
What’s a little rain while you’re driving? Terrifying. At least, it was at the beginning of the 20th century. American inventor Mary Anderson filed the first-ever patent for a windshield wiper on this day in 1903. Before then, people just had to make do with wet or muddy windshields. However, Anderson never got to reap the rewards for her world-changing invention.
Born in Alabama in 1866, Anderson wasn’t a career inventor. Little is known about her early life, but as an adult, she was a winemaker, rancher, and real estate developer. By all available accounts, her invention of the first windshield wiper was her one and only foray into the world of engineering or design. But her varied job titles implies that she likely had a keen eye for spotting opportunities, and the inspiration for her invention was no exception. The story goes that Anderson was visiting New York City during the winter and boarded a streetcar on one particularly wet and blustery day. Because of the inclement weather, the windshield of the streetcar kept getting splattered with water and debris, forcing the driver to open a window to manually wipe the windshield clean. Every time he did so, cold wind would blast through the opening, and this didn’t sit well with Anderson, who was used to the balmy Southern weather of her home state. Streetcar drivers weren’t the only ones who had to contend with this problem, of course. As automobiles became more common, the drivers of those vehicles resorted to similar measures or simply drove with their heads sticking out car windows. Inspired by the streetcar driver’s struggle, and perhaps frustrated by the cold ride, Anderson set out to come up with a better solution. In 1903, the U.S. Patent and Trademark Office awarded Anderson with U.S. Patent No. 743,801, or Window-Cleaning Device.
Anderson’s invention, though groundbreaking for its time, doesn’t resemble the modern iteration much. Her version was still operated by hand (albeit from the inside) and consisted of a single rubber blade to clear the windshield. The device also included a counterweight to keep the blade firmly in contact with the glass, and though it was relatively primitive, it was still pretty effective. Unfortunately for Anderson, automakers were hesitant to embrace her invention early on. Despite several attempts, Anderson was never able to attract investors or have them manufactured for sale due to lack of interest. She may have simply been too ahead of her time. Automakers didn’t start making windshield wipers standard equipment in their vehicles until 1916. By then, Anderson’s patent had expired, keeping her from making any profit from her inventions through licensing. Then again, maybe automakers didn’t adopt her windshield wipers on purpose so as not to pay her any fees, though the actual reason is unclear.
Though her invention may not have earned her any money, Anderson has since been recognized for her contribution. In 2011, over 60 years after her death, she was inducted into the National Inventors Hall of Fame. These days, many improvements have been made to her original windshield wiper. In 1917, Charlotte Bridgewood invented the Electric Storm Windshield Cleaner (U.S. Patent No. 1,274,983), the first to be powered by electricity. A few years later, in 1922, brothers William M. and Fred Folberth invented the simply-named Windshield Cleaner (U.S. Patent No. 1,420,538) which was powered by redirected engine exhaust. However, the version that most windshield wipers are based on today was invented by Robert Kearns in the 1960s. Called Windshield Wiper System With Intermittent Operation (U.S. Patent No. 3,351,836), it was motorized and capable of variable speeds. Who knew there were so many ways to clean a windshield?
[Image description: raindrops on a windshield which has been partially wiped clean.] Credit & copyright: Valeriia Miller, PexelsWhat’s a little rain while you’re driving? Terrifying. At least, it was at the beginning of the 20th century. American inventor Mary Anderson filed the first-ever patent for a windshield wiper on this day in 1903. Before then, people just had to make do with wet or muddy windshields. However, Anderson never got to reap the rewards for her world-changing invention.
Born in Alabama in 1866, Anderson wasn’t a career inventor. Little is known about her early life, but as an adult, she was a winemaker, rancher, and real estate developer. By all available accounts, her invention of the first windshield wiper was her one and only foray into the world of engineering or design. But her varied job titles implies that she likely had a keen eye for spotting opportunities, and the inspiration for her invention was no exception. The story goes that Anderson was visiting New York City during the winter and boarded a streetcar on one particularly wet and blustery day. Because of the inclement weather, the windshield of the streetcar kept getting splattered with water and debris, forcing the driver to open a window to manually wipe the windshield clean. Every time he did so, cold wind would blast through the opening, and this didn’t sit well with Anderson, who was used to the balmy Southern weather of her home state. Streetcar drivers weren’t the only ones who had to contend with this problem, of course. As automobiles became more common, the drivers of those vehicles resorted to similar measures or simply drove with their heads sticking out car windows. Inspired by the streetcar driver’s struggle, and perhaps frustrated by the cold ride, Anderson set out to come up with a better solution. In 1903, the U.S. Patent and Trademark Office awarded Anderson with U.S. Patent No. 743,801, or Window-Cleaning Device.
Anderson’s invention, though groundbreaking for its time, doesn’t resemble the modern iteration much. Her version was still operated by hand (albeit from the inside) and consisted of a single rubber blade to clear the windshield. The device also included a counterweight to keep the blade firmly in contact with the glass, and though it was relatively primitive, it was still pretty effective. Unfortunately for Anderson, automakers were hesitant to embrace her invention early on. Despite several attempts, Anderson was never able to attract investors or have them manufactured for sale due to lack of interest. She may have simply been too ahead of her time. Automakers didn’t start making windshield wipers standard equipment in their vehicles until 1916. By then, Anderson’s patent had expired, keeping her from making any profit from her inventions through licensing. Then again, maybe automakers didn’t adopt her windshield wipers on purpose so as not to pay her any fees, though the actual reason is unclear.
Though her invention may not have earned her any money, Anderson has since been recognized for her contribution. In 2011, over 60 years after her death, she was inducted into the National Inventors Hall of Fame. These days, many improvements have been made to her original windshield wiper. In 1917, Charlotte Bridgewood invented the Electric Storm Windshield Cleaner (U.S. Patent No. 1,274,983), the first to be powered by electricity. A few years later, in 1922, brothers William M. and Fred Folberth invented the simply-named Windshield Cleaner (U.S. Patent No. 1,420,538) which was powered by redirected engine exhaust. However, the version that most windshield wipers are based on today was invented by Robert Kearns in the 1960s. Called Windshield Wiper System With Intermittent Operation (U.S. Patent No. 3,351,836), it was motorized and capable of variable speeds. Who knew there were so many ways to clean a windshield?
[Image description: raindrops on a windshield which has been partially wiped clean.] Credit & copyright: Valeriia Miller, Pexels -
FREEArt Appreciation PP&T CurioFree1 CQ
There are movements that shape artists, and there are artists that shape movements. Henri Matisse was decidedly the latter of the two. The multidisciplinary French artist passed away on this day in 1954, and during his illustrious career, he became one of the most prolific and influential artists of all time, engaging in friendships and rivalries with other masters of modern art, most notably Pablo Picasso.
Henri Émile Benoît Matisse was born on December 31, 1869 in Le Cateau-Cambresis, Nord, France. Unlike many of his artistic contemporaries, Matisse wasn’t trained in the discipline nor did he show any significant interest in it until he was already a young man. Before picking up his first paintbrush, Matisse moved to Paris in 1887 to study law and went on to find work as a court administrator in northern France. It wasn’t until 1889, when he became ill with appendicitis, that he began painting after his mother gifted him some art supplies to stave off boredom during his recovery. The young Matisse quickly became completely enamored with painting, later describing it as "a kind of paradise.” Much to the chagrin of his father, Matisse abandoned his legal ambitions and moved back to Paris to learn art, studying under the likes of William-Adolphe Bouguereau and Gustave Moreau. However, the work produced in his early years, mostly consisting of still lifes in earth toned palettes, was quite unlike the work that would eventually make him famous. His true artistic awakening didn’t occur until 1896, when he met Australian painter John Russell. A friend of Vincent van Gogh, Russell showed the struggling artist a collection of Van Gogh’s paintings, introducing Matisse to Impressionism.
In the following years, Matisse began collecting and studying the work of his contemporaries, particularly the Neo-Impressionists. Inspired by their bright colors and bold brushstrokes, his own vision of the world began coalescing along with that of other, like-minded artists into a relatively short-lived but influential movement called Fauvism. The works of the “Fauves” (“wild beasts” in French) like Matisse were defined by unconventional and intense color palettes laid down with striking brushstrokes. Despite being a founding member of a movement, Matisse was never one to settle for just one style or medium. Throughout his life, he dabbled in pointillism, printmaking, sculpting, and paper cutting. At times, he even returned to and was praised for his more traditional works, which he pursued in the post-WWI period. Among his contemporaries, there was only one who seemed to match him: Pablo Picasso. Matisse’s rivalry with this fellow master of modern art is well documented, and the two seemed to study each other’s works carefully. Matisse and Picasso often painted the same scenes and subjects, including the same models. At times, they even titled their pieces the same, not for lack of creativity, but to serve as a riposte on canvas. Matisse once likened their rivalry to a boxing match, and though the two didn’t initially care for each other’s work, they eventually developed a mutual admiration.
Today, the name Matisse is practically synonymous with modern art, and his influence goes beyond the canvas. In his later years, Matisse’s failing health forced him to rely on assistants for much of his work. During the 1940s, Matisse worked with paper, creating colorful collages called gouaches découpés that he described as “painting with scissors.” His final masterpiece, however, was his design for a stained-glass window for the Union Church of Pocantico Hills in New York City. No matter what medium he touched, Matisse always left an impression, leaving behind a body of work that is wildly eclectic yet always recognizably his. Surely his father had to admit that Matisse did the right thing by leaving law school.
[Image description: A fanned-out group of paint brushes smattered with paint.] Credit & copyright: Steve Johnson, PexelsThere are movements that shape artists, and there are artists that shape movements. Henri Matisse was decidedly the latter of the two. The multidisciplinary French artist passed away on this day in 1954, and during his illustrious career, he became one of the most prolific and influential artists of all time, engaging in friendships and rivalries with other masters of modern art, most notably Pablo Picasso.
Henri Émile Benoît Matisse was born on December 31, 1869 in Le Cateau-Cambresis, Nord, France. Unlike many of his artistic contemporaries, Matisse wasn’t trained in the discipline nor did he show any significant interest in it until he was already a young man. Before picking up his first paintbrush, Matisse moved to Paris in 1887 to study law and went on to find work as a court administrator in northern France. It wasn’t until 1889, when he became ill with appendicitis, that he began painting after his mother gifted him some art supplies to stave off boredom during his recovery. The young Matisse quickly became completely enamored with painting, later describing it as "a kind of paradise.” Much to the chagrin of his father, Matisse abandoned his legal ambitions and moved back to Paris to learn art, studying under the likes of William-Adolphe Bouguereau and Gustave Moreau. However, the work produced in his early years, mostly consisting of still lifes in earth toned palettes, was quite unlike the work that would eventually make him famous. His true artistic awakening didn’t occur until 1896, when he met Australian painter John Russell. A friend of Vincent van Gogh, Russell showed the struggling artist a collection of Van Gogh’s paintings, introducing Matisse to Impressionism.
In the following years, Matisse began collecting and studying the work of his contemporaries, particularly the Neo-Impressionists. Inspired by their bright colors and bold brushstrokes, his own vision of the world began coalescing along with that of other, like-minded artists into a relatively short-lived but influential movement called Fauvism. The works of the “Fauves” (“wild beasts” in French) like Matisse were defined by unconventional and intense color palettes laid down with striking brushstrokes. Despite being a founding member of a movement, Matisse was never one to settle for just one style or medium. Throughout his life, he dabbled in pointillism, printmaking, sculpting, and paper cutting. At times, he even returned to and was praised for his more traditional works, which he pursued in the post-WWI period. Among his contemporaries, there was only one who seemed to match him: Pablo Picasso. Matisse’s rivalry with this fellow master of modern art is well documented, and the two seemed to study each other’s works carefully. Matisse and Picasso often painted the same scenes and subjects, including the same models. At times, they even titled their pieces the same, not for lack of creativity, but to serve as a riposte on canvas. Matisse once likened their rivalry to a boxing match, and though the two didn’t initially care for each other’s work, they eventually developed a mutual admiration.
Today, the name Matisse is practically synonymous with modern art, and his influence goes beyond the canvas. In his later years, Matisse’s failing health forced him to rely on assistants for much of his work. During the 1940s, Matisse worked with paper, creating colorful collages called gouaches découpés that he described as “painting with scissors.” His final masterpiece, however, was his design for a stained-glass window for the Union Church of Pocantico Hills in New York City. No matter what medium he touched, Matisse always left an impression, leaving behind a body of work that is wildly eclectic yet always recognizably his. Surely his father had to admit that Matisse did the right thing by leaving law school.
[Image description: A fanned-out group of paint brushes smattered with paint.] Credit & copyright: Steve Johnson, Pexels -
FREEUS History PP&T CurioFree1 CQ
New York is full of engineering wonders, from skyscrapers to suspension bridges, but one of the most impressive isn’t even visible above ground. The New York City subway system transports over a billion riders through the urban jungle every year. The city’s first subway system opened on this day in 1904, and since then it has continued to expand and serve an exponentially growing population.
By the late 1800s, New York City was already the most populated city in the United States. Already known as a center of commerce and culture, the city was growing quickly…and quickly running out of room. Roads were congested with horse-drawn carriages and the island borough of Manhattan was serviced by elevated railways that took up precious real estate. City planners needed a solution that would address the transportation needs of the residents without taking up what little room was left. A subway system seemed like a logical answer. After all, the world’s first underground transit system was already a proven success, as it had been operating in London since 1863. In nearby Boston, America’s first subway was finished in 1897, though it was more limited in scope and used streetcars. There had even been a limited subway line in New York City between 1870 and 1873. During those short few years, a pneumatic-powered, 18-passenger car traversed under Broadway using a 100 horsepower fan. There had been talk of expanding the line, but the technology was made obsolete by improvements in electric traction motors, and the line was soon abandoned. Indeed, the future of transit in New York City was electric, and after much lobbying from the city’s Board of Rapid Transit and financing from prominent financier August Belmont, Jr., construction on the permanent subway system began in 1900.
As construction crews dug underground, they built temporary wooden bridges over the subway tunnels to allow traffic to continue unimpeded. Not everything went so smoothly, though. Because the tunnel was close to the surface in many places, construction often involved moving existing infrastructure like gas and water lines. Some things weren’t so easy to move out of the way, such as the Columbus Monument in Central Park. One section of the tunnels had to pass through the east side of the 700-ton monument’s foundation, and simply digging through could have led to its collapse. To avoid damaging it, workers had to build a new support under the monument, slowing progress on the subway. Another major obstacle was the New York Times building, which had a pressroom below where the tunnel was to be built. So, the subway was simply built through the building with steel channels to reinforce its structure. Despite these and other engineering challenges, construction was completed just four years after it started, and the inaugural run of the city’s new transit system took place on October 27, 1904, at 2:35 PM, with Mayor George McClellan at the controls. The subway system was operated by the Interborough Rapid Transit Company (IRT) and consisted of just 9.1 miles of tracks passing through 28 stations. That may seem limited compared to today, but it was an astounding leap for commuters at the time, with IRT claiming to take passengers from “City Hall to Harlem in 15 minutes.” At 7 PM, just hours after the inaugural run, the subway was opened to the public for just a nickel per passenger. On opening day, around 100,000 passengers tried out the newly-minted subway, and that number has only grown since.
Today, New York City’s subway system has 472 stations and 665 miles of track. It’s operated by the Metropolitan Transport Authority (MTA) and serves over three million riders a day. The city’s subway system wasn’t the first, nor is it currently the largest, but it remains the only one to operate 24 hours a day, 7 days a week—a feature that many New Yorkers have come to rely on. The extensive and convenient transit system allowed the city to grow throughout the 20th century, and the Big Apple might have ended up as Small Potatoes without it.
[Image description: A subway train near a sign reading “W 8 Street.”] Credit & copyright: Tim Gouw, PexelsNew York is full of engineering wonders, from skyscrapers to suspension bridges, but one of the most impressive isn’t even visible above ground. The New York City subway system transports over a billion riders through the urban jungle every year. The city’s first subway system opened on this day in 1904, and since then it has continued to expand and serve an exponentially growing population.
By the late 1800s, New York City was already the most populated city in the United States. Already known as a center of commerce and culture, the city was growing quickly…and quickly running out of room. Roads were congested with horse-drawn carriages and the island borough of Manhattan was serviced by elevated railways that took up precious real estate. City planners needed a solution that would address the transportation needs of the residents without taking up what little room was left. A subway system seemed like a logical answer. After all, the world’s first underground transit system was already a proven success, as it had been operating in London since 1863. In nearby Boston, America’s first subway was finished in 1897, though it was more limited in scope and used streetcars. There had even been a limited subway line in New York City between 1870 and 1873. During those short few years, a pneumatic-powered, 18-passenger car traversed under Broadway using a 100 horsepower fan. There had been talk of expanding the line, but the technology was made obsolete by improvements in electric traction motors, and the line was soon abandoned. Indeed, the future of transit in New York City was electric, and after much lobbying from the city’s Board of Rapid Transit and financing from prominent financier August Belmont, Jr., construction on the permanent subway system began in 1900.
As construction crews dug underground, they built temporary wooden bridges over the subway tunnels to allow traffic to continue unimpeded. Not everything went so smoothly, though. Because the tunnel was close to the surface in many places, construction often involved moving existing infrastructure like gas and water lines. Some things weren’t so easy to move out of the way, such as the Columbus Monument in Central Park. One section of the tunnels had to pass through the east side of the 700-ton monument’s foundation, and simply digging through could have led to its collapse. To avoid damaging it, workers had to build a new support under the monument, slowing progress on the subway. Another major obstacle was the New York Times building, which had a pressroom below where the tunnel was to be built. So, the subway was simply built through the building with steel channels to reinforce its structure. Despite these and other engineering challenges, construction was completed just four years after it started, and the inaugural run of the city’s new transit system took place on October 27, 1904, at 2:35 PM, with Mayor George McClellan at the controls. The subway system was operated by the Interborough Rapid Transit Company (IRT) and consisted of just 9.1 miles of tracks passing through 28 stations. That may seem limited compared to today, but it was an astounding leap for commuters at the time, with IRT claiming to take passengers from “City Hall to Harlem in 15 minutes.” At 7 PM, just hours after the inaugural run, the subway was opened to the public for just a nickel per passenger. On opening day, around 100,000 passengers tried out the newly-minted subway, and that number has only grown since.
Today, New York City’s subway system has 472 stations and 665 miles of track. It’s operated by the Metropolitan Transport Authority (MTA) and serves over three million riders a day. The city’s subway system wasn’t the first, nor is it currently the largest, but it remains the only one to operate 24 hours a day, 7 days a week—a feature that many New Yorkers have come to rely on. The extensive and convenient transit system allowed the city to grow throughout the 20th century, and the Big Apple might have ended up as Small Potatoes without it.
[Image description: A subway train near a sign reading “W 8 Street.”] Credit & copyright: Tim Gouw, Pexels -
FREELiterature PP&T CurioFree1 CQ
Halloween approaches, and with it a host of familiar, spooky tales, many of which have their basis in classic novels. Oscar Wilde’s The Picture Dorian Gray isn’t quite as famous as Dracula or Frankenstein, but this novel is just as spooky, and it’s had its fair share of pop culture appearances and film adaptations too. It’s not exactly a story about a monster… but about the monstrous faults that lurk in all of us.
The Picture Dorian Gray was first published in 1890 in Lippincott’s Monthly Magazine as a novella, which was common for new stories at the time. It follows the titular character through his descent into moral decay. Dorian Gray is a handsome, rich young man who enjoys a relatively carefree life. Gray’s friend, Basil Hallward, paints his portrait and discusses Gray’s extraordinary beauty with Lord Henry Wotton, a hedonistic socialite. When Gray arrives to see the finished piece, Wotton describes his personal life philosophy: that one should live to indulge their impulses and appetites. He goes on to say to Gray, “…you have the most marvelous youth, and youth is the one thing worth having.” As Hallward places the finishing touches on the painting, Gray declares, “But this picture will remain always young. It will never be older than this particular day of June…If it were only the other way! If it were I who was to be always young, and the picture that was to grow old! For that—for that—I would give everything! Yes, there is nothing in the whole world I would not give! I would give my soul for that!” From that point on, Gray begins to commit cruel and even violent transgressions, the first of which leads to the death of his lover, Sibyl Vane. Yet, he remains ageless and beautiful while his portrait warps into an increasingly grotesque reflection of his inner self. Ultimately, even his attempt to redeem himself by a kind act is revealed to be self-serving, as the portrait changes to reflect his cunning. Eventually, Gray murders the portrait's creator after Hallword discovers how hideous it has become. When a crazed Gray stabs the portrait in frustration, a servant hears him scream and comes to his aid, only to find the body of an ugly, old man with a knife in his chest. The portrait, meanwhile, has reverted back to its original, beautiful form.
Wilde’s novel didn’t have quite the reception he’d hoped for. When it was unleashed upon the Victorian readership, it set off a storm of controversy with Wilde at the center. This was despite the fact that Lippincott’s editor, J. M. Stoddart, had heavily edited the novella to censor portions that he believed were too obscene for Victorian sensibilities. The cuts to the text were made without Wilde’s input or consent, and largely targeted the homosexual undertones present in the interactions between some of the male characters. In particular, Hallward was originally characterized as having much more overt homosexual inclinations toward Gray. Stoddart also removed some of the more salacious details surrounding the novel’s heterosexual relationships. When the book was engulfed in scandal, Wilde himself made further edits of his own accord, but to no avail. When Wilde was accused of having engaged in a homosexual relationship with Lord Alfred Douglas by Douglas’s father, the author sued the latter for libel. The suit fell apart in court after the homosexual themes in The Picture of Dorian Gray were used as evidence against Wilde, and the failure of the suit left him open to criminal prosecution for homosexuality under British law. After two trials, Wilde was sentenced to two years of hard labor in 1895. After his release, he was plagued by poor health while commercial success eluded him. Wilde passed away in Paris, France, in 1900 of acute meningitis.
Today, The Picture of Dorian Gray is seen in a much different light. The work is considered one of the best examples of Wilde’s wit and eye for characterization. It’s also the most representative of Wilde’s Aestheticism, a worldview espoused by several characters in the novella. Nowadays, a version true to the author’s original intent is available as The Picture of Dorian Gray: An Annotated, Uncensored Edition (2011), which restores material cut from the text by Stoddart and Wilde. It may not be so controversial for modern sensibilities, but just in case, make sure you’re wearing some pearls so you have something to clutch if you buy a copy.
[Image description: A 1908 illustration from Oscar Wilde's The Picture of Dorian Gray] Credit & copyright:
Eugène Dété (1848–1922) after Paul Thiriat (1868–1943), 1908. Mississippi State University, College of Architecture Art and Design, Wikimedia Commons. This work is in the public domain in its source country and the United States because it was published (or registered with the U.S. Copyright Office) before January 1, 1929.Halloween approaches, and with it a host of familiar, spooky tales, many of which have their basis in classic novels. Oscar Wilde’s The Picture Dorian Gray isn’t quite as famous as Dracula or Frankenstein, but this novel is just as spooky, and it’s had its fair share of pop culture appearances and film adaptations too. It’s not exactly a story about a monster… but about the monstrous faults that lurk in all of us.
The Picture Dorian Gray was first published in 1890 in Lippincott’s Monthly Magazine as a novella, which was common for new stories at the time. It follows the titular character through his descent into moral decay. Dorian Gray is a handsome, rich young man who enjoys a relatively carefree life. Gray’s friend, Basil Hallward, paints his portrait and discusses Gray’s extraordinary beauty with Lord Henry Wotton, a hedonistic socialite. When Gray arrives to see the finished piece, Wotton describes his personal life philosophy: that one should live to indulge their impulses and appetites. He goes on to say to Gray, “…you have the most marvelous youth, and youth is the one thing worth having.” As Hallward places the finishing touches on the painting, Gray declares, “But this picture will remain always young. It will never be older than this particular day of June…If it were only the other way! If it were I who was to be always young, and the picture that was to grow old! For that—for that—I would give everything! Yes, there is nothing in the whole world I would not give! I would give my soul for that!” From that point on, Gray begins to commit cruel and even violent transgressions, the first of which leads to the death of his lover, Sibyl Vane. Yet, he remains ageless and beautiful while his portrait warps into an increasingly grotesque reflection of his inner self. Ultimately, even his attempt to redeem himself by a kind act is revealed to be self-serving, as the portrait changes to reflect his cunning. Eventually, Gray murders the portrait's creator after Hallword discovers how hideous it has become. When a crazed Gray stabs the portrait in frustration, a servant hears him scream and comes to his aid, only to find the body of an ugly, old man with a knife in his chest. The portrait, meanwhile, has reverted back to its original, beautiful form.
Wilde’s novel didn’t have quite the reception he’d hoped for. When it was unleashed upon the Victorian readership, it set off a storm of controversy with Wilde at the center. This was despite the fact that Lippincott’s editor, J. M. Stoddart, had heavily edited the novella to censor portions that he believed were too obscene for Victorian sensibilities. The cuts to the text were made without Wilde’s input or consent, and largely targeted the homosexual undertones present in the interactions between some of the male characters. In particular, Hallward was originally characterized as having much more overt homosexual inclinations toward Gray. Stoddart also removed some of the more salacious details surrounding the novel’s heterosexual relationships. When the book was engulfed in scandal, Wilde himself made further edits of his own accord, but to no avail. When Wilde was accused of having engaged in a homosexual relationship with Lord Alfred Douglas by Douglas’s father, the author sued the latter for libel. The suit fell apart in court after the homosexual themes in The Picture of Dorian Gray were used as evidence against Wilde, and the failure of the suit left him open to criminal prosecution for homosexuality under British law. After two trials, Wilde was sentenced to two years of hard labor in 1895. After his release, he was plagued by poor health while commercial success eluded him. Wilde passed away in Paris, France, in 1900 of acute meningitis.
Today, The Picture of Dorian Gray is seen in a much different light. The work is considered one of the best examples of Wilde’s wit and eye for characterization. It’s also the most representative of Wilde’s Aestheticism, a worldview espoused by several characters in the novella. Nowadays, a version true to the author’s original intent is available as The Picture of Dorian Gray: An Annotated, Uncensored Edition (2011), which restores material cut from the text by Stoddart and Wilde. It may not be so controversial for modern sensibilities, but just in case, make sure you’re wearing some pearls so you have something to clutch if you buy a copy.
[Image description: A 1908 illustration from Oscar Wilde's The Picture of Dorian Gray] Credit & copyright:
Eugène Dété (1848–1922) after Paul Thiriat (1868–1943), 1908. Mississippi State University, College of Architecture Art and Design, Wikimedia Commons. This work is in the public domain in its source country and the United States because it was published (or registered with the U.S. Copyright Office) before January 1, 1929. -
FREEPolitical Science PP&T CurioFree1 CQ
For better or worse, modern American politics are a bombastic affair involving celebrity endorsements and plenty of talking heads. Former President Jimmy Carter, who recently became the first U.S. President to celebrate his 100th birthday, has lived a different sort of life than many modern politicians. His first home lacked electricity and indoor plumbing, and his career involved more quiet service than political bravado.
Born on October 1, 1924 in Plains, Georgia, James Earl “Jimmy” Carter Jr. was the first U.S. President to be born in a hospital, as home births were more common at the time. His early childhood was fairly humble. His father, Earl, was a peanut farmer and businessman who enlisted young Jimmy’s help in packing goods to be sold in town, while his mother was a trained nurse who provided healthcare services to impoverished Black families. As a student, Carter excelled at school, encouraged by his parents to be hardworking and enterprising. Aside from helping his father, he also sought work with the Sumter County Library Board, where he helped set up the bookmobile, a traveling library to service the rural areas of the county. After graduating high school in 1941, Carter attended the Georgia Institute of Technology for a year before entering the U.S. Naval Academy. He met his future wife, Rosalynn Smith, during his last year at the Academy, and the two were married in 1946. After graduating from the Academy the same year, Carter joined the U.S. Navy’s submarine service, although it was a dangerous job. He even worked with Captain Hyman Rickover, the “father of the nuclear Navy,” and studied nuclear engineering as part of the Navy’s efforts to build its first nuclear submarines. Carter would have served aboard the U.S.S. Seawolf, one of the first two such vessels, but the death of his father in 1953 prompted him to resign so that he could return to Georgia and take over the struggling family farm.
On returning to his home state, Carter and his family moved into a public housing project in Plains due to a post-war housing shortage. This experience inspired him to work with Habitat for Humanity decades later, and it also made him the first president to have lived in public housing. While turning around the fortunes of the family’s peanut farm, Carter became involved in politics, earning a seat on the Sumter County Board of Education in 1955. In 1962, he ran for a seat in the Georgia State Senate, where he earned a reputation for himself by targeting wasteful spending and laws meant to disenfranchise Black voters. Although he failed to win the Democratic primary in 1966 for a seat in the U.S. Congress (largely due to his support of the civil rights movement), he refocused his efforts toward the 1970 gubernatorial election. After a successful campaign, he surprised many in Georgia by advocating for integration and appointing more Black staff members than previous administrations. Though his idealism attracted criticism, Carter was largely popular in the state for his work in reducing government bureaucracy and increasing funding for schools.
Jimmy Carter’s political ambitions eventually led him to the White House when he took office in 1977. His Presidency took place during a chaotic time, in which the Iranian hostage crisis, a war in Afghanistan, and economic worries were just some of the problems he was tasked with helping to solve. After losing the 1980 Presidential race to Ronald Reagan, Carter and his wife moved back into their modest, ranch-style home in Georgia where they lived for more than 60 years, making him one of just a few presidents to return to their pre-presidential residences. Today, Carter is almost as well-known for his work after his presidency, as during it, since he dedicated much of his life to charity work, especially building homes with Habitat for Humanity. He also wrote over 30 books, including three that he recorded as audio books which won him three Grammy Awards in the Spoken Word Album category. Not too shabby for a humble peanut farmer.
[Image description: Jimmy Carter’s official Presidential portrait; he wears a dark blue suit with a light blue shirt and striped tie.] Credit & copyright: Department of Defense. Department of the Navy. Naval Photographic Center. Wikimedia Commons. This work is in the public domain in the United States because it is a work prepared by an officer or employee of the United States Government as part of that person’s official duties under the terms of Title 17, Chapter 1, Section 105 of the US Code.For better or worse, modern American politics are a bombastic affair involving celebrity endorsements and plenty of talking heads. Former President Jimmy Carter, who recently became the first U.S. President to celebrate his 100th birthday, has lived a different sort of life than many modern politicians. His first home lacked electricity and indoor plumbing, and his career involved more quiet service than political bravado.
Born on October 1, 1924 in Plains, Georgia, James Earl “Jimmy” Carter Jr. was the first U.S. President to be born in a hospital, as home births were more common at the time. His early childhood was fairly humble. His father, Earl, was a peanut farmer and businessman who enlisted young Jimmy’s help in packing goods to be sold in town, while his mother was a trained nurse who provided healthcare services to impoverished Black families. As a student, Carter excelled at school, encouraged by his parents to be hardworking and enterprising. Aside from helping his father, he also sought work with the Sumter County Library Board, where he helped set up the bookmobile, a traveling library to service the rural areas of the county. After graduating high school in 1941, Carter attended the Georgia Institute of Technology for a year before entering the U.S. Naval Academy. He met his future wife, Rosalynn Smith, during his last year at the Academy, and the two were married in 1946. After graduating from the Academy the same year, Carter joined the U.S. Navy’s submarine service, although it was a dangerous job. He even worked with Captain Hyman Rickover, the “father of the nuclear Navy,” and studied nuclear engineering as part of the Navy’s efforts to build its first nuclear submarines. Carter would have served aboard the U.S.S. Seawolf, one of the first two such vessels, but the death of his father in 1953 prompted him to resign so that he could return to Georgia and take over the struggling family farm.
On returning to his home state, Carter and his family moved into a public housing project in Plains due to a post-war housing shortage. This experience inspired him to work with Habitat for Humanity decades later, and it also made him the first president to have lived in public housing. While turning around the fortunes of the family’s peanut farm, Carter became involved in politics, earning a seat on the Sumter County Board of Education in 1955. In 1962, he ran for a seat in the Georgia State Senate, where he earned a reputation for himself by targeting wasteful spending and laws meant to disenfranchise Black voters. Although he failed to win the Democratic primary in 1966 for a seat in the U.S. Congress (largely due to his support of the civil rights movement), he refocused his efforts toward the 1970 gubernatorial election. After a successful campaign, he surprised many in Georgia by advocating for integration and appointing more Black staff members than previous administrations. Though his idealism attracted criticism, Carter was largely popular in the state for his work in reducing government bureaucracy and increasing funding for schools.
Jimmy Carter’s political ambitions eventually led him to the White House when he took office in 1977. His Presidency took place during a chaotic time, in which the Iranian hostage crisis, a war in Afghanistan, and economic worries were just some of the problems he was tasked with helping to solve. After losing the 1980 Presidential race to Ronald Reagan, Carter and his wife moved back into their modest, ranch-style home in Georgia where they lived for more than 60 years, making him one of just a few presidents to return to their pre-presidential residences. Today, Carter is almost as well-known for his work after his presidency, as during it, since he dedicated much of his life to charity work, especially building homes with Habitat for Humanity. He also wrote over 30 books, including three that he recorded as audio books which won him three Grammy Awards in the Spoken Word Album category. Not too shabby for a humble peanut farmer.
[Image description: Jimmy Carter’s official Presidential portrait; he wears a dark blue suit with a light blue shirt and striped tie.] Credit & copyright: Department of Defense. Department of the Navy. Naval Photographic Center. Wikimedia Commons. This work is in the public domain in the United States because it is a work prepared by an officer or employee of the United States Government as part of that person’s official duties under the terms of Title 17, Chapter 1, Section 105 of the US Code.