Curio Cabinet / Person, Place, or Thing
-
FREEWorld History PP&T CurioFree1 CQ
Happy Saint Patrick’s Day! Just who was this Saint Patrick guy, anyway? Like all saints who went on to become holiday mascots (think Saint Valentine and Saint Nicholas) the real Saint Patrick’s life is steeped in legend. In fact, almost everything we know about his life comes from two works that Patrick wrote himself: his autobiography, Confessio, and a letter condemning what he saw as Britain’s mistreatment of Christians in Ireland. While some of Patrick’s stories might best be taken with a grain of salt, there’s no doubt that he became an extremely successful priest and missionary in his lifetime, and that he faced plenty of tribulations along the way.
The story of Saint Patrick gets strange right off the bat since, despite his fame as the patron saint of Ireland, he wasn’t actually Irish. Rather, he was born in Britain sometime around 450 C.E. to a family of Roman descent. His father was a wealthy deacon and local politician, but even his status wasn’t enough to protect a 16-year-old Patrick from being kidnapped by Irish raiders who broke into his family’s estate. The teen was carried off into slavery in Ireland, where he was forced to work for six years herding sheep. During his time in captivity, Patrick sought solace in his religion and became more devout as a result. According to Patrick’s own writings, he had a dream one night in which the Christian god told him that it was time to leave, so he fled his captors and returned to his family in Britain. After his return, another dream told him that he would one day return to Ireland as a missionary. Whatever his reasoning, Patrick did begin 15 years of religious training, at the end of which he was ordained a priest. Amazingly, he did indeed choose to return to the land where he had been enslaved to do the bulk of his religious work.
Although some legends claim that Saint Patrick introduced Christianity to Ireland, that’s almost certainly not true, since part of his job as a missionary and priest was working with Ireland’s already-Christian population. Unlike most foreign priests, Patrick was familiar with Irish traditions and rituals due to the time he’d spent there, which endeared him to Irish Christians. It also allowed him to better relate to the non-Christians he was trying to convert. Patrick put a Christian spin on Irish, pagan rituals, such as lighting bonfires during Easter instead of doing so to worship the Celtic gods. He is also credited with redesigning the typical Christian cross by adding a circle that represents the sun—a prominent Celtic symbol—to make the reverence of the symbol feel more familiar. This design came to be known as the Celtic cross, and it’s still in use today in regions with Celtic heritage. His influence and reputation in Ireland only grew after his death, and he was heralded as a saint by acclaim alone before the Catholic Church had a formal canonization process.
As with any Catholic saint, Patrick was credited for performing a number of epic feats and miracles. The most famous of these is his eradication of snakes from the island, though historically this seems unlikely since scientific evidence points to there being no reptiles at all on the island prior to modern times. Patrick is also credited with using a three-leafed clover, or shamrock, to explain the concept of the Holy Trinity to the Irish, though this was never mentioned in his own writings. Another story tells of Patrick fasting on a mountain for 40 days, until an angel came down to speak with him on behalf of God. The story goes that Patrick then made several demands of God, like allowing him to save more damned souls than any other saint, preventing the English from ever ruling over the Irish, and giving him the privilege of judging Irish souls during the Last Judgment.
While St. Patrick is still heavily associated with Irish culture, his feast day on March 17 is celebrated in many countries today. For many, St. Patrick’s Day is a fairly secular holiday in which revelers don green clothes and drink plenty of beer. This is particularly true in the U.S., where the holiday was first promoted by Irish immigrants in Boston in the 18th century. The first St. Patrick’s Day parade was held in Boston in 1737, and the tradition has spread to cities across the country. No need to be green with envy for the Emerald Isle—everyone has the luck of the Irish on St. Patrick’s Day.
[Image description: A black-and-white engraving of Saint Patrick reading a bible and holding a staff while wearing a robe and tall hat.] Credit & copyright: Mattheus Borrekens, 1625-1670. Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.Happy Saint Patrick’s Day! Just who was this Saint Patrick guy, anyway? Like all saints who went on to become holiday mascots (think Saint Valentine and Saint Nicholas) the real Saint Patrick’s life is steeped in legend. In fact, almost everything we know about his life comes from two works that Patrick wrote himself: his autobiography, Confessio, and a letter condemning what he saw as Britain’s mistreatment of Christians in Ireland. While some of Patrick’s stories might best be taken with a grain of salt, there’s no doubt that he became an extremely successful priest and missionary in his lifetime, and that he faced plenty of tribulations along the way.
The story of Saint Patrick gets strange right off the bat since, despite his fame as the patron saint of Ireland, he wasn’t actually Irish. Rather, he was born in Britain sometime around 450 C.E. to a family of Roman descent. His father was a wealthy deacon and local politician, but even his status wasn’t enough to protect a 16-year-old Patrick from being kidnapped by Irish raiders who broke into his family’s estate. The teen was carried off into slavery in Ireland, where he was forced to work for six years herding sheep. During his time in captivity, Patrick sought solace in his religion and became more devout as a result. According to Patrick’s own writings, he had a dream one night in which the Christian god told him that it was time to leave, so he fled his captors and returned to his family in Britain. After his return, another dream told him that he would one day return to Ireland as a missionary. Whatever his reasoning, Patrick did begin 15 years of religious training, at the end of which he was ordained a priest. Amazingly, he did indeed choose to return to the land where he had been enslaved to do the bulk of his religious work.
Although some legends claim that Saint Patrick introduced Christianity to Ireland, that’s almost certainly not true, since part of his job as a missionary and priest was working with Ireland’s already-Christian population. Unlike most foreign priests, Patrick was familiar with Irish traditions and rituals due to the time he’d spent there, which endeared him to Irish Christians. It also allowed him to better relate to the non-Christians he was trying to convert. Patrick put a Christian spin on Irish, pagan rituals, such as lighting bonfires during Easter instead of doing so to worship the Celtic gods. He is also credited with redesigning the typical Christian cross by adding a circle that represents the sun—a prominent Celtic symbol—to make the reverence of the symbol feel more familiar. This design came to be known as the Celtic cross, and it’s still in use today in regions with Celtic heritage. His influence and reputation in Ireland only grew after his death, and he was heralded as a saint by acclaim alone before the Catholic Church had a formal canonization process.
As with any Catholic saint, Patrick was credited for performing a number of epic feats and miracles. The most famous of these is his eradication of snakes from the island, though historically this seems unlikely since scientific evidence points to there being no reptiles at all on the island prior to modern times. Patrick is also credited with using a three-leafed clover, or shamrock, to explain the concept of the Holy Trinity to the Irish, though this was never mentioned in his own writings. Another story tells of Patrick fasting on a mountain for 40 days, until an angel came down to speak with him on behalf of God. The story goes that Patrick then made several demands of God, like allowing him to save more damned souls than any other saint, preventing the English from ever ruling over the Irish, and giving him the privilege of judging Irish souls during the Last Judgment.
While St. Patrick is still heavily associated with Irish culture, his feast day on March 17 is celebrated in many countries today. For many, St. Patrick’s Day is a fairly secular holiday in which revelers don green clothes and drink plenty of beer. This is particularly true in the U.S., where the holiday was first promoted by Irish immigrants in Boston in the 18th century. The first St. Patrick’s Day parade was held in Boston in 1737, and the tradition has spread to cities across the country. No need to be green with envy for the Emerald Isle—everyone has the luck of the Irish on St. Patrick’s Day.
[Image description: A black-and-white engraving of Saint Patrick reading a bible and holding a staff while wearing a robe and tall hat.] Credit & copyright: Mattheus Borrekens, 1625-1670. Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication. -
FREEFitness PP&T CurioFree1 CQ
We’re three months into 2024! Have you stuck to the fitness goals you set back in January? If so, you’re probably intimately familiar with one of the world’s most popular fitness machines: the treadmill. While they’re touted for their health benefits today, treadmills have a surprisingly dark history. In fact, they weren’t invented for fitness at all, but for punishment.
Treadmills may have been invented in ancient Asia, though historians aren’t entirely sure. The earliest iterations of treadmills in the West were used to pump water or grind grains. Called a treadwheel, the machine was the brainchild of William Cubitt, an English civil engineer from a family of millwrights. The design was fairly simple, with two wheels connected by cogs. Users would climb on top of it and then walk, as if ascending a never-ending flight of stairs, while holding onto a bar for support. It was a useful industrial machine, but in 1818, not long after it was first introduced, prisons noticed that it had potential as an instrument of punishment.
Thus, treadwheels began to pop up at large correctional facilities where they were given a new, dystopian name: atonement machines. The grueling labor was touted by prison officials as a way for prisoners to “work off their sins.” Of course, in reality the devices were less about atonement and more about keeping prisoners too occupied and exhausted to stir up trouble. These correctional contraptions were modified for prison use as well, with partitions separating inmates so that they couldn’t pass the time by socializing. This would obviously be considered torture by modern standards, as inmates were sometimes made to work on atonement machines for up to 10 hours a day. Unlike the first treadwheels, most atonement machines weren’t even made to do anything useful, like pumping water or grinding grain. Thankfully, the sisyphean punishment fell out of favor in the late 1800s as its efficacy as a rehabilitation tool was shown to be questionable at best and lethal at worst. By the turn of the century, there were a little more than a dozen functioning atonement machines left in English prisons.
The treadmill saw a similar rise and decline in popularity as a correctional tool in the U.S., but some enterprising Americans also thought to re-purpose the torture device as a fitness machine. In 1913, Claude Lauraine Hagen filed a patent for a “training-machine” in the U.S. in response to a 1910 report by the CDC on heart disease being caused by lack of exercise. In a similar vein, a cardiologist named Robert Bruce came up with the “Bruce Protocol” in the early 1960s, where he would evaluate a patient’s cardiac health by having them walk on a treadmill while connected to an electrocardiogram. It wasn’t until later that decade, when William Staub invented the “Pacemaster 600”, that the treadmill really caught on as a machine for fitness and recreation. Staub’s iteration of the treadmill came at a time when Americans were becoming more health-conscious and concerned with maintaining their physiques. With the Pacemaster 600, the average person could run in any weather and sweat off extra pounds. Staub was seemingly on to something, as he reportedly used a treadmill every day until he died at the ripe old age of 96. Nowadays, treadmills are a ubiquitous fixture in home gyms and fitness centers around the world…though some may still consider them a bit torturous.
[Image description: A Victorian-era illustration of prisoners walking on a treadmill while other people, wearing hats and coats, stand near a basket of food in the foreground.] Credit & copyright: British Library c. 1817. Wikimedia Commons, Public Domain.We’re three months into 2024! Have you stuck to the fitness goals you set back in January? If so, you’re probably intimately familiar with one of the world’s most popular fitness machines: the treadmill. While they’re touted for their health benefits today, treadmills have a surprisingly dark history. In fact, they weren’t invented for fitness at all, but for punishment.
Treadmills may have been invented in ancient Asia, though historians aren’t entirely sure. The earliest iterations of treadmills in the West were used to pump water or grind grains. Called a treadwheel, the machine was the brainchild of William Cubitt, an English civil engineer from a family of millwrights. The design was fairly simple, with two wheels connected by cogs. Users would climb on top of it and then walk, as if ascending a never-ending flight of stairs, while holding onto a bar for support. It was a useful industrial machine, but in 1818, not long after it was first introduced, prisons noticed that it had potential as an instrument of punishment.
Thus, treadwheels began to pop up at large correctional facilities where they were given a new, dystopian name: atonement machines. The grueling labor was touted by prison officials as a way for prisoners to “work off their sins.” Of course, in reality the devices were less about atonement and more about keeping prisoners too occupied and exhausted to stir up trouble. These correctional contraptions were modified for prison use as well, with partitions separating inmates so that they couldn’t pass the time by socializing. This would obviously be considered torture by modern standards, as inmates were sometimes made to work on atonement machines for up to 10 hours a day. Unlike the first treadwheels, most atonement machines weren’t even made to do anything useful, like pumping water or grinding grain. Thankfully, the sisyphean punishment fell out of favor in the late 1800s as its efficacy as a rehabilitation tool was shown to be questionable at best and lethal at worst. By the turn of the century, there were a little more than a dozen functioning atonement machines left in English prisons.
The treadmill saw a similar rise and decline in popularity as a correctional tool in the U.S., but some enterprising Americans also thought to re-purpose the torture device as a fitness machine. In 1913, Claude Lauraine Hagen filed a patent for a “training-machine” in the U.S. in response to a 1910 report by the CDC on heart disease being caused by lack of exercise. In a similar vein, a cardiologist named Robert Bruce came up with the “Bruce Protocol” in the early 1960s, where he would evaluate a patient’s cardiac health by having them walk on a treadmill while connected to an electrocardiogram. It wasn’t until later that decade, when William Staub invented the “Pacemaster 600”, that the treadmill really caught on as a machine for fitness and recreation. Staub’s iteration of the treadmill came at a time when Americans were becoming more health-conscious and concerned with maintaining their physiques. With the Pacemaster 600, the average person could run in any weather and sweat off extra pounds. Staub was seemingly on to something, as he reportedly used a treadmill every day until he died at the ripe old age of 96. Nowadays, treadmills are a ubiquitous fixture in home gyms and fitness centers around the world…though some may still consider them a bit torturous.
[Image description: A Victorian-era illustration of prisoners walking on a treadmill while other people, wearing hats and coats, stand near a basket of food in the foreground.] Credit & copyright: British Library c. 1817. Wikimedia Commons, Public Domain. -
FREEUS History PP&T CurioFree1 CQ
There are jailbreaks, and then there are jailbreaks. On this day in 1934, one of the most notorious criminals in U.S. history broke out of jail using only a self-made, fake gun. John Dillinger was a troublemaker long before this infamous feat, but he did become something of a legend afterward. In his time, he was even considered something of a folk hero, albeit a violent one.
John Herbert Dillinger was born in Indianapolis, Indiana, in 1903. He had a difficult family life that resulted in a tumultuous childhood. His mother died when he was only three, and he did not get along with his stepmother. In school, he frequently got into trouble and eventually dropped out at the age of 16. Hoping to reform their son’s ways by moving to a more rural area, his family relocated to a farm outside of Indianapolis, but to no avail. Despite his father’s efforts to distance him from city life, Dillinger still ventured into Indianapolis to work at a machine shop during the day, and drink in bars until all hours of the night. At age 20, Dillinger did take one serious crack at the “straight and narrow” life by joining the Navy…but after just a few months at his first station, he went AWOL and married 16-year-old Beryl Hovious.
Dillinger began his criminal career not long after getting married, though it got off to a rough start. In 1924, Dillinger and his friend, Edgar Singleton, assaulted and robbed a local grocer. Dillinger's used an iron bolt wrapped in cloth as his main weapon, but the grocer suffered only minor injuries. Dillinger was promptly identified, arrested, and sentenced to a whopping 10 to 20 years in prison. His wife divorced him while he was serving his sentence, and it seemed that Dillinger was a man whose violent actions had cost him everything. Far from learning his lesson, however, Dillinger made friends with other criminals while incarcerated and learned more about pulling successful heists. After making parole in 1933, Dillinger picked up right where he'd left off. Using what he’d learned from ex-military inmates who robbed banks with tactical precision, he began a crime spree for the ages. His first step was to break out some of his incarcerated friends: Harry Pierpont, Charles Makley, John Hamilton, Walter Dietrich, and Russell Clark. They, along with Homer Van Meter, formed the Dillinger Gang, and stole weapons by breaking into police stations. Their modus operandi was unusually meticulous; members of the gang performed reconnaissance before each robbery. They also posed as government officials to get an inside look at the daily operations of their targets, and even “rehearsed” by driving over their escape routes several times in advance.
During one of the gang’s robberies, Dillinger killed a police officer named O’Malley. The additional attention from the killing eventually caught up with him, and Dillinger was captured in January of 1934. While awaiting trial in Crown Point, Indiana, Dillinger carved a fake gun out of a wooden washboard and colored it with bootblack. Using the “gun,” he successfully escaped on March 3 of that year by taking a guard hostage (although his lawyer supposedly bribed some guards to aid in the escape) and fled to Chicago, where he underwent plastic surgery to hide his identity. In June, Dillinger was declared Public Enemy Number One by the federal government. Less than a month later, he met a fittingly violent end when he died in a hail of gunfire in a shootout with the FBI.
Oddly, in life Dillinger was considered a folk hero by some Americans, who celebrated him for holding up the “crooked banks” that they blamed for the country’s economic woes. Today, Dillinger is among the most famous of the bank robbers who plagued the U.S. in the early 1900s. Whether he was just a violent criminal or a champion of the people, one thing’s for sure: Dillinger knew how to whittle a convincing gun.
[Image description: Black-and-white mugshots of John Dilliner wearing a suit.] Credit & copyright: John Dillinger's 1924 mugshot from the Indiana State Penitentiary. Indiana State Penitentiary photographic records. Wikimedia Commons. This work is in the public domain in the United States because it was published (or registered with the U.S. Copyright Office) before January 1, 1929.There are jailbreaks, and then there are jailbreaks. On this day in 1934, one of the most notorious criminals in U.S. history broke out of jail using only a self-made, fake gun. John Dillinger was a troublemaker long before this infamous feat, but he did become something of a legend afterward. In his time, he was even considered something of a folk hero, albeit a violent one.
John Herbert Dillinger was born in Indianapolis, Indiana, in 1903. He had a difficult family life that resulted in a tumultuous childhood. His mother died when he was only three, and he did not get along with his stepmother. In school, he frequently got into trouble and eventually dropped out at the age of 16. Hoping to reform their son’s ways by moving to a more rural area, his family relocated to a farm outside of Indianapolis, but to no avail. Despite his father’s efforts to distance him from city life, Dillinger still ventured into Indianapolis to work at a machine shop during the day, and drink in bars until all hours of the night. At age 20, Dillinger did take one serious crack at the “straight and narrow” life by joining the Navy…but after just a few months at his first station, he went AWOL and married 16-year-old Beryl Hovious.
Dillinger began his criminal career not long after getting married, though it got off to a rough start. In 1924, Dillinger and his friend, Edgar Singleton, assaulted and robbed a local grocer. Dillinger's used an iron bolt wrapped in cloth as his main weapon, but the grocer suffered only minor injuries. Dillinger was promptly identified, arrested, and sentenced to a whopping 10 to 20 years in prison. His wife divorced him while he was serving his sentence, and it seemed that Dillinger was a man whose violent actions had cost him everything. Far from learning his lesson, however, Dillinger made friends with other criminals while incarcerated and learned more about pulling successful heists. After making parole in 1933, Dillinger picked up right where he'd left off. Using what he’d learned from ex-military inmates who robbed banks with tactical precision, he began a crime spree for the ages. His first step was to break out some of his incarcerated friends: Harry Pierpont, Charles Makley, John Hamilton, Walter Dietrich, and Russell Clark. They, along with Homer Van Meter, formed the Dillinger Gang, and stole weapons by breaking into police stations. Their modus operandi was unusually meticulous; members of the gang performed reconnaissance before each robbery. They also posed as government officials to get an inside look at the daily operations of their targets, and even “rehearsed” by driving over their escape routes several times in advance.
During one of the gang’s robberies, Dillinger killed a police officer named O’Malley. The additional attention from the killing eventually caught up with him, and Dillinger was captured in January of 1934. While awaiting trial in Crown Point, Indiana, Dillinger carved a fake gun out of a wooden washboard and colored it with bootblack. Using the “gun,” he successfully escaped on March 3 of that year by taking a guard hostage (although his lawyer supposedly bribed some guards to aid in the escape) and fled to Chicago, where he underwent plastic surgery to hide his identity. In June, Dillinger was declared Public Enemy Number One by the federal government. Less than a month later, he met a fittingly violent end when he died in a hail of gunfire in a shootout with the FBI.
Oddly, in life Dillinger was considered a folk hero by some Americans, who celebrated him for holding up the “crooked banks” that they blamed for the country’s economic woes. Today, Dillinger is among the most famous of the bank robbers who plagued the U.S. in the early 1900s. Whether he was just a violent criminal or a champion of the people, one thing’s for sure: Dillinger knew how to whittle a convincing gun.
[Image description: Black-and-white mugshots of John Dilliner wearing a suit.] Credit & copyright: John Dillinger's 1924 mugshot from the Indiana State Penitentiary. Indiana State Penitentiary photographic records. Wikimedia Commons. This work is in the public domain in the United States because it was published (or registered with the U.S. Copyright Office) before January 1, 1929. -
FREEWorld History PP&T CurioFree1 CQ
It may look tame, but trust us—you don’t want to take the plunge. The Bolton Strid, a narrow portion of the River Wharf running through the Bolton Woods in Yorkshire, England, is considered by many to be the world’s deadliest stream. Despite looking relatively calm and easy to jump over, the Strid has claimed many lives, giving it an almost otherworldly reputation.
The Bolton Strid gets the first part of its name from the 12th century monastery nearby, the Bolton Priory. “Strid,” on the other hand, means “turmoil” in Old English. The stream’s waters are not as gentle as they appear at first glance, and the Bolton Strid’s dangerous reputation has been known to locals for centuries. In the early 1800s, British poet William Wordsworth even wrote a poem called The Force of Prayer about the Bolton Strid, and an unlucky boy who fell in while trying to jump across it. The poem reads, in part, “The Boy is in the arms of Wharf, / And strangled by a merciless force; / For never more was young Romilly seen / Till he rose a lifeless corpse.” Over the centuries, countless people have similarly fallen victim to the Bolton Strid, many while trying to jump across it. More recently, in 1998, a couple drowned there on the second day of their honeymoon after heavy rains caused the water level to rise by five feet in less than a minute.
So, what exactly makes the Bolton Strid so dangerous? It's a combination of physics and psychology. The River Wharf, which is around 30 feet across for most of its length, narrows at the Bolton Strid significantly, until it’s only a few feet across. Below the seemingly-calm surface, the sudden narrowness causes water to be “squeezed” along much faster than it would normally flow. Over centuries, the turbulent water has allowed the Bolton Strid to gouge deep into the surrounding stone—perhaps as deep as 30 feet, though the exact measurement is still unknown. Some researchers have described the Bolton Strid as a full-sized river that is simply “turned on its side”, so that all of its danger is hidden below the surface. These dangers include not only the rushing current but also sharp rocks that aren’t visible from above. There are only a few, small clues that the stream isn’t what it seems. These include swirls of water on its surface (from hidden undercurrents), bubbles rising from the bottom (which are a sign of low water density that makes it easier for objects to sink), and an inability to see the bottom of the stream, even on a clear, sunny day. Ultimately, the fact that most people miss these subtle signs is what truly makes the Bolton Strid so dangerous. People are simply more willing to approach the calm-looking stream than they would be an obviously-raging river. Hence why there are so many stories of people trying to jump across the waterway. In fact, people still do jump across all the time, due in part to the stream’s now-famous reputation. For teens, leaping over the Bolton Strid is a surefire, albeit dangerous way to “prove their love” to a romantic partner.
The fact remains that no one who has ever fallen into the Bolton Strid has made it out alive, and many don’t make it out at all. No one is sure where their bodies go when they don’t emerge, but some experts believe that objects in the Bolton Strid are sucked into underwater caverns in the stony riverbed. Sure, you should never judge a book by its cover, but it also pays not to judge a river by its width.
[Image description: A black-and-white photo of The Strid taken in 1898.] Credit & copyright: Bolton Woods; The Strid. 1898. Rijksmuseum, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.It may look tame, but trust us—you don’t want to take the plunge. The Bolton Strid, a narrow portion of the River Wharf running through the Bolton Woods in Yorkshire, England, is considered by many to be the world’s deadliest stream. Despite looking relatively calm and easy to jump over, the Strid has claimed many lives, giving it an almost otherworldly reputation.
The Bolton Strid gets the first part of its name from the 12th century monastery nearby, the Bolton Priory. “Strid,” on the other hand, means “turmoil” in Old English. The stream’s waters are not as gentle as they appear at first glance, and the Bolton Strid’s dangerous reputation has been known to locals for centuries. In the early 1800s, British poet William Wordsworth even wrote a poem called The Force of Prayer about the Bolton Strid, and an unlucky boy who fell in while trying to jump across it. The poem reads, in part, “The Boy is in the arms of Wharf, / And strangled by a merciless force; / For never more was young Romilly seen / Till he rose a lifeless corpse.” Over the centuries, countless people have similarly fallen victim to the Bolton Strid, many while trying to jump across it. More recently, in 1998, a couple drowned there on the second day of their honeymoon after heavy rains caused the water level to rise by five feet in less than a minute.
So, what exactly makes the Bolton Strid so dangerous? It's a combination of physics and psychology. The River Wharf, which is around 30 feet across for most of its length, narrows at the Bolton Strid significantly, until it’s only a few feet across. Below the seemingly-calm surface, the sudden narrowness causes water to be “squeezed” along much faster than it would normally flow. Over centuries, the turbulent water has allowed the Bolton Strid to gouge deep into the surrounding stone—perhaps as deep as 30 feet, though the exact measurement is still unknown. Some researchers have described the Bolton Strid as a full-sized river that is simply “turned on its side”, so that all of its danger is hidden below the surface. These dangers include not only the rushing current but also sharp rocks that aren’t visible from above. There are only a few, small clues that the stream isn’t what it seems. These include swirls of water on its surface (from hidden undercurrents), bubbles rising from the bottom (which are a sign of low water density that makes it easier for objects to sink), and an inability to see the bottom of the stream, even on a clear, sunny day. Ultimately, the fact that most people miss these subtle signs is what truly makes the Bolton Strid so dangerous. People are simply more willing to approach the calm-looking stream than they would be an obviously-raging river. Hence why there are so many stories of people trying to jump across the waterway. In fact, people still do jump across all the time, due in part to the stream’s now-famous reputation. For teens, leaping over the Bolton Strid is a surefire, albeit dangerous way to “prove their love” to a romantic partner.
The fact remains that no one who has ever fallen into the Bolton Strid has made it out alive, and many don’t make it out at all. No one is sure where their bodies go when they don’t emerge, but some experts believe that objects in the Bolton Strid are sucked into underwater caverns in the stony riverbed. Sure, you should never judge a book by its cover, but it also pays not to judge a river by its width.
[Image description: A black-and-white photo of The Strid taken in 1898.] Credit & copyright: Bolton Woods; The Strid. 1898. Rijksmuseum, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication. -
FREEWorld History PP&T CurioFree1 CQ
If you were paying attention this past Valentine’s Day, chances are good that you saw at least one depiction of Cupid. He’s everywhere this time of year—on boxes of chocolates, bouquet wrappings, and of course cards. But Cupid wasn’t always the friendly little winged baby that he’s often depicted as today; there was a time when his heart-tipped arrows inspired terror and were part of dark tales involving madness and death. That’s because Cupid was originally Eros, the Greek God of carnal love.
Cupid’s mythological backstory went through plenty of changes over the centuries. One of the earliest mentions of Eros describes him as a primordial cosmological entity who emerged from the world egg. Another version of his birth describes him as the son of Nyx (the goddess of night) and Erebus (the god of darkness). He is sometimes portrayed as a sibling of Deimos, Phobos, and Harmonia, (gods of fear, panic and harmony, respectively). In other portrayals, he is the elder brother of Anteros, the god of requited love and avenger of unrequited love. However, the most popular tales of Eros portray Aphrodite, the goddess of sexual love and beauty, as his mother. In these stories, Eros is usually portrayed as a powerful, handsome young man with wings. His enchanted bow and arrows can impart an everlasting and irresistible carnal desire for another being, and they could affect both gods and mortals. His power wasn’t always used for good. Aphrodite, being a jealous goddess, often ordered her son to inflict his arrows on mortal women she grew jealous of, sometimes making them fall in love with animals. Yet, for a god who could so easily make others fall in love, Eros himself had quite the time trying to form a romantic connection of his own.
The tale of Eros and Psyche tells the story of Eros’s own quest for love. Most versions of the tale state that Aphrodite grew jealous of Psyche, a beautiful, mortal princess, and sent Eros to punish her with his arrows. Meanwhile, concerned that suitors never came to call on his daughter, Psyche’s father sought advice from the Oracle of Delphi, who told him to leave her at the top of a mountain, where she would meet her husband. This husband was said to be a monster, but Psyche went anyway to make her father happy. When she reached the summit, she was taken to a grand palace where she lived alone, only meeting her husband under the dark cover of night. When her sisters came to visit her, they—in their jealousy—convinced her to shine a light on her husband when he came to visit. One night, Psyche did just that, and revealed a most beautiful face—that of Eros himself. Eros, as it happened, had nicked himself with his own arrow while taking aim at Psyche. But the god was angered by the betrayal and flew away, leaving Psyche alone once more. Some versions say that Psyche remained alone forever, but in other versions she and Eros reconciled and Psyche went to live with Eros on Olympus.
Things changed for Eros around 146 BCE, when the Romans took over Greece’s city-states. Although they kept much of Greece’s mythology intact, they renamed the Greek gods and, in some cases, altered their stories and appearances. The Romans changed Eros’s name to Cupid, a word that resembles the Latin verb “cupere”, meaning “to desire.” Unlike Eros, Cupid was portrayed as a mischievous young boy. This is probably because many of Eros’s stories saw him carrying out the commands of his mother, which the Romans viewed as childlike. As Christian art grew popular in Italy, Cupid came to be depicted in an even younger form, taking on the baby-like appearance of winged cherubs that were popular in Christian art. Eventually, the only thing that set Cupid apart from other cherubs were his quiver and arrows. Don’t be fooled by their modern, cartoonish appearance, though. They pack a wallop.
[Image description: A round plate with Eros depicted on it, standing nude with wings and arms outstretched.] Credit & copyright:
Louvre Museum, Wikimedia Commons. 470 BC–450 BCE. This picture was shot by Marie-Lan Nguyen (user:Jastrow) and placed in the Public Domain.If you were paying attention this past Valentine’s Day, chances are good that you saw at least one depiction of Cupid. He’s everywhere this time of year—on boxes of chocolates, bouquet wrappings, and of course cards. But Cupid wasn’t always the friendly little winged baby that he’s often depicted as today; there was a time when his heart-tipped arrows inspired terror and were part of dark tales involving madness and death. That’s because Cupid was originally Eros, the Greek God of carnal love.
Cupid’s mythological backstory went through plenty of changes over the centuries. One of the earliest mentions of Eros describes him as a primordial cosmological entity who emerged from the world egg. Another version of his birth describes him as the son of Nyx (the goddess of night) and Erebus (the god of darkness). He is sometimes portrayed as a sibling of Deimos, Phobos, and Harmonia, (gods of fear, panic and harmony, respectively). In other portrayals, he is the elder brother of Anteros, the god of requited love and avenger of unrequited love. However, the most popular tales of Eros portray Aphrodite, the goddess of sexual love and beauty, as his mother. In these stories, Eros is usually portrayed as a powerful, handsome young man with wings. His enchanted bow and arrows can impart an everlasting and irresistible carnal desire for another being, and they could affect both gods and mortals. His power wasn’t always used for good. Aphrodite, being a jealous goddess, often ordered her son to inflict his arrows on mortal women she grew jealous of, sometimes making them fall in love with animals. Yet, for a god who could so easily make others fall in love, Eros himself had quite the time trying to form a romantic connection of his own.
The tale of Eros and Psyche tells the story of Eros’s own quest for love. Most versions of the tale state that Aphrodite grew jealous of Psyche, a beautiful, mortal princess, and sent Eros to punish her with his arrows. Meanwhile, concerned that suitors never came to call on his daughter, Psyche’s father sought advice from the Oracle of Delphi, who told him to leave her at the top of a mountain, where she would meet her husband. This husband was said to be a monster, but Psyche went anyway to make her father happy. When she reached the summit, she was taken to a grand palace where she lived alone, only meeting her husband under the dark cover of night. When her sisters came to visit her, they—in their jealousy—convinced her to shine a light on her husband when he came to visit. One night, Psyche did just that, and revealed a most beautiful face—that of Eros himself. Eros, as it happened, had nicked himself with his own arrow while taking aim at Psyche. But the god was angered by the betrayal and flew away, leaving Psyche alone once more. Some versions say that Psyche remained alone forever, but in other versions she and Eros reconciled and Psyche went to live with Eros on Olympus.
Things changed for Eros around 146 BCE, when the Romans took over Greece’s city-states. Although they kept much of Greece’s mythology intact, they renamed the Greek gods and, in some cases, altered their stories and appearances. The Romans changed Eros’s name to Cupid, a word that resembles the Latin verb “cupere”, meaning “to desire.” Unlike Eros, Cupid was portrayed as a mischievous young boy. This is probably because many of Eros’s stories saw him carrying out the commands of his mother, which the Romans viewed as childlike. As Christian art grew popular in Italy, Cupid came to be depicted in an even younger form, taking on the baby-like appearance of winged cherubs that were popular in Christian art. Eventually, the only thing that set Cupid apart from other cherubs were his quiver and arrows. Don’t be fooled by their modern, cartoonish appearance, though. They pack a wallop.
[Image description: A round plate with Eros depicted on it, standing nude with wings and arms outstretched.] Credit & copyright:
Louvre Museum, Wikimedia Commons. 470 BC–450 BCE. This picture was shot by Marie-Lan Nguyen (user:Jastrow) and placed in the Public Domain. -
FREEUS History PP&T CurioFree1 CQ
In honor of Black History Month, we’re celebrating one of the greatest American writers of all time: Langston Hughes. A central figure of the Harlem Renaissance, Hughes is best remembered for his succinct, gripping poetry, but he also wrote novels, plays, and essays. His unique ability to combine themes of beauty and hope with truths about poverty and prejudice made his work an inspiration to millions.
James Mercer Langston Hughes was born in Joplin, Missouri, on February 1, 1901. His childhood was complicated and impacted by racism from the start. His father, seeking to escape the violent, anti-Black racism of the U.S., divorced Hughes’ mother and left for Mexico when Hughes was an infant. His mother, unable to find steady work due to persistent prejudice, left him to be raised by his maternal grandmother until he was 13. An activist in her youth, Hughes’ grandmother instilled in him a sense of racial pride and responsibility toward the Black community. By the time he was a teenager, Hughes had already developed a passion for literature and had begun writing poetry, thanks in large part to his grandmother.
After high school, he briefly went to live with his father in Mexico and tried, unsuccessfully, to convince him to fund his education at Columbia University, where he wanted to study writing. It was during his time in Mexico that the young Hughes wrote one of his most famous poems: The Negro Speaks of Rivers. A part of the poem reads, “I’ve known rivers ancient as the world and older than the flow of human blood in human veins/My soul has grown deep like the rivers.” The free-verse poem, focused on the strength and beauty of Black heritage, was published in NAACP’s The Crisis Magazine, and caught the attention of critics. Although Hughes did attend Columbia University for a year, he paused his studies to travel around Europe and Africa to escape the racism he encountered at school and in the U.S. in general. After returning to the U.S., he finished his education at Lincoln University in Chester County, Pennsylvania, before making what was possibly the most impactful move of his life: he settled down in New York, in the historically Black neighborhood of Harlem.
Despite his first poem’s success, most of Hughes’ early work was poorly received by Black critics and almost universally ignored by white critics. Black critics disliked Hughes because he wrote about the lives of Black people in Harlem in a stark, forthcoming light, which they found unflattering. For example, the title of his second book of poetry, Fine Clothes to the Jew, referred to the at-the-time common practice of Black people pawning their expensive clothes to mostly Jewish-owned businesses during times of financial hardship. As Hughes once wrote of other Black writers in his autobiography, “In anything that white people were likely to read, they wanted to put their best foot forward, their politely polished and cultural foot—and only that foot.” Hughes’ lack of critical acclaim didn’t seem to bother him though, as he remained most interested in the lives of everyday Black people. Hughes was well-regarded by the working class, who found his writing relatable. Within a few years, he grew popular enough to become the first Black American to make a living from writing alone. However, this success also made him a target of the US government. His interest in communism, open criticisms of capitalism, and his unwavering support of civil rights for Black Americans earned him the ire of J. Edgar Hoover, and the government kept an extensive file on Hughes for many years.
Hughes did eventually win over the literary critics. In fact, he became one of the most significant figures in the Harlem Renaissance, an explosion of Black art and literature based in Harlem, which began in the 1920s. For his part, Hughes relentlessly portrayed the struggles of Black Americans and relayed their experiences to the world with dignity and solemnity. He became friends with other leading Black literary figures, like author Gwendolyn Brooks. Throughout the rest of his life, Hughes continued to travel the world as much as he could, and was as beloved on the international stage as he was scorned by the establishment in his home country. He may have lived in Harlem, but all the world was his neighborhood.
[Image description: A black-and-white photo of Langston Hughes smiling while wearing a suit.] Credit & copyright: Jack Delano (1914–1997), Wikimedia Commons, This image is a work of an employee of the United States Farm Security Administration or Office of War Information domestic photographic units, taken as part of that person's official duties. As a work of the U.S. federal government, the image is in the public domain in the United States.In honor of Black History Month, we’re celebrating one of the greatest American writers of all time: Langston Hughes. A central figure of the Harlem Renaissance, Hughes is best remembered for his succinct, gripping poetry, but he also wrote novels, plays, and essays. His unique ability to combine themes of beauty and hope with truths about poverty and prejudice made his work an inspiration to millions.
James Mercer Langston Hughes was born in Joplin, Missouri, on February 1, 1901. His childhood was complicated and impacted by racism from the start. His father, seeking to escape the violent, anti-Black racism of the U.S., divorced Hughes’ mother and left for Mexico when Hughes was an infant. His mother, unable to find steady work due to persistent prejudice, left him to be raised by his maternal grandmother until he was 13. An activist in her youth, Hughes’ grandmother instilled in him a sense of racial pride and responsibility toward the Black community. By the time he was a teenager, Hughes had already developed a passion for literature and had begun writing poetry, thanks in large part to his grandmother.
After high school, he briefly went to live with his father in Mexico and tried, unsuccessfully, to convince him to fund his education at Columbia University, where he wanted to study writing. It was during his time in Mexico that the young Hughes wrote one of his most famous poems: The Negro Speaks of Rivers. A part of the poem reads, “I’ve known rivers ancient as the world and older than the flow of human blood in human veins/My soul has grown deep like the rivers.” The free-verse poem, focused on the strength and beauty of Black heritage, was published in NAACP’s The Crisis Magazine, and caught the attention of critics. Although Hughes did attend Columbia University for a year, he paused his studies to travel around Europe and Africa to escape the racism he encountered at school and in the U.S. in general. After returning to the U.S., he finished his education at Lincoln University in Chester County, Pennsylvania, before making what was possibly the most impactful move of his life: he settled down in New York, in the historically Black neighborhood of Harlem.
Despite his first poem’s success, most of Hughes’ early work was poorly received by Black critics and almost universally ignored by white critics. Black critics disliked Hughes because he wrote about the lives of Black people in Harlem in a stark, forthcoming light, which they found unflattering. For example, the title of his second book of poetry, Fine Clothes to the Jew, referred to the at-the-time common practice of Black people pawning their expensive clothes to mostly Jewish-owned businesses during times of financial hardship. As Hughes once wrote of other Black writers in his autobiography, “In anything that white people were likely to read, they wanted to put their best foot forward, their politely polished and cultural foot—and only that foot.” Hughes’ lack of critical acclaim didn’t seem to bother him though, as he remained most interested in the lives of everyday Black people. Hughes was well-regarded by the working class, who found his writing relatable. Within a few years, he grew popular enough to become the first Black American to make a living from writing alone. However, this success also made him a target of the US government. His interest in communism, open criticisms of capitalism, and his unwavering support of civil rights for Black Americans earned him the ire of J. Edgar Hoover, and the government kept an extensive file on Hughes for many years.
Hughes did eventually win over the literary critics. In fact, he became one of the most significant figures in the Harlem Renaissance, an explosion of Black art and literature based in Harlem, which began in the 1920s. For his part, Hughes relentlessly portrayed the struggles of Black Americans and relayed their experiences to the world with dignity and solemnity. He became friends with other leading Black literary figures, like author Gwendolyn Brooks. Throughout the rest of his life, Hughes continued to travel the world as much as he could, and was as beloved on the international stage as he was scorned by the establishment in his home country. He may have lived in Harlem, but all the world was his neighborhood.
[Image description: A black-and-white photo of Langston Hughes smiling while wearing a suit.] Credit & copyright: Jack Delano (1914–1997), Wikimedia Commons, This image is a work of an employee of the United States Farm Security Administration or Office of War Information domestic photographic units, taken as part of that person's official duties. As a work of the U.S. federal government, the image is in the public domain in the United States. -
FREEUS History PP&T CurioFree1 CQ
Revolutions don’t come cheap, you know! On this day in 1787, a little-known American rebellion was defeated less than a year after it began. Its cause? Economic uncertainty following the Revolutionary War. With credit hard to come by and creditors making difficult demands, many people struggled to pay their bills once the war was over. Some lost their land, while others were thrown behind bars, and all the while, social unrest began to spread. When discontent turned to rebellion, the dissidents joined forces and were led by one man: Daniel Shays. Shays’ Rebellion, though brief, tested the stability of a newly-independent America.
During the war, businesses in America and Europe lent massive sums of money to the Continental Army. Once the war ended, they were hesitant to lend more money when they were still owed so much, and most demanded cash up front for goods and services rendered. Meanwhile, many farmers who had fought in the war struggled to make ends meet, as they had borrowed large amounts to start their farms and provide for their families. Ironically, the reason many people were in debt was because they themselves were owed money for their service in state militias and the Continental Army, which had often issued IOUs in lieu of pay. To help alleviate the situation, state governments began to pass pro-debtor laws that forgave loans or printed more money. One exception was Massachusetts, whose Governor, James Bowdoin, refused to pass such laws. This allowed businesses in Boston to seize land from farmers and throw debtors into prison. With so many people unable to pay their debts, the situation quickly became untenable. Beginning in 1786, Disgruntled farmers and other debtors began to hold special meetings, creating an atmosphere of rebellion akin to that of the early days of the American Revolution. Among the dissidents was Daniel Shays, a farmer who had served as a Captain in the Continental Army.
Shays had fought at Bunker Hill and had a decorated military career, but in the rebellion that would eventually bear his name, he was a reluctant leader. He feared that an uprising could threaten the new democracy he’d recently fought to form. Still, he was an active participant in debtors' meetings and protests from the rebellion’s start. Shays was present during a protest in Northampton when rebels blocked judges from entering a courthouse where debtors’ trials were to be held. Not long after, he led a group of 600 men to a courthouse in Springfield for a similar purpose. Shays’ calm demeanor, as he negotiated with General William Shepard to allow the trials to proceed as long as he and his followers were allowed to protest outside, helped make him the rebellion’s de facto leader. The arrangement worked in Shays’ favor, because even though judges could freely enter the courthouse, the court couldn’t find enough people willing to serve as jurors due to the public’s growing sympathies. With Shays seemingly at the helm, the dissidents began to be referred to as “Shaysites,” though their growing rebellion wouldn’t last long.
The Shaysites had divided a fledgling nation. Some viewed their cause as a continuation of the American Revolution, while others, like Sam Adams, considered them traitors and called for their executions. Perhaps taking note of the parallels, George Washington once wrote to a friend, “commotions of this sort, like snow-balls, gather strength as they roll, if there is no opposition in the way to divide and crumble them.” The snow-ball met its match in January of 1787 at the Springfield Arsenal, a federal armory where Shays and his men were headed to procure weapons. Waiting for them was General Shepard, who fired a volley of warning shots, followed by artillery. Two Shaysites died and another twenty were wounded. The rest scattered. On February 4th, the rebellion officially ended after troops led by former Continental Army General Benjamin Lincoln ambushed Shay and his remaining rebels as they made camp. Shays, along with most of the rebellion’s other leaders, fled to Vermont.
Luckily for Shays, John Hancock soon took office as Governor of Massachusetts, and issued pardons for him and his rebels. Still, Shays’ actions proved to many that a more decisive, centralized government was needed in the U.S. Following the rebellion, George Washington, who had led the nation through the war, was elected its first president. The U.S. also abandoned the Articles of Confederation and adopted the U.S. Constitution, written by Federalists like Alexander Hamilton, who favored a stronger federal government and weaker state governments. The ups and downs of the American Revolution are enough to make your head spin.
[Image description: A stone monument in a field with two small American flags, commemorating the last day of Shay’s Rebellion.] Credit & copyright: John Bessa, Wikimedia Commons. The author of this work has released it into the public domain.Revolutions don’t come cheap, you know! On this day in 1787, a little-known American rebellion was defeated less than a year after it began. Its cause? Economic uncertainty following the Revolutionary War. With credit hard to come by and creditors making difficult demands, many people struggled to pay their bills once the war was over. Some lost their land, while others were thrown behind bars, and all the while, social unrest began to spread. When discontent turned to rebellion, the dissidents joined forces and were led by one man: Daniel Shays. Shays’ Rebellion, though brief, tested the stability of a newly-independent America.
During the war, businesses in America and Europe lent massive sums of money to the Continental Army. Once the war ended, they were hesitant to lend more money when they were still owed so much, and most demanded cash up front for goods and services rendered. Meanwhile, many farmers who had fought in the war struggled to make ends meet, as they had borrowed large amounts to start their farms and provide for their families. Ironically, the reason many people were in debt was because they themselves were owed money for their service in state militias and the Continental Army, which had often issued IOUs in lieu of pay. To help alleviate the situation, state governments began to pass pro-debtor laws that forgave loans or printed more money. One exception was Massachusetts, whose Governor, James Bowdoin, refused to pass such laws. This allowed businesses in Boston to seize land from farmers and throw debtors into prison. With so many people unable to pay their debts, the situation quickly became untenable. Beginning in 1786, Disgruntled farmers and other debtors began to hold special meetings, creating an atmosphere of rebellion akin to that of the early days of the American Revolution. Among the dissidents was Daniel Shays, a farmer who had served as a Captain in the Continental Army.
Shays had fought at Bunker Hill and had a decorated military career, but in the rebellion that would eventually bear his name, he was a reluctant leader. He feared that an uprising could threaten the new democracy he’d recently fought to form. Still, he was an active participant in debtors' meetings and protests from the rebellion’s start. Shays was present during a protest in Northampton when rebels blocked judges from entering a courthouse where debtors’ trials were to be held. Not long after, he led a group of 600 men to a courthouse in Springfield for a similar purpose. Shays’ calm demeanor, as he negotiated with General William Shepard to allow the trials to proceed as long as he and his followers were allowed to protest outside, helped make him the rebellion’s de facto leader. The arrangement worked in Shays’ favor, because even though judges could freely enter the courthouse, the court couldn’t find enough people willing to serve as jurors due to the public’s growing sympathies. With Shays seemingly at the helm, the dissidents began to be referred to as “Shaysites,” though their growing rebellion wouldn’t last long.
The Shaysites had divided a fledgling nation. Some viewed their cause as a continuation of the American Revolution, while others, like Sam Adams, considered them traitors and called for their executions. Perhaps taking note of the parallels, George Washington once wrote to a friend, “commotions of this sort, like snow-balls, gather strength as they roll, if there is no opposition in the way to divide and crumble them.” The snow-ball met its match in January of 1787 at the Springfield Arsenal, a federal armory where Shays and his men were headed to procure weapons. Waiting for them was General Shepard, who fired a volley of warning shots, followed by artillery. Two Shaysites died and another twenty were wounded. The rest scattered. On February 4th, the rebellion officially ended after troops led by former Continental Army General Benjamin Lincoln ambushed Shay and his remaining rebels as they made camp. Shays, along with most of the rebellion’s other leaders, fled to Vermont.
Luckily for Shays, John Hancock soon took office as Governor of Massachusetts, and issued pardons for him and his rebels. Still, Shays’ actions proved to many that a more decisive, centralized government was needed in the U.S. Following the rebellion, George Washington, who had led the nation through the war, was elected its first president. The U.S. also abandoned the Articles of Confederation and adopted the U.S. Constitution, written by Federalists like Alexander Hamilton, who favored a stronger federal government and weaker state governments. The ups and downs of the American Revolution are enough to make your head spin.
[Image description: A stone monument in a field with two small American flags, commemorating the last day of Shay’s Rebellion.] Credit & copyright: John Bessa, Wikimedia Commons. The author of this work has released it into the public domain. -
FREEAerobics PP&T CurioFree1 CQ
The world’s full of fitness gurus, but most are known for abs of steel, not hearts of gold! That’s not the case for Richard Simmons, who made a career out of encouraging people to exercise in a compassionate way. The retired aerobics instructor, who is the subject of an upcoming biopic starring Pauly Shore, was one of the first well-known fitness personalities to encourage playful—rather than painful—at-home workouts. He also made a point to include plus-sized people in his nearly 60 straight-to-video workout tapes and DVDs.
Richard Simmons was born on July 12, 1948 in New Orleans, Louisiana, where he spent his early life. He struggled with compulsive eating from a young age, and was overweight by the time he was just four years old. At school, he was bullied by other students for his weight, which led him to comfort himself with food, furthering the problem. By the time he graduated high school, he weighed 268 pounds. A few years later, while studying and working in Italy as a fashion illustrator, someone left an anonymous note on Simmons’ car expressing concern for his health. It was this note that prompted him to move to California, where he was determined to lose weight.
Unfortunately, weight loss can bring on problems of its own. After settling down in Los Angeles, Simmons developed an eating disorder, turning to unhealthy methods to lose weight. He ended up losing 137 pounds within a short span of time and ended up in the hospital. Determined to find a more sustainable, healthy way to maintain his weight, Simmons sought help from local fitness experts. However, he found that most trainers were too tough on their customers and focused on helping people who were already in shape. With the goal of creating a space for the “average” person, Simmons saved up $25,000 over two years and opened his own fitness studio, the Anatomy Asylum, which he eventually renamed to Slimmons.
Simmons quickly found success, first as a fitness guru giving seminars and then as a TV personality. The Richard Simmons Show, which ran from 1980 to 1984, featured fitness advice and healthy cooking segments, but it wasn’t until he started releasing video tapes of aerobic exercise routines that he became a household name. Sweatin’ to the Oldies, released in 1988, was an instant success. It featured Simmons accompanied by plus-sized participants, and was designed to be non-judgmental and fun. Over the years, he released 65 videos that sold 20 million copies, forming the foundation of his multi-million dollar fitness empire. But fitness and weight loss weren’t his only concerns. To address the kind of trauma and confidence issues he struggled with himself, Simmons also released videos like Love Yourself and Win, which focused on building self-confidence and finding motivation.
Simmons never shied away from the spotlight throughout his career, which is why many people were surprised when he disappeared from public life in 2014. His sudden withdrawal has been the subject of much speculation, much of which borders on conspiracy theories, but Simmons maintains that he’s not in hiding, just retired. Said retirement may be related to a botched knee replacement which made it difficult for him to exercise vigorously. Regardless, he remains engaged with his fans on social media, even commenting on the fact that he never authorized Pauly Shore’s recent biopic. Simmons is still beloved for his positive attitude and for being one of the first fitness gurus to lead with positivity and inclusiveness. He may have been “sweatin’ to the oldies,” but he was ahead of his time.
[Image description: Colorful aerobics equipment, including yellow yoga balls and green exercise mats.] Credit & copyright: ArsAdAstra, PixabayThe world’s full of fitness gurus, but most are known for abs of steel, not hearts of gold! That’s not the case for Richard Simmons, who made a career out of encouraging people to exercise in a compassionate way. The retired aerobics instructor, who is the subject of an upcoming biopic starring Pauly Shore, was one of the first well-known fitness personalities to encourage playful—rather than painful—at-home workouts. He also made a point to include plus-sized people in his nearly 60 straight-to-video workout tapes and DVDs.
Richard Simmons was born on July 12, 1948 in New Orleans, Louisiana, where he spent his early life. He struggled with compulsive eating from a young age, and was overweight by the time he was just four years old. At school, he was bullied by other students for his weight, which led him to comfort himself with food, furthering the problem. By the time he graduated high school, he weighed 268 pounds. A few years later, while studying and working in Italy as a fashion illustrator, someone left an anonymous note on Simmons’ car expressing concern for his health. It was this note that prompted him to move to California, where he was determined to lose weight.
Unfortunately, weight loss can bring on problems of its own. After settling down in Los Angeles, Simmons developed an eating disorder, turning to unhealthy methods to lose weight. He ended up losing 137 pounds within a short span of time and ended up in the hospital. Determined to find a more sustainable, healthy way to maintain his weight, Simmons sought help from local fitness experts. However, he found that most trainers were too tough on their customers and focused on helping people who were already in shape. With the goal of creating a space for the “average” person, Simmons saved up $25,000 over two years and opened his own fitness studio, the Anatomy Asylum, which he eventually renamed to Slimmons.
Simmons quickly found success, first as a fitness guru giving seminars and then as a TV personality. The Richard Simmons Show, which ran from 1980 to 1984, featured fitness advice and healthy cooking segments, but it wasn’t until he started releasing video tapes of aerobic exercise routines that he became a household name. Sweatin’ to the Oldies, released in 1988, was an instant success. It featured Simmons accompanied by plus-sized participants, and was designed to be non-judgmental and fun. Over the years, he released 65 videos that sold 20 million copies, forming the foundation of his multi-million dollar fitness empire. But fitness and weight loss weren’t his only concerns. To address the kind of trauma and confidence issues he struggled with himself, Simmons also released videos like Love Yourself and Win, which focused on building self-confidence and finding motivation.
Simmons never shied away from the spotlight throughout his career, which is why many people were surprised when he disappeared from public life in 2014. His sudden withdrawal has been the subject of much speculation, much of which borders on conspiracy theories, but Simmons maintains that he’s not in hiding, just retired. Said retirement may be related to a botched knee replacement which made it difficult for him to exercise vigorously. Regardless, he remains engaged with his fans on social media, even commenting on the fact that he never authorized Pauly Shore’s recent biopic. Simmons is still beloved for his positive attitude and for being one of the first fitness gurus to lead with positivity and inclusiveness. He may have been “sweatin’ to the oldies,” but he was ahead of his time.
[Image description: Colorful aerobics equipment, including yellow yoga balls and green exercise mats.] Credit & copyright: ArsAdAstra, Pixabay -
FREEWorld History PP&T CurioFree1 CQ
Don’t lose your head! During the French Revolution, that advice was particularly prudent for France’s royalty and aristocracy, many of whom were executed by guillotine during the upheaval. Even the French king, Louis XVI, was killed on this day in 1793. The revolution is still remembered as one of the most violent in history, yet at its heart it was a matter of economics: specifically economic inequality between social classes.
During the latter half of the 18th century, France was struggling financially for several reasons. Chief among these was the country’s involvement in another revolution—the American Revolution—which had proven more expensive than they’d bargained for. Compounding the issue were years of bad weather and failed harvests that led food prices to rise. Even bread was unaffordable for many people. Yet, French royals and aristocrats continued to publicly flaunt their wealth even as lower classes starved.
The King of France, Louis XVI, wasn’t completely deaf to growing discontent among the masses, though. In 1787, he tasked his Controller General, Charles Alexandre de Calonne, to carry out financial reforms that would remove land tax exemptions from the aristocrats. The hope was, if they paid their fair share, more wealth would be available to everyone. When the Estates General (an assembly of three bodies representing the clergy, nobles, and the middle class) assembled at Versaille, the proposal was met with universal support from the middle class who made up 98 percent of the country’s population, but most nobles and clergy were against it. Thus, the representatives of the middle class and sympathetic nobles held a protest until, on July 9, 1789, the King formed a National Constituent Assembly, a single legislative body with more proportional representation.
However, rumors of an impending military coup had already begun to spread, and the resulting panic led people to storm the Bastille fortress to acquire gunpowder and other supplies on July 14, 1789. This marked the beginning of the French Revolution in earnest. What followed were years of political instability and upheavals. The king’s authority was greatly lessened by the National Constituent Assembly and he became a de facto prisoner of the aristocrats. Louis XVI even tried to flee the country in 1791, hoping to secure allies from abroad, but he was caught and brought back to Paris. Meanwhile, the feudal system was abolished and the Catholic Church’s landholdings were nationalized. The land was distributed to farmers and the middle class, creating tensions across the board.
To restore his authority, Louis XVI declared war against Austria, who were soon joined by Prussia. Together, the two armies crushed French forces and began marching toward Paris. When the French people saw the approaching forces, they believed that they were coming to act as counterrevolutionaries. French forces were eventually able to turn away the invasion, but the damage was done. Believing they had been betrayed by the aristocrats and their own King, the people began executing anyone they deemed to be an aristocrat or a member of the royal family. Louis XVI was tried and found guilty of high treason, then sentenced to death. The Reign of Terror had begun.
The Reign of Terror, which lasted 10 months, saw a number of violent changes to the regime. Maximilien de Robespierre, the radical revolutionary who led the Committee of Public Safety, ordered nearly 17,000 executions without trials before being executed himself in 1794. That, in turn, marked the beginning of the Thermidorian Reaction, which sought to counter the violence of the preceding year with more moderate measures. But public discontent arising from political corruption and inefficiency eventually led the new era to end as well, as Napoleon Bonaparte staged a bloodless coup against the unpopular government. In the end, the French had revolted against a king, started a representative government, then ended up under the rule of Napoleon, who would crown himself emperor.
It wasn’t all for nothing, though. The French Revolution gave birth to radical ideas of governance and led to a rise in French nationalism. Once divided by regional identities and cultures, the French people began to identify as one unified group, a trend that would be followed by other European nations. France is, of course, a democracy today, and the storming of the Bastille is celebrated as a national holiday. It’s nice to live in a time when political reforms don’t tend to lead to decapitation.
[Image description: An engraving portraying Louis XVI in white clothing standing on an execution stage surrounded by mounted soldiers. Executioners and a priest, wearing a black robe, stand near him.] Credit & copyright: 19th century engraving, Wikimedia Commons. This work is in the public domain in its country of origin and other countries and areas where the copyright term is the author's life plus 100 years or fewer.Don’t lose your head! During the French Revolution, that advice was particularly prudent for France’s royalty and aristocracy, many of whom were executed by guillotine during the upheaval. Even the French king, Louis XVI, was killed on this day in 1793. The revolution is still remembered as one of the most violent in history, yet at its heart it was a matter of economics: specifically economic inequality between social classes.
During the latter half of the 18th century, France was struggling financially for several reasons. Chief among these was the country’s involvement in another revolution—the American Revolution—which had proven more expensive than they’d bargained for. Compounding the issue were years of bad weather and failed harvests that led food prices to rise. Even bread was unaffordable for many people. Yet, French royals and aristocrats continued to publicly flaunt their wealth even as lower classes starved.
The King of France, Louis XVI, wasn’t completely deaf to growing discontent among the masses, though. In 1787, he tasked his Controller General, Charles Alexandre de Calonne, to carry out financial reforms that would remove land tax exemptions from the aristocrats. The hope was, if they paid their fair share, more wealth would be available to everyone. When the Estates General (an assembly of three bodies representing the clergy, nobles, and the middle class) assembled at Versaille, the proposal was met with universal support from the middle class who made up 98 percent of the country’s population, but most nobles and clergy were against it. Thus, the representatives of the middle class and sympathetic nobles held a protest until, on July 9, 1789, the King formed a National Constituent Assembly, a single legislative body with more proportional representation.
However, rumors of an impending military coup had already begun to spread, and the resulting panic led people to storm the Bastille fortress to acquire gunpowder and other supplies on July 14, 1789. This marked the beginning of the French Revolution in earnest. What followed were years of political instability and upheavals. The king’s authority was greatly lessened by the National Constituent Assembly and he became a de facto prisoner of the aristocrats. Louis XVI even tried to flee the country in 1791, hoping to secure allies from abroad, but he was caught and brought back to Paris. Meanwhile, the feudal system was abolished and the Catholic Church’s landholdings were nationalized. The land was distributed to farmers and the middle class, creating tensions across the board.
To restore his authority, Louis XVI declared war against Austria, who were soon joined by Prussia. Together, the two armies crushed French forces and began marching toward Paris. When the French people saw the approaching forces, they believed that they were coming to act as counterrevolutionaries. French forces were eventually able to turn away the invasion, but the damage was done. Believing they had been betrayed by the aristocrats and their own King, the people began executing anyone they deemed to be an aristocrat or a member of the royal family. Louis XVI was tried and found guilty of high treason, then sentenced to death. The Reign of Terror had begun.
The Reign of Terror, which lasted 10 months, saw a number of violent changes to the regime. Maximilien de Robespierre, the radical revolutionary who led the Committee of Public Safety, ordered nearly 17,000 executions without trials before being executed himself in 1794. That, in turn, marked the beginning of the Thermidorian Reaction, which sought to counter the violence of the preceding year with more moderate measures. But public discontent arising from political corruption and inefficiency eventually led the new era to end as well, as Napoleon Bonaparte staged a bloodless coup against the unpopular government. In the end, the French had revolted against a king, started a representative government, then ended up under the rule of Napoleon, who would crown himself emperor.
It wasn’t all for nothing, though. The French Revolution gave birth to radical ideas of governance and led to a rise in French nationalism. Once divided by regional identities and cultures, the French people began to identify as one unified group, a trend that would be followed by other European nations. France is, of course, a democracy today, and the storming of the Bastille is celebrated as a national holiday. It’s nice to live in a time when political reforms don’t tend to lead to decapitation.
[Image description: An engraving portraying Louis XVI in white clothing standing on an execution stage surrounded by mounted soldiers. Executioners and a priest, wearing a black robe, stand near him.] Credit & copyright: 19th century engraving, Wikimedia Commons. This work is in the public domain in its country of origin and other countries and areas where the copyright term is the author's life plus 100 years or fewer. -
FREEUS History PP&T CurioFree1 CQ
Who said that small towns weren’t interesting? New Harmony, a small town of less than 700 in southern Indiana, might seem like a low-profile place. Yet, it has one of the strangest histories of any town in the U.S. That’s because it was founded by the Harmony Society, a strange religious group that sought to make the town into an egalitarian utopia. They even built the country’s first public library and two labyrinths in the process.
The story of New Harmony began in faraway Germany, where Johann Georg Rapp and his followers broke away from the German Lutheran Church in the late 1700s to form the Harmony Society. Rapp, who declared publicly that, “I am a prophet, and I am called to be one,” was forbidden from gathering with his followers, called Rappites. After convincing his congregation to emigrate to the U.S., Rapp took a handful of his 10,000 or so followers and settled down in Butler County, Pennsylvania, to fulfill their own vision of utopia.
They named their town Harmony and generated income through farming and manufacturing, but decided to relocate to a climate more favorable for growing grapes for wine. So, in 1814, they left Pennsylvania for Indiana, and founded a new town they named Harmonie (sometimes called Neu Harmony) along the banks of the Wabash River. By this point, there were around 700 members in the Harmony Society, but 120 of them died while they were settling in their new home from malaria. Nevertheless, the second settlement to be called Harmony became a thriving economy, producing dry goods, wine, whiskey, and beer to trade with surrounding communities. At their peak, the Harmonists had 150 log homes, 20,000 acres of land, and a variety of retail buildings. However, the Harmony Society decided to relocate once again in 1824—this time, in search of land more suitable for manufacturing and commercial purposes. So they sold the settlement to a pair of business partners named Robert Owen and William Maclure for $150,000.
Owen and Maclure renamed the town to New Harmony, and they had plans of starting a utopia of their own. Despite their own immense wealth, they envisioned a society without social classes. They supported a variety of social causes, promoting free education, which led to the creation of America’s first public library as well as a public education system that accepted both men and women. In just a few years, Owen and Maclure attracted some of the most highly regarded academics, feminists, and naturalists, turning the small town into a haven of progressive thought and scientific research.
Unfortunately for them, the dream was not to last. By just 1827, the town became economically unsustainable under Owen and Maclure’s leadership, and the utopian society dissolved into a more conventional town. Meanwhile, the Harmony Society would face problems of their own. Returning east to Pennsylvania, they founded the town of Economy. They prospered for decades, but their community had one critical flaw: they were celibate. Without a new generation to carry on their beliefs or new converts, the Society began to dwindle. Eventually, infighting and schisms broke apart what little was left of them, and the community officially dissolved in 1905.
While the idealistic visions that shaped New Harmony’s origins may have faded, the small town is still an outlier of sorts. Despite its tiny population, New Harmony still attracts artists from around the country, and the town is dotted with sculptures and unique Harmonist architecture. Points of interest include the Roofless Church (a non-denominational, open-air church), the Atheneum (a visitors’ center designed by abstract artist Richard Meier) and not one but two mysterious hedge labyrinths built by the original Rappites. Appropriate, considering the town’s meandering history.
[Image description: A painting of a Harmonist imagining of what New Harmony could look like. It shows a walled, gated city near a river, with a family in the foreground.] Credit & copyright: F. Bate, London 1838. Wikimedia Commons. This work is in the public domain in the United States because it was published (or registered with the U.S. Copyright Office) before January 1, 1929.Who said that small towns weren’t interesting? New Harmony, a small town of less than 700 in southern Indiana, might seem like a low-profile place. Yet, it has one of the strangest histories of any town in the U.S. That’s because it was founded by the Harmony Society, a strange religious group that sought to make the town into an egalitarian utopia. They even built the country’s first public library and two labyrinths in the process.
The story of New Harmony began in faraway Germany, where Johann Georg Rapp and his followers broke away from the German Lutheran Church in the late 1700s to form the Harmony Society. Rapp, who declared publicly that, “I am a prophet, and I am called to be one,” was forbidden from gathering with his followers, called Rappites. After convincing his congregation to emigrate to the U.S., Rapp took a handful of his 10,000 or so followers and settled down in Butler County, Pennsylvania, to fulfill their own vision of utopia.
They named their town Harmony and generated income through farming and manufacturing, but decided to relocate to a climate more favorable for growing grapes for wine. So, in 1814, they left Pennsylvania for Indiana, and founded a new town they named Harmonie (sometimes called Neu Harmony) along the banks of the Wabash River. By this point, there were around 700 members in the Harmony Society, but 120 of them died while they were settling in their new home from malaria. Nevertheless, the second settlement to be called Harmony became a thriving economy, producing dry goods, wine, whiskey, and beer to trade with surrounding communities. At their peak, the Harmonists had 150 log homes, 20,000 acres of land, and a variety of retail buildings. However, the Harmony Society decided to relocate once again in 1824—this time, in search of land more suitable for manufacturing and commercial purposes. So they sold the settlement to a pair of business partners named Robert Owen and William Maclure for $150,000.
Owen and Maclure renamed the town to New Harmony, and they had plans of starting a utopia of their own. Despite their own immense wealth, they envisioned a society without social classes. They supported a variety of social causes, promoting free education, which led to the creation of America’s first public library as well as a public education system that accepted both men and women. In just a few years, Owen and Maclure attracted some of the most highly regarded academics, feminists, and naturalists, turning the small town into a haven of progressive thought and scientific research.
Unfortunately for them, the dream was not to last. By just 1827, the town became economically unsustainable under Owen and Maclure’s leadership, and the utopian society dissolved into a more conventional town. Meanwhile, the Harmony Society would face problems of their own. Returning east to Pennsylvania, they founded the town of Economy. They prospered for decades, but their community had one critical flaw: they were celibate. Without a new generation to carry on their beliefs or new converts, the Society began to dwindle. Eventually, infighting and schisms broke apart what little was left of them, and the community officially dissolved in 1905.
While the idealistic visions that shaped New Harmony’s origins may have faded, the small town is still an outlier of sorts. Despite its tiny population, New Harmony still attracts artists from around the country, and the town is dotted with sculptures and unique Harmonist architecture. Points of interest include the Roofless Church (a non-denominational, open-air church), the Atheneum (a visitors’ center designed by abstract artist Richard Meier) and not one but two mysterious hedge labyrinths built by the original Rappites. Appropriate, considering the town’s meandering history.
[Image description: A painting of a Harmonist imagining of what New Harmony could look like. It shows a walled, gated city near a river, with a family in the foreground.] Credit & copyright: F. Bate, London 1838. Wikimedia Commons. This work is in the public domain in the United States because it was published (or registered with the U.S. Copyright Office) before January 1, 1929. -
FREEBiology PP&T CurioFree1 CQ
Sometimes, you just have to dig around a little to find what you’re looking for. That couldn’t be truer in the case of the De Winton’s golden mole, which was thought to be extinct for almost 90 years. It took a dog with a keen nose, DNA testing, and three years of searching, but researchers from the Endangered Wildlife Trust (EWT) managed to locate the long-lost subterranean mammal last year. Golden moles are some of the strangest animals on the planet…and technically they’re not actually moles at all.
One of 21 species of golden moles that make up the Chrysochloridae family, De Winton’s golden moles are found only in South Africa. Other species of golden moles are commonly found throughout Sub Saharan Africa, and despite their common name, they come in a variety of colors. In fact, “Chrysochloridae” means “green-gold” after the color of their fur in light. An oil secreted by golden moles gives their fur an iridescent copper sheen and lubricates it as they move around underground. These little animals are only about the size of a mouse, but they’re tough as nails. With their powerful claws, they can swim through sand dunes, protected by their thick skin. Since they live underground, they have little use for their eyes, which are vestigial and covered in skin. Oddly enough, though, Chrysochloridae aren’t true moles, which belong to the Talpidae family. The set of similarities between the families is an example of convergent evolution, a phenomenon in which two or more species evolve to fill a specific environmental niche (like digging underground) and thus develop similar adaptations. Despite these incredible adaptations, though, De Winton’s golden moles were nearly wiped out. Prior to their rediscovery, the last time anyone saw one was in 1936, and they were thought to have gone extinct.
Their decades-long disappearing act was caused by alluvial diamond mining in South Africa, which destroyed much of their already limited habitat. However, not everyone was convinced that the creatures were gone, though it was difficult to verify their status. The different species of golden moles look very similar to each other, so researchers couldn’t rely on sightings alone. To search for the elusive digger, EWT researchers used a two-pronged approach, with the goal of acquiring photographic and DNA evidence of De Winton’s golden moles. First, they trained a border collie named Jessie to recognize the smells of the 20 other members of Chrysochloridae, and also trained her to lie down if she smelled one. Then they took her to different sites where golden moles were known to live. When researchers found golden mole tracks and burrows but Jessie didn’t lie down, they took soil samples to test for the second piece of their plan: environmental DNA, or eDNA (skin, feces, mucus and other genetic material left behind in the soil). The samples were collected in 2021, but it wasn’t until late 2023 that they were able to confirm the presence of De Winton’s golden moles in the areas they checked. Since then, two De Winton’s golden moles have been photographed in the areas they checked, and eDNA analysis has confirmed the presence of four additional golden mole species previously unknown to science.
Unfortunately, De Winton’s golden moles are still at risk of being wiped out completely. Their habitats are still threatened by diamond mining, and they’re listed as critically endangered. The next step, EWT says, is to create protected areas where the golden moles are found and raise awareness of their plight. A senior field officer of EWT, Esther Matthew, said in a statement, “A lot of the conservation focus is on the more charismatic and big animals that people see often, while the rare ones that probably need more help are the ones that need more publicity.” If moles are good at anything, though, it’s avoiding getting trampled underfoot.
[Image description: A drawing of a brown-furred golden mole with no visible eyes.] Credit & copyright: Cornelis van Noorde (1731–1795), Wikimedia Commons, This work is in the public domain in the United States because it was published (or registered with the U.S. Copyright Office) before January 1, 1928.Sometimes, you just have to dig around a little to find what you’re looking for. That couldn’t be truer in the case of the De Winton’s golden mole, which was thought to be extinct for almost 90 years. It took a dog with a keen nose, DNA testing, and three years of searching, but researchers from the Endangered Wildlife Trust (EWT) managed to locate the long-lost subterranean mammal last year. Golden moles are some of the strangest animals on the planet…and technically they’re not actually moles at all.
One of 21 species of golden moles that make up the Chrysochloridae family, De Winton’s golden moles are found only in South Africa. Other species of golden moles are commonly found throughout Sub Saharan Africa, and despite their common name, they come in a variety of colors. In fact, “Chrysochloridae” means “green-gold” after the color of their fur in light. An oil secreted by golden moles gives their fur an iridescent copper sheen and lubricates it as they move around underground. These little animals are only about the size of a mouse, but they’re tough as nails. With their powerful claws, they can swim through sand dunes, protected by their thick skin. Since they live underground, they have little use for their eyes, which are vestigial and covered in skin. Oddly enough, though, Chrysochloridae aren’t true moles, which belong to the Talpidae family. The set of similarities between the families is an example of convergent evolution, a phenomenon in which two or more species evolve to fill a specific environmental niche (like digging underground) and thus develop similar adaptations. Despite these incredible adaptations, though, De Winton’s golden moles were nearly wiped out. Prior to their rediscovery, the last time anyone saw one was in 1936, and they were thought to have gone extinct.
Their decades-long disappearing act was caused by alluvial diamond mining in South Africa, which destroyed much of their already limited habitat. However, not everyone was convinced that the creatures were gone, though it was difficult to verify their status. The different species of golden moles look very similar to each other, so researchers couldn’t rely on sightings alone. To search for the elusive digger, EWT researchers used a two-pronged approach, with the goal of acquiring photographic and DNA evidence of De Winton’s golden moles. First, they trained a border collie named Jessie to recognize the smells of the 20 other members of Chrysochloridae, and also trained her to lie down if she smelled one. Then they took her to different sites where golden moles were known to live. When researchers found golden mole tracks and burrows but Jessie didn’t lie down, they took soil samples to test for the second piece of their plan: environmental DNA, or eDNA (skin, feces, mucus and other genetic material left behind in the soil). The samples were collected in 2021, but it wasn’t until late 2023 that they were able to confirm the presence of De Winton’s golden moles in the areas they checked. Since then, two De Winton’s golden moles have been photographed in the areas they checked, and eDNA analysis has confirmed the presence of four additional golden mole species previously unknown to science.
Unfortunately, De Winton’s golden moles are still at risk of being wiped out completely. Their habitats are still threatened by diamond mining, and they’re listed as critically endangered. The next step, EWT says, is to create protected areas where the golden moles are found and raise awareness of their plight. A senior field officer of EWT, Esther Matthew, said in a statement, “A lot of the conservation focus is on the more charismatic and big animals that people see often, while the rare ones that probably need more help are the ones that need more publicity.” If moles are good at anything, though, it’s avoiding getting trampled underfoot.
[Image description: A drawing of a brown-furred golden mole with no visible eyes.] Credit & copyright: Cornelis van Noorde (1731–1795), Wikimedia Commons, This work is in the public domain in the United States because it was published (or registered with the U.S. Copyright Office) before January 1, 1928. -
FREEWorld History PP&T CurioFree1 CQ
Happy New Year! While plenty of us will be swigging champagne as the ball drops tonight, in Spain people will be popping a dozen grapes into their mouths as fast as they can. This tradition, called “las doce uvas de la suerte”, or “the twelve grapes of luck”, has been practiced for at least a century. Everything about it, from the type of grapes used to the exact timing of their consumption, has a special meaning and connection to Spanish history.
The tradition itself is pretty simple. On the last day of the year, called “Nochevieja,” revelers watch the clock count down to midnight on the Real Casa de Correos, a famous eighteenth century building in Madrid, either in person or on T.V. When the clock hits midnight, bells start ringing to mark the New Year. That’s when the grapes come in. The bell chimes 12 times in all, and participants must scarf down a grape with each chime. The bell doesn’t wait for them to finish chewing, so they’ve got to be quick in order to eat the full dozen by the final chime. Since the whole thing is so time-sensitive, grapes have to be counted out for each person ahead of time, so they’re ready to go. To that end, people who are watching the countdown usually carry their grapes in individual bowls. If a person manages to eat all 12 grapes before the final chime, they will supposedly enjoy good luck for the rest of the year (each grape represents one month of the upcoming year). If not…well, the grapes are at least a delicious variety of green grapes (also known as white grapes) called Aledo. Aledo grapes mature late in the year (they’re harvested around November and December), making them perfect for New Year’s Eve festivities. They also have a particularly decadent flavor due to the unique way they’re grown. While still on the vine, the grapes are wrapped in paper bags during the summer months to protect them from pests and disease. Since the fruits aren’t exposed to the elements, their skin remains tender, and their slow maturation allows them to develop a rich, sweet flavor and aroma.
Despite being a much-loved and much-practiced tradition, the history of the twelve grapes of luck is somewhat murky. There are many supposed origin stories, though. One of the best-known says that farmers in Alicante, Spain, where most of the Aledo grapes are grown, had a bumper crop one year in the early 1900s. Supposedly, they came up with the twelve grapes tradition to sell off the excess, and the practice stuck around through the years. Another story claims that the tradition started even earlier. In the 1880s, the Spanish middle class supposedly started emulating their French counterparts who were known for welcoming the New Year with champagne and grapes, and that somehow evolved into the Spanish practice as it exists today.
Did we mention that the twelve grapes of luck tradition has one more, slightly more intimate component? Those who are truly serious about securing luck in the new year claim that the tradition works best if you don red undergarments (socks, briefs, panties, etc.). Some people also insist that one must wash down the grapes with a glass of Cava (a Spanish sparkling wine) with a gold ring resting at the bottom. In the weeks leading up to Nochevieja, stores around the country are filled with Aledo grapes and red undergarments. The grapes are available fresh, of course, but they’re also sold seeded and skinless in cans (twelve to a can, of course) for convenience. What a grape way to ring in the New Year!
[Image description: A small bowl of white grapes.] Credit & copyright: manfredrichter, PixabayHappy New Year! While plenty of us will be swigging champagne as the ball drops tonight, in Spain people will be popping a dozen grapes into their mouths as fast as they can. This tradition, called “las doce uvas de la suerte”, or “the twelve grapes of luck”, has been practiced for at least a century. Everything about it, from the type of grapes used to the exact timing of their consumption, has a special meaning and connection to Spanish history.
The tradition itself is pretty simple. On the last day of the year, called “Nochevieja,” revelers watch the clock count down to midnight on the Real Casa de Correos, a famous eighteenth century building in Madrid, either in person or on T.V. When the clock hits midnight, bells start ringing to mark the New Year. That’s when the grapes come in. The bell chimes 12 times in all, and participants must scarf down a grape with each chime. The bell doesn’t wait for them to finish chewing, so they’ve got to be quick in order to eat the full dozen by the final chime. Since the whole thing is so time-sensitive, grapes have to be counted out for each person ahead of time, so they’re ready to go. To that end, people who are watching the countdown usually carry their grapes in individual bowls. If a person manages to eat all 12 grapes before the final chime, they will supposedly enjoy good luck for the rest of the year (each grape represents one month of the upcoming year). If not…well, the grapes are at least a delicious variety of green grapes (also known as white grapes) called Aledo. Aledo grapes mature late in the year (they’re harvested around November and December), making them perfect for New Year’s Eve festivities. They also have a particularly decadent flavor due to the unique way they’re grown. While still on the vine, the grapes are wrapped in paper bags during the summer months to protect them from pests and disease. Since the fruits aren’t exposed to the elements, their skin remains tender, and their slow maturation allows them to develop a rich, sweet flavor and aroma.
Despite being a much-loved and much-practiced tradition, the history of the twelve grapes of luck is somewhat murky. There are many supposed origin stories, though. One of the best-known says that farmers in Alicante, Spain, where most of the Aledo grapes are grown, had a bumper crop one year in the early 1900s. Supposedly, they came up with the twelve grapes tradition to sell off the excess, and the practice stuck around through the years. Another story claims that the tradition started even earlier. In the 1880s, the Spanish middle class supposedly started emulating their French counterparts who were known for welcoming the New Year with champagne and grapes, and that somehow evolved into the Spanish practice as it exists today.
Did we mention that the twelve grapes of luck tradition has one more, slightly more intimate component? Those who are truly serious about securing luck in the new year claim that the tradition works best if you don red undergarments (socks, briefs, panties, etc.). Some people also insist that one must wash down the grapes with a glass of Cava (a Spanish sparkling wine) with a gold ring resting at the bottom. In the weeks leading up to Nochevieja, stores around the country are filled with Aledo grapes and red undergarments. The grapes are available fresh, of course, but they’re also sold seeded and skinless in cans (twelve to a can, of course) for convenience. What a grape way to ring in the New Year!
[Image description: A small bowl of white grapes.] Credit & copyright: manfredrichter, Pixabay -
FREEMusic Appreciation PP&T CurioFree1 CQ
Merry Christmas Eve! We’ve written before about mythical (yet commercialized) Christmas figures like Santa Claus and Rudolph, but what about their chilliest wintertime pal, Frosty the Snowman? Unlike Santa, who is loosely based on a real person, and Rudolph, who was created to sell toys, for Frosty, everything started with a song. And with West Virginia…or New York, depending on who you ask.
The catchy song Frosty the Snowman was written by Steve Nelson and lyricist Walter “Jack” Rollins. Rollins originally wrote the lyrics, about a snowman who comes to life and leads a group of children on an adventure, when he was just a child himself. The words were meant to be a poem, and Rollins wrote them while he and his family were living in Keyser, West Virginia. He didn’t even get into songwriting as a career until he was 40, when he met Nelson while working as a baggage handler in New York City. The two teamed up and went on to write Here Comes Peter Cottontail, a song about the Easter bunny, which became a hit upon its release in 1949. They followed up their success in 1950 by setting Rollins’s childhood poem to music, adapting Frosty the Snowman into a song. Their decision to write a Christmas song was largely based on the success of Rudolph the Red-Nosed Reindeer, which also began as a poem. Rudolph… showed that there was a market for secular Christmas songs. The first recording of Frosty the Snowman was performed by Gene Autry and the Cass County Boys, but there have been countless covers in every conceivable genre in the years since.
It may not sound like a controversial song, but there is a pretty passionate debate regarding where the story of Frosty the Snowman takes place. In what city or town did Frosty first put on his magic top hat? While Rollins wrote the original lyrics in West Virginia, the towns of Armonk and White Plains in New York each claim that the story takes place in their respective locales. Armonk seems to have the stronger argument: Nelson lived there for many years and probably tweaked some of the lyrics based on memories of his time there. The town is so proud of their connection to Frosty that they even hold a snowman-themed parade every year.
Regardless of where it was really set, most modern listeners likely associate the song with the 1969 Frosty the Snowman Rankin/Bass television special featuring Jimmy Durante as narrator and singer. In the special, Frosty is a regular, inanimate snowman, but comes to life after the children place a magical hat on his head. When they realize that Frosty will melt as the weather gets warmer, they try to take him further north on a train stocked with ice cream cakes. Despite a sad scene in which Frosty melts while trying to warm up his human friend in a greenhouse, Santa Claus is able to bring him back to life and take him to the North Pole. Durante’s unique rendition of the titular song helped turn the special into a Christmas classic. In fact, the most widely played recording of the song nowadays is Durante’s version from the special.
Fans of Frosty will be happy to know that the titular snowman has so far kept his promise to come “back again someday.” There have been several sequels to the 1969 special, though none of them included Durante and none reached the same level of popularity. It’s been seven decades since the original song was first released, yet Frosty remains a quintessential holiday mascot. That’s a lot of staying power for someone made of snow!
[Image description: An AI-generated digital image of a snowman wearing a scarf and top hat against a blue background.] Credit & copyright: geralt, PixabayMerry Christmas Eve! We’ve written before about mythical (yet commercialized) Christmas figures like Santa Claus and Rudolph, but what about their chilliest wintertime pal, Frosty the Snowman? Unlike Santa, who is loosely based on a real person, and Rudolph, who was created to sell toys, for Frosty, everything started with a song. And with West Virginia…or New York, depending on who you ask.
The catchy song Frosty the Snowman was written by Steve Nelson and lyricist Walter “Jack” Rollins. Rollins originally wrote the lyrics, about a snowman who comes to life and leads a group of children on an adventure, when he was just a child himself. The words were meant to be a poem, and Rollins wrote them while he and his family were living in Keyser, West Virginia. He didn’t even get into songwriting as a career until he was 40, when he met Nelson while working as a baggage handler in New York City. The two teamed up and went on to write Here Comes Peter Cottontail, a song about the Easter bunny, which became a hit upon its release in 1949. They followed up their success in 1950 by setting Rollins’s childhood poem to music, adapting Frosty the Snowman into a song. Their decision to write a Christmas song was largely based on the success of Rudolph the Red-Nosed Reindeer, which also began as a poem. Rudolph… showed that there was a market for secular Christmas songs. The first recording of Frosty the Snowman was performed by Gene Autry and the Cass County Boys, but there have been countless covers in every conceivable genre in the years since.
It may not sound like a controversial song, but there is a pretty passionate debate regarding where the story of Frosty the Snowman takes place. In what city or town did Frosty first put on his magic top hat? While Rollins wrote the original lyrics in West Virginia, the towns of Armonk and White Plains in New York each claim that the story takes place in their respective locales. Armonk seems to have the stronger argument: Nelson lived there for many years and probably tweaked some of the lyrics based on memories of his time there. The town is so proud of their connection to Frosty that they even hold a snowman-themed parade every year.
Regardless of where it was really set, most modern listeners likely associate the song with the 1969 Frosty the Snowman Rankin/Bass television special featuring Jimmy Durante as narrator and singer. In the special, Frosty is a regular, inanimate snowman, but comes to life after the children place a magical hat on his head. When they realize that Frosty will melt as the weather gets warmer, they try to take him further north on a train stocked with ice cream cakes. Despite a sad scene in which Frosty melts while trying to warm up his human friend in a greenhouse, Santa Claus is able to bring him back to life and take him to the North Pole. Durante’s unique rendition of the titular song helped turn the special into a Christmas classic. In fact, the most widely played recording of the song nowadays is Durante’s version from the special.
Fans of Frosty will be happy to know that the titular snowman has so far kept his promise to come “back again someday.” There have been several sequels to the 1969 special, though none of them included Durante and none reached the same level of popularity. It’s been seven decades since the original song was first released, yet Frosty remains a quintessential holiday mascot. That’s a lot of staying power for someone made of snow!
[Image description: An AI-generated digital image of a snowman wearing a scarf and top hat against a blue background.] Credit & copyright: geralt, Pixabay -
FREEUS History PP&T CurioFree1 CQ
She wasn’t trying to start a revolution, but she wasn’t afraid to join one. Deborah Sampson was the first woman in U.S. history to receive a military pension—not as a spouse, but as a veteran. Born on this day 1760, Sampson disguised herself as a man and adopted a new identity to fight in the Continental Army. Later, she toured the newly formed nation as a lecturer.
Born in Plympton, Massachusetts, Sampson had a difficult childhood. Her father was lost at sea when she was just five years old, and her family struggled financially as a result. Starting from the age of ten, she worked as an indentured servant on a farm until she turned 18. Afterward, she found work as a schoolteacher in the summer and as a weaver in the winter while the American Revolutionary War raged on. But starting in the 1780s, as the war continued, Sampson tried to enlist in the Continental Army in disguise. Her first attempt ended in failure, leading to her immediate discovery and a scandal in town. That didn’t deter her, though, and her second attempt in 1782 was successful. Taking on the name Robert Shurtleff, Sampson joined the 4th Massachusetts Regiment. Her fellow soldiers didn’t catch on to her ruse and her true gender went unnoticed, although she was given the nickname “Molly” due to her lack of facial hair,
For 17 months, “Shurtleff” served in the Continental Army. Just months after joining, Sampson participated in a skirmish against Tory forces that saw her fighting one-on-one against enemy soldiers. She also served as a scout, entering Manhattan and reporting on the British troops that were mobilizing and gathering supplies there. Sampson’s cover was almost blown several times, but she was so determined to keep her secret that she even dug a bullet out of her own leg after she was shot, to avoid a doctor’s examination. This resulted in her living the rest of her life with some shrapnel in her leg. Unfortunately, she was found out after she came down with a serious illness. While in Philadelphia, she was sent to a hospital with a severe fever. She fell unconscious after arriving, and medical staff discovered her true gender while treating her. After being discovered, Sampson received an honorable discharge and returned to Massachusetts. In 1785, she married Benjamin Gannet, with whom she had three children. During this time, she did not receive a pension for her service, and she lived a quiet life. However, things changed as stories of her deeds spread due to the publication of The Female Review: or, Memoirs of an American Young Lady by Herman Mann in 1797. The book was a detailed account of Sampson’s time in the army. To promote the book, Sampson herself went on a year-long lecture tour in 1802. She regaled listeners with war stories, often in uniform, though she may have embellished things a bit. For instance, she claimed to have dug trenches and faced cannons during the Battle of Yorktown, but that battle took place a year before she enlisted. Nevertheless, her accomplishments were largely corroborated and even Paul Revere came to her aid to help her secure a military pension from the state of Massachusetts.
Today, Sampson is remembered as a folk hero of the Revolutionary War. After she passed away in 1827 in Sharon, Massachusetts, the town erected statues in her honor. There’s even one standing outside the town’s public library. It shows her dressed as a woman, but holding her musket, with her uniform jacket draped over her shoulder. In 1982, Massachusetts declared May 23 “Deborah Sampson Day” and made her the official state heroine. That seems well-deserved, given that she was the first woman to bayonet-charge her way through the gender barrier.
[Image description: An engraving of Deborah Sampson wearing a dress with a frilled collar.] Credit & copyright: Engraving by George Graham. From a drawing by William Beastall, which was based on a painting by Joseph Stone. Wikimedia Commons, Public DomainShe wasn’t trying to start a revolution, but she wasn’t afraid to join one. Deborah Sampson was the first woman in U.S. history to receive a military pension—not as a spouse, but as a veteran. Born on this day 1760, Sampson disguised herself as a man and adopted a new identity to fight in the Continental Army. Later, she toured the newly formed nation as a lecturer.
Born in Plympton, Massachusetts, Sampson had a difficult childhood. Her father was lost at sea when she was just five years old, and her family struggled financially as a result. Starting from the age of ten, she worked as an indentured servant on a farm until she turned 18. Afterward, she found work as a schoolteacher in the summer and as a weaver in the winter while the American Revolutionary War raged on. But starting in the 1780s, as the war continued, Sampson tried to enlist in the Continental Army in disguise. Her first attempt ended in failure, leading to her immediate discovery and a scandal in town. That didn’t deter her, though, and her second attempt in 1782 was successful. Taking on the name Robert Shurtleff, Sampson joined the 4th Massachusetts Regiment. Her fellow soldiers didn’t catch on to her ruse and her true gender went unnoticed, although she was given the nickname “Molly” due to her lack of facial hair,
For 17 months, “Shurtleff” served in the Continental Army. Just months after joining, Sampson participated in a skirmish against Tory forces that saw her fighting one-on-one against enemy soldiers. She also served as a scout, entering Manhattan and reporting on the British troops that were mobilizing and gathering supplies there. Sampson’s cover was almost blown several times, but she was so determined to keep her secret that she even dug a bullet out of her own leg after she was shot, to avoid a doctor’s examination. This resulted in her living the rest of her life with some shrapnel in her leg. Unfortunately, she was found out after she came down with a serious illness. While in Philadelphia, she was sent to a hospital with a severe fever. She fell unconscious after arriving, and medical staff discovered her true gender while treating her. After being discovered, Sampson received an honorable discharge and returned to Massachusetts. In 1785, she married Benjamin Gannet, with whom she had three children. During this time, she did not receive a pension for her service, and she lived a quiet life. However, things changed as stories of her deeds spread due to the publication of The Female Review: or, Memoirs of an American Young Lady by Herman Mann in 1797. The book was a detailed account of Sampson’s time in the army. To promote the book, Sampson herself went on a year-long lecture tour in 1802. She regaled listeners with war stories, often in uniform, though she may have embellished things a bit. For instance, she claimed to have dug trenches and faced cannons during the Battle of Yorktown, but that battle took place a year before she enlisted. Nevertheless, her accomplishments were largely corroborated and even Paul Revere came to her aid to help her secure a military pension from the state of Massachusetts.
Today, Sampson is remembered as a folk hero of the Revolutionary War. After she passed away in 1827 in Sharon, Massachusetts, the town erected statues in her honor. There’s even one standing outside the town’s public library. It shows her dressed as a woman, but holding her musket, with her uniform jacket draped over her shoulder. In 1982, Massachusetts declared May 23 “Deborah Sampson Day” and made her the official state heroine. That seems well-deserved, given that she was the first woman to bayonet-charge her way through the gender barrier.
[Image description: An engraving of Deborah Sampson wearing a dress with a frilled collar.] Credit & copyright: Engraving by George Graham. From a drawing by William Beastall, which was based on a painting by Joseph Stone. Wikimedia Commons, Public Domain -
FREELiterature PP&T CurioFree1 CQ
She’s the most famous poet that people in her lifetime had never heard of. Today, Emily Dickinson is considered one of the most important American writers of all time. Yet, she lived in seclusion for most of her adult life, publishing very few of her own poems. Despite her simple lifestyle and general lack of travel, Dickinson’s writing is known for boldly exploring big, abstract concepts, like infinity and truth.
Born this day in 1830 in Amherst, Massachusetts, Dickinson was a studious child who very much enjoyed school, especially subjects related to nature and animals. As was typical for daughters of upper-class Victorian families, Dickinson learned to perform domestic tasks, entertain guests, and socialize frequently at home and church. Yet, despite this seemingly idyllic lifestyle, she was troubled by an internal gloom that she couldn’t overcome. From an early age, Dickinson became obsessed with death and all things morbid. She had a difficult time accepting the deaths of friends and family members, and she was particularly traumatized by the passing of Sophia Holland, a cousin with whom she’d had a close childhood relationship. Dickinson visited the local cemetery in Amherst frequently, where she would watch burials, the imagery of which later featured frequently in her work. Even though a religious revival movement took place in Massachusetts and other parts of the U.S. during her youth, Dickinson never fully embraced religion. Going against the grain, she remained steadfast in her religious apathy even though the rest of her family was devout, once saying to a friend, “I am one of the lingering bad ones.” As she grew into a young adult, she became more focused on her writing, which she rarely published and shared only with close acquaintances.
It was also around this time, in her early 20s, that Dickinson began to isolate herself from society. She occasionally traveled to see relatives and entertained a select few guests, including Susan Huntington Gilbert, her best friend and sister-in-law. Meanwhile, her poetry became increasingly solemn in tone and subject matter. Of her many poems exploring the subjects of death and grief, one of the most well known is I measure every Grief I meet (561), which begins, “I measure every Grief I meet With narrow, probing, eyes – I wonder if It weighs like Mine – Or has an Easier size.” This poem lists examples of different ways people seem to grieve, and in doing so, shows that there is as much hope as there is despair in the matter. In her own way, Dickinson shows both the beauty and suffering inherent in death and loss, before leaving the reader to ponder what her own perspective might have been.
For Dickinson herself, death came when she was just 55, following a lengthy bout of illness that modern experts believe might have been heart failure brought on by severe hypertension. The cause of the hypertension itself might have been stress, as in the years leading up to her death she had lost some of her closest friends and family members. When she fell ill, she remained reclusive, only allowing a doctor to examine her through an open door in her home. But in death, Dickinson became immortalized as decades worth of her work were discovered by her surviving sister. Although she had published just a handful of poems in her lifetime, the first volume of her work was published a few years after her death. Since then, Dickinson has become one of the most well known and widely read American poets, known for her idiosyncratic writing style. Almost a century and a half later, she remains one of the most distinguished female voices in literature, giving readers a unique glimpse into Victorian attitudes toward life, grief, and death.
[Image description: A black-and-white daguerreotype of Emily Dickinson with a ribbon around her neck.] Credit & copyright: From the Todd-Bingham Picture Collection and Family Papers, Yale University Manuscripts & Archives Digital Images Database, Yale University, New Haven, Connecticut. Wikimedia Commons, Public DomainShe’s the most famous poet that people in her lifetime had never heard of. Today, Emily Dickinson is considered one of the most important American writers of all time. Yet, she lived in seclusion for most of her adult life, publishing very few of her own poems. Despite her simple lifestyle and general lack of travel, Dickinson’s writing is known for boldly exploring big, abstract concepts, like infinity and truth.
Born this day in 1830 in Amherst, Massachusetts, Dickinson was a studious child who very much enjoyed school, especially subjects related to nature and animals. As was typical for daughters of upper-class Victorian families, Dickinson learned to perform domestic tasks, entertain guests, and socialize frequently at home and church. Yet, despite this seemingly idyllic lifestyle, she was troubled by an internal gloom that she couldn’t overcome. From an early age, Dickinson became obsessed with death and all things morbid. She had a difficult time accepting the deaths of friends and family members, and she was particularly traumatized by the passing of Sophia Holland, a cousin with whom she’d had a close childhood relationship. Dickinson visited the local cemetery in Amherst frequently, where she would watch burials, the imagery of which later featured frequently in her work. Even though a religious revival movement took place in Massachusetts and other parts of the U.S. during her youth, Dickinson never fully embraced religion. Going against the grain, she remained steadfast in her religious apathy even though the rest of her family was devout, once saying to a friend, “I am one of the lingering bad ones.” As she grew into a young adult, she became more focused on her writing, which she rarely published and shared only with close acquaintances.
It was also around this time, in her early 20s, that Dickinson began to isolate herself from society. She occasionally traveled to see relatives and entertained a select few guests, including Susan Huntington Gilbert, her best friend and sister-in-law. Meanwhile, her poetry became increasingly solemn in tone and subject matter. Of her many poems exploring the subjects of death and grief, one of the most well known is I measure every Grief I meet (561), which begins, “I measure every Grief I meet With narrow, probing, eyes – I wonder if It weighs like Mine – Or has an Easier size.” This poem lists examples of different ways people seem to grieve, and in doing so, shows that there is as much hope as there is despair in the matter. In her own way, Dickinson shows both the beauty and suffering inherent in death and loss, before leaving the reader to ponder what her own perspective might have been.
For Dickinson herself, death came when she was just 55, following a lengthy bout of illness that modern experts believe might have been heart failure brought on by severe hypertension. The cause of the hypertension itself might have been stress, as in the years leading up to her death she had lost some of her closest friends and family members. When she fell ill, she remained reclusive, only allowing a doctor to examine her through an open door in her home. But in death, Dickinson became immortalized as decades worth of her work were discovered by her surviving sister. Although she had published just a handful of poems in her lifetime, the first volume of her work was published a few years after her death. Since then, Dickinson has become one of the most well known and widely read American poets, known for her idiosyncratic writing style. Almost a century and a half later, she remains one of the most distinguished female voices in literature, giving readers a unique glimpse into Victorian attitudes toward life, grief, and death.
[Image description: A black-and-white daguerreotype of Emily Dickinson with a ribbon around her neck.] Credit & copyright: From the Todd-Bingham Picture Collection and Family Papers, Yale University Manuscripts & Archives Digital Images Database, Yale University, New Haven, Connecticut. Wikimedia Commons, Public Domain -
FREEWorld History PP&T CurioFree1 CQ
They’re glitzy, they’re glittery, and for a long time they were serious fire hazards. Christmas trees are, by far, the holiday’s most famous decorations, to the point that much of Christmas day literally takes place around them. Yet, for centuries, Christmas was celebrated without this celebratory staple.
Although Christmas is now the most widely-observed Christian holiday, Christians didn’t always celebrate it. Until the 4th century C.E., they were actually averse to celebrating the birthdays of saints or martyrs, and that extended to Jesus Christ himself. Such celebrations were considered too similar to pagan traditions, as were flashy decorations. It wasn’t until the 16th century that Germans embraced the pagan tradition of bringing evergreen boughs and whole trees into their homes during winter. Over time, they naturally began to spruce them up with homemade decorations. Martin Luther, a central figure of the Protestant Revolution, is widely acknowledged as the first person to have placed candles on a Christmas tree, in honor of the starry winter sky. Luther also helped make the trees a Christmas (rather than simply a winter) tradition.
In the following years, the practice of decorating trees for Christmas spread across Europe, making its way to the American colonies in the 18th century. Most early Christmas tree decorations were made from things that were easy to find around the house or in nature. These included strings of popcorn, bright pieces of fabric, red pepper swags, and moss. As much as the trees were a source of joy, however, they could also lead to disaster. Since they were lit with actual, flaming candles, they were serious fire hazards. Pine trees aren’t exactly known for being fireproof, and a single, errant flame could burn down not just a tree, but the entire house surrounding it. Still, by the 19th century, Christmas trees became a must-have item for every household after Queen Victoria (whose mother was German) put one up and placed her children's presents under it.
These days, electric lights have replaced candles, and many people prefer artificial trees that can be used year after year (and are far less flammable.) The first artificial Christmas trees were created due to pine tree shortages in Europe following World War I. These were made from dyed feathers and were fairly delicate. During World War II, the first mass-produced artificial trees were created by British company Addis Housewares, made from repurposed toilet-brush-making machinery. Sculpted, aluminum trees lit with internal color-wheels became popular soon after, but their appeal suffered greatly after the 1965 TV special A Charlie Brown Christmas, which portrayed them as symbols of crass commercialism.
The artificial tree found redemption in the hands of Si Spiegel, a Jewish WWII veteran who, after being denied pilot jobs due to antisemitism, took a job as a factory worker at American Brush Machinery in the 1950s. Some of the machines that Spiegel worked on were being sold to companies that used them to make artificial trees. Spiegel encouraged American Brush Machinery to make artificial trees themselves, but the process didn't go smoothly, at first. The trees weren’t realistic enough. So, Spiegel studied real trees and tweaked the machine designs in order to make plastic trees with bendable branches that looked as close to the real thing as possible. With his guidance, American Brush Machinery became the world’s biggest supplier of artificial Christmas trees by the mid-1970s. Spiegel even started his own company, American Tree and Wreath, which produced around 800,000 trees per year, at the height of its popularity. Today, Christmas trees are a must-have item for anyone who celebrates the holiday. Real or fake, just leave the candles off of them.
[Image description: The top of a decorated Christmas Tree surrounded by golden string lights.] Credit & copyright: Elina Fairytale, PexelsThey’re glitzy, they’re glittery, and for a long time they were serious fire hazards. Christmas trees are, by far, the holiday’s most famous decorations, to the point that much of Christmas day literally takes place around them. Yet, for centuries, Christmas was celebrated without this celebratory staple.
Although Christmas is now the most widely-observed Christian holiday, Christians didn’t always celebrate it. Until the 4th century C.E., they were actually averse to celebrating the birthdays of saints or martyrs, and that extended to Jesus Christ himself. Such celebrations were considered too similar to pagan traditions, as were flashy decorations. It wasn’t until the 16th century that Germans embraced the pagan tradition of bringing evergreen boughs and whole trees into their homes during winter. Over time, they naturally began to spruce them up with homemade decorations. Martin Luther, a central figure of the Protestant Revolution, is widely acknowledged as the first person to have placed candles on a Christmas tree, in honor of the starry winter sky. Luther also helped make the trees a Christmas (rather than simply a winter) tradition.
In the following years, the practice of decorating trees for Christmas spread across Europe, making its way to the American colonies in the 18th century. Most early Christmas tree decorations were made from things that were easy to find around the house or in nature. These included strings of popcorn, bright pieces of fabric, red pepper swags, and moss. As much as the trees were a source of joy, however, they could also lead to disaster. Since they were lit with actual, flaming candles, they were serious fire hazards. Pine trees aren’t exactly known for being fireproof, and a single, errant flame could burn down not just a tree, but the entire house surrounding it. Still, by the 19th century, Christmas trees became a must-have item for every household after Queen Victoria (whose mother was German) put one up and placed her children's presents under it.
These days, electric lights have replaced candles, and many people prefer artificial trees that can be used year after year (and are far less flammable.) The first artificial Christmas trees were created due to pine tree shortages in Europe following World War I. These were made from dyed feathers and were fairly delicate. During World War II, the first mass-produced artificial trees were created by British company Addis Housewares, made from repurposed toilet-brush-making machinery. Sculpted, aluminum trees lit with internal color-wheels became popular soon after, but their appeal suffered greatly after the 1965 TV special A Charlie Brown Christmas, which portrayed them as symbols of crass commercialism.
The artificial tree found redemption in the hands of Si Spiegel, a Jewish WWII veteran who, after being denied pilot jobs due to antisemitism, took a job as a factory worker at American Brush Machinery in the 1950s. Some of the machines that Spiegel worked on were being sold to companies that used them to make artificial trees. Spiegel encouraged American Brush Machinery to make artificial trees themselves, but the process didn't go smoothly, at first. The trees weren’t realistic enough. So, Spiegel studied real trees and tweaked the machine designs in order to make plastic trees with bendable branches that looked as close to the real thing as possible. With his guidance, American Brush Machinery became the world’s biggest supplier of artificial Christmas trees by the mid-1970s. Spiegel even started his own company, American Tree and Wreath, which produced around 800,000 trees per year, at the height of its popularity. Today, Christmas trees are a must-have item for anyone who celebrates the holiday. Real or fake, just leave the candles off of them.
[Image description: The top of a decorated Christmas Tree surrounded by golden string lights.] Credit & copyright: Elina Fairytale, Pexels -
FREEWriting PP&T CurioFree1 CQ
There’s nothing wrong with being a little peanutty. American cartoonist Charles Schulz, who was born on this day in 1922, created one of the most iconic weekly comic strips of all time: Peanuts. Featuring adorable child characters and witty dialogue, Peanuts is still considered a classic today. Yet, the comic wasn’t entirely born out of happiness. Much of its melancholy humor was inspired by hardships in Schulz’s own life.
Born on November 26, 1922, in Minneapolis, Minnesota, Charles Schulz was the son of a barber and a homemaker. He was interested in drawing from an early age, and he especially loved to draw his family members, including the family dog, Spike. Spike was always eating unusual things from around the house, which inspired Schulz, at age 15, to draw a picture of the dog and send it to Ripley's Believe It or Not!, where it ran in Robert Ripley's syndicated panel. This wasn’t enough to impress those running the yearbook at Schulz’s highschool though, as the drawings he submitted to them were rejected. Nevertheless, with the encouragement of his mother, Schulz enrolled in an art correspondence program after highschool, and began taking his art more seriously.
1943 was a difficult year for Schulz. Not only was World War II in full swing, his mother, to whom he was very close, died suddenly of cervical cancer. Not long after, he was drafted into the Army, where he eventually became a squad leader on a machine gun team. Although Schulz was proud of his service during the war, and even won a Combat Infantry Badge, the violence left him saddened and gave him a lifelong dislike of fighting.
For a few years after the war, Schulz worked at the same correspondence school he’d graduated from as he continued to sell his art wherever he could. Schulz’s first real break into the comic world came in 1947, with the publication of several, one-panel comics titled Li’l Folks in the St. Paul Pioneer Press. This comic is now considered a sort of precursor to Peanuts, since it featured Schulz’s signature child characters and even a dog that resembled Snoopy. Schulz’s comics were so popular that he was able to publish some in The Saturday Evening Post at the same time that they were already appearing in the St. Paul Pioneer Press. In 1950, feeling that his comics needed wider reach, Schulz approached United Feature Syndicate, and pitched the idea of a weekly four-panel strip. The syndicate loved Schulz’s work and gladly accepted, which meant that newspapers around the country would be able to pay them to run his work. There was just one, slight hiccup: the comic’s name. Since it had already been published in other newspapers under the name Li’l Folks, the syndicate and Schulz had to settle on a new title. They decided to call the strip Peanuts, after Schulz’s nickname for children. Later, Schulz would go on to explain, “Peanuts are the grandest people in the world. All children are peanuts. They're delightful, funny, irresistible, and wonderfully unpredictable. I really hate to see them grow out of the peanut stage.”
Within just a few years, Peanuts grew into one of the most successful comics of all time, running in 2,600 newspaper in 75 countries. Charlie Brown, Lucy, Sally, Linus, and the rest of the Peanuts gang were cute, mischievous, and wise beyond their years. Schulz based his melancholic protagonist, Charlie Brown, on himself and his own childhood struggles to fit in. The character’s relatability helped catapult Peanuts from newspapers to television screens with the 1965 animated special, A Charlie Brown Christmas. Despite apprehension that the slow-paced cartoon might fail, the special was instantly popular, winning both a Primetime Emmy and a Peabody Award. Schulz continued to draw Peanuts himself for the rest of his life, refusing to hand over the job to anyone else even after his hands began to shake due to essential tremor. In all, Schulz created 17,897 Peanuts comics before passing away in 2000 at the age of 77. Even among artists, Schulz lived a life remarkably full of creativity. You were a good man, Charlie Brown!
[Image description: A black-and-white photo of Charles Schulz at a desk with one arm extended.] Credit & copyright: Wikimedia Commons, Roger Higgins, World Telegram staff photographer. This photograph is a work for hire created prior to 1968 by a staff photographer at New York World-Telegram & Sun. It is part of a collection donated to the Library of Congress and per the instrument of gift it is in the public domain.There’s nothing wrong with being a little peanutty. American cartoonist Charles Schulz, who was born on this day in 1922, created one of the most iconic weekly comic strips of all time: Peanuts. Featuring adorable child characters and witty dialogue, Peanuts is still considered a classic today. Yet, the comic wasn’t entirely born out of happiness. Much of its melancholy humor was inspired by hardships in Schulz’s own life.
Born on November 26, 1922, in Minneapolis, Minnesota, Charles Schulz was the son of a barber and a homemaker. He was interested in drawing from an early age, and he especially loved to draw his family members, including the family dog, Spike. Spike was always eating unusual things from around the house, which inspired Schulz, at age 15, to draw a picture of the dog and send it to Ripley's Believe It or Not!, where it ran in Robert Ripley's syndicated panel. This wasn’t enough to impress those running the yearbook at Schulz’s highschool though, as the drawings he submitted to them were rejected. Nevertheless, with the encouragement of his mother, Schulz enrolled in an art correspondence program after highschool, and began taking his art more seriously.
1943 was a difficult year for Schulz. Not only was World War II in full swing, his mother, to whom he was very close, died suddenly of cervical cancer. Not long after, he was drafted into the Army, where he eventually became a squad leader on a machine gun team. Although Schulz was proud of his service during the war, and even won a Combat Infantry Badge, the violence left him saddened and gave him a lifelong dislike of fighting.
For a few years after the war, Schulz worked at the same correspondence school he’d graduated from as he continued to sell his art wherever he could. Schulz’s first real break into the comic world came in 1947, with the publication of several, one-panel comics titled Li’l Folks in the St. Paul Pioneer Press. This comic is now considered a sort of precursor to Peanuts, since it featured Schulz’s signature child characters and even a dog that resembled Snoopy. Schulz’s comics were so popular that he was able to publish some in The Saturday Evening Post at the same time that they were already appearing in the St. Paul Pioneer Press. In 1950, feeling that his comics needed wider reach, Schulz approached United Feature Syndicate, and pitched the idea of a weekly four-panel strip. The syndicate loved Schulz’s work and gladly accepted, which meant that newspapers around the country would be able to pay them to run his work. There was just one, slight hiccup: the comic’s name. Since it had already been published in other newspapers under the name Li’l Folks, the syndicate and Schulz had to settle on a new title. They decided to call the strip Peanuts, after Schulz’s nickname for children. Later, Schulz would go on to explain, “Peanuts are the grandest people in the world. All children are peanuts. They're delightful, funny, irresistible, and wonderfully unpredictable. I really hate to see them grow out of the peanut stage.”
Within just a few years, Peanuts grew into one of the most successful comics of all time, running in 2,600 newspaper in 75 countries. Charlie Brown, Lucy, Sally, Linus, and the rest of the Peanuts gang were cute, mischievous, and wise beyond their years. Schulz based his melancholic protagonist, Charlie Brown, on himself and his own childhood struggles to fit in. The character’s relatability helped catapult Peanuts from newspapers to television screens with the 1965 animated special, A Charlie Brown Christmas. Despite apprehension that the slow-paced cartoon might fail, the special was instantly popular, winning both a Primetime Emmy and a Peabody Award. Schulz continued to draw Peanuts himself for the rest of his life, refusing to hand over the job to anyone else even after his hands began to shake due to essential tremor. In all, Schulz created 17,897 Peanuts comics before passing away in 2000 at the age of 77. Even among artists, Schulz lived a life remarkably full of creativity. You were a good man, Charlie Brown!
[Image description: A black-and-white photo of Charles Schulz at a desk with one arm extended.] Credit & copyright: Wikimedia Commons, Roger Higgins, World Telegram staff photographer. This photograph is a work for hire created prior to 1968 by a staff photographer at New York World-Telegram & Sun. It is part of a collection donated to the Library of Congress and per the instrument of gift it is in the public domain. -
FREEUS History PP&T CurioFree1 CQ
He didn’t oversee a war or have a particularly controversial political career, yet James Garfield, who was born on this day in 1831, was the second U.S. President to be assassinated. Though his death isn’t nearly as remembered as Lincoln’s or John F. Kennedy’s (Garfield joins President McKinley as a less-remembered assassinated President) the circumstances that led to it were as strange as they were tragic.
Sometimes referred to as “the last president to be born in a log cabin”, James Garfield straddled two eras in American history. He was born in the Ohio Western Reserve, then a relatively distant territory from the U.S. capital. As a young man, he fought for the Union during the American Civil War, reaching the rank of captain, and the aftermath of the conflict shaped his political career. Elected into the U.S. Congress in 1863, Garfield contributed to the addition of the 14th and 15th Amendments to the Constitution, which were meant to guarantee equal rights for newly-freed slaves after the war. Throughout his career, he was a supporter of civil rights and was well-regarded by his peers across parties. Ironically, he never sought the office that made him an assassin’s target. His presidential nomination in 1880 came about because the Republican Party was contentiously split on whom to choose, and so they chose Garfield, whom everyone liked. Garfield went on to win the election, becoming the 20th president of the U.S., with Chester A. Arthur as his vice-president.
Although Garfield's term ended up being cut tragically short after just four months, he did have time to make several sweeping reforms while in office. Chief among them was his ending of the system of patronage and “gifting” of political offices in the federal civil service based purely on personal favors. Garfield instead wanted to replace it with a system that rewarded merit. But patronage was a common and well-tolerated practice at the time, and Garfield’s own party had a faction called the Stalwarts who supported it. One such Stalwart was Charles Guiteau, who was infuriated by Garfield’s anti-patronage views. Guiteau was considered unstable by not only his coworkers, but by members of his own family. In D.C., he had a reputation as a political pest, constantly seeking favors. He had actually made a few political speeches in favor of Garfield, during the President’s 1880 campaign, and thus convinced himself that Garfield owed him a high-ranking government job, in return. When no such job offer came, Guiteau sought revenge.
The assassination itself was meticulously planned. Guiteau purchased an ivory-handled .44 caliber pistol that he thought would make a fine museum display after the fact. On the day of the assassination, he carried a note in his pocket that read, “The president’s tragic death was a sad necessity, but it will unite the Republican Party and save the Republic. Life is a fleeting dream, and it matters little when one goes.” Around 9:20 a.m., Guiteau shot President Garfield at the Baltimore and Potomac Railroad Station while shouting, “I am a Stalwart and Arthur is now president!”
President Garfield passed away weeks later after his wounds became septic. Guiteau was executed on October 14, 1881 after a highly publicized trial that focused the public’s eye on him and Stalwarts. During the trial, much of the coverage involved ridiculing Guiteau’s sense of entitlement and his alleged insanity. His association with the Stalwarts led to the faction’s ruin, and the 1883 Pendleton Civil Service Act ended patronage on a national level. It’s a strange twist of fate that a man who never sought power was murdered by one who felt entitled to it. At least Garfield’s will was done, regardless, in the end.
[Image description: A black-and-white illustration from 1881 depicting the assassintion of President Garfield. Garfield places a hand on his back where he was shot. He is supported by Secretary of State James G. Blaine while in the background the President’s assassin, Charles Guiteau, is caught by the crowd.] Credit & copyright: A. Berghaus and C. Upham, published in Frank Leslie's Illustrated Newspaper, Wikimedia Commons, This work is in the public domain in its country of origin and other countries and areas where the copyright term is the author's life plus 70 years or fewer.He didn’t oversee a war or have a particularly controversial political career, yet James Garfield, who was born on this day in 1831, was the second U.S. President to be assassinated. Though his death isn’t nearly as remembered as Lincoln’s or John F. Kennedy’s (Garfield joins President McKinley as a less-remembered assassinated President) the circumstances that led to it were as strange as they were tragic.
Sometimes referred to as “the last president to be born in a log cabin”, James Garfield straddled two eras in American history. He was born in the Ohio Western Reserve, then a relatively distant territory from the U.S. capital. As a young man, he fought for the Union during the American Civil War, reaching the rank of captain, and the aftermath of the conflict shaped his political career. Elected into the U.S. Congress in 1863, Garfield contributed to the addition of the 14th and 15th Amendments to the Constitution, which were meant to guarantee equal rights for newly-freed slaves after the war. Throughout his career, he was a supporter of civil rights and was well-regarded by his peers across parties. Ironically, he never sought the office that made him an assassin’s target. His presidential nomination in 1880 came about because the Republican Party was contentiously split on whom to choose, and so they chose Garfield, whom everyone liked. Garfield went on to win the election, becoming the 20th president of the U.S., with Chester A. Arthur as his vice-president.
Although Garfield's term ended up being cut tragically short after just four months, he did have time to make several sweeping reforms while in office. Chief among them was his ending of the system of patronage and “gifting” of political offices in the federal civil service based purely on personal favors. Garfield instead wanted to replace it with a system that rewarded merit. But patronage was a common and well-tolerated practice at the time, and Garfield’s own party had a faction called the Stalwarts who supported it. One such Stalwart was Charles Guiteau, who was infuriated by Garfield’s anti-patronage views. Guiteau was considered unstable by not only his coworkers, but by members of his own family. In D.C., he had a reputation as a political pest, constantly seeking favors. He had actually made a few political speeches in favor of Garfield, during the President’s 1880 campaign, and thus convinced himself that Garfield owed him a high-ranking government job, in return. When no such job offer came, Guiteau sought revenge.
The assassination itself was meticulously planned. Guiteau purchased an ivory-handled .44 caliber pistol that he thought would make a fine museum display after the fact. On the day of the assassination, he carried a note in his pocket that read, “The president’s tragic death was a sad necessity, but it will unite the Republican Party and save the Republic. Life is a fleeting dream, and it matters little when one goes.” Around 9:20 a.m., Guiteau shot President Garfield at the Baltimore and Potomac Railroad Station while shouting, “I am a Stalwart and Arthur is now president!”
President Garfield passed away weeks later after his wounds became septic. Guiteau was executed on October 14, 1881 after a highly publicized trial that focused the public’s eye on him and Stalwarts. During the trial, much of the coverage involved ridiculing Guiteau’s sense of entitlement and his alleged insanity. His association with the Stalwarts led to the faction’s ruin, and the 1883 Pendleton Civil Service Act ended patronage on a national level. It’s a strange twist of fate that a man who never sought power was murdered by one who felt entitled to it. At least Garfield’s will was done, regardless, in the end.
[Image description: A black-and-white illustration from 1881 depicting the assassintion of President Garfield. Garfield places a hand on his back where he was shot. He is supported by Secretary of State James G. Blaine while in the background the President’s assassin, Charles Guiteau, is caught by the crowd.] Credit & copyright: A. Berghaus and C. Upham, published in Frank Leslie's Illustrated Newspaper, Wikimedia Commons, This work is in the public domain in its country of origin and other countries and areas where the copyright term is the author's life plus 70 years or fewer. -
FREELiterature PP&T CurioFree1 CQ
Excelsior! You probably know him as the father of Spider-Man and for his various cameos in Marvel movies, but Stan Lee did more than you might think behind the scenes. Indeed, if it weren’t for the comic book pioneer who passed away on this day in 2018, the world of superheroes would be a lot less colorful than it is today.
Stanley Martin Lieber, better known as Stan Lee, was born in New York in 1922. Lee grew up during the Great Depression, watching his parents struggle to provide for him and his brother, though there was no shortage of reading material in their home. As a child, Lee was a big fan of science fiction, mystery, and adventure books by authors like Jules Verne, Sir Arthur Conan Doyle, and Mark Twain. Lee credited his affinity for creativity and literature to these idols as well as his mother, who instilled in him a love of reading. Before going into comics, Lee actually dreamed of being a novelist himself.
Just a few years out of high school in 1942, Lee started working at Timely Comics as an editorial assistant and was promoted to editor soon after. Timely Comics would go on to be renamed Atlas Comics, which would in turn become Marvel Comics, meaning that Stan Lee was with Marvel since its very earliest days. There, he worked with other comic book legends like Jack Kirby, but the publisher was struggling at the time. As WWII waned, so did interest in superheroes like the Human Torch and Captain America, requiring the publisher to pivot to other genres. As editor, Lee had a gift for recognizing comic book trends, and under his direction, Marvel published everything from horror stories to westerns. In fact, Marvel's early publication history included very few superheroes.
Their focus started to change in the late 50s and early 60s, when rival publisher DC began to publish more superhero material. Once again catching on to the trend, Stan Lee enlisted the help of Jack Kirby to create a whole roster of Marvel’s own superheroes. The first major title they came up with was The Fantastic Four in 1961, which combined heroic adventures with plenty of science fiction themes. Whereas DC’s Batman was a costumed detective solving crimes in a more or less realistic city, the Fantastic Four traversed space and time, meeting equally outrageous characters in the process. After the success of The Fantastic Four, Lee helped create other iconic characters and their respective titles, including Spider-Man, Iron Man, Daredevil, and the X-Men. By then, superheroes were wildly popular again, but what set Marvel’s roster of heroes apart was Lee’s focus on character’s personal lives and struggles. For example, Spider-Man’s alter ego, Peter Parker, started off as a nerdy kid who struggled with his social life, Iron Man struggled with his alcoholism, and Daredevil fought crime using his disability as a strength. The X-Men, in particular, were intentionally made to have parallels to the Civil Rights movement. In their stories, characters who are marginalized fight against society’s ignorance and prejudice as much as they fight supervillains.
In his later years, Stan Lee became the de facto spokesperson of the comic book industry, leading to his beloved movie cameos (the last being in Avengers: Endgame in 2019). He believed in comic books as legitimate literature, and through the medium he promoted his values of inclusivity and diversity. However, there was a time when even he questioned his career. He once said, "I used to be embarrassed because I was just a comic book writer while other people were building bridges or going on to medical careers. And then I began to realize that entertainment is one of the most important things in people's lives. Without it, they might go off the deep end." 'Nuff Said!
[Image description: Stan Lee’s star on the Hollywood Walk of Fame.] Credit & copyright:
Benoît Prieur (1975–), Wikimedia Commons, Public Domain DedicationExcelsior! You probably know him as the father of Spider-Man and for his various cameos in Marvel movies, but Stan Lee did more than you might think behind the scenes. Indeed, if it weren’t for the comic book pioneer who passed away on this day in 2018, the world of superheroes would be a lot less colorful than it is today.
Stanley Martin Lieber, better known as Stan Lee, was born in New York in 1922. Lee grew up during the Great Depression, watching his parents struggle to provide for him and his brother, though there was no shortage of reading material in their home. As a child, Lee was a big fan of science fiction, mystery, and adventure books by authors like Jules Verne, Sir Arthur Conan Doyle, and Mark Twain. Lee credited his affinity for creativity and literature to these idols as well as his mother, who instilled in him a love of reading. Before going into comics, Lee actually dreamed of being a novelist himself.
Just a few years out of high school in 1942, Lee started working at Timely Comics as an editorial assistant and was promoted to editor soon after. Timely Comics would go on to be renamed Atlas Comics, which would in turn become Marvel Comics, meaning that Stan Lee was with Marvel since its very earliest days. There, he worked with other comic book legends like Jack Kirby, but the publisher was struggling at the time. As WWII waned, so did interest in superheroes like the Human Torch and Captain America, requiring the publisher to pivot to other genres. As editor, Lee had a gift for recognizing comic book trends, and under his direction, Marvel published everything from horror stories to westerns. In fact, Marvel's early publication history included very few superheroes.
Their focus started to change in the late 50s and early 60s, when rival publisher DC began to publish more superhero material. Once again catching on to the trend, Stan Lee enlisted the help of Jack Kirby to create a whole roster of Marvel’s own superheroes. The first major title they came up with was The Fantastic Four in 1961, which combined heroic adventures with plenty of science fiction themes. Whereas DC’s Batman was a costumed detective solving crimes in a more or less realistic city, the Fantastic Four traversed space and time, meeting equally outrageous characters in the process. After the success of The Fantastic Four, Lee helped create other iconic characters and their respective titles, including Spider-Man, Iron Man, Daredevil, and the X-Men. By then, superheroes were wildly popular again, but what set Marvel’s roster of heroes apart was Lee’s focus on character’s personal lives and struggles. For example, Spider-Man’s alter ego, Peter Parker, started off as a nerdy kid who struggled with his social life, Iron Man struggled with his alcoholism, and Daredevil fought crime using his disability as a strength. The X-Men, in particular, were intentionally made to have parallels to the Civil Rights movement. In their stories, characters who are marginalized fight against society’s ignorance and prejudice as much as they fight supervillains.
In his later years, Stan Lee became the de facto spokesperson of the comic book industry, leading to his beloved movie cameos (the last being in Avengers: Endgame in 2019). He believed in comic books as legitimate literature, and through the medium he promoted his values of inclusivity and diversity. However, there was a time when even he questioned his career. He once said, "I used to be embarrassed because I was just a comic book writer while other people were building bridges or going on to medical careers. And then I began to realize that entertainment is one of the most important things in people's lives. Without it, they might go off the deep end." 'Nuff Said!
[Image description: Stan Lee’s star on the Hollywood Walk of Fame.] Credit & copyright:
Benoît Prieur (1975–), Wikimedia Commons, Public Domain Dedication -
FREEWorld History PP&T CurioFree1 CQ
They get a head start up north! While Americans eagerly await Thanksgiving, Canadians have already enjoyed their turkey dinners. That’s right, Canadians have their own Thanksgiving, and the original version of it predates the first American Thanksgiving by 43 years. Although the modern version of the holiday looks very similar to American Thanksgiving, its history is quite different, and is directly related to Canada’s European roots.
The first Canadian Thanksgiving likely took place aboard a ship. In 1578, British explorer Martin Frobisher arrived in modern-day Nunavut, a northern Canadian city, with his crew. Grateful that they’d made it safely all the way from England, the captain and crew anchored in Frobisher Bay, and observed communion. They then enjoyed as much of a feast as could be had on a 16th century explorer’s ship. Instead of turkey or mashed potatoes, the crew had salt beef and biscuits—food that kept well over a long time. The menu did improve over time, however. In 1606, the French settlers of New France, a colony in North America that once spanned across much of Newfoundland, Nova Scotia, and the Great Lakes region, began to hold feasts led by Governor Samuel de Champlain. These feasts were centered around a local fruit that settlers had learned about from the Mi’kmaq people—cranberries. The vitamin C in the berries prevented scurvy, so the settlers held feasts of cranberries with other local foods as often as resources allowed. Over time, other perennial Thanksgiving items were added to their menu, like pumpkins and even turkey, although the latter was due to direct American influence—sort of. The birds were introduced by United Empire Loyalists, former American colonists who were still loyal to Great Britain and fled during or just after the American Revolution. Lucky for Canadians that the loyalists still loved America’s native poultry.
Despite the turkey, there are still plenty of differences between how Canadians and Americans celebrate Thanksgiving. The first is the holiday’s date. Canadians get a jump on things by celebrating Thanksgiving in October. However, just like the American version, the holiday has seen its fair share of reschedulings. In fact, for decades Canadian Thanksgiving was celebrated at different times in different communities, without a set date. It wasn’t until 1879 that it was officially set to be on November 6. But that changed again in 1957, when it was reassigned to the second Monday of October.
The biggest thing that sets Canadian Thanksgiving apart? The holiday is a lot more laid back than the American version. Gatherings and feasts are generally smaller, and traveling from coast to coast to meet up with family isn’t the norm in the Great White North. People also don’t tend to eat themselves into a stupor. That’s because the weather is still nice enough in most places for people to enjoy a Thanksgiving Day hike. Many people also use the extended weekend to go on short vacations. Overall, Canadian Thanksgiving has a smaller cultural footprint than American Thanksgiving, and while there are still parades and football games on the holiday, not as many people tune in. Many Canadian families don’t even have their turkey on Monday. Instead, they choose to start earlier in the week in order to save their days off for resting and munching on leftovers. Canadian Thanksgiving also isn’t followed by an equivalent to American Black Friday; shopping sprees just aren’t Canada’s style. For all of Canadian Thanksgiving’s differences, though, it’s exactly like its American counterpart in one way—it’s a very bad day to be a turkey!
[Image description: A Canadian flag flying over a mountain and pine forest.] Credit & copyright: Daniel Joseph Petty, PexelsThey get a head start up north! While Americans eagerly await Thanksgiving, Canadians have already enjoyed their turkey dinners. That’s right, Canadians have their own Thanksgiving, and the original version of it predates the first American Thanksgiving by 43 years. Although the modern version of the holiday looks very similar to American Thanksgiving, its history is quite different, and is directly related to Canada’s European roots.
The first Canadian Thanksgiving likely took place aboard a ship. In 1578, British explorer Martin Frobisher arrived in modern-day Nunavut, a northern Canadian city, with his crew. Grateful that they’d made it safely all the way from England, the captain and crew anchored in Frobisher Bay, and observed communion. They then enjoyed as much of a feast as could be had on a 16th century explorer’s ship. Instead of turkey or mashed potatoes, the crew had salt beef and biscuits—food that kept well over a long time. The menu did improve over time, however. In 1606, the French settlers of New France, a colony in North America that once spanned across much of Newfoundland, Nova Scotia, and the Great Lakes region, began to hold feasts led by Governor Samuel de Champlain. These feasts were centered around a local fruit that settlers had learned about from the Mi’kmaq people—cranberries. The vitamin C in the berries prevented scurvy, so the settlers held feasts of cranberries with other local foods as often as resources allowed. Over time, other perennial Thanksgiving items were added to their menu, like pumpkins and even turkey, although the latter was due to direct American influence—sort of. The birds were introduced by United Empire Loyalists, former American colonists who were still loyal to Great Britain and fled during or just after the American Revolution. Lucky for Canadians that the loyalists still loved America’s native poultry.
Despite the turkey, there are still plenty of differences between how Canadians and Americans celebrate Thanksgiving. The first is the holiday’s date. Canadians get a jump on things by celebrating Thanksgiving in October. However, just like the American version, the holiday has seen its fair share of reschedulings. In fact, for decades Canadian Thanksgiving was celebrated at different times in different communities, without a set date. It wasn’t until 1879 that it was officially set to be on November 6. But that changed again in 1957, when it was reassigned to the second Monday of October.
The biggest thing that sets Canadian Thanksgiving apart? The holiday is a lot more laid back than the American version. Gatherings and feasts are generally smaller, and traveling from coast to coast to meet up with family isn’t the norm in the Great White North. People also don’t tend to eat themselves into a stupor. That’s because the weather is still nice enough in most places for people to enjoy a Thanksgiving Day hike. Many people also use the extended weekend to go on short vacations. Overall, Canadian Thanksgiving has a smaller cultural footprint than American Thanksgiving, and while there are still parades and football games on the holiday, not as many people tune in. Many Canadian families don’t even have their turkey on Monday. Instead, they choose to start earlier in the week in order to save their days off for resting and munching on leftovers. Canadian Thanksgiving also isn’t followed by an equivalent to American Black Friday; shopping sprees just aren’t Canada’s style. For all of Canadian Thanksgiving’s differences, though, it’s exactly like its American counterpart in one way—it’s a very bad day to be a turkey!
[Image description: A Canadian flag flying over a mountain and pine forest.] Credit & copyright: Daniel Joseph Petty, Pexels