Curio Cabinet
- By Date
- By Type
March 16, 2025
-
FREEScience PP&T CurioFree1 CQ
If a comet is named after you, does that make you a star among the stars? German astronomer Caroline Herschel, born on this day in 1750, would probably say so. The 35P/Herschel–Rigollet comet bears her name, and she discovered plenty of other comets throughout her long career, which was mostly spent working alongside her brother, William Herschel. That’s not to say that she toiled in her sibling’s shadow, though. Herschel made a name for herself as the first woman in England to hold a government position and the first known woman in the world to receive a salary as a scientist.
Born on March 16, 1750, in Hanover, Germany, Herschel's childhood got off to a rough start. Though she was the eighth child and fourth daughter in her family, two of her sisters died in childhood, while the eldest married and left home when Herschel was just five years old, leaving her alone as the family’s main housekeeper. At just 10 years old, Herschel contracted typhus and nearly died. The infection blinded her in her left eye and severely stunted her growth, leaving her with an adult height of four feet, three inches. Though her father wanted her to be educated, her mother insisted that, since she likely wouldn’t marry due to her disabilities, she should be trained as a housekeeping servant. Though her father educated her as best he could, Herschel ultimately learned little more than basic reading, arithmetic, and some sewing throughout her teenage years.
Things changed in Herschel’s early 20s, when she received an invitation from her two brothers, William and Alexander, to join them in Bath, England. William was becoming a fairly successful singer, and the brothers proposed that Herschel sing with him during some performances. While learning to sing in Bath, Herschel was finally able to become educated in other subjects too. After a few years of running William’s household and participating in his music career, she offered him support when his interests turned from music to astronomy.
Soon, William was making his own telescope lenses, which proved to be more powerful than conventional ones. In 1781, William discovered the planet Uranus, though he mistook it for a comet. As the siblings worked together, Herschel began scanning the sky each night for interesting objects, and meticulously recording their positions in a record book, along with any discoveries that she and William made. She also compared their observations with the Messier Catalog, a book of astronomy by French astronomer Charles Messier, which at the time was considered the most comprehensive catalog of astronomical objects. On February 26, 1783, Herschel made her first two, independent discoveries when she noticed a nebula that didn’t appear in the Messier Catalog and a small galaxy that later came to be known as Messier 110, a satellite of the Andromeda Galaxy.
In 1798, Herschel presented her astronomical catalog to the Royal Society in England, to be used as an update to English astronomer John Flamsteed’s observations. Her catalog was meticulously detailed, and was organized by north polar distance rather than by constellation. Using telescopes built by William, Herschel went on to discover eight comets. As she and William published papers with the Royal Society, both of them began earning a wage for their work. Herschel was paid £50 a year, making her the first known woman to earn a wage as a scientist.
By 1799, Herschel’s work was so well known that she was independently invited to spend a week with the royal family. Three years later, the Royal Society published the most detailed version of her work yet, though they did so under William’s name. After her brother’s death, Herschel created yet another astronomical catalog, this one for William’s son, John, who had also shown a great interest in astronomy. This catalog eventually became the New General Catalogue, which gave us the NGC numbers by which many astronomical objects are still identified.
Despite her childhood hardships and growing up during a time when women weren't encouraged to practice science, Caroline Herschel made some of the 18th and 19 centuries’ most important contributions to astronomy. Her determination to “mind the heavens”, as she put it, has impacted centuries of astronomical study. Happy Women’s History Month!
[Image description: A black-and-white portrait of Caroline Herschel wearing a bonnet and high-collared, lacy blouse.] Credit & copyright: Portrait by M. F. Tielemann, 1829. From page 114-115 of Agnes Clerke's The Herschels and Modern Astronomy (1895).If a comet is named after you, does that make you a star among the stars? German astronomer Caroline Herschel, born on this day in 1750, would probably say so. The 35P/Herschel–Rigollet comet bears her name, and she discovered plenty of other comets throughout her long career, which was mostly spent working alongside her brother, William Herschel. That’s not to say that she toiled in her sibling’s shadow, though. Herschel made a name for herself as the first woman in England to hold a government position and the first known woman in the world to receive a salary as a scientist.
Born on March 16, 1750, in Hanover, Germany, Herschel's childhood got off to a rough start. Though she was the eighth child and fourth daughter in her family, two of her sisters died in childhood, while the eldest married and left home when Herschel was just five years old, leaving her alone as the family’s main housekeeper. At just 10 years old, Herschel contracted typhus and nearly died. The infection blinded her in her left eye and severely stunted her growth, leaving her with an adult height of four feet, three inches. Though her father wanted her to be educated, her mother insisted that, since she likely wouldn’t marry due to her disabilities, she should be trained as a housekeeping servant. Though her father educated her as best he could, Herschel ultimately learned little more than basic reading, arithmetic, and some sewing throughout her teenage years.
Things changed in Herschel’s early 20s, when she received an invitation from her two brothers, William and Alexander, to join them in Bath, England. William was becoming a fairly successful singer, and the brothers proposed that Herschel sing with him during some performances. While learning to sing in Bath, Herschel was finally able to become educated in other subjects too. After a few years of running William’s household and participating in his music career, she offered him support when his interests turned from music to astronomy.
Soon, William was making his own telescope lenses, which proved to be more powerful than conventional ones. In 1781, William discovered the planet Uranus, though he mistook it for a comet. As the siblings worked together, Herschel began scanning the sky each night for interesting objects, and meticulously recording their positions in a record book, along with any discoveries that she and William made. She also compared their observations with the Messier Catalog, a book of astronomy by French astronomer Charles Messier, which at the time was considered the most comprehensive catalog of astronomical objects. On February 26, 1783, Herschel made her first two, independent discoveries when she noticed a nebula that didn’t appear in the Messier Catalog and a small galaxy that later came to be known as Messier 110, a satellite of the Andromeda Galaxy.
In 1798, Herschel presented her astronomical catalog to the Royal Society in England, to be used as an update to English astronomer John Flamsteed’s observations. Her catalog was meticulously detailed, and was organized by north polar distance rather than by constellation. Using telescopes built by William, Herschel went on to discover eight comets. As she and William published papers with the Royal Society, both of them began earning a wage for their work. Herschel was paid £50 a year, making her the first known woman to earn a wage as a scientist.
By 1799, Herschel’s work was so well known that she was independently invited to spend a week with the royal family. Three years later, the Royal Society published the most detailed version of her work yet, though they did so under William’s name. After her brother’s death, Herschel created yet another astronomical catalog, this one for William’s son, John, who had also shown a great interest in astronomy. This catalog eventually became the New General Catalogue, which gave us the NGC numbers by which many astronomical objects are still identified.
Despite her childhood hardships and growing up during a time when women weren't encouraged to practice science, Caroline Herschel made some of the 18th and 19 centuries’ most important contributions to astronomy. Her determination to “mind the heavens”, as she put it, has impacted centuries of astronomical study. Happy Women’s History Month!
[Image description: A black-and-white portrait of Caroline Herschel wearing a bonnet and high-collared, lacy blouse.] Credit & copyright: Portrait by M. F. Tielemann, 1829. From page 114-115 of Agnes Clerke's The Herschels and Modern Astronomy (1895).
March 15, 2025
-
FREEFootball Sporty CurioFree1 CQ
It pays to watch yourself in Warwickshire! That’s the English country where, in the town of Atherstone, a fairly violent game of folk football has been played every Shrove Tuesday (the day before Ash Wednesday in Christian tradition) for the past nine centuries. This 826th game took place this year on March 4.
The Atherstone Ball Game, as it’s known, is a folk or "medieval" football game, meaning that it’s nothing like either of the games that we call “football” today: American football or European soccer. The only object of the game is to grab and hold onto a heavy, leather ball as long as possible as it is kicked and thrown down the town’s main street. Whoever is holding the ball at the end of the two-hour game is the winner. This means that the final minutes of the game are usually violent as players swarm around the ball, punching and kicking each other in what resembles a crowd crush mixed with a wrestling match. Luckily, one of the game's few rules states that killing other players isn’t allowed.
This rowdy tradition got started in 1199, when King John oversaw a match between players from Warwickshire and Leicestershire. The king offered a bag of gold to the winners, making the high-stakes game particularly violent. Some say that the bag of gold was actually used in place of a ball, though it’s impossible to know for sure. We do know that Leicestershire won, but it’s Warwickshire that has carried on the game’s tradition. As far as history’s concerned, they’re the real winners!It pays to watch yourself in Warwickshire! That’s the English country where, in the town of Atherstone, a fairly violent game of folk football has been played every Shrove Tuesday (the day before Ash Wednesday in Christian tradition) for the past nine centuries. This 826th game took place this year on March 4.
The Atherstone Ball Game, as it’s known, is a folk or "medieval" football game, meaning that it’s nothing like either of the games that we call “football” today: American football or European soccer. The only object of the game is to grab and hold onto a heavy, leather ball as long as possible as it is kicked and thrown down the town’s main street. Whoever is holding the ball at the end of the two-hour game is the winner. This means that the final minutes of the game are usually violent as players swarm around the ball, punching and kicking each other in what resembles a crowd crush mixed with a wrestling match. Luckily, one of the game's few rules states that killing other players isn’t allowed.
This rowdy tradition got started in 1199, when King John oversaw a match between players from Warwickshire and Leicestershire. The king offered a bag of gold to the winners, making the high-stakes game particularly violent. Some say that the bag of gold was actually used in place of a ball, though it’s impossible to know for sure. We do know that Leicestershire won, but it’s Warwickshire that has carried on the game’s tradition. As far as history’s concerned, they’re the real winners!
March 14, 2025
-
FREEMind + Body Daily CurioFree1 CQ
There’s so many layers to love. With its meaty sauce and layers of pasta, lasagna is one of the world’s best-known foods, and it’s available at just about every Italian restaurant on Earth. Yet, this famously Italian dish didn’t originate in Italy. Like modern mathematics and philosophy, the first form of lasagna actually came from ancient Greece.
Lasagna is a dish made with large, flat sheets of pasta layered on top of one another, with fillings like chopped tomatoes, meat, cheese, or a combination of the three in between the layers. Usually, lasagna is smothered in tomato sauce or ragù, a type of meat sauce, and topped with cheese (usually mozzarella) before being baked and cut into squares for serving.
The lasagna we know today began as an ancient Greek dish called laganon. Like modern lasagna, laganon utilized large, flat sheets of pasta, but these sheets were cut into strips, sprinkled with toppings like crumbly cheese or chopped vegetables, and eaten with a pointed stick. Things changed around 146 B.C.E., when the Romans conquered Greece and began expanding upon Greek recipes. Over the next century, laganon morphed into a Roman dish called lasagne patina, which was cut into squares, but varied greatly from modern lasagna when it came to its ingredients. Some recipes called for fish to fill in the layers between pasta, others for pork belly or mixed vegetables. Sauce was still not standard for lasagna, though cheese did become one of the most popular Roman filling and topping.
Sauce, specifically tomato sauce, didn’t become the golden standard for lasagna until the dish got popular in Naples. By the 1600s, Neapolitans were eating their lasagna with ricotta cheese, ragú, and mozzarella cheese, though the dish still wasn’t served in layers. Then, in 1863, Francesco Zambrini, a scholar of ancient Italian texts from Bologna, Italy, published a lost, 14th-century cookbook called Libro di Cucina. Inside was a recipe for lasagna that called for layering egg pasta sheets with cheese filling. This recipe, mixed with the already-in-vogue practice of serving lasagna with tomatoes and meat sauce, resulted in the beloved dish that’s so popular today. All it took to make it happen was the formation of the Roman Empire, a love for tomatoes, and a long-lost cookbook!
[Image description: Lasagna topped with greens on a plate with silverware.] Credit & copyright: alleksana, PexelsThere’s so many layers to love. With its meaty sauce and layers of pasta, lasagna is one of the world’s best-known foods, and it’s available at just about every Italian restaurant on Earth. Yet, this famously Italian dish didn’t originate in Italy. Like modern mathematics and philosophy, the first form of lasagna actually came from ancient Greece.
Lasagna is a dish made with large, flat sheets of pasta layered on top of one another, with fillings like chopped tomatoes, meat, cheese, or a combination of the three in between the layers. Usually, lasagna is smothered in tomato sauce or ragù, a type of meat sauce, and topped with cheese (usually mozzarella) before being baked and cut into squares for serving.
The lasagna we know today began as an ancient Greek dish called laganon. Like modern lasagna, laganon utilized large, flat sheets of pasta, but these sheets were cut into strips, sprinkled with toppings like crumbly cheese or chopped vegetables, and eaten with a pointed stick. Things changed around 146 B.C.E., when the Romans conquered Greece and began expanding upon Greek recipes. Over the next century, laganon morphed into a Roman dish called lasagne patina, which was cut into squares, but varied greatly from modern lasagna when it came to its ingredients. Some recipes called for fish to fill in the layers between pasta, others for pork belly or mixed vegetables. Sauce was still not standard for lasagna, though cheese did become one of the most popular Roman filling and topping.
Sauce, specifically tomato sauce, didn’t become the golden standard for lasagna until the dish got popular in Naples. By the 1600s, Neapolitans were eating their lasagna with ricotta cheese, ragú, and mozzarella cheese, though the dish still wasn’t served in layers. Then, in 1863, Francesco Zambrini, a scholar of ancient Italian texts from Bologna, Italy, published a lost, 14th-century cookbook called Libro di Cucina. Inside was a recipe for lasagna that called for layering egg pasta sheets with cheese filling. This recipe, mixed with the already-in-vogue practice of serving lasagna with tomatoes and meat sauce, resulted in the beloved dish that’s so popular today. All it took to make it happen was the formation of the Roman Empire, a love for tomatoes, and a long-lost cookbook!
[Image description: Lasagna topped with greens on a plate with silverware.] Credit & copyright: alleksana, Pexels
March 13, 2025
-
FREEChemistry Nerdy CurioFree1 CQ
Have you ever seen berkelocene? Not until now! Researchers led by the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) recently discovered a new organometallic molecule, called berkelocene. Organometallic molecules are made up of a carbon-based framework surrounding a metal ion, but this is the first time such a molecule has contained the element berkelium.
While organometallic molecules often contain metals from earlier in the periodic table, they’re rarely found to contain actinides, or metals with atomic numbers from 89-103. Berkelium’s atomic number is 97, making the discovery of berkelocene quite unusual. In fact, this is the first time that any chemical bond between carbon and berkelium has been observed. Like 23 other synthetic metals on the periodic table, berkelium is not naturally-occurring. It can only be created in labs via nuclear reactions, which makes it all the more unusual that it could bond with a natural element, like carbon. Berkelium is highly radioactive, which also makes it difficult to study. It’s fitting, though, that the discovery of berkelocene took place at the Lawrence Berkeley National Laboratory, since berkelium was originally discovered and named after Berkeley, California, in 1949. In chemistry, what goes around comes around, but be careful—it’s radioactive, after all.[Image description: A black-and-white illustration of the periodic table cell for the element Berkelium.] Credit & copyright: Author’s own illustration.
Have you ever seen berkelocene? Not until now! Researchers led by the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) recently discovered a new organometallic molecule, called berkelocene. Organometallic molecules are made up of a carbon-based framework surrounding a metal ion, but this is the first time such a molecule has contained the element berkelium.
While organometallic molecules often contain metals from earlier in the periodic table, they’re rarely found to contain actinides, or metals with atomic numbers from 89-103. Berkelium’s atomic number is 97, making the discovery of berkelocene quite unusual. In fact, this is the first time that any chemical bond between carbon and berkelium has been observed. Like 23 other synthetic metals on the periodic table, berkelium is not naturally-occurring. It can only be created in labs via nuclear reactions, which makes it all the more unusual that it could bond with a natural element, like carbon. Berkelium is highly radioactive, which also makes it difficult to study. It’s fitting, though, that the discovery of berkelocene took place at the Lawrence Berkeley National Laboratory, since berkelium was originally discovered and named after Berkeley, California, in 1949. In chemistry, what goes around comes around, but be careful—it’s radioactive, after all.[Image description: A black-and-white illustration of the periodic table cell for the element Berkelium.] Credit & copyright: Author’s own illustration.
-
FREEParenting Daily Curio #3046Free1 CQ
Grief affects everyone differently, but the one constant is that it’s never easy. Now, at least, British parents who experience a miscarriage will have the right to take bereavement leave thanks to new workers’ rights reforms. The new law is part of changes to the employment rights bill proposed by the Labour Party and extends bereavement leave of up to two weeks to pregnant people who suffer a miscarriage before 24 weeks, as well as their partners. That’s good news for parents who are trying to have children, especially since most miscarriages happen early in the course of a pregnancy.
As tragic as they are, miscarriages are unfortunately extremely common. Though estimates vary, it’s believed that up to 20 percent of pregnancies end in miscarriage, with around 80 percent of them occurring in the first trimester, or in the first 12 weeks. Miscarriages can happen for a variety of reasons, but the most common cause is an issue with the number of fetal chromosomes. Extra chromosomes or missing chromosomes can lead to a fetus or embryo not developing properly, which, in turn, leads to a miscarriage. Viruses, illnesses, and food poisoning can also lead to miscarriages. Miscarriage symptoms also vary widely. Bleeding, cramping, or rapid heartbeat while pregnant can all be signs of a miscarriage, but sometimes there are no symptoms at all. In such cases, the miscarriage might go completely unnoticed, meaning that the actual miscarriage rate could be much higher than is currently estimated. Since miscarriages can have so many causes, many of them can’t be prevented—much of it is down to simple luck. Still, avoiding alcohol, smoking, and particularly risky sports can give a pregnancy a better chance at viability. At least with Britain’s new law, parents will have some time to breathe if bad luck strikes.Grief affects everyone differently, but the one constant is that it’s never easy. Now, at least, British parents who experience a miscarriage will have the right to take bereavement leave thanks to new workers’ rights reforms. The new law is part of changes to the employment rights bill proposed by the Labour Party and extends bereavement leave of up to two weeks to pregnant people who suffer a miscarriage before 24 weeks, as well as their partners. That’s good news for parents who are trying to have children, especially since most miscarriages happen early in the course of a pregnancy.
As tragic as they are, miscarriages are unfortunately extremely common. Though estimates vary, it’s believed that up to 20 percent of pregnancies end in miscarriage, with around 80 percent of them occurring in the first trimester, or in the first 12 weeks. Miscarriages can happen for a variety of reasons, but the most common cause is an issue with the number of fetal chromosomes. Extra chromosomes or missing chromosomes can lead to a fetus or embryo not developing properly, which, in turn, leads to a miscarriage. Viruses, illnesses, and food poisoning can also lead to miscarriages. Miscarriage symptoms also vary widely. Bleeding, cramping, or rapid heartbeat while pregnant can all be signs of a miscarriage, but sometimes there are no symptoms at all. In such cases, the miscarriage might go completely unnoticed, meaning that the actual miscarriage rate could be much higher than is currently estimated. Since miscarriages can have so many causes, many of them can’t be prevented—much of it is down to simple luck. Still, avoiding alcohol, smoking, and particularly risky sports can give a pregnancy a better chance at viability. At least with Britain’s new law, parents will have some time to breathe if bad luck strikes.
March 12, 2025
-
FREEBiology Nerdy CurioFree1 CQ
Turns out, unicorns are real—they’ve been hanging out in the ocean this whole time. Narwhals, sometimes called the “unicorns of the sea” are some of the most unusual animals on Earth, but they’re also extremely elusive. In fact, until recently, there was little consensus on what narwhals used their long, horn-like tusks for. Now, drones have finally captured footage of narwhals using their tusks for hunting and play. The footage was captured thanks to researchers at Florida Atlantic University’s Harbor Branch Oceanographic Institute and Canada’s Department of Fisheries and Oceans, in partnership with Inuit communities in Nunavut in Canada’s High Arctic. The narwhals used their tusks to “steer” prey fish, like Arctic char, in favorable directions and even to hit and stun the fish. They also used their tusks to prod and shake various things in their environment, behavior that researchers described as “exploratory play.”
Narwhals’ “horns” aren’t horns at all, but tusks. A narwhal’s tusk begins as a canine tooth (usually their upper left) that eventually grows through their upper lip. However, not all narwhals end up with tusks at all. Some males never grow them for unknown reasons, and only about 15 percent of female narwhals do. Narwhal tusks can reach lengths of up to 10 feet. That’s more than half the length of an adult male’s body, which can reach 15.7 feet and weigh more than 3,500 pounds. Narwhals come by their large size naturally, as they’re members of the Monodontidae family. This family also includes belugas, right whales, sperm whales, and blue whales, the latter of which are the largest animals that have ever lived on Earth.
Like most whales, narwhals live in pods, or groups, of up to 10 individuals. Females, calves, and young males form pods together, while sexually mature males have pods of their own. Narwhals are also migratory, meaning that they spend different parts of the year in different places. In the summer, they spend their time in Arctic bays and fjords, but as thick sea ice forms in the fall, they migrate to deeper Arctic waters. Most narwhals spend the winter between Canada and Greenland, in areas like Baffin Bay. When narwhals return to shallower, coastal waters in the spring, they also begin searching for mates. While male narwhals have never been observed fighting for mates, they do display behavior called “tusking”, in which two males raise their tusks out of the water and lay them against each other, probably to determine which male is larger. Whichever male “wins” the contest will go on to mate with nearby females. Narwhals give birth to just one calf per year.
Unfortunately, narwhals' low birth rate makes it difficult for their numbers to recover after disasters like ocean storms or oil spills. Luckily, narwhals are not currently considered endangered, but as climate change continues to affect the Arctic waters they call home, they may have difficulty adapting to a warming world. That’s not very cool for these unicorns of the sea.
[Image description: A black-and-white illustration of a narwhal diving through water.] Credit & copyright: Archives of Pearson Scott Foresman, donated to the Wikimedia Foundation. This work has been released into the public domain by its author, Pearson Scott Foresman. This applies worldwide.Turns out, unicorns are real—they’ve been hanging out in the ocean this whole time. Narwhals, sometimes called the “unicorns of the sea” are some of the most unusual animals on Earth, but they’re also extremely elusive. In fact, until recently, there was little consensus on what narwhals used their long, horn-like tusks for. Now, drones have finally captured footage of narwhals using their tusks for hunting and play. The footage was captured thanks to researchers at Florida Atlantic University’s Harbor Branch Oceanographic Institute and Canada’s Department of Fisheries and Oceans, in partnership with Inuit communities in Nunavut in Canada’s High Arctic. The narwhals used their tusks to “steer” prey fish, like Arctic char, in favorable directions and even to hit and stun the fish. They also used their tusks to prod and shake various things in their environment, behavior that researchers described as “exploratory play.”
Narwhals’ “horns” aren’t horns at all, but tusks. A narwhal’s tusk begins as a canine tooth (usually their upper left) that eventually grows through their upper lip. However, not all narwhals end up with tusks at all. Some males never grow them for unknown reasons, and only about 15 percent of female narwhals do. Narwhal tusks can reach lengths of up to 10 feet. That’s more than half the length of an adult male’s body, which can reach 15.7 feet and weigh more than 3,500 pounds. Narwhals come by their large size naturally, as they’re members of the Monodontidae family. This family also includes belugas, right whales, sperm whales, and blue whales, the latter of which are the largest animals that have ever lived on Earth.
Like most whales, narwhals live in pods, or groups, of up to 10 individuals. Females, calves, and young males form pods together, while sexually mature males have pods of their own. Narwhals are also migratory, meaning that they spend different parts of the year in different places. In the summer, they spend their time in Arctic bays and fjords, but as thick sea ice forms in the fall, they migrate to deeper Arctic waters. Most narwhals spend the winter between Canada and Greenland, in areas like Baffin Bay. When narwhals return to shallower, coastal waters in the spring, they also begin searching for mates. While male narwhals have never been observed fighting for mates, they do display behavior called “tusking”, in which two males raise their tusks out of the water and lay them against each other, probably to determine which male is larger. Whichever male “wins” the contest will go on to mate with nearby females. Narwhals give birth to just one calf per year.
Unfortunately, narwhals' low birth rate makes it difficult for their numbers to recover after disasters like ocean storms or oil spills. Luckily, narwhals are not currently considered endangered, but as climate change continues to affect the Arctic waters they call home, they may have difficulty adapting to a warming world. That’s not very cool for these unicorns of the sea.
[Image description: A black-and-white illustration of a narwhal diving through water.] Credit & copyright: Archives of Pearson Scott Foresman, donated to the Wikimedia Foundation. This work has been released into the public domain by its author, Pearson Scott Foresman. This applies worldwide. -
FREEUS History Daily Curio #3045Free1 CQ
This was one march that could turn on a dime. March of Dimes recently appointed a new CEO, making this the perfect time to look back on the nonprofit’s impressive, 85-year history. Not only did the organization play a major hand in helping to eradicate polio, it has pivoted and widened its scope several times throughout its history. Initially named the National Foundation for Infantile Paralysis, March of Dimes was founded by President Franklin D. Roosevelt to combat polio. Infantile paralysis was another name for polio at the time, and the president himself relied on a wheelchair for much of his life due to his own bout with polio in 1921. The catchy name for the organization, March of Dimes, came from a campaign asking the public to send money to the White House in pursuit of a cure. While coming up with the idea for the campaign, popular comedian Eddie Cantor suggested that they call it “The March of Dimes,” a play on the name of a newsreel at the time, “The March of Time.” Much of the money sent to the White House was indeed in the form of dimes and was used to fund research headed by Jonas Salk, the developer of one of the first successful polio vaccines. The campaign was heavily promoted on the radio, and its success helped develop the fundraising model used by other medical nonprofits today.
By the 1950s, Salk successfully developed a cure for polio, and by the end of the 1970s, polio was all but eradicated. Sadly, Roosevelt passed away in 1945, before he could see the creation of the vaccine. In 1979, the organization officially changed its name to the March of Dimes Foundation. After fulfilling their original mission, the March of Dimes also diversified their efforts. The organization continues to fund research into various childhood diseases while lobbying for better access to prenatal care. As part of their mission statement, the March of Dimes acknowledges that the U.S. has some of the highest infant and maternal mortality rates among developed nations. Clearly, the march is far from over.
[Image description: A jar of coins with some coins sitting beside it.] Credit & copyright: Miguel Á. Padriñán, PexelsThis was one march that could turn on a dime. March of Dimes recently appointed a new CEO, making this the perfect time to look back on the nonprofit’s impressive, 85-year history. Not only did the organization play a major hand in helping to eradicate polio, it has pivoted and widened its scope several times throughout its history. Initially named the National Foundation for Infantile Paralysis, March of Dimes was founded by President Franklin D. Roosevelt to combat polio. Infantile paralysis was another name for polio at the time, and the president himself relied on a wheelchair for much of his life due to his own bout with polio in 1921. The catchy name for the organization, March of Dimes, came from a campaign asking the public to send money to the White House in pursuit of a cure. While coming up with the idea for the campaign, popular comedian Eddie Cantor suggested that they call it “The March of Dimes,” a play on the name of a newsreel at the time, “The March of Time.” Much of the money sent to the White House was indeed in the form of dimes and was used to fund research headed by Jonas Salk, the developer of one of the first successful polio vaccines. The campaign was heavily promoted on the radio, and its success helped develop the fundraising model used by other medical nonprofits today.
By the 1950s, Salk successfully developed a cure for polio, and by the end of the 1970s, polio was all but eradicated. Sadly, Roosevelt passed away in 1945, before he could see the creation of the vaccine. In 1979, the organization officially changed its name to the March of Dimes Foundation. After fulfilling their original mission, the March of Dimes also diversified their efforts. The organization continues to fund research into various childhood diseases while lobbying for better access to prenatal care. As part of their mission statement, the March of Dimes acknowledges that the U.S. has some of the highest infant and maternal mortality rates among developed nations. Clearly, the march is far from over.
[Image description: A jar of coins with some coins sitting beside it.] Credit & copyright: Miguel Á. Padriñán, Pexels
March 11, 2025
-
FREEEngineering Daily Curio #3044Free1 CQ
It’s a good thing your eye comes with a spare. Researchers at the Dana-Farber Cancer Institute, Massachusetts Eye and Ear, and Boston Children’s Hospital have found a way to repair previously-irreversible corneal damage in one eye using stem cells from a person’s remaining, healthy eye.
Along with the lens, an eye’s cornea plays a critical role in focusing light. Damage to the cornea, whether from injury or disease, can permanently impair vision or even lead to blindness. The cornea also protects the eye behind it by keeping out debris and germs. Since the cornea is literally at the front and center of the eye, however, it is particularly vulnerable to damage from the very things it’s designed to protect against. Unfortunately, damage to the cornea is notoriously difficult to treat.
Now, researchers have managed to extract what they call cultivated autologous limbal epithelial cells (CALEC) from a healthy eye to restore function to a damaged cornea. The process works like this: CALEC is extracted via a biopsy from the healthy eye then placed in a cellular tissue graft, where it takes up to three weeks to grow. Once ready, the graft is transplanted into the damaged eye to replace the damaged cornea. One of the researchers, Ula Jurkunas, MD, said in press release, “Now we have this new data supporting that CALEC is more than 90% effective at restoring the cornea’s surface, which makes a meaningful difference in individuals with cornea damage that was considered untreatable.” Still, as they say, an ounce of prevention is worth a pound of cure. Common causes of corneal injury include damage from foreign objects entering the eye during yard work or while working with tools or chemicals. Too much exposure to UV rays can also cause damage to the cornea, as well as physical trauma during sports. The best way to prevent these injuries, of course, is to use eye protection. Even if they can fix your cornea, having someone poke around inside your one good eye should probably remain a last resort.
[Image description: An illustrated diagram of the human eye from the side, with labels.] Credit & copyright: Archives of Pearson Scott Foresman, donated to the Wikimedia Foundation. This work has been released into the public domain by its author, Pearson Scott Foresman. This applies worldwide.It’s a good thing your eye comes with a spare. Researchers at the Dana-Farber Cancer Institute, Massachusetts Eye and Ear, and Boston Children’s Hospital have found a way to repair previously-irreversible corneal damage in one eye using stem cells from a person’s remaining, healthy eye.
Along with the lens, an eye’s cornea plays a critical role in focusing light. Damage to the cornea, whether from injury or disease, can permanently impair vision or even lead to blindness. The cornea also protects the eye behind it by keeping out debris and germs. Since the cornea is literally at the front and center of the eye, however, it is particularly vulnerable to damage from the very things it’s designed to protect against. Unfortunately, damage to the cornea is notoriously difficult to treat.
Now, researchers have managed to extract what they call cultivated autologous limbal epithelial cells (CALEC) from a healthy eye to restore function to a damaged cornea. The process works like this: CALEC is extracted via a biopsy from the healthy eye then placed in a cellular tissue graft, where it takes up to three weeks to grow. Once ready, the graft is transplanted into the damaged eye to replace the damaged cornea. One of the researchers, Ula Jurkunas, MD, said in press release, “Now we have this new data supporting that CALEC is more than 90% effective at restoring the cornea’s surface, which makes a meaningful difference in individuals with cornea damage that was considered untreatable.” Still, as they say, an ounce of prevention is worth a pound of cure. Common causes of corneal injury include damage from foreign objects entering the eye during yard work or while working with tools or chemicals. Too much exposure to UV rays can also cause damage to the cornea, as well as physical trauma during sports. The best way to prevent these injuries, of course, is to use eye protection. Even if they can fix your cornea, having someone poke around inside your one good eye should probably remain a last resort.
[Image description: An illustrated diagram of the human eye from the side, with labels.] Credit & copyright: Archives of Pearson Scott Foresman, donated to the Wikimedia Foundation. This work has been released into the public domain by its author, Pearson Scott Foresman. This applies worldwide.
March 10, 2025
-
FREEArt Appreciation Art CurioFree1 CQ
Well, there’s something you don’t see everyday. Figure of a Monkey on a Dog is a sculpture that depicts exactly what its title implies: a monkey, dressed in a full outfit of pants, shirt, vest, and hat, riding atop a dog as if the latter animal is a horse. The dog wears two large saddlebags. While one might assume that this sculpture is simply the imaginative work of one whimsical artist, its history actually runs a lot deeper. It was part of a satirical art genre called “singeries”, or “monkey tricks” in French. After French artist Claude III Audran painted a picture of monkeys dressed in human clothes and seated at a table in 1709, other artists took up the same motif. In a movement that lasted through most of the 18th century, French artists painted and sculpted monkeys dressed in finery, engaging in all sorts of human activities, from drinking wine to dancing to playing cards. Too bad for the dog in this sculpture that canines weren’t afforded the same honor as monkeys in singeries. He ended up a beast of burden rather than a dapper dog!
Figure of a Monkey on a Dog, Manufactured by Villeroy Factory, c. 1745, soft-paste porcelain with enamel decoration, 6.25 in. (15.9 cm.), The Cleveland Museum of Art, Cleveland, Ohio
[Image credit & copyright: Manufactured by Villeroy Factory, c. 1745. The Cleveland Museum of Art, Gift of Rosenberg & Stiebel, Inc. 1953.269. Public Domain, Creative Commons Zero (CC0) designation.]Well, there’s something you don’t see everyday. Figure of a Monkey on a Dog is a sculpture that depicts exactly what its title implies: a monkey, dressed in a full outfit of pants, shirt, vest, and hat, riding atop a dog as if the latter animal is a horse. The dog wears two large saddlebags. While one might assume that this sculpture is simply the imaginative work of one whimsical artist, its history actually runs a lot deeper. It was part of a satirical art genre called “singeries”, or “monkey tricks” in French. After French artist Claude III Audran painted a picture of monkeys dressed in human clothes and seated at a table in 1709, other artists took up the same motif. In a movement that lasted through most of the 18th century, French artists painted and sculpted monkeys dressed in finery, engaging in all sorts of human activities, from drinking wine to dancing to playing cards. Too bad for the dog in this sculpture that canines weren’t afforded the same honor as monkeys in singeries. He ended up a beast of burden rather than a dapper dog!
Figure of a Monkey on a Dog, Manufactured by Villeroy Factory, c. 1745, soft-paste porcelain with enamel decoration, 6.25 in. (15.9 cm.), The Cleveland Museum of Art, Cleveland, Ohio
[Image credit & copyright: Manufactured by Villeroy Factory, c. 1745. The Cleveland Museum of Art, Gift of Rosenberg & Stiebel, Inc. 1953.269. Public Domain, Creative Commons Zero (CC0) designation.] -
FREELiterature Daily Curio #3043Free1 CQ
She might be gone, but her work lives on! Pulitzer Prize-winning American author Harper Lee published only two books before passing away in 2016: 1960’s To Kill A Mockingbird and 2015’s Go Set a Watchmen. Now, in a great surprise to fans, a collection of short stories that Lee wrote prior to 1960 is set to be published by Harper, an imprint of Harper Collins, this October.
The stories will be part of The Land of Sweet Forever: Stories and Essays, which will also include eight of Lee’s nonfiction pieces printed in various publications throughout her life. Some of the collection’s short stories draw upon themes that are also present in To Kill A Mockingbird, and include elements inspired from her own life in Alabama, where she grew up, and New York City, where she moved in 1949 and lived part-time for around 40 years. Ailah Ahmed, publishing director of the new book’s UK publisher, Hutchinson Heinemann, told The Guardian that the stories “...will prove an invaluable resource for anyone interested in Lee’s development as a writer.”
A famously private author, Lee wrote unflinchingly about the racism that plagued the deep South during the 1930s in To Kill A Mockingbird. The empathetic voice of Scout Finch, the book’s child narrator, offers some hope for a better future throughout an otherwise somber tale. Lee’s willingness to portray Atticus Finch, a white lawyer, fighting for the rights of Tom Robinson, a Black man unjustly accused of rape, showcases the idea that bravery and empathy are the ultimate antidotes to prejudice, even if injustice ultimately wins the day, as it does in the story. No doubt Lee’s fans will relish the chance to glimpse into the author’s past, to a time before To Kill A Mockingbird forever changed America’s literary landscape. Short stories like this just don’t happen everyday.
[Image description: A stack of books without titles visible.] Credit & copyright: Jess Bailey Designs, PexelsShe might be gone, but her work lives on! Pulitzer Prize-winning American author Harper Lee published only two books before passing away in 2016: 1960’s To Kill A Mockingbird and 2015’s Go Set a Watchmen. Now, in a great surprise to fans, a collection of short stories that Lee wrote prior to 1960 is set to be published by Harper, an imprint of Harper Collins, this October.
The stories will be part of The Land of Sweet Forever: Stories and Essays, which will also include eight of Lee’s nonfiction pieces printed in various publications throughout her life. Some of the collection’s short stories draw upon themes that are also present in To Kill A Mockingbird, and include elements inspired from her own life in Alabama, where she grew up, and New York City, where she moved in 1949 and lived part-time for around 40 years. Ailah Ahmed, publishing director of the new book’s UK publisher, Hutchinson Heinemann, told The Guardian that the stories “...will prove an invaluable resource for anyone interested in Lee’s development as a writer.”
A famously private author, Lee wrote unflinchingly about the racism that plagued the deep South during the 1930s in To Kill A Mockingbird. The empathetic voice of Scout Finch, the book’s child narrator, offers some hope for a better future throughout an otherwise somber tale. Lee’s willingness to portray Atticus Finch, a white lawyer, fighting for the rights of Tom Robinson, a Black man unjustly accused of rape, showcases the idea that bravery and empathy are the ultimate antidotes to prejudice, even if injustice ultimately wins the day, as it does in the story. No doubt Lee’s fans will relish the chance to glimpse into the author’s past, to a time before To Kill A Mockingbird forever changed America’s literary landscape. Short stories like this just don’t happen everyday.
[Image description: A stack of books without titles visible.] Credit & copyright: Jess Bailey Designs, Pexels