Curio Cabinet
- By Date
- By Type
March 17, 2025
-
FREEArt Appreciation Art CurioFree1 CQ
Here’s some springy green for St. Patrick’s Day. Despite its upbeat look, though, Vincent van Gogh’s Wheat Field with Cypresses wasn’t painted under very happy circumstances. The painting, featuring Van Gogh’s famously wide, swirling brushstrokes, shows a yellow wheat field with green bushes and trees, under a light blue sky with white clouds. Van Gogh painted it while voluntarily staying at the Saint-Paul-de-Mausole asylum in Southern France. The artist’s mental health was deteriorating at the time, following a heated argument with his friend, fellow artist Paul Gauguin, after which Van Gogh cut off part of his left ear. The asylum was less like a modern hospital and more a place for patients to relax away from other people. Seeking comfort in the pastoral nature surrounding the asylum, Van Gogh often painted fields, trees, hills, and night skies. A little over a year after painting Wheat Field with Cypresses, at the age of 37, Van Gogh died of a self-inflicted gunshot wound. It’s a sad reminder that beautiful artwork doesn’t always reflect inner happiness. Yet, Van Gogh’s work inspires happiness in others to this day. Thanks, Vincent.
Wheat Field with Cypresses, Vincent van Gogh (1853–1890), 1889, oil on canvas, 28.87 × 36.75 in. (73.2 × 93.4 cm.), The Metropolitan Museum of Art, New York City, New York
[Image credit & copyright: Vincent van Gogh, The Metropolitan Museum of Art. Purchase, The Annenberg Foundation Gift, 1993. Public Domain.Here’s some springy green for St. Patrick’s Day. Despite its upbeat look, though, Vincent van Gogh’s Wheat Field with Cypresses wasn’t painted under very happy circumstances. The painting, featuring Van Gogh’s famously wide, swirling brushstrokes, shows a yellow wheat field with green bushes and trees, under a light blue sky with white clouds. Van Gogh painted it while voluntarily staying at the Saint-Paul-de-Mausole asylum in Southern France. The artist’s mental health was deteriorating at the time, following a heated argument with his friend, fellow artist Paul Gauguin, after which Van Gogh cut off part of his left ear. The asylum was less like a modern hospital and more a place for patients to relax away from other people. Seeking comfort in the pastoral nature surrounding the asylum, Van Gogh often painted fields, trees, hills, and night skies. A little over a year after painting Wheat Field with Cypresses, at the age of 37, Van Gogh died of a self-inflicted gunshot wound. It’s a sad reminder that beautiful artwork doesn’t always reflect inner happiness. Yet, Van Gogh’s work inspires happiness in others to this day. Thanks, Vincent.
Wheat Field with Cypresses, Vincent van Gogh (1853–1890), 1889, oil on canvas, 28.87 × 36.75 in. (73.2 × 93.4 cm.), The Metropolitan Museum of Art, New York City, New York
[Image credit & copyright: Vincent van Gogh, The Metropolitan Museum of Art. Purchase, The Annenberg Foundation Gift, 1993. Public Domain. -
FREEScience Daily Curio #3047Free1 CQ
The house might be burning, but at least the roof is intact. As climate change continues to affect Earth’s weather, there’s still some good news about the environment: the ozone layer is doing better and better, according to recent research. Concerns over the state of the ozone layer first emerged in the 1980s, when researchers discovered a hole in the layer over Antarctica. That was bad news considering how crucial the ozone layer is to the health of life on Earth. Consisting of 3 oxygen atoms, ozone mitigates the amount of harmful UV radiation that makes it to our planet’s surface. Without it, humans and animals would be much more prone to skin cancer and cataracts. Many plants, including some crops, could also die of excess radiation.
In 1986, researchers from the National Oceanic and Atmospheric Administration (NOAA) set out on an expedition to Antarctica and discovered the culprit behind the missing patch of ozone layer: chlorofluorocarbons. Better known as CFCs, these synthetic chemicals were widely used at the time as refrigerants, insulation, and as aerosol propellants, showing up in common, everyday items like air conditioners and hair spray. Following the discovery, an international treaty limiting the use of CFCs was adopted. Known as the Montreal Protocol, the treaty’s benefits are becoming clearer by the day. A recent study from MIT shows that the ozone layer is recovering, and the data shows that it’s a direct result of CFC reduction. Susan Solomon, the author of the study, said in a university statement, “The conclusion is, with 95 percent confidence, it is recovering. Which is awesome. And it shows we can actually solve environmental problems.” So far, it’s 1-0 for ozone.
[Image description: A blue sky with white clouds.] Credit & copyright: Johann Piber, PexelsThe house might be burning, but at least the roof is intact. As climate change continues to affect Earth’s weather, there’s still some good news about the environment: the ozone layer is doing better and better, according to recent research. Concerns over the state of the ozone layer first emerged in the 1980s, when researchers discovered a hole in the layer over Antarctica. That was bad news considering how crucial the ozone layer is to the health of life on Earth. Consisting of 3 oxygen atoms, ozone mitigates the amount of harmful UV radiation that makes it to our planet’s surface. Without it, humans and animals would be much more prone to skin cancer and cataracts. Many plants, including some crops, could also die of excess radiation.
In 1986, researchers from the National Oceanic and Atmospheric Administration (NOAA) set out on an expedition to Antarctica and discovered the culprit behind the missing patch of ozone layer: chlorofluorocarbons. Better known as CFCs, these synthetic chemicals were widely used at the time as refrigerants, insulation, and as aerosol propellants, showing up in common, everyday items like air conditioners and hair spray. Following the discovery, an international treaty limiting the use of CFCs was adopted. Known as the Montreal Protocol, the treaty’s benefits are becoming clearer by the day. A recent study from MIT shows that the ozone layer is recovering, and the data shows that it’s a direct result of CFC reduction. Susan Solomon, the author of the study, said in a university statement, “The conclusion is, with 95 percent confidence, it is recovering. Which is awesome. And it shows we can actually solve environmental problems.” So far, it’s 1-0 for ozone.
[Image description: A blue sky with white clouds.] Credit & copyright: Johann Piber, Pexels
March 16, 2025
-
FREEScience PP&T CurioFree1 CQ
If a comet is named after you, does that make you a star among the stars? German astronomer Caroline Herschel, born on this day in 1750, would probably say so. The 35P/Herschel–Rigollet comet bears her name, and she discovered plenty of other comets throughout her long career, which was mostly spent working alongside her brother, William Herschel. That’s not to say that she toiled in her sibling’s shadow, though. Herschel made a name for herself as the first woman in England to hold a government position and the first known woman in the world to receive a salary as a scientist.
Born on March 16, 1750, in Hanover, Germany, Herschel's childhood got off to a rough start. Though she was the eighth child and fourth daughter in her family, two of her sisters died in childhood, while the eldest married and left home when Herschel was just five years old, leaving her alone as the family’s main housekeeper. At just 10 years old, Herschel contracted typhus and nearly died. The infection blinded her in her left eye and severely stunted her growth, leaving her with an adult height of four feet, three inches. Though her father wanted her to be educated, her mother insisted that, since she likely wouldn’t marry due to her disabilities, she should be trained as a housekeeping servant. Though her father educated her as best he could, Herschel ultimately learned little more than basic reading, arithmetic, and some sewing throughout her teenage years.
Things changed in Herschel’s early 20s, when she received an invitation from her two brothers, William and Alexander, to join them in Bath, England. William was becoming a fairly successful singer, and the brothers proposed that Herschel sing with him during some performances. While learning to sing in Bath, Herschel was finally able to become educated in other subjects too. After a few years of running William’s household and participating in his music career, she offered him support when his interests turned from music to astronomy.
Soon, William was making his own telescope lenses, which proved to be more powerful than conventional ones. In 1781, William discovered the planet Uranus, though he mistook it for a comet. As the siblings worked together, Herschel began scanning the sky each night for interesting objects, and meticulously recording their positions in a record book, along with any discoveries that she and William made. She also compared their observations with the Messier Catalog, a book of astronomy by French astronomer Charles Messier, which at the time was considered the most comprehensive catalog of astronomical objects. On February 26, 1783, Herschel made her first two, independent discoveries when she noticed a nebula that didn’t appear in the Messier Catalog and a small galaxy that later came to be known as Messier 110, a satellite of the Andromeda Galaxy.
In 1798, Herschel presented her astronomical catalog to the Royal Society in England, to be used as an update to English astronomer John Flamsteed’s observations. Her catalog was meticulously detailed, and was organized by north polar distance rather than by constellation. Using telescopes built by William, Herschel went on to discover eight comets. As she and William published papers with the Royal Society, both of them began earning a wage for their work. Herschel was paid £50 a year, making her the first known woman to earn a wage as a scientist.
By 1799, Herschel’s work was so well known that she was independently invited to spend a week with the royal family. Three years later, the Royal Society published the most detailed version of her work yet, though they did so under William’s name. After her brother’s death, Herschel created yet another astronomical catalog, this one for William’s son, John, who had also shown a great interest in astronomy. This catalog eventually became the New General Catalogue, which gave us the NGC numbers by which many astronomical objects are still identified.
Despite her childhood hardships and growing up during a time when women weren't encouraged to practice science, Caroline Herschel made some of the 18th and 19 centuries’ most important contributions to astronomy. Her determination to “mind the heavens”, as she put it, has impacted centuries of astronomical study. Happy Women’s History Month!
[Image description: A black-and-white portrait of Caroline Herschel wearing a bonnet and high-collared, lacy blouse.] Credit & copyright: Portrait by M. F. Tielemann, 1829. From page 114-115 of Agnes Clerke's The Herschels and Modern Astronomy (1895).If a comet is named after you, does that make you a star among the stars? German astronomer Caroline Herschel, born on this day in 1750, would probably say so. The 35P/Herschel–Rigollet comet bears her name, and she discovered plenty of other comets throughout her long career, which was mostly spent working alongside her brother, William Herschel. That’s not to say that she toiled in her sibling’s shadow, though. Herschel made a name for herself as the first woman in England to hold a government position and the first known woman in the world to receive a salary as a scientist.
Born on March 16, 1750, in Hanover, Germany, Herschel's childhood got off to a rough start. Though she was the eighth child and fourth daughter in her family, two of her sisters died in childhood, while the eldest married and left home when Herschel was just five years old, leaving her alone as the family’s main housekeeper. At just 10 years old, Herschel contracted typhus and nearly died. The infection blinded her in her left eye and severely stunted her growth, leaving her with an adult height of four feet, three inches. Though her father wanted her to be educated, her mother insisted that, since she likely wouldn’t marry due to her disabilities, she should be trained as a housekeeping servant. Though her father educated her as best he could, Herschel ultimately learned little more than basic reading, arithmetic, and some sewing throughout her teenage years.
Things changed in Herschel’s early 20s, when she received an invitation from her two brothers, William and Alexander, to join them in Bath, England. William was becoming a fairly successful singer, and the brothers proposed that Herschel sing with him during some performances. While learning to sing in Bath, Herschel was finally able to become educated in other subjects too. After a few years of running William’s household and participating in his music career, she offered him support when his interests turned from music to astronomy.
Soon, William was making his own telescope lenses, which proved to be more powerful than conventional ones. In 1781, William discovered the planet Uranus, though he mistook it for a comet. As the siblings worked together, Herschel began scanning the sky each night for interesting objects, and meticulously recording their positions in a record book, along with any discoveries that she and William made. She also compared their observations with the Messier Catalog, a book of astronomy by French astronomer Charles Messier, which at the time was considered the most comprehensive catalog of astronomical objects. On February 26, 1783, Herschel made her first two, independent discoveries when she noticed a nebula that didn’t appear in the Messier Catalog and a small galaxy that later came to be known as Messier 110, a satellite of the Andromeda Galaxy.
In 1798, Herschel presented her astronomical catalog to the Royal Society in England, to be used as an update to English astronomer John Flamsteed’s observations. Her catalog was meticulously detailed, and was organized by north polar distance rather than by constellation. Using telescopes built by William, Herschel went on to discover eight comets. As she and William published papers with the Royal Society, both of them began earning a wage for their work. Herschel was paid £50 a year, making her the first known woman to earn a wage as a scientist.
By 1799, Herschel’s work was so well known that she was independently invited to spend a week with the royal family. Three years later, the Royal Society published the most detailed version of her work yet, though they did so under William’s name. After her brother’s death, Herschel created yet another astronomical catalog, this one for William’s son, John, who had also shown a great interest in astronomy. This catalog eventually became the New General Catalogue, which gave us the NGC numbers by which many astronomical objects are still identified.
Despite her childhood hardships and growing up during a time when women weren't encouraged to practice science, Caroline Herschel made some of the 18th and 19 centuries’ most important contributions to astronomy. Her determination to “mind the heavens”, as she put it, has impacted centuries of astronomical study. Happy Women’s History Month!
[Image description: A black-and-white portrait of Caroline Herschel wearing a bonnet and high-collared, lacy blouse.] Credit & copyright: Portrait by M. F. Tielemann, 1829. From page 114-115 of Agnes Clerke's The Herschels and Modern Astronomy (1895).
March 15, 2025
-
FREEFootball Sporty CurioFree1 CQ
It pays to watch yourself in Warwickshire! That’s the English country where, in the town of Atherstone, a fairly violent game of folk football has been played every Shrove Tuesday (the day before Ash Wednesday in Christian tradition) for the past nine centuries. This 826th game took place this year on March 4.
The Atherstone Ball Game, as it’s known, is a folk or "medieval" football game, meaning that it’s nothing like either of the games that we call “football” today: American football or European soccer. The only object of the game is to grab and hold onto a heavy, leather ball as long as possible as it is kicked and thrown down the town’s main street. Whoever is holding the ball at the end of the two-hour game is the winner. This means that the final minutes of the game are usually violent as players swarm around the ball, punching and kicking each other in what resembles a crowd crush mixed with a wrestling match. Luckily, one of the game's few rules states that killing other players isn’t allowed.
This rowdy tradition got started in 1199, when King John oversaw a match between players from Warwickshire and Leicestershire. The king offered a bag of gold to the winners, making the high-stakes game particularly violent. Some say that the bag of gold was actually used in place of a ball, though it’s impossible to know for sure. We do know that Leicestershire won, but it’s Warwickshire that has carried on the game’s tradition. As far as history’s concerned, they’re the real winners!It pays to watch yourself in Warwickshire! That’s the English country where, in the town of Atherstone, a fairly violent game of folk football has been played every Shrove Tuesday (the day before Ash Wednesday in Christian tradition) for the past nine centuries. This 826th game took place this year on March 4.
The Atherstone Ball Game, as it’s known, is a folk or "medieval" football game, meaning that it’s nothing like either of the games that we call “football” today: American football or European soccer. The only object of the game is to grab and hold onto a heavy, leather ball as long as possible as it is kicked and thrown down the town’s main street. Whoever is holding the ball at the end of the two-hour game is the winner. This means that the final minutes of the game are usually violent as players swarm around the ball, punching and kicking each other in what resembles a crowd crush mixed with a wrestling match. Luckily, one of the game's few rules states that killing other players isn’t allowed.
This rowdy tradition got started in 1199, when King John oversaw a match between players from Warwickshire and Leicestershire. The king offered a bag of gold to the winners, making the high-stakes game particularly violent. Some say that the bag of gold was actually used in place of a ball, though it’s impossible to know for sure. We do know that Leicestershire won, but it’s Warwickshire that has carried on the game’s tradition. As far as history’s concerned, they’re the real winners!
March 14, 2025
-
FREEMind + Body Daily CurioFree1 CQ
There’s so many layers to love. With its meaty sauce and layers of pasta, lasagna is one of the world’s best-known foods, and it’s available at just about every Italian restaurant on Earth. Yet, this famously Italian dish didn’t originate in Italy. Like modern mathematics and philosophy, the first form of lasagna actually came from ancient Greece.
Lasagna is a dish made with large, flat sheets of pasta layered on top of one another, with fillings like chopped tomatoes, meat, cheese, or a combination of the three in between the layers. Usually, lasagna is smothered in tomato sauce or ragù, a type of meat sauce, and topped with cheese (usually mozzarella) before being baked and cut into squares for serving.
The lasagna we know today began as an ancient Greek dish called laganon. Like modern lasagna, laganon utilized large, flat sheets of pasta, but these sheets were cut into strips, sprinkled with toppings like crumbly cheese or chopped vegetables, and eaten with a pointed stick. Things changed around 146 B.C.E., when the Romans conquered Greece and began expanding upon Greek recipes. Over the next century, laganon morphed into a Roman dish called lasagne patina, which was cut into squares, but varied greatly from modern lasagna when it came to its ingredients. Some recipes called for fish to fill in the layers between pasta, others for pork belly or mixed vegetables. Sauce was still not standard for lasagna, though cheese did become one of the most popular Roman filling and topping.
Sauce, specifically tomato sauce, didn’t become the golden standard for lasagna until the dish got popular in Naples. By the 1600s, Neapolitans were eating their lasagna with ricotta cheese, ragú, and mozzarella cheese, though the dish still wasn’t served in layers. Then, in 1863, Francesco Zambrini, a scholar of ancient Italian texts from Bologna, Italy, published a lost, 14th-century cookbook called Libro di Cucina. Inside was a recipe for lasagna that called for layering egg pasta sheets with cheese filling. This recipe, mixed with the already-in-vogue practice of serving lasagna with tomatoes and meat sauce, resulted in the beloved dish that’s so popular today. All it took to make it happen was the formation of the Roman Empire, a love for tomatoes, and a long-lost cookbook!
[Image description: Lasagna topped with greens on a plate with silverware.] Credit & copyright: alleksana, PexelsThere’s so many layers to love. With its meaty sauce and layers of pasta, lasagna is one of the world’s best-known foods, and it’s available at just about every Italian restaurant on Earth. Yet, this famously Italian dish didn’t originate in Italy. Like modern mathematics and philosophy, the first form of lasagna actually came from ancient Greece.
Lasagna is a dish made with large, flat sheets of pasta layered on top of one another, with fillings like chopped tomatoes, meat, cheese, or a combination of the three in between the layers. Usually, lasagna is smothered in tomato sauce or ragù, a type of meat sauce, and topped with cheese (usually mozzarella) before being baked and cut into squares for serving.
The lasagna we know today began as an ancient Greek dish called laganon. Like modern lasagna, laganon utilized large, flat sheets of pasta, but these sheets were cut into strips, sprinkled with toppings like crumbly cheese or chopped vegetables, and eaten with a pointed stick. Things changed around 146 B.C.E., when the Romans conquered Greece and began expanding upon Greek recipes. Over the next century, laganon morphed into a Roman dish called lasagne patina, which was cut into squares, but varied greatly from modern lasagna when it came to its ingredients. Some recipes called for fish to fill in the layers between pasta, others for pork belly or mixed vegetables. Sauce was still not standard for lasagna, though cheese did become one of the most popular Roman filling and topping.
Sauce, specifically tomato sauce, didn’t become the golden standard for lasagna until the dish got popular in Naples. By the 1600s, Neapolitans were eating their lasagna with ricotta cheese, ragú, and mozzarella cheese, though the dish still wasn’t served in layers. Then, in 1863, Francesco Zambrini, a scholar of ancient Italian texts from Bologna, Italy, published a lost, 14th-century cookbook called Libro di Cucina. Inside was a recipe for lasagna that called for layering egg pasta sheets with cheese filling. This recipe, mixed with the already-in-vogue practice of serving lasagna with tomatoes and meat sauce, resulted in the beloved dish that’s so popular today. All it took to make it happen was the formation of the Roman Empire, a love for tomatoes, and a long-lost cookbook!
[Image description: Lasagna topped with greens on a plate with silverware.] Credit & copyright: alleksana, Pexels
March 13, 2025
-
FREEChemistry Nerdy CurioFree1 CQ
Have you ever seen berkelocene? Not until now! Researchers led by the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) recently discovered a new organometallic molecule, called berkelocene. Organometallic molecules are made up of a carbon-based framework surrounding a metal ion, but this is the first time such a molecule has contained the element berkelium.
While organometallic molecules often contain metals from earlier in the periodic table, they’re rarely found to contain actinides, or metals with atomic numbers from 89-103. Berkelium’s atomic number is 97, making the discovery of berkelocene quite unusual. In fact, this is the first time that any chemical bond between carbon and berkelium has been observed. Like 23 other synthetic metals on the periodic table, berkelium is not naturally-occurring. It can only be created in labs via nuclear reactions, which makes it all the more unusual that it could bond with a natural element, like carbon. Berkelium is highly radioactive, which also makes it difficult to study. It’s fitting, though, that the discovery of berkelocene took place at the Lawrence Berkeley National Laboratory, since berkelium was originally discovered and named after Berkeley, California, in 1949. In chemistry, what goes around comes around, but be careful—it’s radioactive, after all.[Image description: A black-and-white illustration of the periodic table cell for the element Berkelium.] Credit & copyright: Author’s own illustration.
Have you ever seen berkelocene? Not until now! Researchers led by the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) recently discovered a new organometallic molecule, called berkelocene. Organometallic molecules are made up of a carbon-based framework surrounding a metal ion, but this is the first time such a molecule has contained the element berkelium.
While organometallic molecules often contain metals from earlier in the periodic table, they’re rarely found to contain actinides, or metals with atomic numbers from 89-103. Berkelium’s atomic number is 97, making the discovery of berkelocene quite unusual. In fact, this is the first time that any chemical bond between carbon and berkelium has been observed. Like 23 other synthetic metals on the periodic table, berkelium is not naturally-occurring. It can only be created in labs via nuclear reactions, which makes it all the more unusual that it could bond with a natural element, like carbon. Berkelium is highly radioactive, which also makes it difficult to study. It’s fitting, though, that the discovery of berkelocene took place at the Lawrence Berkeley National Laboratory, since berkelium was originally discovered and named after Berkeley, California, in 1949. In chemistry, what goes around comes around, but be careful—it’s radioactive, after all.[Image description: A black-and-white illustration of the periodic table cell for the element Berkelium.] Credit & copyright: Author’s own illustration.
-
FREEParenting Daily Curio #3046Free1 CQ
Grief affects everyone differently, but the one constant is that it’s never easy. Now, at least, British parents who experience a miscarriage will have the right to take bereavement leave thanks to new workers’ rights reforms. The new law is part of changes to the employment rights bill proposed by the Labour Party and extends bereavement leave of up to two weeks to pregnant people who suffer a miscarriage before 24 weeks, as well as their partners. That’s good news for parents who are trying to have children, especially since most miscarriages happen early in the course of a pregnancy.
As tragic as they are, miscarriages are unfortunately extremely common. Though estimates vary, it’s believed that up to 20 percent of pregnancies end in miscarriage, with around 80 percent of them occurring in the first trimester, or in the first 12 weeks. Miscarriages can happen for a variety of reasons, but the most common cause is an issue with the number of fetal chromosomes. Extra chromosomes or missing chromosomes can lead to a fetus or embryo not developing properly, which, in turn, leads to a miscarriage. Viruses, illnesses, and food poisoning can also lead to miscarriages. Miscarriage symptoms also vary widely. Bleeding, cramping, or rapid heartbeat while pregnant can all be signs of a miscarriage, but sometimes there are no symptoms at all. In such cases, the miscarriage might go completely unnoticed, meaning that the actual miscarriage rate could be much higher than is currently estimated. Since miscarriages can have so many causes, many of them can’t be prevented—much of it is down to simple luck. Still, avoiding alcohol, smoking, and particularly risky sports can give a pregnancy a better chance at viability. At least with Britain’s new law, parents will have some time to breathe if bad luck strikes.Grief affects everyone differently, but the one constant is that it’s never easy. Now, at least, British parents who experience a miscarriage will have the right to take bereavement leave thanks to new workers’ rights reforms. The new law is part of changes to the employment rights bill proposed by the Labour Party and extends bereavement leave of up to two weeks to pregnant people who suffer a miscarriage before 24 weeks, as well as their partners. That’s good news for parents who are trying to have children, especially since most miscarriages happen early in the course of a pregnancy.
As tragic as they are, miscarriages are unfortunately extremely common. Though estimates vary, it’s believed that up to 20 percent of pregnancies end in miscarriage, with around 80 percent of them occurring in the first trimester, or in the first 12 weeks. Miscarriages can happen for a variety of reasons, but the most common cause is an issue with the number of fetal chromosomes. Extra chromosomes or missing chromosomes can lead to a fetus or embryo not developing properly, which, in turn, leads to a miscarriage. Viruses, illnesses, and food poisoning can also lead to miscarriages. Miscarriage symptoms also vary widely. Bleeding, cramping, or rapid heartbeat while pregnant can all be signs of a miscarriage, but sometimes there are no symptoms at all. In such cases, the miscarriage might go completely unnoticed, meaning that the actual miscarriage rate could be much higher than is currently estimated. Since miscarriages can have so many causes, many of them can’t be prevented—much of it is down to simple luck. Still, avoiding alcohol, smoking, and particularly risky sports can give a pregnancy a better chance at viability. At least with Britain’s new law, parents will have some time to breathe if bad luck strikes.
March 12, 2025
-
FREEBiology Nerdy CurioFree1 CQ
Turns out, unicorns are real—they’ve been hanging out in the ocean this whole time. Narwhals, sometimes called the “unicorns of the sea” are some of the most unusual animals on Earth, but they’re also extremely elusive. In fact, until recently, there was little consensus on what narwhals used their long, horn-like tusks for. Now, drones have finally captured footage of narwhals using their tusks for hunting and play. The footage was captured thanks to researchers at Florida Atlantic University’s Harbor Branch Oceanographic Institute and Canada’s Department of Fisheries and Oceans, in partnership with Inuit communities in Nunavut in Canada’s High Arctic. The narwhals used their tusks to “steer” prey fish, like Arctic char, in favorable directions and even to hit and stun the fish. They also used their tusks to prod and shake various things in their environment, behavior that researchers described as “exploratory play.”
Narwhals’ “horns” aren’t horns at all, but tusks. A narwhal’s tusk begins as a canine tooth (usually their upper left) that eventually grows through their upper lip. However, not all narwhals end up with tusks at all. Some males never grow them for unknown reasons, and only about 15 percent of female narwhals do. Narwhal tusks can reach lengths of up to 10 feet. That’s more than half the length of an adult male’s body, which can reach 15.7 feet and weigh more than 3,500 pounds. Narwhals come by their large size naturally, as they’re members of the Monodontidae family. This family also includes belugas, right whales, sperm whales, and blue whales, the latter of which are the largest animals that have ever lived on Earth.
Like most whales, narwhals live in pods, or groups, of up to 10 individuals. Females, calves, and young males form pods together, while sexually mature males have pods of their own. Narwhals are also migratory, meaning that they spend different parts of the year in different places. In the summer, they spend their time in Arctic bays and fjords, but as thick sea ice forms in the fall, they migrate to deeper Arctic waters. Most narwhals spend the winter between Canada and Greenland, in areas like Baffin Bay. When narwhals return to shallower, coastal waters in the spring, they also begin searching for mates. While male narwhals have never been observed fighting for mates, they do display behavior called “tusking”, in which two males raise their tusks out of the water and lay them against each other, probably to determine which male is larger. Whichever male “wins” the contest will go on to mate with nearby females. Narwhals give birth to just one calf per year.
Unfortunately, narwhals' low birth rate makes it difficult for their numbers to recover after disasters like ocean storms or oil spills. Luckily, narwhals are not currently considered endangered, but as climate change continues to affect the Arctic waters they call home, they may have difficulty adapting to a warming world. That’s not very cool for these unicorns of the sea.
[Image description: A black-and-white illustration of a narwhal diving through water.] Credit & copyright: Archives of Pearson Scott Foresman, donated to the Wikimedia Foundation. This work has been released into the public domain by its author, Pearson Scott Foresman. This applies worldwide.Turns out, unicorns are real—they’ve been hanging out in the ocean this whole time. Narwhals, sometimes called the “unicorns of the sea” are some of the most unusual animals on Earth, but they’re also extremely elusive. In fact, until recently, there was little consensus on what narwhals used their long, horn-like tusks for. Now, drones have finally captured footage of narwhals using their tusks for hunting and play. The footage was captured thanks to researchers at Florida Atlantic University’s Harbor Branch Oceanographic Institute and Canada’s Department of Fisheries and Oceans, in partnership with Inuit communities in Nunavut in Canada’s High Arctic. The narwhals used their tusks to “steer” prey fish, like Arctic char, in favorable directions and even to hit and stun the fish. They also used their tusks to prod and shake various things in their environment, behavior that researchers described as “exploratory play.”
Narwhals’ “horns” aren’t horns at all, but tusks. A narwhal’s tusk begins as a canine tooth (usually their upper left) that eventually grows through their upper lip. However, not all narwhals end up with tusks at all. Some males never grow them for unknown reasons, and only about 15 percent of female narwhals do. Narwhal tusks can reach lengths of up to 10 feet. That’s more than half the length of an adult male’s body, which can reach 15.7 feet and weigh more than 3,500 pounds. Narwhals come by their large size naturally, as they’re members of the Monodontidae family. This family also includes belugas, right whales, sperm whales, and blue whales, the latter of which are the largest animals that have ever lived on Earth.
Like most whales, narwhals live in pods, or groups, of up to 10 individuals. Females, calves, and young males form pods together, while sexually mature males have pods of their own. Narwhals are also migratory, meaning that they spend different parts of the year in different places. In the summer, they spend their time in Arctic bays and fjords, but as thick sea ice forms in the fall, they migrate to deeper Arctic waters. Most narwhals spend the winter between Canada and Greenland, in areas like Baffin Bay. When narwhals return to shallower, coastal waters in the spring, they also begin searching for mates. While male narwhals have never been observed fighting for mates, they do display behavior called “tusking”, in which two males raise their tusks out of the water and lay them against each other, probably to determine which male is larger. Whichever male “wins” the contest will go on to mate with nearby females. Narwhals give birth to just one calf per year.
Unfortunately, narwhals' low birth rate makes it difficult for their numbers to recover after disasters like ocean storms or oil spills. Luckily, narwhals are not currently considered endangered, but as climate change continues to affect the Arctic waters they call home, they may have difficulty adapting to a warming world. That’s not very cool for these unicorns of the sea.
[Image description: A black-and-white illustration of a narwhal diving through water.] Credit & copyright: Archives of Pearson Scott Foresman, donated to the Wikimedia Foundation. This work has been released into the public domain by its author, Pearson Scott Foresman. This applies worldwide. -
FREEUS History Daily Curio #3045Free1 CQ
This was one march that could turn on a dime. March of Dimes recently appointed a new CEO, making this the perfect time to look back on the nonprofit’s impressive, 85-year history. Not only did the organization play a major hand in helping to eradicate polio, it has pivoted and widened its scope several times throughout its history. Initially named the National Foundation for Infantile Paralysis, March of Dimes was founded by President Franklin D. Roosevelt to combat polio. Infantile paralysis was another name for polio at the time, and the president himself relied on a wheelchair for much of his life due to his own bout with polio in 1921. The catchy name for the organization, March of Dimes, came from a campaign asking the public to send money to the White House in pursuit of a cure. While coming up with the idea for the campaign, popular comedian Eddie Cantor suggested that they call it “The March of Dimes,” a play on the name of a newsreel at the time, “The March of Time.” Much of the money sent to the White House was indeed in the form of dimes and was used to fund research headed by Jonas Salk, the developer of one of the first successful polio vaccines. The campaign was heavily promoted on the radio, and its success helped develop the fundraising model used by other medical nonprofits today.
By the 1950s, Salk successfully developed a cure for polio, and by the end of the 1970s, polio was all but eradicated. Sadly, Roosevelt passed away in 1945, before he could see the creation of the vaccine. In 1979, the organization officially changed its name to the March of Dimes Foundation. After fulfilling their original mission, the March of Dimes also diversified their efforts. The organization continues to fund research into various childhood diseases while lobbying for better access to prenatal care. As part of their mission statement, the March of Dimes acknowledges that the U.S. has some of the highest infant and maternal mortality rates among developed nations. Clearly, the march is far from over.
[Image description: A jar of coins with some coins sitting beside it.] Credit & copyright: Miguel Á. Padriñán, PexelsThis was one march that could turn on a dime. March of Dimes recently appointed a new CEO, making this the perfect time to look back on the nonprofit’s impressive, 85-year history. Not only did the organization play a major hand in helping to eradicate polio, it has pivoted and widened its scope several times throughout its history. Initially named the National Foundation for Infantile Paralysis, March of Dimes was founded by President Franklin D. Roosevelt to combat polio. Infantile paralysis was another name for polio at the time, and the president himself relied on a wheelchair for much of his life due to his own bout with polio in 1921. The catchy name for the organization, March of Dimes, came from a campaign asking the public to send money to the White House in pursuit of a cure. While coming up with the idea for the campaign, popular comedian Eddie Cantor suggested that they call it “The March of Dimes,” a play on the name of a newsreel at the time, “The March of Time.” Much of the money sent to the White House was indeed in the form of dimes and was used to fund research headed by Jonas Salk, the developer of one of the first successful polio vaccines. The campaign was heavily promoted on the radio, and its success helped develop the fundraising model used by other medical nonprofits today.
By the 1950s, Salk successfully developed a cure for polio, and by the end of the 1970s, polio was all but eradicated. Sadly, Roosevelt passed away in 1945, before he could see the creation of the vaccine. In 1979, the organization officially changed its name to the March of Dimes Foundation. After fulfilling their original mission, the March of Dimes also diversified their efforts. The organization continues to fund research into various childhood diseases while lobbying for better access to prenatal care. As part of their mission statement, the March of Dimes acknowledges that the U.S. has some of the highest infant and maternal mortality rates among developed nations. Clearly, the march is far from over.
[Image description: A jar of coins with some coins sitting beside it.] Credit & copyright: Miguel Á. Padriñán, Pexels
March 11, 2025
-
FREEEngineering Daily Curio #3044Free1 CQ
It’s a good thing your eye comes with a spare. Researchers at the Dana-Farber Cancer Institute, Massachusetts Eye and Ear, and Boston Children’s Hospital have found a way to repair previously-irreversible corneal damage in one eye using stem cells from a person’s remaining, healthy eye.
Along with the lens, an eye’s cornea plays a critical role in focusing light. Damage to the cornea, whether from injury or disease, can permanently impair vision or even lead to blindness. The cornea also protects the eye behind it by keeping out debris and germs. Since the cornea is literally at the front and center of the eye, however, it is particularly vulnerable to damage from the very things it’s designed to protect against. Unfortunately, damage to the cornea is notoriously difficult to treat.
Now, researchers have managed to extract what they call cultivated autologous limbal epithelial cells (CALEC) from a healthy eye to restore function to a damaged cornea. The process works like this: CALEC is extracted via a biopsy from the healthy eye then placed in a cellular tissue graft, where it takes up to three weeks to grow. Once ready, the graft is transplanted into the damaged eye to replace the damaged cornea. One of the researchers, Ula Jurkunas, MD, said in press release, “Now we have this new data supporting that CALEC is more than 90% effective at restoring the cornea’s surface, which makes a meaningful difference in individuals with cornea damage that was considered untreatable.” Still, as they say, an ounce of prevention is worth a pound of cure. Common causes of corneal injury include damage from foreign objects entering the eye during yard work or while working with tools or chemicals. Too much exposure to UV rays can also cause damage to the cornea, as well as physical trauma during sports. The best way to prevent these injuries, of course, is to use eye protection. Even if they can fix your cornea, having someone poke around inside your one good eye should probably remain a last resort.
[Image description: An illustrated diagram of the human eye from the side, with labels.] Credit & copyright: Archives of Pearson Scott Foresman, donated to the Wikimedia Foundation. This work has been released into the public domain by its author, Pearson Scott Foresman. This applies worldwide.It’s a good thing your eye comes with a spare. Researchers at the Dana-Farber Cancer Institute, Massachusetts Eye and Ear, and Boston Children’s Hospital have found a way to repair previously-irreversible corneal damage in one eye using stem cells from a person’s remaining, healthy eye.
Along with the lens, an eye’s cornea plays a critical role in focusing light. Damage to the cornea, whether from injury or disease, can permanently impair vision or even lead to blindness. The cornea also protects the eye behind it by keeping out debris and germs. Since the cornea is literally at the front and center of the eye, however, it is particularly vulnerable to damage from the very things it’s designed to protect against. Unfortunately, damage to the cornea is notoriously difficult to treat.
Now, researchers have managed to extract what they call cultivated autologous limbal epithelial cells (CALEC) from a healthy eye to restore function to a damaged cornea. The process works like this: CALEC is extracted via a biopsy from the healthy eye then placed in a cellular tissue graft, where it takes up to three weeks to grow. Once ready, the graft is transplanted into the damaged eye to replace the damaged cornea. One of the researchers, Ula Jurkunas, MD, said in press release, “Now we have this new data supporting that CALEC is more than 90% effective at restoring the cornea’s surface, which makes a meaningful difference in individuals with cornea damage that was considered untreatable.” Still, as they say, an ounce of prevention is worth a pound of cure. Common causes of corneal injury include damage from foreign objects entering the eye during yard work or while working with tools or chemicals. Too much exposure to UV rays can also cause damage to the cornea, as well as physical trauma during sports. The best way to prevent these injuries, of course, is to use eye protection. Even if they can fix your cornea, having someone poke around inside your one good eye should probably remain a last resort.
[Image description: An illustrated diagram of the human eye from the side, with labels.] Credit & copyright: Archives of Pearson Scott Foresman, donated to the Wikimedia Foundation. This work has been released into the public domain by its author, Pearson Scott Foresman. This applies worldwide.