Curio Cabinet
- By Date
- By Type
January 23, 2025
-
FREEBiology Nerdy CurioFree1 CQ
A lot of cells can really motivate along, and that’s great—until it’s not. According to a paper published in Nature Structural & Molecular Biology, researchers at Rockefeller University’s Laboratory of Structural Biophysics and Mechanobiology have finally figured out how cells build filopodia, the tiny, finger-like protrusions that some cells use to move through the body. More than just solving a mystery, however, the discovery may lead to better cancer treatments. Until recently, the process by which filopodia formed was something of a mystery. Filopodia are made of a protein called fascin, which bind actin filaments, or thin, flexible protein fibers together. On its own, actin isn’t particularly strong, but when stitched together in a hexagonal bundle they become strong enough to stick out from a cell and move back and forth to propel it forward. Filopodias’ formation process was captured using advanced imaging technology like cryo-EM and tomography, and understanding the process might help treat certain kinds of cancer. That’s because cancerous cells use the same mechanism to form filopodia and move around, allowing them to spread, or metastasize. In some cases, the process of filopodia-building goes haywire, creating much more than is needed for cancer cells or even creating filopodia where they shouldn’t be, accelerating the cancer’s spread. There are already fascin inhibitors (drugs that block the protein fascin) to try to address this issue, but knowing more about filopodia might lead to better versions in the future. Soon enough, cancer might not have a leg to stand on.
A lot of cells can really motivate along, and that’s great—until it’s not. According to a paper published in Nature Structural & Molecular Biology, researchers at Rockefeller University’s Laboratory of Structural Biophysics and Mechanobiology have finally figured out how cells build filopodia, the tiny, finger-like protrusions that some cells use to move through the body. More than just solving a mystery, however, the discovery may lead to better cancer treatments. Until recently, the process by which filopodia formed was something of a mystery. Filopodia are made of a protein called fascin, which bind actin filaments, or thin, flexible protein fibers together. On its own, actin isn’t particularly strong, but when stitched together in a hexagonal bundle they become strong enough to stick out from a cell and move back and forth to propel it forward. Filopodias’ formation process was captured using advanced imaging technology like cryo-EM and tomography, and understanding the process might help treat certain kinds of cancer. That’s because cancerous cells use the same mechanism to form filopodia and move around, allowing them to spread, or metastasize. In some cases, the process of filopodia-building goes haywire, creating much more than is needed for cancer cells or even creating filopodia where they shouldn’t be, accelerating the cancer’s spread. There are already fascin inhibitors (drugs that block the protein fascin) to try to address this issue, but knowing more about filopodia might lead to better versions in the future. Soon enough, cancer might not have a leg to stand on.
-
FREEArt Appreciation Daily Curio #3018Free1 CQ
Some artists live and die by what critics say…others just can’t be bothered to care. Édouard Manet, born on this day in 1832, was definitely in the latter camp. Born in Paris, France, Manet had a typical upbringing and education for the time, but always showed interest in painting, even as a young student. His father had aspirations for him to become a lawyer, but Manet wasn’t interested. After refusing to enroll in law school, Manet’s father wouldn't fund his artistic education, so Manet applied for the naval college, but was rejected. He then worked aboard a transport vessel before returning to Paris and applying to the naval college again. When he was rejected again, his father finally relented and allowed Manet to pursue art.
As a painter, Manet cared very little about what critics thought. He went against the grain and eschewed the biblical and mythological themes that were popular in his time. Manet preferred to paint subjects that he personally related to or was familiar with, painting common people and common scenes. Moreover, his style sought to capture movement and light in their ephemeral states, which angered critics but inspired other artists who would go on to form the growing Impressionist movement. One painting that showcases his style is his portrait of Berthe Morisot, whom Manet painted to convey a sense of motion as she turns to look at the artist. Manet painted the woman’s hair as unkempt and her outfit somewhat abstract, leaving much to the imagination of the viewer when it comes to her posture. Another of Manet’s paintings, Olympia, caused quite a controversy upon its debut. It depicts a nude woman reclining while looking brazenly at the viewer instead of looking away demurely, and it was considered vulgar at the time. Despite being a significant influence on the Impressionists, Manet himself never completely associated with them. Defiant and independent to the end, he painted what he liked, as he liked, staying true to his own vision of art and nothing else. It wasn’t until after his death that Manet was fully appreciated as the influential artist he was, instead of the lightning rod of controversy critics had branded him as. They say “different stroke for different folks,” but some folks clearly had it wrong.
[Image description: A portion of Edouard Manet’s Berthe Morisot painting, showing a woman in an elaborate hat and fur coat.] Credit & copyright: The Cleveland Museum of Art, Bequest of Leonard C. Hanna Jr. 1958.34. Public Domain, Creative Commons Zero (CC0) designation.Some artists live and die by what critics say…others just can’t be bothered to care. Édouard Manet, born on this day in 1832, was definitely in the latter camp. Born in Paris, France, Manet had a typical upbringing and education for the time, but always showed interest in painting, even as a young student. His father had aspirations for him to become a lawyer, but Manet wasn’t interested. After refusing to enroll in law school, Manet’s father wouldn't fund his artistic education, so Manet applied for the naval college, but was rejected. He then worked aboard a transport vessel before returning to Paris and applying to the naval college again. When he was rejected again, his father finally relented and allowed Manet to pursue art.
As a painter, Manet cared very little about what critics thought. He went against the grain and eschewed the biblical and mythological themes that were popular in his time. Manet preferred to paint subjects that he personally related to or was familiar with, painting common people and common scenes. Moreover, his style sought to capture movement and light in their ephemeral states, which angered critics but inspired other artists who would go on to form the growing Impressionist movement. One painting that showcases his style is his portrait of Berthe Morisot, whom Manet painted to convey a sense of motion as she turns to look at the artist. Manet painted the woman’s hair as unkempt and her outfit somewhat abstract, leaving much to the imagination of the viewer when it comes to her posture. Another of Manet’s paintings, Olympia, caused quite a controversy upon its debut. It depicts a nude woman reclining while looking brazenly at the viewer instead of looking away demurely, and it was considered vulgar at the time. Despite being a significant influence on the Impressionists, Manet himself never completely associated with them. Defiant and independent to the end, he painted what he liked, as he liked, staying true to his own vision of art and nothing else. It wasn’t until after his death that Manet was fully appreciated as the influential artist he was, instead of the lightning rod of controversy critics had branded him as. They say “different stroke for different folks,” but some folks clearly had it wrong.
[Image description: A portion of Edouard Manet’s Berthe Morisot painting, showing a woman in an elaborate hat and fur coat.] Credit & copyright: The Cleveland Museum of Art, Bequest of Leonard C. Hanna Jr. 1958.34. Public Domain, Creative Commons Zero (CC0) designation.
January 22, 2025
-
8 minFREEWork Business CurioFree5 CQ
California officials and insurance representatives are holding workshops starting this weekend to help people deal with their insurance companies amid the fi...
California officials and insurance representatives are holding workshops starting this weekend to help people deal with their insurance companies amid the fi...
-
FREEScience Nerdy CurioFree1 CQ
Gorillas really aren’t supposed to fly. Earlier this month, a five-month-old gorilla was rescued from a plane’s cargo hold after someone tried to illegally import him into Thailand by way of Istanbul, Turkey. The baby primate, now named Zeytin, is recovering at Polonezkoy Zoo, and workers there hope that he may one day be reintroduced to the wild. Zeytin’s plight highlights a growing problem for wild gorilla populations: the illegal pet trade. But this is far from the only threat faced by the world’s largest primates.
Male gorillas can stand up to six feet tall and weigh up to 500 pounds, while females generally grow to around 4.5 feet tall and weigh around 250 pounds. Despite their enormous size and strength, these giants are fairly gentle. Most of their diet is made up of plants, though they also eat insects, like termites. Male gorillas may be famous for pounding their chests and shrieking, but such displays are actually fairly rare and are used to intimate opponents in order to avoid real fights.
There are two gorilla species: Eastern and Western, each of which has its own subspecies. All four kinds live in central and east African rainforests, and all four are endangered. Like many rainforest animals, their habitat has been rapidly shrinking due to human encroachment and the expansion of the logging industry. However, the biggest and most violent threat to gorillas is illegal poaching. Ape meat is seen as a delicacy in some wealthy areas, and gorillas are prone to being killed for their meat since they do not typically attack or run from people who get close to them.
All gorillas live in groups called families or troops that can have up to 50 members. Troops are composed of a dominant male, called a silverback, several adult females, and their young offspring. Gorillas don’t leave the troop they were born into until they’re between eight to twelve years old, which highlights another challenge they face: slow birth and growth rates. Gorillas live to be between 35 to 40 years old in the wild, but females only have one baby at a time, with gestation taking around 8.5 months. Since each baby takes around a decade to fully mature, gorilla populations struggle to bounce back after poaching attacks or habitat destruction. Luckily, conservationists have implemented captive breeding programs around the world and some countries have enacted laws to protect gorilla habitats from further destruction. Here’s hoping that brighter times are ahead for these dark-furred wonders.
[Image description: A gorilla sitting in green grass at the Pittsburgh Zoo.] Credit & copyright: Daderot, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.Gorillas really aren’t supposed to fly. Earlier this month, a five-month-old gorilla was rescued from a plane’s cargo hold after someone tried to illegally import him into Thailand by way of Istanbul, Turkey. The baby primate, now named Zeytin, is recovering at Polonezkoy Zoo, and workers there hope that he may one day be reintroduced to the wild. Zeytin’s plight highlights a growing problem for wild gorilla populations: the illegal pet trade. But this is far from the only threat faced by the world’s largest primates.
Male gorillas can stand up to six feet tall and weigh up to 500 pounds, while females generally grow to around 4.5 feet tall and weigh around 250 pounds. Despite their enormous size and strength, these giants are fairly gentle. Most of their diet is made up of plants, though they also eat insects, like termites. Male gorillas may be famous for pounding their chests and shrieking, but such displays are actually fairly rare and are used to intimate opponents in order to avoid real fights.
There are two gorilla species: Eastern and Western, each of which has its own subspecies. All four kinds live in central and east African rainforests, and all four are endangered. Like many rainforest animals, their habitat has been rapidly shrinking due to human encroachment and the expansion of the logging industry. However, the biggest and most violent threat to gorillas is illegal poaching. Ape meat is seen as a delicacy in some wealthy areas, and gorillas are prone to being killed for their meat since they do not typically attack or run from people who get close to them.
All gorillas live in groups called families or troops that can have up to 50 members. Troops are composed of a dominant male, called a silverback, several adult females, and their young offspring. Gorillas don’t leave the troop they were born into until they’re between eight to twelve years old, which highlights another challenge they face: slow birth and growth rates. Gorillas live to be between 35 to 40 years old in the wild, but females only have one baby at a time, with gestation taking around 8.5 months. Since each baby takes around a decade to fully mature, gorilla populations struggle to bounce back after poaching attacks or habitat destruction. Luckily, conservationists have implemented captive breeding programs around the world and some countries have enacted laws to protect gorilla habitats from further destruction. Here’s hoping that brighter times are ahead for these dark-furred wonders.
[Image description: A gorilla sitting in green grass at the Pittsburgh Zoo.] Credit & copyright: Daderot, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication. -
FREEUS History Daily Curio #3017Free1 CQ
Why did they call it a ration when it was so irrational? Pre-sliced bread became popular starting in the late 1920s, and it quickly became so ingrained in consumers’ preferences that when it was banned during WWII, it caused quite an uproar. Bread has been around for millennia, but pre-sliced bread has only been around for about a century and a half. The very first bread-slicing device was invented in 1860 and used parallel blades to cut a loaf of bread all at once. However, it wasn’t until Otto Frederick Rohwedder of Iowa invented an automated version in 1928 that pre-sliced bread really took off. Soon, innovations saw machines that could slice and wrap bread at the same time, and consumers were glad to buy loaves that they could more conveniently consume. There was also an added benefit: because sliced bread came wrapped and consumers only had to take out as much as they needed at a time, the bread lasted longer compared to whole loaves, which had to be completely unwrapped to slice at home.
When World War II food rationing began in the U.S., Claude R. Wickard, Secretary of Agriculture and head of the War Food Administration, issued Food Distribution Order 1, which banned sliced bread in order to save on the nation’s supply of wax paper. The American public went into an immediate uproar and Wickard was criticized in the press for the short-sighted measure. Firstly, the lack of sliced bread meant that housewives all over the nation had to vie for the same supply of bread knives, which were made of steel, another rationed resource. Secondly, because machines both sliced and wrapped the bread, both had to be done by hand again, sliced or not, which increased labor costs. Thirdly, since whole loaves went stale faster, more food was wasted during a time when families could only buy as much as their ration books allowed. Fortunately, the government reversed course on the decision, and the ban was lifted less than two months after it took effect. Let’s raise a toast to sliced bread.
[Image description: Slices of bread in front of a divided white-and-gray background. Some slices are white bread and some have whole grains on top.] Credit & copyright: Mariana Kurnyk, PexelsWhy did they call it a ration when it was so irrational? Pre-sliced bread became popular starting in the late 1920s, and it quickly became so ingrained in consumers’ preferences that when it was banned during WWII, it caused quite an uproar. Bread has been around for millennia, but pre-sliced bread has only been around for about a century and a half. The very first bread-slicing device was invented in 1860 and used parallel blades to cut a loaf of bread all at once. However, it wasn’t until Otto Frederick Rohwedder of Iowa invented an automated version in 1928 that pre-sliced bread really took off. Soon, innovations saw machines that could slice and wrap bread at the same time, and consumers were glad to buy loaves that they could more conveniently consume. There was also an added benefit: because sliced bread came wrapped and consumers only had to take out as much as they needed at a time, the bread lasted longer compared to whole loaves, which had to be completely unwrapped to slice at home.
When World War II food rationing began in the U.S., Claude R. Wickard, Secretary of Agriculture and head of the War Food Administration, issued Food Distribution Order 1, which banned sliced bread in order to save on the nation’s supply of wax paper. The American public went into an immediate uproar and Wickard was criticized in the press for the short-sighted measure. Firstly, the lack of sliced bread meant that housewives all over the nation had to vie for the same supply of bread knives, which were made of steel, another rationed resource. Secondly, because machines both sliced and wrapped the bread, both had to be done by hand again, sliced or not, which increased labor costs. Thirdly, since whole loaves went stale faster, more food was wasted during a time when families could only buy as much as their ration books allowed. Fortunately, the government reversed course on the decision, and the ban was lifted less than two months after it took effect. Let’s raise a toast to sliced bread.
[Image description: Slices of bread in front of a divided white-and-gray background. Some slices are white bread and some have whole grains on top.] Credit & copyright: Mariana Kurnyk, Pexels
January 21, 2025
-
7 minFREEWork Business CurioFree4 CQ
From the BBC World Service: As President Donald Trump begins his second term in office, he’s been talking tariffs — but for not for China, as many expected. ...
From the BBC World Service: As President Donald Trump begins his second term in office, he’s been talking tariffs — but for not for China, as many expected. ...
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: January 21, 2025\GOOR-mahnd\ noun
What It Means
A gourmand is a person who loves and appreciates good food and drink. Gourm...
with Merriam-WebsterWord of the Day
: January 21, 2025\GOOR-mahnd\ noun
What It Means
A gourmand is a person who loves and appreciates good food and drink. Gourm...
-
FREEMusic Appreciation Song CurioFree2 CQ
Sometimes you get a hit song and a film title in one fell swoop. On this day in 1978, the soundtrack for Saturday Night Fever, a movie about New York’s disco scene, began a 24-week run at number one on the U.S. album chart. It remains the only disco album to ever win a Grammy for Album of the Year. But its name (and sound) would have been very different if the Bee Gees hadn’t been approached to work on the film. In 1977, the Bee Gees’ manager, Robert Stigwood, told them about a movie he was producing called Saturday Night and asked them to write a song with the same name for it. Instead, they gave Stigwood a song they had already written, called Night Fever, and persuaded him to change the movie’s title to Saturday Night Fever to fit it. The rest is disco history. Night Fever is one of the best-remembered disco songs of all time, featuring a danceable beat, the Bee Gees' signature harmonized falsetto, and lyrics about (what else?) dancing all night. The song helped make the movie and its soundtrack a resounding hit. The Bee Gees really knew how to work smarter, not harder!
Sometimes you get a hit song and a film title in one fell swoop. On this day in 1978, the soundtrack for Saturday Night Fever, a movie about New York’s disco scene, began a 24-week run at number one on the U.S. album chart. It remains the only disco album to ever win a Grammy for Album of the Year. But its name (and sound) would have been very different if the Bee Gees hadn’t been approached to work on the film. In 1977, the Bee Gees’ manager, Robert Stigwood, told them about a movie he was producing called Saturday Night and asked them to write a song with the same name for it. Instead, they gave Stigwood a song they had already written, called Night Fever, and persuaded him to change the movie’s title to Saturday Night Fever to fit it. The rest is disco history. Night Fever is one of the best-remembered disco songs of all time, featuring a danceable beat, the Bee Gees' signature harmonized falsetto, and lyrics about (what else?) dancing all night. The song helped make the movie and its soundtrack a resounding hit. The Bee Gees really knew how to work smarter, not harder!
-
FREEBiology Daily Curio #3016Free1 CQ
Don’t read this if you can’t stand to have your heart broken. Many penguins famously mate for life, a romantic fact that has helped make them some of the world’s best-loved birds. However, a 13-year study into the breeding habits of little penguins (Eudyptula minor) has revealed that the diminutive birds are surprisingly prone to “divorce.” Also known as fairy penguins, little penguins, as their name suggests, only grow to around 14 inches tall and only weigh about three pounds. But big drama sometimes comes in small packages. Researchers from Monash University in Australia tracked the breeding habits of around a thousand little penguin pairs on Phillips Island. The island is home to the world’s largest colony of the species, with a population of 37,000 or so. Of all the pairs they studied, around 250 ended up “divorced”, with the pairs splitting up and seeking new breeding partners.
So, what causes penguin divorce? Struggles with infertility, mostly. Penguin couples were much more likely to part ways when they failed to produce offspring. While divorce rates could be as high as 26 percent in some years, rates went down when the colony saw more successful hatchings. Marital bliss isn’t determined by offspring alone, though. According to one of the researchers, Richard Reina, little penguins aren’t exactly known for their faithfulness. In a university press release, he explained, “In good times, they largely stick with their partners, although there’s often a bit of hanky-panky happening on the side.” It might be hard to swallow the idea of adorable penguins divorcing and cheating on each other, but this study into little penguin behavior is important for the future of conservation. Current efforts to protect penguin species are focused on the impact of climate change, but studies like this show that there are complex social dynamics to consider as well when trying to maintain a healthy population. No word yet on whether there are little penguin divorce lawyers.
[Image description: A little penguin standing just underneath some type of wooden structure.] Credit & copyright: Sklmsta (Sklmsta~commonswiki), Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.Don’t read this if you can’t stand to have your heart broken. Many penguins famously mate for life, a romantic fact that has helped make them some of the world’s best-loved birds. However, a 13-year study into the breeding habits of little penguins (Eudyptula minor) has revealed that the diminutive birds are surprisingly prone to “divorce.” Also known as fairy penguins, little penguins, as their name suggests, only grow to around 14 inches tall and only weigh about three pounds. But big drama sometimes comes in small packages. Researchers from Monash University in Australia tracked the breeding habits of around a thousand little penguin pairs on Phillips Island. The island is home to the world’s largest colony of the species, with a population of 37,000 or so. Of all the pairs they studied, around 250 ended up “divorced”, with the pairs splitting up and seeking new breeding partners.
So, what causes penguin divorce? Struggles with infertility, mostly. Penguin couples were much more likely to part ways when they failed to produce offspring. While divorce rates could be as high as 26 percent in some years, rates went down when the colony saw more successful hatchings. Marital bliss isn’t determined by offspring alone, though. According to one of the researchers, Richard Reina, little penguins aren’t exactly known for their faithfulness. In a university press release, he explained, “In good times, they largely stick with their partners, although there’s often a bit of hanky-panky happening on the side.” It might be hard to swallow the idea of adorable penguins divorcing and cheating on each other, but this study into little penguin behavior is important for the future of conservation. Current efforts to protect penguin species are focused on the impact of climate change, but studies like this show that there are complex social dynamics to consider as well when trying to maintain a healthy population. No word yet on whether there are little penguin divorce lawyers.
[Image description: A little penguin standing just underneath some type of wooden structure.] Credit & copyright: Sklmsta (Sklmsta~commonswiki), Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.
January 20, 2025
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: January 20, 2025\in-IM-it-uh-bul\ adjective
What It Means
Inimitable describes someone or something that is impossible to c...
with Merriam-WebsterWord of the Day
: January 20, 2025\in-IM-it-uh-bul\ adjective
What It Means
Inimitable describes someone or something that is impossible to c...
-
FREEArt Appreciation Art CurioFree1 CQ
"Prayer nut” sounds like a name for someone who loves to go to church, but its meaning is actually a lot more literal! Real prayer nuts are intricate, miniature sculptures contained inside a wooden sphere. The image above shows two round pieces of carved wood connected by a hinge. Both pieces feature detailed, carved scenes. The top scene shows a man about to be beheaded, while the bottom shows a king sitting at a throne before an audience. Also known as paternosters, prayer nuts were often no larger than a golf ball and featured such minutely detailed scenes that magnification was required to truly view them properly. They were popular in the Netherlands in the early 1500s, and were likely prohibitively expensive thanks to the level of detail involved in making them. While they were religious in nature, they were also valued as displays of wealth. Today, only around 150 prayer nuts remain, and their use as devotional items is heavily debated. It’s unclear if they were ever used for prayer at all. This is a tough nut to crack.
Prayer Nut with Scenes from the Life of St. James the Greater, Adam Dircksz (active c. 1500), c. 1500–1530, Boxwood, 2.31 x 1.87 in. (5.8 x 4.8 cm.), The Cleveland Museum of Art, Cleveland, Ohio
[Image credit & copyright: The Cleveland Museum of Art, Purchase from the J. H. Wade Fund 1961.87, Public domain, Creative Commons Zero (CC0) designation."Prayer nut” sounds like a name for someone who loves to go to church, but its meaning is actually a lot more literal! Real prayer nuts are intricate, miniature sculptures contained inside a wooden sphere. The image above shows two round pieces of carved wood connected by a hinge. Both pieces feature detailed, carved scenes. The top scene shows a man about to be beheaded, while the bottom shows a king sitting at a throne before an audience. Also known as paternosters, prayer nuts were often no larger than a golf ball and featured such minutely detailed scenes that magnification was required to truly view them properly. They were popular in the Netherlands in the early 1500s, and were likely prohibitively expensive thanks to the level of detail involved in making them. While they were religious in nature, they were also valued as displays of wealth. Today, only around 150 prayer nuts remain, and their use as devotional items is heavily debated. It’s unclear if they were ever used for prayer at all. This is a tough nut to crack.
Prayer Nut with Scenes from the Life of St. James the Greater, Adam Dircksz (active c. 1500), c. 1500–1530, Boxwood, 2.31 x 1.87 in. (5.8 x 4.8 cm.), The Cleveland Museum of Art, Cleveland, Ohio
[Image credit & copyright: The Cleveland Museum of Art, Purchase from the J. H. Wade Fund 1961.87, Public domain, Creative Commons Zero (CC0) designation. -
FREEMind + Body Daily Curio #3015Free1 CQ
You can paint the town red all you want, but you probably shouldn’t eat all the red you want. The FDA recently banned Red No. 3, a ubiquitous food coloring agent. Also called erythrosine, Red No. 3 is a synthetic dye made from petroleum, and it’s been standing out like a red thumb for decades thanks to being a known carcinogen. While its use in cosmetics was banned years ago, the dye is still currently used in over 9,200 food products. The FDA is giving companies until the beginning of 2027 to remove the dye from their formulas, bringing an end to a decades-long battle by activists to ban the dye from the food supply. Red No. 3 was first approved for use in food in 1907, and since then, it has been the go-to dye to give sodas, candies, and other sweets a vibrant, cherry red coloration. The color may make the food appealing to the eye, but it’s not exactly kind to the rest of the body. The dye was first identified as a possible carcinogen in the 1980s when it was shown to cause cancer in male rats that were exposed to high doses. Since then, groups like the Center for Science in the Public Interest have been pressuring the FDA to ban the dye, while several states did so of their own accord. For example, the dye has been banned in California since 2023. Outside the U.S., the dye has already been banned by several countries in the European Union, Australia, and Japan, and the list is growing. However, Red No. 3 isn’t the only dye to cause controversy. Red No. 40 has been linked in recent years to behavioral issues in children, but it’s not facing a ban yet. It seems red is a tough color to dye for.
[Image description: A red rectangle.] Credit & copyright: Author’s own photo. Public Domain.You can paint the town red all you want, but you probably shouldn’t eat all the red you want. The FDA recently banned Red No. 3, a ubiquitous food coloring agent. Also called erythrosine, Red No. 3 is a synthetic dye made from petroleum, and it’s been standing out like a red thumb for decades thanks to being a known carcinogen. While its use in cosmetics was banned years ago, the dye is still currently used in over 9,200 food products. The FDA is giving companies until the beginning of 2027 to remove the dye from their formulas, bringing an end to a decades-long battle by activists to ban the dye from the food supply. Red No. 3 was first approved for use in food in 1907, and since then, it has been the go-to dye to give sodas, candies, and other sweets a vibrant, cherry red coloration. The color may make the food appealing to the eye, but it’s not exactly kind to the rest of the body. The dye was first identified as a possible carcinogen in the 1980s when it was shown to cause cancer in male rats that were exposed to high doses. Since then, groups like the Center for Science in the Public Interest have been pressuring the FDA to ban the dye, while several states did so of their own accord. For example, the dye has been banned in California since 2023. Outside the U.S., the dye has already been banned by several countries in the European Union, Australia, and Japan, and the list is growing. However, Red No. 3 isn’t the only dye to cause controversy. Red No. 40 has been linked in recent years to behavioral issues in children, but it’s not facing a ban yet. It seems red is a tough color to dye for.
[Image description: A red rectangle.] Credit & copyright: Author’s own photo. Public Domain. -
8 minFREEWork Business CurioFree5 CQ
“We already felt like we’re being priced out,” said Claire Contreras, a teacher who lost her Altadena apartment to a fire. “All of this just kind of puts a b...
“We already felt like we’re being priced out,” said Claire Contreras, a teacher who lost her Altadena apartment to a fire. “All of this just kind of puts a b...
January 19, 2025
-
FREEBiology PP&T CurioFree1 CQ
You really shouldn’t spray paint at church—especially not on the grave of the world’s most famous biologist. Two climate activists recently made headlines for spray painting a message on Charles Darwin’s grave, in London’s Westminster Abbey. They hoped to draw attention to the fact that Earth’s global temperatures were 34.7 degrees higher than pre-industrial levels for the first time in 2024. While there’s no way to know how Darwin would feel about our modern climate crisis, during his lifetime he wasn’t focused on global temperatures. Rather, he wanted to learn how living things adapted to their environments. His theory of natural selection was groundbreaking…though, contrary to popular belief, Darwin was far from the first scientist to notice that organisms changed over time.
Born on February 12, 1809, in Shrewsbury, England, Charles Darwin was already interested in nature and an avid collector of plants and insects by the time he was teen. Still, he didn’t set out to study the natural world, at first. Instead, he apprenticed with his father, a doctor, then enrolled at the University of Edinburgh’s medical school in 1825. Alas, Darwin wasn’t cut out to be a doctor. Not only was he bored by medical lectures, he was deeply (and understandably) upset by medical practices of the time. This was especially true of a surgery he witnessed in which doctors operated on a child without anesthetics—because they hadn’t been invented yet. After leaving medical school, Darwin didn’t have a clear direction in life. He studied taxidermy for a time and later enrolled at Cambridge University to study theology. Yet again, Darwin found himself drawn away from his schooling, finally spurning theology to join the five-year voyage of the HMS Beagle to serve as its naturalist. The Beagle was set to circumnavigate the globe and survey the coastline of South America, among other things, allowing Darwin to travel to remote locations rarely visited by anyone.
During the voyage, Darwin did just what he’d done as a child, collecting specimens of insects, plants, animals, and fossils. He didn’t quite have the same “leave only footprints” mantra as modern scientists, though. In fact, Darwin not only documented the various lifeforms he encountered on his journey, he dined on them too. This was actually a habit dating back to his days at Cambridge, where he was the founding member of the Gourmet Club (also known as the Glutton Club). The goal of the club had been to feast on “birds and beasts which were before unknown to human palate,” and Darwin certainly made good on that motto during his time aboard the Beagle. According to his notes, Darwin ate iguanas, giant tortoises, armadillos, and even a puma, which he said was "remarkably like veal in taste." His most important contribution as a naturalist, though, was his theory of natural selection.
Darwin came up with his most famous idea after observing 13 different species of finches on the Galápagos Islands. Examining their behavior in the wild and studying their anatomy from captured specimens, Darwin found that the finches all had differently shaped beaks for different purposes. Some were better suited for eating seeds, while others ate insects. Despite these differences, Darwin concluded that they were all descended from the same bird, having many common characteristics, with specializations arising over time. Darwin wasn’t the first person to posit the possibility of evolution, though. 18th-century naturalist Jean Baptiste Lamarck believed that animals changed their bodies throughout their lives based on their environment, while Darwin’s contemporary Alfred Russel Wallace came up with the same theory of natural selection as he did. In fact, the two published a joint statement and gave a presentation at the Linnean Society in London in 1858. Darwin didn’t actually coin the phrase “survival of the fittest,” either. English philosopher Herbert Spencer came up with it in 1864 while comparing his economic and sociological theories to Darwin’s theory of evolution.
Despite Darwin’s confidence in his theory and praise from his peers in the scientific world, he actually waited 20 years to publish his findings. He was fearful of how his theory would be received by the religious community in England, since it contradicted much of what was written in the Bible. However, despite some public criticism, Darwin was mostly celebrated upon his theory’s publication. When he died in 1882, he was laid to rest in London’s Westminster Abbey, alongside England’s greatest heroes. It seems he didn’t have much to fear if his countrymen were willing to bury him in a church!
[Image description: A black-and-white photograph of Charles Darwin with a white beard.] Credit & copyright: Library of Congress, Prints & Photographs Division, LC-DIG-ggbain-03485, George Grantham Bain Collection. No known restrictions on publication.You really shouldn’t spray paint at church—especially not on the grave of the world’s most famous biologist. Two climate activists recently made headlines for spray painting a message on Charles Darwin’s grave, in London’s Westminster Abbey. They hoped to draw attention to the fact that Earth’s global temperatures were 34.7 degrees higher than pre-industrial levels for the first time in 2024. While there’s no way to know how Darwin would feel about our modern climate crisis, during his lifetime he wasn’t focused on global temperatures. Rather, he wanted to learn how living things adapted to their environments. His theory of natural selection was groundbreaking…though, contrary to popular belief, Darwin was far from the first scientist to notice that organisms changed over time.
Born on February 12, 1809, in Shrewsbury, England, Charles Darwin was already interested in nature and an avid collector of plants and insects by the time he was teen. Still, he didn’t set out to study the natural world, at first. Instead, he apprenticed with his father, a doctor, then enrolled at the University of Edinburgh’s medical school in 1825. Alas, Darwin wasn’t cut out to be a doctor. Not only was he bored by medical lectures, he was deeply (and understandably) upset by medical practices of the time. This was especially true of a surgery he witnessed in which doctors operated on a child without anesthetics—because they hadn’t been invented yet. After leaving medical school, Darwin didn’t have a clear direction in life. He studied taxidermy for a time and later enrolled at Cambridge University to study theology. Yet again, Darwin found himself drawn away from his schooling, finally spurning theology to join the five-year voyage of the HMS Beagle to serve as its naturalist. The Beagle was set to circumnavigate the globe and survey the coastline of South America, among other things, allowing Darwin to travel to remote locations rarely visited by anyone.
During the voyage, Darwin did just what he’d done as a child, collecting specimens of insects, plants, animals, and fossils. He didn’t quite have the same “leave only footprints” mantra as modern scientists, though. In fact, Darwin not only documented the various lifeforms he encountered on his journey, he dined on them too. This was actually a habit dating back to his days at Cambridge, where he was the founding member of the Gourmet Club (also known as the Glutton Club). The goal of the club had been to feast on “birds and beasts which were before unknown to human palate,” and Darwin certainly made good on that motto during his time aboard the Beagle. According to his notes, Darwin ate iguanas, giant tortoises, armadillos, and even a puma, which he said was "remarkably like veal in taste." His most important contribution as a naturalist, though, was his theory of natural selection.
Darwin came up with his most famous idea after observing 13 different species of finches on the Galápagos Islands. Examining their behavior in the wild and studying their anatomy from captured specimens, Darwin found that the finches all had differently shaped beaks for different purposes. Some were better suited for eating seeds, while others ate insects. Despite these differences, Darwin concluded that they were all descended from the same bird, having many common characteristics, with specializations arising over time. Darwin wasn’t the first person to posit the possibility of evolution, though. 18th-century naturalist Jean Baptiste Lamarck believed that animals changed their bodies throughout their lives based on their environment, while Darwin’s contemporary Alfred Russel Wallace came up with the same theory of natural selection as he did. In fact, the two published a joint statement and gave a presentation at the Linnean Society in London in 1858. Darwin didn’t actually coin the phrase “survival of the fittest,” either. English philosopher Herbert Spencer came up with it in 1864 while comparing his economic and sociological theories to Darwin’s theory of evolution.
Despite Darwin’s confidence in his theory and praise from his peers in the scientific world, he actually waited 20 years to publish his findings. He was fearful of how his theory would be received by the religious community in England, since it contradicted much of what was written in the Bible. However, despite some public criticism, Darwin was mostly celebrated upon his theory’s publication. When he died in 1882, he was laid to rest in London’s Westminster Abbey, alongside England’s greatest heroes. It seems he didn’t have much to fear if his countrymen were willing to bury him in a church!
[Image description: A black-and-white photograph of Charles Darwin with a white beard.] Credit & copyright: Library of Congress, Prints & Photographs Division, LC-DIG-ggbain-03485, George Grantham Bain Collection. No known restrictions on publication. -
9 minFREEWork Business CurioFree5 CQ
From a new so-called Department of Government Efficiency to an incoming Republican Congress, deep cuts to the federal government are promised this year. Amon...
From a new so-called Department of Government Efficiency to an incoming Republican Congress, deep cuts to the federal government are promised this year. Amon...
January 18, 2025
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: January 18, 2025\MIN-uh-skyool\ adjective
What It Means
Something described as minuscule is very small. Minuscule can also ...
with Merriam-WebsterWord of the Day
: January 18, 2025\MIN-uh-skyool\ adjective
What It Means
Something described as minuscule is very small. Minuscule can also ...
-
FREESports Sporty CurioFree1 CQ
Forget the polish: they’re starting from scratch. Some Olympians who earned medals during the 2024 Paris Olympics are starting to ask for replacements after their prizes started showing signs of significant deterioration. Designed by Parisian jewelry house Chaumet and manufactured by the Monnaie de Paris, the French mint, 5,084 medals were handed out during the Paris Olympics and Paralympics last year. The medals were made with something extra inside them—a piece of the Eiffel Tower itself. These pieces came from girders and other parts of the tower that were replaced during renovations. With 18,038 iron parts making up the entirety of the tower, renovation is an ongoing process that often involves swapping out old components. But it seems that the Olympic medals that contain pieces of the tower need renovations of their own. Some athletes posted pictures of their medals deteriorating while the games were still ongoing, like American skateboarder Nyjah Huston, whose video went viral on social media. Since then, many more have spoken out about the issue. The affected medals are described as having “crocodile skin” from corrosion. The actual cause of the damage is unknown, but the Monnaie de Paris is set to begin making replacements in the coming weeks. Replacing over 5,000 medals sounds like an Olympic feat of its own.
Forget the polish: they’re starting from scratch. Some Olympians who earned medals during the 2024 Paris Olympics are starting to ask for replacements after their prizes started showing signs of significant deterioration. Designed by Parisian jewelry house Chaumet and manufactured by the Monnaie de Paris, the French mint, 5,084 medals were handed out during the Paris Olympics and Paralympics last year. The medals were made with something extra inside them—a piece of the Eiffel Tower itself. These pieces came from girders and other parts of the tower that were replaced during renovations. With 18,038 iron parts making up the entirety of the tower, renovation is an ongoing process that often involves swapping out old components. But it seems that the Olympic medals that contain pieces of the tower need renovations of their own. Some athletes posted pictures of their medals deteriorating while the games were still ongoing, like American skateboarder Nyjah Huston, whose video went viral on social media. Since then, many more have spoken out about the issue. The affected medals are described as having “crocodile skin” from corrosion. The actual cause of the damage is unknown, but the Monnaie de Paris is set to begin making replacements in the coming weeks. Replacing over 5,000 medals sounds like an Olympic feat of its own.
-
7 minFREEWork Business CurioFree4 CQ
California officials and insurance representatives are holding workshops starting this weekend to help people deal with their insurance companies amid the fi...
California officials and insurance representatives are holding workshops starting this weekend to help people deal with their insurance companies amid the fi...
January 17, 2025
-
8 minFREEWork Business CurioFree5 CQ
From the BBC World Service: China’s economy grew by 5% last year, beating expectations. This growth was driven by the country’s manufacturing sector, with th...
From the BBC World Service: China’s economy grew by 5% last year, beating expectations. This growth was driven by the country’s manufacturing sector, with th...
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: January 17, 2025\ap-rih-HEN-shun\ noun
What It Means
Apprehension most often refers to the fear that something bad or unple...
with Merriam-WebsterWord of the Day
: January 17, 2025\ap-rih-HEN-shun\ noun
What It Means
Apprehension most often refers to the fear that something bad or unple...
-
FREEMind + Body Daily CurioFree1 CQ
You could call it a kingly dish…too bad it’s been forgotten! Chicken à la king was once one of the U.S.’s most popular dishes. It was a hit at dinner parties in the 1950s and 60s, and could also be found in plenty of fancy restaurants. Today, you’d be hard pressed to find it anywhere. So, what happened?
Despite its royal name, chicken à la king is a fairly simple dish, made from easy-to-source ingredients. It consists of chopped chicken in a cream sauce with veggies like mushrooms, tomatoes, and peas. Sherry is sometimes added to the sauce. The dish is usually served over noodles, rice, or toast, making chicken à la king a sort of sauce itself.
No one knows who invented chicken à la king, though most theories suggest it dates back to the mid to late 1800s. Some claim that it was invented by a chef at the famous New York restaurant Delmonico's, where it was called “Chicken à la Keene.” There are various stories of other New York City chefs creating the dish, though one tale links chicken à la king to Philadelphia. Supposedly, in the 1890s, a cook named William "Bill" King created it while working at the Bellevue Hotel.
Wherever it came from, there’s no doubt that chicken à la king’s popularity began in New York City, where several fancy restaurants began serving it in the early to mid 1900s. Between 1910 and 1960, the dish appeared on more than 300 menus in New York City. Beginning in the 1940s, dinner parties with friends and neighbors became one of the most popular ways for suburbanites to socialize. Chicken à la king, with its short prep time and easy-to-find ingredients, quickly became one of the most commonly-found foods at such parties, not to mention at weddings and other large-scale get-togethers.
As for why the dish fell out of fashion…no one’s really sure. As the dish became more common, it’s possible that quicker and cheaper versions of it convinced some people that it didn’t live up to its original hype. Or perhaps its meteoric rise in popularity was also its downfall, and people simply got sick of it being served at every major function. One thing’s for sure: chicken à la king was here for a good time…not for a long time.
[Image description: Two pieces of raw chicken with sprigs of green herbs on a white plate.] Credit & copyright: Leeloo The First, PexelsYou could call it a kingly dish…too bad it’s been forgotten! Chicken à la king was once one of the U.S.’s most popular dishes. It was a hit at dinner parties in the 1950s and 60s, and could also be found in plenty of fancy restaurants. Today, you’d be hard pressed to find it anywhere. So, what happened?
Despite its royal name, chicken à la king is a fairly simple dish, made from easy-to-source ingredients. It consists of chopped chicken in a cream sauce with veggies like mushrooms, tomatoes, and peas. Sherry is sometimes added to the sauce. The dish is usually served over noodles, rice, or toast, making chicken à la king a sort of sauce itself.
No one knows who invented chicken à la king, though most theories suggest it dates back to the mid to late 1800s. Some claim that it was invented by a chef at the famous New York restaurant Delmonico's, where it was called “Chicken à la Keene.” There are various stories of other New York City chefs creating the dish, though one tale links chicken à la king to Philadelphia. Supposedly, in the 1890s, a cook named William "Bill" King created it while working at the Bellevue Hotel.
Wherever it came from, there’s no doubt that chicken à la king’s popularity began in New York City, where several fancy restaurants began serving it in the early to mid 1900s. Between 1910 and 1960, the dish appeared on more than 300 menus in New York City. Beginning in the 1940s, dinner parties with friends and neighbors became one of the most popular ways for suburbanites to socialize. Chicken à la king, with its short prep time and easy-to-find ingredients, quickly became one of the most commonly-found foods at such parties, not to mention at weddings and other large-scale get-togethers.
As for why the dish fell out of fashion…no one’s really sure. As the dish became more common, it’s possible that quicker and cheaper versions of it convinced some people that it didn’t live up to its original hype. Or perhaps its meteoric rise in popularity was also its downfall, and people simply got sick of it being served at every major function. One thing’s for sure: chicken à la king was here for a good time…not for a long time.
[Image description: Two pieces of raw chicken with sprigs of green herbs on a white plate.] Credit & copyright: Leeloo The First, Pexels