Curio Cabinet / Daily Curio
-
FREEHumanities Daily Curio #3090Free1 CQ
Be careful calling someone a Neanderthal as an insult—you might actually be complimenting them. A team of Spanish archaeologists have announced the discovery of a fingerprint that suggests that Neanderthals were more artistically inclined than previously thought. At around 43,000 years old, the fingerprint in question was left by a Neanderthal on an unassuming granite pebble. The rock was originally discovered in 2022 at the San Lázaro rock shelter near Segovia, and at first, it wasn't clear just what the small, red dot on it was. After consulting geologists, the team found that the red color on the rock came from a pigment made of iron oxide and clay, while police forensics experts confirmed that the mark itself came from the tip of someone's finger. Although it doesn't look like much at a glance, the fingerprinted rock caught the team's attention for a number of reasons. Firstly, there was nothing else in the site that also had the red pigment on it, suggesting it was placed there deliberately after being sourced from another location. Secondly, the rock vaguely resembles a human face, and the dot just so happens to be where the nose should be. Thus, the archaeologists believe that whoever marked the rock did so to complete the face. It may sound far-fetched that a Neanderthal could make such a deliberate artistic statement, but more and more evidence suggests that they were capable of more artistic and symbolic expression that they used to be given credit for. As much as it may hurt the pride of their successors, (Homo sapiens, also known as human beings) the Neanderthals may have beaten us to the punch when it comes to developing culture. Regardless of whether or not the red dot was an intentional creation, it is now officially the oldest human fingerprint ever found. How about a round of applause for the Paleolithic Picasso?
[Image description: A painting of a Neanderthal family by a cave, with a man holding a spear out front.] Credit & copyright: Neanderthal Flintworkers, Le Moustier Cavern, Dordogne, France, Charles Robert Knight (1874–1953). American Museum of Natural History, Public Domain.Be careful calling someone a Neanderthal as an insult—you might actually be complimenting them. A team of Spanish archaeologists have announced the discovery of a fingerprint that suggests that Neanderthals were more artistically inclined than previously thought. At around 43,000 years old, the fingerprint in question was left by a Neanderthal on an unassuming granite pebble. The rock was originally discovered in 2022 at the San Lázaro rock shelter near Segovia, and at first, it wasn't clear just what the small, red dot on it was. After consulting geologists, the team found that the red color on the rock came from a pigment made of iron oxide and clay, while police forensics experts confirmed that the mark itself came from the tip of someone's finger. Although it doesn't look like much at a glance, the fingerprinted rock caught the team's attention for a number of reasons. Firstly, there was nothing else in the site that also had the red pigment on it, suggesting it was placed there deliberately after being sourced from another location. Secondly, the rock vaguely resembles a human face, and the dot just so happens to be where the nose should be. Thus, the archaeologists believe that whoever marked the rock did so to complete the face. It may sound far-fetched that a Neanderthal could make such a deliberate artistic statement, but more and more evidence suggests that they were capable of more artistic and symbolic expression that they used to be given credit for. As much as it may hurt the pride of their successors, (Homo sapiens, also known as human beings) the Neanderthals may have beaten us to the punch when it comes to developing culture. Regardless of whether or not the red dot was an intentional creation, it is now officially the oldest human fingerprint ever found. How about a round of applause for the Paleolithic Picasso?
[Image description: A painting of a Neanderthal family by a cave, with a man holding a spear out front.] Credit & copyright: Neanderthal Flintworkers, Le Moustier Cavern, Dordogne, France, Charles Robert Knight (1874–1953). American Museum of Natural History, Public Domain. -
FREEUS History Daily Curio #3089Free1 CQ
What happens when you take the "mutually" out of "mutually assured destruction?” The answer, surprisingly, is a problem. The newly announced missile defense system dubbed the "Golden Dome" is drawing comparisons to President Ronald Reagan's Strategic Defense Initiative (SDI). While SDI was similar to the Golden Dome in many ways, the circumstances of its conception gave rise to a distinctly different set of issues.
As far as most Americans in the 1980s were concerned, the Cold War was a conflict without end. The U.S. and the Soviet Union were engaged in a morbid and seemingly inescapable mandate—that of mutually assured destruction (MAD). Both sides were armed with thousands of nuclear weapons ready to strike, set to launch in kind should either party decide to use them. In 1983, President Reagan proposed a way for the U.S. to finally gain the elusive upper hand. The plan was called the Strategic Defense Initiative (SDI), and would have used satellites in space equipped with laser weaponry to shoot down any intercontinental ballistic missiles (ICBM) launched by the Soviet Union.
Critics judged the plan to be infeasible and unrealistic, calling it "Star Wars" after the movie franchise of the same name. Indeed, the technology to make such a defense system didn’t exist yet. Even today, laser weaponry is mostly experimental in nature. Reagan’s plan also had the potential to be a foreign policy disaster. Whereas MAD had made the use of nuclear weapons forbidden by default, by announcing the SDI, the U.S. was announcing that it was essentially ready to take the "mutually" out of MAD. Thus, the very existence of the plan was seen as a sign of aggression, though the infeasible nature of the technology soon eased those concerns. There were also fears that successfully rendering nuclear weapons useless for one side would simply encourage an arms race of another kind. Ultimately, the SDI was scrapped by the 1990s as the end of the Cold War reduced the incentive to develop them. We did end up getting more Star Wars movies though, so that's something.
[Image description: A blue sky with a single, white cloud.] Credit & copyright: Dinkum, Wikimedia Commons. Creative Commons Zero, Public Domain Dedication.What happens when you take the "mutually" out of "mutually assured destruction?” The answer, surprisingly, is a problem. The newly announced missile defense system dubbed the "Golden Dome" is drawing comparisons to President Ronald Reagan's Strategic Defense Initiative (SDI). While SDI was similar to the Golden Dome in many ways, the circumstances of its conception gave rise to a distinctly different set of issues.
As far as most Americans in the 1980s were concerned, the Cold War was a conflict without end. The U.S. and the Soviet Union were engaged in a morbid and seemingly inescapable mandate—that of mutually assured destruction (MAD). Both sides were armed with thousands of nuclear weapons ready to strike, set to launch in kind should either party decide to use them. In 1983, President Reagan proposed a way for the U.S. to finally gain the elusive upper hand. The plan was called the Strategic Defense Initiative (SDI), and would have used satellites in space equipped with laser weaponry to shoot down any intercontinental ballistic missiles (ICBM) launched by the Soviet Union.
Critics judged the plan to be infeasible and unrealistic, calling it "Star Wars" after the movie franchise of the same name. Indeed, the technology to make such a defense system didn’t exist yet. Even today, laser weaponry is mostly experimental in nature. Reagan’s plan also had the potential to be a foreign policy disaster. Whereas MAD had made the use of nuclear weapons forbidden by default, by announcing the SDI, the U.S. was announcing that it was essentially ready to take the "mutually" out of MAD. Thus, the very existence of the plan was seen as a sign of aggression, though the infeasible nature of the technology soon eased those concerns. There were also fears that successfully rendering nuclear weapons useless for one side would simply encourage an arms race of another kind. Ultimately, the SDI was scrapped by the 1990s as the end of the Cold War reduced the incentive to develop them. We did end up getting more Star Wars movies though, so that's something.
[Image description: A blue sky with a single, white cloud.] Credit & copyright: Dinkum, Wikimedia Commons. Creative Commons Zero, Public Domain Dedication. -
FREEBiology Daily Curio #3088Free1 CQ
The birds, they are a-changin’. New research shows that hummingbird feeders are not only helping hummingbirds expand their range, but driving them to evolve as well. Millions of Americans enjoy leaving out feeders full of sugar water for hummingbirds, simply to catch a glimpse of the tiny, colorful creatures. Such feeders became popular after WWII, though they've been around even longer. Homemade feeders and instructions on how to make them existed for decades before a patent was filed for a mass-produced version in 1947. In the western U.S., Anna's hummingbirds (Calypte anna) have been able to greatly expand their range thanks to the charity of their admirers. More specifically, they've been able to go further north, out of their usual Southern California range. Part of their expansion has to do with the eucalyptus trees that were planted throughout California in the 19th century, but the feeders are mostly responsible.
There's also something subtler going on with the SoCal natives thanks to those feeders. Their beaks have been changing over the last few generations, probably to be more efficient at drawing the nectar from feeders as opposed to flowers. According to researchers, Anna's hummingbirds’ beaks have been getting longer and more tapered, showing that the feeders have become more than a supplementary source of sustenance for the birds—they’re now central to their diet. The birds are even prioritizing manmade feeders over flowers in some areas. Researchers believe that hummingbirds have come to prefer them since the feeders are practically inexhaustible sources of “nectar” compared to flowers. Birds may even be competing for who gets to stay at them the longest. Those flitting balls of feathers are ready to throw down for some good sugar water.
[Image description: A blue hummingbird sipping at a red feeder.] Credit & copyright: Someguy1221, Wikimedia Commons. This work has been released into the public domain by its author, Someguy1221. This applies worldwide.The birds, they are a-changin’. New research shows that hummingbird feeders are not only helping hummingbirds expand their range, but driving them to evolve as well. Millions of Americans enjoy leaving out feeders full of sugar water for hummingbirds, simply to catch a glimpse of the tiny, colorful creatures. Such feeders became popular after WWII, though they've been around even longer. Homemade feeders and instructions on how to make them existed for decades before a patent was filed for a mass-produced version in 1947. In the western U.S., Anna's hummingbirds (Calypte anna) have been able to greatly expand their range thanks to the charity of their admirers. More specifically, they've been able to go further north, out of their usual Southern California range. Part of their expansion has to do with the eucalyptus trees that were planted throughout California in the 19th century, but the feeders are mostly responsible.
There's also something subtler going on with the SoCal natives thanks to those feeders. Their beaks have been changing over the last few generations, probably to be more efficient at drawing the nectar from feeders as opposed to flowers. According to researchers, Anna's hummingbirds’ beaks have been getting longer and more tapered, showing that the feeders have become more than a supplementary source of sustenance for the birds—they’re now central to their diet. The birds are even prioritizing manmade feeders over flowers in some areas. Researchers believe that hummingbirds have come to prefer them since the feeders are practically inexhaustible sources of “nectar” compared to flowers. Birds may even be competing for who gets to stay at them the longest. Those flitting balls of feathers are ready to throw down for some good sugar water.
[Image description: A blue hummingbird sipping at a red feeder.] Credit & copyright: Someguy1221, Wikimedia Commons. This work has been released into the public domain by its author, Someguy1221. This applies worldwide. -
FREEScience Daily Curio #3087Free1 CQ
Where there's smoke, there's fire; and where there's green, there's bound to be lava. At least, that's what scientists are beginning to believe after looking at satellite images of trees growing near volcanoes. As destructive as volcanic eruptions can be, there's never been a reliable way to predict them. That's a huge problem for the many communities around the world that live around active volcanoes. Sure, not all eruptions are cataclysmic events filled with pyroclastic blasts, but lava is dangerous no matter how you look at it. Until now, scientists have been able to gauge the risk of a volcanic eruption happening by measuring seismic waves and even the rise of the ground level around a volcano, but such data can't show exactly when the eruption will occur. Yet, there may be hope of accurately forecasting eruptions in the future.
For a long time, scientists have noticed that trees near volcanoes get greener before eruptions. Apparently, as magma builds up under the Earth's crust, it creates pressure underground that forces carbon dioxide to rise through the surface, which in turn feeds the trees and helps them grow. It seems simple enough, then, to measure the increase in carbon dioxide levels, but even the amount that comes up through the ground to jazz up the greenery isn't easily measurable with existing equipment. Compared to the amount of carbon dioxide in the atmosphere, the amount that comes up is too small. However, using satellite images provided by NASA's Orbiting Carbon Observatory 2, volcanologists are figuring out how to measure these carbon dioxide changes indirectly by tracking the surrounding vegetation instead. The process still requires more data to get a better understanding of the correlation between volcanoes and changes in the plants around them, and it won't help with volcanoes that are located in environments without vegetation, but it might one day help protect the 10 percent of the world's population who live near active volcanos. Until then, may cooler eruptions prevail.
[Image description: A cluster of oak leaves against a green background.] Credit & copyright: W.carter, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.Where there's smoke, there's fire; and where there's green, there's bound to be lava. At least, that's what scientists are beginning to believe after looking at satellite images of trees growing near volcanoes. As destructive as volcanic eruptions can be, there's never been a reliable way to predict them. That's a huge problem for the many communities around the world that live around active volcanoes. Sure, not all eruptions are cataclysmic events filled with pyroclastic blasts, but lava is dangerous no matter how you look at it. Until now, scientists have been able to gauge the risk of a volcanic eruption happening by measuring seismic waves and even the rise of the ground level around a volcano, but such data can't show exactly when the eruption will occur. Yet, there may be hope of accurately forecasting eruptions in the future.
For a long time, scientists have noticed that trees near volcanoes get greener before eruptions. Apparently, as magma builds up under the Earth's crust, it creates pressure underground that forces carbon dioxide to rise through the surface, which in turn feeds the trees and helps them grow. It seems simple enough, then, to measure the increase in carbon dioxide levels, but even the amount that comes up through the ground to jazz up the greenery isn't easily measurable with existing equipment. Compared to the amount of carbon dioxide in the atmosphere, the amount that comes up is too small. However, using satellite images provided by NASA's Orbiting Carbon Observatory 2, volcanologists are figuring out how to measure these carbon dioxide changes indirectly by tracking the surrounding vegetation instead. The process still requires more data to get a better understanding of the correlation between volcanoes and changes in the plants around them, and it won't help with volcanoes that are located in environments without vegetation, but it might one day help protect the 10 percent of the world's population who live near active volcanos. Until then, may cooler eruptions prevail.
[Image description: A cluster of oak leaves against a green background.] Credit & copyright: W.carter, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication. -
FREEMind + Body Daily CurioFree1 CQ
Sweet, gooey, spicy…nice! Cinnamon rolls are beloved throughout much of the world for their unique softness and interesting shape. Although they’re not as heavily associated with their country of origin as macarons are with France or as cannolis are with Italy, cinnamon rolls were almost certainly invented in Sweden. The country even celebrates their native pastry every on a special day each year.
Cinnamon rolls are made from yeast-leavened, enriched dough. This dough adds butter, sugar, and eggs to the usual flour and milk, which helps make it soft and puffy. The dough is then spread out and rolled up, buttered, then sprinkled with cinnamon, sugar, and sometimes toppings like raisins or nuts. After baking, cinnamon rolls are often drizzled with thick icing.
Ancient Romans began using cinnamon from Sri Lanka centuries before it became common in other European countries. Besides food, the Romans used cinnamon in perfumes, religious incense, and for medicinal purposes. It was likely the Romans that introduced Sweden to cinnamon. The first record of its use there is a 14th century recipe for mulled beer, but it wasn’t long before it made its way into Swedish pastries. By the 17th century, cinnamon was common throughout Europe, and various European desserts called for it, but none were as similar to modern cinnamon rolls as Swedish kanelbulles, or “cinnamon buns.” There are some differences, though. Kanelbulle dough usually contains cardamom, for one thing. They’re also not usually iced, and are instead topped with pearl sugar.
A Swedish population boom coupled with a difficult Swedish economy caused millions of Swedes to immigrate to the U.S. starting in the early 19th century. They brought their pastries with them, and cinnamon roll hotspots began popping up across the country. They became particularly popular in Philadelphia, where German immigrants made them even sweeter (and gooier) by adding molasses and brown sugar. At some point, probably after World War II, icing became a common staple of American cinnamon rolls, taking the soft pastries’ sweetness to a new level. Count on the U.S. to find new ways to add even more sugar to their snacks.
[Image description: A plate of cinnamon rolls with white icing.] Credit & copyright: Alcinoe, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide.Sweet, gooey, spicy…nice! Cinnamon rolls are beloved throughout much of the world for their unique softness and interesting shape. Although they’re not as heavily associated with their country of origin as macarons are with France or as cannolis are with Italy, cinnamon rolls were almost certainly invented in Sweden. The country even celebrates their native pastry every on a special day each year.
Cinnamon rolls are made from yeast-leavened, enriched dough. This dough adds butter, sugar, and eggs to the usual flour and milk, which helps make it soft and puffy. The dough is then spread out and rolled up, buttered, then sprinkled with cinnamon, sugar, and sometimes toppings like raisins or nuts. After baking, cinnamon rolls are often drizzled with thick icing.
Ancient Romans began using cinnamon from Sri Lanka centuries before it became common in other European countries. Besides food, the Romans used cinnamon in perfumes, religious incense, and for medicinal purposes. It was likely the Romans that introduced Sweden to cinnamon. The first record of its use there is a 14th century recipe for mulled beer, but it wasn’t long before it made its way into Swedish pastries. By the 17th century, cinnamon was common throughout Europe, and various European desserts called for it, but none were as similar to modern cinnamon rolls as Swedish kanelbulles, or “cinnamon buns.” There are some differences, though. Kanelbulle dough usually contains cardamom, for one thing. They’re also not usually iced, and are instead topped with pearl sugar.
A Swedish population boom coupled with a difficult Swedish economy caused millions of Swedes to immigrate to the U.S. starting in the early 19th century. They brought their pastries with them, and cinnamon roll hotspots began popping up across the country. They became particularly popular in Philadelphia, where German immigrants made them even sweeter (and gooier) by adding molasses and brown sugar. At some point, probably after World War II, icing became a common staple of American cinnamon rolls, taking the soft pastries’ sweetness to a new level. Count on the U.S. to find new ways to add even more sugar to their snacks.
[Image description: A plate of cinnamon rolls with white icing.] Credit & copyright: Alcinoe, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide. -
FREEEngineering Daily Curio #3086Free1 CQ
Nothing's worse than having your fate up in the air when you’re up in the air. A passenger aircraft belonging to the German airline Lufthansa recently reached its destination safely despite neither pilot being at the controls for a time, and it was all thanks to the plane’s autopilot system. Autopilot is a tremendous help to modern day pilots, but it wasn’t always as reliable as it is today.
Last year, a Lufthansa plane carrying 199 passengers from Frankfurt, Germany, to Seville, Spain, encountered a potential disaster. While the captain was away from the flight deck, the first officer became unconscious. When the captain attempted to return to the cockpit, he found himself locked out without any response from his colleague. Fortunately, the first officer regained consciousness within a matter of minutes, but for a time—however brief—no human pilot was at the controls of the plane. The incident has only recently been publicly revealed by the airline after an investigation took place.
In the early days of aviation, such a situation would have certainly led to tragedy. Older aircraft required the constant and meticulous attention of their pilots, who had to make minute adjustments to keep their planes aloft. The first autopilot system was invented by Lawrence Sperry, whose gyroscopic automatic pilot (nicknamed "George") automatically kept planes in balance. The first digital autopilot systems were developed in the 1970s in response to data showing that most crashes occurred due to human error. Today, autopilot systems are usually integrated into a plane's flight management system, and most of the small adjustments are taken care of by onboard computers. Contrary to popular belief, autopilot systems can't fully control a plane over the entire course of its journey. Pilots fully control aircraft during takeoff and landing, which are the most difficult parts of most flights. They also maintain communication with ground crews so that they can change course in case of emergencies, stay clear of other aircraft, and let airports know exactly when they’ll be landing. Autopilots mainly maintain a plane’s course and altitude, including in emergencies. It’s a life-saving invention for sure, but most people would probably still prefer that their pilot be conscious.
[Image description: A blue sky with a single, white cloud.] Credit & copyright: Dinkum, Wikimedia Commons. Creative Commons Zero, Public Domain Dedication.Nothing's worse than having your fate up in the air when you’re up in the air. A passenger aircraft belonging to the German airline Lufthansa recently reached its destination safely despite neither pilot being at the controls for a time, and it was all thanks to the plane’s autopilot system. Autopilot is a tremendous help to modern day pilots, but it wasn’t always as reliable as it is today.
Last year, a Lufthansa plane carrying 199 passengers from Frankfurt, Germany, to Seville, Spain, encountered a potential disaster. While the captain was away from the flight deck, the first officer became unconscious. When the captain attempted to return to the cockpit, he found himself locked out without any response from his colleague. Fortunately, the first officer regained consciousness within a matter of minutes, but for a time—however brief—no human pilot was at the controls of the plane. The incident has only recently been publicly revealed by the airline after an investigation took place.
In the early days of aviation, such a situation would have certainly led to tragedy. Older aircraft required the constant and meticulous attention of their pilots, who had to make minute adjustments to keep their planes aloft. The first autopilot system was invented by Lawrence Sperry, whose gyroscopic automatic pilot (nicknamed "George") automatically kept planes in balance. The first digital autopilot systems were developed in the 1970s in response to data showing that most crashes occurred due to human error. Today, autopilot systems are usually integrated into a plane's flight management system, and most of the small adjustments are taken care of by onboard computers. Contrary to popular belief, autopilot systems can't fully control a plane over the entire course of its journey. Pilots fully control aircraft during takeoff and landing, which are the most difficult parts of most flights. They also maintain communication with ground crews so that they can change course in case of emergencies, stay clear of other aircraft, and let airports know exactly when they’ll be landing. Autopilots mainly maintain a plane’s course and altitude, including in emergencies. It’s a life-saving invention for sure, but most people would probably still prefer that their pilot be conscious.
[Image description: A blue sky with a single, white cloud.] Credit & copyright: Dinkum, Wikimedia Commons. Creative Commons Zero, Public Domain Dedication. -
FREEBiology Daily Curio #3085Free1 CQ
Orange you glad they’ve solved this kitty mystery? After years of puzzlement, scientists now know what gives ginger cats their distinct orange coloration, and why so many orange cats are male. Owners of orange cats have long posited that there’s something special about them. They tend to have sillier, more laid-back personalities than average cats, and that temperament has earned them devoted followers. Among them is Professor Hiroyuki Sasaki, a geneticist at Kyushu University in Japan who crowdfunded his effort to unravel the secret behind the cats' unique fur. Scientists knew that there had to be a unique genetic link between their fur color and their sex, since orange cats are overwhelmingly male just as calico cats are overwhelmingly female. Sasaki and his colleagues raised around $70,000 to perform their research, mostly from fellow cat lovers. Their efforts paid off, and the culprit was identified: the ARHGAP36 gene, or rather, a mutation in the X chromosome that deleted a section of it in some cats. The ARHGAP36 gene is responsible for pheomelanin, the type of melanin responsible for red, orange, and yellow pigments in mammals, as opposed to eumelanin, which controls brown to black pigment. In cats with the mutation, the ARHGAP36 gene goes haywire, producing much more than it normally would. That it only occurs in the X chromosome also explains the skewed sex ratio. Since male cats only have one X chromosome, the mutation goes uncorrected. Meanwhile, female cats require the mutation to exist in both of their two X chromosomes. If only one is affected, they end up as calicos. But there might be more to the gene than meets the eye. The ARHGAP36 gene might also play a role in orange cats’ personalities. The gene is responsible for other functions in the brain and hormonal glands, so it’s possible that it produces unique inclinations and behaviors. Now if only we knew why we love cats even when they’re indifferent to us.
[Image description: An orange tabby cat lying on gray carpet.] Credit & copyright: Brian Adler, Wikimedia Commons.Orange you glad they’ve solved this kitty mystery? After years of puzzlement, scientists now know what gives ginger cats their distinct orange coloration, and why so many orange cats are male. Owners of orange cats have long posited that there’s something special about them. They tend to have sillier, more laid-back personalities than average cats, and that temperament has earned them devoted followers. Among them is Professor Hiroyuki Sasaki, a geneticist at Kyushu University in Japan who crowdfunded his effort to unravel the secret behind the cats' unique fur. Scientists knew that there had to be a unique genetic link between their fur color and their sex, since orange cats are overwhelmingly male just as calico cats are overwhelmingly female. Sasaki and his colleagues raised around $70,000 to perform their research, mostly from fellow cat lovers. Their efforts paid off, and the culprit was identified: the ARHGAP36 gene, or rather, a mutation in the X chromosome that deleted a section of it in some cats. The ARHGAP36 gene is responsible for pheomelanin, the type of melanin responsible for red, orange, and yellow pigments in mammals, as opposed to eumelanin, which controls brown to black pigment. In cats with the mutation, the ARHGAP36 gene goes haywire, producing much more than it normally would. That it only occurs in the X chromosome also explains the skewed sex ratio. Since male cats only have one X chromosome, the mutation goes uncorrected. Meanwhile, female cats require the mutation to exist in both of their two X chromosomes. If only one is affected, they end up as calicos. But there might be more to the gene than meets the eye. The ARHGAP36 gene might also play a role in orange cats’ personalities. The gene is responsible for other functions in the brain and hormonal glands, so it’s possible that it produces unique inclinations and behaviors. Now if only we knew why we love cats even when they’re indifferent to us.
[Image description: An orange tabby cat lying on gray carpet.] Credit & copyright: Brian Adler, Wikimedia Commons. -
FREEPhysics Daily Curio #3084Free1 CQ
When it comes to the end date of the universe, what's a few orders of magnitude? Scientists at Radboud University in Nijmegen, Netherlands, have found that the universe might end much earlier than expected—but it's still a very, very long way away.
Estimating the universe’s remaining days might seem like an unfathomably impossible task, but physicists have come up with a few ways to figure it out. One method involves calculating how long it takes for stars to die. Larger stars collapse in on themselves, cause supernovas, and become black holes. Smaller stars will leave a nebula when they die, as well as a hot, dense core called a white dwarf. Most stars in the universe will become white dwarves in about 17 trillion years, but the story doesn't end there. Both white dwarves and black holes decay over time, and they do so at an astronomically glacial pace. Their decay releases Hawking radiation, named after the late astrophysicist Stephen Hawking, who first predicted the process.
Hawking only ever posited that black holes would decay in this way, but the scientists at Radboud University believe that white dwarves can also decay similarly. Since white dwarves were believed to linger on much longer, it was believed that it would take around 10 to the power of 1,100 years for the last remaining stars to die out for good. However, if they decay similarly to black holes, that number comes way down. That's not to say that it will happen anytime soon. It will still be another 10 to the power of 78 years, or one quinvigintillion years. By then, there certainly won't be anyone left to say, "Lights out!"
[Image description: A starry sky above a line of dark trees] Credit & copyright: tommy haugsveen, PexelsWhen it comes to the end date of the universe, what's a few orders of magnitude? Scientists at Radboud University in Nijmegen, Netherlands, have found that the universe might end much earlier than expected—but it's still a very, very long way away.
Estimating the universe’s remaining days might seem like an unfathomably impossible task, but physicists have come up with a few ways to figure it out. One method involves calculating how long it takes for stars to die. Larger stars collapse in on themselves, cause supernovas, and become black holes. Smaller stars will leave a nebula when they die, as well as a hot, dense core called a white dwarf. Most stars in the universe will become white dwarves in about 17 trillion years, but the story doesn't end there. Both white dwarves and black holes decay over time, and they do so at an astronomically glacial pace. Their decay releases Hawking radiation, named after the late astrophysicist Stephen Hawking, who first predicted the process.
Hawking only ever posited that black holes would decay in this way, but the scientists at Radboud University believe that white dwarves can also decay similarly. Since white dwarves were believed to linger on much longer, it was believed that it would take around 10 to the power of 1,100 years for the last remaining stars to die out for good. However, if they decay similarly to black holes, that number comes way down. That's not to say that it will happen anytime soon. It will still be another 10 to the power of 78 years, or one quinvigintillion years. By then, there certainly won't be anyone left to say, "Lights out!"
[Image description: A starry sky above a line of dark trees] Credit & copyright: tommy haugsveen, Pexels -
FREEUS History Daily Curio #3083Free1 CQ
Here’s something from the lost-and-found bin of history. The lost colony of Roanoke is one of the most enduring mysteries in American history, but one self-proclaimed amateur archaeologist now says that he’s solved it. Either way, the story of Roanoke is equal parts intriguing and tragic.
Before Jamestown, the first successful English colony in America, attempts were made to establish a colony on Roanoke Island (located in what is now North Carolina). The colony was meant to serve as England's foothold on the "New World" as they competed against the Spanish, and would have served as a base of operations for English privateers. However, the first attempt in 1585 by Ralph Lane ended in disaster, especially after relations with the nearby Algonquians soured. The second attempt, which began in 1587, lasted just a few months before one of the colonists, John White, had to return to England to raise supplies and funding. White left behind his wife, daughter and granddaughter, the first English child to be born in America. When he returned three years later, however, White’s family was nowhere to be found. Carved into nearby trees was "CROATOAN," referring to the Native American tribe who lived on Hatteras Island. Tragically, dangerous weather kept White from reaching the island, and he was forced to return to an England that had lost interest in the colony.
White died in 1606, never having found his family, but there have been some clues and hoaxes regarding their ultimate fate. Artifacts known as the Dare Stones inscribed with writing supposedly tell the story of the survivors, though their authenticity isn't widely accepted. Archaeologists have found traces of settlements nearby that may have belonged to Roanoke colonists who scattered around the area. Now, Scott Dawson, the president of the Croatian Archaeological Society, claims to have found remnants of hammerscale—bits of molten iron leftover from the forging process—on nearby Hatteras Island. Dawson claims that the hammerscale proves that the English colonists who once inhabited Roanoke Island must have fled there, since Native Americans at the time didn't have the means to forge iron. His evidence is compelling, but it might be too late to definitively solve a mystery that happened so long ago. At least by being lost, the settlers of Roanoke will never be forgotten.
[Image description: A map from 1590 showing an area spanning from Cape Fear to Chesapeake Bay, including the area in which the colony of Roanoke stood.] Credit & copyright: Library of Congress, Geography and Map Division. 1590. Public Domain.Here’s something from the lost-and-found bin of history. The lost colony of Roanoke is one of the most enduring mysteries in American history, but one self-proclaimed amateur archaeologist now says that he’s solved it. Either way, the story of Roanoke is equal parts intriguing and tragic.
Before Jamestown, the first successful English colony in America, attempts were made to establish a colony on Roanoke Island (located in what is now North Carolina). The colony was meant to serve as England's foothold on the "New World" as they competed against the Spanish, and would have served as a base of operations for English privateers. However, the first attempt in 1585 by Ralph Lane ended in disaster, especially after relations with the nearby Algonquians soured. The second attempt, which began in 1587, lasted just a few months before one of the colonists, John White, had to return to England to raise supplies and funding. White left behind his wife, daughter and granddaughter, the first English child to be born in America. When he returned three years later, however, White’s family was nowhere to be found. Carved into nearby trees was "CROATOAN," referring to the Native American tribe who lived on Hatteras Island. Tragically, dangerous weather kept White from reaching the island, and he was forced to return to an England that had lost interest in the colony.
White died in 1606, never having found his family, but there have been some clues and hoaxes regarding their ultimate fate. Artifacts known as the Dare Stones inscribed with writing supposedly tell the story of the survivors, though their authenticity isn't widely accepted. Archaeologists have found traces of settlements nearby that may have belonged to Roanoke colonists who scattered around the area. Now, Scott Dawson, the president of the Croatian Archaeological Society, claims to have found remnants of hammerscale—bits of molten iron leftover from the forging process—on nearby Hatteras Island. Dawson claims that the hammerscale proves that the English colonists who once inhabited Roanoke Island must have fled there, since Native Americans at the time didn't have the means to forge iron. His evidence is compelling, but it might be too late to definitively solve a mystery that happened so long ago. At least by being lost, the settlers of Roanoke will never be forgotten.
[Image description: A map from 1590 showing an area spanning from Cape Fear to Chesapeake Bay, including the area in which the colony of Roanoke stood.] Credit & copyright: Library of Congress, Geography and Map Division. 1590. Public Domain. -
FREEMind + Body Daily CurioFree1 CQ
The pot of gold at the end of the rainbow might actually be a ramekin of crème brûlée! This beautiful, golden-brown dessert is one of France’s most famous dishes. Yet, England and Spain also claim to have invented it.
Crème brûlée is made from custard which is baked in a water bath. The custard itself is made with heavy cream, egg yolks, sugar, and, usually, vanilla. The dessert is served in the same, small ramekins in which it is baked, and it’s topped with sugar that is caramelized using a blowtorch or broiler. The crust is sometimes dowsed with liqueur and set on fire during serving to give the crust a more intense flavor.
While crème brûlée is heavily associated with France (the dish’s name means “burnt cream” in French) no one knows exactly where it was first made. In England, custard desserts have been eaten since at least the Middle Ages. In the 17th century, Cambridge College began serving a custard dessert with a sugar crust called Trinity cream, with the crest of Cambridge burned into the crust. This doesn’t necessarily mean that England was the first to invent crème brûlée, since recipes for the French version also appeared around the same time as recipes for Trinity Cream.
Spain also claims to have invented crème brûlée. Since the Middle Ages, a dish called creme catalana, flavored with lemon or orange zest, has been served throughout the country. Milk is usually used instead of cream, and cinnamon is often added to the sugar crust.
Of course, France is best known as the birthplace of crème brûlée, as one of the oldest written recipes for the dessert can be traced to France in 1691. At the time, the dessert was popular at the Palace of Versailles, and thus gained an elegant reputation. As cookbooks became more common, the dessert made its way from the noble classes to everyday people, and today it’s served in French restaurants all over the world. Its recipe is largely unchanged from the 1691 version. If the sugar crust isn’t broken, don’t fix it!
[Image description: A white ramekin of crème brûlée on a white plate with silverware in the background.] Credit & copyright: Romainbehar, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.The pot of gold at the end of the rainbow might actually be a ramekin of crème brûlée! This beautiful, golden-brown dessert is one of France’s most famous dishes. Yet, England and Spain also claim to have invented it.
Crème brûlée is made from custard which is baked in a water bath. The custard itself is made with heavy cream, egg yolks, sugar, and, usually, vanilla. The dessert is served in the same, small ramekins in which it is baked, and it’s topped with sugar that is caramelized using a blowtorch or broiler. The crust is sometimes dowsed with liqueur and set on fire during serving to give the crust a more intense flavor.
While crème brûlée is heavily associated with France (the dish’s name means “burnt cream” in French) no one knows exactly where it was first made. In England, custard desserts have been eaten since at least the Middle Ages. In the 17th century, Cambridge College began serving a custard dessert with a sugar crust called Trinity cream, with the crest of Cambridge burned into the crust. This doesn’t necessarily mean that England was the first to invent crème brûlée, since recipes for the French version also appeared around the same time as recipes for Trinity Cream.
Spain also claims to have invented crème brûlée. Since the Middle Ages, a dish called creme catalana, flavored with lemon or orange zest, has been served throughout the country. Milk is usually used instead of cream, and cinnamon is often added to the sugar crust.
Of course, France is best known as the birthplace of crème brûlée, as one of the oldest written recipes for the dessert can be traced to France in 1691. At the time, the dessert was popular at the Palace of Versailles, and thus gained an elegant reputation. As cookbooks became more common, the dessert made its way from the noble classes to everyday people, and today it’s served in French restaurants all over the world. Its recipe is largely unchanged from the 1691 version. If the sugar crust isn’t broken, don’t fix it!
[Image description: A white ramekin of crème brûlée on a white plate with silverware in the background.] Credit & copyright: Romainbehar, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication. -
FREEEngineering Daily Curio #3082Free1 CQ
It's time for this conductor to face the music—and lead the orchestra! A conductor in Ohio who was diagnosed with Parkinson's disease 11 years ago was recently given a cutting-edge implant that some are calling a "pacemaker for the brain”, allowing him to conduct without shaking. While more invasive than previous treatments, it might give hope to those with difficult-to-manage symptoms.
For nearly 50 years, Rand Laycock has been living his dream of conducting a symphony orchestra in Parma, Ohio. But for the past decade, he's also been struggling with Parkinson's, which threatened his passion and livelihood. What started as a minor twitch in his thumb developed into worsening tremors in his right hand, and the very medications used to treat him also gave him dyskinesias—erratic, involuntary movements that are, unfortunately, a common side effect.
The symptoms of Parkinson's can vary widely, but can include changes in speech, rigid muscles, difficulty moving, and balance issues. Less noticeable are the psychological symptoms like depression and anxiety. Parkinson's is also a degenerative disease that gets worse over time and can lead to death, even with treatment. Fortunately, Laycock was able to be treated with deep brain stimulation (DBS), which uses implanted leads that deliver therapeutic amounts of electric currents to curb the worst of the symptoms. However, Laycock's symptoms were particularly difficult to treat because they tended to fluctuate in severity. That meant that they could flare up in the middle of a performance. Now, he's being treated with adaptive deep brain stimulation (aDBS), which can vary the electric currents to meet the changing needs of Laycock's condition. Thanks to the aDBS implant, Laycock can continue to perform his duties at concerts reliably. Even Parkinson’s is no match for advanced medical engineering.
[Image description: A digital illustration of a gray human brain against a black background.] Credit & copyright: KATRIN BOLOVTSOVA, PexelsIt's time for this conductor to face the music—and lead the orchestra! A conductor in Ohio who was diagnosed with Parkinson's disease 11 years ago was recently given a cutting-edge implant that some are calling a "pacemaker for the brain”, allowing him to conduct without shaking. While more invasive than previous treatments, it might give hope to those with difficult-to-manage symptoms.
For nearly 50 years, Rand Laycock has been living his dream of conducting a symphony orchestra in Parma, Ohio. But for the past decade, he's also been struggling with Parkinson's, which threatened his passion and livelihood. What started as a minor twitch in his thumb developed into worsening tremors in his right hand, and the very medications used to treat him also gave him dyskinesias—erratic, involuntary movements that are, unfortunately, a common side effect.
The symptoms of Parkinson's can vary widely, but can include changes in speech, rigid muscles, difficulty moving, and balance issues. Less noticeable are the psychological symptoms like depression and anxiety. Parkinson's is also a degenerative disease that gets worse over time and can lead to death, even with treatment. Fortunately, Laycock was able to be treated with deep brain stimulation (DBS), which uses implanted leads that deliver therapeutic amounts of electric currents to curb the worst of the symptoms. However, Laycock's symptoms were particularly difficult to treat because they tended to fluctuate in severity. That meant that they could flare up in the middle of a performance. Now, he's being treated with adaptive deep brain stimulation (aDBS), which can vary the electric currents to meet the changing needs of Laycock's condition. Thanks to the aDBS implant, Laycock can continue to perform his duties at concerts reliably. Even Parkinson’s is no match for advanced medical engineering.
[Image description: A digital illustration of a gray human brain against a black background.] Credit & copyright: KATRIN BOLOVTSOVA, Pexels -
FREEVideography Daily Curio #3081Free1 CQ
Who says you can't get smarter by watching TV? British broadcasting legend and naturalist Sir David Attenborough turned 99 on May 8, and released a new feature-length documentary titled Ocean to celebrate. Focusing on life below the depths and the impact that Earth’s oceans have on all life, it's a testament to Attenborough's dedication to conservation and education.
Attenborough was born on May 8, 1926, in London and showed interest in the natural world at a young age. As a child, he collected fossils and loved being outdoors, observing animals. He went on to earn a degree in natural sciences from the University of Cambridge. However, both he and his older brother, Richard, were also drawn to the big screen. Richard became an actor and producer, while David studied broadcasting before joining the BBC as a television producer in 1952. As a producer for the BBC, Attenborough created a number of educational programs, starting with Zoo Quest, which featured footage of animals in the wild and in captivity. During his tenure at the BBC, he was also involved with a number of non-educational programs, like Monty Python’s Flying Circus.
Attenborough left the station in the 1970s and started working on the nature documentaries that would define his career. The first of these was 1979’s Life on Earth. In 2001, he released The Blue Planet, an in-depth look at the world's oceans and their diverse habitats. In 2017, he won an Emmy for narrating its sequel, The Blue Planet II. Attenborough's voice and distinct cadence have become staples of his work, and he has used his platform to promote the need for conservation and help raise the alarm over climate change. Attenborough doesn't limit himself to lending his voice either. He appears on camera in many of his documentaries. Funnily enough, he almost didn't step in front of the camera himself after a BBC executive told him that his teeth were too long. Well, octopuses have eight arms but you never hear anyone complain about them showing up in nature documentaries!
[Image description: The British flag (Union Jack) featuring crisscrossing red and white stripes on a blue background.] Credit & copyright: Public Domain.Who says you can't get smarter by watching TV? British broadcasting legend and naturalist Sir David Attenborough turned 99 on May 8, and released a new feature-length documentary titled Ocean to celebrate. Focusing on life below the depths and the impact that Earth’s oceans have on all life, it's a testament to Attenborough's dedication to conservation and education.
Attenborough was born on May 8, 1926, in London and showed interest in the natural world at a young age. As a child, he collected fossils and loved being outdoors, observing animals. He went on to earn a degree in natural sciences from the University of Cambridge. However, both he and his older brother, Richard, were also drawn to the big screen. Richard became an actor and producer, while David studied broadcasting before joining the BBC as a television producer in 1952. As a producer for the BBC, Attenborough created a number of educational programs, starting with Zoo Quest, which featured footage of animals in the wild and in captivity. During his tenure at the BBC, he was also involved with a number of non-educational programs, like Monty Python’s Flying Circus.
Attenborough left the station in the 1970s and started working on the nature documentaries that would define his career. The first of these was 1979’s Life on Earth. In 2001, he released The Blue Planet, an in-depth look at the world's oceans and their diverse habitats. In 2017, he won an Emmy for narrating its sequel, The Blue Planet II. Attenborough's voice and distinct cadence have become staples of his work, and he has used his platform to promote the need for conservation and help raise the alarm over climate change. Attenborough doesn't limit himself to lending his voice either. He appears on camera in many of his documentaries. Funnily enough, he almost didn't step in front of the camera himself after a BBC executive told him that his teeth were too long. Well, octopuses have eight arms but you never hear anyone complain about them showing up in nature documentaries!
[Image description: The British flag (Union Jack) featuring crisscrossing red and white stripes on a blue background.] Credit & copyright: Public Domain. -
FREEMind + Body Daily Curio #3080Free1 CQ
Even routine screenings shouldn't be taken for granted. The FDA just approved the first at-home screening kit for cervical cancer, which the manufacturer claims is just as effective as the Papanicolaou test (also known as the Pap test or Pap smear). While the Pap smear may not be anyone's favorite procedure, it's undoubtedly saved countless lives—even if it took decades for it to be accepted by the medical establishment.
Cervical cancer, which is almost always caused by HPV (human papillomavirus), was once the leading cause of death in women in the U.S. At the beginning of the 20th century, it claimed around 40,000 lives each year. Even by the standards of the day, cervical cancer was difficult to treat because it often went undiagnosed until it had already reached an advanced stage. Enter Dr. Georgios Papanicolaou, a laboratory assistant at Cornell University Medical College. Papanicolaou was studying sex chromosomes using guinea pigs as test subjects, and he happened to notice that the reproductive cycles of the female guinea pigs could be tracked by examining their vaginal secretions. Finding that secretions from human subjects could carry similar information, he redirected his efforts. With the help of his wife Mary, who also worked as his lab technician, he began collecting swabs of cervical cells from his wife and other volunteers. By 1928, he discovered how to distinguish between healthy and malignant cervical cells obtained through swabs. However, the medical community was largely skeptical, and they didn't believe that mere cell samples—rather than whole tissue—could be used for reliable diagnoses.
Nevertheless, Papanicolaou continued to publish his findings on the subject, and in 1943, his research became widely accepted. By the 1950s, the Pap test began to significantly reduce cervical cancer rates. Even after many decades, Pap tests don't vary much from how Papanicolaou gathered his samples. A small brush is used to collect cell samples, which are then examined under a microscope for signs of disease. Thanks in large part to Papanicolaou and the simple procedure he invented, cervical cancer rates today are a fraction of what they were just a hundred years ago. Who knew a little cell could tell so much?Even routine screenings shouldn't be taken for granted. The FDA just approved the first at-home screening kit for cervical cancer, which the manufacturer claims is just as effective as the Papanicolaou test (also known as the Pap test or Pap smear). While the Pap smear may not be anyone's favorite procedure, it's undoubtedly saved countless lives—even if it took decades for it to be accepted by the medical establishment.
Cervical cancer, which is almost always caused by HPV (human papillomavirus), was once the leading cause of death in women in the U.S. At the beginning of the 20th century, it claimed around 40,000 lives each year. Even by the standards of the day, cervical cancer was difficult to treat because it often went undiagnosed until it had already reached an advanced stage. Enter Dr. Georgios Papanicolaou, a laboratory assistant at Cornell University Medical College. Papanicolaou was studying sex chromosomes using guinea pigs as test subjects, and he happened to notice that the reproductive cycles of the female guinea pigs could be tracked by examining their vaginal secretions. Finding that secretions from human subjects could carry similar information, he redirected his efforts. With the help of his wife Mary, who also worked as his lab technician, he began collecting swabs of cervical cells from his wife and other volunteers. By 1928, he discovered how to distinguish between healthy and malignant cervical cells obtained through swabs. However, the medical community was largely skeptical, and they didn't believe that mere cell samples—rather than whole tissue—could be used for reliable diagnoses.
Nevertheless, Papanicolaou continued to publish his findings on the subject, and in 1943, his research became widely accepted. By the 1950s, the Pap test began to significantly reduce cervical cancer rates. Even after many decades, Pap tests don't vary much from how Papanicolaou gathered his samples. A small brush is used to collect cell samples, which are then examined under a microscope for signs of disease. Thanks in large part to Papanicolaou and the simple procedure he invented, cervical cancer rates today are a fraction of what they were just a hundred years ago. Who knew a little cell could tell so much? -
FREEHumanities Daily Curio #3079Free1 CQ
It's a tough job picking the next pope, but it could be worse. After the passing of Pope Francis in April, devotees eagerly awaited the selection of the new pontiff. The process took just two days, but it used to routinely take much, much longer. When a pope passes away (or resigns, as in the case of Pope Benedict XVI), cardinals must gather from around the world at the Vatican in Rome to select the church’s new leader. Cardinals are the second-highest-ranking clergy in the church, but not all of them get to participate in the selection process. Only those under 80 can cast a ballot, and this year, there were just 135, a little over half of all existing cardinals. Eligible cardinals enter the Sistine Chapel, where they're locked in and shut out from the rest of the world in a gathering called the conclave. Conclave comes from a Latin word meaning "a room that can be locked up.” Indeed, the cardinals aren't allowed to interact with the outside world in any way until a new pope has been chosen. During the first day of the conclave, the cardinals hold one vote, and if the two-thirds majority needed to select the new pope isn't reached, black smoke emerges from the chimney of the chapel. From the second day onward, the cardinals vote four times a day, until a supermajority decision is reached. Once the voting is concluded, white smoke emerges from the chimney, signalling to the world that there is a new pontiff. This year, it only took two days, and it hasn’t taken much longer for modern popes. In centuries past, though, the process could take weeks. The longest papal election, in the 13th century, took a whopping two years. That drawn-out selection is actually what led the church to use its current method of forcing the cardinals into a conclave. What settles disagreements faster than being locked in a room with coworkers?
[Image description: A black-and-white portrait of Pope Leo X with an elaborate border featuring religious figures.] Credit & copyright: The Metropolitan Museum of Art, Portrait of Pope Leo X in a decorative border, Alexander Mair German After (?) Cherubino Alberti (Zaccaria Mattia) Italian 1575–1620. The Elisha Whittelsey Collection, The Elisha Whittelsey Fund, 1951. Public Domain.It's a tough job picking the next pope, but it could be worse. After the passing of Pope Francis in April, devotees eagerly awaited the selection of the new pontiff. The process took just two days, but it used to routinely take much, much longer. When a pope passes away (or resigns, as in the case of Pope Benedict XVI), cardinals must gather from around the world at the Vatican in Rome to select the church’s new leader. Cardinals are the second-highest-ranking clergy in the church, but not all of them get to participate in the selection process. Only those under 80 can cast a ballot, and this year, there were just 135, a little over half of all existing cardinals. Eligible cardinals enter the Sistine Chapel, where they're locked in and shut out from the rest of the world in a gathering called the conclave. Conclave comes from a Latin word meaning "a room that can be locked up.” Indeed, the cardinals aren't allowed to interact with the outside world in any way until a new pope has been chosen. During the first day of the conclave, the cardinals hold one vote, and if the two-thirds majority needed to select the new pope isn't reached, black smoke emerges from the chimney of the chapel. From the second day onward, the cardinals vote four times a day, until a supermajority decision is reached. Once the voting is concluded, white smoke emerges from the chimney, signalling to the world that there is a new pontiff. This year, it only took two days, and it hasn’t taken much longer for modern popes. In centuries past, though, the process could take weeks. The longest papal election, in the 13th century, took a whopping two years. That drawn-out selection is actually what led the church to use its current method of forcing the cardinals into a conclave. What settles disagreements faster than being locked in a room with coworkers?
[Image description: A black-and-white portrait of Pope Leo X with an elaborate border featuring religious figures.] Credit & copyright: The Metropolitan Museum of Art, Portrait of Pope Leo X in a decorative border, Alexander Mair German After (?) Cherubino Alberti (Zaccaria Mattia) Italian 1575–1620. The Elisha Whittelsey Collection, The Elisha Whittelsey Fund, 1951. Public Domain. -
FREEMind + Body Daily CurioFree1 CQ
This snack is creamy, cheesy, vegetable-y, spicy and portable. Elote, also known as Mexican street corn, really does it all. As the weather warms up and elote makes an appearance at fairs and festivals all over the world, it’s worth taking a look at this street food’s surprisingly long history.
Elote, which in Spanish can refer to either a plain ear of corn or the street food, is made by either boiling ears of corn in their husks or, more commonly, by grilling them. The corn is then slathered with mayo and cotija cheese, and sprinkled with chili powder and other seasonings, like cumin. Lime is sometimes squeezed on top for extra zest. Elote is usually put on a skewer for easy carrying, or shaved from the cob into a cup in a preparation known as esquites.
Corn is native to the lowlands of west-central Mexico, and has been cultivated there for more than 7,000 years. Corn was a staple food in both the Aztec and Mayan Empires, and was used to make tortillas, tamales, soups, and even drinks. In fact, corn was so important that it was considered holy; the Popol Vuh, a Mayan sacred text, states that the first human was made from corn. Eventually, corn cultivation spread throughout Mexico, then to the Southwestern U.S. as people migrated there. By the time Europeans arrived in the U.S., native Americans had been growing corn for at least 1,000 years.
We’ll never know exactly who invented the elote we know today, nor exactly when. We do know that it has been served in various parts of Mexico for centuries, and that its popularity has a lot to do with busy lifestyles in places like Mexico City. Just like New Yorkers love their ultra-portable hot dogs, those in Mexican cities enjoy eating elote on the go. Like hot dogs, elote is also a common food to find at backyard get-togethers and family functions. Don’t forget to grab a cob next time you’re out and about.
[Image description: An ear of corn on a stick, covered in white cheese and red spices, on white paper.] Credit & copyright: Daderot, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.This snack is creamy, cheesy, vegetable-y, spicy and portable. Elote, also known as Mexican street corn, really does it all. As the weather warms up and elote makes an appearance at fairs and festivals all over the world, it’s worth taking a look at this street food’s surprisingly long history.
Elote, which in Spanish can refer to either a plain ear of corn or the street food, is made by either boiling ears of corn in their husks or, more commonly, by grilling them. The corn is then slathered with mayo and cotija cheese, and sprinkled with chili powder and other seasonings, like cumin. Lime is sometimes squeezed on top for extra zest. Elote is usually put on a skewer for easy carrying, or shaved from the cob into a cup in a preparation known as esquites.
Corn is native to the lowlands of west-central Mexico, and has been cultivated there for more than 7,000 years. Corn was a staple food in both the Aztec and Mayan Empires, and was used to make tortillas, tamales, soups, and even drinks. In fact, corn was so important that it was considered holy; the Popol Vuh, a Mayan sacred text, states that the first human was made from corn. Eventually, corn cultivation spread throughout Mexico, then to the Southwestern U.S. as people migrated there. By the time Europeans arrived in the U.S., native Americans had been growing corn for at least 1,000 years.
We’ll never know exactly who invented the elote we know today, nor exactly when. We do know that it has been served in various parts of Mexico for centuries, and that its popularity has a lot to do with busy lifestyles in places like Mexico City. Just like New Yorkers love their ultra-portable hot dogs, those in Mexican cities enjoy eating elote on the go. Like hot dogs, elote is also a common food to find at backyard get-togethers and family functions. Don’t forget to grab a cob next time you’re out and about.
[Image description: An ear of corn on a stick, covered in white cheese and red spices, on white paper.] Credit & copyright: Daderot, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication. -
FREEAstronomy Daily Curio #3078Free1 CQ
The Greeks had nothing on this ancient astronomer! For centuries, the oldest surviving star catalog, mapping the exact positions of heavenly bodies, was known to have come from ancient Greece. Created by the Greek astronomer Hipparchus of Nicaea some time around 130 B.C.E., it gave ancient Greece the distinction of being the first civilization to map stars using coordinates. Now, researchers in China have turned that idea on its head, as they claim to have dated a Chinese star catalog to more than 100 years before the Greeks’. It was compiled by Chinese astrologer and astronomer Shi Shen some time around 335 B.C.E. and is being called The Star Manual of Master Shi.
While this new star catalog shows detailed information about 120 stars, including their names and coordinates, it doesn’t include a date. To determine when, exactly, it was made, researchers had to get creative. We know that stars’ positions change over time relative to earthbound viewers due to a phenomenon called precession, in which the Earth wobbles slightly on its axis in slow, 26,000-year cycles. Researchers first compared The Star Manual of Master Shi to other manuals made in later periods, like the Tang and Yuan dynasties. Then, they used a specially-made algorithm to compare the positions in Shi’s manual to 10,000 different moments in later periods, factoring in the process of precession. The algorithm found that The Star Manual of Master Shi had to have been created in 335 B.C.E., which makes sense since that year falls right within Shi’s lifetime, at the height of his career. In the process of comparing Shi’s work to that of later astronomers, they also found that his coordinates had been meticulously and purposefully updated by another famous, ancient Chinese astronomer: Grand Astronomer Zhang Heng, of the Han Dynasty. We may have just discovered how important Shi’s manual was, but it seems that other astronomers already knew what was up (in the sky.)
[Image description: A starry sky with some purple visible.] Credit & copyright: Felix Mittermeier, PexelsThe Greeks had nothing on this ancient astronomer! For centuries, the oldest surviving star catalog, mapping the exact positions of heavenly bodies, was known to have come from ancient Greece. Created by the Greek astronomer Hipparchus of Nicaea some time around 130 B.C.E., it gave ancient Greece the distinction of being the first civilization to map stars using coordinates. Now, researchers in China have turned that idea on its head, as they claim to have dated a Chinese star catalog to more than 100 years before the Greeks’. It was compiled by Chinese astrologer and astronomer Shi Shen some time around 335 B.C.E. and is being called The Star Manual of Master Shi.
While this new star catalog shows detailed information about 120 stars, including their names and coordinates, it doesn’t include a date. To determine when, exactly, it was made, researchers had to get creative. We know that stars’ positions change over time relative to earthbound viewers due to a phenomenon called precession, in which the Earth wobbles slightly on its axis in slow, 26,000-year cycles. Researchers first compared The Star Manual of Master Shi to other manuals made in later periods, like the Tang and Yuan dynasties. Then, they used a specially-made algorithm to compare the positions in Shi’s manual to 10,000 different moments in later periods, factoring in the process of precession. The algorithm found that The Star Manual of Master Shi had to have been created in 335 B.C.E., which makes sense since that year falls right within Shi’s lifetime, at the height of his career. In the process of comparing Shi’s work to that of later astronomers, they also found that his coordinates had been meticulously and purposefully updated by another famous, ancient Chinese astronomer: Grand Astronomer Zhang Heng, of the Han Dynasty. We may have just discovered how important Shi’s manual was, but it seems that other astronomers already knew what was up (in the sky.)
[Image description: A starry sky with some purple visible.] Credit & copyright: Felix Mittermeier, Pexels -
FREESports Daily Curio #3077Free1 CQ
Aging out? Never heard of it! American gymnast Simone Biles recently announced that she's unsure whether or not she’ll compete in the 2028 Summer Olympics, when she'll be 28 years old. If she did choose to participate, she would undoubtedly be one of the oldest gymnasts competing in 2028…but possibly not the oldest! She’d also be far from the oldest to ever compete at the Olympics.
It's no secret that age counts for a lot in competitive sports, and that's truer in gymnastics than most others. While age can bring experience and even lend a competitive edge to athletes in some other sports, gymnastics is notoriously hard on the body, making it more difficult for aging athletes to compete and recover without pain. Those flips and jumps also require a lot of muscle mass, which tends to decline as people age. That's why Olympic gymnasts tend to be younger on average than, say, swimmers or marathon runners.
Of course, there are some exceptions. Take 49-year-old Uzbek gymnast Oksana Chusovitina, the oldest female gymnast to ever compete at the Olympics. She’s aiming to come back yet again in 2028 after missing out on Paris last year. She last competed in the 2020 Summer Olympics in Tokyo at the age of 46. Throughout her long career, she earned a gold medal in the 1992 team all-around competition in Barcelona, and a silver for vault in 2008, in Beijing. Then there's Bulgarian gymnast Yordan Yovchev. He’s retired now, but when he last competed in 2012, he was the oldest gymnast participating, at the age of 39. He’s brought home four Olympic medals, including a silver in rings at the San Juan Olympics in 1996. Yovchev also boasts the most consecutive appearances at the Olympics by any male gymnast, having competed six times between 1992 and 2012. Compared to these legendary athletes, Biles is practically a spring chicken!Aging out? Never heard of it! American gymnast Simone Biles recently announced that she's unsure whether or not she’ll compete in the 2028 Summer Olympics, when she'll be 28 years old. If she did choose to participate, she would undoubtedly be one of the oldest gymnasts competing in 2028…but possibly not the oldest! She’d also be far from the oldest to ever compete at the Olympics.
It's no secret that age counts for a lot in competitive sports, and that's truer in gymnastics than most others. While age can bring experience and even lend a competitive edge to athletes in some other sports, gymnastics is notoriously hard on the body, making it more difficult for aging athletes to compete and recover without pain. Those flips and jumps also require a lot of muscle mass, which tends to decline as people age. That's why Olympic gymnasts tend to be younger on average than, say, swimmers or marathon runners.
Of course, there are some exceptions. Take 49-year-old Uzbek gymnast Oksana Chusovitina, the oldest female gymnast to ever compete at the Olympics. She’s aiming to come back yet again in 2028 after missing out on Paris last year. She last competed in the 2020 Summer Olympics in Tokyo at the age of 46. Throughout her long career, she earned a gold medal in the 1992 team all-around competition in Barcelona, and a silver for vault in 2008, in Beijing. Then there's Bulgarian gymnast Yordan Yovchev. He’s retired now, but when he last competed in 2012, he was the oldest gymnast participating, at the age of 39. He’s brought home four Olympic medals, including a silver in rings at the San Juan Olympics in 1996. Yovchev also boasts the most consecutive appearances at the Olympics by any male gymnast, having competed six times between 1992 and 2012. Compared to these legendary athletes, Biles is practically a spring chicken! -
FREEBiology Daily Curio #3076Free1 CQ
This week, as the weather continues to warm, we're looking back on some of our favorite springtime curios from years past.
Do you have trouble falling asleep? Do you get rocky "half-sleep"? Well hibernation might be just the cure for you. In two recent unrelated experiments, researchers isolated the neurons in the brain that "switch" on hibernation in mammals. One study, led by neurobiologist Sinisa Hrvatin of Harvard, intentionally reached its findings. Hrvatin and her team first hypothesized that they could trick mice into going into hibernation, mostly by limiting their diets and exposing them to cold temperatures. They were correct. Hrvatin and her team noticed that the combination of variables led some mice to enter a state of torpor in 10 hours, and others in up to 48 hours. As the mice lulled to sleep, the scientists observed and tagged neurons in their rodent hypothalami. The hypothalamus is an area of the brain largely concerned with primordial sensations like feeding, temperature, and eating. Once the scientists tagged and cataloged the neurons involved in the torpor, the scientists could stimulate those neurons on command. In other words, they could instantly thrust mice into a pleasant siesta. The second study, based in Japan, largely came to the same conclusion but unintentionally. Both teams posit that artificial hibernation could carry over to humans, allowing for the long-sought-after suspended sleep during space flights, metabolic control of body temperature during surgery, and a much safer form of sedation for unruly patients. And of course it may bring z's to all us purple-eyed, groggy insomniacs. Just remember to set a couple of alarms and set them beside your head before you drift off. Otherwise, you might oversleep until the spring of 2021!
Image credit & copyright: Huntsmanleader, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.
This week, as the weather continues to warm, we're looking back on some of our favorite springtime curios from years past.
Do you have trouble falling asleep? Do you get rocky "half-sleep"? Well hibernation might be just the cure for you. In two recent unrelated experiments, researchers isolated the neurons in the brain that "switch" on hibernation in mammals. One study, led by neurobiologist Sinisa Hrvatin of Harvard, intentionally reached its findings. Hrvatin and her team first hypothesized that they could trick mice into going into hibernation, mostly by limiting their diets and exposing them to cold temperatures. They were correct. Hrvatin and her team noticed that the combination of variables led some mice to enter a state of torpor in 10 hours, and others in up to 48 hours. As the mice lulled to sleep, the scientists observed and tagged neurons in their rodent hypothalami. The hypothalamus is an area of the brain largely concerned with primordial sensations like feeding, temperature, and eating. Once the scientists tagged and cataloged the neurons involved in the torpor, the scientists could stimulate those neurons on command. In other words, they could instantly thrust mice into a pleasant siesta. The second study, based in Japan, largely came to the same conclusion but unintentionally. Both teams posit that artificial hibernation could carry over to humans, allowing for the long-sought-after suspended sleep during space flights, metabolic control of body temperature during surgery, and a much safer form of sedation for unruly patients. And of course it may bring z's to all us purple-eyed, groggy insomniacs. Just remember to set a couple of alarms and set them beside your head before you drift off. Otherwise, you might oversleep until the spring of 2021!
Image credit & copyright: Huntsmanleader, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.
-
FREEOutdoors Daily Curio #3075Free1 CQ
This week, as the weather continues to warm, we're looking back on some of our favorite springtime curios from years past.
The fastest growing sport in the U.S. probably isn’t what you’d expect. With another spring comes another wave of outdoor activities, and for many fair-weather athletes, the name of the game is pickleball. Pickleball was invented in 1965 by three dads who wanted to keep their kids entertained during summer vacation. The sport’s founding fathers, Joel Pritchard, Bill Bell and Barney McCallum took a wiffle ball, lowered a badminton net to the ground, and with elements from tennis, ping-pong and badminton, they cobbled together a sport that was easy and fun.
Part of the appeal of the sport comes from the small court on which it is played, which allows for an exciting game for all ages. The paddles used are roughly twice the size of the ones used for ping-pong, and while the original versions were made out of scrap plywood, a number of manufacturers make pickleball-specific paddles and other equipment. According to the USA Pickleball Association, the sport is played on a 20 by 44 inch court with a net that hangs to 36 inches at the sides and 34 inches at the middle. It can be played as singles or doubles, just like tennis. Pickleball has experienced a surge in popularity recently, due to its soft learning curve and the pandemic which had people looking for easy outdoor activities with a social lean, but even before the pandemic, the number of players grew by 10.5 percent between 2017 and 2020. As for the name? Some claim that it was named after the Pritchards’ family dog, Pickles, while others claim that the dog was named after the sport and that the name is a reference to “pickle boats” in rowing, which are manned by athletes left over from other teams. Either way, grab a paddle![Image description: Yellow pickleballs on a blue court.] Credit & copyright: Stephen James Hall, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.
This week, as the weather continues to warm, we're looking back on some of our favorite springtime curios from years past.
The fastest growing sport in the U.S. probably isn’t what you’d expect. With another spring comes another wave of outdoor activities, and for many fair-weather athletes, the name of the game is pickleball. Pickleball was invented in 1965 by three dads who wanted to keep their kids entertained during summer vacation. The sport’s founding fathers, Joel Pritchard, Bill Bell and Barney McCallum took a wiffle ball, lowered a badminton net to the ground, and with elements from tennis, ping-pong and badminton, they cobbled together a sport that was easy and fun.
Part of the appeal of the sport comes from the small court on which it is played, which allows for an exciting game for all ages. The paddles used are roughly twice the size of the ones used for ping-pong, and while the original versions were made out of scrap plywood, a number of manufacturers make pickleball-specific paddles and other equipment. According to the USA Pickleball Association, the sport is played on a 20 by 44 inch court with a net that hangs to 36 inches at the sides and 34 inches at the middle. It can be played as singles or doubles, just like tennis. Pickleball has experienced a surge in popularity recently, due to its soft learning curve and the pandemic which had people looking for easy outdoor activities with a social lean, but even before the pandemic, the number of players grew by 10.5 percent between 2017 and 2020. As for the name? Some claim that it was named after the Pritchards’ family dog, Pickles, while others claim that the dog was named after the sport and that the name is a reference to “pickle boats” in rowing, which are manned by athletes left over from other teams. Either way, grab a paddle![Image description: Yellow pickleballs on a blue court.] Credit & copyright: Stephen James Hall, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.
-
FREEMind + Body Daily CurioFree1 CQ
This week, as the weather continues to warm, we're looking back on some of our favorite springtime curios from years past.
This sticky topping is more than just a pancake accessory. Maple syrup has a uniquely North American history beginning with the continents’ first inhabitants. Over the centuries, it’s been used as a medicine, a drink, a food topping, and it even helped early U.S. colonists avoid hefty import fees.
As its name suggests, maple syrup is made from the sap of maple trees, usually black maples, sugar maples, or red maples. These trees are unique in that they store starch in their trunks and roots, which turns into sugar and is carried throughout the tree via sap. In late winter and early spring, when the trees are full of this sugary sap, holes are drilled in their trunks and the sap is collected. It is then heated to get rid of excess water. The result is a runny, brown, sweet-tasting syrup that’s used as a topping on many foods, most famously pancakes.
No one knows who, exactly, first discovered that maple sap was sweet and edible, but they grow throughout North America, and native peoples have been making sugar and syrup from their sap for centuries. For the Algonquian people, who lived mainly in what today is New England and Canada, maple syrup held particular cultural significance. They collected maple sap in clay buckets and turned it into syrup by letting it freeze and then throwing out the ice that formed on top, thereby getting rid of excess moisture. The pots were sometimes boiled over large fires, too. The syrup was not only used as a topping but was also mixed into a drink with herbs and spices.
When European settlers made their way to North America, the Algonquians and other peoples showed them how to make maple sugar and maple syrup. This was lucky, since, in the 17th century, sugarcane had to be imported from the West Indies at a considerable cost. Using maple syrup and sugar as their main sources of sweetness allowed colonists to save considerable money and enjoy desserts at the same time. By the early 19th century, maple syrup was sold and prized throughout North America, and was even exported to other countries. To this day, Canada is particularly proud of its maple syrup, and it’s considered a national staple. Pretty sweet, eh?
[Image description: A stack of three pancakes with whipped cream and berries. Maple syrup is being poured over them.] Credit & copyright: Sydney Troxell, PexelsThis week, as the weather continues to warm, we're looking back on some of our favorite springtime curios from years past.
This sticky topping is more than just a pancake accessory. Maple syrup has a uniquely North American history beginning with the continents’ first inhabitants. Over the centuries, it’s been used as a medicine, a drink, a food topping, and it even helped early U.S. colonists avoid hefty import fees.
As its name suggests, maple syrup is made from the sap of maple trees, usually black maples, sugar maples, or red maples. These trees are unique in that they store starch in their trunks and roots, which turns into sugar and is carried throughout the tree via sap. In late winter and early spring, when the trees are full of this sugary sap, holes are drilled in their trunks and the sap is collected. It is then heated to get rid of excess water. The result is a runny, brown, sweet-tasting syrup that’s used as a topping on many foods, most famously pancakes.
No one knows who, exactly, first discovered that maple sap was sweet and edible, but they grow throughout North America, and native peoples have been making sugar and syrup from their sap for centuries. For the Algonquian people, who lived mainly in what today is New England and Canada, maple syrup held particular cultural significance. They collected maple sap in clay buckets and turned it into syrup by letting it freeze and then throwing out the ice that formed on top, thereby getting rid of excess moisture. The pots were sometimes boiled over large fires, too. The syrup was not only used as a topping but was also mixed into a drink with herbs and spices.
When European settlers made their way to North America, the Algonquians and other peoples showed them how to make maple sugar and maple syrup. This was lucky, since, in the 17th century, sugarcane had to be imported from the West Indies at a considerable cost. Using maple syrup and sugar as their main sources of sweetness allowed colonists to save considerable money and enjoy desserts at the same time. By the early 19th century, maple syrup was sold and prized throughout North America, and was even exported to other countries. To this day, Canada is particularly proud of its maple syrup, and it’s considered a national staple. Pretty sweet, eh?
[Image description: A stack of three pancakes with whipped cream and berries. Maple syrup is being poured over them.] Credit & copyright: Sydney Troxell, Pexels