Curio Cabinet
- By Date
- By Type
July 12, 2025
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: July 12, 2025\im-PYOO-nuh-tee\ noun
What It Means
Impunity, usually used in the phrase "with impunity," refers to exemption...
with Merriam-WebsterWord of the Day
: July 12, 2025\im-PYOO-nuh-tee\ noun
What It Means
Impunity, usually used in the phrase "with impunity," refers to exemption...
-
10 minFREEWork Business CurioFree6 CQ
Medicaid is run by the states, but about 70% of its funding comes from the federal government. Now, given $1 trillion in cuts from President Donald Trump's t...
Medicaid is run by the states, but about 70% of its funding comes from the federal government. Now, given $1 trillion in cuts from President Donald Trump's t...
-
FREERunning Sporty CurioFree1 CQ
In competitive racing, a few seconds can make all the difference in the world. Many female runners have been trying to break through the 14 -minute barrier in the 5,000 meter event, and Beatrice Chebet has finally managed it with just under two seconds to spare. During the Prefontaine Classic, held at Hayward Field in Eugene, Oregon, Kenyan runner Beatrice Chebet made history by coming in first in the 5,000 meter with a time of 13 minutes and 58.06 seconds. Her compatriot, Agnes Jebet Ngetich, finished close behind her in second place with a time of 14:01.29, followed by Ethiopian Gudaf Tsegay with 14:04.41. Tsegay was the previous record holder in the 5,000 meter with a time of 14:00.21, also set at Hayward Field in 2023, and while Ngetich didn’t win the race, her time is now the third-fastest for the event. This race also isn’t the first time Chebet has made history. Last year, she broke the 29-minute barrier for the 10,000 meter by coming in first with a time of 28:54.14. With other runners coming in tantalizingly close to the 14-minute and 29-minute barriers, it may be only a matter of time until Chebet’s records are broken too. But how many people can say that they did it first twice?
Â
In competitive racing, a few seconds can make all the difference in the world. Many female runners have been trying to break through the 14 -minute barrier in the 5,000 meter event, and Beatrice Chebet has finally managed it with just under two seconds to spare. During the Prefontaine Classic, held at Hayward Field in Eugene, Oregon, Kenyan runner Beatrice Chebet made history by coming in first in the 5,000 meter with a time of 13 minutes and 58.06 seconds. Her compatriot, Agnes Jebet Ngetich, finished close behind her in second place with a time of 14:01.29, followed by Ethiopian Gudaf Tsegay with 14:04.41. Tsegay was the previous record holder in the 5,000 meter with a time of 14:00.21, also set at Hayward Field in 2023, and while Ngetich didn’t win the race, her time is now the third-fastest for the event. This race also isn’t the first time Chebet has made history. Last year, she broke the 29-minute barrier for the 10,000 meter by coming in first with a time of 28:54.14. With other runners coming in tantalizingly close to the 14-minute and 29-minute barriers, it may be only a matter of time until Chebet’s records are broken too. But how many people can say that they did it first twice?
Â
July 11, 2025
-
9 minFREEWork Business CurioFree5 CQ
Back-to-school shopping already? We’re only about halfway through July, but for retailers — and some very organized parents — the back-to-school shopping sea...
Back-to-school shopping already? We’re only about halfway through July, but for retailers — and some very organized parents — the back-to-school shopping sea...
-
FREEMind + Body Daily CurioFree1 CQ
What does a fruit salad have to do with one of the world’s most famous hotels? More than you’d think! Waldorf salad is more than just a great choice for cooling down during summer, it’s an integral part of American culinary history. Developed at New York City’s famous Waldorf-Astoria hotel during the establishment’s golden age, this humble salad is a superstar…albiet a misunderstood one.
Â
Modern Waldorf salad is usually made with chopped apples, mayonnaise, sliced grapes, chopped celery, and walnuts. Raisins are also sometimes added. Juice from the chopped apples melds with the mayonnaise during mixing, giving the salad a tangy, sweet flavor. Often, green apples and grapes are used, though some suggest using pink lady apples for a less pucker-inducing dish. Though Waldorf salad is fairly simple to make, it used to be even more so. The original recipe called for just three ingredients: apples, celery, and mayonnaise.
Â
Unlike many other iconic foods, Waldorf salad’s history is well-documented. It was first served on March 13, 1896, at New York City’s Waldorf-Astoria by famed maĂ®tre d'hĂ´tel Oscar Tschirky. At the time, the Waldorf-Astoria was known as a hotel of the elite. Diplomats, movie stars, and other international celebrities frequently stayed there, and as such the hotel’s menus had to meet high standards and change frequently enough to keep guests interested. Tschirky was a master at coming up with simple yet creative dishes. He first served his three-ingredient Waldorf salad at a charity ball for St. Mary's Hospital, where it was an instant hit. It soon gained a permanent place on the hotel’s menu, and spread beyond its walls when Tschirky published The Cook Book, by "Oscar" of the Waldorf later that same year. Soon, Waldorf salad made its way onto other restaurant menus in New York City, and remained a regional dish for a time before spreading to the rest of the country. Naturally, the further from its birthplace the salad traveled, the more it changed. Regional variations that included grapes and walnuts eventually became the standard, though no one is quite sure how. What’s wrong with teaching an old salad new tricks?Â
[Image description: A pile of green apples with some red coloring in a cardboard box.] Credit & copyright: Daderot, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.What does a fruit salad have to do with one of the world’s most famous hotels? More than you’d think! Waldorf salad is more than just a great choice for cooling down during summer, it’s an integral part of American culinary history. Developed at New York City’s famous Waldorf-Astoria hotel during the establishment’s golden age, this humble salad is a superstar…albiet a misunderstood one.
Â
Modern Waldorf salad is usually made with chopped apples, mayonnaise, sliced grapes, chopped celery, and walnuts. Raisins are also sometimes added. Juice from the chopped apples melds with the mayonnaise during mixing, giving the salad a tangy, sweet flavor. Often, green apples and grapes are used, though some suggest using pink lady apples for a less pucker-inducing dish. Though Waldorf salad is fairly simple to make, it used to be even more so. The original recipe called for just three ingredients: apples, celery, and mayonnaise.
Â
Unlike many other iconic foods, Waldorf salad’s history is well-documented. It was first served on March 13, 1896, at New York City’s Waldorf-Astoria by famed maĂ®tre d'hĂ´tel Oscar Tschirky. At the time, the Waldorf-Astoria was known as a hotel of the elite. Diplomats, movie stars, and other international celebrities frequently stayed there, and as such the hotel’s menus had to meet high standards and change frequently enough to keep guests interested. Tschirky was a master at coming up with simple yet creative dishes. He first served his three-ingredient Waldorf salad at a charity ball for St. Mary's Hospital, where it was an instant hit. It soon gained a permanent place on the hotel’s menu, and spread beyond its walls when Tschirky published The Cook Book, by "Oscar" of the Waldorf later that same year. Soon, Waldorf salad made its way onto other restaurant menus in New York City, and remained a regional dish for a time before spreading to the rest of the country. Naturally, the further from its birthplace the salad traveled, the more it changed. Regional variations that included grapes and walnuts eventually became the standard, though no one is quite sure how. What’s wrong with teaching an old salad new tricks?Â
[Image description: A pile of green apples with some red coloring in a cardboard box.] Credit & copyright: Daderot, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.
July 10, 2025
-
8 minFREEWork Business CurioFree5 CQ
From the BBC World Service: Lesotho has declared a national state of disaster that will last for two years. The government says soaring youth unemployment an...
From the BBC World Service: Lesotho has declared a national state of disaster that will last for two years. The government says soaring youth unemployment an...
-
FREEScience Nerdy CurioFree1 CQ
These mountains look cool, but they can be real hotheads. Researchers from the University of Wisconsin–Madison have presented a study at the Goldschmidt Conference in Prague suggesting that dormant volcanoes around the world may become more active as a result of melting glaciers. First, some clarification: there are three main volcano classifications depending on their level of activity. “Active” means that the volcano has erupted during the Holocene epoch (the last 11,650 years or so) and has the potential to erupt again in the future. “Extinct” means that, as far as anyone can tell, the volcano is unlikely to ever erupt again (though it happens from time to time). “Dormant”, on the other hand, means “potentially active,” as in, it’s an active volcano (the first classification) that’s just not erupting presently, as opposed to “actively erupting,” which means magma is currently coming out of the ground.
Â
A lot of factors contribute to a volcano’s dormancy, and scientists have found that glaciers are one of them. Researchers tracked volcanic activity by measuring the radioactive decay of argon in crystals formed in magmatic rock. They then compared that to the level of ice cover during the peak of the last ice age. What the data seems to suggest is that the ice cover acted as a lid, inhibiting eruptions. As the ice melted, volcanoes became more active. Currently, there are an estimated 245 dormant volcanoes buried under three miles of ice, and many of them are in Antarctica. Once these begin to erupt due to the reduction in ice cover, it may create a feedback loop as the eruptions themselves further melt the ice. It seems there will be an icy reception before things really heat up.Â
[Image description: A portion of the Andes mountain range between Chile and Argentina, photographed from far above.] Credit & copyright: Jorge Morales Piderit, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide.
ÂThese mountains look cool, but they can be real hotheads. Researchers from the University of Wisconsin–Madison have presented a study at the Goldschmidt Conference in Prague suggesting that dormant volcanoes around the world may become more active as a result of melting glaciers. First, some clarification: there are three main volcano classifications depending on their level of activity. “Active” means that the volcano has erupted during the Holocene epoch (the last 11,650 years or so) and has the potential to erupt again in the future. “Extinct” means that, as far as anyone can tell, the volcano is unlikely to ever erupt again (though it happens from time to time). “Dormant”, on the other hand, means “potentially active,” as in, it’s an active volcano (the first classification) that’s just not erupting presently, as opposed to “actively erupting,” which means magma is currently coming out of the ground.
Â
A lot of factors contribute to a volcano’s dormancy, and scientists have found that glaciers are one of them. Researchers tracked volcanic activity by measuring the radioactive decay of argon in crystals formed in magmatic rock. They then compared that to the level of ice cover during the peak of the last ice age. What the data seems to suggest is that the ice cover acted as a lid, inhibiting eruptions. As the ice melted, volcanoes became more active. Currently, there are an estimated 245 dormant volcanoes buried under three miles of ice, and many of them are in Antarctica. Once these begin to erupt due to the reduction in ice cover, it may create a feedback loop as the eruptions themselves further melt the ice. It seems there will be an icy reception before things really heat up.Â
[Image description: A portion of the Andes mountain range between Chile and Argentina, photographed from far above.] Credit & copyright: Jorge Morales Piderit, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide.
 -
FREEWorld History Daily Curio #3114Free1 CQ
These are some not-so-fresh kicks. Archaeologists in England have unearthed 2,000-year-old pairs of Roman shoes, and they’re some of the best preserved footwear from the era. The researchers were working at the Magna Roman Fort in Northumberland, located near another ancient Roman fort called Vindolanda, when they made the discovery. Many famous artifacts have been unearthed at Vindolanda, including wooden writing tablets and around 5,000 pairs of ancient Roman shoes. The Magna site, it seems, is literally following in those footsteps, with 32 shoes found so far preserved in the fort’s “ankle-breaker” trenches. Originally designed to trip and injure attackers, the trenches ended up being a perfect, anaerobic environment to preserve the shoes.
Â
Roman shoes were made with hand-stitched leather, and many were closed-toed as opposed to the sandals often portrayed in popular media (in fact, sandals were only worn indoors). The ancient Romans were actually expert shoemakers, and their footwear contributed greatly to their military success. Most Roman soldiers wore caligae, leather boots consisting of an outer shell cut into many strips that allowed them to be laced up tightly. Replaceable iron hobnails on the soles helped the boots last longer and provided traction on soft surfaces. These boots were eventually replaced with completely enclosed ones called calcei, but the caligae have left a greater impression on the perception of Roman culture. That’s probably thanks to Caligula, the infamous Roman emperor whose real name was Gaius. When Gaius was a child, he accompanied his father on campaign in a set of kid-sized legionary gear, including the caligae. The soldiers then started calling him “Caligula,” which means “little boots.” Unfortunate, since he had some big shoes to fill as the third emperor of Rome.Â
[Image description: A detailed, black-and-white illustration of two elaborately-dressed ancient Roman soldiers looking at one another.] Credit & copyright: The Metropolitan Museum of Art, Two Roman Soldiers, Giovanni Francesco Venturini, 17th century. Bequest of Phyllis Massar, 2011. Public Domain.These are some not-so-fresh kicks. Archaeologists in England have unearthed 2,000-year-old pairs of Roman shoes, and they’re some of the best preserved footwear from the era. The researchers were working at the Magna Roman Fort in Northumberland, located near another ancient Roman fort called Vindolanda, when they made the discovery. Many famous artifacts have been unearthed at Vindolanda, including wooden writing tablets and around 5,000 pairs of ancient Roman shoes. The Magna site, it seems, is literally following in those footsteps, with 32 shoes found so far preserved in the fort’s “ankle-breaker” trenches. Originally designed to trip and injure attackers, the trenches ended up being a perfect, anaerobic environment to preserve the shoes.
Â
Roman shoes were made with hand-stitched leather, and many were closed-toed as opposed to the sandals often portrayed in popular media (in fact, sandals were only worn indoors). The ancient Romans were actually expert shoemakers, and their footwear contributed greatly to their military success. Most Roman soldiers wore caligae, leather boots consisting of an outer shell cut into many strips that allowed them to be laced up tightly. Replaceable iron hobnails on the soles helped the boots last longer and provided traction on soft surfaces. These boots were eventually replaced with completely enclosed ones called calcei, but the caligae have left a greater impression on the perception of Roman culture. That’s probably thanks to Caligula, the infamous Roman emperor whose real name was Gaius. When Gaius was a child, he accompanied his father on campaign in a set of kid-sized legionary gear, including the caligae. The soldiers then started calling him “Caligula,” which means “little boots.” Unfortunate, since he had some big shoes to fill as the third emperor of Rome.Â
[Image description: A detailed, black-and-white illustration of two elaborately-dressed ancient Roman soldiers looking at one another.] Credit & copyright: The Metropolitan Museum of Art, Two Roman Soldiers, Giovanni Francesco Venturini, 17th century. Bequest of Phyllis Massar, 2011. Public Domain.
July 9, 2025
-
9 minFREEWork Business CurioFree5 CQ
The high court has cleared the way for the Trump administration to plan out mass layoffs across the government. It's somewhat of a confounding decision, thou...
The high court has cleared the way for the Trump administration to plan out mass layoffs across the government. It's somewhat of a confounding decision, thou...
-
1 minFREEHumanities Word CurioFree1 CQ
Word of the Day
: July 9, 2025\sim-yuh-LAK-rum\ noun
What It Means
A simulacrum is a superficial likeness of something, usually as an imitati...
with Merriam-WebsterWord of the Day
: July 9, 2025\sim-yuh-LAK-rum\ noun
What It Means
A simulacrum is a superficial likeness of something, usually as an imitati...
-
FREEBiology Nerdy CurioFree1 CQ
These critters are as American as apple pie, but a whole lot bigger! North American bison, also called buffalo, are the largest land animals in North America and some of the most historically significant. Yet, we almost lost them altogether. Overhunted throughout the 19th century, there were fewer than 600 bison left in the U.S. by 1889. Today, their numbers have recovered drastically, but these gentle giants still have a long way to go.
Â
There are two species of bison: The North American bison and the European bison. North American bison are often called buffalo, but they aren’t actually buffalo at all. Real buffalo, like cape buffalo and anoa, live in Africa and Asia. However, bison are closely related to buffalo and share many traits with them, since they’re all bovines—members of the family Bovidae’s subfamily, Bovinae. As such, they share many attributes with buffalo, including their large size, horns, and hooves, as well as behavioral traits like living in herds. Bison are famous for their fluffy winter coats, which help them survive harsh, blizzardy winters in places like the Northern Great Plains. That’s not to say that bison are sweet and cuddly, though. They are massive, powerful animals; males can stand up to six feet tall and weigh up to 2,000 pounds. Like any wild animal, they can become aggressive if approached, especially during mating and calving season. It’s a fact that tourists sometimes learn the hard way when they don’t obey rules in places like Yellowstone National Park, where the largest bison population in North America roams free.
Â
Bison first appeared in North America during the late Middle Pleistocene epoch, between 195,000 and 135,000 years ago. Before European colonists began settling in North America en masse in the late 15th century, there were around 30 million bison roaming in what is now the United States. Many native tribes relied on bison meat and hides, with some, like the Plains Indians, focusing many parts of their lives around the movements of bison herds. However, as colonist aggression toward native tribes increased and native peoples lost control of more and more land, the bison population dwindled. During the American Indian Wars of the 17th, 18th, and early 19th centuries, bison were deliberately killed by colonists as a means of harming native peoples and to feed colonial soldiers. By the 1880s, there were as few as 300 bison left in what is now the United States. The species was on the brink of extinction.
Â
Luckily, private organizations and ranchers stepped in to save North American buffalo, keeping herds on private land where they couldn’t be hunted. In 1902, 21 bison from private owners were placed in a designated area at Yellowstone National Park. Eventually, they were reintroduced to the wild, and began breeding with Yellowstone’s existing wild population. In 1905, the American Bison Society started a bison breeding program that also helped spread awareness about the importance of wild bison. Theodore Roosevelt aligned himself closely with the organization and even served as its honorary president for a time. Today, thanks to over a century of conservation efforts, there are roughly 31,000 wild bison in the United States. It’s a far cry from the millions that once roamed here, but it’s a whole lot better than extinction, and that’s no bison hockey!Â
[Image description: An adult and baby bison standing on a shrubby plain.] Credit & copyright: Anna Weyers Blades/USFWS. Public Domain.These critters are as American as apple pie, but a whole lot bigger! North American bison, also called buffalo, are the largest land animals in North America and some of the most historically significant. Yet, we almost lost them altogether. Overhunted throughout the 19th century, there were fewer than 600 bison left in the U.S. by 1889. Today, their numbers have recovered drastically, but these gentle giants still have a long way to go.
Â
There are two species of bison: The North American bison and the European bison. North American bison are often called buffalo, but they aren’t actually buffalo at all. Real buffalo, like cape buffalo and anoa, live in Africa and Asia. However, bison are closely related to buffalo and share many traits with them, since they’re all bovines—members of the family Bovidae’s subfamily, Bovinae. As such, they share many attributes with buffalo, including their large size, horns, and hooves, as well as behavioral traits like living in herds. Bison are famous for their fluffy winter coats, which help them survive harsh, blizzardy winters in places like the Northern Great Plains. That’s not to say that bison are sweet and cuddly, though. They are massive, powerful animals; males can stand up to six feet tall and weigh up to 2,000 pounds. Like any wild animal, they can become aggressive if approached, especially during mating and calving season. It’s a fact that tourists sometimes learn the hard way when they don’t obey rules in places like Yellowstone National Park, where the largest bison population in North America roams free.
Â
Bison first appeared in North America during the late Middle Pleistocene epoch, between 195,000 and 135,000 years ago. Before European colonists began settling in North America en masse in the late 15th century, there were around 30 million bison roaming in what is now the United States. Many native tribes relied on bison meat and hides, with some, like the Plains Indians, focusing many parts of their lives around the movements of bison herds. However, as colonist aggression toward native tribes increased and native peoples lost control of more and more land, the bison population dwindled. During the American Indian Wars of the 17th, 18th, and early 19th centuries, bison were deliberately killed by colonists as a means of harming native peoples and to feed colonial soldiers. By the 1880s, there were as few as 300 bison left in what is now the United States. The species was on the brink of extinction.
Â
Luckily, private organizations and ranchers stepped in to save North American buffalo, keeping herds on private land where they couldn’t be hunted. In 1902, 21 bison from private owners were placed in a designated area at Yellowstone National Park. Eventually, they were reintroduced to the wild, and began breeding with Yellowstone’s existing wild population. In 1905, the American Bison Society started a bison breeding program that also helped spread awareness about the importance of wild bison. Theodore Roosevelt aligned himself closely with the organization and even served as its honorary president for a time. Today, thanks to over a century of conservation efforts, there are roughly 31,000 wild bison in the United States. It’s a far cry from the millions that once roamed here, but it’s a whole lot better than extinction, and that’s no bison hockey!Â
[Image description: An adult and baby bison standing on a shrubby plain.] Credit & copyright: Anna Weyers Blades/USFWS. Public Domain. -
FREEMind + Body Daily Curio #3113Free1 CQ
It’s not always good to go out with a bang. Heart attacks were once the number one cause of deaths in the world, but a recent study shows that the tides are changing. In the last half-century or so, the number of heart attacks has been in sharp decline. Consider the following statistic from Stanford Medicine researchers: a person over the age of 65 admitted to a hospital in 1970 had just a 60 percent chance of leaving alive, and the most likely cause of death would have been an acute myocardial infarctions, otherwise known as a heart attack. Since then, the numbers have shifted drastically. Heart disease used to account for 41 percent of all deaths in the U.S., but that number is now down to 24 percent. Deaths from heart attacks, specifically, have fallen by an astonishing 90 percent. There are a few reasons for this change, the first being that medical technology has simply advanced, giving doctors better tools with which to help their patients, including better drugs. Another reason is that more people have become health-conscious, eating better, exercising more, and smoking less. Younger Americans are also drinking less alcohol, which might continue to improve the nation’s overall heart health. More people know how to perform CPR now too, and those that don’t can easily look it up within seconds thanks to smartphones. This makes cardiac arrest itself less deadly than it once was. Nowadays, instead of heart attacks, more people are dying from chronic heart conditions. That might not sound like a good thing, but it’s ultimately a positive sign. As the lead author of the study, Sara King, said in a statement, “People now are surviving these acute events, so they have the opportunity to develop these other heart conditions.” Is it really a trade-off if the cost of not dying younger is dying older?
Â
[Image description: A digital illustration of a cartoon heart with a break down the center. The heart is maroon, the background is red.] Credit & copyright: Author-created image. Public domain.It’s not always good to go out with a bang. Heart attacks were once the number one cause of deaths in the world, but a recent study shows that the tides are changing. In the last half-century or so, the number of heart attacks has been in sharp decline. Consider the following statistic from Stanford Medicine researchers: a person over the age of 65 admitted to a hospital in 1970 had just a 60 percent chance of leaving alive, and the most likely cause of death would have been an acute myocardial infarctions, otherwise known as a heart attack. Since then, the numbers have shifted drastically. Heart disease used to account for 41 percent of all deaths in the U.S., but that number is now down to 24 percent. Deaths from heart attacks, specifically, have fallen by an astonishing 90 percent. There are a few reasons for this change, the first being that medical technology has simply advanced, giving doctors better tools with which to help their patients, including better drugs. Another reason is that more people have become health-conscious, eating better, exercising more, and smoking less. Younger Americans are also drinking less alcohol, which might continue to improve the nation’s overall heart health. More people know how to perform CPR now too, and those that don’t can easily look it up within seconds thanks to smartphones. This makes cardiac arrest itself less deadly than it once was. Nowadays, instead of heart attacks, more people are dying from chronic heart conditions. That might not sound like a good thing, but it’s ultimately a positive sign. As the lead author of the study, Sara King, said in a statement, “People now are surviving these acute events, so they have the opportunity to develop these other heart conditions.” Is it really a trade-off if the cost of not dying younger is dying older?
Â
[Image description: A digital illustration of a cartoon heart with a break down the center. The heart is maroon, the background is red.] Credit & copyright: Author-created image. Public domain.
July 8, 2025
-
8 minFREEWork Business CurioFree5 CQ
From the BBC World Service: 14 countries received a letter from the White House saying a pause on tariffs due to expire Wednesday will now be extended to Aug...
From the BBC World Service: 14 countries received a letter from the White House saying a pause on tariffs due to expire Wednesday will now be extended to Aug...
-
1 minFREEHumanities Word CurioFree1 CQ
Word of the Day
: July 8, 2025\ig-ZEM-pluh-ree\ adjective
What It Means
Something described as exemplary is extremely good and deserves to be...
with Merriam-WebsterWord of the Day
: July 8, 2025\ig-ZEM-pluh-ree\ adjective
What It Means
Something described as exemplary is extremely good and deserves to be...
-
FREEMusic Appreciation Song CurioFree2 CQ
This here’s one rootin’, tootin’, high-falootin’ musical. On this day in 1958, the soundtrack for Rodgers and Hammerstein’s musical Oklahoma! won the Recording Industry Association of America’s (RIAA) first-ever Gold Album. That’s not the only way in which Oklahoma! was the first of its kind—it was also the first musical that Rodgers and Hammerstein ever worked on together, and their first major hit. The album’s titular track is, fittingly, one of the best-loved songs of the entire show. Sung mainly by the character Curly McLain (played by Gordon MacRae in the film version) the song celebrates not only a wedding, but the impending statehood of Oklahoma and everything about living there. A true classic musical number, Oklahoma! is big and showy, with the large ensemble cast joining in to sing some of the show’s most iconic and energetic lines. These include “Oooooklahoma, where the wind comes sweeping down the plain” and “...You’re doin’ fine, Oklahoma! Oklahoma, O.K.!” Oklahoma might not be a top travel destination for most people, but as far as Hollywood and the RIAA are concerned, you should think twice before calling it a flyover state.
Â
Â
This here’s one rootin’, tootin’, high-falootin’ musical. On this day in 1958, the soundtrack for Rodgers and Hammerstein’s musical Oklahoma! won the Recording Industry Association of America’s (RIAA) first-ever Gold Album. That’s not the only way in which Oklahoma! was the first of its kind—it was also the first musical that Rodgers and Hammerstein ever worked on together, and their first major hit. The album’s titular track is, fittingly, one of the best-loved songs of the entire show. Sung mainly by the character Curly McLain (played by Gordon MacRae in the film version) the song celebrates not only a wedding, but the impending statehood of Oklahoma and everything about living there. A true classic musical number, Oklahoma! is big and showy, with the large ensemble cast joining in to sing some of the show’s most iconic and energetic lines. These include “Oooooklahoma, where the wind comes sweeping down the plain” and “...You’re doin’ fine, Oklahoma! Oklahoma, O.K.!” Oklahoma might not be a top travel destination for most people, but as far as Hollywood and the RIAA are concerned, you should think twice before calling it a flyover state.
Â
Â
-
FREEBiology Daily Curio #3112Free1 CQ
The Earth is teeming with life and, apparantly, with “not-life” as well. Scientists have discovered a new type of organism that appears to defy the standard definition of “life.” All living things are organisms, but not all organisms are living. Take viruses, for instance. While viruses are capable of reproducing, they can’t do so on their own. They require a host organism to perform the biological functions necessary to reproduce. Viruses also can’t produce energy on their own or grow, unlike even simple living things, like bacteria. Now, there’s the matter of Sukunaarchaeum mirabile. The organism was discovered by accident by a team of Canadian and Japanese researchers who were looking into the DNA of Citharistes regius, a species of plankton. When they noticed a loop of DNA that didn’t belong to the plankton, they took a closer look and found Sukunaarchaeum. In some ways, this new organism resembles a virus. It can’t grow, produce energy, or reproduce on its own, but it has one distinct feature that sets it apart: it can produce its own ribosomes, messenger RNA, and transfer RNA. That latter part makes it more like a bacterium than a virus.
Â
Then there’s the matter of its genetics. Sukunaarchaeum, it seems, is a genetic lightweight with only 238,000 base pairs of DNA. Compare that to a typical virus, which can range from 735,000 to 2.5 million base pairs, and the low number really stands out. Nearly all of Sukunaarchaeum’s genes are made to work toward the singular goal of replicating the organism. In a way, Sukunaarchaeum appears to be somewhere between a virus and a bacteria in terms of how “alive” it is, indicating that life itself exists on a spectrum. In science, nothing is as simple as it first appears.Â
The Earth is teeming with life and, apparantly, with “not-life” as well. Scientists have discovered a new type of organism that appears to defy the standard definition of “life.” All living things are organisms, but not all organisms are living. Take viruses, for instance. While viruses are capable of reproducing, they can’t do so on their own. They require a host organism to perform the biological functions necessary to reproduce. Viruses also can’t produce energy on their own or grow, unlike even simple living things, like bacteria. Now, there’s the matter of Sukunaarchaeum mirabile. The organism was discovered by accident by a team of Canadian and Japanese researchers who were looking into the DNA of Citharistes regius, a species of plankton. When they noticed a loop of DNA that didn’t belong to the plankton, they took a closer look and found Sukunaarchaeum. In some ways, this new organism resembles a virus. It can’t grow, produce energy, or reproduce on its own, but it has one distinct feature that sets it apart: it can produce its own ribosomes, messenger RNA, and transfer RNA. That latter part makes it more like a bacterium than a virus.
Â
Then there’s the matter of its genetics. Sukunaarchaeum, it seems, is a genetic lightweight with only 238,000 base pairs of DNA. Compare that to a typical virus, which can range from 735,000 to 2.5 million base pairs, and the low number really stands out. Nearly all of Sukunaarchaeum’s genes are made to work toward the singular goal of replicating the organism. In a way, Sukunaarchaeum appears to be somewhere between a virus and a bacteria in terms of how “alive” it is, indicating that life itself exists on a spectrum. In science, nothing is as simple as it first appears.Â
July 7, 2025
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: July 7, 2025\pruh-KRASS-tuh-nayt\ verb
What It Means
To procrastinate is to be slow or late about doing something that shou...
with Merriam-WebsterWord of the Day
: July 7, 2025\pruh-KRASS-tuh-nayt\ verb
What It Means
To procrastinate is to be slow or late about doing something that shou...
-
9 minFREEWork Business CurioFree5 CQ
The House of Representatives could vote as soon as today on President Donald Trump’s big tax and spending bill. Trump says the legislation gets rid of taxes ...
The House of Representatives could vote as soon as today on President Donald Trump’s big tax and spending bill. Trump says the legislation gets rid of taxes ...
-
FREEArt Appreciation Art CurioFree1 CQ
If you want to impress your Iron Age friends, you need one of these bad boys. Torcs (also spelled torques) are a kind of rigid necklace or neck ring. They were commonly worn by Celts throughout western Europe from around 1200 BCE to 550 BCE, but they weren’t all made from solid gold. The photo above shows a round, golden torc necklace. The body of the torc is twisted into an ornamental design and the ends are rolled to create a rounded point. Torcs were a symbol of wealth and social status amongst the Celts, depending on the materials used and the complexity of the design. Torcs could be made of any metal familiar to Iron Age jewelers, including silver, bronze, and copper, and they often featured etched details depicting mythical beings. They also served as a way of safekeeping and keeping track of wealth. Some torcs of solid gold could weigh several pounds, and torcs are often found in Celtic burial sites. The dead don’t speak, but they can still torc.
Â
Gold Neck Ring, Celtic, 6th–4th century BCE, Gold, 7.5 x 7.5 x .5 in. (19 x 19.1 x 1.2 cm.), The Metropolitan Museum of Art, New York City, New YorkÂ
[Image credit & copyright: The Metropolitan Museum of Art, Purchase, 2005 Benefit Fund, Rogers Fund, Audrey Love Charitable Foundation Gift, and Gifts of J. Pierpont Morgan and George Blumenthal and Fletcher Fund, by exchange, 2005. Public Domain.If you want to impress your Iron Age friends, you need one of these bad boys. Torcs (also spelled torques) are a kind of rigid necklace or neck ring. They were commonly worn by Celts throughout western Europe from around 1200 BCE to 550 BCE, but they weren’t all made from solid gold. The photo above shows a round, golden torc necklace. The body of the torc is twisted into an ornamental design and the ends are rolled to create a rounded point. Torcs were a symbol of wealth and social status amongst the Celts, depending on the materials used and the complexity of the design. Torcs could be made of any metal familiar to Iron Age jewelers, including silver, bronze, and copper, and they often featured etched details depicting mythical beings. They also served as a way of safekeeping and keeping track of wealth. Some torcs of solid gold could weigh several pounds, and torcs are often found in Celtic burial sites. The dead don’t speak, but they can still torc.
Â
Gold Neck Ring, Celtic, 6th–4th century BCE, Gold, 7.5 x 7.5 x .5 in. (19 x 19.1 x 1.2 cm.), The Metropolitan Museum of Art, New York City, New YorkÂ
[Image credit & copyright: The Metropolitan Museum of Art, Purchase, 2005 Benefit Fund, Rogers Fund, Audrey Love Charitable Foundation Gift, and Gifts of J. Pierpont Morgan and George Blumenthal and Fletcher Fund, by exchange, 2005. Public Domain. -
FREEAstronomy Daily Curio #3111Free1 CQ
Don’t hold your breath for moon dust. Long thought to be toxic, new research shows that moon dust may be relatively harmless compared to what’s already here on Earth. While the dusty surface of the moon looks beautiful and its name sounds like a whimsical ingredient in a fairy tale potion, it was a thorn in the side of lunar explorers during the Apollo missions. NASA astronauts who traversed the moon’s dusty surface reported symptoms like nasal congestion and sneezing, which they began calling “lunar hay fever.” They also reported that moon dust smelled like burnt gunpowder, and while an unpleasant smell isn’t necessarily bad for one’s health, it couldn’t have been comforting. These symptoms were likely caused by the abrasive nature of moon dust particles, which are never smoothed out by wind or water the way they would be on Earth. The particles are also small, so they’re very hard to keep out of spacesuits and away from equipment. Then there’s the matter of the moon’s low gravity, which allows moon dust to float around for longer than it would on Earth, making it more likely to penetrate spacesuit’s seals and be inhaled into the lungs. There, like asbestos, the dust can cause tiny cuts that can lead to respiratory problems and even cancer…at least, that’s what everyone thought until recently. Researchers at the University of Technology Sydney (UTS) just published a paper claiming that moon dust might not be so dangerous after all. They believe that the dust will likely cause short-term symptoms without leading to long-term damage. Using simulated moon dust and real human lungs, they found that moon dust was less dangerous than many air pollutants found on Earth. For instance, silica (typically found on construction sites) is much more dangerous, as it can cause silicosis by lingering in the lungs, leading to scarring and lesions. Astronauts headed to the moon in the future can breathe a sigh of relief—but it may be safer to wait until they get there.
Â
[Image description: A moon surrounded by orange-ish hazy clouds against a black sky.] Credit & copyright: Cbaile19, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.Don’t hold your breath for moon dust. Long thought to be toxic, new research shows that moon dust may be relatively harmless compared to what’s already here on Earth. While the dusty surface of the moon looks beautiful and its name sounds like a whimsical ingredient in a fairy tale potion, it was a thorn in the side of lunar explorers during the Apollo missions. NASA astronauts who traversed the moon’s dusty surface reported symptoms like nasal congestion and sneezing, which they began calling “lunar hay fever.” They also reported that moon dust smelled like burnt gunpowder, and while an unpleasant smell isn’t necessarily bad for one’s health, it couldn’t have been comforting. These symptoms were likely caused by the abrasive nature of moon dust particles, which are never smoothed out by wind or water the way they would be on Earth. The particles are also small, so they’re very hard to keep out of spacesuits and away from equipment. Then there’s the matter of the moon’s low gravity, which allows moon dust to float around for longer than it would on Earth, making it more likely to penetrate spacesuit’s seals and be inhaled into the lungs. There, like asbestos, the dust can cause tiny cuts that can lead to respiratory problems and even cancer…at least, that’s what everyone thought until recently. Researchers at the University of Technology Sydney (UTS) just published a paper claiming that moon dust might not be so dangerous after all. They believe that the dust will likely cause short-term symptoms without leading to long-term damage. Using simulated moon dust and real human lungs, they found that moon dust was less dangerous than many air pollutants found on Earth. For instance, silica (typically found on construction sites) is much more dangerous, as it can cause silicosis by lingering in the lungs, leading to scarring and lesions. Astronauts headed to the moon in the future can breathe a sigh of relief—but it may be safer to wait until they get there.
Â
[Image description: A moon surrounded by orange-ish hazy clouds against a black sky.] Credit & copyright: Cbaile19, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.
July 6, 2025
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: July 6, 2025\AN-tik\ noun
What It Means
Antic refers to an attention-drawing, often wildly playful or funny act or action. ...
with Merriam-WebsterWord of the Day
: July 6, 2025\AN-tik\ noun
What It Means
Antic refers to an attention-drawing, often wildly playful or funny act or action. ...
-
12 minFREEWork Business CurioFree7 CQ
The government reported today that 147,000 more people were on payrolls in June compared to May — a stronger outcome than initially forecasted. This data com...
The government reported today that 147,000 more people were on payrolls in June compared to May — a stronger outcome than initially forecasted. This data com...
-
FREEHumanities PP&T CurioFree1 CQ
They say that a dog is man’s best friend, but there’s one thing that can get in the way of that friendship like nothing else. For thousands of years, rabies has claimed countless lives, often transmitted to humans via dogs, raccoons, foxes, and other mammals. For most of that time, there was no way to directly prevent the transmission of rabies, until French scientist Louis Pasteur managed to successfully inoculate someone against the disease on this day in 1885.
Â
Rabies has always been a disease without a cure. Even the ancient Sumerians knew about the deadly disease and how it could be transmitted through a bite from an infected animal. It was a common enough problem that the Babylonians had specific regulations on how the owner of a rabid dog was to compensate a victim’s family in the event of a bite. The disease itself is caused by a virus which is expressed in the saliva, and causes the infected animal to behave in an agitated or aggressive manner. Symptoms across species remain similar, and when humans are infected, they show signs of agitation, hyperactivity, fever, nausea, confusion, and the same excessive salivation seen in other animals. In advanced stages, victims begin hallucinating and having difficulty swallowing. The latter symptom also leads to a fear of water. Rabies is almost always fatal without intervention. Fortunately, post-exposure prophylaxis against rabies now exists, thanks to the efforts of one scientist.
Â
By the time 9-year-old Joseph Meister was bitten 14 times by a rabid dog in the town of Alsace, French chemist Louis Pasteur was already working with rabid dogs. Pasteur had developed a rabies vaccine which he was administering to dogs and rabbits. Though it showed promise, it had never been tested on human subjects. It had also never been used on a subject who had already been infected. When Joseph’s mother brought the child to Paris to seek treatment from Pasteur, he and his colleagues didn’t want to administer the vaccine due to its untested nature. That might have been the end for the young Joseph but for Dr. Jacques Joseph Grancher’s intervention. Grancher offered to administer the vaccine on the boy himself, and over the course of 10 days, Joseph received 12 doses. Remarkably, Joseph was cured by the end of the month, proving the vaccine’s efficacy as both a preventative and a treatment. While credit for developing the vaccine goes to Pasteur, Grancher was also recognized for his part in ending the era of rabies as an automatic death sentence. In 1888, Grancher was given the rank of Grand Officer of the Legion of Honor, the highest French honor given at the time to civilians or military personnel.
Â
The rabies vaccine and post-exposure prophylaxis have greatly improved since Pasteur’s time, and they’re no longer as grueling to receive as they once were. Still, rabies remains a dangerous disease. Luckily, cases are few and far between nowadays, with only around ten fatalities a year in North America thanks to decades of wildlife vaccination efforts. Most cases are spread by infected raccoons, foxes, bats, or skunks, as most pet dogs are vaccinated against rabies. In the rare instance that someone is infected and unable to receive a post-exposure prophylaxis quickly, the disease is still almost always fatal. Once symptoms start showing, it’s already too late. In a way, rabies still hasn’t been put to Pasteur.Â
[Image description: A raccoon poking its head out from underneath a large wooden beam.] Credit & copyright: Poivrier, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide.They say that a dog is man’s best friend, but there’s one thing that can get in the way of that friendship like nothing else. For thousands of years, rabies has claimed countless lives, often transmitted to humans via dogs, raccoons, foxes, and other mammals. For most of that time, there was no way to directly prevent the transmission of rabies, until French scientist Louis Pasteur managed to successfully inoculate someone against the disease on this day in 1885.
Â
Rabies has always been a disease without a cure. Even the ancient Sumerians knew about the deadly disease and how it could be transmitted through a bite from an infected animal. It was a common enough problem that the Babylonians had specific regulations on how the owner of a rabid dog was to compensate a victim’s family in the event of a bite. The disease itself is caused by a virus which is expressed in the saliva, and causes the infected animal to behave in an agitated or aggressive manner. Symptoms across species remain similar, and when humans are infected, they show signs of agitation, hyperactivity, fever, nausea, confusion, and the same excessive salivation seen in other animals. In advanced stages, victims begin hallucinating and having difficulty swallowing. The latter symptom also leads to a fear of water. Rabies is almost always fatal without intervention. Fortunately, post-exposure prophylaxis against rabies now exists, thanks to the efforts of one scientist.
Â
By the time 9-year-old Joseph Meister was bitten 14 times by a rabid dog in the town of Alsace, French chemist Louis Pasteur was already working with rabid dogs. Pasteur had developed a rabies vaccine which he was administering to dogs and rabbits. Though it showed promise, it had never been tested on human subjects. It had also never been used on a subject who had already been infected. When Joseph’s mother brought the child to Paris to seek treatment from Pasteur, he and his colleagues didn’t want to administer the vaccine due to its untested nature. That might have been the end for the young Joseph but for Dr. Jacques Joseph Grancher’s intervention. Grancher offered to administer the vaccine on the boy himself, and over the course of 10 days, Joseph received 12 doses. Remarkably, Joseph was cured by the end of the month, proving the vaccine’s efficacy as both a preventative and a treatment. While credit for developing the vaccine goes to Pasteur, Grancher was also recognized for his part in ending the era of rabies as an automatic death sentence. In 1888, Grancher was given the rank of Grand Officer of the Legion of Honor, the highest French honor given at the time to civilians or military personnel.
Â
The rabies vaccine and post-exposure prophylaxis have greatly improved since Pasteur’s time, and they’re no longer as grueling to receive as they once were. Still, rabies remains a dangerous disease. Luckily, cases are few and far between nowadays, with only around ten fatalities a year in North America thanks to decades of wildlife vaccination efforts. Most cases are spread by infected raccoons, foxes, bats, or skunks, as most pet dogs are vaccinated against rabies. In the rare instance that someone is infected and unable to receive a post-exposure prophylaxis quickly, the disease is still almost always fatal. Once symptoms start showing, it’s already too late. In a way, rabies still hasn’t been put to Pasteur.Â
[Image description: A raccoon poking its head out from underneath a large wooden beam.] Credit & copyright: Poivrier, Wikimedia Commons. The copyright holder of this work has released it into the public domain. This applies worldwide.