Curio Cabinet
- By Date
- By Type
May 1, 2024
-
FREETravel Nerdy CurioFree1 CQ
If you’ve ever ditched your car to ride the rails in the U.S., chances are you’ve interacted with this corporation. On this day in 1971, Amtrak, a corporation that operates nearly all U.S. passenger trains, began running. Amtrak was actually started by a U.S. President—Richard Nixon to be precise—who in 1970 signed the Rail Passenger Service Act, creating the National Railroad Passenger Corporation, which later became Amtrak. From day one, Amtrak operated 184 trains, with the first one running between New York City and Philadelphia. Today, more than 300 Amtrak trains run throughout the country. It’s important to note, though, that Amtrak isn’t a private company. Rather, it’s a federally chartered corporation, meaning that the federal government is a majority stockholder. The government is heavily involved in Amtrak’s operations; the corporation’s board of directors is appointed by the U.S. president and confirmed by the senate. It may seem strange, but Amtrak also doesn’t own the tracks that their trains run on. Rather, most U.S. tracks are privately owned and operated, with a few owned by state governments or even the federal government. It’s a confusing system, but it’s meant to ensure that both public and private interests work together to keep American trains running safely. Hey, if it works it works!
If you’ve ever ditched your car to ride the rails in the U.S., chances are you’ve interacted with this corporation. On this day in 1971, Amtrak, a corporation that operates nearly all U.S. passenger trains, began running. Amtrak was actually started by a U.S. President—Richard Nixon to be precise—who in 1970 signed the Rail Passenger Service Act, creating the National Railroad Passenger Corporation, which later became Amtrak. From day one, Amtrak operated 184 trains, with the first one running between New York City and Philadelphia. Today, more than 300 Amtrak trains run throughout the country. It’s important to note, though, that Amtrak isn’t a private company. Rather, it’s a federally chartered corporation, meaning that the federal government is a majority stockholder. The government is heavily involved in Amtrak’s operations; the corporation’s board of directors is appointed by the U.S. president and confirmed by the senate. It may seem strange, but Amtrak also doesn’t own the tracks that their trains run on. Rather, most U.S. tracks are privately owned and operated, with a few owned by state governments or even the federal government. It’s a confusing system, but it’s meant to ensure that both public and private interests work together to keep American trains running safely. Hey, if it works it works!
-
FREEUS History Daily Curio #2865Free1 CQ
For heads of state, the hardest part of the Cold War was keeping cool. But on this day in 1960, things got a little heated between the East and West when an American U-2 spy plane was shot down over the Soviet Union. The U-2 spy plane was a single-occupant plane developed in the 1950s to perform high-altitude reconnaissance for the CIA and was heavily used during the Cold War. When this particular U-2 was shot down by a surface-to-air missile during a mission originally planned to span 2,900 miles, its sole operator, Francis Gary Powers, found himself alone on foreign soil. After parachuting to safety, Powers was quickly captured and held in Moscow while Soviet premier Nikita Khrushchev demanded answers from President Dwight D. Eisenhower. The incident couldn’t have happened at a worse time; the two nations were planning to meet in Paris to discuss ongoing issues regarding a divided Germany, which had been partitioned after WWII. The idea of American spy planes in their airspace wasn’t well-received by the Soviet Union, and the incident put the upcoming Paris summit under threat. It also dragged several other countries into the mess. According to testimony from Powers, he had been taking off from airfields in Pakistan, Norway, and Turkey, all of which received protest notes from the Soviet Union. Meantime, the U.S. denied culpability, claiming that the flights were unauthorized and that they had no knowledge of them. Pakistan, Norway and Turkey, in turn, sent letters to the U.S. demanding that they cease all flights from their territories. In reality, the spy plane program was a matter of great interest to Eisenhower, who personally authorized each flight. Nevertheless, he sent assurances to his Soviet counterpart that no such flights would be made for the rest of his term, though he also refused to apologize for them. The refusal sunk any hopes for the Paris summit, as Khrushchev refused to maintain diplomatic discussions with the U.S. for the rest of Eisenhower’s term, only resuming once John F. Kennedy took office. As for Powers, he was tried and found guilty of spying. He served two years of his ten-year sentence, until he was exchanged for Soviet spy Rudolf Abel. An eye for an eye and a spy for a spy.
[Image description: A black Lockheed U-2 aircraft flying over clouds.] Credit & copyright: Wikipedia, Picture prepared for Wikipedia by Adrian Pingstone in April 2003. This image or file is a work of a U.S. Air Force Airman or employee, taken or made as part of that person's official duties. As a work of the U.S. federal government, the image or file is in the public domain in the United States.For heads of state, the hardest part of the Cold War was keeping cool. But on this day in 1960, things got a little heated between the East and West when an American U-2 spy plane was shot down over the Soviet Union. The U-2 spy plane was a single-occupant plane developed in the 1950s to perform high-altitude reconnaissance for the CIA and was heavily used during the Cold War. When this particular U-2 was shot down by a surface-to-air missile during a mission originally planned to span 2,900 miles, its sole operator, Francis Gary Powers, found himself alone on foreign soil. After parachuting to safety, Powers was quickly captured and held in Moscow while Soviet premier Nikita Khrushchev demanded answers from President Dwight D. Eisenhower. The incident couldn’t have happened at a worse time; the two nations were planning to meet in Paris to discuss ongoing issues regarding a divided Germany, which had been partitioned after WWII. The idea of American spy planes in their airspace wasn’t well-received by the Soviet Union, and the incident put the upcoming Paris summit under threat. It also dragged several other countries into the mess. According to testimony from Powers, he had been taking off from airfields in Pakistan, Norway, and Turkey, all of which received protest notes from the Soviet Union. Meantime, the U.S. denied culpability, claiming that the flights were unauthorized and that they had no knowledge of them. Pakistan, Norway and Turkey, in turn, sent letters to the U.S. demanding that they cease all flights from their territories. In reality, the spy plane program was a matter of great interest to Eisenhower, who personally authorized each flight. Nevertheless, he sent assurances to his Soviet counterpart that no such flights would be made for the rest of his term, though he also refused to apologize for them. The refusal sunk any hopes for the Paris summit, as Khrushchev refused to maintain diplomatic discussions with the U.S. for the rest of Eisenhower’s term, only resuming once John F. Kennedy took office. As for Powers, he was tried and found guilty of spying. He served two years of his ten-year sentence, until he was exchanged for Soviet spy Rudolf Abel. An eye for an eye and a spy for a spy.
[Image description: A black Lockheed U-2 aircraft flying over clouds.] Credit & copyright: Wikipedia, Picture prepared for Wikipedia by Adrian Pingstone in April 2003. This image or file is a work of a U.S. Air Force Airman or employee, taken or made as part of that person's official duties. As a work of the U.S. federal government, the image or file is in the public domain in the United States.
April 30, 2024
-
FREEMusic Appreciation Song CurioFree2 CQ
You may not remember The Seekers, but they were on top down under! The 1960s were a competitive time for musicians, and no one proves that more than Australian pop-folk group The Seekers. The four-person group boasted polished instrumentals and harmonies, but so did many other groups from countries with bigger music scenes, like the U.S. and England. To get their music on the radio, the Seekers sought out the help of Tom Springfield, Dusty Springfield’s songwriter brother, who penned them what would go on to be their best-remembered hit: I'll Never Find Another You. The bouncy lovesong, with its beautiful harmonies, folklife instrumentation, and sweet lyrics about lifelong devotion was just what the 1960s ordered. Not only did the song break The Seekers onto the radio for the first time, it reached number one in Australia and the UK, and number four on the U.S. Billboard Hot 100. They went on to have several more number one hits throughout their career, though, so you could say they did find another tune.
You may not remember The Seekers, but they were on top down under! The 1960s were a competitive time for musicians, and no one proves that more than Australian pop-folk group The Seekers. The four-person group boasted polished instrumentals and harmonies, but so did many other groups from countries with bigger music scenes, like the U.S. and England. To get their music on the radio, the Seekers sought out the help of Tom Springfield, Dusty Springfield’s songwriter brother, who penned them what would go on to be their best-remembered hit: I'll Never Find Another You. The bouncy lovesong, with its beautiful harmonies, folklife instrumentation, and sweet lyrics about lifelong devotion was just what the 1960s ordered. Not only did the song break The Seekers onto the radio for the first time, it reached number one in Australia and the UK, and number four on the U.S. Billboard Hot 100. They went on to have several more number one hits throughout their career, though, so you could say they did find another tune.
-
FREEScience Daily Curio #2864Free1 CQ
It seems like shrinkflation has affected everything but hurricanes. As climate change leads to more extreme weather events, some scientists are saying that a new tier needs to be added to the scale used for measuring hurricanes. Currently, the Saffir-Simpson Hurricane Wind Scale works as a convenient shorthand for how serious a storm is. The difference between 95 MPH winds and 156 MPH winds might not mean much to the average person—they both sound devastatingly fast. However, those speeds represent the upper ends of Category 1 storms and Category 4 storms respectively, and there is a stark difference in destructive potential. Category 1 hurricanes might cause damage to shrubbery and building exteriors, but Category 4 could cause a roof collapse or completely obliterate small buildings. Most people who live in hurricane-prone areas understand the difference thanks to the simplified category labels. Where Category 5 sits nowadays, however, is a matter of growing concern. Currently, Category 5 is anything above Category 4, and such storms have caused structural failures and power outages lasting months due to extensive damage to the power grid. Yet, some newer Category 5 hurricanes have so much more destructive potential than Category 5s of the past that some scientists now are calling for a Category 6 to be added to the scale. The new category would refer to storms with wind speeds of 192 MPH and above. Storms with such severe winds used to be fairly rare, so a sixth category wasn’t thought to be necessary. Between 1980 and 2021, only five out of 197 hurricanes would have been labeled Category 6, but all five of those took place from 2013 onward. Those in favor of the change have pointed out that labeling a 200-MPH storm Category 5 just doesn’t convey an accurate picture of the storm’s threat. However, some scientists also believe that adding an additional category might affect public perception of the lower categories and cause people to take them less seriously—and make no mistake, even a Category 1 storm can be dangerous. Then again, you can at least ride out a Category 1; a Category 6 will throw you like a bull.
It seems like shrinkflation has affected everything but hurricanes. As climate change leads to more extreme weather events, some scientists are saying that a new tier needs to be added to the scale used for measuring hurricanes. Currently, the Saffir-Simpson Hurricane Wind Scale works as a convenient shorthand for how serious a storm is. The difference between 95 MPH winds and 156 MPH winds might not mean much to the average person—they both sound devastatingly fast. However, those speeds represent the upper ends of Category 1 storms and Category 4 storms respectively, and there is a stark difference in destructive potential. Category 1 hurricanes might cause damage to shrubbery and building exteriors, but Category 4 could cause a roof collapse or completely obliterate small buildings. Most people who live in hurricane-prone areas understand the difference thanks to the simplified category labels. Where Category 5 sits nowadays, however, is a matter of growing concern. Currently, Category 5 is anything above Category 4, and such storms have caused structural failures and power outages lasting months due to extensive damage to the power grid. Yet, some newer Category 5 hurricanes have so much more destructive potential than Category 5s of the past that some scientists now are calling for a Category 6 to be added to the scale. The new category would refer to storms with wind speeds of 192 MPH and above. Storms with such severe winds used to be fairly rare, so a sixth category wasn’t thought to be necessary. Between 1980 and 2021, only five out of 197 hurricanes would have been labeled Category 6, but all five of those took place from 2013 onward. Those in favor of the change have pointed out that labeling a 200-MPH storm Category 5 just doesn’t convey an accurate picture of the storm’s threat. However, some scientists also believe that adding an additional category might affect public perception of the lower categories and cause people to take them less seriously—and make no mistake, even a Category 1 storm can be dangerous. Then again, you can at least ride out a Category 1; a Category 6 will throw you like a bull.
April 29, 2024
-
FREEArt Appreciation Art CurioFree1 CQ
It doesn’t hurt to look good on the battlefield—in fact, it helps. Maximilian armors were as stylish as they were practical, and became popular during the 16th century. The piece above, Equestrian Portrait of the Emperor Maximilian, is a woodcut print depicting a man sitting on a horse. Both are wearing intricately detailed armors, and the man’s helmet has peacock feathers on the top. Maximilian I ruled over the Holy Roman Empire between 1486–1519. He was known for greatly expanding the holdings of his family, the Habsburgs, through both military conquests and diplomacy. He had a particular inclination for the former, though, and is even credited with creating the Landsknechte, a highly-organized mercenary group that utilized the pike and shot formation. However, he might be remembered more for his association with Maximilian style armors, known for extensive fluting and intricate details that made them as much fashion statements as military equipment. The fluting was more than cosmetic, though—it made the plate armor more resilient against blows. Sadly for martial fashion mavens, such armor fell out of style by 1530, when artillery became more popular. That’s fashion for you.
Equestrian Portrait of the Emperor Maximilian, Hans Burgkmair (1473–1531), 1508, Woodcut on paper, 12.68 x 8.93 in., The Cleveland Museum of Art, Cleveland, Ohio
[Image credit & copyright: The Cleveland Museum of Art, John L. Severance Fund 1950.72. Public Domain, Creative Commons Zero (CC0) designation.]It doesn’t hurt to look good on the battlefield—in fact, it helps. Maximilian armors were as stylish as they were practical, and became popular during the 16th century. The piece above, Equestrian Portrait of the Emperor Maximilian, is a woodcut print depicting a man sitting on a horse. Both are wearing intricately detailed armors, and the man’s helmet has peacock feathers on the top. Maximilian I ruled over the Holy Roman Empire between 1486–1519. He was known for greatly expanding the holdings of his family, the Habsburgs, through both military conquests and diplomacy. He had a particular inclination for the former, though, and is even credited with creating the Landsknechte, a highly-organized mercenary group that utilized the pike and shot formation. However, he might be remembered more for his association with Maximilian style armors, known for extensive fluting and intricate details that made them as much fashion statements as military equipment. The fluting was more than cosmetic, though—it made the plate armor more resilient against blows. Sadly for martial fashion mavens, such armor fell out of style by 1530, when artillery became more popular. That’s fashion for you.
Equestrian Portrait of the Emperor Maximilian, Hans Burgkmair (1473–1531), 1508, Woodcut on paper, 12.68 x 8.93 in., The Cleveland Museum of Art, Cleveland, Ohio
[Image credit & copyright: The Cleveland Museum of Art, John L. Severance Fund 1950.72. Public Domain, Creative Commons Zero (CC0) designation.] -
FREEWorld History Daily Curio #2863Free1 CQ
Some recipes might be too good to mess with, but this one is actually illegal to alter. Anzac Day was recently held on April 25, and for Aussies and Kiwis, that means eating Anzac Biscuits—a treat of longstanding tradition that isn’t taken lightly. “Anzac” is short for Australian and New Zealand Army Corps, and Anzac Day commemorates April 25, 1915, when the two nations’ militaries embarked on an allied expedition and took part in a grueling campaign. Called the Gallipoli campaign, it took place in the peninsula of the same name, which at the time was a territory of the Ottoman Empire. The campaign lasted until the end of the year, and ultimately led to massive casualties for both the Ottoman and the allies, with around 12,000 dead for the Anzacs.
It makes sense for a day commemorating such a deadly campaign to include somber ceremonies and traditions. After the prayers, speeches, and moments of silence have concluded, though, observants indulge in Anzac Biscuits, formerly known as Soldiers’ Biscuit. These were originally biscuits sent from family members back home to soldiers on the front lines, and were made with simple ingredients so that they could endure their lengthy journey without spoiling. The recipe varied quite a bit back when civilians were baking them, and even today, every family has their own slight variation, but commercial producers of the biscuits are held to a strict standard. Per the Protection of Word ‘Anzac’ Act 1920, companies that produce the biscuits for sale cannot deviate from the set recipe, lest they receive hefty fines. As of today, the fine can be as high as $40,000 in U.S. dollars, and there may even be jail time of up to 12 months. If the company wants to sell biscuits with additional or alternate ingredients (except to accommodate dietary restrictions like lactose intolerance), they’re not allowed to use the term “Anzac.” As for the official ingredients, they include butter or margarine, golden syrup, baking soda, flour, rolled oats, dried coconut, and brown sugar. It’s not just tradition—it’s the law!
[Image description: The Australian flag flying against a blue sky.] Credit & copyright: Hugo Heimendinger, PexelsSome recipes might be too good to mess with, but this one is actually illegal to alter. Anzac Day was recently held on April 25, and for Aussies and Kiwis, that means eating Anzac Biscuits—a treat of longstanding tradition that isn’t taken lightly. “Anzac” is short for Australian and New Zealand Army Corps, and Anzac Day commemorates April 25, 1915, when the two nations’ militaries embarked on an allied expedition and took part in a grueling campaign. Called the Gallipoli campaign, it took place in the peninsula of the same name, which at the time was a territory of the Ottoman Empire. The campaign lasted until the end of the year, and ultimately led to massive casualties for both the Ottoman and the allies, with around 12,000 dead for the Anzacs.
It makes sense for a day commemorating such a deadly campaign to include somber ceremonies and traditions. After the prayers, speeches, and moments of silence have concluded, though, observants indulge in Anzac Biscuits, formerly known as Soldiers’ Biscuit. These were originally biscuits sent from family members back home to soldiers on the front lines, and were made with simple ingredients so that they could endure their lengthy journey without spoiling. The recipe varied quite a bit back when civilians were baking them, and even today, every family has their own slight variation, but commercial producers of the biscuits are held to a strict standard. Per the Protection of Word ‘Anzac’ Act 1920, companies that produce the biscuits for sale cannot deviate from the set recipe, lest they receive hefty fines. As of today, the fine can be as high as $40,000 in U.S. dollars, and there may even be jail time of up to 12 months. If the company wants to sell biscuits with additional or alternate ingredients (except to accommodate dietary restrictions like lactose intolerance), they’re not allowed to use the term “Anzac.” As for the official ingredients, they include butter or margarine, golden syrup, baking soda, flour, rolled oats, dried coconut, and brown sugar. It’s not just tradition—it’s the law!
[Image description: The Australian flag flying against a blue sky.] Credit & copyright: Hugo Heimendinger, Pexels -
8 minFREEWork Business CurioFree5 CQ
The Supreme Court is scheduled to hear oral arguments today on whether the National Labor Relations Board has to meet a higher burden of proof when interveni...
The Supreme Court is scheduled to hear oral arguments today on whether the National Labor Relations Board has to meet a higher burden of proof when interveni...
April 28, 2024
-
FREEMind + Body PP&T CurioFree1 CQ
The illnesses just keep coming! First it was COVID-19, then a bird flu scare. Now, people are concerned about another disorder that might be making the leap from animals to humans: chronic wasting disease (CWD). For years, this fatal illness has only affected cervids (members of the deer family) but a recent case involving two hunters has some people (and government agencies) concerned that it could impact people as well… assuming that those people eat contaminated venison.
Unlike COVID-19 or bird flu, CWD isn’t caused by a virus. Rather, it’s a prion disease, like mad cow disease. Prions aren’t alive like bacteria and other microbes, nor do they contain genetic material like viruses. Rather, they’re misfolded proteins that cause other proteins to become similarly misfolded. As a result, prions can cause a cascade effect, bumping into proteins and creating copies of themselves, destroying the ability of infected tissue (usually in the brain) to function properly. In short, a prion is like an immortal bull in a china shop, except that every time it breaks a plate, that plate becomes another bull. Compounding their danger is the fact that prions are resistant to treatments that are effective on most pathogens, and they can last a long time—even years—if left undisturbed. Prions can develop spontaneously in otherwise healthy organisms, but most well-known cases involve transmissions of existing ones.
CWD was first discovered in 1967, but was thought to only impact deer, until recently. Among cervids like white-tailed deer, mule deer, elk, and moose, CWD spreads via saliva, urine, and feces. As its name implies, CWD causes an infected animal to lose a significant amount of weight. Over time, they begin to exhibit cognitive issues, rendering them unable to socialize properly with other deer, and making them lose awareness of their surroundings and their natural fear of humans.
It has recently been reported that, in 2022, two American hunters ate venison infected with CWD and subsequently became ill with Creutzfeldt-Jakob disease (CJD), a rare neurodegenerative disorder with symptoms very similar to Alzheimer’s disease. CJD and CWD are types of spongiform encephalopathies, which means that they cause degradation of brain tissue. Symptoms may include depression, confusion, a change in gait, and hallucinations. Both disorders are fatal, and decline in health can occur rapidly. One of the hunters died less than a month after his symptoms began. Up until now, humans have only been diagnosed with CJD after receiving transplants like cornea tissue from infected donors. But this recent case could end up proving that, just as the prion disorder known as mad cow disease can jump from livestock to humans, CWD can make the same leap from deer.
That’s not to say that there’s likely to be a sudden pandemic of prion infections. Although both hunters contracted the fatal disease after eating infected deer meat, the population they were eating from was known to be infected with CWD. The disease doesn’t affect a high proportion of the American deer population, either, though it can spread rapidly through populations once it takes hold. Human intervention in wild deer life, such as feeding, baiting, or using urine-based lures, can quicken the spread. Limiting or banning such practices is usually step number one when it comes to CWD mitigation. If CWD were ever to get out of hand in American deer populations, hunters might then be required to submit tissue samples from harvested deer, or to report any carcasses found in the wild. In the meantime, wildlife officials advise against eating meat from deer that looked obviously sick or emaciated, just in case. You want venison to be lean, but not that lean.
[Image description: Three white tailed deer graze on grass. A male deer with antlers stands at the front of the group.] Credit & copyright: Wikimedia Commons, Richard Lydekker (1849–1915). This work is in the public domain in the United States because it was published (or registered with the U.S. Copyright Office) before January 1, 1929.The illnesses just keep coming! First it was COVID-19, then a bird flu scare. Now, people are concerned about another disorder that might be making the leap from animals to humans: chronic wasting disease (CWD). For years, this fatal illness has only affected cervids (members of the deer family) but a recent case involving two hunters has some people (and government agencies) concerned that it could impact people as well… assuming that those people eat contaminated venison.
Unlike COVID-19 or bird flu, CWD isn’t caused by a virus. Rather, it’s a prion disease, like mad cow disease. Prions aren’t alive like bacteria and other microbes, nor do they contain genetic material like viruses. Rather, they’re misfolded proteins that cause other proteins to become similarly misfolded. As a result, prions can cause a cascade effect, bumping into proteins and creating copies of themselves, destroying the ability of infected tissue (usually in the brain) to function properly. In short, a prion is like an immortal bull in a china shop, except that every time it breaks a plate, that plate becomes another bull. Compounding their danger is the fact that prions are resistant to treatments that are effective on most pathogens, and they can last a long time—even years—if left undisturbed. Prions can develop spontaneously in otherwise healthy organisms, but most well-known cases involve transmissions of existing ones.
CWD was first discovered in 1967, but was thought to only impact deer, until recently. Among cervids like white-tailed deer, mule deer, elk, and moose, CWD spreads via saliva, urine, and feces. As its name implies, CWD causes an infected animal to lose a significant amount of weight. Over time, they begin to exhibit cognitive issues, rendering them unable to socialize properly with other deer, and making them lose awareness of their surroundings and their natural fear of humans.
It has recently been reported that, in 2022, two American hunters ate venison infected with CWD and subsequently became ill with Creutzfeldt-Jakob disease (CJD), a rare neurodegenerative disorder with symptoms very similar to Alzheimer’s disease. CJD and CWD are types of spongiform encephalopathies, which means that they cause degradation of brain tissue. Symptoms may include depression, confusion, a change in gait, and hallucinations. Both disorders are fatal, and decline in health can occur rapidly. One of the hunters died less than a month after his symptoms began. Up until now, humans have only been diagnosed with CJD after receiving transplants like cornea tissue from infected donors. But this recent case could end up proving that, just as the prion disorder known as mad cow disease can jump from livestock to humans, CWD can make the same leap from deer.
That’s not to say that there’s likely to be a sudden pandemic of prion infections. Although both hunters contracted the fatal disease after eating infected deer meat, the population they were eating from was known to be infected with CWD. The disease doesn’t affect a high proportion of the American deer population, either, though it can spread rapidly through populations once it takes hold. Human intervention in wild deer life, such as feeding, baiting, or using urine-based lures, can quicken the spread. Limiting or banning such practices is usually step number one when it comes to CWD mitigation. If CWD were ever to get out of hand in American deer populations, hunters might then be required to submit tissue samples from harvested deer, or to report any carcasses found in the wild. In the meantime, wildlife officials advise against eating meat from deer that looked obviously sick or emaciated, just in case. You want venison to be lean, but not that lean.
[Image description: Three white tailed deer graze on grass. A male deer with antlers stands at the front of the group.] Credit & copyright: Wikimedia Commons, Richard Lydekker (1849–1915). This work is in the public domain in the United States because it was published (or registered with the U.S. Copyright Office) before January 1, 1929. -
7 minFREEWork Business CurioFree4 CQ
New emissions standards for fossil fuel power plants would require them to eliminate 90% of the carbon dioxide they emit, or close. But the Supreme Court cou...
New emissions standards for fossil fuel power plants would require them to eliminate 90% of the carbon dioxide they emit, or close. But the Supreme Court cou...
April 27, 2024
-
7 minFREEWork Business CurioFree4 CQ
The U.S. economy grew by just 1.6% last quarter — falling very short of expectations. At the same time, inflation was up, according to the latest PCE. What’s...
The U.S. economy grew by just 1.6% last quarter — falling very short of expectations. At the same time, inflation was up, according to the latest PCE. What’s...
-
FREESwimming Sporty CurioFree1 CQ
One would hope that rulemakers for the upcoming Olympics aren’t a bunch of dopes. However, a recent revelation that Chinese swimmers were cleared to compete in the Tokyo Olympics after testing positive for banned substances has competitors worried for the upcoming games in Paris. Just months before the opening ceremony, it has just come to light that 23 Chinese swimmers tested positive for trimetazidine (TMZ) by the World Anti-Doping Agency (WADA), but were allowed to compete anyway, going on to win several medals. At the time, WADA accepted an explanation from Chinese officials that the athletes were accidentally exposed to the drug by way of a contaminated kitchen. While the amounts found in the swimmers’ system was too little to offer any benefit, others have been penalized for similar amounts. In 2019, Australian swimmer Shayna Jack was banned for four years after testing positive for ligandrol. The ban happened despite claims that she was accidentally exposed, likely at a contaminated public pool—claims that even investigators said were credible. Nonetheless, Jack only managed to reduce her ban down to two years, and she’ll be competing this year at the upcoming Olympics. Now, the U.S. Anti-Doping Agency and athletes who have been similarly penalized are asking why the Chinese swimmers were seemingly given special treatment when the policy is to ban athletes who test positive, regardless of amount or intent. The issue, then, isn’t actually doping, but that WADA seems to be giving some athletes a pass. The organization’s credibility may be left in the shallow end after this.
One would hope that rulemakers for the upcoming Olympics aren’t a bunch of dopes. However, a recent revelation that Chinese swimmers were cleared to compete in the Tokyo Olympics after testing positive for banned substances has competitors worried for the upcoming games in Paris. Just months before the opening ceremony, it has just come to light that 23 Chinese swimmers tested positive for trimetazidine (TMZ) by the World Anti-Doping Agency (WADA), but were allowed to compete anyway, going on to win several medals. At the time, WADA accepted an explanation from Chinese officials that the athletes were accidentally exposed to the drug by way of a contaminated kitchen. While the amounts found in the swimmers’ system was too little to offer any benefit, others have been penalized for similar amounts. In 2019, Australian swimmer Shayna Jack was banned for four years after testing positive for ligandrol. The ban happened despite claims that she was accidentally exposed, likely at a contaminated public pool—claims that even investigators said were credible. Nonetheless, Jack only managed to reduce her ban down to two years, and she’ll be competing this year at the upcoming Olympics. Now, the U.S. Anti-Doping Agency and athletes who have been similarly penalized are asking why the Chinese swimmers were seemingly given special treatment when the policy is to ban athletes who test positive, regardless of amount or intent. The issue, then, isn’t actually doping, but that WADA seems to be giving some athletes a pass. The organization’s credibility may be left in the shallow end after this.
April 26, 2024
-
FREEMind + Body PP&T CurioFree1 CQ
If southern hospitality had a flavor, this would probably be it. Chicken and dumplings, a dish famous in the American South, is renowned as a top-tier comfort food. Yet it’s also a source of debate. There are those who claim that the dish’s “dumplings” aren’t really dumplings, and that its depression-era backstory is dubious at best.
Chicken and dumplings is a simple soup made with simmered chicken meat and thick broth created via the simmering process. The dish’s dumplings are balls of biscuit dough, usually made from flour, shortening, and milk, though the latter can be substituted for buttermilk, water, or chicken broth. The soup is seasoned sparingly with salt and pepper.
Chicken and dumplings is a simple dish that requires few ingredients and can feed many people at once. Thus, for a time the dish was rumored to have been invented during the Great Depression, when resources were scarce. However, modern food historians have a different theory which begins not in the American South but in Germany. German cuisine includes many dishes that are similar to chicken and dumplings, such as potato dumplings in broth. Many German dishes became popular throughout the U.S. due to a wave of German immigrants in the 1820s, and the first written record of chicken and dumplings appears not long after, in the 1879 cookbook, Housekeeping in Old Virginia..
Of course, that doesn’t solve the debate about whether the dumplings in chicken and dumplings are really dumplings. Some foodies only consider something a dumpling if the food in question is stuffed with something, such as Japanese gyoza which are stuffed with meat and veggies, or European pierogies filled with potatoes and cheese. However, by that definition even gnocchi, the world’s most famous type of potato dumpling, wouldn’t fit the bill. One thing’s for certain, though: chicken and dumplings is a savory, chewy, comforting dish—no matter where it came from or what you call it.
[Image description: A rooster and several chickens pecking at grass.] Credit & copyright: Helge Klaus Rieder, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication. The person who associated a work with this deed has dedicated the work to the public domain by waiving all of their rights to the work worldwide.If southern hospitality had a flavor, this would probably be it. Chicken and dumplings, a dish famous in the American South, is renowned as a top-tier comfort food. Yet it’s also a source of debate. There are those who claim that the dish’s “dumplings” aren’t really dumplings, and that its depression-era backstory is dubious at best.
Chicken and dumplings is a simple soup made with simmered chicken meat and thick broth created via the simmering process. The dish’s dumplings are balls of biscuit dough, usually made from flour, shortening, and milk, though the latter can be substituted for buttermilk, water, or chicken broth. The soup is seasoned sparingly with salt and pepper.
Chicken and dumplings is a simple dish that requires few ingredients and can feed many people at once. Thus, for a time the dish was rumored to have been invented during the Great Depression, when resources were scarce. However, modern food historians have a different theory which begins not in the American South but in Germany. German cuisine includes many dishes that are similar to chicken and dumplings, such as potato dumplings in broth. Many German dishes became popular throughout the U.S. due to a wave of German immigrants in the 1820s, and the first written record of chicken and dumplings appears not long after, in the 1879 cookbook, Housekeeping in Old Virginia..
Of course, that doesn’t solve the debate about whether the dumplings in chicken and dumplings are really dumplings. Some foodies only consider something a dumpling if the food in question is stuffed with something, such as Japanese gyoza which are stuffed with meat and veggies, or European pierogies filled with potatoes and cheese. However, by that definition even gnocchi, the world’s most famous type of potato dumpling, wouldn’t fit the bill. One thing’s for certain, though: chicken and dumplings is a savory, chewy, comforting dish—no matter where it came from or what you call it.
[Image description: A rooster and several chickens pecking at grass.] Credit & copyright: Helge Klaus Rieder, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication. The person who associated a work with this deed has dedicated the work to the public domain by waiving all of their rights to the work worldwide.
April 25, 2024
-
7 minFREEWork Business CurioFree4 CQ
From the BBC World Service: Venice, Italy, has become the first city in the world to charge day trippers. But is $5.30 enough to keep a lid on tourist number...
From the BBC World Service: Venice, Italy, has become the first city in the world to charge day trippers. But is $5.30 enough to keep a lid on tourist number...
-
FREEBiology Nerdy CurioFree1 CQ
It’s the ultimate two-for-one deal. An international team of scientists have managed to see one of the world’s rarest natural phenomena—primary endosymbiosis—in action after years of observation. As one of the authors of their recent study stated, when this last happened over a billion years ago, the first plants appeared on Earth. Primary endosymbiosis occurs when two separate lifeforms join together as one, with the smaller of the two becoming an organelle. The first time the phenomenon occurred, a single-celled organism absorbed a bacterium whole, and that bacterium became the mitochondria. Without it, complex life that requires more energy than what a single-celled organism can produce could not have come into existence. Then came plants, which was the result of one of these complex organisms swallowing a cyanobacteria. Cyanobacteria are capable of turning sunlight into energy, and inside their new hosts, they became chloroplasts which make photosynthesis possible for plants. The latest example of this phenomena was just discovered, but has actually been happening for around a 100 million years. It involves an algae called Braarudosphaera bigelowii that absorbs a cyanobacteria called UCYN-A. Since the process of endosymbiosis first started between these two, UCYN-A has been losing parts of its genome while increasingly becoming reliant on B. bigelowii to provide necessary nutrients, indicating that it has fully accepted its role as an organelle. In exchange, UCYN-A fixes nitrogen from the air, something that algae and plants can’t do on their own. Even legumes—which are often referred to as nitrogen fixers—rely on bacteria living in their roots for the vital element. With further study, scientists say that it may be possible to use the algae to to fix nitrogen in crops, lessening dependence on fertilizers. Now that’s a green solution.
[Image description: A digital diagram of an ovular animal mitochondria with labeled portions.] Credit & copyright: Mariana Ruiz Villarreal LadyofHats, Wikimedia Commons
It’s the ultimate two-for-one deal. An international team of scientists have managed to see one of the world’s rarest natural phenomena—primary endosymbiosis—in action after years of observation. As one of the authors of their recent study stated, when this last happened over a billion years ago, the first plants appeared on Earth. Primary endosymbiosis occurs when two separate lifeforms join together as one, with the smaller of the two becoming an organelle. The first time the phenomenon occurred, a single-celled organism absorbed a bacterium whole, and that bacterium became the mitochondria. Without it, complex life that requires more energy than what a single-celled organism can produce could not have come into existence. Then came plants, which was the result of one of these complex organisms swallowing a cyanobacteria. Cyanobacteria are capable of turning sunlight into energy, and inside their new hosts, they became chloroplasts which make photosynthesis possible for plants. The latest example of this phenomena was just discovered, but has actually been happening for around a 100 million years. It involves an algae called Braarudosphaera bigelowii that absorbs a cyanobacteria called UCYN-A. Since the process of endosymbiosis first started between these two, UCYN-A has been losing parts of its genome while increasingly becoming reliant on B. bigelowii to provide necessary nutrients, indicating that it has fully accepted its role as an organelle. In exchange, UCYN-A fixes nitrogen from the air, something that algae and plants can’t do on their own. Even legumes—which are often referred to as nitrogen fixers—rely on bacteria living in their roots for the vital element. With further study, scientists say that it may be possible to use the algae to to fix nitrogen in crops, lessening dependence on fertilizers. Now that’s a green solution.
[Image description: A digital diagram of an ovular animal mitochondria with labeled portions.] Credit & copyright: Mariana Ruiz Villarreal LadyofHats, Wikimedia Commons
-
FREEWork Daily Curio #2862Free1 CQ
This is one job where it’s appropriate to be a control freak. Dangerous near-collisions of commercial airplanes have been on the rise lately, and the Federal Aviation Administration (FAA) is stepping in to require that air traffic controllers step away for some rest. While a close call in terrestrial traffic might mean an angry honk or a fender bender followed by an annoyed call to an insurance company, the stakes are higher in the air, where near-collisions can lead to hundreds of delayed flights. Of course, near misses are better than the alternative of actual aviation accidents, which could end in mass tragedy. What keeps such disasters at bay is an army of air traffic controllers—trained professionals who often have the final say on where and when a plane can go. Unfortunately, accidents and near-accidents are becoming more common. In April, an airline pilot at John F. Kennedy International Airport in New York was forced to abort a takeoff at the last second because other jets were entering the runway. Later that same week, a nearly identical incident occurred at Ronald Reagan Washington National Airport, one of the airports that serve the Washington D.C. area and is a hub for various airlines.
The problem, the FAA says, is a shortage of air traffic controllers, leading to long shifts without sufficient rest time. To address the issue, the FAA is making changes that are due to take effect in three months, mandating 10 hours of rest time between shifts (up from nine), with the number going up to 12 hours for overnight shifts. As for the cause of the shortage itself, it may not be possible to fix by mandate alone. Part of the issue is systemic: back in 1981, President Ronald Reagan fired 11,000 striking air traffic controllers, ridding the nation’s airports of the most experienced in the field. Today, despite being a relatively high-paying occupation, becoming an air traffic controller is difficult, and the job itself can be (understandably) high-stress. The FAA also maintains stringent requirements for candidates, who cannot be 31 or older and must be willing to relocate to any FAA facility in the U.S. after completing the training program, which is held in Oklahoma City, Oklahoma. But if you’re young, college-educated, and not picky about where you live, it could be “OK.”
[Image description: An air traffic control tower against a blue sky.] Credit & copyright: Eheik, Wikimedia Commons. This work has been released into the public domain by its author, Eheik, at the English Wikipedia project. This applies worldwide.This is one job where it’s appropriate to be a control freak. Dangerous near-collisions of commercial airplanes have been on the rise lately, and the Federal Aviation Administration (FAA) is stepping in to require that air traffic controllers step away for some rest. While a close call in terrestrial traffic might mean an angry honk or a fender bender followed by an annoyed call to an insurance company, the stakes are higher in the air, where near-collisions can lead to hundreds of delayed flights. Of course, near misses are better than the alternative of actual aviation accidents, which could end in mass tragedy. What keeps such disasters at bay is an army of air traffic controllers—trained professionals who often have the final say on where and when a plane can go. Unfortunately, accidents and near-accidents are becoming more common. In April, an airline pilot at John F. Kennedy International Airport in New York was forced to abort a takeoff at the last second because other jets were entering the runway. Later that same week, a nearly identical incident occurred at Ronald Reagan Washington National Airport, one of the airports that serve the Washington D.C. area and is a hub for various airlines.
The problem, the FAA says, is a shortage of air traffic controllers, leading to long shifts without sufficient rest time. To address the issue, the FAA is making changes that are due to take effect in three months, mandating 10 hours of rest time between shifts (up from nine), with the number going up to 12 hours for overnight shifts. As for the cause of the shortage itself, it may not be possible to fix by mandate alone. Part of the issue is systemic: back in 1981, President Ronald Reagan fired 11,000 striking air traffic controllers, ridding the nation’s airports of the most experienced in the field. Today, despite being a relatively high-paying occupation, becoming an air traffic controller is difficult, and the job itself can be (understandably) high-stress. The FAA also maintains stringent requirements for candidates, who cannot be 31 or older and must be willing to relocate to any FAA facility in the U.S. after completing the training program, which is held in Oklahoma City, Oklahoma. But if you’re young, college-educated, and not picky about where you live, it could be “OK.”
[Image description: An air traffic control tower against a blue sky.] Credit & copyright: Eheik, Wikimedia Commons. This work has been released into the public domain by its author, Eheik, at the English Wikipedia project. This applies worldwide.