Curio Cabinet
- By Date
- By Type
January 2, 2025
-
8 minFREEWork Business CurioFree5 CQ
From the BBC World Service: Authorities in India have removed hundreds of tons of toxic waste from an Indian chemical factory that witnessed one of the world...
From the BBC World Service: Authorities in India have removed hundreds of tons of toxic waste from an Indian chemical factory that witnessed one of the world...
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: January 2, 2025\poh-pur-REE\ noun
What It Means
Potpourri is a mixture of dried flower petals, leaves, and spices that is u...
with Merriam-WebsterWord of the Day
: January 2, 2025\poh-pur-REE\ noun
What It Means
Potpourri is a mixture of dried flower petals, leaves, and spices that is u...
-
FREEBiology Nerdy CurioFree1 CQ
You know what they say, there’s no “human” in “team.” According to a paper published by researchers at the Weizmann Institute of Science in Proceedings of the National Academy of Sciences, ants are smarter than humans…at least when it comes to teamwork. Working together is certainly something that Homo sapiens can do, but we’re just not as efficient at it as ants. To prove this, researchers pitted a team of humans against a colony of longhorn crazy ants and had them perform the same task, but at different scales. Both humans and ants were placed inside mazes with identical layouts and were made to move a T-shaped object through it. The ants worked in teams of around 80, while humans worked in groups of up to 26 as well as an individual working alone. To level the playing field, the human subjects were made to wear face masks and instructed not to speak or communicate (after all, ants can’t exactly talk to each other.) Under these conditions, the ants thrived, performing their given task successfully in a cooperative manner. On the other hand, humans easily grew frustrated and often resorted to “greedy” actions that led to them ignoring the contributions of their team members, and prevented them from performing the complex task. The conclusion? Complex intelligence isn’t always better when it comes to solving problems. Lead researcher Prof. Ofer Feinerman wrote, "We've shown that ants acting as a group are smarter, that for them the whole is greater than the sum of its parts. In contrast, forming groups did not expand the cognitive abilities of humans. The famous 'wisdom of the crowd' that's become so popular in the age of social networks didn't come to the fore in our experiments." Two heads might be better than one, but three is still a crowd.
[Image description: A close-up photo of an ant on a green leaf.] Credit & copyright: Egor Kamelev, Pexels
You know what they say, there’s no “human” in “team.” According to a paper published by researchers at the Weizmann Institute of Science in Proceedings of the National Academy of Sciences, ants are smarter than humans…at least when it comes to teamwork. Working together is certainly something that Homo sapiens can do, but we’re just not as efficient at it as ants. To prove this, researchers pitted a team of humans against a colony of longhorn crazy ants and had them perform the same task, but at different scales. Both humans and ants were placed inside mazes with identical layouts and were made to move a T-shaped object through it. The ants worked in teams of around 80, while humans worked in groups of up to 26 as well as an individual working alone. To level the playing field, the human subjects were made to wear face masks and instructed not to speak or communicate (after all, ants can’t exactly talk to each other.) Under these conditions, the ants thrived, performing their given task successfully in a cooperative manner. On the other hand, humans easily grew frustrated and often resorted to “greedy” actions that led to them ignoring the contributions of their team members, and prevented them from performing the complex task. The conclusion? Complex intelligence isn’t always better when it comes to solving problems. Lead researcher Prof. Ofer Feinerman wrote, "We've shown that ants acting as a group are smarter, that for them the whole is greater than the sum of its parts. In contrast, forming groups did not expand the cognitive abilities of humans. The famous 'wisdom of the crowd' that's become so popular in the age of social networks didn't come to the fore in our experiments." Two heads might be better than one, but three is still a crowd.
[Image description: A close-up photo of an ant on a green leaf.] Credit & copyright: Egor Kamelev, Pexels
-
FREEMind + Body Daily Curio #3006Free1 CQ
Are you a night owl or a morning person? Do you know someone who’s both? Some people seem to naturally thrive on less sleep than others, and modern research is shedding some light into why. Sleep deprivation is no joke. Once thought to be just a way for the body to rest, scientists have been learning about the various important functions of sleep in recent years. While sleeping, the body actually performs crucial brain maintenance, replenishing energy stores and flushing away toxins that build up throughout the day. At the same time, the brain consolidates long-term memories while sleeping. No wonder, then, that extreme sleep deprivation can have devastating—even fatal—consequences.
For most people, it takes around seven to nine hours of sleep each night to feel rested, but a small percentage of people do fine with just four to six. They’re called “natural short sleepers” and they’re now said to have a condition called Short Sleeper Syndrome (SSS). It sounds like a detrimental disease, but people who have SSS show no ill health effects from their diminished sleep. Not only do they sleep less than most people, they also have an easier time falling asleep, often wake up without the need for an alarm, and have more energy and a better mood during the day. The lack of negative side effects are leading scientists to believe that people with SSS are getting higher quality sleep than those without the condition, which is why they can reap benefits from sleeping for such a short time. While SSS appears to have a genetic component (scientists have identified seven genes associated with SSS), researchers believe that studying other mechanisms behind SSS could help those who struggle to get quality sleep. Of course, there are some things that anyone can do to get the most out of their sleep: going to bed at the same time each night, getting plenty of sun during the day (especially in the morning) and keeping bedrooms dark and quiet. Eating too much or drinking alcohol before bed can also decrease sleep quality. It’s all a tad more complicated than chamomile at night and coffee in the morning.
[Image description: A black alarm clock sitting on a nightstand on top of a notebook beside a blue-and-white-porcelein cup] Credit & copyright: Aphiwat chuangchoem, PexelsAre you a night owl or a morning person? Do you know someone who’s both? Some people seem to naturally thrive on less sleep than others, and modern research is shedding some light into why. Sleep deprivation is no joke. Once thought to be just a way for the body to rest, scientists have been learning about the various important functions of sleep in recent years. While sleeping, the body actually performs crucial brain maintenance, replenishing energy stores and flushing away toxins that build up throughout the day. At the same time, the brain consolidates long-term memories while sleeping. No wonder, then, that extreme sleep deprivation can have devastating—even fatal—consequences.
For most people, it takes around seven to nine hours of sleep each night to feel rested, but a small percentage of people do fine with just four to six. They’re called “natural short sleepers” and they’re now said to have a condition called Short Sleeper Syndrome (SSS). It sounds like a detrimental disease, but people who have SSS show no ill health effects from their diminished sleep. Not only do they sleep less than most people, they also have an easier time falling asleep, often wake up without the need for an alarm, and have more energy and a better mood during the day. The lack of negative side effects are leading scientists to believe that people with SSS are getting higher quality sleep than those without the condition, which is why they can reap benefits from sleeping for such a short time. While SSS appears to have a genetic component (scientists have identified seven genes associated with SSS), researchers believe that studying other mechanisms behind SSS could help those who struggle to get quality sleep. Of course, there are some things that anyone can do to get the most out of their sleep: going to bed at the same time each night, getting plenty of sun during the day (especially in the morning) and keeping bedrooms dark and quiet. Eating too much or drinking alcohol before bed can also decrease sleep quality. It’s all a tad more complicated than chamomile at night and coffee in the morning.
[Image description: A black alarm clock sitting on a nightstand on top of a notebook beside a blue-and-white-porcelein cup] Credit & copyright: Aphiwat chuangchoem, Pexels
January 1, 2025
-
7 minFREEWork Business CurioFree4 CQ
From the BBC World Service: Norway has suspended its plans for commercial-scale deep sea mining after facing criticism from environmental scientists. In the ...
From the BBC World Service: Norway has suspended its plans for commercial-scale deep sea mining after facing criticism from environmental scientists. In the ...
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: January 1, 2025\rih-JOO-vuh-nayt\ verb
What It Means
To rejuvenate a person, parts of the body, etc., is to make them feel ...
with Merriam-WebsterWord of the Day
: January 1, 2025\rih-JOO-vuh-nayt\ verb
What It Means
To rejuvenate a person, parts of the body, etc., is to make them feel ...
-
FREEBiology Nerdy CurioFree1 CQ
They’re some of the cutest critters around, and they’ve made quite the comeback. Fur seals, as their name suggests, are covered in visible fur, unlike many seal species which have sleek skin or very, very short fur. Unfortunately, the very fur that gives these marine mammals their name has also made them a target for hunters at various points in history. Northern fur seals, for example, were nearly hunted to extinction in the early 19th century before being protected by law in 1915. Yet this species has bounded back in unexpected ways. This year, the Farallon Islands near San Francisco, California, where northern fur seals were once hunted for their blubber and pelts, saw record numbers of baby seals born. Of the 2,133 fur seals counted on the islands during a recent population survey, a whopping 1,276 were new pups.
It shouldn’t be too surprising that fur seals are resilient. They originally evolved as land mammals around 15 to 17 million years ago, but quickly took to the water in pursuit of abundant aquatic prey. To this end, they developed flippers, streamlined bodies, and, like other members of the clade Pinniped, which includes walruses and sea lions, they have plenty of adaptations to help them survive in conditions that humans would find inhospitable. Northern fur seals’ fur is extremely dense, with around 350,000 hairs per square inch, and it’s made up of two layers—thick, outer guard hairs for protection, and a soft inner layer that keeps heat close to the body. A layer of blubber also protects the seals’ organs from frigid waters, as does their hefty overall size. Females can weigh up to 120 pounds, while males can reach a whopping 600 pounds.
Fur seals are pelagic, meaning that they spend the majority of their lives in the open sea. Northern fur seals are relatively solitary animals, only returning to land and gathering in groups each summer to breed. While at sea, fur seals sleep by floating on the surface with three flippers sticking out of the water to minimize heat loss. While half of their brain sleeps, the other half remains partially conscious–just enough to ensure that the seal doesn’t drown. Unlike most mammals, who need long periods of REM sleep, fur seals only have short bursts of REM sleep while at sea, which has led scientists to study their unusual sleeping patterns. When they’re awake, fur seals are capable hunters, feasting on more than 60 different species of fish. Squid can also make up large portions of their diet, depending on place and time of year. During breeding season, though, male fur seals stop eating entirely, focusing instead on mating and on fighting off aggressive male rivals. Males commonly lose around 20 percent of their body weight each breeding season. Even for a New Year’s resolution, that seems a bit extreme.
[Image description: A close-up photo of a brown Northern Fur Seal’s face.] Credit & copyright: Greg Thompson/USFWS. This image or recording is the work of a U.S. Fish and Wildlife Service employee, taken or made as part of that person's official duties. As a work of the U.S. federal government, the image is in the public domain.They’re some of the cutest critters around, and they’ve made quite the comeback. Fur seals, as their name suggests, are covered in visible fur, unlike many seal species which have sleek skin or very, very short fur. Unfortunately, the very fur that gives these marine mammals their name has also made them a target for hunters at various points in history. Northern fur seals, for example, were nearly hunted to extinction in the early 19th century before being protected by law in 1915. Yet this species has bounded back in unexpected ways. This year, the Farallon Islands near San Francisco, California, where northern fur seals were once hunted for their blubber and pelts, saw record numbers of baby seals born. Of the 2,133 fur seals counted on the islands during a recent population survey, a whopping 1,276 were new pups.
It shouldn’t be too surprising that fur seals are resilient. They originally evolved as land mammals around 15 to 17 million years ago, but quickly took to the water in pursuit of abundant aquatic prey. To this end, they developed flippers, streamlined bodies, and, like other members of the clade Pinniped, which includes walruses and sea lions, they have plenty of adaptations to help them survive in conditions that humans would find inhospitable. Northern fur seals’ fur is extremely dense, with around 350,000 hairs per square inch, and it’s made up of two layers—thick, outer guard hairs for protection, and a soft inner layer that keeps heat close to the body. A layer of blubber also protects the seals’ organs from frigid waters, as does their hefty overall size. Females can weigh up to 120 pounds, while males can reach a whopping 600 pounds.
Fur seals are pelagic, meaning that they spend the majority of their lives in the open sea. Northern fur seals are relatively solitary animals, only returning to land and gathering in groups each summer to breed. While at sea, fur seals sleep by floating on the surface with three flippers sticking out of the water to minimize heat loss. While half of their brain sleeps, the other half remains partially conscious–just enough to ensure that the seal doesn’t drown. Unlike most mammals, who need long periods of REM sleep, fur seals only have short bursts of REM sleep while at sea, which has led scientists to study their unusual sleeping patterns. When they’re awake, fur seals are capable hunters, feasting on more than 60 different species of fish. Squid can also make up large portions of their diet, depending on place and time of year. During breeding season, though, male fur seals stop eating entirely, focusing instead on mating and on fighting off aggressive male rivals. Males commonly lose around 20 percent of their body weight each breeding season. Even for a New Year’s resolution, that seems a bit extreme.
[Image description: A close-up photo of a brown Northern Fur Seal’s face.] Credit & copyright: Greg Thompson/USFWS. This image or recording is the work of a U.S. Fish and Wildlife Service employee, taken or made as part of that person's official duties. As a work of the U.S. federal government, the image is in the public domain. -
FREEDaily Curio #3005Free1 CQ
This is the one time we want to soak up microplastics. As recent research has shown, microplastics are everywhere and in everything. They’re notoriously difficult to remove once they’ve taken hold somewhere, and their environmental and health effects are just now beginning to be understood. Fortunately, a group of researchers at Wuhan University in China might have come up with a way to remove microplastics from water using biodegradable materials. Microplastics are defined as any piece of plastic five millimeters or smaller, and there’s estimated to be around 15.5 million tons of them just sitting on the ocean floor. That’s not even counting the microplastics that wash up on shore to mingle with sand or the ones in bodies of water further inland. With plastic production only set to increase in coming years, battling microplastics seems like a hopeless task. But cleaning this mess might just require the right sponge.
Researchers in China have managed to develop a sponge made of cotton cellulose and squid bones—both relatively inexpensive materials—that can simply soak up microplastic particles from water. The sponge acts like a filter, and when the researchers tested it in different bodies of water, they found that it was 99.9 percent effective at removing microplastics while maintaining efficiency for several decontamination cycles. That’s another thing: the sponge can be reused many times in addition to being completely biodegradable. There are some limits, though. Firstly, the sponge is only effective at removing microplastics floating around in water; it can’t do much about removing what’s already mixed in with sediment. Secondly, using the sponge would require a means to safely contain whatever microplastics have been removed, which is a separate problem entirely. Still, it could help prevent further contamination of water if scaled up and deployed widely. Even when the cleaning task is this momentous, it seems you can’t go wrong with a humble sponge.
[Image description: The surface of water under an open sky.] Credit & copyright: Matt Hardy, PexelsThis is the one time we want to soak up microplastics. As recent research has shown, microplastics are everywhere and in everything. They’re notoriously difficult to remove once they’ve taken hold somewhere, and their environmental and health effects are just now beginning to be understood. Fortunately, a group of researchers at Wuhan University in China might have come up with a way to remove microplastics from water using biodegradable materials. Microplastics are defined as any piece of plastic five millimeters or smaller, and there’s estimated to be around 15.5 million tons of them just sitting on the ocean floor. That’s not even counting the microplastics that wash up on shore to mingle with sand or the ones in bodies of water further inland. With plastic production only set to increase in coming years, battling microplastics seems like a hopeless task. But cleaning this mess might just require the right sponge.
Researchers in China have managed to develop a sponge made of cotton cellulose and squid bones—both relatively inexpensive materials—that can simply soak up microplastic particles from water. The sponge acts like a filter, and when the researchers tested it in different bodies of water, they found that it was 99.9 percent effective at removing microplastics while maintaining efficiency for several decontamination cycles. That’s another thing: the sponge can be reused many times in addition to being completely biodegradable. There are some limits, though. Firstly, the sponge is only effective at removing microplastics floating around in water; it can’t do much about removing what’s already mixed in with sediment. Secondly, using the sponge would require a means to safely contain whatever microplastics have been removed, which is a separate problem entirely. Still, it could help prevent further contamination of water if scaled up and deployed widely. Even when the cleaning task is this momentous, it seems you can’t go wrong with a humble sponge.
[Image description: The surface of water under an open sky.] Credit & copyright: Matt Hardy, Pexels
December 31, 2024
-
8 minFREEWork Business CurioFree5 CQ
From the BBC World Service: Chinese state-sponsored hackers allegedly gained access to the U.S. Treasury Department’s systems earlier this month, and were ab...
From the BBC World Service: Chinese state-sponsored hackers allegedly gained access to the U.S. Treasury Department’s systems earlier this month, and were ab...
-
1 minFREEHumanities Word CurioFree1 CQ
Word of the Day
: December 31, 2024\TSYTE-gyste\ noun
What It Means
Zeitgeist refers to the general beliefs, ideas, and spirit of a time and ...
with Merriam-WebsterWord of the Day
: December 31, 2024\TSYTE-gyste\ noun
What It Means
Zeitgeist refers to the general beliefs, ideas, and spirit of a time and ...
-
FREESong CurioFree2 CQ
It’s the final day of 2024! As such, when the ball drops in Times Square tonight, a familiar song will play alongside the traditional Auld Lang Syne, as it has every New Year’s Eve since 2005: John Lennon’s Imagine. This iconic song asks listeners to imagine a world without politics, war, and religion. But imagine if John Lennon wasn’t entirely responsible for Imagine’s success! The fact is, inspiration for the song’s iconic lyrics came from Lennon’s wife, Yoko Ono, who wrote some of her own “imaginings” in her 1964 book, Grapefruit. When speaking about his famously-poignant piano ballad in a 1980 BBC interview, Lennon admitted that the song “...should be credited as a Lennon/Ono song because a lot of the lyrics and the concept came from Yoko. But those days, I was a bit more selfish, a bit more macho, and I sort of omitted to mention her contribution.” That did change officially in 2017, when The National Music Publishers Association credited Ono on the track alongside Lennon. Like the possibilities of a new year, it’s better late than never!
It’s the final day of 2024! As such, when the ball drops in Times Square tonight, a familiar song will play alongside the traditional Auld Lang Syne, as it has every New Year’s Eve since 2005: John Lennon’s Imagine. This iconic song asks listeners to imagine a world without politics, war, and religion. But imagine if John Lennon wasn’t entirely responsible for Imagine’s success! The fact is, inspiration for the song’s iconic lyrics came from Lennon’s wife, Yoko Ono, who wrote some of her own “imaginings” in her 1964 book, Grapefruit. When speaking about his famously-poignant piano ballad in a 1980 BBC interview, Lennon admitted that the song “...should be credited as a Lennon/Ono song because a lot of the lyrics and the concept came from Yoko. But those days, I was a bit more selfish, a bit more macho, and I sort of omitted to mention her contribution.” That did change officially in 2017, when The National Music Publishers Association credited Ono on the track alongside Lennon. Like the possibilities of a new year, it’s better late than never!
-
FREEPolitical Science Daily Curio #3004Free1 CQ
As 2024 draws to a close, we're taking another look at the life of President Jimmy Carter, who passed away on December 29 at 100 years old.
For better or worse, modern American politics are a bombastic affair involving celebrity endorsements and plenty of talking heads. Former President Jimmy Carter, who recently became the first U.S. President to celebrate his 100th birthday, has lived a different sort of life than many modern politicians. His first home lacked electricity and indoor plumbing, and his career involved more quiet service than political bravado.
Born on October 1, 1924 in Plains, Georgia, James Earl “Jimmy” Carter Jr. was the first U.S. President to be born in a hospital, as home births were more common at the time. His early childhood was fairly humble. His father, Earl, was a peanut farmer and businessman who enlisted young Jimmy’s help in packing goods to be sold in town, while his mother was a trained nurse who provided healthcare services to impoverished Black families. As a student, Carter excelled at school, encouraged by his parents to be hardworking and enterprising. Aside from helping his father, he also sought work with the Sumter County Library Board, where he helped set up the bookmobile, a traveling library to service the rural areas of the county. After graduating high school in 1941, Carter attended the Georgia Institute of Technology for a year before entering the U.S. Naval Academy. He met his future wife, Rosalynn Smith, during his last year at the Academy, and the two were married in 1946. After graduating from the Academy the same year, Carter joined the U.S. Navy’s submarine service, although it was a dangerous job. He even worked with Captain Hyman Rickover, the “father of the nuclear Navy,” and studied nuclear engineering as part of the Navy’s efforts to build its first nuclear submarines. Carter would have served aboard the U.S.S. Seawolf, one of the first two such vessels, but the death of his father in 1953 prompted him to resign so that he could return to Georgia and take over the struggling family farm.
On returning to his home state, Carter and his family moved into a public housing project in Plains due to a post-war housing shortage. This experience inspired him to work with Habitat for Humanity decades later, and it also made him the first president to have lived in public housing. While turning around the fortunes of the family’s peanut farm, Carter became involved in politics, earning a seat on the Sumter County Board of Education in 1955. In 1962, he ran for a seat in the Georgia State Senate, where he earned a reputation for himself by targeting wasteful spending and laws meant to disenfranchise Black voters. Although he failed to win the Democratic primary in 1966 for a seat in the U.S. Congress (largely due to his support of the civil rights movement), he refocused his efforts toward the 1970 gubernatorial election. After a successful campaign, he surprised many in Georgia by advocating for integration and appointing more Black staff members than previous administrations. Though his idealism attracted criticism, Carter was largely popular in the state for his work in reducing government bureaucracy and increasing funding for schools.
Jimmy Carter’s political ambitions eventually led him to the White House when he took office in 1977. His Presidency took place during a chaotic time, in which the Iranian hostage crisis, a war in Afghanistan, and economic worries were just some of the problems he was tasked with helping to solve. After losing the 1980 Presidential race to Ronald Reagan, Carter and his wife moved back into their modest, ranch-style home in Georgia where they lived for more than 60 years, making him one of just a few presidents to return to their pre-presidential residences. Today, Carter is almost as well-known for his work after his presidency as during it, since he dedicated much of his life to charity work, especially building homes with Habitat for Humanity. He also wrote over 30 books, including three that he recorded as audio books which won him three Grammy Awards in the Spoken Word Album category. Not too shabby for a humble peanut farmer.
[Image description: Jimmy Carter’s official Presidential portrait; he wears a dark blue suit with a light blue shirt and striped tie.] Credit & copyright: Department of Defense. Department of the Navy. Naval Photographic Center. Wikimedia Commons. This work is in the public domain in the United States because it is a work prepared by an officer or employee of the United States Government as part of that person’s official duties under the terms of Title 17, Chapter 1, Section 105 of the US Code.As 2024 draws to a close, we're taking another look at the life of President Jimmy Carter, who passed away on December 29 at 100 years old.
For better or worse, modern American politics are a bombastic affair involving celebrity endorsements and plenty of talking heads. Former President Jimmy Carter, who recently became the first U.S. President to celebrate his 100th birthday, has lived a different sort of life than many modern politicians. His first home lacked electricity and indoor plumbing, and his career involved more quiet service than political bravado.
Born on October 1, 1924 in Plains, Georgia, James Earl “Jimmy” Carter Jr. was the first U.S. President to be born in a hospital, as home births were more common at the time. His early childhood was fairly humble. His father, Earl, was a peanut farmer and businessman who enlisted young Jimmy’s help in packing goods to be sold in town, while his mother was a trained nurse who provided healthcare services to impoverished Black families. As a student, Carter excelled at school, encouraged by his parents to be hardworking and enterprising. Aside from helping his father, he also sought work with the Sumter County Library Board, where he helped set up the bookmobile, a traveling library to service the rural areas of the county. After graduating high school in 1941, Carter attended the Georgia Institute of Technology for a year before entering the U.S. Naval Academy. He met his future wife, Rosalynn Smith, during his last year at the Academy, and the two were married in 1946. After graduating from the Academy the same year, Carter joined the U.S. Navy’s submarine service, although it was a dangerous job. He even worked with Captain Hyman Rickover, the “father of the nuclear Navy,” and studied nuclear engineering as part of the Navy’s efforts to build its first nuclear submarines. Carter would have served aboard the U.S.S. Seawolf, one of the first two such vessels, but the death of his father in 1953 prompted him to resign so that he could return to Georgia and take over the struggling family farm.
On returning to his home state, Carter and his family moved into a public housing project in Plains due to a post-war housing shortage. This experience inspired him to work with Habitat for Humanity decades later, and it also made him the first president to have lived in public housing. While turning around the fortunes of the family’s peanut farm, Carter became involved in politics, earning a seat on the Sumter County Board of Education in 1955. In 1962, he ran for a seat in the Georgia State Senate, where he earned a reputation for himself by targeting wasteful spending and laws meant to disenfranchise Black voters. Although he failed to win the Democratic primary in 1966 for a seat in the U.S. Congress (largely due to his support of the civil rights movement), he refocused his efforts toward the 1970 gubernatorial election. After a successful campaign, he surprised many in Georgia by advocating for integration and appointing more Black staff members than previous administrations. Though his idealism attracted criticism, Carter was largely popular in the state for his work in reducing government bureaucracy and increasing funding for schools.
Jimmy Carter’s political ambitions eventually led him to the White House when he took office in 1977. His Presidency took place during a chaotic time, in which the Iranian hostage crisis, a war in Afghanistan, and economic worries were just some of the problems he was tasked with helping to solve. After losing the 1980 Presidential race to Ronald Reagan, Carter and his wife moved back into their modest, ranch-style home in Georgia where they lived for more than 60 years, making him one of just a few presidents to return to their pre-presidential residences. Today, Carter is almost as well-known for his work after his presidency as during it, since he dedicated much of his life to charity work, especially building homes with Habitat for Humanity. He also wrote over 30 books, including three that he recorded as audio books which won him three Grammy Awards in the Spoken Word Album category. Not too shabby for a humble peanut farmer.
[Image description: Jimmy Carter’s official Presidential portrait; he wears a dark blue suit with a light blue shirt and striped tie.] Credit & copyright: Department of Defense. Department of the Navy. Naval Photographic Center. Wikimedia Commons. This work is in the public domain in the United States because it is a work prepared by an officer or employee of the United States Government as part of that person’s official duties under the terms of Title 17, Chapter 1, Section 105 of the US Code.
December 30, 2024
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: December 30, 2024\er-BAYN\ adjective
What It Means
Someone described as urbane is notably polite, confident, or polished in...
with Merriam-WebsterWord of the Day
: December 30, 2024\er-BAYN\ adjective
What It Means
Someone described as urbane is notably polite, confident, or polished in...
-
FREEArt Appreciation Art CurioFree1 CQ
Who doesn’t like to dress up, especially when sitting for a portrait? French artist Jean-Marc Nattier painted this Greek mythology-themed portrait in the 18th century, but the identity of the subject remains unknown. Portrait of a Woman as Diana depicts a woman in a white dress. She is wearing flowers in her hair and a leopard pelt is draped over her arms as she holds a bow in her hands. To the left is a blue quiver full of arrows. Nattier was known for painting portraits that incorporated mythical imagery, and he often portrayed his subjects in the likeness of the Greek gods. In this case, the woman is depicted as Diana, goddess of the hunt, who is associated with bows, quivers, and pelts like those surrounding Nattier’s mysterious subject. The woman was previously thought to be either Madame de Pompadour, the mistress of King Louis XV, or the king’s daughter, but those theories have been debunked. Whoever she was, she seems to have had a certain divine je ne sais quoi.
Portrait of a Woman as Diana, Jean-Marc Nattier
(French, 1685–1766), 1752, Oil on canvas, 39.5 x 31.31 in. (100.4 x 79.5 cm.), The Cleveland Museum of Art, Cleveland, Ohio
[Image credit & copyright: Jean-Marc Nattier, The Cleveland Museum of Art, Bequest of John L. Severance 1942.643, Pubic Domain, Creative Commons Zero (CC0) designation.]Who doesn’t like to dress up, especially when sitting for a portrait? French artist Jean-Marc Nattier painted this Greek mythology-themed portrait in the 18th century, but the identity of the subject remains unknown. Portrait of a Woman as Diana depicts a woman in a white dress. She is wearing flowers in her hair and a leopard pelt is draped over her arms as she holds a bow in her hands. To the left is a blue quiver full of arrows. Nattier was known for painting portraits that incorporated mythical imagery, and he often portrayed his subjects in the likeness of the Greek gods. In this case, the woman is depicted as Diana, goddess of the hunt, who is associated with bows, quivers, and pelts like those surrounding Nattier’s mysterious subject. The woman was previously thought to be either Madame de Pompadour, the mistress of King Louis XV, or the king’s daughter, but those theories have been debunked. Whoever she was, she seems to have had a certain divine je ne sais quoi.
Portrait of a Woman as Diana, Jean-Marc Nattier
(French, 1685–1766), 1752, Oil on canvas, 39.5 x 31.31 in. (100.4 x 79.5 cm.), The Cleveland Museum of Art, Cleveland, Ohio
[Image credit & copyright: Jean-Marc Nattier, The Cleveland Museum of Art, Bequest of John L. Severance 1942.643, Pubic Domain, Creative Commons Zero (CC0) designation.] -
FREEWork Daily Curio #3003Free1 CQ
Home is where the agriculture is. A previously struggling village of just 300 residents in India is bouncing back after it won ownership rights to a nearby bamboo forest. Their success is due to a little-known piece of legislation that might end up helping other communities in similar situations.
For generations, the rural village of Pachgaon in Central India was in decline. Its population was dwindling as its residents went to cities in search of work, and those who stayed struggled to make ends meet thanks to violent seasonal floods that frequently destroyed their crops. But the villagers saw a potential answer in the Panchayat Act of 1996 and the Forest Rights Act of 2006, historic pieces of legislation that were designed to allow panchayats (tribal village councils) to apply for “community forest rights papers.” The papers would in turn allow the villagers to harvest various natural resources from the forests they inhabited, but many communities were unaware of their rights. The people of Pachgaon, however, sought the help of activist Vijay Dethe. Together, they applied for community forest rights with the government in 2009 and finally received them in 2012. Today the villagers of Panchayat have the right to work 2,486 acres of forest land, and to harvest the plentiful bamboo that grows in the area.
Bamboo is used for everything from scaffolding and setting concrete, to paper production, making it an in-demand resource. Different species of bamboo have different properties, so some are better suited for certain purposes than others. The bamboo from Panchayat, for example, isn’t suitable for being turned into pulp for paper mills, but it has plenty of other uses. In the past ten years, harvesting bamboo has brought in 34 million rupees or around $400,000 to the village, and some residents who has moved away to cities have come back. Thanks to the availability of work in the village, residents can now make a comfortable living. And unlike traditional crops, the bamboo forests aren’t affected by flooding, so there is no seasonal threat against their livelihood. Meanwhile, the operation is managed by the gram sabha (village assembly), and profits are distributed equitably among the workers. Notably, the workers are paid equally regardless of gender, and there is no formal hierarchy in the management of operations. Seems like no one’s getting bamboozled there.
[Image description: Green bamboo against a dark background.] Credit & copyright: Valeriia Miller, PexelsHome is where the agriculture is. A previously struggling village of just 300 residents in India is bouncing back after it won ownership rights to a nearby bamboo forest. Their success is due to a little-known piece of legislation that might end up helping other communities in similar situations.
For generations, the rural village of Pachgaon in Central India was in decline. Its population was dwindling as its residents went to cities in search of work, and those who stayed struggled to make ends meet thanks to violent seasonal floods that frequently destroyed their crops. But the villagers saw a potential answer in the Panchayat Act of 1996 and the Forest Rights Act of 2006, historic pieces of legislation that were designed to allow panchayats (tribal village councils) to apply for “community forest rights papers.” The papers would in turn allow the villagers to harvest various natural resources from the forests they inhabited, but many communities were unaware of their rights. The people of Pachgaon, however, sought the help of activist Vijay Dethe. Together, they applied for community forest rights with the government in 2009 and finally received them in 2012. Today the villagers of Panchayat have the right to work 2,486 acres of forest land, and to harvest the plentiful bamboo that grows in the area.
Bamboo is used for everything from scaffolding and setting concrete, to paper production, making it an in-demand resource. Different species of bamboo have different properties, so some are better suited for certain purposes than others. The bamboo from Panchayat, for example, isn’t suitable for being turned into pulp for paper mills, but it has plenty of other uses. In the past ten years, harvesting bamboo has brought in 34 million rupees or around $400,000 to the village, and some residents who has moved away to cities have come back. Thanks to the availability of work in the village, residents can now make a comfortable living. And unlike traditional crops, the bamboo forests aren’t affected by flooding, so there is no seasonal threat against their livelihood. Meanwhile, the operation is managed by the gram sabha (village assembly), and profits are distributed equitably among the workers. Notably, the workers are paid equally regardless of gender, and there is no formal hierarchy in the management of operations. Seems like no one’s getting bamboozled there.
[Image description: Green bamboo against a dark background.] Credit & copyright: Valeriia Miller, Pexels -
7 minFREEWork Business CurioFree4 CQ
From the BBC World Service: The incoming U.S. president claimed his country is paying excessive fees to use the waterway, which has been under Panama’s contr...
From the BBC World Service: The incoming U.S. president claimed his country is paying excessive fees to use the waterway, which has been under Panama’s contr...
December 29, 2024
-
FREEPP&T CurioFree1 CQ
This is one dispute between neighbors that got way out of hand. On this day in 1845, the U.S. Congress approved the annexation of the Republic of Texas, leading to the Mexican-American War. The conflict lasted for two brutal years and claimed the lives of nearly 40,000 soldiers.
Contrary of popular belief, Texas was not actually part of Mexico at the time of its annexation. Rather, it was a breakaway state—a Republic of its own that had gained independence from Mexico during the fittingly-named Texas Revolution. When the U.S. decided to annex it, the Republic had existed for around 10 years. For most of its existence, the U.S. recognized the Republic of Texas as an independent nation, while Mexico did not. Mexico considered it a rebellious state, and was eager to quash the Republic’s independent economic dealings with other nations. At the same time, they threatened war if the U.S. ever tried to annex the Republic of Texas.
Mexico had plenty of reasons to worry since the Republic of Texas itself was in favor of being annexed. In 1836, the Republic voted to become part of the U.S., as they were eager to procure the protection of the U.S. military and gain a stronger economic standing. However, it wasn’t until 1845 that President John Tyler, with the help of President-elect James K. Polk, passed a joint resolution in both houses of Congress and officially made Texas part of the United States. This increase in U.S. territory followed a trend of westward expansion at the time.
Mexico wasn’t happy, but they didn’t make good on their threat to declare war over the annexation. Rather, they took issue with Texas’ new borders. Mexico believed that the border should only extend as far as the Nueces River, but Texas claimed that their border extended all the way to the Rio Grande River and included portions of modern-day New Mexico and Colorado. In November, 1845, The U.S. sent Congressman John Slidell to negotiate a purchase agreement with Mexico for the disputed areas of land. At the same time, The U.S. Army began to take up stations within the disputed territory, infuriating Mexican military leaders and leading to open skirmishes between Mexican and U.S. troops. President Polk had run on a platform of westward U.S. expansion, so he wasn’t about to cede any land to Mexico, and Mexico wouldn’t allow it to be purchased. So, Polk urged Congress to declare war on Mexico, which they did on May 13, 1846.
From the start, Mexico faced serious disadvantages. Their armaments were outdated compared to those of U.S. troops, as most Mexican soldiers used surplus British muskets while U.S. soldiers had access to rifles and revolvers. Most difficult for Mexico to overcome were its own, severe political divisions. Centralistas, who supported a centralized Mexican government, were bitter rivals with federalists, who wanted a decentralized government structure. These two groups often failed to work together within military ranks, and sometimes even turned their weapons on one another. Even Mexican General Antonio López de Santa Anna, Mexico’s most famous military leader, struggled to get his nation’s divided political factions to fight together.
These obstacles quickly proved insurmountable for the Mexican military. After a three-day battle, the U.S. handily captured the major city of Monterrey, Mexico, on September 24, 1846. Not long after, the U.S. advanced into central Mexico and the bloody Battle of Buena Vista ended ambiguously, with both sides claiming victory. However, Mexico never decisively won a single battle in the war, and on September 14, 1847, the U.S. Army captured Mexico City, ending the fighting.
It wasn’t exactly smooth sailing from that point on. The Mexican government had to reform enough to be able to negotiate the war’s ending. This took time, since most of the Mexican government had fled Mexico City in advance of its downfall. It wasn’t until February 2, 1848, that the Treaty of Guadalupe Hidalgo was signed, and the war officially ended. The treaty granted the U.S. all of the formerly-contested territory, which eventually became the states of New Mexico, Utah, Arizona, Nevada, Colorado, California, and, of course, Texas. In return, Mexico got $15 million—far less than the U.S. originally offered to purchase the territory for. It might not have been a great deal to begin with—but Mexico likely ended up wishing they'd taken it.
[Image description: An illustration of soldiers in blue uniforms on horseback, one holding a sword aloft. Other soldiers are on the ground in disarray as others march up a distant hill amid clouds of smoke.] Credit & copyright: Storming of Independence Hill at the Battle of Monterey Kelloggs & Thayer, c. 1850-1900. Library of Congress Prints and Photographs Division Washington, D.C. 20540 USA. Control number: 93507890. Public Domain.This is one dispute between neighbors that got way out of hand. On this day in 1845, the U.S. Congress approved the annexation of the Republic of Texas, leading to the Mexican-American War. The conflict lasted for two brutal years and claimed the lives of nearly 40,000 soldiers.
Contrary of popular belief, Texas was not actually part of Mexico at the time of its annexation. Rather, it was a breakaway state—a Republic of its own that had gained independence from Mexico during the fittingly-named Texas Revolution. When the U.S. decided to annex it, the Republic had existed for around 10 years. For most of its existence, the U.S. recognized the Republic of Texas as an independent nation, while Mexico did not. Mexico considered it a rebellious state, and was eager to quash the Republic’s independent economic dealings with other nations. At the same time, they threatened war if the U.S. ever tried to annex the Republic of Texas.
Mexico had plenty of reasons to worry since the Republic of Texas itself was in favor of being annexed. In 1836, the Republic voted to become part of the U.S., as they were eager to procure the protection of the U.S. military and gain a stronger economic standing. However, it wasn’t until 1845 that President John Tyler, with the help of President-elect James K. Polk, passed a joint resolution in both houses of Congress and officially made Texas part of the United States. This increase in U.S. territory followed a trend of westward expansion at the time.
Mexico wasn’t happy, but they didn’t make good on their threat to declare war over the annexation. Rather, they took issue with Texas’ new borders. Mexico believed that the border should only extend as far as the Nueces River, but Texas claimed that their border extended all the way to the Rio Grande River and included portions of modern-day New Mexico and Colorado. In November, 1845, The U.S. sent Congressman John Slidell to negotiate a purchase agreement with Mexico for the disputed areas of land. At the same time, The U.S. Army began to take up stations within the disputed territory, infuriating Mexican military leaders and leading to open skirmishes between Mexican and U.S. troops. President Polk had run on a platform of westward U.S. expansion, so he wasn’t about to cede any land to Mexico, and Mexico wouldn’t allow it to be purchased. So, Polk urged Congress to declare war on Mexico, which they did on May 13, 1846.
From the start, Mexico faced serious disadvantages. Their armaments were outdated compared to those of U.S. troops, as most Mexican soldiers used surplus British muskets while U.S. soldiers had access to rifles and revolvers. Most difficult for Mexico to overcome were its own, severe political divisions. Centralistas, who supported a centralized Mexican government, were bitter rivals with federalists, who wanted a decentralized government structure. These two groups often failed to work together within military ranks, and sometimes even turned their weapons on one another. Even Mexican General Antonio López de Santa Anna, Mexico’s most famous military leader, struggled to get his nation’s divided political factions to fight together.
These obstacles quickly proved insurmountable for the Mexican military. After a three-day battle, the U.S. handily captured the major city of Monterrey, Mexico, on September 24, 1846. Not long after, the U.S. advanced into central Mexico and the bloody Battle of Buena Vista ended ambiguously, with both sides claiming victory. However, Mexico never decisively won a single battle in the war, and on September 14, 1847, the U.S. Army captured Mexico City, ending the fighting.
It wasn’t exactly smooth sailing from that point on. The Mexican government had to reform enough to be able to negotiate the war’s ending. This took time, since most of the Mexican government had fled Mexico City in advance of its downfall. It wasn’t until February 2, 1848, that the Treaty of Guadalupe Hidalgo was signed, and the war officially ended. The treaty granted the U.S. all of the formerly-contested territory, which eventually became the states of New Mexico, Utah, Arizona, Nevada, Colorado, California, and, of course, Texas. In return, Mexico got $15 million—far less than the U.S. originally offered to purchase the territory for. It might not have been a great deal to begin with—but Mexico likely ended up wishing they'd taken it.
[Image description: An illustration of soldiers in blue uniforms on horseback, one holding a sword aloft. Other soldiers are on the ground in disarray as others march up a distant hill amid clouds of smoke.] Credit & copyright: Storming of Independence Hill at the Battle of Monterey Kelloggs & Thayer, c. 1850-1900. Library of Congress Prints and Photographs Division Washington, D.C. 20540 USA. Control number: 93507890. Public Domain. -
7 minFREEWork Business CurioFree4 CQ
New research shows that not only are back-to-the-office mandates unpopular with employees, they lead to more turnover and chase away the employees with the m...
New research shows that not only are back-to-the-office mandates unpopular with employees, they lead to more turnover and chase away the employees with the m...
December 28, 2024
-
1 minFREEHumanities Word CurioFree1 CQ
Word of the Day
: December 28, 2024\kun-DOHN\ verb
What It Means
To condone something that is considered wrong is to forgive or approve it, o...
with Merriam-WebsterWord of the Day
: December 28, 2024\kun-DOHN\ verb
What It Means
To condone something that is considered wrong is to forgive or approve it, o...
-
FREEGolf Sporty CurioFree1 CQ
When you’re at the golf course, stray balls and runaway carts are the biggest dangers you’re likely to encounter. But one “hazard” that golfers have real reason to fret over is the chance of lightning strikes. Recently, a tragic lightning fatality at a golf course in Atlanta, Georgia, even prompted a wrongful death lawsuit. While plenty of golfers have openly spoken about the dangers posed by lightning, data indicates that injuries and deaths from golf-course lightning strikes are exceedingly rare. According to the National Weather Service, there were 418 lightning fatalities in the U.S. between 2006 and 2019. While two out of three victims were engaged in an outdoor leisure activity when they were struck, only ten individuals were golfing. Fishing actually accounted for four times as many fatalities with 40 deaths, followed by camping with 20 deaths and boating, with 18 deaths. The summer months between June and August accounted for the majority of lightning-related deaths. Of course, the low death count doesn’t mean that it’s safe to play golf during a lightning storm. Rather, the low number of lightning fatalities is likely related to people taking reasonable precautions and paying attention to the weather. When you see dark clouds rolling over the green, it’s still best to put away the golf umbrella and head back to the clubhouse.
[Image description: Lightning in a purple sky.] Credit & copyright: Martinus, Pexels
When you’re at the golf course, stray balls and runaway carts are the biggest dangers you’re likely to encounter. But one “hazard” that golfers have real reason to fret over is the chance of lightning strikes. Recently, a tragic lightning fatality at a golf course in Atlanta, Georgia, even prompted a wrongful death lawsuit. While plenty of golfers have openly spoken about the dangers posed by lightning, data indicates that injuries and deaths from golf-course lightning strikes are exceedingly rare. According to the National Weather Service, there were 418 lightning fatalities in the U.S. between 2006 and 2019. While two out of three victims were engaged in an outdoor leisure activity when they were struck, only ten individuals were golfing. Fishing actually accounted for four times as many fatalities with 40 deaths, followed by camping with 20 deaths and boating, with 18 deaths. The summer months between June and August accounted for the majority of lightning-related deaths. Of course, the low death count doesn’t mean that it’s safe to play golf during a lightning storm. Rather, the low number of lightning fatalities is likely related to people taking reasonable precautions and paying attention to the weather. When you see dark clouds rolling over the green, it’s still best to put away the golf umbrella and head back to the clubhouse.
[Image description: Lightning in a purple sky.] Credit & copyright: Martinus, Pexels
-
7 minFREEWork Business CurioFree4 CQ
Stock and bond markets took unnerving tumbles yesterday when the Federal Reserve Chair suggested there won’t be as many interest rate cuts next year. The S&P...
Stock and bond markets took unnerving tumbles yesterday when the Federal Reserve Chair suggested there won’t be as many interest rate cuts next year. The S&P...
December 27, 2024
-
8 minFREEWork Business CurioFree5 CQ
From the BBC World Service: South Korea has voted to impeach acting President Han Duck-soo, two weeks after parliament voted to impeach President Yoon Suk Ye...
From the BBC World Service: South Korea has voted to impeach acting President Han Duck-soo, two weeks after parliament voted to impeach President Yoon Suk Ye...
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: December 27, 2024\FYE-stee\ adjective
What It Means
Feisty describes someone who has or shows a lively aggressiveness espec...
with Merriam-WebsterWord of the Day
: December 27, 2024\FYE-stee\ adjective
What It Means
Feisty describes someone who has or shows a lively aggressiveness espec...
-
FREEMind + Body Daily CurioFree1 CQ
Here’s one from the holiday archives: A look back at the history of one of the season’s most festive drinks!
Love it or hate it, there’s no doubt that it’s festive. Eggnog is possibly the most divisive of all holiday drinks, but it’s also one of the most enduring. Eggnog has a surprisingly long history, and though it’s associated with homey holiday parties today, it was once considered too fancy for everyday drinkers.
Modern eggnog is an alcoholic cocktail most often made with cream, sugar, egg yolks, whipped egg whites, nutmeg, and either rum, brandy, or bourbon. Cinnamon is sometimes added for an extra festive kick. It’s easy to see why not everyone is eager to down a glass of the frothy concoction—eggnog may be sweet, but plenty of people will pause at the thought of drinking eggs. Yet, at the time of eggnog’s invention, eggs were a fairly normal ingredient. Eggnog is thought to date back to 13th century England, where it was named after two words: “grog”, meaning rum, and “noggins”, meaning wooden mug. The cocktail evolved from posset, a nonalcoholic celebratory drink that included milk, eggs, and figs that was often served as punch at social gatherings. Like posset, early eggnog was served hot. It didn’t even include alcohol until the 17th century, when celebrants added sherry to the mix. Since both sherry and eggs were expensive in Europe at the time, eggnog was considered an upper-class drink, and was mainly enjoyed by the aristocracy.
Things changed when European settlers began making their way to the U.S. The colonies included many farms, so eggs were widely available, and unlike wine, sherry, rum, and whiskey weren’t heavily taxed. So, alcoholic American eggnog began making its way into colonial celebrations, including Christmas parties. It’s thought that the drink became associated with winter because it was originally served hot, and since Christmas is the biggest wintertime celebration, the two were naturally conflated.
Eggnog remained warm until the early 1900s, when the addition of ice to many cocktails convinced Americans to try it cold. The chill has stuck since then, and even most Europeans take their eggnog cold today. We’re guessing that anyone hesitant to try an egg-heavy cocktail wouldn’t warm up to the idea if it was served hot!
[Image description: A container labeled “Egg Nog” with holly on the label behind a glass of eggnog with a striped straw, on a table with holiday decorations.] Credit & copyright: Jill Wellington, PexelsHere’s one from the holiday archives: A look back at the history of one of the season’s most festive drinks!
Love it or hate it, there’s no doubt that it’s festive. Eggnog is possibly the most divisive of all holiday drinks, but it’s also one of the most enduring. Eggnog has a surprisingly long history, and though it’s associated with homey holiday parties today, it was once considered too fancy for everyday drinkers.
Modern eggnog is an alcoholic cocktail most often made with cream, sugar, egg yolks, whipped egg whites, nutmeg, and either rum, brandy, or bourbon. Cinnamon is sometimes added for an extra festive kick. It’s easy to see why not everyone is eager to down a glass of the frothy concoction—eggnog may be sweet, but plenty of people will pause at the thought of drinking eggs. Yet, at the time of eggnog’s invention, eggs were a fairly normal ingredient. Eggnog is thought to date back to 13th century England, where it was named after two words: “grog”, meaning rum, and “noggins”, meaning wooden mug. The cocktail evolved from posset, a nonalcoholic celebratory drink that included milk, eggs, and figs that was often served as punch at social gatherings. Like posset, early eggnog was served hot. It didn’t even include alcohol until the 17th century, when celebrants added sherry to the mix. Since both sherry and eggs were expensive in Europe at the time, eggnog was considered an upper-class drink, and was mainly enjoyed by the aristocracy.
Things changed when European settlers began making their way to the U.S. The colonies included many farms, so eggs were widely available, and unlike wine, sherry, rum, and whiskey weren’t heavily taxed. So, alcoholic American eggnog began making its way into colonial celebrations, including Christmas parties. It’s thought that the drink became associated with winter because it was originally served hot, and since Christmas is the biggest wintertime celebration, the two were naturally conflated.
Eggnog remained warm until the early 1900s, when the addition of ice to many cocktails convinced Americans to try it cold. The chill has stuck since then, and even most Europeans take their eggnog cold today. We’re guessing that anyone hesitant to try an egg-heavy cocktail wouldn’t warm up to the idea if it was served hot!
[Image description: A container labeled “Egg Nog” with holly on the label behind a glass of eggnog with a striped straw, on a table with holiday decorations.] Credit & copyright: Jill Wellington, Pexels