Curio Cabinet
- By Date
- By Type
April 24, 2025
-
7 minFREEWork Business CurioFree4 CQ
The government has moved to give immigration officials access to IRS records. The Trump administration wants to use tax information to find people under depo...
The government has moved to give immigration officials access to IRS records. The Trump administration wants to use tax information to find people under depo...
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: April 24, 2025\ah-STEN-suh-bul\ adjective
What It Means
Ostensible is used to describe something that seems or is said to b...
with Merriam-WebsterWord of the Day
: April 24, 2025\ah-STEN-suh-bul\ adjective
What It Means
Ostensible is used to describe something that seems or is said to b...
-
FREEEngineering Nerdy CurioFree1 CQ
Everyone loves a good tutorial video…even robots. Despite humanity’s recent breakthroughs in AI technology, there still aren’t many robots around. That’s because robots have physical bodies, which means that they have to navigate the physical world…which means that they have to be able to learn and adapt. That’s a tall order for a non-living entity with no brain. Typically, robots have to be programmed with very specific instructions, and then re-programmed with new information whenever they inevitably encounter real-world obstacles. Now, though, researchers at Cornell University in New York have developed an AI-powered framework that allows robots to learn new things simply by watching a video. It’s called RHyME (Retrieval for Hybrid Imitation under Mismatched Execution), and it works by equipping robots with a “memory bank” of moving images that they can access when they encounter tasks they don’t understand. While videos of humans performing various tasks have been used to train robots before, the method was never completely successful because humans don’t move like robots, and their movements would therefore confuse their robotic pupils. This is the problem that RHyME set out to fix. When a RHyME-equipped robot watches how-to videos of humans performing tasks, it retains all the images within a memory bank. After watching multiple videos, the robot has a lot of stored information to draw upon, and it can piece together new actions using its memory for “inspiration.” For example, if a robot has seen a video of a person opening a book, it can not only open a book itself, but also draw on other videos it might have seen of humans grasping, lifting, and setting down objects. This would allow it to grab, lift, and set down the book in addition to opening it, even if it hadn’t seen a human perform those exact actions with a book. RHyME could open the door to more adaptive robotic learning, allowing robots to safely perform tasks in all sorts of new environments. Futuristic robotic butlers, here we come!
[Image description: A digital illustration of a robotic hand reaching toward geometric grid-like shapes.] Credit & copyright: Tara Winstead, Pexels
Everyone loves a good tutorial video…even robots. Despite humanity’s recent breakthroughs in AI technology, there still aren’t many robots around. That’s because robots have physical bodies, which means that they have to navigate the physical world…which means that they have to be able to learn and adapt. That’s a tall order for a non-living entity with no brain. Typically, robots have to be programmed with very specific instructions, and then re-programmed with new information whenever they inevitably encounter real-world obstacles. Now, though, researchers at Cornell University in New York have developed an AI-powered framework that allows robots to learn new things simply by watching a video. It’s called RHyME (Retrieval for Hybrid Imitation under Mismatched Execution), and it works by equipping robots with a “memory bank” of moving images that they can access when they encounter tasks they don’t understand. While videos of humans performing various tasks have been used to train robots before, the method was never completely successful because humans don’t move like robots, and their movements would therefore confuse their robotic pupils. This is the problem that RHyME set out to fix. When a RHyME-equipped robot watches how-to videos of humans performing tasks, it retains all the images within a memory bank. After watching multiple videos, the robot has a lot of stored information to draw upon, and it can piece together new actions using its memory for “inspiration.” For example, if a robot has seen a video of a person opening a book, it can not only open a book itself, but also draw on other videos it might have seen of humans grasping, lifting, and setting down objects. This would allow it to grab, lift, and set down the book in addition to opening it, even if it hadn’t seen a human perform those exact actions with a book. RHyME could open the door to more adaptive robotic learning, allowing robots to safely perform tasks in all sorts of new environments. Futuristic robotic butlers, here we come!
[Image description: A digital illustration of a robotic hand reaching toward geometric grid-like shapes.] Credit & copyright: Tara Winstead, Pexels
-
FREEBiology Daily Curio #3070Free1 CQ
The colors of spring are always a sight to behold, but some of them we can’t actually see. While we’ve known for decades that there are certain colors the human eye can’t detect, new research has uncovered a previously unknown one—and has even helped a few people to see it.
Humans see color because of light sensitive cells in our eyes called cones. Some cones are sensitive to long wavelengths of light, some to medium wavelengths, and others to short wavelengths. While short-sensitive cones are stimulated by white-blue light and long-sensitive cones by red light, medium-sensitive cones aren’t stimulated by any light independent of other cones. To see what would happen if these medium-sensitive cones were stimulated directly, U.S. researchers first mapped the retinas of five study participants, noting the exact positions of their cones. Then, a laser was used to stimulate only the medium-sensitive cones in each person’s eye.
Participants reported seeing a large patch of color different from any they’d seen before. It was described as an impossibly saturated blue-green. The new color has been dubbed “olo”, a name based on the binary code 010, which indicates that only the medium-sensitive cones were activated. To ensure that participants had actually seen the same color, they each took color-matching tests. When given an adjustable color wheel and asked to match it as closely as possible to olo, all participants selected a teal color.
As amazing as the results seem, some scientists are dubious that olo is actually its own color, claiming that, though it can only be seen via unnatural stimulation, the color itself is just a highly-saturated green. As much as we’d love to see whether they’re right, we’re not quite ready to have lasers flashed in our eyes. For now, we’ll stick with regular, springtime green.[Image description: A digital illustration representing rainbow light shining through a triangular, white prism.] Credit & copyright: Author-created illustration. Public Domain.
The colors of spring are always a sight to behold, but some of them we can’t actually see. While we’ve known for decades that there are certain colors the human eye can’t detect, new research has uncovered a previously unknown one—and has even helped a few people to see it.
Humans see color because of light sensitive cells in our eyes called cones. Some cones are sensitive to long wavelengths of light, some to medium wavelengths, and others to short wavelengths. While short-sensitive cones are stimulated by white-blue light and long-sensitive cones by red light, medium-sensitive cones aren’t stimulated by any light independent of other cones. To see what would happen if these medium-sensitive cones were stimulated directly, U.S. researchers first mapped the retinas of five study participants, noting the exact positions of their cones. Then, a laser was used to stimulate only the medium-sensitive cones in each person’s eye.
Participants reported seeing a large patch of color different from any they’d seen before. It was described as an impossibly saturated blue-green. The new color has been dubbed “olo”, a name based on the binary code 010, which indicates that only the medium-sensitive cones were activated. To ensure that participants had actually seen the same color, they each took color-matching tests. When given an adjustable color wheel and asked to match it as closely as possible to olo, all participants selected a teal color.
As amazing as the results seem, some scientists are dubious that olo is actually its own color, claiming that, though it can only be seen via unnatural stimulation, the color itself is just a highly-saturated green. As much as we’d love to see whether they’re right, we’re not quite ready to have lasers flashed in our eyes. For now, we’ll stick with regular, springtime green.[Image description: A digital illustration representing rainbow light shining through a triangular, white prism.] Credit & copyright: Author-created illustration. Public Domain.
April 23, 2025
-
7 minFREEWork Business CurioFree4 CQ
From the BBC World Service: The International Monetary Fund has cut its prediction for global economic growth from 3.3% to 2.8%. In its assessment, it descri...
From the BBC World Service: The International Monetary Fund has cut its prediction for global economic growth from 3.3% to 2.8%. In its assessment, it descri...
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: April 23, 2025\SLUFF\ verb
What It Means
Slough is a formal verb used for the action of getting rid of something unwanted. ...
with Merriam-WebsterWord of the Day
: April 23, 2025\SLUFF\ verb
What It Means
Slough is a formal verb used for the action of getting rid of something unwanted. ...
-
FREEBiology Nerdy CurioFree1 CQ
Hey there, that’s no bear! Red pandas might not be closely related to actual pandas, but they do still resemble their closest living relatives: racoons. With their reddish fur, “masked” faces and long, ringed tails, red pandas are a striking sight. No wonder they’re cultural icons in Japan, and have been the subject of movies, books, and cartoons in recent years. Beloved as they are, though, red pandas are also endangered, with less than 10,000 living in the wild.
Red pandas are unique in ways that have nothing to do with their good looks. They’re the only living members of the family Ailuridae, which falls within the superfamily Musteloidea. Other members of this superfamily include racoons, skunks, and weasels. Like these animals, red pandas are officially considered carnivores because of certain physical characteristics, like their teeth and skull shape. However, unlike most other members of Musteloidea, red pandas almost exclusively eat vegetation: specifically bamboo. In fact, it makes up around 95 percent of their diet. This is the real reason that they share their name with giant pandas, one of the only other animals on Earth that survives on almost nothing but bamboo. Compared to giant pandas, though, red pandas are quite small. They’re only about the size of a domestic cat, reaching lengths of around 43 inches (including their tails) and weighing between eight and 17 pounds.
In their natural habitat of high-altitude, mountainous Asian forests, red pandas spend most of their lives in the treetops. They are excellent climbers, with special wrist bones that act as pseudo-thumbs, allowing them to grip branches. Like squirrels, red pandas use their long tails for balance. They lead solitary lives until mating season, which takes place from January to March. Adult red pandas split up after mating, and females give birth in summer to one to four cubs, which will stay with her for around a year.
Unfortunately, habitat destruction has caused red panda birth rates to plummet, and they’ve been considered endangered since 2015. Because red pandas are such popular cultural icons, they’re also targeted by the illegal pet trade. Conservation organizations like the Red Panda Network are working in countries like Nepal and Bhutan to increase the red panda population, but stricter protections for red pandas’ habitat will be needed to make any lasting progress. Hopefully their striking looks can strike a chord with government officials.
[Image description: A red panda surrounded by snow, walking with one paw raised.] Credit & copyright: Wikimedia Commons, Dave Pape. This work has been released into the public domain by its author, Davepape. This applies worldwide.Hey there, that’s no bear! Red pandas might not be closely related to actual pandas, but they do still resemble their closest living relatives: racoons. With their reddish fur, “masked” faces and long, ringed tails, red pandas are a striking sight. No wonder they’re cultural icons in Japan, and have been the subject of movies, books, and cartoons in recent years. Beloved as they are, though, red pandas are also endangered, with less than 10,000 living in the wild.
Red pandas are unique in ways that have nothing to do with their good looks. They’re the only living members of the family Ailuridae, which falls within the superfamily Musteloidea. Other members of this superfamily include racoons, skunks, and weasels. Like these animals, red pandas are officially considered carnivores because of certain physical characteristics, like their teeth and skull shape. However, unlike most other members of Musteloidea, red pandas almost exclusively eat vegetation: specifically bamboo. In fact, it makes up around 95 percent of their diet. This is the real reason that they share their name with giant pandas, one of the only other animals on Earth that survives on almost nothing but bamboo. Compared to giant pandas, though, red pandas are quite small. They’re only about the size of a domestic cat, reaching lengths of around 43 inches (including their tails) and weighing between eight and 17 pounds.
In their natural habitat of high-altitude, mountainous Asian forests, red pandas spend most of their lives in the treetops. They are excellent climbers, with special wrist bones that act as pseudo-thumbs, allowing them to grip branches. Like squirrels, red pandas use their long tails for balance. They lead solitary lives until mating season, which takes place from January to March. Adult red pandas split up after mating, and females give birth in summer to one to four cubs, which will stay with her for around a year.
Unfortunately, habitat destruction has caused red panda birth rates to plummet, and they’ve been considered endangered since 2015. Because red pandas are such popular cultural icons, they’re also targeted by the illegal pet trade. Conservation organizations like the Red Panda Network are working in countries like Nepal and Bhutan to increase the red panda population, but stricter protections for red pandas’ habitat will be needed to make any lasting progress. Hopefully their striking looks can strike a chord with government officials.
[Image description: A red panda surrounded by snow, walking with one paw raised.] Credit & copyright: Wikimedia Commons, Dave Pape. This work has been released into the public domain by its author, Davepape. This applies worldwide. -
FREESports Daily Curio #3069Free1 CQ
When you're at a baseball game, the only sound sweeter than the crack of a bat is the peal of a pipe organ. On April 26, 1941, a pipe organ was played for the first time at a professional baseball game, creating an unexpected musical tradition that has lasted for decades. While the sound of a pipe organ is heavily associated with baseball today, live music was once something of a novelty at large sporting events. The first musician to play a pipe organ at the ballpark was Roy Nelson, who entertained fans at Wrigley Field in Chicago. At the time, the music couldn't be played over the loudspeakers, so Nelson’s performance was a pre-game event. Due to copyright concerns (since the games were being aired on the radio) Nelson was only able to play for two days, but the trend caught on anyway. In 1942, Gladys Goodding, a silent film musician who had experience playing large events at Madison Square Garden, became the first professional organist in baseball history. Her music, which punctuated different parts of the game and encouraged audience participation, made her something of a legendary figure. She even earned the nickname, “The Ebbets Field Organ Queen" during her tenure playing for the Brooklyn Dodgers. Her career as a baseball organist lasted until 1957 when the team moved to Los Angeles. Other ballparks wanted musicians of their own, and even other sports were eager to get in on the action. For example, organist John Kiley played for the Celtics basketball team, the Red Sox baseball team, and the Bruins ice hockey team in Boston. While it ultimately didn’t catch on in other sports, today organ music is associated with baseball games almost as much as it’s associated with churches. Of course, dedicated fans would probably tell you that there’s little difference between baseball and religion.
[Image description: A black-and-white photo of a baseball on the ground.] Credit & copyright: Rachel Xiao, PexelsWhen you're at a baseball game, the only sound sweeter than the crack of a bat is the peal of a pipe organ. On April 26, 1941, a pipe organ was played for the first time at a professional baseball game, creating an unexpected musical tradition that has lasted for decades. While the sound of a pipe organ is heavily associated with baseball today, live music was once something of a novelty at large sporting events. The first musician to play a pipe organ at the ballpark was Roy Nelson, who entertained fans at Wrigley Field in Chicago. At the time, the music couldn't be played over the loudspeakers, so Nelson’s performance was a pre-game event. Due to copyright concerns (since the games were being aired on the radio) Nelson was only able to play for two days, but the trend caught on anyway. In 1942, Gladys Goodding, a silent film musician who had experience playing large events at Madison Square Garden, became the first professional organist in baseball history. Her music, which punctuated different parts of the game and encouraged audience participation, made her something of a legendary figure. She even earned the nickname, “The Ebbets Field Organ Queen" during her tenure playing for the Brooklyn Dodgers. Her career as a baseball organist lasted until 1957 when the team moved to Los Angeles. Other ballparks wanted musicians of their own, and even other sports were eager to get in on the action. For example, organist John Kiley played for the Celtics basketball team, the Red Sox baseball team, and the Bruins ice hockey team in Boston. While it ultimately didn’t catch on in other sports, today organ music is associated with baseball games almost as much as it’s associated with churches. Of course, dedicated fans would probably tell you that there’s little difference between baseball and religion.
[Image description: A black-and-white photo of a baseball on the ground.] Credit & copyright: Rachel Xiao, Pexels
April 22, 2025
-
7 minFREEWork Business CurioFree4 CQ
From the BBC World Service: Sky-high tariffs on Chinese goods arriving at the U.S. border are already having a knock-on effect for many companies. Many manuf...
From the BBC World Service: Sky-high tariffs on Chinese goods arriving at the U.S. border are already having a knock-on effect for many companies. Many manuf...
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: April 22, 2025\lee-AY-zahn\ noun
What It Means
Liaison refers to a person who helps organizations or groups work together a...
with Merriam-WebsterWord of the Day
: April 22, 2025\lee-AY-zahn\ noun
What It Means
Liaison refers to a person who helps organizations or groups work together a...
-
FREEGuitar Song CurioFree2 CQ
You could always count on Lonnie to get his licks in. This month in 2016, the world said goodbye to American guitarist Lonnie Mack, who helped popularize the electric guitar with his impressive instrumentals. Specializing in rock, blues, and country music, Mack took a liking to electric guitars in the mid-1950s, before they were a popular mainstream instrument. He quickly endeared audiences to the guitars by using them to craft dynamic instrumental tracks, like 1963’s Wham!. The bouncy, danceable song wouldn't have been out of place at a sock hop, yet it featured significant reverb that gave it a rough-around-the-edges feel, courtesy of Mack’s electric guitar. The song reached number 24 on the Billboard pop charts, and piqued both public and industry interest in electric guitars. Today, of course, they're industry staples. No wonder many rock historians believe that Mack helped create our modern “rock guitar” sound. You could say he was instrumental in the process.
You could always count on Lonnie to get his licks in. This month in 2016, the world said goodbye to American guitarist Lonnie Mack, who helped popularize the electric guitar with his impressive instrumentals. Specializing in rock, blues, and country music, Mack took a liking to electric guitars in the mid-1950s, before they were a popular mainstream instrument. He quickly endeared audiences to the guitars by using them to craft dynamic instrumental tracks, like 1963’s Wham!. The bouncy, danceable song wouldn't have been out of place at a sock hop, yet it featured significant reverb that gave it a rough-around-the-edges feel, courtesy of Mack’s electric guitar. The song reached number 24 on the Billboard pop charts, and piqued both public and industry interest in electric guitars. Today, of course, they're industry staples. No wonder many rock historians believe that Mack helped create our modern “rock guitar” sound. You could say he was instrumental in the process.
-
FREEWorld History Daily Curio #3068Free1 CQ
Nobody likes Mondays, but you’ve probably never had one as bad as this. On Easter Monday in 1360, a deadly hailstorm devastated English forces in the Hundred Years' War so badly that they ended up signing a peace treaty. The Hundred Years' War between Britain and France was already a bloody conflict, but on one fateful day in 1360, death was dealt not by soldiers, but by inclement weather. King Edward III of England had crossed the English Channel with his troops and was making his way through the French countryside, pillaging throughout the winter. In April, Edward III's army was approaching Paris when they stopped to camp outside the town of Chartres. They weren't in any danger from enemy forces, but they would suffer heavy losses regardless. On what would come to be known as "Black Monday," a devastating hailstorm broke out over the area. First, a lightning strike killed several people, then massive hailstones fell from the sky, killing 1,000 English soldiers and 6,000 horses.
It might seem unbelievable, but there are modern records of hailstones as wide as eight inches, weighing nearly two pounds. That’s heavy enough to be lethal. Understandably, the hailstorm was seen as a divine omen, and Edward III went on to negotiate the Treaty of Brétigny. According to the treaty, Edward III was to renounce his claims to the throne of France and was given some territory in the north in exchange. The treaty didn't end the Hundred Years' War for good. The conflict started up again just nine years later, after the King of France accused Edward III of violating the terms of the treaty. The war, which began in 1337, didn’t officially conclude until 1453. Maybe weirder weather could have ended it sooner!
[Image description: Hailstones on ice.] Credit & copyright: Julia Filirovska, PexelsNobody likes Mondays, but you’ve probably never had one as bad as this. On Easter Monday in 1360, a deadly hailstorm devastated English forces in the Hundred Years' War so badly that they ended up signing a peace treaty. The Hundred Years' War between Britain and France was already a bloody conflict, but on one fateful day in 1360, death was dealt not by soldiers, but by inclement weather. King Edward III of England had crossed the English Channel with his troops and was making his way through the French countryside, pillaging throughout the winter. In April, Edward III's army was approaching Paris when they stopped to camp outside the town of Chartres. They weren't in any danger from enemy forces, but they would suffer heavy losses regardless. On what would come to be known as "Black Monday," a devastating hailstorm broke out over the area. First, a lightning strike killed several people, then massive hailstones fell from the sky, killing 1,000 English soldiers and 6,000 horses.
It might seem unbelievable, but there are modern records of hailstones as wide as eight inches, weighing nearly two pounds. That’s heavy enough to be lethal. Understandably, the hailstorm was seen as a divine omen, and Edward III went on to negotiate the Treaty of Brétigny. According to the treaty, Edward III was to renounce his claims to the throne of France and was given some territory in the north in exchange. The treaty didn't end the Hundred Years' War for good. The conflict started up again just nine years later, after the King of France accused Edward III of violating the terms of the treaty. The war, which began in 1337, didn’t officially conclude until 1453. Maybe weirder weather could have ended it sooner!
[Image description: Hailstones on ice.] Credit & copyright: Julia Filirovska, Pexels
April 21, 2025
-
FREEArt Appreciation Art CurioFree1 CQ
Holy cow...well, actually, holy bull. Apis was a bull deity in ancient Egypt that was associated with many different concepts, from fertility and grain production to death and the underworld. The piece above is a stone figurine of a bull with a broken disc on its head. The stone is green and the surface contains etched designs. While many Egyptian gods, which began to be worshipped around 5,000 years ago, were believed to take on the forms of animals, Apis only took on the form of a bull. Apis grew to be strongly associated with Ptah, a creator deity worshipped in Memphis. Black bulls with white, triangular markings on their heads were said to be favorites of Apis, and some of these “Apis bulls” were even given servants and lived lives of luxury in Egypt. In a ritual called The Running of Apis, the bulls were let loose in Memphis’s temple precinct, and their running symbolized fertilizing the land. When an Apis bull died, it was given an opulent burial and all of Egypt mourned. Don't try to take this bull by the horns, lest you incur the wrath of the gods.
Apis Bull, 400–100 BCE, Serpentinite, 20.87 x 7.5 x 23.25 in. (53 x 19 x 59 cm.), The Cleveland Museum of Art, Cleveland, Ohio.
[Image credit & copyright: The Cleveland Museum of Art, Leonard C. Hanna Jr. Fund 1969.118, Public Domain Creative Commons Zero (CC0) designation.]Holy cow...well, actually, holy bull. Apis was a bull deity in ancient Egypt that was associated with many different concepts, from fertility and grain production to death and the underworld. The piece above is a stone figurine of a bull with a broken disc on its head. The stone is green and the surface contains etched designs. While many Egyptian gods, which began to be worshipped around 5,000 years ago, were believed to take on the forms of animals, Apis only took on the form of a bull. Apis grew to be strongly associated with Ptah, a creator deity worshipped in Memphis. Black bulls with white, triangular markings on their heads were said to be favorites of Apis, and some of these “Apis bulls” were even given servants and lived lives of luxury in Egypt. In a ritual called The Running of Apis, the bulls were let loose in Memphis’s temple precinct, and their running symbolized fertilizing the land. When an Apis bull died, it was given an opulent burial and all of Egypt mourned. Don't try to take this bull by the horns, lest you incur the wrath of the gods.
Apis Bull, 400–100 BCE, Serpentinite, 20.87 x 7.5 x 23.25 in. (53 x 19 x 59 cm.), The Cleveland Museum of Art, Cleveland, Ohio.
[Image credit & copyright: The Cleveland Museum of Art, Leonard C. Hanna Jr. Fund 1969.118, Public Domain Creative Commons Zero (CC0) designation.] -
7 minFREEWork Business CurioFree4 CQ
Yesterday, Federal Reserve Chair Jerome Powell took a wait-and-see posture on interest rates amid market disruptions and increased talk of recession linked t...
Yesterday, Federal Reserve Chair Jerome Powell took a wait-and-see posture on interest rates amid market disruptions and increased talk of recession linked t...
-
FREEUS History Daily Curio #3067Free1 CQ
San Francisco is no stranger to earthquakes, but this one was a particular doozy. This month in 1906, the City by the Bay was devastated and permanently reshaped by what would come to be known as the Great 1906 San Francisco Earthquake. On the morning of April 18, 1906, at 5:12 AM, many San Francisco residents were woken up by foreshocks, smaller earthquakes that can occur hours to minutes ahead of a larger one. Just 20 seconds or so later, an earthquake with a magnitude of 7.9 hit the city in earnest, shaking the ground for a full minute. The epicenter of the earthquake was at San Andreas fault, where 296 miles of the northern portion ruptured, sending out a destructive quake that could be felt as far north as Oregon and as far south as Los Angeles. The earthquake was so powerful that buildings toppled and streets were torn apart, but that was only part of the event’s destructive power. There's a reason that it's sometimes called the Great San Francisco Earthquake and Fire. The ensuing flames, caused by burst gas pipes and upended stoves, caused almost as much damage as the earthquake itself. Over the course of four days, 28,000 buildings in 500 blocks were reduced to rubble and ash. It was around $350 million worth of damage, but the loss of property paled in comparison to the loss of life. An estimated 3,000 people died in the earthquake and around 250,000 people were left homeless in its aftermath. The disaster had just one silver lining: geologic observations of the fault and a survey of the devastation proved to be a massive help in understanding how earthquakes cause damage, and the city was quickly rebuilt to be more earthquake and fire-resistant. No matter what, though, the real fault lies with the fault.
[Image description: A black-and-white photo of San Fransisco after the 1906 earthquake, with many ruined buildings.] Credit & copyright: National Archives Catalog. Photographer: Chadwick, H. D. (U.S. Gov War Department. Office of the Chief Signal Officer.) Images Collected by Brigadier General Adolphus W. Greely, Chief Signal Officer (1887-1906), between 1865–1935. Unrestricted Access, Unrestricted Use, Public Domain.San Francisco is no stranger to earthquakes, but this one was a particular doozy. This month in 1906, the City by the Bay was devastated and permanently reshaped by what would come to be known as the Great 1906 San Francisco Earthquake. On the morning of April 18, 1906, at 5:12 AM, many San Francisco residents were woken up by foreshocks, smaller earthquakes that can occur hours to minutes ahead of a larger one. Just 20 seconds or so later, an earthquake with a magnitude of 7.9 hit the city in earnest, shaking the ground for a full minute. The epicenter of the earthquake was at San Andreas fault, where 296 miles of the northern portion ruptured, sending out a destructive quake that could be felt as far north as Oregon and as far south as Los Angeles. The earthquake was so powerful that buildings toppled and streets were torn apart, but that was only part of the event’s destructive power. There's a reason that it's sometimes called the Great San Francisco Earthquake and Fire. The ensuing flames, caused by burst gas pipes and upended stoves, caused almost as much damage as the earthquake itself. Over the course of four days, 28,000 buildings in 500 blocks were reduced to rubble and ash. It was around $350 million worth of damage, but the loss of property paled in comparison to the loss of life. An estimated 3,000 people died in the earthquake and around 250,000 people were left homeless in its aftermath. The disaster had just one silver lining: geologic observations of the fault and a survey of the devastation proved to be a massive help in understanding how earthquakes cause damage, and the city was quickly rebuilt to be more earthquake and fire-resistant. No matter what, though, the real fault lies with the fault.
[Image description: A black-and-white photo of San Fransisco after the 1906 earthquake, with many ruined buildings.] Credit & copyright: National Archives Catalog. Photographer: Chadwick, H. D. (U.S. Gov War Department. Office of the Chief Signal Officer.) Images Collected by Brigadier General Adolphus W. Greely, Chief Signal Officer (1887-1906), between 1865–1935. Unrestricted Access, Unrestricted Use, Public Domain.
April 20, 2025
-
8 minFREEWork Business CurioFree5 CQ
The American Revolutionary War began 250 years ago Saturday. You probably know the political reasons behind the American colonists' fight for independence, b...
The American Revolutionary War began 250 years ago Saturday. You probably know the political reasons behind the American colonists' fight for independence, b...
-
FREEPolitical Science PP&T CurioFree1 CQ
Tariffs, duties, customs—no matter what you call them, they can be a volatile tool capable of protecting weak industries or bleeding an economy dry. As much as tariffs have been in the news lately, they can be difficult to understand. Luckily, one can always turn to history to see how they’ve been used in the past. In the early days of the U.S., tariffs helped domestic industries stay competitive. However, they can easily turn harmful if they’re implemented without due consideration.
A tariff is a tax applied to goods that are imported or exported, though the latter is rarely used nowadays. Tariffs on exports were sometimes used to safeguard limited resources from leaving the country, but when the word "tariff" is used, it almost always means a tax applied to imports. Tariffs are paid by the entity importing the goods, and they can be either “ad valorem” or "specific" tariffs. Ad valorem tariffs are based on a percentage of the value of the goods being taxed, while specific tariffs are fixed amounts, regardless of the total value. It's easy to think that tariffs are categorically detrimental, but there’s sometimes a good reason for them. Making certain types of imported goods more expensive might help domestic producers of those goods stay more competitive by allowing them to sell their goods at lower prices. On the other hand, poorly-conceived tariffs can end up raising the prices of goods across the board, putting economic pressure on consumers without helping domestic industries. Tariffs used to be much more common around the world, but as international trade grew throughout the 20th century, they became less and less so. In the U.S., at least, many factors led to tariffs’ decline.
In 1789, the Tariff Act was one of the first major pieces of legislation passed by Congress, and it created a massive source of revenue for the fledgling nation. Tariffs helped domestic industries gain their footing by leveling the playing field against their better-established foreign competitors, particularly in Britain. By the beginning of the Civil War, tariffs accounted for around 90 percent of the U.S. government’s revenue. As Americans took up arms against each other, however, there was a sudden, dire need for other sources of government funding. Other taxes were introduced, leading to tariffs becoming less significant. Still, even immediately after the war, tariffs accounted for around 50 percent of the nation's revenue. During the Great Depression, tariffs caused more problems than they solved. The Smoot-Hawley Tariff Act of 1930 was intended to bolster domestic industries, but it also made it less feasible for those industries to export goods, hindering their overall business. By the start of World War II, the U.S. government simply could not rely on tariffs as a significant source of revenue any longer. Social security, the New Deal, and exponentially growing military expenditure among other things created mountains of expenses far too large for tariffs to cover. Thus, tariffs became less popular and less relevant over the decades.
Today, tariffs are typically used as negotiation tools between countries engaged in trade. Generally, tariffs are applied on specific industries or goods. For example, tariffs on steel have been used a number of times in recent history to aid American producers. However, the tariffs making the news as of late are unusual. Instead of targeting specific industries, tariffs are being applied across the board against entire countries, even on goods from established trade partners like Canada and Mexico. Only time will tell how this will impact U.S. consumers and U.S. industries. It’ll be historic no matter what…but that doesn’t always mean it will be smooth sailing.
[Image description: A U.S. flag with a wooden pole.] Credit & copyright: Crefollet, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.Tariffs, duties, customs—no matter what you call them, they can be a volatile tool capable of protecting weak industries or bleeding an economy dry. As much as tariffs have been in the news lately, they can be difficult to understand. Luckily, one can always turn to history to see how they’ve been used in the past. In the early days of the U.S., tariffs helped domestic industries stay competitive. However, they can easily turn harmful if they’re implemented without due consideration.
A tariff is a tax applied to goods that are imported or exported, though the latter is rarely used nowadays. Tariffs on exports were sometimes used to safeguard limited resources from leaving the country, but when the word "tariff" is used, it almost always means a tax applied to imports. Tariffs are paid by the entity importing the goods, and they can be either “ad valorem” or "specific" tariffs. Ad valorem tariffs are based on a percentage of the value of the goods being taxed, while specific tariffs are fixed amounts, regardless of the total value. It's easy to think that tariffs are categorically detrimental, but there’s sometimes a good reason for them. Making certain types of imported goods more expensive might help domestic producers of those goods stay more competitive by allowing them to sell their goods at lower prices. On the other hand, poorly-conceived tariffs can end up raising the prices of goods across the board, putting economic pressure on consumers without helping domestic industries. Tariffs used to be much more common around the world, but as international trade grew throughout the 20th century, they became less and less so. In the U.S., at least, many factors led to tariffs’ decline.
In 1789, the Tariff Act was one of the first major pieces of legislation passed by Congress, and it created a massive source of revenue for the fledgling nation. Tariffs helped domestic industries gain their footing by leveling the playing field against their better-established foreign competitors, particularly in Britain. By the beginning of the Civil War, tariffs accounted for around 90 percent of the U.S. government’s revenue. As Americans took up arms against each other, however, there was a sudden, dire need for other sources of government funding. Other taxes were introduced, leading to tariffs becoming less significant. Still, even immediately after the war, tariffs accounted for around 50 percent of the nation's revenue. During the Great Depression, tariffs caused more problems than they solved. The Smoot-Hawley Tariff Act of 1930 was intended to bolster domestic industries, but it also made it less feasible for those industries to export goods, hindering their overall business. By the start of World War II, the U.S. government simply could not rely on tariffs as a significant source of revenue any longer. Social security, the New Deal, and exponentially growing military expenditure among other things created mountains of expenses far too large for tariffs to cover. Thus, tariffs became less popular and less relevant over the decades.
Today, tariffs are typically used as negotiation tools between countries engaged in trade. Generally, tariffs are applied on specific industries or goods. For example, tariffs on steel have been used a number of times in recent history to aid American producers. However, the tariffs making the news as of late are unusual. Instead of targeting specific industries, tariffs are being applied across the board against entire countries, even on goods from established trade partners like Canada and Mexico. Only time will tell how this will impact U.S. consumers and U.S. industries. It’ll be historic no matter what…but that doesn’t always mean it will be smooth sailing.
[Image description: A U.S. flag with a wooden pole.] Credit & copyright: Crefollet, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.
April 19, 2025
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: April 19, 2025\fass-TID-ee-us\ adjective
What It Means
Someone described as fastidious is extremely or overly careful about...
with Merriam-WebsterWord of the Day
: April 19, 2025\fass-TID-ee-us\ adjective
What It Means
Someone described as fastidious is extremely or overly careful about...
-
FREEGolf Sporty CurioFree1 CQ
Finally, a chance to don green on the green! On April 13th, Northern Irish golfer Rory Daniel McIlroy won the Masters Tournament and was awarded his green jacket, the traditional prize. It was a nail-biting win, as McIlroy faced off against British golfer Justin Rose in a sudden-death playoff. The suspense was intensified by the fact that McIlroy had been famously close to winning the Masters several times before, starting in 2011. That year, he entered the tournament’s final round with a four-stroke lead, but shot a triple-bogey on the tenth hole, followed by more bogeys that left him unable to recover. He made it to the Tournament’s top ten in 2014, finished solo-fourth in 2015, and claimed 10th place in 2016. In 2018, he ended up finishing in a tie for fifth place despite tying his then-personal-best score. In 2022, he came tantalizingly close to winning, but ended up in second place. After failing to make the top ten in 2024, McIlroy came out literally and figuratively swinging at the 2025 tournament. Though he began the tournament’s final day with a five-shot lead, it had disappeared by the 18th hole. That’s when McIlroy hit his ball just four feet from the cup, setting up a sudden-death playoff against Rose that McIlroy ultimately won. Winners are required to pass their jackets on to the next winner after a year...but maybe McIlroy should be allowed to hold on to his just a tad longer? He's earned it!
Finally, a chance to don green on the green! On April 13th, Northern Irish golfer Rory Daniel McIlroy won the Masters Tournament and was awarded his green jacket, the traditional prize. It was a nail-biting win, as McIlroy faced off against British golfer Justin Rose in a sudden-death playoff. The suspense was intensified by the fact that McIlroy had been famously close to winning the Masters several times before, starting in 2011. That year, he entered the tournament’s final round with a four-stroke lead, but shot a triple-bogey on the tenth hole, followed by more bogeys that left him unable to recover. He made it to the Tournament’s top ten in 2014, finished solo-fourth in 2015, and claimed 10th place in 2016. In 2018, he ended up finishing in a tie for fifth place despite tying his then-personal-best score. In 2022, he came tantalizingly close to winning, but ended up in second place. After failing to make the top ten in 2024, McIlroy came out literally and figuratively swinging at the 2025 tournament. Though he began the tournament’s final day with a five-shot lead, it had disappeared by the 18th hole. That’s when McIlroy hit his ball just four feet from the cup, setting up a sudden-death playoff against Rose that McIlroy ultimately won. Winners are required to pass their jackets on to the next winner after a year...but maybe McIlroy should be allowed to hold on to his just a tad longer? He's earned it!
-
9 minFREEWork Business CurioFree5 CQ
It's the law that insurance companies have to cover the costs of certain screenings for cancer, diabetes, infectious diseases and more. Patients could soon h...
It's the law that insurance companies have to cover the costs of certain screenings for cancer, diabetes, infectious diseases and more. Patients could soon h...
April 18, 2025
-
8 minFREEWork Business CurioFree5 CQ
From the BBC World Service: Ukraine says it has signed a memorandum of intent on a minerals deal with the United States after negotiations in Washington. Plu...
From the BBC World Service: Ukraine says it has signed a memorandum of intent on a minerals deal with the United States after negotiations in Washington. Plu...
-
2 minFREEHumanities Word CurioFree2 CQ
Word of the Day
: April 18, 2025\kuh-LAB-uh-rayt\ verb
What It Means
To collaborate is to work with another person or group in order to do or...
with Merriam-WebsterWord of the Day
: April 18, 2025\kuh-LAB-uh-rayt\ verb
What It Means
To collaborate is to work with another person or group in order to do or...
-
FREEMind + Body Daily CurioFree1 CQ
Fire up the grill, backyard barbeque season is nearly upon us! In many places in the U.S., no outdoor get-together is complete without a scoop of Boston baked beans. This famous side’s sweet flavor sets itself apart from other baked beans. Its origins, though, are anything but sweet.
Like other kinds of baked beans, Boston baked beans are made by boiling beans (usually white common beans or navy beans) and then baking them in sauce. The sauce for Boston baked beans is sweetened with molasses and brown sugar, but also has a savory edge since bacon or salt pork is often added.
Boston baked beans are responsible for giving their titular city the nickname “Beantown.” In the years leading up to and directly following the Revolutionary War, Boston boasted more molasses than any other American city, but Bostonians didn’t produce it themselves. The city’s coastal position made it a major hub of the Triangle Trade between the Americas, Europe, and Africa. In this brutal trade, Europe shipped goods to Africa, which were traded for enslaved people, who were shipped to the Americas to farm and produce goods like cotton and rum, which were then shipped to Europe. Boston’s molasses was produced by enslaved people on sugar plantations in the Caribbean, then used in Boston to produce rum as part of the Triangle Trade. Leftover molasses became a common household item in Boston, and was used to create many New England foods that are still famous today, from molasses cookies to Boston baked beans.
In the late 19th century, large food companies began using new, industrial technology to mass produce and can goods. This included foods that were only famous in specific regions, like Boston baked beans. Once they were shipped across the country, Boston baked beans became instantly popular outside of New England. Today, most baked beans on grocery shelves are sweet and syrupy, even if they don’t call themselves Boston baked beans. If you get popular enough, your name sometimes dissolves into the sauce of the general culture.
[Image description: A white bowl filled with baked beans and sliced hot dogs.] Credit & copyright: Thomson200, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.Fire up the grill, backyard barbeque season is nearly upon us! In many places in the U.S., no outdoor get-together is complete without a scoop of Boston baked beans. This famous side’s sweet flavor sets itself apart from other baked beans. Its origins, though, are anything but sweet.
Like other kinds of baked beans, Boston baked beans are made by boiling beans (usually white common beans or navy beans) and then baking them in sauce. The sauce for Boston baked beans is sweetened with molasses and brown sugar, but also has a savory edge since bacon or salt pork is often added.
Boston baked beans are responsible for giving their titular city the nickname “Beantown.” In the years leading up to and directly following the Revolutionary War, Boston boasted more molasses than any other American city, but Bostonians didn’t produce it themselves. The city’s coastal position made it a major hub of the Triangle Trade between the Americas, Europe, and Africa. In this brutal trade, Europe shipped goods to Africa, which were traded for enslaved people, who were shipped to the Americas to farm and produce goods like cotton and rum, which were then shipped to Europe. Boston’s molasses was produced by enslaved people on sugar plantations in the Caribbean, then used in Boston to produce rum as part of the Triangle Trade. Leftover molasses became a common household item in Boston, and was used to create many New England foods that are still famous today, from molasses cookies to Boston baked beans.
In the late 19th century, large food companies began using new, industrial technology to mass produce and can goods. This included foods that were only famous in specific regions, like Boston baked beans. Once they were shipped across the country, Boston baked beans became instantly popular outside of New England. Today, most baked beans on grocery shelves are sweet and syrupy, even if they don’t call themselves Boston baked beans. If you get popular enough, your name sometimes dissolves into the sauce of the general culture.
[Image description: A white bowl filled with baked beans and sliced hot dogs.] Credit & copyright: Thomson200, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.