Curio Cabinet
- By Date
- By Type
September 9, 2025
-
FREEMusic Song CurioFree2 CQ
Indie, folk, rock…there’s no need to choose. That was seemingly the ethos of famed Canadian-American band Buffalo Springfield, which incorporated sounds from all three genres into their music, including their biggest 1970s hits. Bassist Bruce Palmer, born on this day in 1946, played a key role in the band’s unique sound on tracks like 1967’s Rock & Roll Woman. The song begins with folksy harmonies before Palmer’s groovy baseline turns it into something almost psychedelic, and jagged guitar riffs finally carry the track into pure rock territory. Though the song only reached 44 on the Billboard Hot 100 the year it was released, it’s still one of the band’s best-remembered hits, and that’s saying a lot considering that they were inducted into the Rock and Roll Hall of Fame in 1997. Rock and Roll Woman was inspirational to many, including Stevie Nicks, who reportedly heard the song when she was 19 and felt almost as if it was describing her future life as a musician, with its lyrics about a beautiful, mysterious rockstar. Buffalo Springfield might not have been purposefully prophetic, but that doesn’t mean that Nicks was wrong!
Indie, folk, rock…there’s no need to choose. That was seemingly the ethos of famed Canadian-American band Buffalo Springfield, which incorporated sounds from all three genres into their music, including their biggest 1970s hits. Bassist Bruce Palmer, born on this day in 1946, played a key role in the band’s unique sound on tracks like 1967’s Rock & Roll Woman. The song begins with folksy harmonies before Palmer’s groovy baseline turns it into something almost psychedelic, and jagged guitar riffs finally carry the track into pure rock territory. Though the song only reached 44 on the Billboard Hot 100 the year it was released, it’s still one of the band’s best-remembered hits, and that’s saying a lot considering that they were inducted into the Rock and Roll Hall of Fame in 1997. Rock and Roll Woman was inspirational to many, including Stevie Nicks, who reportedly heard the song when she was 19 and felt almost as if it was describing her future life as a musician, with its lyrics about a beautiful, mysterious rockstar. Buffalo Springfield might not have been purposefully prophetic, but that doesn’t mean that Nicks was wrong!
-
FREEGeography Daily Curio #3148Free1 CQ
Maps can tell you how to get somewhere, but they’re not always great at telling you where you’re going. This especially applies to common world maps known as Mercator projection maps, which distort the real size of various places. Now, an international campaign in Africa is hoping to change the way people view the world.
The “Correct The Map” campaign, endorsed by the African Union, is using one of the oldest criticisms of the Mercator projection against it in the hopes of changing people’s perceptions about the continent. Though Africa is home to over 1.4 billion people, those looking at a world map tend to underestimate the continent’s size, population, and global significance due to the undersized portrayal that results from the Mercator projection. Developed by Flemish cartographer Gerardus Mercator in the 16th century, the Mercator projection is a cylindrical map projection that allows the spherical world on a linear scale, where the meridians are equidistant and the lines of latitude grow further apart the closer it gets to the poles. The Mercator projection was ideal for nautical navigation at a time before computer assisted navigation, as it could depict rhumb lines (lines of constant course) as straight lines, making charting easier.
Beyond navigation, however, the Mercator projection has some significant flaws. Because the projection scales infinitely as the lines of latitude approach the poles, objects closer to the poles become more distorted. As a result, Greenland and other landmasses near the poles appear to be much larger than they actually are. Based on the Mercator projection, Greenland appears to be larger than Africa, when it’s actually a fraction of its size. This might be just a curious shortcoming of an otherwise practical map projection, but supporters of Correct The Map claim that such visual distortions minimize the cultural and economic influence of Africa. Therefore, they hope to come up with an alternative that more accurately depicts the size of objects in a map to encourage a more balanced portrayal. It’s a big, wide world, and it just might call for a big, wide map.
[Image description: A map of the world with green continents and blue oceans, without words or labels.] Credit & copyright: Noleander, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.Maps can tell you how to get somewhere, but they’re not always great at telling you where you’re going. This especially applies to common world maps known as Mercator projection maps, which distort the real size of various places. Now, an international campaign in Africa is hoping to change the way people view the world.
The “Correct The Map” campaign, endorsed by the African Union, is using one of the oldest criticisms of the Mercator projection against it in the hopes of changing people’s perceptions about the continent. Though Africa is home to over 1.4 billion people, those looking at a world map tend to underestimate the continent’s size, population, and global significance due to the undersized portrayal that results from the Mercator projection. Developed by Flemish cartographer Gerardus Mercator in the 16th century, the Mercator projection is a cylindrical map projection that allows the spherical world on a linear scale, where the meridians are equidistant and the lines of latitude grow further apart the closer it gets to the poles. The Mercator projection was ideal for nautical navigation at a time before computer assisted navigation, as it could depict rhumb lines (lines of constant course) as straight lines, making charting easier.
Beyond navigation, however, the Mercator projection has some significant flaws. Because the projection scales infinitely as the lines of latitude approach the poles, objects closer to the poles become more distorted. As a result, Greenland and other landmasses near the poles appear to be much larger than they actually are. Based on the Mercator projection, Greenland appears to be larger than Africa, when it’s actually a fraction of its size. This might be just a curious shortcoming of an otherwise practical map projection, but supporters of Correct The Map claim that such visual distortions minimize the cultural and economic influence of Africa. Therefore, they hope to come up with an alternative that more accurately depicts the size of objects in a map to encourage a more balanced portrayal. It’s a big, wide world, and it just might call for a big, wide map.
[Image description: A map of the world with green continents and blue oceans, without words or labels.] Credit & copyright: Noleander, Wikimedia Commons. This file is made available under the Creative Commons CC0 1.0 Universal Public Domain Dedication.
September 8, 2025
-
FREEArt Appreciation Art CurioFree1 CQ
It was a fashion item that was shawl to please! Though mostly forgotten now, a fichu was once an indispensable part of a woman’s wardrobe throughout much of Europe in the 18th century. The picture above shows a black, triangular garment. It’s adorned with floral lace on transparent silk. A fichu was a shawl with a specific purpose. It could either be square and folded in half or made in a triangular shape, but either way, they were meant to go around the neck and come together at the front to cover the neckline. This was partly for warmth and partly for modesty, as the dresses of the 18th century and the Regency era had lower necklines than the times directly before or after. Triangular fichus were sometimes made with a rounded curve on the inside to better conform to the shape of the neck. Shawls and scarves are still worn today, of course, but what set the fichu apart was the distinct purpose it served. When fashions changed and necklines rose, the fichu fell out of favor. Spare a tissue for the fichu.
Fichu, 19th century, Silk, The Metropolitan Museum of Art, New York City, New York
[Image credit & copyright: The Metropolitan Museum of Art, Bequest of Emma T. Gary, 1934. Public Domain.It was a fashion item that was shawl to please! Though mostly forgotten now, a fichu was once an indispensable part of a woman’s wardrobe throughout much of Europe in the 18th century. The picture above shows a black, triangular garment. It’s adorned with floral lace on transparent silk. A fichu was a shawl with a specific purpose. It could either be square and folded in half or made in a triangular shape, but either way, they were meant to go around the neck and come together at the front to cover the neckline. This was partly for warmth and partly for modesty, as the dresses of the 18th century and the Regency era had lower necklines than the times directly before or after. Triangular fichus were sometimes made with a rounded curve on the inside to better conform to the shape of the neck. Shawls and scarves are still worn today, of course, but what set the fichu apart was the distinct purpose it served. When fashions changed and necklines rose, the fichu fell out of favor. Spare a tissue for the fichu.
Fichu, 19th century, Silk, The Metropolitan Museum of Art, New York City, New York
[Image credit & copyright: The Metropolitan Museum of Art, Bequest of Emma T. Gary, 1934. Public Domain. -
FREEScience Daily Curio #3147Free1 CQ
If three bodies are a problem, how bad are four? Astronomers recently discovered a quadruple star system with some unusual inhabitants. It’s one of the first times a system with four stars orbiting each other has been found. While our own galaxy has just one nearby star, out in the vast cosmos, there are many systems with multiple stars. Most of them are binary systems, where two stars are locked in orbit with each other, engaged in a cosmic dance that will last eons until one of them explodes into a brilliant supernova. More rarely, the two stars combine into a single, more massive star. Lesser known are triple-star systems, also known as ternary or trinary systems. These consist of three stars locked in orbit, usually with two stars orbiting each other and a third orbiting around them both. Just a cosmic stone’s throw away, the Alpha Centauri system is known to contain three stars. To date, there have been multiple-star systems discovered containing up to seven stars, but the recently-discovered quadruple star system has something a little extra that makes it even rarer. Right here in the Milky Way, of all places, astronomers found a quadruple-star system containing two brown dwarfs. A brown dwarf is a rare object, also known as a “failed star.” As its nickname implies, a brown dwarf is a sub-stellar object which is too massive to be considered a planet, yet not massive enough to start and sustain nuclear fusion. Since there is no nuclear fusion taking place, brown dwarfs are cold and emit very little energy, making them difficult to find, let alone study in contrast to their brighter, full-fledged stellar counterparts. Sure, you could make the argument that having two sub-stellar objects disqualifies it from being a quadruple-star system, but who’s going to complain?
[Image description: A digital illustration of stars in a dark blue sky with a large, brown star in the center.] Credit & copyright: Author-created image. Public domain.If three bodies are a problem, how bad are four? Astronomers recently discovered a quadruple star system with some unusual inhabitants. It’s one of the first times a system with four stars orbiting each other has been found. While our own galaxy has just one nearby star, out in the vast cosmos, there are many systems with multiple stars. Most of them are binary systems, where two stars are locked in orbit with each other, engaged in a cosmic dance that will last eons until one of them explodes into a brilliant supernova. More rarely, the two stars combine into a single, more massive star. Lesser known are triple-star systems, also known as ternary or trinary systems. These consist of three stars locked in orbit, usually with two stars orbiting each other and a third orbiting around them both. Just a cosmic stone’s throw away, the Alpha Centauri system is known to contain three stars. To date, there have been multiple-star systems discovered containing up to seven stars, but the recently-discovered quadruple star system has something a little extra that makes it even rarer. Right here in the Milky Way, of all places, astronomers found a quadruple-star system containing two brown dwarfs. A brown dwarf is a rare object, also known as a “failed star.” As its nickname implies, a brown dwarf is a sub-stellar object which is too massive to be considered a planet, yet not massive enough to start and sustain nuclear fusion. Since there is no nuclear fusion taking place, brown dwarfs are cold and emit very little energy, making them difficult to find, let alone study in contrast to their brighter, full-fledged stellar counterparts. Sure, you could make the argument that having two sub-stellar objects disqualifies it from being a quadruple-star system, but who’s going to complain?
[Image description: A digital illustration of stars in a dark blue sky with a large, brown star in the center.] Credit & copyright: Author-created image. Public domain.
September 7, 2025
-
FREEStyle PP&T CurioFree1 CQ
If you think bras are uncomfortable, try wearing corsets in the middle of summer. Brassieres have been around in some form for millennia, but the modern bra has only been popular for around a century. Some call them indispensable, some call them oppressive. Whether you love them or hate them, one thing’s for sure: they’ve got quite the history.
Before the advent of underwires, nylon, and adjustable straps, women used a variety of undergarments to support and cover their breasts. As far back as the 4th and 5th centuries B.C.E., the ancient Greeks and Romans used long pieces of fabric to make bandeau like garments. The Romans called it a strophium or mamillare, and these early predecessors to the bra lifted, separated, and supported the breasts. In the following eras, various other cultures developed breast-supporting garments of their own, though there isn’t much surviving documentation describing them. One medieval-era bra discovered in Austria resembles a modern bra or bikini top, consisting of two “cups” made of fabric meant to provide support and held up by straps made from the same material. During the 14th century, French surgeon Henri de Mondeville described a garment worn by many women of that time, which he likened to two bags of fabric fitted tightly on the breasts and fastened into place with a band. However, starting in the 16th century, corsets were introduced in Europe and became favored by middle and upper class women. Corsets were made to measure and wrapped around the wearer’s waist. Sewn into the layers of fabric was boning, often made of literal whale bones or, later, out of steel. Unlike the earlier iterations of the bra, corsets provided support from below by lacing around the torso. They also provided a thinning effect on the waist. Contrary to popular belief, corsets weren’t necessarily uncomfortable, provided they were properly fitted. While tightlacing trends sometimes took the idea of thinning women' s waists to extremes, these were exceptions, not the rule.
Even if corsets were perfectly comfortable, their days were numbered. Their downfall in popularity ultimately came down to their need for boning. The first iteration of the modern bra was invented in 1893 by Marie Tucek, whose patent shows a bra with shoulder straps and a pair of connected cups that support the breasts from underneath. Tucek’s bra didn’t catch on widely, but a few decades later in 1913, the modern bra as most people know it was invented by Mary Phelps Jacob. Jacob was a New York socialite who was sick of corset boning showing through the thinner fabrics of the fashionable evening dresses of the time. Her version was originally made of two silk handkerchiefs fastened onto her with ribbons, and she later created an improved version which she patented in 1914. Jacob also gave the garment its name, with “brassiere” being based on the French word for “bodice” or “arm guard.” Jacob first gave out her bras to friends and family by request, and later sold them commercially under the name “Caresse Crosby.” She later sold her patent to the Warner Brothers Corset Company, and the bras grew in popularity over the years.
The killing blow to the corset was actually World War I. During the war, steel rationing meant that most women were encouraged to give up corsets, which in the U.S. used up enough steel to build two battleships. With a better alternative, there was no reason not to switch to bras.
Over the last century, bras have undergone many changes, usually reflecting the fashions of the time. New materials like Lycra (better known as Spandex) and nylon allowed for more comfortable bras, while “bullet bras” or “cone bras,” with their exaggerated shapes, were designed purely for aesthetic reasons. Today, the bra has evolved from undergarment to essential sportswear for women with the advent of sports bras, which prioritize functionality and comfort, and are often worn without an additional layer of clothing over them. And not a sliver of whale bone in sight!
[Image description: A modern black bra against a white background.] Credit & copyright: FlorenceFlowerRaqs, Wikimedia Commons.If you think bras are uncomfortable, try wearing corsets in the middle of summer. Brassieres have been around in some form for millennia, but the modern bra has only been popular for around a century. Some call them indispensable, some call them oppressive. Whether you love them or hate them, one thing’s for sure: they’ve got quite the history.
Before the advent of underwires, nylon, and adjustable straps, women used a variety of undergarments to support and cover their breasts. As far back as the 4th and 5th centuries B.C.E., the ancient Greeks and Romans used long pieces of fabric to make bandeau like garments. The Romans called it a strophium or mamillare, and these early predecessors to the bra lifted, separated, and supported the breasts. In the following eras, various other cultures developed breast-supporting garments of their own, though there isn’t much surviving documentation describing them. One medieval-era bra discovered in Austria resembles a modern bra or bikini top, consisting of two “cups” made of fabric meant to provide support and held up by straps made from the same material. During the 14th century, French surgeon Henri de Mondeville described a garment worn by many women of that time, which he likened to two bags of fabric fitted tightly on the breasts and fastened into place with a band. However, starting in the 16th century, corsets were introduced in Europe and became favored by middle and upper class women. Corsets were made to measure and wrapped around the wearer’s waist. Sewn into the layers of fabric was boning, often made of literal whale bones or, later, out of steel. Unlike the earlier iterations of the bra, corsets provided support from below by lacing around the torso. They also provided a thinning effect on the waist. Contrary to popular belief, corsets weren’t necessarily uncomfortable, provided they were properly fitted. While tightlacing trends sometimes took the idea of thinning women' s waists to extremes, these were exceptions, not the rule.
Even if corsets were perfectly comfortable, their days were numbered. Their downfall in popularity ultimately came down to their need for boning. The first iteration of the modern bra was invented in 1893 by Marie Tucek, whose patent shows a bra with shoulder straps and a pair of connected cups that support the breasts from underneath. Tucek’s bra didn’t catch on widely, but a few decades later in 1913, the modern bra as most people know it was invented by Mary Phelps Jacob. Jacob was a New York socialite who was sick of corset boning showing through the thinner fabrics of the fashionable evening dresses of the time. Her version was originally made of two silk handkerchiefs fastened onto her with ribbons, and she later created an improved version which she patented in 1914. Jacob also gave the garment its name, with “brassiere” being based on the French word for “bodice” or “arm guard.” Jacob first gave out her bras to friends and family by request, and later sold them commercially under the name “Caresse Crosby.” She later sold her patent to the Warner Brothers Corset Company, and the bras grew in popularity over the years.
The killing blow to the corset was actually World War I. During the war, steel rationing meant that most women were encouraged to give up corsets, which in the U.S. used up enough steel to build two battleships. With a better alternative, there was no reason not to switch to bras.
Over the last century, bras have undergone many changes, usually reflecting the fashions of the time. New materials like Lycra (better known as Spandex) and nylon allowed for more comfortable bras, while “bullet bras” or “cone bras,” with their exaggerated shapes, were designed purely for aesthetic reasons. Today, the bra has evolved from undergarment to essential sportswear for women with the advent of sports bras, which prioritize functionality and comfort, and are often worn without an additional layer of clothing over them. And not a sliver of whale bone in sight!
[Image description: A modern black bra against a white background.] Credit & copyright: FlorenceFlowerRaqs, Wikimedia Commons.
September 6, 2025
-
FREESports Sporty CurioFree1 CQ
Even elbow grease has its limits. The San Francisco Giants have announced that their relief pitcher, Randy Rodriguez, will have to sit out the rest of the season due to an elbow surgery that’s getting more and more common among baseball players. Known as the Tommy John surgery, the operation repairs a type of injury that could easily have ended an athlete’s career just decades ago. Named after the first baseball player to undergo the procedure in 1874, it involves taking ligaments from a donor site (usually the wrist) and using them to repair the ulnar collateral ligament (UCL). As of 2023, around 35 percent of pitchers in the MLB have received the surgery. More surprising, however, is the number of young baseball players who have needed it. In fact, over half of all people who have had their UCL repaired are athletes between the ages of 15 and 19, mostly baseball players. Increased specialization in the sport combined with year-round training and increasing competition have made injuries more common. Those who pitch more and at faster speeds are more likely to injure their UCL, and young athletes aiming for scholarships are no exception. The trend is so pronounced among young athletes that they are the fastest growing segment of the population to receive the surgery. They really do grow up so fast.
Even elbow grease has its limits. The San Francisco Giants have announced that their relief pitcher, Randy Rodriguez, will have to sit out the rest of the season due to an elbow surgery that’s getting more and more common among baseball players. Known as the Tommy John surgery, the operation repairs a type of injury that could easily have ended an athlete’s career just decades ago. Named after the first baseball player to undergo the procedure in 1874, it involves taking ligaments from a donor site (usually the wrist) and using them to repair the ulnar collateral ligament (UCL). As of 2023, around 35 percent of pitchers in the MLB have received the surgery. More surprising, however, is the number of young baseball players who have needed it. In fact, over half of all people who have had their UCL repaired are athletes between the ages of 15 and 19, mostly baseball players. Increased specialization in the sport combined with year-round training and increasing competition have made injuries more common. Those who pitch more and at faster speeds are more likely to injure their UCL, and young athletes aiming for scholarships are no exception. The trend is so pronounced among young athletes that they are the fastest growing segment of the population to receive the surgery. They really do grow up so fast.
September 5, 2025
-
FREEMind + Body Daily CurioFree1 CQ
It’s small, but there’s no doubt that it’s mighty! Jamaican cuisine is known for its bold, spicy flavors, and nothing embodies that more than one of the island’s most common foods: Jamaican beef patties. These hand pies have been around since the 17th century, and though they’re one of Jamaica’s best loved dishes today, the roots of their flavor stretch to many different places.
Jamaican beef patties, also known simply as Jamaican patties depending on their filling, are a kind of hand pie or turnover with thick crust on the outside and a spicy, meaty mixture on the inside. The sturdy-yet-flaky crust is usually made with flour, fat, salt, and baking powder, and gets its signature yellow color from either egg yolks or turmeric. The filling is traditionally made with ground beef, root vegetables like onions, and spices like garlic, ginger, cayenne pepper powder, curry powder, thyme, and Scotch bonnet pepper powder. Some patties use pulled chicken in place of beef, and some are vegetarian, utilizing vegetables like carrots, peas, potatoes, and corn.
It’s no coincidence that Jamaican beef patties bear a resemblance to European meat pies. Similar foods first came to the island around 1509, when the Spanish began to colonize Jamaica, bringing turnovers with them. In 1655, the British took control of the island from Spain, and brought along their own Cornish pasties. These meat pies, with their hard crust, were usually served with gravy. It didn’t take long, however, for Jamaicans and others living in the Caribbean to make the dish their own. Scotch bonnet peppers, commonly used in Jamaican cuisine, were added to the beef filling, while Indian traders and workers added curry powder and enslaved Africans made their patties with cayenne pepper. The patties were made smaller and thinner than Cornish pasties and were served without gravy or sauce, making them easier to carry around and eat while working. Today, the patties are eaten throughout the Caribbean, and regional variations are common. From Europe to Asia to the Caribbean, these seemingly simple patties are actually a flavorful international affair!It’s small, but there’s no doubt that it’s mighty! Jamaican cuisine is known for its bold, spicy flavors, and nothing embodies that more than one of the island’s most common foods: Jamaican beef patties. These hand pies have been around since the 17th century, and though they’re one of Jamaica’s best loved dishes today, the roots of their flavor stretch to many different places.
Jamaican beef patties, also known simply as Jamaican patties depending on their filling, are a kind of hand pie or turnover with thick crust on the outside and a spicy, meaty mixture on the inside. The sturdy-yet-flaky crust is usually made with flour, fat, salt, and baking powder, and gets its signature yellow color from either egg yolks or turmeric. The filling is traditionally made with ground beef, root vegetables like onions, and spices like garlic, ginger, cayenne pepper powder, curry powder, thyme, and Scotch bonnet pepper powder. Some patties use pulled chicken in place of beef, and some are vegetarian, utilizing vegetables like carrots, peas, potatoes, and corn.
It’s no coincidence that Jamaican beef patties bear a resemblance to European meat pies. Similar foods first came to the island around 1509, when the Spanish began to colonize Jamaica, bringing turnovers with them. In 1655, the British took control of the island from Spain, and brought along their own Cornish pasties. These meat pies, with their hard crust, were usually served with gravy. It didn’t take long, however, for Jamaicans and others living in the Caribbean to make the dish their own. Scotch bonnet peppers, commonly used in Jamaican cuisine, were added to the beef filling, while Indian traders and workers added curry powder and enslaved Africans made their patties with cayenne pepper. The patties were made smaller and thinner than Cornish pasties and were served without gravy or sauce, making them easier to carry around and eat while working. Today, the patties are eaten throughout the Caribbean, and regional variations are common. From Europe to Asia to the Caribbean, these seemingly simple patties are actually a flavorful international affair!
September 4, 2025
-
FREEScience Nerdy CurioFree1 CQ
It seems you can never breathe easy these days. According to a new study by the National Institute of Standards and Technology (NIST) published in ASTM International, air purifiers can sometimes release harmful byproducts into the air, and researchers have figured out a way to measure just how much. It seems paradoxical, but air purifiers aren’t always good for the air. They can use a variety of processes that create toxic byproducts. The most common of these is the ozone produced by UV lights meant to kill pathogens. Normally harmless in low concentrations, the ozone can sometimes accumulate enough to pose a threat to the user’s health. Other byproducts include formaldehyde and ultrafine particles, which are produced by an unintended interaction between the components of the air purifier. To test which air purifier models harm more than they help, the NIST developed a test to measure the pollutants. During the test, the air purifier is left running in a sealed room for four hours. Samples of the air are then taken. Then, a UV light is shone so that researchers can measure the amount of ozone and formaldehyde present in the samples, since both substances absorb UV radiation. To measure ultrafine particles, researchers use a method called scanning mobility particle sizing (SMPS), which passes the samples through an x-ray field. The x-ray imparts an electric charge on any particles that might be present, making it easier to sort ultrafine particles from larger ones, since they hold different amounts of charge. Once the ultrafine particles are isolated, they’re placed in a cool steam bath, which makes them swell in size. Then, lasers can be used to determine how much is present by measuring how much the particles scatter the light. It’s a lot of work to figure out how many impurities these purifiers are leaving behind, but there’s no doubt that it needs doing.
It seems you can never breathe easy these days. According to a new study by the National Institute of Standards and Technology (NIST) published in ASTM International, air purifiers can sometimes release harmful byproducts into the air, and researchers have figured out a way to measure just how much. It seems paradoxical, but air purifiers aren’t always good for the air. They can use a variety of processes that create toxic byproducts. The most common of these is the ozone produced by UV lights meant to kill pathogens. Normally harmless in low concentrations, the ozone can sometimes accumulate enough to pose a threat to the user’s health. Other byproducts include formaldehyde and ultrafine particles, which are produced by an unintended interaction between the components of the air purifier. To test which air purifier models harm more than they help, the NIST developed a test to measure the pollutants. During the test, the air purifier is left running in a sealed room for four hours. Samples of the air are then taken. Then, a UV light is shone so that researchers can measure the amount of ozone and formaldehyde present in the samples, since both substances absorb UV radiation. To measure ultrafine particles, researchers use a method called scanning mobility particle sizing (SMPS), which passes the samples through an x-ray field. The x-ray imparts an electric charge on any particles that might be present, making it easier to sort ultrafine particles from larger ones, since they hold different amounts of charge. Once the ultrafine particles are isolated, they’re placed in a cool steam bath, which makes them swell in size. Then, lasers can be used to determine how much is present by measuring how much the particles scatter the light. It’s a lot of work to figure out how many impurities these purifiers are leaving behind, but there’s no doubt that it needs doing.
-
FREEUS History Daily Curio #3146Free1 CQ
Get ready to clutch your pearls—there are people shopping for clothes on Sundays. The city of Paramus, New Jersey, recently filed a lawsuit against a local mall for allowing shoppers to buy garments on Sundays, and while supporters of the suit say it’s to reduce noise and traffic, detractors say it’s an example of blue laws gone wrong.
Blue laws refer to any law that restricts secular activities on Sundays, though what’s covered varies considerably. Some are more well known, like the restriction on the sales of alcohol, while more extreme cases restrict entertainment, various types of commerce, sports and working in general. In the recent controversy, the American Dream mall has been accused of allowing shoppers to purchase “nonessential” goods. These include not just clothes, but furniture and appliances, as opposed to essential goods like groceries or medicine. Like many blue laws in other jurisdictions, Paramus’s has been on the books since the colonial period and has remained largely unchanged. As for how such laws got on the books in the first place, they were developed in England, then promoted by Puritans early in American history.
Due to their religious roots, blue laws have often been challenged in court as a violation of the First Amendment, which states that the government may not favor one particular religion. However, many blue laws still remain in effect partly from lack of the political will to change them and partly because of a Supreme Court ruling in their favor. In 1961, during a case called McGowan v. Maryland, the court ruled that Maryland’s blue laws forbidding certain types of commerce weren’t in violation of the Establishment Clause of the First Amendment. The justification was that, even if the laws were originally created to encourage church attendance on Sundays, they also served a secular function by making Sundays a universal day of rest. Whether you’re the pious or partying type, it’s hard to argue with a day off the clock.
[Image description: A porcelain sculpture of a man and woman in historical clothing in a clothing shop with goods on the wall behind them.] Credit & copyright: "Venetian Fair" shop with two figures, Ludwigsburg Porcelain Manufactory (German, 1758–1824). The Metropolitan Museum of Art, Gift of R. Thornton Wilson, in memory of Florence Ellsworth Wilson, 1950. Public Domain.Get ready to clutch your pearls—there are people shopping for clothes on Sundays. The city of Paramus, New Jersey, recently filed a lawsuit against a local mall for allowing shoppers to buy garments on Sundays, and while supporters of the suit say it’s to reduce noise and traffic, detractors say it’s an example of blue laws gone wrong.
Blue laws refer to any law that restricts secular activities on Sundays, though what’s covered varies considerably. Some are more well known, like the restriction on the sales of alcohol, while more extreme cases restrict entertainment, various types of commerce, sports and working in general. In the recent controversy, the American Dream mall has been accused of allowing shoppers to purchase “nonessential” goods. These include not just clothes, but furniture and appliances, as opposed to essential goods like groceries or medicine. Like many blue laws in other jurisdictions, Paramus’s has been on the books since the colonial period and has remained largely unchanged. As for how such laws got on the books in the first place, they were developed in England, then promoted by Puritans early in American history.
Due to their religious roots, blue laws have often been challenged in court as a violation of the First Amendment, which states that the government may not favor one particular religion. However, many blue laws still remain in effect partly from lack of the political will to change them and partly because of a Supreme Court ruling in their favor. In 1961, during a case called McGowan v. Maryland, the court ruled that Maryland’s blue laws forbidding certain types of commerce weren’t in violation of the Establishment Clause of the First Amendment. The justification was that, even if the laws were originally created to encourage church attendance on Sundays, they also served a secular function by making Sundays a universal day of rest. Whether you’re the pious or partying type, it’s hard to argue with a day off the clock.
[Image description: A porcelain sculpture of a man and woman in historical clothing in a clothing shop with goods on the wall behind them.] Credit & copyright: "Venetian Fair" shop with two figures, Ludwigsburg Porcelain Manufactory (German, 1758–1824). The Metropolitan Museum of Art, Gift of R. Thornton Wilson, in memory of Florence Ellsworth Wilson, 1950. Public Domain.
September 3, 2025
-
FREEEngineering Daily Curio #3145Free1 CQ
Smell ya later—but not too much later! Millions of people suffer from loss of smell (anosmia) due to a variety of medical causes, but researchers at Hanyang University and Kwangwoon University in South Korea have now discovered a way to restore the lost sense using radio waves.
The sense of smell is more important to daily life than most people think. Just ask anyone who took the sense for granted before they lost it due to a sinus infection, brain injury, or COVID-19. The recent pandemic brought the issue to the spotlight, since it caused so many people to either temporarily or permanently lose their sense of smell, along with their sense of taste. With no sense of smell, it’s difficult to enjoy food at the very least, and at the worst, it can lead to danger. Imagine, for instance, not being able to detect spoiled food at a sniff before any visual indications are obvious or not being able to smell a gas leak. Currently, there is no surefire treatment for anosmia. If the cause is something like polyps or a deviated septum, surgery might help. In other cases, olfactory treatment can be used, which involves the use of strong, often unpleasant scents to “retrain” the patient’s nose.
Now, researchers claim they have come up with a completely noninvasive, chemical-free method to restore a sense of smell. The treatment makes use of a small radio antenna placed near the patient’s head that send out targeted radio waves at the nerves inside the brain responsible for smell. It sounds almost too good to be true, but the researchers claim that just a week of treatments produced significant improvements. If it really works as they say, then even those who aren’t suffering from anosmia could benefit, as it can potentially enhance a normal sense of smell to be even sharper. It could be the mildest superpower ever!
[Image description: A black-and-white illustration of a person in historical clothing smelling a flower next to two flowering plants.] Credit & copyright: Smell, Abraham Bosse, c.1635–38. The Metropolitan Museum of Art, Harris Brisbane Dick Fund, 1930. Public Domain.Smell ya later—but not too much later! Millions of people suffer from loss of smell (anosmia) due to a variety of medical causes, but researchers at Hanyang University and Kwangwoon University in South Korea have now discovered a way to restore the lost sense using radio waves.
The sense of smell is more important to daily life than most people think. Just ask anyone who took the sense for granted before they lost it due to a sinus infection, brain injury, or COVID-19. The recent pandemic brought the issue to the spotlight, since it caused so many people to either temporarily or permanently lose their sense of smell, along with their sense of taste. With no sense of smell, it’s difficult to enjoy food at the very least, and at the worst, it can lead to danger. Imagine, for instance, not being able to detect spoiled food at a sniff before any visual indications are obvious or not being able to smell a gas leak. Currently, there is no surefire treatment for anosmia. If the cause is something like polyps or a deviated septum, surgery might help. In other cases, olfactory treatment can be used, which involves the use of strong, often unpleasant scents to “retrain” the patient’s nose.
Now, researchers claim they have come up with a completely noninvasive, chemical-free method to restore a sense of smell. The treatment makes use of a small radio antenna placed near the patient’s head that send out targeted radio waves at the nerves inside the brain responsible for smell. It sounds almost too good to be true, but the researchers claim that just a week of treatments produced significant improvements. If it really works as they say, then even those who aren’t suffering from anosmia could benefit, as it can potentially enhance a normal sense of smell to be even sharper. It could be the mildest superpower ever!
[Image description: A black-and-white illustration of a person in historical clothing smelling a flower next to two flowering plants.] Credit & copyright: Smell, Abraham Bosse, c.1635–38. The Metropolitan Museum of Art, Harris Brisbane Dick Fund, 1930. Public Domain.