Curio Cabinet / Person, Place, or Thing
-
FREEHumanities PP&T CurioFree1 CQ
This writer-turned-activist didn’t let anything keep her down. A recent Google Doodle by artist Sienna Gonzales featured activist Barbara May Cameron holding a Progress Pride flag on what would have been her 69th birthday. The commemorative image was made with input from Cameron’s partner of 21 years, Linda Boyd-Durkee, as a tribute to her life and legacy as the first nationally-known Native American activist who advocated for LGBTQIA+ rights, Native American rights, and women’s rights.
Born in 1954 in Fort Yates, North Dakota, Cameron was part of the Standing Rock Sioux Tribe. Her name in Lakota, Wia Washte Wi, meant, “woman, good woman.” After attending American Indian Art Institute, Cameron moved to San Francisco, and became involved in the area’s thriving LGBTQIA+ community. In 1975, Cameron, along with Randy Burns, co-founded the Gay American Indians (GAI). The first of its kind, the GAI was an organization dedicated to queer Native Americans specifically, and Cameron advocated for greater acceptance of LGBTQIA+ people within Native American communities. As a queer indigenous writer, she helped shed light into the unique challenges faced by her people through her writing. Her essays and poems were published in several landmark anthologies, including A Gathering of Spirit: A Collection of Writing and Art by North American Indian Women in 1983 and Our Right to Love: A Lesbian Resource Book in 1996. Her works highlighted issues that were rarely touched upon even by other LGBTQIA+ writers of the time, like how Native Americans and other people of color were disproportionately affected by the AIDS crisis.
During the crisis, Cameron was active in the San Francisco AIDS Foundation and the American Indian AIDS Institute, providing help to those in need, especially those who couldn’t afford medical care. She also served as a consultant to the U.S. Department of Health and Human Services and the Centers for Disease Control, which sought to educate the public about the disease. At the time, AIDS was considered a taboo topic and many in the government were unwilling to speak about it. As part of her work as a consultant, Cameron also contributed to childhood immunization programs, working to get kids in rural communities vaccinated.
Between 1980 and 1985, Cameron did some of her best-remembered work, such as helping to organize the Lesbian Gay Freedom Day Parade and Celebration, now known as San Francisco Lesbian, Gay, Bisexual, and Transgender Pride Celebration, the largest event of its kind in the U.S. A few years later in 1988, she was appointed by the mayor to the Citizens Committee on Community Development and the San Francisco Human Rights Commission. Later, she was also appointed to the United Nations Commission on the Status of Women.
For her contributions to San Francisco, Cameron was honored with the Harvey Milk Award for Community Service in 1992 and the first Bay Area Career Women Community Service Award the following year. But she also took her activism on behalf of LGBTQIA+ people to a broader stage beyond San Francisco, when she successfully co-led a lawsuit against the Immigration & Naturalization Service. The lawsuit addressed the agency’s discriminatory policy of turning away gay immigrants, who were not yet considered a protected class under federal law. The suit went all the way to the U.S. Supreme Court, which ruled in the plaintiffs’ favor.
Today, Cameron is remembered as one of the most influential figures in the history of LGBTQIA+ activism. In San Francisco, where she was most active, she was one of the loudest voices speaking on behalf of communities who were often overlooked or discriminated against. Her work helped create modern San Francisco’s thriving, no-longer-underground LGBTQIA+ community. That’s certainly something to take pride in.
[Image description: A pride flag blowing in the wind.] Credit & copyright: Markus Spiske, PexelsThis writer-turned-activist didn’t let anything keep her down. A recent Google Doodle by artist Sienna Gonzales featured activist Barbara May Cameron holding a Progress Pride flag on what would have been her 69th birthday. The commemorative image was made with input from Cameron’s partner of 21 years, Linda Boyd-Durkee, as a tribute to her life and legacy as the first nationally-known Native American activist who advocated for LGBTQIA+ rights, Native American rights, and women’s rights.
Born in 1954 in Fort Yates, North Dakota, Cameron was part of the Standing Rock Sioux Tribe. Her name in Lakota, Wia Washte Wi, meant, “woman, good woman.” After attending American Indian Art Institute, Cameron moved to San Francisco, and became involved in the area’s thriving LGBTQIA+ community. In 1975, Cameron, along with Randy Burns, co-founded the Gay American Indians (GAI). The first of its kind, the GAI was an organization dedicated to queer Native Americans specifically, and Cameron advocated for greater acceptance of LGBTQIA+ people within Native American communities. As a queer indigenous writer, she helped shed light into the unique challenges faced by her people through her writing. Her essays and poems were published in several landmark anthologies, including A Gathering of Spirit: A Collection of Writing and Art by North American Indian Women in 1983 and Our Right to Love: A Lesbian Resource Book in 1996. Her works highlighted issues that were rarely touched upon even by other LGBTQIA+ writers of the time, like how Native Americans and other people of color were disproportionately affected by the AIDS crisis.
During the crisis, Cameron was active in the San Francisco AIDS Foundation and the American Indian AIDS Institute, providing help to those in need, especially those who couldn’t afford medical care. She also served as a consultant to the U.S. Department of Health and Human Services and the Centers for Disease Control, which sought to educate the public about the disease. At the time, AIDS was considered a taboo topic and many in the government were unwilling to speak about it. As part of her work as a consultant, Cameron also contributed to childhood immunization programs, working to get kids in rural communities vaccinated.
Between 1980 and 1985, Cameron did some of her best-remembered work, such as helping to organize the Lesbian Gay Freedom Day Parade and Celebration, now known as San Francisco Lesbian, Gay, Bisexual, and Transgender Pride Celebration, the largest event of its kind in the U.S. A few years later in 1988, she was appointed by the mayor to the Citizens Committee on Community Development and the San Francisco Human Rights Commission. Later, she was also appointed to the United Nations Commission on the Status of Women.
For her contributions to San Francisco, Cameron was honored with the Harvey Milk Award for Community Service in 1992 and the first Bay Area Career Women Community Service Award the following year. But she also took her activism on behalf of LGBTQIA+ people to a broader stage beyond San Francisco, when she successfully co-led a lawsuit against the Immigration & Naturalization Service. The lawsuit addressed the agency’s discriminatory policy of turning away gay immigrants, who were not yet considered a protected class under federal law. The suit went all the way to the U.S. Supreme Court, which ruled in the plaintiffs’ favor.
Today, Cameron is remembered as one of the most influential figures in the history of LGBTQIA+ activism. In San Francisco, where she was most active, she was one of the loudest voices speaking on behalf of communities who were often overlooked or discriminated against. Her work helped create modern San Francisco’s thriving, no-longer-underground LGBTQIA+ community. That’s certainly something to take pride in.
[Image description: A pride flag blowing in the wind.] Credit & copyright: Markus Spiske, Pexels -
FREEPolitical Science PP&T CurioFree1 CQ
Americans are hearing a lot about our government’s debt ceiling lately, and a lot of us are worried about default. Exactly what is the debt ceiling, and why is it so important that it be raised? The debt ceiling is pretty much exactly what it sounds like: a limit to the amount of money the U.S. government is allowed to borrow. It was created during World War I by Congress through the Second Liberty Bond Act of 1917 in order to allow the Treasury Department greater discretion in borrowing money for the war effort.
The U.S. has been in debt for nearly its entire existence because it runs a deficit almost every year, meaning that the government spends more than it earns through various means like taxes, customs, and other sources of revenue. The only exception was in 2001, but even so, it’s never been an issue because the U.S. has never gone into default on its debt. While it may be something of a precarious legislative dance, the raising of the debt ceiling to prevent the U.S. government from defaulting has become a fairly routine matter. Congress has raised the debt ceiling 49 times under a Republican president and 29 times under a Democratic president since 1960. The last time it was raised was back in 2021 to $31.4 trillion, which was just met in January of this year. It now needs to be raised yet again, but the U.S.’s political system is doing what it does best: causing delays. The debt ceiling is often used as leverage for a party to make demands regarding other legislative issues, and it rarely gets raised without some sort of compromise. That’s where Congress is right now, with the two parties unable to agree on a budget and how it will be spent. It has left many wondering what would happen if an agreement can’t be reached. The answer is complicated, but it could eventually result in a cascade of catastrophes that would devastate the U.S. economy.
In 2011, the last time the U.S. took too long to raise the debt ceiling, ratings agencies downgraded the credit rating of the US from AAA to AA+. That could happen again depending on how long it takes Congress this time, but even if they vote against raising the debt ceiling, there won’t be an immediate effect. Per the official projection released by the Congressional Budget Office, there may be enough funds left to last until September of this year, while U.S. Treasury Secretary Janet Yellen believes that money could run out by June 5. Once funds run dry, the US would default on its debts. Unable to pay all its bills, it would no longer be able to pay for things like government salaries, Social Security, Medicare benefits, tax refunds, and other payments.
Defaulting would also affect everyone who bought securities sold by the U.S. government, like bonds. The U.S. sells bonds to pay for its deficits, and bond buyers are paid back after a set period for their purchase plus interest. If the U.S. is unable to pay back bond buyers, it would have to pay back higher interest rates, meaning more money spent down the road. And, of course, the U.S. government’s credit rating would be downgraded, making it more difficult to borrow money in the future. Because the U.S. debt is owned by many parties and affects the value of the U.S. dollar, a default could have negative repercussions worldwide. Within the U.S., consumer loans would see a rise in interest rates, potentially leading to a recession. Since the U.S. dollar plays a pivotal role in global trade, a decrease in the value of the dollar could devastate foreign economies.
Still, it’s more likely than not that Congress will reach an agreement and vote to raise the debt ceiling. After all, political posturing about the debt ceiling has happened before, while default never has. That doesn't mean that anything is guaranteed, however. So, here’s hoping that Congress gets a move on.
[Image description: A roll of five-dollar bills on an American flag.] Credit & copyright: Karolina Grabowska, PexelsAmericans are hearing a lot about our government’s debt ceiling lately, and a lot of us are worried about default. Exactly what is the debt ceiling, and why is it so important that it be raised? The debt ceiling is pretty much exactly what it sounds like: a limit to the amount of money the U.S. government is allowed to borrow. It was created during World War I by Congress through the Second Liberty Bond Act of 1917 in order to allow the Treasury Department greater discretion in borrowing money for the war effort.
The U.S. has been in debt for nearly its entire existence because it runs a deficit almost every year, meaning that the government spends more than it earns through various means like taxes, customs, and other sources of revenue. The only exception was in 2001, but even so, it’s never been an issue because the U.S. has never gone into default on its debt. While it may be something of a precarious legislative dance, the raising of the debt ceiling to prevent the U.S. government from defaulting has become a fairly routine matter. Congress has raised the debt ceiling 49 times under a Republican president and 29 times under a Democratic president since 1960. The last time it was raised was back in 2021 to $31.4 trillion, which was just met in January of this year. It now needs to be raised yet again, but the U.S.’s political system is doing what it does best: causing delays. The debt ceiling is often used as leverage for a party to make demands regarding other legislative issues, and it rarely gets raised without some sort of compromise. That’s where Congress is right now, with the two parties unable to agree on a budget and how it will be spent. It has left many wondering what would happen if an agreement can’t be reached. The answer is complicated, but it could eventually result in a cascade of catastrophes that would devastate the U.S. economy.
In 2011, the last time the U.S. took too long to raise the debt ceiling, ratings agencies downgraded the credit rating of the US from AAA to AA+. That could happen again depending on how long it takes Congress this time, but even if they vote against raising the debt ceiling, there won’t be an immediate effect. Per the official projection released by the Congressional Budget Office, there may be enough funds left to last until September of this year, while U.S. Treasury Secretary Janet Yellen believes that money could run out by June 5. Once funds run dry, the US would default on its debts. Unable to pay all its bills, it would no longer be able to pay for things like government salaries, Social Security, Medicare benefits, tax refunds, and other payments.
Defaulting would also affect everyone who bought securities sold by the U.S. government, like bonds. The U.S. sells bonds to pay for its deficits, and bond buyers are paid back after a set period for their purchase plus interest. If the U.S. is unable to pay back bond buyers, it would have to pay back higher interest rates, meaning more money spent down the road. And, of course, the U.S. government’s credit rating would be downgraded, making it more difficult to borrow money in the future. Because the U.S. debt is owned by many parties and affects the value of the U.S. dollar, a default could have negative repercussions worldwide. Within the U.S., consumer loans would see a rise in interest rates, potentially leading to a recession. Since the U.S. dollar plays a pivotal role in global trade, a decrease in the value of the dollar could devastate foreign economies.
Still, it’s more likely than not that Congress will reach an agreement and vote to raise the debt ceiling. After all, political posturing about the debt ceiling has happened before, while default never has. That doesn't mean that anything is guaranteed, however. So, here’s hoping that Congress gets a move on.
[Image description: A roll of five-dollar bills on an American flag.] Credit & copyright: Karolina Grabowska, Pexels -
FREEMusic Appreciation PP&T CurioFree1 CQ
“The beautiful thing about learning is nobody can take it away from you.” This nugget of wisdom from B.B. King, the undisputed King of the Blues, exemplified his approach to music and life in general. King passed away on this day in 2015, having spent decades reshaping America’s musical landscape. His music crossed cultural and political boundaries, and made the blues mainstream from sea to shining sea.
B.B. King was born September 16, 1925, on a cotton plantation near Itta Bena, Mississippi. He was raised around gospel music and Delta blues, and showed an interest in both at a young age. As a child, King started performing on the streets for money, traveling through as many as four towns a night to find an audience. But his first dive into the larger music industry came when he moved to Memphis, Tennessee, where he became part of a burgeoning blues movement centered on Beale Street. It was there he earned the nickname “Beale Street Blues Boy,” which he would later shorten to B.B.
In 1948, King began appearing on radio blues segments. Eventually, he landed a job at Memphis station WDIA as a disc jockey and singer. Using the connections he made there, King assembled a band and his first recording contracts, first with Bullet Records, then with RPM Records, in 1949. By the early 1950s, King’s music career was in full swing. He released a number of hits over the following decade, like You Know I Love You, Woke Up This Morning and Three O'Clock Blues, the last of which earned him his first number one spot on the R&B Billboard chart. King was even busier outside the studio, performing an average of 300 shows a year.
Although King was an icon in the blues scene, blues music wasn’t yet mainstream. King changed that with the release of his 1965 album Live at the Regal. Considered one of the greatest blues albums of all time, it introduced King to a whole new audience. This trend continued with his 1969 cover of The Thrill is Gone, released as a single. His rendition earned King a Grammy and turned the song into a blues standard. Around the same time, he gained further mainstream acclaim as rock musicians began to commonly cite him as an influence. King was known for his distinct guitar style, which was itself influenced by blues greats like Blind Lemon Jefferson and T-Bone Walker. Soon, others began imitating King’s method of producing vocal-like sounds and heavy, soulful vibrato with his strings. Today, many of the standard sounds in rock, soul, and R&B can be traced directly back to him.
King always performed with his signature guitar Lucille, identical copies of the same black, maple-bodied Gibson that he began his career with. His connection to his instrument was so great that, early in his career, he once ran into a burning club to retrieve it. When he learned afterward that the fire had started as a result of a fight between two men over a woman named Lucille, he gave that name to his guitar. He once said of his instrument, “When I sing, I play in my mind; the minute I stop singing orally, I start to sing by playing Lucille.”
The bluesman continued to tour extensively both in the U.S. and abroad. At the height of the Cold War, he even became the first blues musician to perform in the Soviet Union. At home, his music helped bridge a cultural gap between Black and white Americans. In 1984, King was inducted into the Blues Foundation Hall of Fame, and a few years later in 1987, to the Rock and Roll Hall of Fame for his contributions to both genres. In 1991, he opened B.B. King’s Blues Club on Beale Street in Memphis, where he got his first big break, and the establishment remains a landmark in the area to this day, with additional locations opening in other parts of the country. King’s last major performance took place at the House of Blues in Chicago, Illinois, in 2014, the year before he passed away. The blues has sounded just a little more blue since then, but the thrill definitely isn’t gone.
[Image description: A black-and-white photo of B.B. King playing guitar and singing.] Credit & copyright: Eugene F. Tourangeau, 1971. Wikimedia Commons. This work is in the public domain in the United States because it was published in the United States between 1928 and 1977, inclusive, without a copyright notice.“The beautiful thing about learning is nobody can take it away from you.” This nugget of wisdom from B.B. King, the undisputed King of the Blues, exemplified his approach to music and life in general. King passed away on this day in 2015, having spent decades reshaping America’s musical landscape. His music crossed cultural and political boundaries, and made the blues mainstream from sea to shining sea.
B.B. King was born September 16, 1925, on a cotton plantation near Itta Bena, Mississippi. He was raised around gospel music and Delta blues, and showed an interest in both at a young age. As a child, King started performing on the streets for money, traveling through as many as four towns a night to find an audience. But his first dive into the larger music industry came when he moved to Memphis, Tennessee, where he became part of a burgeoning blues movement centered on Beale Street. It was there he earned the nickname “Beale Street Blues Boy,” which he would later shorten to B.B.
In 1948, King began appearing on radio blues segments. Eventually, he landed a job at Memphis station WDIA as a disc jockey and singer. Using the connections he made there, King assembled a band and his first recording contracts, first with Bullet Records, then with RPM Records, in 1949. By the early 1950s, King’s music career was in full swing. He released a number of hits over the following decade, like You Know I Love You, Woke Up This Morning and Three O'Clock Blues, the last of which earned him his first number one spot on the R&B Billboard chart. King was even busier outside the studio, performing an average of 300 shows a year.
Although King was an icon in the blues scene, blues music wasn’t yet mainstream. King changed that with the release of his 1965 album Live at the Regal. Considered one of the greatest blues albums of all time, it introduced King to a whole new audience. This trend continued with his 1969 cover of The Thrill is Gone, released as a single. His rendition earned King a Grammy and turned the song into a blues standard. Around the same time, he gained further mainstream acclaim as rock musicians began to commonly cite him as an influence. King was known for his distinct guitar style, which was itself influenced by blues greats like Blind Lemon Jefferson and T-Bone Walker. Soon, others began imitating King’s method of producing vocal-like sounds and heavy, soulful vibrato with his strings. Today, many of the standard sounds in rock, soul, and R&B can be traced directly back to him.
King always performed with his signature guitar Lucille, identical copies of the same black, maple-bodied Gibson that he began his career with. His connection to his instrument was so great that, early in his career, he once ran into a burning club to retrieve it. When he learned afterward that the fire had started as a result of a fight between two men over a woman named Lucille, he gave that name to his guitar. He once said of his instrument, “When I sing, I play in my mind; the minute I stop singing orally, I start to sing by playing Lucille.”
The bluesman continued to tour extensively both in the U.S. and abroad. At the height of the Cold War, he even became the first blues musician to perform in the Soviet Union. At home, his music helped bridge a cultural gap between Black and white Americans. In 1984, King was inducted into the Blues Foundation Hall of Fame, and a few years later in 1987, to the Rock and Roll Hall of Fame for his contributions to both genres. In 1991, he opened B.B. King’s Blues Club on Beale Street in Memphis, where he got his first big break, and the establishment remains a landmark in the area to this day, with additional locations opening in other parts of the country. King’s last major performance took place at the House of Blues in Chicago, Illinois, in 2014, the year before he passed away. The blues has sounded just a little more blue since then, but the thrill definitely isn’t gone.
[Image description: A black-and-white photo of B.B. King playing guitar and singing.] Credit & copyright: Eugene F. Tourangeau, 1971. Wikimedia Commons. This work is in the public domain in the United States because it was published in the United States between 1928 and 1977, inclusive, without a copyright notice. -
FREEUS History PP&T CurioFree1 CQ
During Asian American and Pacific Islander (AAPI) Heritage Month, there are amazing activists, artists, innovators and military heroes to recognize. Among the latter group, none stands out more than Daniel Inouye, the late U.S. Senator who represented the state of Hawaii for over 50 years. His accomplishments as a political leader and war hero came about during a dark chapter in American history that is often overlooked. His life, in many ways, reflects the changing state of race-relations in the U.S. over the last century.
Inouye was born on September 7, 1924 in Honolulu, Hawaii to Hyotaro Inouye and Kame Imanaga when the islands were still U.S. territories. At the time of his birth, the Hawaiian islands contained a large population of Japanese Americans like Inouye, although they were segregated from the white population. Growing up, Inouye dreamed of becoming a surgeon, but when he was just 17 years old, his life changed forever. The U.S. military bases in Pearl Harbor were attacked by the Imperial Japanese Navy, and despite his eagerness to help and his status as an American citizen, Inouye soon found himself under attack as well. After Pearl Harbor, all people of Japanese descent in the U.S. were forcibly rounded up and relocated to internment camps without trial, charges, or a chance to appeal.
Those in Hawaii were spared, however, because Japanese Americans made up such a large percentage of the population. It was determined that to imprison them would devastate the economy, but they were still barred from enlisting in the military. Unlike German and Italian Americans, Japanese Americans were considered enemy aliens. Nevertheless, Inouye successfully petitioned to be allowed to join the military, albeit in a segregated battalion. Inouye’s unit, the 442nd Regimental Combat Team, became one of the most decorated in U.S. history by the end of World War II, but at great cost. Japanese-American soldiers were sent to the European theater, where they suffered heavy losses. It was there that Inouye performed an act that forever cemented his reputation as a war hero. While fighting against enemy machine gun teams on a ridge in Italy, Inouye, who was already suffering blood loss from gunshot wounds, extended his arm to throw a grenade. In that moment, his right arm was shot almost completely off. Still, he used his good hand to grab the grenade from his other, mangled hand, and threw it at the enemy before falling unconscious from his injuries.
Inouye survived the encounter, but the fight cost him his right arm, ending his aspirations of becoming a surgeon. Instead of pursuing medicine as he originally wanted, he studied government and economics at the University of Hawaii, and later earned a law degree from George Washington Law School. Although he hadn’t had political ambitions before the War, he became active in politics in Hawaii. After Hawaii was granted statehood in 1959, he became a delegate in the U.S. House of Representatives. A few years later in 1962, he was elected to the U.S. Senate, and served in that position for 53 years, never losing an election. He was popular in his home state largely due to his famously sunny disposition, but his career as the first Japanese-American in Congress was not without difficulties. In one notable instance, after he was elected to serve on the committee that investigated President Nixon’s involvement in the Watergate scandal, one of Nixon’s attorneys openly used a racial slur to describe Inouye to the press.
Yet, Inouye also witnessed great social progress for Asian Americans during his life. Starting in the 1990s, the U.S. military even began reviewing candidates from WWII who had been denied the Medal of Honor due to racism. In 2000, Inouye received the long overdue award along with 19 other members of the 442nd Regiment. After Inouye passed away in 2012, he was also posthumously awarded the Presidential Medal of Freedom. Since then, Honolulu International Airport has been renamed Daniel K. Inouye International Airport in his honor, and his name welcomes visitors to a nation that once branded him an enemy alien.
[Image description: A portrait of Daniel Inouye in front of American and military flags.] Credit & copyright: United States Senate, Wikimedia Commons, Image created by an employee of the U.S. government as part of their official duties, Public DomainDuring Asian American and Pacific Islander (AAPI) Heritage Month, there are amazing activists, artists, innovators and military heroes to recognize. Among the latter group, none stands out more than Daniel Inouye, the late U.S. Senator who represented the state of Hawaii for over 50 years. His accomplishments as a political leader and war hero came about during a dark chapter in American history that is often overlooked. His life, in many ways, reflects the changing state of race-relations in the U.S. over the last century.
Inouye was born on September 7, 1924 in Honolulu, Hawaii to Hyotaro Inouye and Kame Imanaga when the islands were still U.S. territories. At the time of his birth, the Hawaiian islands contained a large population of Japanese Americans like Inouye, although they were segregated from the white population. Growing up, Inouye dreamed of becoming a surgeon, but when he was just 17 years old, his life changed forever. The U.S. military bases in Pearl Harbor were attacked by the Imperial Japanese Navy, and despite his eagerness to help and his status as an American citizen, Inouye soon found himself under attack as well. After Pearl Harbor, all people of Japanese descent in the U.S. were forcibly rounded up and relocated to internment camps without trial, charges, or a chance to appeal.
Those in Hawaii were spared, however, because Japanese Americans made up such a large percentage of the population. It was determined that to imprison them would devastate the economy, but they were still barred from enlisting in the military. Unlike German and Italian Americans, Japanese Americans were considered enemy aliens. Nevertheless, Inouye successfully petitioned to be allowed to join the military, albeit in a segregated battalion. Inouye’s unit, the 442nd Regimental Combat Team, became one of the most decorated in U.S. history by the end of World War II, but at great cost. Japanese-American soldiers were sent to the European theater, where they suffered heavy losses. It was there that Inouye performed an act that forever cemented his reputation as a war hero. While fighting against enemy machine gun teams on a ridge in Italy, Inouye, who was already suffering blood loss from gunshot wounds, extended his arm to throw a grenade. In that moment, his right arm was shot almost completely off. Still, he used his good hand to grab the grenade from his other, mangled hand, and threw it at the enemy before falling unconscious from his injuries.
Inouye survived the encounter, but the fight cost him his right arm, ending his aspirations of becoming a surgeon. Instead of pursuing medicine as he originally wanted, he studied government and economics at the University of Hawaii, and later earned a law degree from George Washington Law School. Although he hadn’t had political ambitions before the War, he became active in politics in Hawaii. After Hawaii was granted statehood in 1959, he became a delegate in the U.S. House of Representatives. A few years later in 1962, he was elected to the U.S. Senate, and served in that position for 53 years, never losing an election. He was popular in his home state largely due to his famously sunny disposition, but his career as the first Japanese-American in Congress was not without difficulties. In one notable instance, after he was elected to serve on the committee that investigated President Nixon’s involvement in the Watergate scandal, one of Nixon’s attorneys openly used a racial slur to describe Inouye to the press.
Yet, Inouye also witnessed great social progress for Asian Americans during his life. Starting in the 1990s, the U.S. military even began reviewing candidates from WWII who had been denied the Medal of Honor due to racism. In 2000, Inouye received the long overdue award along with 19 other members of the 442nd Regiment. After Inouye passed away in 2012, he was also posthumously awarded the Presidential Medal of Freedom. Since then, Honolulu International Airport has been renamed Daniel K. Inouye International Airport in his honor, and his name welcomes visitors to a nation that once branded him an enemy alien.
[Image description: A portrait of Daniel Inouye in front of American and military flags.] Credit & copyright: United States Senate, Wikimedia Commons, Image created by an employee of the U.S. government as part of their official duties, Public Domain -
FREEMind + Body PP&T CurioFree1 CQ
Spring is in full swing, and for a lot of people that means just one thing: horrible allergies. Puffy eyes and runny noses abound, but there are thankfully plenty of remedies to go around, from antihistamines to allergy shots. While it’s easy to take these over-the-counter medications and outpatient treatments for granted, let’s not forget all the work that went into creating them and figuring out just what allergies were, in the first place.
Allergies occur when the immune system overreacts to a foreign substance, or allergen, such as pollen. While the most common types of allergies (like seasonal allergies) are annoying, others can be downright dangerous. That’s especially true for certain food allergies, which can cause a strong, life-threatening allergic reaction called anaphylaxis. Yet, allergies were not well understood for most of human history. The discovery of allergens began with an English doctor named Charles Harrison Blackley, who suffered from seasonal allergies himself. At the time, the consensus in the medical community was that instances of “hay fever” or “summer colds” were caused by warming temperatures or ozone. In 1869, Blackley performed the first allergy skin test, where he placed pollen into an open wound. The body’s reaction to the pollen at the wound site told him what he had suspected: pollen was the culprit behind his allergies. Alas, even with this knowledge, treatments for allergy symptoms were decades away.
The first of those treatments were developed by Leonard Noon and John Freeman in 1911. Together, they created a desensitization treatment where patients were deliberately exposed to controlled amounts of pollen. Starting from a dose too small to cause symptoms which was gradually increased, they successfully induced immunity to pollen in patients who were willing to stick to the frequent injections required by the regimen.
Around the same time, histamines, chemical messengers involved in immune responses and allergic reactions, were discovered by Sir Henry Dale, though they would not be isolated until 1927. In 1942, France released Antergan, the first antihistamine drug. While many other antihistamines have been developed since, they all work on the same principle. Histamines cause inflammation, increasing the heart rate and dilating blood vessels. In people who have allergies, the body reacts to allergens with a histamine response that is out of control. Antihistamines simply block the histamines from making contact with histamine receptors, preventing the symptoms they cause. Corticosteroids, which work by alleviating inflammation directly, are another type of allergy drug. The first nasal sprays that used them were developed in the 1950s.
Today, allergy tests and treatments are more or less the same as they were in the early days of their development. Allergy specialists diagnose patients with a skin test not unlike Blackley’s, where patients are pricked with proteins from common allergens to see the reaction. There is also a blood test called radioallergosorbent, which measures the presence of antibodies that responsible for causing allergic reactions. Severe allergic reactions that cause anaphylaxis can be treated with emergency epinephrine, which reduces symptoms until the patient can receive other treatment. Finally, patients with severe, chronic allergies can opt for immunotherapy. Much like Noon and Freeman’s early desensitization treatments, immunotherapy exposes the patient to purified extracts of allergens in controlled amounts through injections or sublingual tablets until they develop a tolerance. For thousands of years, people simply lived with or died from their allergies. Yet in less than two centuries, allergies went from debilitating to treatable. Sure, many of us might still have to keep tissues nearby when the pollen count is high or keep an epi-pen handy, but there’s no doubt it could be worse!
[Image description: A woman in a red jacket holds a tissue over her face while standing in front of yellow flowers.] Credit & copyright: cenczi, PixabaySpring is in full swing, and for a lot of people that means just one thing: horrible allergies. Puffy eyes and runny noses abound, but there are thankfully plenty of remedies to go around, from antihistamines to allergy shots. While it’s easy to take these over-the-counter medications and outpatient treatments for granted, let’s not forget all the work that went into creating them and figuring out just what allergies were, in the first place.
Allergies occur when the immune system overreacts to a foreign substance, or allergen, such as pollen. While the most common types of allergies (like seasonal allergies) are annoying, others can be downright dangerous. That’s especially true for certain food allergies, which can cause a strong, life-threatening allergic reaction called anaphylaxis. Yet, allergies were not well understood for most of human history. The discovery of allergens began with an English doctor named Charles Harrison Blackley, who suffered from seasonal allergies himself. At the time, the consensus in the medical community was that instances of “hay fever” or “summer colds” were caused by warming temperatures or ozone. In 1869, Blackley performed the first allergy skin test, where he placed pollen into an open wound. The body’s reaction to the pollen at the wound site told him what he had suspected: pollen was the culprit behind his allergies. Alas, even with this knowledge, treatments for allergy symptoms were decades away.
The first of those treatments were developed by Leonard Noon and John Freeman in 1911. Together, they created a desensitization treatment where patients were deliberately exposed to controlled amounts of pollen. Starting from a dose too small to cause symptoms which was gradually increased, they successfully induced immunity to pollen in patients who were willing to stick to the frequent injections required by the regimen.
Around the same time, histamines, chemical messengers involved in immune responses and allergic reactions, were discovered by Sir Henry Dale, though they would not be isolated until 1927. In 1942, France released Antergan, the first antihistamine drug. While many other antihistamines have been developed since, they all work on the same principle. Histamines cause inflammation, increasing the heart rate and dilating blood vessels. In people who have allergies, the body reacts to allergens with a histamine response that is out of control. Antihistamines simply block the histamines from making contact with histamine receptors, preventing the symptoms they cause. Corticosteroids, which work by alleviating inflammation directly, are another type of allergy drug. The first nasal sprays that used them were developed in the 1950s.
Today, allergy tests and treatments are more or less the same as they were in the early days of their development. Allergy specialists diagnose patients with a skin test not unlike Blackley’s, where patients are pricked with proteins from common allergens to see the reaction. There is also a blood test called radioallergosorbent, which measures the presence of antibodies that responsible for causing allergic reactions. Severe allergic reactions that cause anaphylaxis can be treated with emergency epinephrine, which reduces symptoms until the patient can receive other treatment. Finally, patients with severe, chronic allergies can opt for immunotherapy. Much like Noon and Freeman’s early desensitization treatments, immunotherapy exposes the patient to purified extracts of allergens in controlled amounts through injections or sublingual tablets until they develop a tolerance. For thousands of years, people simply lived with or died from their allergies. Yet in less than two centuries, allergies went from debilitating to treatable. Sure, many of us might still have to keep tissues nearby when the pollen count is high or keep an epi-pen handy, but there’s no doubt it could be worse!
[Image description: A woman in a red jacket holds a tissue over her face while standing in front of yellow flowers.] Credit & copyright: cenczi, Pixabay -
FREEMusic Appreciation PP&T CurioFree1 CQ
They were stars of music, movies, and mayhem. Famed American punk band the Ramones quickly rose to fame after the release of their debut album, Ramones, on this day in 1976. While the band’s original lineup only lasted a few years, they recorded and toured for decades after setting the standard for early punk rock’s sound. In fact, the Ramones are considered by many to be the first true punk band.
After teaming up to form a band in Queens’ Forest Hill neighborhood, the Ramones came screaming onto the New York City music scene in 1974, wearing leather jackets and tattered jeans meant to evoke the greaser look of the 1950s. While their appearance was a throwback to days gone by, their sound was completely new—a mashup of raw, unfiltered garage rock and boppy, mainstream rock performed by giants like The Who. The band’s four original members took on monikers with the last name “Ramone,” a reference to Paul McCartney’s habit of checking into hotels under the pseudonym “Paul Ramon.” Jeffrey Hyman became Joey Ramone, Douglas Colvin became Dee Dee Ramone, Erdelyi Tamas became Tommy Ramone and John Cummings Johnny Ramone. At first, Joey played the drums while Tommy acted as the manager, but the former eventually switched to vocals while the latter replaced Joey on the drums.
From the beginning, their songs were short (around 2 minutes per track), loud, and infectious, with their earlier sets lasting just 20 minutes. In those brief performances, the Ramones began defining what punk rock was supposed to sound like. Their signature three-chord compositions ensured that their music was accessible. It was also full bombastic energy. While many saw their songs as too fun to be taken seriously, they were also too loud to ignore. The Ramones’ sound was a rebellion against the clean-cut, heavily-produced, popular music of the era, and that rebellion became the core of the entire punk genre they helped spawn. Modern listeners might find the Ramone’s style more akin to pop than punk, but there’s no doubt that the band laid the foundation for today’s edgier, grittier, more political punk music.
The opening track of their first album just so happened to be Blitzkrieg Bop, which today is considered one of the most iconic songs of the 1970s. Modern fans might be amazed to discover, though, that both the song and the album failed to chart and were commercially unsuccessful. Yet, they were well-received in punk circles, both in the U.S. and the U.K. This new, underground musical movement was large enough that the Ramones were booked solid in both countries, and were able to follow up with their sophomore album, Leave Home, by January of 1977. Shortly after the release of the album, the band endured its first of many member-changes. Tommy Ramone (aka Erdelyi Tamas) was replaced by drummer Marc Bell of Richard Hell and the Voidoids. Bell took on the stage name Marky Ramone, and Tommy continued to be involved with the band as a manager and producer. Amid touring and recording, the band also starred in the 1979 Roger Corman cult classic Rock 'n' Roll High School, in which they play fictionalized versions of themselves.
In all, the Ramones released fourteen studio albums, with the last fittingly titled, ¡Adios Amigos! in 1996. The band stopped touring in the 90s, but were recognized for their contributions to music when they were inducted into the Rock and Roll Hall of Fame in 2002, and were given a Grammy Award for lifetime achievement in 2011. Not too shabby for a bunch of punks!
[Image description: A stage featuring three guitars and other music equipment under a red decorative panel.] Credit & copyright: wayoutradio, PixabayThey were stars of music, movies, and mayhem. Famed American punk band the Ramones quickly rose to fame after the release of their debut album, Ramones, on this day in 1976. While the band’s original lineup only lasted a few years, they recorded and toured for decades after setting the standard for early punk rock’s sound. In fact, the Ramones are considered by many to be the first true punk band.
After teaming up to form a band in Queens’ Forest Hill neighborhood, the Ramones came screaming onto the New York City music scene in 1974, wearing leather jackets and tattered jeans meant to evoke the greaser look of the 1950s. While their appearance was a throwback to days gone by, their sound was completely new—a mashup of raw, unfiltered garage rock and boppy, mainstream rock performed by giants like The Who. The band’s four original members took on monikers with the last name “Ramone,” a reference to Paul McCartney’s habit of checking into hotels under the pseudonym “Paul Ramon.” Jeffrey Hyman became Joey Ramone, Douglas Colvin became Dee Dee Ramone, Erdelyi Tamas became Tommy Ramone and John Cummings Johnny Ramone. At first, Joey played the drums while Tommy acted as the manager, but the former eventually switched to vocals while the latter replaced Joey on the drums.
From the beginning, their songs were short (around 2 minutes per track), loud, and infectious, with their earlier sets lasting just 20 minutes. In those brief performances, the Ramones began defining what punk rock was supposed to sound like. Their signature three-chord compositions ensured that their music was accessible. It was also full bombastic energy. While many saw their songs as too fun to be taken seriously, they were also too loud to ignore. The Ramones’ sound was a rebellion against the clean-cut, heavily-produced, popular music of the era, and that rebellion became the core of the entire punk genre they helped spawn. Modern listeners might find the Ramone’s style more akin to pop than punk, but there’s no doubt that the band laid the foundation for today’s edgier, grittier, more political punk music.
The opening track of their first album just so happened to be Blitzkrieg Bop, which today is considered one of the most iconic songs of the 1970s. Modern fans might be amazed to discover, though, that both the song and the album failed to chart and were commercially unsuccessful. Yet, they were well-received in punk circles, both in the U.S. and the U.K. This new, underground musical movement was large enough that the Ramones were booked solid in both countries, and were able to follow up with their sophomore album, Leave Home, by January of 1977. Shortly after the release of the album, the band endured its first of many member-changes. Tommy Ramone (aka Erdelyi Tamas) was replaced by drummer Marc Bell of Richard Hell and the Voidoids. Bell took on the stage name Marky Ramone, and Tommy continued to be involved with the band as a manager and producer. Amid touring and recording, the band also starred in the 1979 Roger Corman cult classic Rock 'n' Roll High School, in which they play fictionalized versions of themselves.
In all, the Ramones released fourteen studio albums, with the last fittingly titled, ¡Adios Amigos! in 1996. The band stopped touring in the 90s, but were recognized for their contributions to music when they were inducted into the Rock and Roll Hall of Fame in 2002, and were given a Grammy Award for lifetime achievement in 2011. Not too shabby for a bunch of punks!
[Image description: A stage featuring three guitars and other music equipment under a red decorative panel.] Credit & copyright: wayoutradio, Pixabay -
FREELiterature PP&T CurioFree1 CQ
Of all the world’s literary characters, Odysseus probably needed directions the most. The Odyssey, by the ancient Greek poet Homer, tells of the final six weeks of Odysseus’ ten-year journey home, to the Island of Ithaca. Although the epic poem is filled with myth and legend, some elements of the tale may reflect reality. For example, an ancient city unearthed in modern-day Turkey may have been the city of Troy from the Odyssey. Further research has also led some historians to believe that the date Odysseus returned home, in the Odyssey, may have been this very day in 1178 B.C.E., which coincided with a real-life solar eclipse.
Considered one of the most important pieces of literature in history, the Odyssey spans 24 books. Epic poems are more narrative than traditional modern poems, meaning that they tell a single story from beginning to end. However, they have some characteristics still found in modern poetry, such as meter and alliteration, which add interest. These features also made epic poems easier to memorize, which was important since most of them were composed during a time when few people could not write. Bards, or professional storytellers, would memorize and recite epic poems, and those who could write would sometimes transcribe them, as was the case with the Odyssey.
The poem’s story opens on Odysseus, king of the Island of Ithaca, having a rough time. He’s attempting to get back home following the bloody Trojan War, but is being prevented by Poseidon, god of the sea. Odysseus angered Poseidon by blinding Polyphemus, a one-eyed giant who happened to be Poseidon’s son. Meanwhile, Odysseus’ wife is having troubles of her own, back on the Island of Ithaca. With her husband gone, even her son, the 20-something-year-old Telemachus, can’t prevent the island’s rowdy suitors from clamoring for her hand in marriage. In fact, there are 108 suitors in all, and they’ve barged their way into Odysseus’ royal home, eating his food and selling his things as they harass his family. Luckily, Telemachus has a friend in the goddess Athena, who comes to his home in disguise to help him. She finds him a ship and crew so that he may sail to find allies, the first of which is the son of the great warrior Nestor. The second, Menalaus of Sparta, tells Telemachus that his father is being held captive by a nymph called Calypso.
Odysseus has indeed been Calypso's prisoner for seven years, and the Odyssey details how he has repeatedly refused to marry her, even in exchange for immortality. When she is finally ordered by the gods to release him, Odysseus builds a raft to escape, but is once again thwarted by Poseidon, who causes the raft to crash near the island of Scheria, where he managed to swim ashore. There, he is helped by the royal family, who agree to help him off the island, though they don’t know who he is. His identity is only revealed when he gets emotional during a retelling of the story of the Trojan Horse, since he was involved in those events during the Trojan War. Odysseus then recounts everything he has been through following the war, including the story of Polyphemus’ blinding, after which the cyclops demanded that his father, Poseidon, force Odysseus to wander for ten years. Among other stories, he tells of his troubles with the witch-goddess Circe, who turned some of his crew into pigs, and of his encounter with sirens and the six-headed monster Scylla. After hearing his tale, the king and queen of Scheria finally help Odysseus return to Ithaca, along with a healthy heaping of treasure. Aided by Athena, Odysseus and Telemachus kill Penelope’s suitors and the king is reunited with his wife.
Today, the Odyssey remains a revered story of perseverance in the face of seemingly insurmountable odds. Its exploration of themes like free will and justice ensure that it will remain a hot topic among those who love literature. If only Homer could have known just how long his work would endure.
[Image description: A white, marble bust of Odysseus with wavy hair and a beard. The statue has a chipped nose.] Credit & copyright: Jastrow, Head of Odysseus, Wikimedia Commons, Public Domain DedicationOf all the world’s literary characters, Odysseus probably needed directions the most. The Odyssey, by the ancient Greek poet Homer, tells of the final six weeks of Odysseus’ ten-year journey home, to the Island of Ithaca. Although the epic poem is filled with myth and legend, some elements of the tale may reflect reality. For example, an ancient city unearthed in modern-day Turkey may have been the city of Troy from the Odyssey. Further research has also led some historians to believe that the date Odysseus returned home, in the Odyssey, may have been this very day in 1178 B.C.E., which coincided with a real-life solar eclipse.
Considered one of the most important pieces of literature in history, the Odyssey spans 24 books. Epic poems are more narrative than traditional modern poems, meaning that they tell a single story from beginning to end. However, they have some characteristics still found in modern poetry, such as meter and alliteration, which add interest. These features also made epic poems easier to memorize, which was important since most of them were composed during a time when few people could not write. Bards, or professional storytellers, would memorize and recite epic poems, and those who could write would sometimes transcribe them, as was the case with the Odyssey.
The poem’s story opens on Odysseus, king of the Island of Ithaca, having a rough time. He’s attempting to get back home following the bloody Trojan War, but is being prevented by Poseidon, god of the sea. Odysseus angered Poseidon by blinding Polyphemus, a one-eyed giant who happened to be Poseidon’s son. Meanwhile, Odysseus’ wife is having troubles of her own, back on the Island of Ithaca. With her husband gone, even her son, the 20-something-year-old Telemachus, can’t prevent the island’s rowdy suitors from clamoring for her hand in marriage. In fact, there are 108 suitors in all, and they’ve barged their way into Odysseus’ royal home, eating his food and selling his things as they harass his family. Luckily, Telemachus has a friend in the goddess Athena, who comes to his home in disguise to help him. She finds him a ship and crew so that he may sail to find allies, the first of which is the son of the great warrior Nestor. The second, Menalaus of Sparta, tells Telemachus that his father is being held captive by a nymph called Calypso.
Odysseus has indeed been Calypso's prisoner for seven years, and the Odyssey details how he has repeatedly refused to marry her, even in exchange for immortality. When she is finally ordered by the gods to release him, Odysseus builds a raft to escape, but is once again thwarted by Poseidon, who causes the raft to crash near the island of Scheria, where he managed to swim ashore. There, he is helped by the royal family, who agree to help him off the island, though they don’t know who he is. His identity is only revealed when he gets emotional during a retelling of the story of the Trojan Horse, since he was involved in those events during the Trojan War. Odysseus then recounts everything he has been through following the war, including the story of Polyphemus’ blinding, after which the cyclops demanded that his father, Poseidon, force Odysseus to wander for ten years. Among other stories, he tells of his troubles with the witch-goddess Circe, who turned some of his crew into pigs, and of his encounter with sirens and the six-headed monster Scylla. After hearing his tale, the king and queen of Scheria finally help Odysseus return to Ithaca, along with a healthy heaping of treasure. Aided by Athena, Odysseus and Telemachus kill Penelope’s suitors and the king is reunited with his wife.
Today, the Odyssey remains a revered story of perseverance in the face of seemingly insurmountable odds. Its exploration of themes like free will and justice ensure that it will remain a hot topic among those who love literature. If only Homer could have known just how long his work would endure.
[Image description: A white, marble bust of Odysseus with wavy hair and a beard. The statue has a chipped nose.] Credit & copyright: Jastrow, Head of Odysseus, Wikimedia Commons, Public Domain Dedication -
FREEWorld History PP&T CurioFree1 CQ
Bilbies, not bunnies! That’s the slogan of those in Australia who support the Easter Bilby, an Aussie alternative to the traditional Easter Bunny. Bilbies are endangered Australian marsupials with some rabbit-like features, such as long ears and strong back legs that make them prolific jumpers. This time of year, Australian shops sell chocolate bilbies and picture books featuring the Easter-themed marsupial. But the Easter Bilby isn’t just a way to Aussie-fy Easter. It helps bring awareness to two related environmental problems down under.
Bilbies are unique creatures, and some of the world’s oldest living mammals. They thrive in arid environments where many other animals have trouble surviving. Unlike rabbits, bilbies are omnivores who survive by eating a combination of plants, seeds, fungi, and insects. It’s no wonder that Australians are proud enough of this native animal to use it as a holiday mascot. As is fitting of such a whimsical character, the Easter Bilby was invented by a child. In 1968, 9-year-old Australian Rose-Marie Dusting wrote a short story called Billy The Aussie Easter Bilby, which she later published as a book. The book was popular enough to raise the general public’s interest in bilbies, and the Easter Bilby began appearing on Easter cards and decorations. The Easter Bilby really took off, though, when chocolate companies got on board and began selling chocolate bilbies right alongside the usual Easter Bunnies. Seeing that the Easter Bilby was quite popular, Australian environmentalists seized the opportunity to educate Australians about the bilby’s endangered status and the environmental problems posed by the nation's feral rabbits.
Bilbies were once found across 70 percent of Australia, but today that percentage has shriveled to 20 percent. Besides simple habitat encroachment, human life harmed bilbies in another big way: by introducing non-native species. Europeans introduced both foxes and domesticated cats to Australia in the 19th Century. Today, foxes kill around 300 million native Australian animals every year, While cats kill a whopping 2 billion annually. While it’s obvious how predators like foxes and cats can hunt and kill bilbies, cute, fluffy bunnies pose just as much of a threat. On Christmas Day in 1859, European settler Thomas Austin released 24 rabbits into the Australian wilderness, believing that hunting them would provide good sport for his fellow colonists. He couldn’t have foreseen the devastating consequences of his decision. From his original 24 rabbits, an entire population of non-native, feral rabbits was born, and they’ve been decimating native Australian wildlife ever since. These rabbits gobble up millions of native plants. This not only kills species that directly depend on the plants for food, it also causes soil erosion since the plants’ roots normally help keep soil compacted. Erosion can change entire landscapes, making them uninhabitable to native species. Unfortunately, rabbits helped drive one of Australia’s two bilby species, the Lesser Bilby, to extinction in the 1950s. Now, less than 10,000 Greater Bilbies remain in the wild.
When conservation group Foundation for Rabbit-Free Australia caught wind of the Easter Bilby, they took the opportunity to promote it as an environmentally-friendly alternative to the bunny-centric holiday. Their efforts led to more chocolate companies producing chocolate bilbies. Some even began donating their proceeds to help save real bilbies. Companies like Pink Lady and Haigh’s Chocolates have donated tens of thousands of dollars to Australia’s Save the Bilby Fund. Other Easter Bilby products include mugs, keychains, and stuffed toys. Some Australian artists create work featuring the Easter Bilby. Just like the Easter Bunny, the Easter Bilby is usually pictured bringing colorful eggs to children, and frolicking in springtime flowers. If he’s anything like his real-life counterparts, he’d sooner eat troublesome termites than cause any environmental damage. Win-win!
[Image description: A vintage drawing of a bilby with its long ears laid back.] Credit & copyright:
John Gould, Mammals of Australia Vol. I Plate 7, Wikimedia Commons, Public DomainBilbies, not bunnies! That’s the slogan of those in Australia who support the Easter Bilby, an Aussie alternative to the traditional Easter Bunny. Bilbies are endangered Australian marsupials with some rabbit-like features, such as long ears and strong back legs that make them prolific jumpers. This time of year, Australian shops sell chocolate bilbies and picture books featuring the Easter-themed marsupial. But the Easter Bilby isn’t just a way to Aussie-fy Easter. It helps bring awareness to two related environmental problems down under.
Bilbies are unique creatures, and some of the world’s oldest living mammals. They thrive in arid environments where many other animals have trouble surviving. Unlike rabbits, bilbies are omnivores who survive by eating a combination of plants, seeds, fungi, and insects. It’s no wonder that Australians are proud enough of this native animal to use it as a holiday mascot. As is fitting of such a whimsical character, the Easter Bilby was invented by a child. In 1968, 9-year-old Australian Rose-Marie Dusting wrote a short story called Billy The Aussie Easter Bilby, which she later published as a book. The book was popular enough to raise the general public’s interest in bilbies, and the Easter Bilby began appearing on Easter cards and decorations. The Easter Bilby really took off, though, when chocolate companies got on board and began selling chocolate bilbies right alongside the usual Easter Bunnies. Seeing that the Easter Bilby was quite popular, Australian environmentalists seized the opportunity to educate Australians about the bilby’s endangered status and the environmental problems posed by the nation's feral rabbits.
Bilbies were once found across 70 percent of Australia, but today that percentage has shriveled to 20 percent. Besides simple habitat encroachment, human life harmed bilbies in another big way: by introducing non-native species. Europeans introduced both foxes and domesticated cats to Australia in the 19th Century. Today, foxes kill around 300 million native Australian animals every year, While cats kill a whopping 2 billion annually. While it’s obvious how predators like foxes and cats can hunt and kill bilbies, cute, fluffy bunnies pose just as much of a threat. On Christmas Day in 1859, European settler Thomas Austin released 24 rabbits into the Australian wilderness, believing that hunting them would provide good sport for his fellow colonists. He couldn’t have foreseen the devastating consequences of his decision. From his original 24 rabbits, an entire population of non-native, feral rabbits was born, and they’ve been decimating native Australian wildlife ever since. These rabbits gobble up millions of native plants. This not only kills species that directly depend on the plants for food, it also causes soil erosion since the plants’ roots normally help keep soil compacted. Erosion can change entire landscapes, making them uninhabitable to native species. Unfortunately, rabbits helped drive one of Australia’s two bilby species, the Lesser Bilby, to extinction in the 1950s. Now, less than 10,000 Greater Bilbies remain in the wild.
When conservation group Foundation for Rabbit-Free Australia caught wind of the Easter Bilby, they took the opportunity to promote it as an environmentally-friendly alternative to the bunny-centric holiday. Their efforts led to more chocolate companies producing chocolate bilbies. Some even began donating their proceeds to help save real bilbies. Companies like Pink Lady and Haigh’s Chocolates have donated tens of thousands of dollars to Australia’s Save the Bilby Fund. Other Easter Bilby products include mugs, keychains, and stuffed toys. Some Australian artists create work featuring the Easter Bilby. Just like the Easter Bunny, the Easter Bilby is usually pictured bringing colorful eggs to children, and frolicking in springtime flowers. If he’s anything like his real-life counterparts, he’d sooner eat troublesome termites than cause any environmental damage. Win-win!
[Image description: A vintage drawing of a bilby with its long ears laid back.] Credit & copyright:
John Gould, Mammals of Australia Vol. I Plate 7, Wikimedia Commons, Public Domain -
FREEMind + Body PP&T CurioFree1 CQ
If you used Google earlier this week, odds are that you saw a Google Doodle featuring Justine Siegemund, a woman who changed the medical world forever. Believe it or not, women were largely excluded from obstetrics (the branch of medicine concerned with childbirth and maternal health) for centuries throughout much of Europe, until Siegemund changed things. In 1690, she became the first person to publish a medical text about obstetrics from a woman’s perspective, and the first woman to ever publish a German medical text.
Not much is known about Siegemund’s early life, but she was still a young woman in 17th-century Germany when she began suffering from a prolapsed uterus. A uterus becomes prolapsed when the ligaments and muscles that hold the organ in place are weakened, causing tissue to bulge out of the vagina. At the time, the condition wasn’t well understood, so Siegemund had difficulty finding adequate help from doctors and midwives. Since women could be midwives but not doctors, Siegemund was forced to seek help mostly from men, who not only didn’t understand her condition but often misdiagnosed her as being pregnant. Frustrated by the poor quality of healthcare available to her, Siegemund decided to educate herself on obstetrics.
She quickly encountered two new obstacles: doctors weren’t eager to help a woman learn about the medical sciences, and midwives were secretive about their trade. Moreover, nearly everything to do with childbirth was spread by word of mouth, meaning that there was no formal or academic resource that Siegemund could consult. Nevertheless, she started practicing as a midwife with what she could learn. She began by delivering children among the poor, eventually earning a reputation as a reliable midwife. As her reputation grew, nobles and even royalty became her patrons, and she eventually became the official court midwife in Berlin.
Unlike most of her peers, who relied on drugs and surgical instruments to assist in childbirth, Siegemund found innovative ways to safely deliver babies, even in cases where her peers might have given up. For example, when a baby began to emerge shoulders-first, she developed a way to safely rotate it in the birth canal before pulling it out. Before then, babies born in such a position often died. She also found a way to more safely perform breech deliveries, which were particularly dangerous for birthing mothers.
But her greatest contribution to obstetrics and midwifery was that she was willing to openly share what she had learned through study and experience. Oddly enough, her innovations gained her many critics from the traditional practitioners who believed her methods were dangerous and ineffective. This didn’t deter Siegemund, and she became so renowned in her field that she was encouraged by Queen Mary II of England to write down all she knew. This led her to publish The Court Midwife, which not only included text descriptions but detailed illustrations. In fact, the Google Doodle that paid homage to her referenced some of these illustrations by drawing the first “O” in the form of a pair of hands reaching toward a baby in-utero. March 28, when the Doodle was shown, was the anniversary of the day that her book was certified by the European University Viadrina Frankfurt as a medical textbook. In its time, the book was an instant success. Though it was originally published in German, it was soon translated to other languages, becoming the authoritative text on matters of midwifery throughout Europe. Quite a few family trees today likely owe many of their branches to her pioneering endeavors. Due to her medical issues, Siegemund was never able to have children of her own. Yet, through her tenacity, brilliance, and generosity, she personally delivered around 6,200 babies, and her written guidance led to the safe delivery of many more. No wonder she’s remembered as a mother of modern obstetrics.
[Image description: A black-and-white copper engraving of Justine Siegemund wearing a head covering, her body surrounded by an ovular frame.] Credit & copyright: Rodak, 1690, Wikimedia Commons, Public Domain, Image cropped for size.If you used Google earlier this week, odds are that you saw a Google Doodle featuring Justine Siegemund, a woman who changed the medical world forever. Believe it or not, women were largely excluded from obstetrics (the branch of medicine concerned with childbirth and maternal health) for centuries throughout much of Europe, until Siegemund changed things. In 1690, she became the first person to publish a medical text about obstetrics from a woman’s perspective, and the first woman to ever publish a German medical text.
Not much is known about Siegemund’s early life, but she was still a young woman in 17th-century Germany when she began suffering from a prolapsed uterus. A uterus becomes prolapsed when the ligaments and muscles that hold the organ in place are weakened, causing tissue to bulge out of the vagina. At the time, the condition wasn’t well understood, so Siegemund had difficulty finding adequate help from doctors and midwives. Since women could be midwives but not doctors, Siegemund was forced to seek help mostly from men, who not only didn’t understand her condition but often misdiagnosed her as being pregnant. Frustrated by the poor quality of healthcare available to her, Siegemund decided to educate herself on obstetrics.
She quickly encountered two new obstacles: doctors weren’t eager to help a woman learn about the medical sciences, and midwives were secretive about their trade. Moreover, nearly everything to do with childbirth was spread by word of mouth, meaning that there was no formal or academic resource that Siegemund could consult. Nevertheless, she started practicing as a midwife with what she could learn. She began by delivering children among the poor, eventually earning a reputation as a reliable midwife. As her reputation grew, nobles and even royalty became her patrons, and she eventually became the official court midwife in Berlin.
Unlike most of her peers, who relied on drugs and surgical instruments to assist in childbirth, Siegemund found innovative ways to safely deliver babies, even in cases where her peers might have given up. For example, when a baby began to emerge shoulders-first, she developed a way to safely rotate it in the birth canal before pulling it out. Before then, babies born in such a position often died. She also found a way to more safely perform breech deliveries, which were particularly dangerous for birthing mothers.
But her greatest contribution to obstetrics and midwifery was that she was willing to openly share what she had learned through study and experience. Oddly enough, her innovations gained her many critics from the traditional practitioners who believed her methods were dangerous and ineffective. This didn’t deter Siegemund, and she became so renowned in her field that she was encouraged by Queen Mary II of England to write down all she knew. This led her to publish The Court Midwife, which not only included text descriptions but detailed illustrations. In fact, the Google Doodle that paid homage to her referenced some of these illustrations by drawing the first “O” in the form of a pair of hands reaching toward a baby in-utero. March 28, when the Doodle was shown, was the anniversary of the day that her book was certified by the European University Viadrina Frankfurt as a medical textbook. In its time, the book was an instant success. Though it was originally published in German, it was soon translated to other languages, becoming the authoritative text on matters of midwifery throughout Europe. Quite a few family trees today likely owe many of their branches to her pioneering endeavors. Due to her medical issues, Siegemund was never able to have children of her own. Yet, through her tenacity, brilliance, and generosity, she personally delivered around 6,200 babies, and her written guidance led to the safe delivery of many more. No wonder she’s remembered as a mother of modern obstetrics.
[Image description: A black-and-white copper engraving of Justine Siegemund wearing a head covering, her body surrounded by an ovular frame.] Credit & copyright: Rodak, 1690, Wikimedia Commons, Public Domain, Image cropped for size. -
FREEActing PP&T CurioFree1 CQ
This action star’s year is off to a great start. Michelle Yeoh recently became the first Asian woman to win Best Actress at the 2023 Academy Awards. She received the Oscar for her role in the unconventional sci-fi film Everything Everywhere All at Once, in which she shows off her martial arts skills while playing an everyday woman on a multiverse-trekking adventure. Yeoh’s acting career had unconventional beginnings. In fact, the famously tough actress, known for performing her own fighting stunts, had a passion for ballet long before she developed a taste for theater or martial arts.
Born Yeoh Choo Kheng in 1962 in Ipoh, Malaysia, Yeoh showed a childhood passion for dance and studied ballet throughout her childhood, beginning at age 4. After her family moved to the U.K. during her teenage years, she continued studying ballet in boarding school and then at the Royal Academy of Dance. Unfortunately, during an otherwise normal practice session, Yeoh injured her back, suffering a herniated disk. Doctors told Yeoh that, while she would recover from her injury, she would no longer be able to practice ballet rigorously every day, thus ending her dream of becoming a prima ballerina.
But Yeoh was still drawn to the stage. She went on to study acting in college and received a BA in Creative Arts with a minor in Drama. Her break into the acting world began when she co-starred in a television commercial for luxury watches alongside none other than martial arts film star Jackie Chan…though Yeoh had no idea, at first, that Chan would be there. The job listed Chan’s Cantonese name, Sing Long, as her co-star, and the part was offered to her via a phone call in Cantonese, which Yeoh could partially understand but not speak. After that role, Yeoh made it a point to learn Cantonese, in addition to the Malay and English she already knew.
Her commercial caught the eye of D&B Films, a film production company in Hong Kong. The company suggested that Yeoh use the name Michelle Khan for her acting roles, saying that the name would be more marketable to a western audience. By 1985, Yeoh was starring in martial arts films for D&B under the name. In an interview with NPR, Yeoh recounted how she came to do her own action scenes under unconventional circumstances: “My first movie, I played a social worker. And we were bullied by, you know, the juvenile delinquents who took great pleasure in teasing us and giving us a hard time. And then the guys who were the martial arts experts were the ones who would rescue us constantly. So when I watched them, I went to my producers, and I say, you know what? I would love to be able to try to do martial arts. They looked at me and thought I was insane…But then they thought, well, what do we have to lose?”
Yeoh excelled at action scenes, which helped her make the leap to Hollywood in 1997, with her role as Wai Lin in the James Bond movie Tomorrow Never Dies. Pierce Brosnan, who played Bond at the time, described Yeoh as a “female James Bond” because she didn’t use a body double for her action scenes. Yeoh would go on to say that her tolerance for pain was something she learned from ballet. However, it was Ang Lee’s 2000 martial arts hit Crouching Tiger, Hidden Dragon that truly catapulted Yeoh to international fame and even earned her a BAFTA nomination. Though she didn’t yet speak Mandarin when she got the role (she had to learn her lines phonetically) she went on to learn the language after filming, bringing the number of languages she could speak to four.
Over the next decade, Yeoh starred in action movies (and racked up some painful injuries from doing her own stunts), founded her own production company, and became the face of several brands. In 2018, she starred in the widely acclaimed Crazy Rich Asians, and in 2021 she made her way into the Marvel Cinematic Universe with her role as Ying Nan in Shang-Chi and the Legend of the Ten Rings. Then, in 2022, Yeoh starred in Everything Everywhere All at Once. In her role as laundromat owner Evelyn Wang, she travels the multiverse in a surreal, comedic, sometimes heart-wrenching attempt to connect with different versions of her daughter. The role earned her a multitude of awards, including a Golden Globe, a Screen Actors Guild Award and, to top it all off, the Academy Award for Best Actress. At 60 years old, Yeoh has become one of the most iconic female action stars in Hollywood, and her career only seems to be ramping up. Who knows what movie set she’ll be smashing her way through next!
[Image description: Michelle Yeoh speaks at a conference in Burma.] Credit & copyright: Wikimedia Commons, William Ng, Public Domain per 17 U.S.C. section 101 and section 105.This action star’s year is off to a great start. Michelle Yeoh recently became the first Asian woman to win Best Actress at the 2023 Academy Awards. She received the Oscar for her role in the unconventional sci-fi film Everything Everywhere All at Once, in which she shows off her martial arts skills while playing an everyday woman on a multiverse-trekking adventure. Yeoh’s acting career had unconventional beginnings. In fact, the famously tough actress, known for performing her own fighting stunts, had a passion for ballet long before she developed a taste for theater or martial arts.
Born Yeoh Choo Kheng in 1962 in Ipoh, Malaysia, Yeoh showed a childhood passion for dance and studied ballet throughout her childhood, beginning at age 4. After her family moved to the U.K. during her teenage years, she continued studying ballet in boarding school and then at the Royal Academy of Dance. Unfortunately, during an otherwise normal practice session, Yeoh injured her back, suffering a herniated disk. Doctors told Yeoh that, while she would recover from her injury, she would no longer be able to practice ballet rigorously every day, thus ending her dream of becoming a prima ballerina.
But Yeoh was still drawn to the stage. She went on to study acting in college and received a BA in Creative Arts with a minor in Drama. Her break into the acting world began when she co-starred in a television commercial for luxury watches alongside none other than martial arts film star Jackie Chan…though Yeoh had no idea, at first, that Chan would be there. The job listed Chan’s Cantonese name, Sing Long, as her co-star, and the part was offered to her via a phone call in Cantonese, which Yeoh could partially understand but not speak. After that role, Yeoh made it a point to learn Cantonese, in addition to the Malay and English she already knew.
Her commercial caught the eye of D&B Films, a film production company in Hong Kong. The company suggested that Yeoh use the name Michelle Khan for her acting roles, saying that the name would be more marketable to a western audience. By 1985, Yeoh was starring in martial arts films for D&B under the name. In an interview with NPR, Yeoh recounted how she came to do her own action scenes under unconventional circumstances: “My first movie, I played a social worker. And we were bullied by, you know, the juvenile delinquents who took great pleasure in teasing us and giving us a hard time. And then the guys who were the martial arts experts were the ones who would rescue us constantly. So when I watched them, I went to my producers, and I say, you know what? I would love to be able to try to do martial arts. They looked at me and thought I was insane…But then they thought, well, what do we have to lose?”
Yeoh excelled at action scenes, which helped her make the leap to Hollywood in 1997, with her role as Wai Lin in the James Bond movie Tomorrow Never Dies. Pierce Brosnan, who played Bond at the time, described Yeoh as a “female James Bond” because she didn’t use a body double for her action scenes. Yeoh would go on to say that her tolerance for pain was something she learned from ballet. However, it was Ang Lee’s 2000 martial arts hit Crouching Tiger, Hidden Dragon that truly catapulted Yeoh to international fame and even earned her a BAFTA nomination. Though she didn’t yet speak Mandarin when she got the role (she had to learn her lines phonetically) she went on to learn the language after filming, bringing the number of languages she could speak to four.
Over the next decade, Yeoh starred in action movies (and racked up some painful injuries from doing her own stunts), founded her own production company, and became the face of several brands. In 2018, she starred in the widely acclaimed Crazy Rich Asians, and in 2021 she made her way into the Marvel Cinematic Universe with her role as Ying Nan in Shang-Chi and the Legend of the Ten Rings. Then, in 2022, Yeoh starred in Everything Everywhere All at Once. In her role as laundromat owner Evelyn Wang, she travels the multiverse in a surreal, comedic, sometimes heart-wrenching attempt to connect with different versions of her daughter. The role earned her a multitude of awards, including a Golden Globe, a Screen Actors Guild Award and, to top it all off, the Academy Award for Best Actress. At 60 years old, Yeoh has become one of the most iconic female action stars in Hollywood, and her career only seems to be ramping up. Who knows what movie set she’ll be smashing her way through next!
[Image description: Michelle Yeoh speaks at a conference in Burma.] Credit & copyright: Wikimedia Commons, William Ng, Public Domain per 17 U.S.C. section 101 and section 105. -
FREESTEM PP&T CurioFree1 CQ
Which came first, the computer or the programmer? While looms and steam engines are far from modern technology, these devices helped people conceive some of the earliest forerunners of modern day computers. But a computer is nothing without someone to program it. This Women’s History Month, we’re celebrating the life of Ada Lovelace, considered by many to be the world's first computer programmer.
Augusta Ada Lovelace was born in 1815 in London, England. Her father was the popular but eccentric poet Lord Byron, and her mother was Anne Isabella Noel Byron, Baroness Wentworth. Lord Byron left Ada behind in Britain when she was one month old and never saw her again. Lady Byron, fearing that her daughter’s imagination would drive her to the same eccentric behavior that her father was notorious for, insisted on an education focused on mathematics and science. Since girls and women were not allowed to pursue higher learning at universities, she hired private tutors for her daughter to advance her education. Not only was she strict, placing great pressure on her daughter to excel in all her studies, she actively discouraged any whimsical or fanciful behavior in young Ada. Even when Ada began studying birds as a child and taking notes to learn how to fly for herself, her efforts were rebuked by her mother.
At 19, Ada married William King, the Earl of Lovelace. By then, she had also become acquainted with Charles Babbage, the Lucasian Professor of Mathematics at the University of Cambridge. At the time, Babbage had been working on a prototype of his “difference engine,” a machine that could be used to accurately and consistently calculate mathematical sums. Inspired by the complexity of the device and the potential it held, Ada visited factories with her mother to observe steam-powered machines and gain a better understanding of mechanical principles. During one of these visits, she saw a Jacquard loom in action, which was a machine that could weave complicated patterns into a fabric based on a selection of punch cards. Upon seeing the punch cards, Ada began to think of them as a form of language that could be used to communicate with, and not merely control, machines.
This experience paid dividends when Babbage designed his “analytical engine.” Instead of performing simple calculations, this new device could hypothetically solve any problem, given sufficient input. It also had the capacity to store data as memory and was capable of loops and conditional branching. Essentially, it was a fully analog computer—or it would have been, had Babbage ever managed to construct a working model. Due to logistical and financial issues, no working model was ever made, but it inspired an Italian engineer named Luigi Federico Menabrea to write Sketch of Charles Babbage’s Analytical Engine, a paper in French concerning its functions. When Ada read the paper, she was in turn inspired to translate it into English, but in the process, managed to write additional notes that were three times longer than the original text.
She would come up with her greatest contributions to computer science while working with Babbage on the translated version. While correcting an error Babbage had made regarding his machine’s use of Bernoulli numbers, she wrote what would be considered the first ever computer program: the Bernoulli number algorithm. Additionally, she wrote instructions on how to make the engine repeat a series of instructions and wrote a code for the machine to convert the alphabet and musical notation into numerical data. In essence, Ada had learned how to communicate with the machine, laying the foundations of the language of computers before one even existed with principles that are still in use today.
Unfortunately, her work, along with Babbage’s machine, were too ahead of their time to be fully appreciated in her lifetime. However, the U.S. Department of Defense named a new computer language “Ada” in her honor in 1980. Today, the significance of her work is much better understood and appreciated than it ever has been. Better late than never!
[Image description: A watercolor portrait of Ada Lovelace sporting an elaborate hairdo and floral headband.] Credit & copyright:
Alfred Edward Chalon, Wikimedia Commons, Public DomainWhich came first, the computer or the programmer? While looms and steam engines are far from modern technology, these devices helped people conceive some of the earliest forerunners of modern day computers. But a computer is nothing without someone to program it. This Women’s History Month, we’re celebrating the life of Ada Lovelace, considered by many to be the world's first computer programmer.
Augusta Ada Lovelace was born in 1815 in London, England. Her father was the popular but eccentric poet Lord Byron, and her mother was Anne Isabella Noel Byron, Baroness Wentworth. Lord Byron left Ada behind in Britain when she was one month old and never saw her again. Lady Byron, fearing that her daughter’s imagination would drive her to the same eccentric behavior that her father was notorious for, insisted on an education focused on mathematics and science. Since girls and women were not allowed to pursue higher learning at universities, she hired private tutors for her daughter to advance her education. Not only was she strict, placing great pressure on her daughter to excel in all her studies, she actively discouraged any whimsical or fanciful behavior in young Ada. Even when Ada began studying birds as a child and taking notes to learn how to fly for herself, her efforts were rebuked by her mother.
At 19, Ada married William King, the Earl of Lovelace. By then, she had also become acquainted with Charles Babbage, the Lucasian Professor of Mathematics at the University of Cambridge. At the time, Babbage had been working on a prototype of his “difference engine,” a machine that could be used to accurately and consistently calculate mathematical sums. Inspired by the complexity of the device and the potential it held, Ada visited factories with her mother to observe steam-powered machines and gain a better understanding of mechanical principles. During one of these visits, she saw a Jacquard loom in action, which was a machine that could weave complicated patterns into a fabric based on a selection of punch cards. Upon seeing the punch cards, Ada began to think of them as a form of language that could be used to communicate with, and not merely control, machines.
This experience paid dividends when Babbage designed his “analytical engine.” Instead of performing simple calculations, this new device could hypothetically solve any problem, given sufficient input. It also had the capacity to store data as memory and was capable of loops and conditional branching. Essentially, it was a fully analog computer—or it would have been, had Babbage ever managed to construct a working model. Due to logistical and financial issues, no working model was ever made, but it inspired an Italian engineer named Luigi Federico Menabrea to write Sketch of Charles Babbage’s Analytical Engine, a paper in French concerning its functions. When Ada read the paper, she was in turn inspired to translate it into English, but in the process, managed to write additional notes that were three times longer than the original text.
She would come up with her greatest contributions to computer science while working with Babbage on the translated version. While correcting an error Babbage had made regarding his machine’s use of Bernoulli numbers, she wrote what would be considered the first ever computer program: the Bernoulli number algorithm. Additionally, she wrote instructions on how to make the engine repeat a series of instructions and wrote a code for the machine to convert the alphabet and musical notation into numerical data. In essence, Ada had learned how to communicate with the machine, laying the foundations of the language of computers before one even existed with principles that are still in use today.
Unfortunately, her work, along with Babbage’s machine, were too ahead of their time to be fully appreciated in her lifetime. However, the U.S. Department of Defense named a new computer language “Ada” in her honor in 1980. Today, the significance of her work is much better understood and appreciated than it ever has been. Better late than never!
[Image description: A watercolor portrait of Ada Lovelace sporting an elaborate hairdo and floral headband.] Credit & copyright:
Alfred Edward Chalon, Wikimedia Commons, Public Domain -
FREEWorld History PP&T CurioFree1 CQ
It's Flashback Friday, and the anniversary of the first successful telephone call back in 1876. Enjoy these curios all about phones and communication!
They were called "Hello Girls" in the retro language of the day, but the patronizing label belies the key role they played in World War I. The Hello Girls ran the vital communications network that linked military command with supply depots, soldiers on the frontlines, and war officials. But 100 years after the war, few Americans know about the Hello Girls, or their role in the Allied victory.The U.S. was the first modern country to enlist women in the armed forces shortly before World War I. Women initially served stateside, but in 1917 General John Pershing put out a call for women to operate switchboards overseas. Many were eager to join, and the War Department was immediately flooded with applications. 223 women were ultimately recruited and sent to Europe to operate switchboards for the Army Signal Corps.
The Hello Girl were vital to the war effort. At that time, the U.S. had the most advanced telephone technology in the world. Telephone communication was faster and more secure than telegraph or radio, and it gave the Allies a clear edge in the conflict. But the work wasn't easy or glamorous. The switchboards were complicated, and the pace was relentless; at the height of the war, operators were fielding as many as 150,000 calls a day. Many women were stationed near the front lines and came under sustained mortar fire.
Despite their valor, the Hello Girls weren't welcomed home with fanfare or ticker-tape parades. In fact, for years they weren't recognized as veterans, even though they'd endured the rigors of military training, wore uniforms and dog tags, and were subject to military regulations. The government denied the Hello Girls veteran status simply because they were women, which meant they didn't qualify for medical care or other benefits. The women fought to be recognized for 60 years, but weren't granted veteran status until 1977; by that time many of them had already passed away.
To this day, the Hello Girls haven't received the credit they deserve. Still, their work helped women finally win the right to vote in 1920, as historian Elizabeth Cobb makes clear in her book, Hello Girls: America's First Women Soldiers. Says Cobb: "With the war, the women could say, 'How can you deny us the vote if we're willing to lay down our lives?'"
Image credit & copyright: Lt. Fox, United States Army Signal Corps, Wikimedia Commons, Public Domain
It's Flashback Friday, and the anniversary of the first successful telephone call back in 1876. Enjoy these curios all about phones and communication!
They were called "Hello Girls" in the retro language of the day, but the patronizing label belies the key role they played in World War I. The Hello Girls ran the vital communications network that linked military command with supply depots, soldiers on the frontlines, and war officials. But 100 years after the war, few Americans know about the Hello Girls, or their role in the Allied victory.The U.S. was the first modern country to enlist women in the armed forces shortly before World War I. Women initially served stateside, but in 1917 General John Pershing put out a call for women to operate switchboards overseas. Many were eager to join, and the War Department was immediately flooded with applications. 223 women were ultimately recruited and sent to Europe to operate switchboards for the Army Signal Corps.
The Hello Girl were vital to the war effort. At that time, the U.S. had the most advanced telephone technology in the world. Telephone communication was faster and more secure than telegraph or radio, and it gave the Allies a clear edge in the conflict. But the work wasn't easy or glamorous. The switchboards were complicated, and the pace was relentless; at the height of the war, operators were fielding as many as 150,000 calls a day. Many women were stationed near the front lines and came under sustained mortar fire.
Despite their valor, the Hello Girls weren't welcomed home with fanfare or ticker-tape parades. In fact, for years they weren't recognized as veterans, even though they'd endured the rigors of military training, wore uniforms and dog tags, and were subject to military regulations. The government denied the Hello Girls veteran status simply because they were women, which meant they didn't qualify for medical care or other benefits. The women fought to be recognized for 60 years, but weren't granted veteran status until 1977; by that time many of them had already passed away.
To this day, the Hello Girls haven't received the credit they deserve. Still, their work helped women finally win the right to vote in 1920, as historian Elizabeth Cobb makes clear in her book, Hello Girls: America's First Women Soldiers. Says Cobb: "With the war, the women could say, 'How can you deny us the vote if we're willing to lay down our lives?'"
Image credit & copyright: Lt. Fox, United States Army Signal Corps, Wikimedia Commons, Public Domain
-
FREEPhysics PP&T CurioFree1 CQ
When it came to scientific firsts, Marie Salomea Skłodowska–Curie, best remembered as Marie Curie, was in a league of her own. Not only was she the first woman to ever win a Nobel Prize, she was also the first person to win two Nobel Prizes, and remains the only person to win a Nobel Prize in two scientific fields. Her discovery of radium alongside her husband, fellow physicist Pierre Curie, redefined the world’s understanding of radiation. All the while, her dedication to physics forever forever changed the way that women in scientific fields were perceived.
Marie Curie (then known as Maria Skłodowska) was born on November 7, 1867, in Warsaw, Poland. At the time, Poland was under the control of the Russian Empire, which caused Curie’s family great strife. Although her parents had both been well-respected teachers before her birth, they lost their jobs and most of their savings in failed Polish uprisings against Russia. When Russian authorities forbid her father from continuing to teach math and physics at the two Warsaw boys’ schools where he served as director, he brought his laboratory equipment home, rather than have it destroyed. There, he taught Curie and her four elder siblings how to use it, igniting her lifelong passion for science. Still, Curie’s childhood remained hard. When she was just ten years old, her mother died from tuberculosis. Not long after, her oldest sister, Zofia, died from typhus. As Curie grew older, it became difficult for her to pursue a career in science, since women were barred from most universities and science-related jobs. Still, she helped her father tutor for several years before finding a place to study at the Flying University, an underground Polish school that admitted women in defiance of Russian law.
In 1890, Curie began working in a chemical laboratory at the Museum of Industry and Agriculture in Warsaw. Normally, women in Poland were barred from working in laboratories, but this one was run by Curie’s cousin, Józef Boguski, who was himself a talented chemist. While working and continuing her studies, Curie saved money for a move to Paris, where women had much broader access to higher education. Her father supported her decision and, in 1891, he helped fund her move. That same year, she began studying at the Sorbonne, also known as The University of Paris. Student life was far from easy. Her small apartment had no heat, and the wages she made working as a tutor for fellow students didn’t always earn her enough to cover both rent and food. Still, in 1893 she earned a degree in physics and another in mathematics the following year.
While employed studying magnetism at the Society for the Encouragement of National Industry in Paris, Curie was introduced to Pierre Curie, a fellow physicist. Pierre was immediately impressed with her talent, and the two began working together in his laboratory. Soon, they were romantically involved, but Curie was determined to return to Poland and continue her scientific work closer to her family. This was easier said than done, though, since no Polish universities would allow women to teach math or science. Not long after returning to Peoland, Curie was convinced, by a letter from Pierre, to go back to Paris. The two were married in 1895, becoming the closest of scientific collaborators.
Just three years after getting married, Curie noticed that pitchblende, an ore containing uranium, became more radioactive when uranium was separated from it. She soon discovered that this was because of a new, radioactive element that was left behind after the separation. She named this new element polonium, after her native Poland. Just a few months after this discovery, the Curies discovered radium in a similar fashion, although they weren’t able to isolate radium salts until 1910. For their work in radioactivity alongside fellow physicist Henri Becquerel, the Curies would win the Nobel Prize for Physics in 1903, becoming the first married couple to ever win the prize. Curie’s isolation of radium also won her the Nobel Prize for Chemistry in 1911.
However, Curie’s devotion to her research ultimately came at a great cost. In 1934, Curie passed away from aplastic anemia, a rare condition caused by prolonged exposure to radioactivity from her research materials. Since the dangers of radioactivity weren’t fully understood in Curie’s lifetime, she didn’t take any precautions to limit her exposure. In fact, throughout her illness, she still went to work in the lab alongside Pierre. Who would have expected any less from one of the most dedicated scientists of all time?
[Image description: A black-and-white photograph of Marie Curie wearing a dress and long necklace.] Credit & copyright: Unknown author, circa 1898, Wikimedia Commons, Image cropped for size, Public domainWhen it came to scientific firsts, Marie Salomea Skłodowska–Curie, best remembered as Marie Curie, was in a league of her own. Not only was she the first woman to ever win a Nobel Prize, she was also the first person to win two Nobel Prizes, and remains the only person to win a Nobel Prize in two scientific fields. Her discovery of radium alongside her husband, fellow physicist Pierre Curie, redefined the world’s understanding of radiation. All the while, her dedication to physics forever forever changed the way that women in scientific fields were perceived.
Marie Curie (then known as Maria Skłodowska) was born on November 7, 1867, in Warsaw, Poland. At the time, Poland was under the control of the Russian Empire, which caused Curie’s family great strife. Although her parents had both been well-respected teachers before her birth, they lost their jobs and most of their savings in failed Polish uprisings against Russia. When Russian authorities forbid her father from continuing to teach math and physics at the two Warsaw boys’ schools where he served as director, he brought his laboratory equipment home, rather than have it destroyed. There, he taught Curie and her four elder siblings how to use it, igniting her lifelong passion for science. Still, Curie’s childhood remained hard. When she was just ten years old, her mother died from tuberculosis. Not long after, her oldest sister, Zofia, died from typhus. As Curie grew older, it became difficult for her to pursue a career in science, since women were barred from most universities and science-related jobs. Still, she helped her father tutor for several years before finding a place to study at the Flying University, an underground Polish school that admitted women in defiance of Russian law.
In 1890, Curie began working in a chemical laboratory at the Museum of Industry and Agriculture in Warsaw. Normally, women in Poland were barred from working in laboratories, but this one was run by Curie’s cousin, Józef Boguski, who was himself a talented chemist. While working and continuing her studies, Curie saved money for a move to Paris, where women had much broader access to higher education. Her father supported her decision and, in 1891, he helped fund her move. That same year, she began studying at the Sorbonne, also known as The University of Paris. Student life was far from easy. Her small apartment had no heat, and the wages she made working as a tutor for fellow students didn’t always earn her enough to cover both rent and food. Still, in 1893 she earned a degree in physics and another in mathematics the following year.
While employed studying magnetism at the Society for the Encouragement of National Industry in Paris, Curie was introduced to Pierre Curie, a fellow physicist. Pierre was immediately impressed with her talent, and the two began working together in his laboratory. Soon, they were romantically involved, but Curie was determined to return to Poland and continue her scientific work closer to her family. This was easier said than done, though, since no Polish universities would allow women to teach math or science. Not long after returning to Peoland, Curie was convinced, by a letter from Pierre, to go back to Paris. The two were married in 1895, becoming the closest of scientific collaborators.
Just three years after getting married, Curie noticed that pitchblende, an ore containing uranium, became more radioactive when uranium was separated from it. She soon discovered that this was because of a new, radioactive element that was left behind after the separation. She named this new element polonium, after her native Poland. Just a few months after this discovery, the Curies discovered radium in a similar fashion, although they weren’t able to isolate radium salts until 1910. For their work in radioactivity alongside fellow physicist Henri Becquerel, the Curies would win the Nobel Prize for Physics in 1903, becoming the first married couple to ever win the prize. Curie’s isolation of radium also won her the Nobel Prize for Chemistry in 1911.
However, Curie’s devotion to her research ultimately came at a great cost. In 1934, Curie passed away from aplastic anemia, a rare condition caused by prolonged exposure to radioactivity from her research materials. Since the dangers of radioactivity weren’t fully understood in Curie’s lifetime, she didn’t take any precautions to limit her exposure. In fact, throughout her illness, she still went to work in the lab alongside Pierre. Who would have expected any less from one of the most dedicated scientists of all time?
[Image description: A black-and-white photograph of Marie Curie wearing a dress and long necklace.] Credit & copyright: Unknown author, circa 1898, Wikimedia Commons, Image cropped for size, Public domain -
FREELiterature PP&T CurioFree1 CQ
It's Flashback Friday! In honor of Women’s History Month, enjoy these curios all about women and their impact.
"The more you know of your history, the more liberated you are." Poet, memoirist, and civil rights activist Maya Angelou once spoke those hard-earned words; and what a personal history she had. Turning 90 this week, the late writer endured countless torments at the hands of society, men—and even her family.
Born Marguerite Annie Johnson in St. Louis, Missouri, three-year-old Angelou was sent to her grandma's in Arkansas after her parents' chaotic divorce. There, she experienced rampant racism from locals; but her family would damage her the worst. When Angelou's mother visited, she brought along her boyfriend, who would rape Angelou. After Angelou's uncle killed the boyfriend in retaliation, the child took a vow of silence for five years.
Angelou refused to speak until she befriended Mrs. Flowers, an educated black woman who taught her in school. With Flowers acting as a guiding figure, Angelou excelled in class. She graduated from high school at 17, but only after bearing a son. To provide for the child, Angelou "worked as a shake dancer in night clubs, fry cook in hamburger joints, dinner cook in a Creole restaurant and once had a job in a mechanic's shop, taking the paint off cars with my hands."
The young mother struggled financially before joining the Harlem Writers Guild in the late '50s. As a member, she befriended famous African-American writers and activists like James Baldwin. She also worked as a civil rights activist for Martin Luther King, Jr., and temporarily relocated with her son to Africa. Eventually, she returned to the U.S. in 1964 and published her first autobiographical work in 1969, the gut-wrenching I Know Why the Caged Bird Sings. The book's unflinching depiction of her childhood and adolescence floored a generation of Americans.
The memoirist released six more acclaimed autobiographies and a plethora of poetry. These works championed black beauty, the strength of women, and the human spirit—all of them made excruciatingly palpable by her struggles. Having died in 2014, Angelou is remembered as one of the greatest nonfiction writers of the 20th century, one who could make every word of her bittersweet books sing. But unlike most prolific writers, Angelou readily admitted how hard she worked at her craft—and she took great pride in that hustle. In her own words: "Nothing will work unless you do."
Image credit & copyright: Wikimedia Commons, Courtesy, William J. Clinton Presidential Library, Public Domain
It's Flashback Friday! In honor of Women’s History Month, enjoy these curios all about women and their impact.
"The more you know of your history, the more liberated you are." Poet, memoirist, and civil rights activist Maya Angelou once spoke those hard-earned words; and what a personal history she had. Turning 90 this week, the late writer endured countless torments at the hands of society, men—and even her family.
Born Marguerite Annie Johnson in St. Louis, Missouri, three-year-old Angelou was sent to her grandma's in Arkansas after her parents' chaotic divorce. There, she experienced rampant racism from locals; but her family would damage her the worst. When Angelou's mother visited, she brought along her boyfriend, who would rape Angelou. After Angelou's uncle killed the boyfriend in retaliation, the child took a vow of silence for five years.
Angelou refused to speak until she befriended Mrs. Flowers, an educated black woman who taught her in school. With Flowers acting as a guiding figure, Angelou excelled in class. She graduated from high school at 17, but only after bearing a son. To provide for the child, Angelou "worked as a shake dancer in night clubs, fry cook in hamburger joints, dinner cook in a Creole restaurant and once had a job in a mechanic's shop, taking the paint off cars with my hands."
The young mother struggled financially before joining the Harlem Writers Guild in the late '50s. As a member, she befriended famous African-American writers and activists like James Baldwin. She also worked as a civil rights activist for Martin Luther King, Jr., and temporarily relocated with her son to Africa. Eventually, she returned to the U.S. in 1964 and published her first autobiographical work in 1969, the gut-wrenching I Know Why the Caged Bird Sings. The book's unflinching depiction of her childhood and adolescence floored a generation of Americans.
The memoirist released six more acclaimed autobiographies and a plethora of poetry. These works championed black beauty, the strength of women, and the human spirit—all of them made excruciatingly palpable by her struggles. Having died in 2014, Angelou is remembered as one of the greatest nonfiction writers of the 20th century, one who could make every word of her bittersweet books sing. But unlike most prolific writers, Angelou readily admitted how hard she worked at her craft—and she took great pride in that hustle. In her own words: "Nothing will work unless you do."
Image credit & copyright: Wikimedia Commons, Courtesy, William J. Clinton Presidential Library, Public Domain
-
FREEUS History PP&T CurioFree1 CQ
It was a cultural symbol and a two-time target of violence. While the terrorist attack of September 11 is the best-remembered disaster to strike the World Trade Center in New York City, it wasn’t the only one. On this day in 1993, a bombing of one of the towers left six dead and many more injured. Disturbingly, the attack was meant to be just one part of a larger plot, and one of the bombers even had familial ties to a participant in the eventual September 11 attack.
In the weeks leading up to the bombing, seven conspirators were engaged in the final stages of their plan, which was meant to topple the World Trade Center and cripple the U.S. economy. Within a rented garage in New Jersey, the group assembled a 1,500 lb bomb, while one of them went to the World Trade Center to scout the area. One of the men, Mohammad Salameh, also rented a van which he reported stolen the day before the bombing to throw future investigators off his trail. Then, on the morning of February 26, Ramzi Yousef, the mastermind of the plot, drove to the World Trade Center along with three of the conspirators. They parked their van full of explosives in the underground parking garage.
At 12:17 PM, once they were clear of the blast area, they detonated the explosives inside the van. The explosion created a 200-foot crater and instantly killed six people in the immediate vicinity. Toppling debris and smoke led to around a thousand injuries. As rescue efforts promptly began, the New York Police Department (NYPD) and the FBI’s Joint Terrorism Task Force quickly reasoned that the explosion was the result of a terrorist attack. By then, the task force had been tracking the perpetrators for months, but they’d been too late to stop the attack.
In the days after the bombing, investigators discovered the wreckage of a vehicle at the epicenter of the explosion. Little remained of it, but at least one part of its vehicle identification number (VIN) survived. They tracked the VIN to the rental company that Salameh had used and arrested him when he showed up to the company’s office to get his $400 deposit back. More arrests followed shortly after, leading to four convictions in 1994. Each of the men were given 240-year sentences for their involvement.
However, three more conspirators in the attack remained at large, including the mastermind, Ramzi Yousef. The FBI offered a $2 million reward for information leading to his arrest, and it paid off. Following a tip from a former contact associate of his, Yousef was found and arrested in 1995 while staying at a hotel in Pakistan. Another conspirator, Eyad Ismoil, was found later that year. Both were also given 240-year sentences, but one conspirator, Abdul Yasin, remains at large to this day.
Despite the capture of six of the seven terrorists responsible for the attack, the threat of terrorism on U.S. soil was far from gone. In the course of pursuing them, the FBI found another group planning a simultaneous attack on New York landmarks, as well as plans by the original group to use cyanide gas in a chemical attack. The bombing of the World Trade Center hadn’t even been fully realized; Yousef had intended to fully topple both towers. As devastating as the bomb had been, it was a failed attempt, and authorities worried that other terrorists might look to finish what the first group started. Indeed, Yousef’s uncle, Khalid Sheikh Mohammed, went on to be directly involved in the planning of the September 11 attacks, which took place just over eight years later. Today, it serves as a reminder that terrorist targets can and often are attacked more than once.
[Image description: A photo of the New York City skyline when the World Trade Center twin towers still stood.] Credit & copyright: geralt, PixabayIt was a cultural symbol and a two-time target of violence. While the terrorist attack of September 11 is the best-remembered disaster to strike the World Trade Center in New York City, it wasn’t the only one. On this day in 1993, a bombing of one of the towers left six dead and many more injured. Disturbingly, the attack was meant to be just one part of a larger plot, and one of the bombers even had familial ties to a participant in the eventual September 11 attack.
In the weeks leading up to the bombing, seven conspirators were engaged in the final stages of their plan, which was meant to topple the World Trade Center and cripple the U.S. economy. Within a rented garage in New Jersey, the group assembled a 1,500 lb bomb, while one of them went to the World Trade Center to scout the area. One of the men, Mohammad Salameh, also rented a van which he reported stolen the day before the bombing to throw future investigators off his trail. Then, on the morning of February 26, Ramzi Yousef, the mastermind of the plot, drove to the World Trade Center along with three of the conspirators. They parked their van full of explosives in the underground parking garage.
At 12:17 PM, once they were clear of the blast area, they detonated the explosives inside the van. The explosion created a 200-foot crater and instantly killed six people in the immediate vicinity. Toppling debris and smoke led to around a thousand injuries. As rescue efforts promptly began, the New York Police Department (NYPD) and the FBI’s Joint Terrorism Task Force quickly reasoned that the explosion was the result of a terrorist attack. By then, the task force had been tracking the perpetrators for months, but they’d been too late to stop the attack.
In the days after the bombing, investigators discovered the wreckage of a vehicle at the epicenter of the explosion. Little remained of it, but at least one part of its vehicle identification number (VIN) survived. They tracked the VIN to the rental company that Salameh had used and arrested him when he showed up to the company’s office to get his $400 deposit back. More arrests followed shortly after, leading to four convictions in 1994. Each of the men were given 240-year sentences for their involvement.
However, three more conspirators in the attack remained at large, including the mastermind, Ramzi Yousef. The FBI offered a $2 million reward for information leading to his arrest, and it paid off. Following a tip from a former contact associate of his, Yousef was found and arrested in 1995 while staying at a hotel in Pakistan. Another conspirator, Eyad Ismoil, was found later that year. Both were also given 240-year sentences, but one conspirator, Abdul Yasin, remains at large to this day.
Despite the capture of six of the seven terrorists responsible for the attack, the threat of terrorism on U.S. soil was far from gone. In the course of pursuing them, the FBI found another group planning a simultaneous attack on New York landmarks, as well as plans by the original group to use cyanide gas in a chemical attack. The bombing of the World Trade Center hadn’t even been fully realized; Yousef had intended to fully topple both towers. As devastating as the bomb had been, it was a failed attempt, and authorities worried that other terrorists might look to finish what the first group started. Indeed, Yousef’s uncle, Khalid Sheikh Mohammed, went on to be directly involved in the planning of the September 11 attacks, which took place just over eight years later. Today, it serves as a reminder that terrorist targets can and often are attacked more than once.
[Image description: A photo of the New York City skyline when the World Trade Center twin towers still stood.] Credit & copyright: geralt, Pixabay -
FREEHumanities PP&T CurioFree1 CQ
It's Flashback Friday, and Twin Peaks Day! In honor of the hit television series Twin Peaks, enjoy these curios all about T.V. In 1959, the United States came face to face with its greatest existential threat: animal nudity. Leading the charge against the bare-bottomed dogs and cows of the world was the newly-formed Society for Indecency in Naked Animals, or S.I.N.A., fronted by a mysterious man known as "G. Clifford Prout." The organization's motto was, A nude horse is a rude horse. Prout made the rounds on radio and daytime television, railing against animals that were "destroying the moral integrity of our great nation."
Of course, the thing was all one big joke. Prout was actually an actor named Buck Henry—later famous for writing the script for The Graduate. The hoax was masterminded by Alan Abel, a young prankster with a bone to pick with "conservative moralists." Still, thousands of Americans bought it, joining the Society and even offering financial support for the cause.
Abel ended the hoax a few years (!) later, but he was far from finished. In 1964, along with his wife Joanne, he started promoting the presidential candidacy of "Yetta Bronstein," a 48-year-old Jewish housewife from the Bronx. Bronstein, a member of the "Best Party," made the radio circuits, describing an absurd platform of "flouridation, national bingo, sex education, and stronger government."
Hidden deep within every Abel scheme, no matter how absurd, was serious commentary. Many of his pranks were meant to call out the ease by which misinformation was spread through the media. He held special disdain for the "lack of substance" in popular daytime talk shows.
Abel's coup de grâce came on January 2, 1980 when The New York Times ran his obituary. The article read: "Alan Abel, a writer, musician and film producer who specialized in satire and lampoons, died of a heart attack yesterday at Sundance, a ski resort near Orem, Utah, while investigating a location for a new film. He was 50." Abel held a press conference the next day to show he was alive and well. It was proof that even the most well-respected media publications can be had, and that Abel was truly a master of his craft. The prank did have one unintended negative consequence. "Now when I really die," Abel lamented, "I'm afraid no one will believe it."
[Image credit & copyright: pasja1000, PixabayIt's Flashback Friday, and Twin Peaks Day! In honor of the hit television series Twin Peaks, enjoy these curios all about T.V. In 1959, the United States came face to face with its greatest existential threat: animal nudity. Leading the charge against the bare-bottomed dogs and cows of the world was the newly-formed Society for Indecency in Naked Animals, or S.I.N.A., fronted by a mysterious man known as "G. Clifford Prout." The organization's motto was, A nude horse is a rude horse. Prout made the rounds on radio and daytime television, railing against animals that were "destroying the moral integrity of our great nation."
Of course, the thing was all one big joke. Prout was actually an actor named Buck Henry—later famous for writing the script for The Graduate. The hoax was masterminded by Alan Abel, a young prankster with a bone to pick with "conservative moralists." Still, thousands of Americans bought it, joining the Society and even offering financial support for the cause.
Abel ended the hoax a few years (!) later, but he was far from finished. In 1964, along with his wife Joanne, he started promoting the presidential candidacy of "Yetta Bronstein," a 48-year-old Jewish housewife from the Bronx. Bronstein, a member of the "Best Party," made the radio circuits, describing an absurd platform of "flouridation, national bingo, sex education, and stronger government."
Hidden deep within every Abel scheme, no matter how absurd, was serious commentary. Many of his pranks were meant to call out the ease by which misinformation was spread through the media. He held special disdain for the "lack of substance" in popular daytime talk shows.
Abel's coup de grâce came on January 2, 1980 when The New York Times ran his obituary. The article read: "Alan Abel, a writer, musician and film producer who specialized in satire and lampoons, died of a heart attack yesterday at Sundance, a ski resort near Orem, Utah, while investigating a location for a new film. He was 50." Abel held a press conference the next day to show he was alive and well. It was proof that even the most well-respected media publications can be had, and that Abel was truly a master of his craft. The prank did have one unintended negative consequence. "Now when I really die," Abel lamented, "I'm afraid no one will believe it."
[Image credit & copyright: pasja1000, Pixabay -
FREEPP&T CurioFree1 CQ
Before there was Galileo, there was Copernicus. Born on this day in 1473, Nicolaus Copernicus was a Polish polymath credited with discovering that the earth revolved around the sun, rather than the other way around. It was a very original (and somewhat controversial) idea for the time. In fact, Copernicus’s writings caused an entire scientific revolution…though he didn’t live to see the most exciting parts of it.
Copernicus was born in Toruń, a city in Poland. His father, Nicolaus Copernicus Sr., was a merchant, but not of the everyday variety. He worked with copper, which was less common and more valuable than it is today. Copernicus’s mother, Barbara Watzenrode, was the daughter of a wealthy merchant herself, so Copernicus and his siblings grew up well-off and well-educated, despite the Thirteen Years’ War which was raging at the time. Unfortunately, when Copernicus was just ten years old, his father died. Copernicus’s maternal uncle, Lucas Watzenrode the Younger, took over overseeing his education, and immediately noticed that Copernicus was a bright boy skilled in mathematics. Watzenrode was Prince Bishop of Warmia, a historical region in Prussia, in modern-day Poland. This meant that he served as the region's civil ruler, and thus had connections with other Prince Bishops and high-ranking intellectuals. This allowed him to ensure that his nephew had access to advanced tutors throughout his childhood. Eventually, Copernicus went on to study astronomy and astrology at the University of Kraków and the University of Bologna. Though astrology (making predictions based on the positions of stars) is considered far from scientific today, in Copernicus’s time it was just as respected as astronomy. Copernicus also studied medicine at the University of Padua, and was tutored extensively in economics.
During his time at the University of Bologna, Copernicus lived in the home of and studied closely under the university’s head astronomer, Domenico Maria de Novara. Part of Novara’s job was reading the stars to foretell important events that might affect Bologna. Copernicus began helping with this job, gaining plenty of actual scientific knowledge along the way. Yet, Copernicus had doubts and questions. Like other astronomers of his time, Novara adhered to the Ptolemaic system of cosmology, which had been the dominant model for over a millennium. Before Ptolemy, ancient astronomers and philosophers believed that the sun and the planets moved in a perfectly circular path around the Earth. This, however, did not explain why sometimes planets appeared to be in retrograde, or moving backwards. Ptolemy resolved this issue by devising a model where the planets moved around the Earth while also moving within a smaller circle, or epicycles. This was enough to satisfy astronomers, including Novara, until Copernicus came along.
Copernicus’s own observations led him to come up with the heliocentric theory, putting the sun at the center of the known universe, instead of the Earth. He wrote about this in Commentariolus (meaning “Little Commentary”), a short treatise that was published after his death. The reason he did not release the treatise while he was alive was not due to potential controversy, but because he felt that it was an incomplete explanation of the universe, largely having to do with the concept of gravity and the growing evidence suggesting that the orbits of the planets were not perfect circles. His inability to reconcile these issues also led him to delay the publication of his most comprehensive work, De revolutionibus orbium coelestium libri vi, or “On the Revolutions of the Heavenly Spheres,” until the end of his life 1543. When it was published, it was not received well among religious leaders and thinkers of the time like Martin Luther, who considered it heretical. Still, it didn’t become truly controversial until Galileo used it as his inspiration for his own theories, officially kicking off what is now known as the Copernican Revolution, in 1610. In 1616, the Vatican banned the book, but Galileo and Johannes Kepler were among its strongest advocates, and were perhaps more ardent in its defense than Copernicus himself would have been. Maybe the idea just needed a little more time for people to rotate around to it.
[Image description: A painting of Nicolaus Copernicus, surrounded by astronomical instruments, looking up at the sky with a cathedral in the background.] Credit & copyright: Jan Matejko, Wikimedia Commons, Public DomainBefore there was Galileo, there was Copernicus. Born on this day in 1473, Nicolaus Copernicus was a Polish polymath credited with discovering that the earth revolved around the sun, rather than the other way around. It was a very original (and somewhat controversial) idea for the time. In fact, Copernicus’s writings caused an entire scientific revolution…though he didn’t live to see the most exciting parts of it.
Copernicus was born in Toruń, a city in Poland. His father, Nicolaus Copernicus Sr., was a merchant, but not of the everyday variety. He worked with copper, which was less common and more valuable than it is today. Copernicus’s mother, Barbara Watzenrode, was the daughter of a wealthy merchant herself, so Copernicus and his siblings grew up well-off and well-educated, despite the Thirteen Years’ War which was raging at the time. Unfortunately, when Copernicus was just ten years old, his father died. Copernicus’s maternal uncle, Lucas Watzenrode the Younger, took over overseeing his education, and immediately noticed that Copernicus was a bright boy skilled in mathematics. Watzenrode was Prince Bishop of Warmia, a historical region in Prussia, in modern-day Poland. This meant that he served as the region's civil ruler, and thus had connections with other Prince Bishops and high-ranking intellectuals. This allowed him to ensure that his nephew had access to advanced tutors throughout his childhood. Eventually, Copernicus went on to study astronomy and astrology at the University of Kraków and the University of Bologna. Though astrology (making predictions based on the positions of stars) is considered far from scientific today, in Copernicus’s time it was just as respected as astronomy. Copernicus also studied medicine at the University of Padua, and was tutored extensively in economics.
During his time at the University of Bologna, Copernicus lived in the home of and studied closely under the university’s head astronomer, Domenico Maria de Novara. Part of Novara’s job was reading the stars to foretell important events that might affect Bologna. Copernicus began helping with this job, gaining plenty of actual scientific knowledge along the way. Yet, Copernicus had doubts and questions. Like other astronomers of his time, Novara adhered to the Ptolemaic system of cosmology, which had been the dominant model for over a millennium. Before Ptolemy, ancient astronomers and philosophers believed that the sun and the planets moved in a perfectly circular path around the Earth. This, however, did not explain why sometimes planets appeared to be in retrograde, or moving backwards. Ptolemy resolved this issue by devising a model where the planets moved around the Earth while also moving within a smaller circle, or epicycles. This was enough to satisfy astronomers, including Novara, until Copernicus came along.
Copernicus’s own observations led him to come up with the heliocentric theory, putting the sun at the center of the known universe, instead of the Earth. He wrote about this in Commentariolus (meaning “Little Commentary”), a short treatise that was published after his death. The reason he did not release the treatise while he was alive was not due to potential controversy, but because he felt that it was an incomplete explanation of the universe, largely having to do with the concept of gravity and the growing evidence suggesting that the orbits of the planets were not perfect circles. His inability to reconcile these issues also led him to delay the publication of his most comprehensive work, De revolutionibus orbium coelestium libri vi, or “On the Revolutions of the Heavenly Spheres,” until the end of his life 1543. When it was published, it was not received well among religious leaders and thinkers of the time like Martin Luther, who considered it heretical. Still, it didn’t become truly controversial until Galileo used it as his inspiration for his own theories, officially kicking off what is now known as the Copernican Revolution, in 1610. In 1616, the Vatican banned the book, but Galileo and Johannes Kepler were among its strongest advocates, and were perhaps more ardent in its defense than Copernicus himself would have been. Maybe the idea just needed a little more time for people to rotate around to it.
[Image description: A painting of Nicolaus Copernicus, surrounded by astronomical instruments, looking up at the sky with a cathedral in the background.] Credit & copyright: Jan Matejko, Wikimedia Commons, Public Domain -
FREEReligious Studies PP&T CurioFree1 CQ
It's Flashback Friday, and World Human Spirit Day! Enjoy these curios about religion around the world.
Come on… come on… give us gimel! During the eight-day celebration of Chanukah, games of Dreidel are customary among family and friends. The game revolves around the toy dreidel, Yiddish for "spinning top." But the ancient gizmo isn't just a way to share a good time and few laughs: this little spinner acts as a conduit for Jewish history and culture.
The four letters on the faces of the dreidel form an acronym for the phrase Nes gadol hayah sham, which means "a great miracle happened there." The miracle happened during the rebellion of the Maccabees; the Maccabees were Jewish resistance fighters who defeated their Greco-Syrian oppressors around 160 B.C.E. At the time, the Torah had been outlawed and its practice made a crime punishable by death.
When the Maccabees achieved victory, the Jews reclaimed their homeland and rededicated the Holy Temple in Jerusalem. There, they built a new menorah and placed it within the temple, but lacked the oil to keep it burning more than a day. And yet, the menorah's light burned for eight days—enough time for more oil to reach the temple. This is the miracle commemorated on the dreidel and where the name "Festival of Lights" is derived.
Despite the symbolism of the dreidel's engravings, it's still a lighthearted game. To play, friends and family gather around the top in a circle. Players take turns spinning the dreidel and reap its rewards or misfortunes, depending on which face lands upside: nun means nothing happens, gimel means the spinner gets the pot, hay means the spinner takes half the pot, and shin means the player must give a predetermined amount to the pot. Whenever the pot empties, each player must supply a set amount to a new pot. Any player out of "currency" must forfeit if they land on shin or if another player lands on gimel.
And as far as the "currency" for dreidel goes, just about anything in reason works—be it pennies, nuts, or the customary coin-shaped chocolate gelt. Or, for the uber competitive, there's real money to be won at the annual Major League Dreidel Spin-Off—which, in the past, has also supplied winners with a year's supply of gelt. Chag Sameach!
Image credit & copyright: Ri_Ya, Pixabay
It's Flashback Friday, and World Human Spirit Day! Enjoy these curios about religion around the world.
Come on… come on… give us gimel! During the eight-day celebration of Chanukah, games of Dreidel are customary among family and friends. The game revolves around the toy dreidel, Yiddish for "spinning top." But the ancient gizmo isn't just a way to share a good time and few laughs: this little spinner acts as a conduit for Jewish history and culture.
The four letters on the faces of the dreidel form an acronym for the phrase Nes gadol hayah sham, which means "a great miracle happened there." The miracle happened during the rebellion of the Maccabees; the Maccabees were Jewish resistance fighters who defeated their Greco-Syrian oppressors around 160 B.C.E. At the time, the Torah had been outlawed and its practice made a crime punishable by death.
When the Maccabees achieved victory, the Jews reclaimed their homeland and rededicated the Holy Temple in Jerusalem. There, they built a new menorah and placed it within the temple, but lacked the oil to keep it burning more than a day. And yet, the menorah's light burned for eight days—enough time for more oil to reach the temple. This is the miracle commemorated on the dreidel and where the name "Festival of Lights" is derived.
Despite the symbolism of the dreidel's engravings, it's still a lighthearted game. To play, friends and family gather around the top in a circle. Players take turns spinning the dreidel and reap its rewards or misfortunes, depending on which face lands upside: nun means nothing happens, gimel means the spinner gets the pot, hay means the spinner takes half the pot, and shin means the player must give a predetermined amount to the pot. Whenever the pot empties, each player must supply a set amount to a new pot. Any player out of "currency" must forfeit if they land on shin or if another player lands on gimel.
And as far as the "currency" for dreidel goes, just about anything in reason works—be it pennies, nuts, or the customary coin-shaped chocolate gelt. Or, for the uber competitive, there's real money to be won at the annual Major League Dreidel Spin-Off—which, in the past, has also supplied winners with a year's supply of gelt. Chag Sameach!
Image credit & copyright: Ri_Ya, Pixabay
-
FREESTEM PP&T CurioFree1 CQ
What if this curio was written by a robot? It wasn’t, but in the near future, a lot of what you read online might be. Artificial intelligence software has been around for a while now. In fact, the world’s first rudimentary chatbot was created way back in 1966, before the world wide web was even invented. Recently, though, AI technology has improved to the point that it can mimic human writing patterns. There’s no better example of this than ChatGPT, a chatbot developed by AI research and deployment company OpenAI. Trained to understand human speech patterns and even to inject humor into its responses, ChatGPT can take just about any written command (such as “write an essay about Shakespeare”) and generate a somewhat-human-like response of up to around 500 words. For some, this means that writing social media posts for their online businesses or creating marketing material for websites is now easier than ever. For others, especially those in academia and publishing, life just got a whole lot more complicated.
ChatGPT was first launched on November 30, 2022, building off of OpenAI’s existing GPT-3.5 AI model. Envisioned as a way to change how people interact with computers, ChatGPT can answer questions about subjects like art and history in a conversational way that may help some users better understand the subject matter. This is because ChatGPT was not only trained by “reading” vast amounts of online text, from essays and articles to social media posts, it was also overseen by human trainers. These human trainers provided feedback to ChatGPT about what kind of responses were appropriate in which circumstances, which sentences were phrased best by the AI, and more. It’s no wonder that the chatbot quickly went viral online for its uncanny ability to sound human. It hasn’t been long since ChatGPT launched, yet it’s already one of the fastest growing internet services ever. In January 2023, it served around 100 million users, and Microsoft is planning to integrate the technology into some of its Office software and its search engine, Bing.
Being popular has its downsides, though. ChatGPT is already facing intense scrutiny from those in the academic community who fear that the chatbot could lead to a wave of cheating. After all, what’s to stop students from letting ChatGPT write essays for them? In the past, AI-generated content, like plagiarized content, was easier to detect because it was often nonsensical or sounded “off”. Not so with ChatGPT. This has led some teachers to employ AI-detecting software, like GPTZero, developed by entrepreneur Edward Tian. Unfortunately, GPTZero has problems of its own. It can’t definitively tell a user whether something was generated using AI. Instead, it tells a user the perplexity or measurement of randomness, of inputted text. If the text is seen as random by an AI, then it is probably genuine. If it is familiar to AI, then it will often be flagged as AI-generated. The trouble is…not a lot is seen as random by ChatGPT, especially not common, short sentences. Students are already coming forward to claim that they’ve been falsely accused of using AI to write essays. Some professional writers have also sounded the alarm about bosses and clients falsely flagging their content as AI-generated simply because it contained too many short sentences or common phrases.
Cheating isn’t the only danger when it comes to AI like ChatGPT. After all, the internet that helped train it isn’t always an unbiased place. ChatGPT’s ability to generate text in the style of certain people or publications can lead to chilling misinformation. For example, when credibility-assessment organization NewsGuard asked ChatGPT to write text in the style of a Russian propaganda news site, the AI created a short article full of misinformation. Even when not being asked to create text in a certain style, ChatGPT isn’t perfect. Frequent users have noticed that it sometimes has difficulty getting timelines and titles correct, making it particularly bad at giving correct information about authors. Just one more reason to hold off on letting ChatGPT do too much writing for us.
[Image description: A pair of hands types on a laptop, which sits on a wooden table with a geometric pattern.] Credit & copyright: Pexels, PixabayWhat if this curio was written by a robot? It wasn’t, but in the near future, a lot of what you read online might be. Artificial intelligence software has been around for a while now. In fact, the world’s first rudimentary chatbot was created way back in 1966, before the world wide web was even invented. Recently, though, AI technology has improved to the point that it can mimic human writing patterns. There’s no better example of this than ChatGPT, a chatbot developed by AI research and deployment company OpenAI. Trained to understand human speech patterns and even to inject humor into its responses, ChatGPT can take just about any written command (such as “write an essay about Shakespeare”) and generate a somewhat-human-like response of up to around 500 words. For some, this means that writing social media posts for their online businesses or creating marketing material for websites is now easier than ever. For others, especially those in academia and publishing, life just got a whole lot more complicated.
ChatGPT was first launched on November 30, 2022, building off of OpenAI’s existing GPT-3.5 AI model. Envisioned as a way to change how people interact with computers, ChatGPT can answer questions about subjects like art and history in a conversational way that may help some users better understand the subject matter. This is because ChatGPT was not only trained by “reading” vast amounts of online text, from essays and articles to social media posts, it was also overseen by human trainers. These human trainers provided feedback to ChatGPT about what kind of responses were appropriate in which circumstances, which sentences were phrased best by the AI, and more. It’s no wonder that the chatbot quickly went viral online for its uncanny ability to sound human. It hasn’t been long since ChatGPT launched, yet it’s already one of the fastest growing internet services ever. In January 2023, it served around 100 million users, and Microsoft is planning to integrate the technology into some of its Office software and its search engine, Bing.
Being popular has its downsides, though. ChatGPT is already facing intense scrutiny from those in the academic community who fear that the chatbot could lead to a wave of cheating. After all, what’s to stop students from letting ChatGPT write essays for them? In the past, AI-generated content, like plagiarized content, was easier to detect because it was often nonsensical or sounded “off”. Not so with ChatGPT. This has led some teachers to employ AI-detecting software, like GPTZero, developed by entrepreneur Edward Tian. Unfortunately, GPTZero has problems of its own. It can’t definitively tell a user whether something was generated using AI. Instead, it tells a user the perplexity or measurement of randomness, of inputted text. If the text is seen as random by an AI, then it is probably genuine. If it is familiar to AI, then it will often be flagged as AI-generated. The trouble is…not a lot is seen as random by ChatGPT, especially not common, short sentences. Students are already coming forward to claim that they’ve been falsely accused of using AI to write essays. Some professional writers have also sounded the alarm about bosses and clients falsely flagging their content as AI-generated simply because it contained too many short sentences or common phrases.
Cheating isn’t the only danger when it comes to AI like ChatGPT. After all, the internet that helped train it isn’t always an unbiased place. ChatGPT’s ability to generate text in the style of certain people or publications can lead to chilling misinformation. For example, when credibility-assessment organization NewsGuard asked ChatGPT to write text in the style of a Russian propaganda news site, the AI created a short article full of misinformation. Even when not being asked to create text in a certain style, ChatGPT isn’t perfect. Frequent users have noticed that it sometimes has difficulty getting timelines and titles correct, making it particularly bad at giving correct information about authors. Just one more reason to hold off on letting ChatGPT do too much writing for us.
[Image description: A pair of hands types on a laptop, which sits on a wooden table with a geometric pattern.] Credit & copyright: Pexels, Pixabay -
FREEWorld History PP&T CurioFree1 CQ
G’day mate, fancy a stroll in the Outback? The Australian Outback made international headlines recently, when mining company Rio Tinto lost a tiny, radioactive capsule in the expansive region. Luckily, government officials using specialized equipment were able to locate the capsule, but the situation led to many people outside of Australia learning more about the Outback, and just how big it is. In fact, the Australian Outback isn’t just one region. The term refers to any remote, sparsely-populated inland area, though it’s most commonly used to refer to arid regions in eastern and northern Australia, and the center of the nation’s Western Plateau. Four deserts sit in areas that are considered to be Outback: the Great Sandy, the Gibson, the Great Victoria, and the Tanami. The Outback covers more than 50,000 square miles, and includes some of the hottest, harshest environments on earth. In fact, the highest temperature ever recorded in the Outback was a whopping 123.3 degrees Fahrenheit. Yet, amazingly, animals and some humans have managed to thrive there for centuries.
The first people to settle in the Australian Outback arrived there from South East Asia around 60,000 years ago. Today, these Indigenous people are collectively known as Aboriginal Australians, though they are made up of many distinct groups. Despite the Outback’s harsh conditions, Aboriginals thrived there, as well as on several of Australia’s islands, like Tasmania and the Tiwi Islands. Those living in The Outback became adept at collecting and storing water. Many also practiced controlled burning of undergrowth, which fertilized the soil and encouraged plant growth, and attracted more animals to hunt. When the first Europeans arrived in Australia, in 1788, around 250,000 Aboriginals lived in Australia, many of them in the Outback. Unfortunately, Europeans brought disease which devastated the Aboriginal population. They also violently drove many Aboriginals from their land on the premise of “terra nullius”, a concept that meant that the land belonged to no one. Only in recent years has the Australian government committed to giving back swaths of land, including some of the Outback, to its ancestral owners. However, only around 5 percent of Australians live in the Outback today.
The Outback is still home to a surprisingly diverse array of wildlife. Animals that live in the most arid parts of the Outback have special adaptations to manage desert life. These include odd-looking marsupials like the bilby. Bilbies’ name comes from a word in the Yuwaalaraay Aboriginal language meaning “long-nosed rat.” Though they’re not rats, or even rodents, bilbies do have long snouts which allow them to root through underbrush, and large ears which help to dissipate the intense heat of their desert environment. Their bodies are so adapted to desert life that they never need to drink liquid water. Their omnivorous diet, which includes insects, small mammals, plants, and fruits, provides all the water they need. Another, more famous marsupial native to the Outback is the black-flanked rock-wallaby. These small, kangaroo-like creatures make their homes on rock formations in the Outback, using their powerful legs to climb and their strong tails to balance. Like many other Outback-dwellers, they are nocturnal, which allows them to take advantage of lower nighttime temperatures. The Outback is home to many reptiles, like great desert skinks, which build burrows to escape high aboveground temperatures. Even some colorful birds are native to the Outback, like the pink cockatoo. These parrots live in large groups and spend their days sheltering in shady trees, but gather in large flocks in the evenings, often by watering holes. Their powerful beaks are adept at breaking open seeds and nuts, and tearing through thick fruit skins. One of the Outback’s most famous inhabitants is the dingo, an animal which isn’t technically native to Australia. These canines were brought to the continent by humans from East Asia between 4,000 and 8,000 years ago. Nevertheless, they are considered native wildlife under Australia’s Nature Conservation Act of 1992, and are protected in the country’s national parks. As far as we’re concerned, any creature hardy enough to call the Outback home deserves respect.
[Image description: A photo of the Australian Outback featuring tufts of grass and a large rock formation with a dead tree in the foreground.] Credit & copyright: lum-box, PixabayG’day mate, fancy a stroll in the Outback? The Australian Outback made international headlines recently, when mining company Rio Tinto lost a tiny, radioactive capsule in the expansive region. Luckily, government officials using specialized equipment were able to locate the capsule, but the situation led to many people outside of Australia learning more about the Outback, and just how big it is. In fact, the Australian Outback isn’t just one region. The term refers to any remote, sparsely-populated inland area, though it’s most commonly used to refer to arid regions in eastern and northern Australia, and the center of the nation’s Western Plateau. Four deserts sit in areas that are considered to be Outback: the Great Sandy, the Gibson, the Great Victoria, and the Tanami. The Outback covers more than 50,000 square miles, and includes some of the hottest, harshest environments on earth. In fact, the highest temperature ever recorded in the Outback was a whopping 123.3 degrees Fahrenheit. Yet, amazingly, animals and some humans have managed to thrive there for centuries.
The first people to settle in the Australian Outback arrived there from South East Asia around 60,000 years ago. Today, these Indigenous people are collectively known as Aboriginal Australians, though they are made up of many distinct groups. Despite the Outback’s harsh conditions, Aboriginals thrived there, as well as on several of Australia’s islands, like Tasmania and the Tiwi Islands. Those living in The Outback became adept at collecting and storing water. Many also practiced controlled burning of undergrowth, which fertilized the soil and encouraged plant growth, and attracted more animals to hunt. When the first Europeans arrived in Australia, in 1788, around 250,000 Aboriginals lived in Australia, many of them in the Outback. Unfortunately, Europeans brought disease which devastated the Aboriginal population. They also violently drove many Aboriginals from their land on the premise of “terra nullius”, a concept that meant that the land belonged to no one. Only in recent years has the Australian government committed to giving back swaths of land, including some of the Outback, to its ancestral owners. However, only around 5 percent of Australians live in the Outback today.
The Outback is still home to a surprisingly diverse array of wildlife. Animals that live in the most arid parts of the Outback have special adaptations to manage desert life. These include odd-looking marsupials like the bilby. Bilbies’ name comes from a word in the Yuwaalaraay Aboriginal language meaning “long-nosed rat.” Though they’re not rats, or even rodents, bilbies do have long snouts which allow them to root through underbrush, and large ears which help to dissipate the intense heat of their desert environment. Their bodies are so adapted to desert life that they never need to drink liquid water. Their omnivorous diet, which includes insects, small mammals, plants, and fruits, provides all the water they need. Another, more famous marsupial native to the Outback is the black-flanked rock-wallaby. These small, kangaroo-like creatures make their homes on rock formations in the Outback, using their powerful legs to climb and their strong tails to balance. Like many other Outback-dwellers, they are nocturnal, which allows them to take advantage of lower nighttime temperatures. The Outback is home to many reptiles, like great desert skinks, which build burrows to escape high aboveground temperatures. Even some colorful birds are native to the Outback, like the pink cockatoo. These parrots live in large groups and spend their days sheltering in shady trees, but gather in large flocks in the evenings, often by watering holes. Their powerful beaks are adept at breaking open seeds and nuts, and tearing through thick fruit skins. One of the Outback’s most famous inhabitants is the dingo, an animal which isn’t technically native to Australia. These canines were brought to the continent by humans from East Asia between 4,000 and 8,000 years ago. Nevertheless, they are considered native wildlife under Australia’s Nature Conservation Act of 1992, and are protected in the country’s national parks. As far as we’re concerned, any creature hardy enough to call the Outback home deserves respect.
[Image description: A photo of the Australian Outback featuring tufts of grass and a large rock formation with a dead tree in the foreground.] Credit & copyright: lum-box, Pixabay