Quantcast
Channel: Atlas Obscura: Articles
Viewing all 11487 articles
Browse latest View live

The Horrifying Legacy of the Victorian Tapeworm Diet

$
0
0
article-image

From horrifying foot-binding practices in Imperial China to life-threatening surgeries in modernity, humanity has been finding harmful ways to modify the body since the dawn of civilization. The Victorians were no exception to this.

The Victorian era, roughly the 1830s to 1900, is notorious for its bizarre notions of beauty, and its even more bizarre secrets to attaining it. The ideal of the time was modeled after those afflicted by consumption (tuberculosis). Pale skin, dilated eyes, rosy cheeks, crimson lips, and a meagre and fragile figure. To achieve this particular look, women of the era employed several harmful practices. From swallowing arsenic—which they knew to be poisonous—to diluting powdered charcoal in water before ingesting it, to using figure-molding corsets in a never-ending quest for the “perfect” 16-inch waist, there was no limit to what fashionable Victorians would do. 

article-image

Most of these practices have, thankfully, gone out of style. We no longer swallow ingredients present in rat poison, and corsets no longer disfigure women’s internal organs. There is one gruesome dietary idea, however, that has managed to survive—the tapeworm diet.

The idea is simple, and gross. You take a pill containing a tapeworm egg. Once hatched, the parasite grows inside of the host, ingesting part of whatever the host eats. In theory, this enables the dieter to simultaneously lose weight and eat without worrying about calorie intake.

Both ideas fit nicely into Victorian ideals, as illustrated by The Ugly-Girl Papers by S.D Powers, one of the most popular beauty guides of the era. First and foremost, the guide states that “it is a woman’s business to be beautiful.” Beauty takes time and effort and no plain girl could forego the tediousness of beauty regimes if she wanted to find a husband. One can therefore conclude that Victorians were very much willing to make sacrifices to attain ideal beauty.

But the guide also recommends that women find a “healthy” balance in the pursuit of beauty. When it comes to maintaining the figure or losing weight, the author claims:

“If stout, [a girl] should eat as little as will satisfy her appetite; never allowing herself, however, to rise from the table hungry.”

The tapeworm diet may thus have been the perfect solution. Allegedly, a woman would never rise hungry from the table, yet she would continue losing weight. All concerns for health and discomfort could be dismissed with the claim that beauty is pain, and sacrifices must be made.

article-image

And sacrifices were most certainly made once the desired weight was achieved. To get rid of the now-unnecessary parasite, dieters would employ the same methods as those unwillingly afflicted by the worms. In Victorian England, this included pills or special devices. One such invention, created by Dr. Meyers of Sheffield, attempted to lure the tapeworm by inserting a cylinder with food via the digestive tract. It comes as no surprise that many patients choked to death before the tapeworm was successfully removed. Other folk cures prescribed holding a glass of milk at the end of either orifice and waiting for the tapeworm to come out. Whether this actually holds any validity remains an issue of debate, as we have yet to prove that parasites have a preference for bovine lactose.

What’s scariest about this diet is not that it may have been used by the same people who willingly ingested carbon, but that its idea continues to be around. Like air pollution and zombie films, it simply refuses to die. Its presence is evidenced by the numerous online forums dedicated to the question of the diet’s efficiency, and the (fairly dubious) reports of Mexican clinics that will give you the treatment for a couple thousand dollars.

With both Victorian and modern dieters, the actual popularity of this radical diet is murky. Historians disagree on whether people actually ingested tapeworm pills, or whether the advertised products were simply placebos meant to dupe desperate people. Likewise, reports on Mexican tapeworm clinics are hard to believe, as are most of the testimonies of its advocates. Moreover, rumors of stars like Maria Callas losing weight with the diet have often turned out to be simple manipulation of facts. It seems, then, that at no point of its history has the tapeworm diet been an actual fad.  

article-image

This doesn’t negate, however, that there are people willing to try it. If the “tapeworm pills” of the Victorian era were indeed a farce, it does not change the fact that people bought and swallowed them in the hopes that a gigantic worm would live in their digestive system. Likewise, a simple Google search on the diet will pull up dozens of diet blogs that cover the topic. The comments sections will take you on a sadly humorous trip full of obvious scams and willing participants asking for more information.

Even reality TV star Khloe Kardashian suggested that she wanted to get a tapeworm on Keeping Up With the Kardashians. The statement evoked an article from Vice on the legitimacy and dangers of the diet. Claiming concern for public health, the FDA has officially banned tapeworm pills. Unrealistic expectations of female beauty, it seems, retain its parasitic grip on pop culture. 


What’s A Woggin? A Bird, a Word, and a Linguistic Mystery

$
0
0
article-image

On December 20, 1792, the whaling ship Asia was making its way through the Desolation Islands, in the Indian Ocean, when the crew decided to stop for lunch. According to the ship's logger, the meal was a great success: "At 1 PM Sent our Boat on Shore After Some refreshments," he wrote. "She returned with A Plenty of Woggins we Cooked Some for Supper."

Right about now, you may be feeling peckish. But you may also be wondering: What in the world is a woggin?

New species are discovered all the time. Unknown old species—extinct ones, found as fossils and then plugged into our historical understanding of the world—turn up a lot, too. But every once in a while, all we have to go on is a word. New or old, known or unknown, no one knew what a woggin was until Judith Lund, whaling historian, decided to find out.

article-image

Like all professionals, 18th-century whalers had their share of strange jargon. A "blanket" was a massive sheet of blubber. "Gurry" was the sludge of oil and guts that covered the deck after a kill, and a "gooney" was an albatross. Modern-day whaling historians depend on their knowledge of these terms to decode ship's logs—vital for understanding the sailors' day-to-day experiences, as well as gleaning overall trends. Being elbow-deep in whaleman slang is just part of the job.

So when Lund ran into a word she didn't know, it caught her eye. Lund was at the New Bedford Whaling Museum, trying to dig up some data on oil harvest rates. "I was reading a logbook and charging along beautifully," she says, "when I came across the fact that whalemen on that voyage were eating woggins and swile."

Lund had heard of swile—it's whaler slang for "seals"— but woggins were new. She asked the museum librarian, Michael Dyer, who didn't know either. "The woggin was a mystery to both of us," she says. So Lund did what any curious person would—started emailing everyone she could think of, asking if they had ever heard of it.

article-image

One of these people was Paul O'Pecko, the Vice President of Collections and Research at Mystic Seaport. "You know how once somebody mentions something to you, that the piece of information seems to jump off the page when you are not even looking?" he asks. This quickly happened with woggins. As soon as Lund's network was alerted, more mentions from ship's logs began flooding in. A Sag Harbor vessel sailing in 1806 "kild one woglin at 10 am." New Bedford sailors from 1838 describe "wogings in vast numbers & noisy with their shril sharp shreaking or howling in the dead hours of the night." In a 1798 diary entry, Christopher Almy of New Bedford writes of "one sort the whalemen call woggins," which have stubby wings. When they move over the rocks, he says, they "look like small boys a walking."

When Lund's inquiry hit O'Pecko's desk, something splashed out of his memory, too. Years before, in his own research, he had come across the story of Jack Woggin, a beloved ship's pet. He dug up the account from an 1832 whaling magazine, and sent it along. "A person looking overboard saw a Penguin (Genus aptenodytes), commonly called by the sailors a 'woggin,'" writes the author, explaining how Jack got on board. Here, finally, was the smoking gun: a woggin is a penguin. 

article-image

But the mystery was only half solved. Penguins, as we understand them, live in the Southern Hemisphere. And yet sailors in the north were also getting in on the action, reporting that they had "caught 10 wogens" or "saw wargins." "Whalemen were noticing them before they went far enough south to see true penguins," says Lund.

At this point, it was Storrs Olson's turn to snap into focus. Olson, an ornithologist with the Smithsonian Institution, had been added to the email chain early on, but the mystery hadn't gripped him. "It was something sailors ate, and they would eat almost anything," he says. "I did not pay a lot of attention at first."

But when it became clear that woggins were in the north, too, an intriguing suspect loomed: what if they were great auks? Also flightless, with large, hooked beaks and white eyespots, great auks went extinct in 1852, hunted to death for their fluffy down. (Arctic sailors also burned them for warmth, as there was often no wood where they were exploring.) As such, we know very little about them, and they have achieved near-mythical status among ornithologists, who grasp at every scrap of evidence about how they lived. 

article-image

Dyer—the librarian from New Bedford—had found another major clue: the notebook of a schoolboy named Abraham Russell, decorated with a careful sketch of a "Sea Waggin found on the banks of Newfound Land." The drawing looked as though it had been traced from a particular illustration of a Great Auk found in a popular navigational guide. Further finds reinforced this theory, and finally, the group of detectives nailed it down: A southern woggin is a penguin. A northern woggin was a great auk.

Lund and Olson released their first woggin exposé in 2007, in Archives of Natural History. A follow-up was published this month. (The new paper is a true rollercoaster—early on in the list of woggin cameos, an explorer from 1860 reports that the birds "excited my wonder and attention." Mere lines later, sealers from 1869 are showing off "a bag full of woggins' hearts, which we can roast on sticks, and who doubts that we shall make a heart-y supper?")

article-image

"Our paper was received with considerable interest by the editors of the Oxford English Dictionary and the Dictionary of American Regional English," it points out, suggesting that woggins may soon officially march into the historical lexicon.

Until then, Olson is using the woggins to learn more about great auks—he has already expanded their probable springtime range down to the coast of North Carolina, based on a sighting from 1762. And Lund keeps the word in her back pocket, a new species of diverting vocabulary. "I run across it occasionally, and it's amusing and interesting" she says. "The woggins live again."

Naturecultures is a weekly column that explores the changing relationships between humanity and wilder things. Have something you want covered (or uncovered)? Send tips to cara@atlasobscura.com.

Watch a Serial Killer Play the Dating Game

$
0
0

It's not just the stilted dialogue and cringe-worthy innuendoes that make this Dating Game clip from 1978 seem off somehow. Bachelor number one is Rodney Alcala, who would be sentenced to death for the murder of at least 50 people just two years from this airing, but his true victim count could be as high as 130. When he appeared as a contestant on the show, he was a convicted rapist in the middle of his killing spree.

The Bachelorette, Cheryl, doesn't pick up on any of this. In fact, she ends up choosing Alcala over the other two contestants despite his lackluster answers like, "Nighttime is the best time," and "I'm a banana... peel me." However, she cut things off immediately after their one date, claiming that she found him "creepy." How right she was.

Every day we track down a Video Wonder: an audiovisual offering that delights, inspires, and entertains. Have you encountered a video we should feature? Email ella@atlasobscura.com.

The Strange Story of Why Belize is Full of Chicago Cubs Fans

$
0
0

Less than four years after gaining independence from the United Kingdom, the Central American nation of Belize notched a smaller, yet somehow lasting, triumph in 1985. That winter, the Chicago Cubs sent their star outfielder Gary Matthews, Sr., to visit the country, which is often claimed to be the most Cubs-friendly land outside the Windy City. Matthews's visit was the culmination of a love affair that had begun in 1981, the year when Cubs games began being broadcast in the country.

The relationship has continued to this day. Right now many Belizeans at home and inside the United States are cheering on the Cubs, who have returned to the World Series for the first time since 1945 and are attempting to break a championship drought that stretches all the way back to the first Roosevelt administration, 1908. The team finished the 2016 season with the best record in baseball, wracking up 103 wins (and 58 losses), an achievement that put them a whopping 17.5 games ahead of their closest division rival, the St. Louis Cardinals.

Melanie Walker, who was born in Belize and lives in the Los Angeles area, is watching the Cubs' championship run with nostalgia. "Of course I'm a Cubs fan," she says. The 42-year-old remembers watching the team play during her childhood. Her husband has been taping the Cubs' playoff games while she works at the Little Belize restaurant in Inglewood, California. He's a Dodgers fan, but she's confident the Cubs are going to win it all this year.

article-image

Why the Cubs—and why Belize? Like a lot of stories from this part of the world, it began with pirates. In this case, however, the outlaws were local TV impresarios, not swashbuckling Johnny Depp look-alikes. In the early ‘80s, there were no television stations in Belize, the only country in Central America whose official language is English. (Anyone with a set would use it to watch VHS tapes.) In 1981, however, Belize City business couple Arthur and Marie Hoare began transmitting the famous Chicago television channel WGN-TV via satellite, bringing programming to Belize. Channel 9, the Hoares' bootlegged Belize affiliate of WGN, brought Cubs and Bulls games into living rooms and bars throughout the country, sparking an interest in Chicago sports that has continued—with varying levels of enthusiasm—to this day.

"As [WGN's] signal was relayed into Belize City by the Hoares, 'world and country' were glued to their television sets to see the mighty Cubs win or lose (mostly lose)," remembered politician Michael Finnegan in a 2013 article in the Belizean paper Amandala.    

Belize was ripe for adopting a professional baseball team. In the early ‘80s popularity for the sport was picking up in Belize: It had a well-attended travel-baseball league, and game broadcasts from a handful of American cities could be heard on shortwave radios. But "when the Cubs came on, and it was the only station [people] could view, they naturally gravitated to the Cubs," says G. Michael Reid, a veteran Belizean journalist.

article-image

Other factors helped solidify the connection. There was (and still is) a large community of Belizeans in Chicago. "[They] started going to Cubs games and holding up signs like 'Hello, Belize!'" says Richard Wilk, a professor at Indiana University who was working in Belize at the time of Matthews's visit. "You could go out to the Cubs game and hold up a sign, and your grandma down in Belize would be able to see it." Eventually the Cubs organization—and even Harry Caray himself—began acknowledging its Belizean fanbase. The small country appreciated the recognition, says Wilk. "Belizeans are always getting lumped together with Jamaicans. They hate that."

Also, the Cubs were actually good. In 1984 they went all the way to the National League Championship Series, and three of their players—Ryne Sandberg, Rick Sutcliffe, and Matthews—finished in the top-five of National League MVP voting. (Sandberg, the Hall of Fame second baseman, would win the honor.) 

The country's tiny size may have kept American television executives at bay. In a 1989 Washington Post dispatch from Wrigley - South, reporter William Branigin explained that "U.S. broadcasters consider the Belize market so small that trying to stop the operations would not be worth the trouble."

article-image

The turnout for Matthews—who Belizeans often refer to as just Sarge, his nickname—was enormous. During his parade through Belize City, "I couldn't get close to the road. I couldn't see a thing," says Wilk. Not too shabby for a country whose population was approximately 165,000 people at the time. According to Wilk, more people came out for Matthews than for Princess Margaret, who had visited the year before, and even Queen Elizabeth, who visited in 1986. Matthews brought baseball equipment along with him, which he donated to youth league teams, further enhancing the Cubs' reputation.

As the Cubs faded back into mediocrity, however, and additional TV channels sprouted up in Belize offering more than just WGN, many Belizeans moved on. Basketball is probably the biggest sport in Belize right now, says Reid, who admits he's not a Cubs fan. He lived in New York City for some time, and is actually a fan of the Knicks, "which is kinda like basketball's version of the Cubs—they just can't win."

Most of the fans watching the Cubs in Belize today are Gen-Xers who grew up watching the team, says Jerry Martinez, a 36-year-old banker from Santa Elena, a city in the western part of the country. If the Cubs can lock up the World Series, Martinez thinks the romance may be rekindled. There are still diehard fans in the country, he says, but "people here usually ride with winners," especially younger Belizeans. 

However the season turns out, Martinez is determined to make his son a Cubs fan. "I grew up a Cubs fan and will die a Cubs fan," he says. "We're the lovable losers that introduced Belize to baseball."

Update, 10/26: An earlier version of this story identified Belize as a Caribbean nation; we've updated to more accurately place it in Central America. 

The Inept Story Behind 100 Missing Brains at the University of Texas

$
0
0
article-image

A version of this story originally appeared on Muckrock.com.

A couple years ago, a story started to make the “news of the weird” rounds about roughly 100 brains missing from the University of Texas at Austin’s psychology department.

article-image

Curious about what kind of work goes into nailing a brain napper, I filed a public records request for any reports regarding the theft of the specimens.

A couple weeks later, I heard back: there weren’t any.

Turns out, that while this had been reported on as being the result of some recent Herber West-ian nefariousness, the real story was that the university was never quite sure how many brains it had in the first place, or even who the original owners of the brains happened to be. The collection’s own curator expressed doubt if their rather morbid crown jewel - the brain of infamous clock tower shooter Charles Whitman - was ever actually among the specimens.

article-image

So when an audit revealed that half the collection was missing, rather than call the police, the department issued a mea culpa and promised to keep better track of the brains in the future. After all, they couldn’t really prove what was missing in the first place. Which just goes to show: inadequate bookkeeping is a far more terrifying threat than zombies.

Investigating the Mystery of One of America's Most Endangered Bees

$
0
0

A version of this story originally appeared on bioGraphic.com.

In 1998, a UC Davis entomologist named Robbin Thorp explored the forests of southern Oregon and northern California, hoping to learn more about a little-studied native pollinator that lived there. He visited nearly three-dozen sites where museum records indicated the yellow-topped Franklin’s bumble bee had once been seen. “It wasn’t the most common bee I saw,” Thorp recalls, “but I could find it at all the sites where it was supposed to be”—and even in some places where it hadn’t previously been recorded.

The next year he visited the same spots. Again, he found the bee at all the study sites. But the year after, quite suddenly, “the bee became difficult to find,” says Thorp. Bumble bee populations fluctuate from year to year, so at first he wasn’t alarmed. But when numbers didn’t bounce back, he realized the species might be in serious trouble. “Something was going on,” he says. 

In 2003, he contacted other bumble bee specialists to see if they were seeing similar problems among the species they studied. They began looking, and concluded that three other species, all belonging to the subgenus Bombus sensu stricto, had also experienced sudden and steep declines. Last week, the U.S. Fish and Wildlife Service proposed that one of those species—the rusty patched bumble bee (Bombus affinis), named for the small, red-brown crescent on its back—receive federal protection as an endangered species.

(Video by Day’s Edge Productions)

There are 47 varieties of native bumble bee in the United States and Canada, and the International Union for Conservation of Nature (IUCN) estimates that more than a quarter of those species face the threat of extinction. But unlike honeybees—an imported species from Europe whose recent mass deaths have been well publicized and extensively researched—bumble bees receive scant attention. If the federal listing of the rusty patched bumble bee proceeds, however, that may change: It would be the first native bee in the continental United States to be protected under the Endangered Species Act.

The rusty patched bumble bee was once ubiquitous across a large, bat-shaped expanse that stretched from New England south through the Appalachians and into the Midwest, and southeastern Canada. Today, however, only a handful of genetically isolated populations survive in Wisconsin and parts of Minnesota. The Fish and Wildlife Service estimated in its listing proposal that populations have declined by as much as 95 percent since the late 1990s. “There are a few little spots where we know they are,” says USDA research entomologist Dr. James Strange, “but only a really few spots.”

article-image

What caused the rusty patched bumble bee to disappear? As with many ecological mysteries, there’s not one easy answer. Urban sprawl and agriculture’s continuing shift from small, diverse farms to vast swaths of single-plant monocrops have fragmented habitat and left fewer hedgerows and native plant blossoms to feed pollinators. Agricultural and garden pesticides can kill or weaken bees. And in the specific case of the rusty patched bumble bee, some scientists point to pathogenic intruders, particularly a fungal parasite that may have grown more virulent thanks to our love of year-round greenhouse tomatoes.

More than 85 percent of flowering plants require the help of pollinators to reproduce—that translates to one in three bites of food we eat. Farmers generally rent honeybees, which live in large, easily portable colonies, to pollinate crops such as almonds and cherries. But certain plants—such as tomatoes, sweet peppers, eggplants, cranberries, blueberries—respond especially well to “buzz pollination,” a behavior unique to bumble bees, which latch on to a flower’s anthers with their mouthparts and vibrate their wings at a frequency that dislodges trapped pollen. Buzz pollination increases the weight of tomatoes by 5 to 16 percent, according to Strange.

article-image

In the 1990s, as the greenhouse tomato business grew from a boutique industry to a major source of year-round tomatoes, the commercial bumble bee industry grew along with it. Thorp believes those mass-produced bees carried with them a fungal microsporidian parasite called Nosema bombi, which caused a collapse in populations of commercially bred western bumble bees in the 1990s, and may have spread to bees in the wild as well. The rusty patched bee, along with the three other declining species in its subgenus, carry particularly high loads of the parasite. “Because all those species collapsed at the same time in such a dramatic way, the belief is that the subgenus is for some reason very susceptible to this pathogen,” says Rich Hatfield, a conservation biologist with the Xerces Society for Invertebrate Conservation who helped spearhead the petition to list the rusty patched bee. Other pathogens, such as viruses spread by managed bumble and honey bees may also be a factor.

Federal protection could help the struggling bees in a number of ways, says Hatfield. “The measure that would help most would be to regulate the commercial bumble bee industry,” he says. “Nobody’s testing those commercial bumble bees for diseases.” In an email, Netherlands-based Koppert Biological Systems, the only company currently rearing bumble bees in the U.S., notes that “there is no proof for the invasive pathogen hypothesis,” and that their bees are raised in a “safe and controlled manner” in the company’s Michigan facilities, including frequent internal and outside audits, tests and inspections. In addition, the company’s bumble bee production is “inspected by Michigan State Department of Agriculture which certifies the bees as disease free for export purposes.”

article-image

 

An endangered designation would also bring additional research funds that would allow scientists to better understand these little-studied native bees, and would help to protect critical habitat and forage. The Fish and Wildlife Service proposal noted that bumble bees may be more vulnerable to pesticide exposure than honeybees. Hatfield hopes an endangered designation will force the Environmental Protection Agency to require that pesticides, particularly a newer class called neonicotinoids, be tested for their effects on native bees. “There is a whole suite of insecticides that are broadly used throughout North America and the only species that the toxicity has been tested on is the honeybee,” he says. “Using them as proxy for all the bee species in North America is not appropriate.”

The agriculture industry is likely to disagree. After the listing proposal, CropLife America, the trade group representing pesticide manufacturers, said in a statement that “field studies have consistently found no unreasonable adverse effects on pollinator populations when pesticides are applied according to label directions.” In comments opposing the original listing petition, the Independent Petroleum Association of America argued that programs already in place to preserve habitat for honey bees, monarch butterflies and northern long-eared bats were sufficient to protect bumble bee populations. 

But scientists believe that without endangered species protections, prospects are dim for the rusty patched bee. “We have a bee that is on the brink of extinction and now we have a chance to do something about it,” says Strange.

For the Franklin’s bumble bee, that chance has, in all likelihood, been lost. In 2006, Thorp visited a spot high in the Siskiyou mountains in southern Oregon. Just below the summit of Mt. Ashland, along a Forest Service road about 50 yards above the Pacific Crest Trail, “there’s a seep area that keeps the vegetation moist,” says Thorp, “where plants keep flowering for a really long time.” As Thorp walked past the meadow, he saw a yellow-topped Franklin’s bee bumble by—the first he had seen in three years. “I wanted to photo-document it but I didn’t have my camera with me,” he says.

It was the last Franklin’s bumble bee anyone has seen. Scientists now believe it is extinct.

The proposed listing opens a 60-day period for the public to provide comments and additional information about the rusty patched bumble bee. The public comment period runs through November 21, 2016. To submit comments or to view documents and comments on the bumble bee listing, visit: https://www.regulations.gov/document?D=FWS-R3-ES-2015-0112-0028/.

The Nearly-Solved Mystery Behind the Missing Corpse of One of the Richest Men Ever

$
0
0
article-image
article-image

On the morning of November 7th, 1878, Frank Parker, the assistant sexton of Saint-Mark’s-Church-In-The-Bowery noticed a pile of fresh dirt at the center of the graveyard.

The flat tombstone beside the mound seemed undisturbed, but suspicious nonetheless, the sexton decided to investigate. With the help of a few other clergymen, he lifted the heavy stone bearing the name “STEWART” and was lowered down by a rope into the darkness.

What Parker found in the depths of that crypt, or rather what he didn’t find, sparked one of New York’s greatest mysteries. Two essential objects were missing from Stewart’s tomb at Saint Mark’s: an engraved silver nameplate and, more importantly, the body it identified.

And the body was not just any body. The missing corpse belonged (or used to belong) to the third-richest man in the United States. In fact, to this day, Alexander T. Stewart, the "Merchant Prince,” remains the seventh-richest American of all time. 

The father of the department store, Stewart made his fortune primarily in retail and manufacturing. When it came to fashionable clothes and dry goods in Manhattan, Stewart was the biggest game in town. So when he died in 1876, the enormity of his inherited estate was a surprise to no one. Stewart left behind an empire at the height of its power, a 76-year-old widow, Cornelia, no children, and a massive personal fortune, worth about $46 billion by today’s standards.

article-image

Alexander Stewart made headlines in life as an entrepreneur and shrewd businessman, but his “resurrection” caused a media sensation unparalleled by anything he had experienced in life. Grave robbing was a reality of 19th-century life, but it usually involved the theft of fresh bodies from the poor and disenfranchised for medical experiments. The successful body-snatching of one of the New York’s biggest names, in a bad economy—two years after a failed attempt to rob Lincoln’s Tomb, no less—captured the zeitgeist. (The Lincoln Case, Bess Lovejoy, author of Rest in Pieces: The Curious Fates of Famous Corpses, suggested in an interview, may actually have served as direct inspiration for the Stewart robbers.)

The very same day Frank Parker made his discovery, an eager crowd surrounded Saint Mark’s cemetery, fueled by curiosity. The robbers’ trail was easy to trace. A line of the foul-smelling stains crossed the stone porch, ending at the iron fence where a few scraps of rotting flesh hung limply from its spikes. Detectives found a few other clues: an old copy of the Herald, a shovel, a lamp, a wooden board, and a length of a woman’s stocking. The 11th Street gate’s padlock was found on the sidewalk, unforced and intact. It was apparent that the “ghouls” (as the New York Times dubbed them) had a key.

It was impressive. Not only had the robbers persisted despite the smell of Stewart’s liquefying body, they had also managed to do so while completely evading detection.

Strangely, Stewart’s body had been scheduled to be exhumed and reburied that week at the Cathedral of the Incarnation in Garden City. Garden City was Stewart’s largest and least-understood project. Reporters openly questioned “Stewart’s Folly” when construction began on the ambitious project. After his death, the press’s incredulity only increased when Stewart’s widow set aside $1 million for a massive cathedral to be built there in her husband’s memory. The thieves may have known about the plans to relocate the body, suggesting this was an inside job, and preyed on the distraction. Nonetheless, all the clergy and cemetery workers were cleared.

article-image

From its smell, detectives deduced that the thieves wiped their hands on the Herald after handling the body. The newspaper held other clues. It was dry, despite a light rain the night before. This gave investigators a timeline. The thieves had struck just after the storm passed at 3:00 am. This matched eyewitness accounts of a Delivery Wagon parked across the street that disappeared around 3:30 am. Where that wagon had gone was anyone’s guess. Because of the rancid smell, the robbers may have taken the body out of the city to avoid detection.

The Police advised Mrs. Stewart and Stewart’s executor, “Judge” Henry Hilton, to wait for the grave robbers to contact them. Given Stewart’s decision not to be embalmed and the passage of two years, ransom seemed a more likely motive than medicine. Unless, as one source suggested at the time, someone wanted to study Stewart’s skull through the still-popular “science” of phrenology. “Stealing skulls for phrenology happened,” Lovejoy said in interview, “But usually only to people considered geniuses… like Haydn and Mozart.” While never as popular in America as in Europe, Lovejoy pointed out that, “Some people definitely thought it was worth studying the contours of a famous skull.”

Whatever the reason for the crime, Hilton told the New York Times, they would offer a $25,000 dollar reward for help capturing the criminals.

Before his death, Stewart was seen as something of a miser, even Scrooge-like. Stories circulated that he’d once fired a carpenter for losing a single nail.  There was another rumor he bankrupted the builder of his Fifth Avenue Mansion by suing him for wartime construction delays. In his will, Stewart left no charitable donations to the city or any university.

article-image

After news of the reward spread, more than 700 letters flooded in to Hilton, to Mrs. Stewart, and to the Police. Hundreds more appeared in the Herald personals section. All claimed to have information about the case.  

Inspector Duke of the NYPD received one letter— written and addressed in cutout newsprint characters— claiming, “In eight hours I will be in Canada with AT Stewart’s body.” One letter published in the Herald said the body would be returned provided Mrs. Stewart donated $500,000 to any charity. Several spiritualists claimed to channel Stewart himself.  In the onslaught, it was hard for investigators to tell what was authentic.

At least two men actually confessed to the crime under interrogation. Two small-time criminals named William Burke and Henry Vreeland offered to take detectives to the body’s hiding place in Chatham, New Jersey.  But after realizing they faced jail time and not a reward, the pair refused to cooperate. No evidence ever linked them to the case, but fortune hunters descended on Chatham anyway, digging holes and dredging the river in search of Stewart’s remains.

After the Herald’s favorite theory involving a famous resurrectionist and a Stuyvesant Street boardinghouse fell apart, so did public confidence. Articles providing advice on how to prevent grave robbing appeared in the Brooklyn Daily Eagle. The Herald Tribune took the position that Mrs. Stewart should publicly give up the search in order to end the public hysteria. By Christmas, the story dropped from the headlines, but according to publisher Jacob A. Riis, the damage was done. To him, the Stewart case was, “the dawn of Yellow Journalism.”

In January of 1879, Paul Henry Jones, Post Master of New York and former Civil War General, received a letter from Montreal. A man named Romaine claimed to have Stewart’s body, and asked Jones to serve as his attorney and negotiator. Jones wrote back asking for proof. In answer, he received a package containing Stewart’s missing nameplate. When Jones approached investigators, Hilton refused to pay and accused him of conspiracy. Negotiations faltered, and, unsatisfied, the alleged kidnappers went silent.

article-image

Five years after Stewart’s death, Stewart and Company declared bankruptcy in 1881. That same year, police excavated portions of Brooklyn’s Cypress Hill Cemetery after a false tip that Stewart’s body had been stashed there. That was the last public news about the investigation.

Despite promises to Mrs. Stewart and the press, if Henry Hilton ever found the body, he never announced it. Rumors swirled that private investigators in his employ were still following leads as late as 1885.

In 1887, former NYPD Police Chief Walling published his memoirs and offered an ending to the story.

According to Walling, Mrs. Stewart personally reopened the negotiations with the robbers in 1884, two years before her death. She offered $20,000, and the thieves sent her a marked map of the Hudson Valley. On an appointed night, Mrs. Stewart’s nephew rode down the marked road after midnight, and eventually found a carriage blocking his path. A group of masked men emerged with a scrap of velvet coffin cloth and a bag of bones. After counting the money, they rode off into the night. Walling’s account states that Stewart’s bones were quietly laid to rest in the Cathedral of the Incarnation in 1885.

article-image

Others are skeptical. Several historians, including Wayne Fanebust, author of The Missing Corpse: Grave Robbing a Gilded Age Tycoon, believe the body was never recovered. The most compelling evidence includes the testimony of Henry Hilton’s personal assistant, Herbert Antsey, who stated in 1890, “No. The body was never recovered.” In fact, Fanebust suggests that Stewart might not have minded. “Stewart himself wouldn’t have paid the ransom,” he said. By not recovering his mentor’s body, Hilton may have been staying true to the man’s principles.

When Cornelia Stewart died in 1886, the New York Times expressed its own skepticism about the recovery, writing that she was buried, “beside the grave wherein Mrs. Stewart had always supposed that the remains of her husband reposed.” When Hilton died in 1899, the New York World remarked, “the body was never returned. Or perhaps it was returned— who knows?”

Walling’s story is reflected in the Cathedral’s records, but, in an interview, Michael Sniffen, Dean of the Cathedral of the Incarnation, admits the story, “sounds a little made up.” Regardless of Alexander Stewart’s final resting place, the Stewarts’ are not buried beneath their prominent floor marker under the Cathedral’s nave. To avoid another robbery, the exact location of their remains is a secret. If you believe the rumors, Mr. and Mrs. Stewart lie somewhere beneath the altar.

“Still, it makes you wonder,” Sniffen said. “What is in Stewart’s tomb?”

We Asked a Vexillologist How to Design a Great National Flag

$
0
0

The United States of America's flag is iconic, important and more or less timeless. We are extremely proud of its design, and we are very attached to the story of Betsy Ross sewing it.

But let's take a step back. Is it any good?

Not many people really think about what it takes to design such a standard, and what makes our flag any better than any of the other national standards across the globe. Except for people like David F. Phillips. 

“I’m 72. I’ve been studying flags since i was about six years old,” says Phillips, a professional vexillologist, or someone who specializes in studying the design, meaning, and effectiveness of flags. “The thing that most interested me as a child, and what I think still interests me, is the way that flags and heraldry, communicate complicated ideas through the use of color and line, without any words. That appeals to me.” 

Among vexillologists, there are a few cardinal rules about flag design that make some flags clear winners, and leave others twisting in the wind. Phillips pointed us toward a short booklet, freely available online, that’s helpfully titled, Good Flag, Bad Flag. This short guide to flag design was put together by Ted Kaye, editor of Raven, the North American Vexillological Association’s official journal, and it lists, in no uncertain terms, what it takes to create a successful flag. Kaye’s guide breaks good flag design down into five essential criteria: simplicity, meaningful symbolism, no more than three colors, no lettering or seals, and unique design.

article-image

The first, and arguably most important factor is keeping your flag simple. “I feel like a flag, and this is Ted’s argument too, should be able to be drawn with crayons by a child,” says Phillips. “If it’s more complicated than that, it’s too complicated.” A simple design is essential because, as Phillips explained to us, the very purpose of a flag is to be read from a distance, unlike more complicated seals or heraldic coats of arms. Simplicity is also essential so that a flag can be easily remembered and be instantly recognizable. 

Then there is the meaning behind the colors and symbols on a flag.“On a good flag, you want the lines and the colors and the charges if any, to have some real significance for the nation or the institution that the flag represents,” says Phillips. Charges are symbols or geometric shapes on flags, that are separate graphic elements from the larger design, and can often hold a lot of meaning in just a single shape. Take, for instance, the Canadian maple leaf, or the Japanese sun. “You can picture it absolutely because it’s very clear, it’s unique, and also the sun has a cultural significance to the Japanese,” says Phillips. The color can be a shorthand, too, as in the case of Ukraine’s blue and yellow flag, which represents a blue sky over wheat fields. 

But the color palette should still be limited and coherent. According to Kaye’s guide, a good flag should contain no more than three primary colors. Kaye lists the basic flag colors as red, blue, green, black, yellow, and white (“White and yellow are called “metals,” after gold and silver,” says Phillips), and most good flags use some mix of these. But it also matters which colors you choose, and for the sake of a flag’s visibility, the name of the game is contrast. “In heraldry, there’s a informal rule that don’t put yellow on white, or white on yellow. And you don’t put red on blue, or blue on green,” says Phillips. “Blue on white, red on gold, that’s a lot easier.” This contrast also ensures that if the flag is reproduced in black and white, it doesn’t entirely lose it’s meaning.

article-image

The fourth rule to remember is that words and lettering have no place on a successful flag. The reasoning behind this, as pointed out in Good Flag, Bad Flag, is pretty simple. Words and detailed seals almost instantly blur and lose meaning at a distance. 

Lastly, make sure your flag is unique. It’s fine and even somewhat encouraged, to allude to other flag designs, but in the end, it is most important that your flag cannot be easily confused for other flags. “If you look at some of the flags, for the African states for example, a lot of them use red, green, and gold,” says Phillips. “They use them vertically, they use them horizontally, they use them in different orders, some of them use them diagonally. Which one is that? You’re not sure.” 

article-image

While vexillologists probably disagree as to the specifics, there are flags from nations across the globe that qualify as both bad and good. Phillips’ tastes tend to run towards simpler designs, but even he has his favorites. “My very favorite in the world is the Belgian flag,” he says. “It’s a vertical tri-color, black, gold, and red. These are based on the heraldic colors of the coat of arms of the principal province of Brabant. It’s modeled on the French flag, but it’s so vivid and so striking, and the yellow does that to a large extent. Yellow adds enormous power to any flag. It’s just so beautiful to look at and so distinctive.” In addition to the Belgian flag, Phillips identified other simplistic flags that he finds particularly great like those from Denmark, France, Switzerland, Canada, and Japan.

There were even some more complicated flags like the United Kingdom’s Union Jack or Sri Lanka’s national flag, that he called out as being particularly successful. “It is a little bit complicated, although certainly every element has a meaning,” he says of the Sri Lankan flag. “The two stripes near the hoist represent ethnic minorities, and the lion is the lion which used to represent the old kingdom of Kandy, which was a kingdom in the center of what later became Sri Lanka. It has a definite relevance, but it’s also unique in the world.”

article-image

Then there are those flags that just don’t hit the mark. Whether the design is too basic, or not original enough, or just confusing, some flags just don’t work. Phillips brought up the flag of Kyrgyzstan as a good example of a bad flag. “It has a rather distinctive element in the center, which is the chimney of a yurt,” says Phillips. “But unless you’ve been in a Kyrgyz yurt, you don’t really know exactly what you’re looking at. It looks kind of like a gold blob on red. Then the red field suggests a Communist orientation, which the Kyrgyz don’t have anymore. That’s not such a great idea either.” Phillips also called out the flags of nations like Egypt, Iraq, Syria, Yemen, and the United Arab Emirates, for being too similar to be distinctive. 

Taking into consideration all of the hallmarks of a great flag, we finally asked Phillips what the national flag of Atlas Obscura might look like. “Atlas Obscura, I’m imagining in my mind, a book, partly obscured,” he says. So, knowing very little about our website, and based on the name alone, he suggested a diagonally bisected field of black and gold, with a book in the middle, which would be half hidden beneath the black half of the flag. “I’m not saying it’s the only solution, it’s just the first thing that occurs to me in five seconds.”

We took his advice, and without further ado, we present the official flag of the nation of Atlas Obscura:

article-image

Truly, that is a great flag. If you disagree, feel free to go design your own. At least now you know how.    


Watch the Domino Effect of 8,000 Matches Going Up in Flames

$
0
0

For reasons science has not been able to discern, we are all inexplicably fascinated by the domino effect. Like moths to a flame and vampires to a pulsing vein, we can't help but be transfixed by the organized disaster of individual objects falling in unison.

This video by LXG Design does the domino effect with a twist—or, shall we say, spark?

Using 8,000 matches to form a red flower figure, the visual rhapsody of flame begetting more flame is hypnotizing. To make the video even more appealing, LXG Design shows it to us a second time in slow motion. Now, it isn’t just the rapid consumption of the matches that is enthralling, but the intensity and movement of the flames.

It's as destructive as it is captivating. As with a car wreck, you just can’t look away.

Every day we track down a Video Wonder: an audiovisual offering that delights, inspires, and entertains. Have you encountered a video we should feature? Email ella@atlasobscura.com.

How A Fake British Accent Took Old Hollywood By Storm

$
0
0

If you’ve ever seen a movie made before 1950, you’re familiar with the accent used by actors like Cary Grant, Katharine Hepburn, and Ingrid Bergman: a sort of high-pitched, indistinctly-accented way of speaking that also pops up in recordings of politicians like FDR and writers like Gore Vidal and William F. Buckley, Jr. It’s easy to gloss over today, because movies have captured a few different accents that aren’t really present today, like the Borscht Belt Jewish accent of Mel Brooks and the old New York “Toity-Toid Street” accent. Is it British? Is American? Is it just “rich”?

But the accent we’re talking about here is among the weirdest ways of speaking in the history of the English language. It is not entirely natural, for one thing: the form of the accent was firmly guided by certain key figures, who created strict rules that were aggressively taught. And it also vanished quickly, within the span of perhaps a decade, which might be related to the fact that it isn’t entirely natural.

Today this accent is sometimes called the Mid-Atlantic Accent, which is deeply offensive to those, like me, from the actual Mid-Atlantic region of the United States.

What that name means in this case is that the accent can be placed somewhere in the middle of the Atlantic Ocean, halfway between New England and England. It's popularity, though, in pop culture can be tied to one American woman, and a very strange set of books.


In the 1800s, once relationships with England began to normalize following the Revolutionary War and War of 1812, the cities of Philadelphia, Boston, and, especially, New York City quickly became the new country’s most powerful. Financial and cultural elites began constructing their own kind of vaguely-British institutions, especially in the form of prestigious private schools. And those schools had elocution classes.

The entire concept of an elocution class is wildly offensive to most of the modern linguists I know; following the rise of super-linguist Bill Labov in the 1960s, the concept that one way of speaking is “better” or “worse” than another is basically anathema. But that wasn’t at all the case for the rich kids of Westchester County, Beacon Hill, or the Main Line (those would be the home of the elites of New York, Boston, and Philadelphia, respectively).

article-image

“There's a long history of dialect features of Southeast England in Eastern New England dialects, tracing back directly to the colonial era,” writes James Stanford, a linguist at Dartmouth College, in an email. “European settlers throughout New England on the east side of Vermont's Green Mountains tended to stay in closer touch with Boston, which in turn stayed in touch with Southeast England through commerce and education.”

The upper-class New England accent of that time shares some things with modern New England accents. The most obvious of those is non-rhoticity, which refers to dropping the “r” sounds in words like “hear” and “Charles.”

But while parts of those accents are natural—some New Yorkers and many Bostonians still drop their “r” sounds today—the elite Northeastern accent was ramped up artificially by elocution teachers at boarding schools. Miss Porter’s School in Connecticut (where Jackie Onassis was educated), the Groton School in Massachusetts (FDR), St. Paul’s School (John Kerry), and others all decided to teach their well-heeled pupils to speak in a certain way, a vaguely British-y speech pattern meant to sound aristocratic, excessively proper, and, weirdly, not regionally specific. A similar impulse created the British Received Pronunciation, the literal Queen’s English, though RP’s roots arose a bit more gradually and naturally in Southeastern England.

article-image

 


The book that codified the elite Northeastern accent is one of the most fascinating and demanding books I’ve ever read, painstakingly written by one Edith Skinner. Skinner was an elocutionist who decided, with what must have been balls the size of Mars, to call this accent “Good Speech.” Here’s a quote from her 1942 book, Speak With Distinction:

"Good Speech is hard to define but easy to recognize when we hear it. Good Speech is a dialect of North American English that is free from regional characteristics; recognizably North American, yet suitable for classic texts; effortlessly articulated and easily understood in the last rows of a theater."

Skinner is now woefully outdated and many of her ideas are so contrary to the way modern linguists think that her books are no longer taught. (To find a copy of Speak With Distinction, I had to hunt through a performing arts library in New York City’s Lincoln Center plaza.) She’s what’s known now as a linguistic prescriptivist, meaning that she believed that some variations of English are flat-out superior to others, and should be taught and valued as such. I mean, come on, she named this accent, “Good Speech.”

article-image

Her influence was felt in filmmaking in a very roundabout way. Film began in New York, only moving en masse to Los Angeles in the mid-1910s. Skinner was born in New Brunswick, Canada, but studied linguistics at Columbia and taught drama for many years at Carnegie Mellon, in Pittsburgh, and Juilliard, in New York City, all highly elite schools. It was in the Northeast that she created Speak With Distinction: an insanely thorough linguistic text, full of specific ways to pronounce thousands of different words, diagrams, lessons on the International Phonetic Alphabet, and exercises for drama students.

Yep, drama: by this point, movies with sound had begun to hit theaters, and then came the disastrous story of Clara Bow. Bow was one of the silent film era’s biggest stars, a master of exaggerated expressions. When the talkies came along, audiences heard her voice for the first time and it was a nasal, honking Brooklyn accent. Though the idea that speaking roles killed her career in film is not entirely accurate (there were plenty of other factors, ranging from drug problems to insane pressures of film studios), it’s certainly true that her career took a nosedive around the time audiences heard her voice, possibly creating a cautionary tale for newly heard actors.


It’s now the 1930s, and Edith Skinner is Hollywood’s go-to advisor for all things speech-related. And Edith Skinner has extremely strong opinions, bred in the elite universities of the Northeast, about exactly how people should speak. So she forced her own “Good Speech” accent on stars, and other voice coaches, and soon her accent became the most popular accent in Hollywood.

Speak With Distinction is incredibly dense, but it’s also very thorough. You can see very clearly, right there on the beat-up pages, why Katharine Hepburn speaks the way she does. “In Good Speech, ALL vowel sounds are oral sounds, to be made with the soft palate raised. Thus the breath flows out through the mouth only, rather than through the mouth and nose,” she writes. (She capitalizes things a lot.) “Each vowel sound is called a PURE SOUND, and the slightest movement or change in any of the organs of speech during the formation of a vowel will mar its purity, resulting in DIPHTHONGIZATION.”

article-image

She demands that “r” sounds be dropped. She demands that the “agh” sound, as in “chance,” should be halfway between the American “agh” and the British “ah.” (Interestingly, this is very different than the typical New England accent today, which is highly “fronted,” meaning that the vowel sound is made with the tongue very close to the teeth in words like “father.” The British, and Mid-Atlantic, vowel is pronounced with the tongue much further back.) She requires that all “t” sounds be precisely enunciated: “butter” cannot sound like “budder,” as it mostly does in the US. Words beginning in “wh” must be given a guttural hacking noise, so “what” sounds more like “ccccchhhhwhat.” She bans all glottal stops—the cessation of air when you say “uh-oh”—even between words, as in this phrase, direct from her book: “Oh, Eaton! He’d even heave eels for Edith Healy!” Go ahead, try to say that without any glottal stops. It’s enormously difficult.

She cracks down on the most obvious of regional cues, railing against what’s now called the “pin-pen merger.” Today, the pin-pen merger—in which the word “pen” sounds like “pin”—is a very easy indicator that a speaker is from the American South. Yech, the South. That will not do for Edith Skinner.


Because Skinner was so influential, and her “Good Speech” was so prominent in movies, it began to leak out into the drama world at large. Other teachers began teaching it. In fact, even up until just a few decades ago, this accent, now called “Mid-Atlantic,” was being taught in drama schools. Jaybird Oberski, who teaches acting at Duke University, got his MFA at Carnegie Mellon in 1997, and he says the class was, amazingly, still being taught then. (He isn’t a fan of the accent.) “The Mid-Atlantic accent is considered the neutralization of regionalization, to bleach out character so everybody sounded the same,” he says.

Weirdly enough, this accent class was called a “neutralization technique” at Carnegie Mellon: theoretically, the idea is that it removes regional signifiers like the pin-pen merger. But there is no “neutral” or “accentless” accent; you can replace one accent with another, but the idea that there is some perfect, unaccented variety of English is a myth that’s long been squashed.

article-image

This particular accent, too, is far from neutral. It’s immediately recognizable and strange, a take on a clipped upper-class New England accent with even more Britishisms tossed in the mix. In her efforts to create a neutral accent, Skinner created one of the most non-neutral accents in the past few centuries.

The film craze of Mid-Atlantic English was short-lived. By the late 1960s, the New Hollywood movement, complete with innovative, gritty directors like Francis Ford Coppola and John Cassavetes, began to depict the world as it was, rather than the fantasy lives presented by earlier films. That goal necessitated the dropping of the Mid-Atlantic accent; there’s no point in showing the grim realities of Vietnam War-era America if everyone is going to talk like they went to Choate Rosemary Hall, so the actors in those films just...didn’t. And elocution classes at those schools began to be dropped as well. “The prestige of non-rhoticity and other British-related features began to change in the mid-20th century, and scholars suspect it may be due to the role of WWII and American national identity—a new identity on the world stage, no longer so closely tied to England for national identity,” writes Stanford.

The accent vanished quickly, now only surviving as a weird hallmark of that era of filmmaking; the only time you hear it now, really, is if a movie is set in Hollywood, in the film industry, prior to 1960. The real Mid-Atlantic accent, the accent of Philadelphia and Baltimore, luckily, lives on.

Was it Hershey or Reese That Made Peanut Butter Cups Great?

$
0
0

The Indigo Girls sing about them, Run DMC raps about them, artists draw them. Their Facebook page has over 12 million likes. They’ve been mixed into cocktails, baked into pies, and stuffed into burgers. No mass-market candy bar inspires such intense passion as Reese’s Peanut Butter Cups.

The combination of peanut butter and chocolate is quintessentially American, like macaroni and cheese. Unlike other combinations, however, peanut butter and chocolate is synonymous with a corporate brand name. The Reese’s brand has so few serious competitors that that the ad campaign no longer really tries anymore. The slogan is one word: “perfect.”

It wasn’t always this way. The first national ad campaign for Reese’s in 1970, seven years after Hershey bought the company, was based on the opposite premise: that the idea of peanut butter and chocolate together was so revolting to consumers that they would only try it if they literally fell on top of it.

According to the campaign’s creative director, Billings Fuess, “[Reese’s] was a brand-new product that Hershey had just bought from this farmer.” Hershey had to convince customers “that these two things taste good together.”

Yet Fuess was twisting the truth. Americans had been eating peanut butter cups for over six decades when the Reese’s campaign debuted. In fact, the peanut butter cup was so iconic that in 1962 Pop artist Roy Lichtenstein painted one.

Why then did Hershey claim that the peanut butter cup was new?  Where exactly did the peanut butter cup come from? And why has it continued to be wildly popular?  


The history of Reese’s Peanut Butter Cups begins over a century ago, when Americans invented peanut butter. Food historians debate just who created it: either the Kellogg brothers, of corn flakes fame, or snack-food promoter George Bayle. Regardless, peanut butter began its life in the mid-1890s as a health food, promoted as a nutrient-rich protein source. 

article-image

Like other health foods before it, peanut butter was soon incorporated into candy. But first its cultural status had to drop. Before 1900, peanut butter was expensive and noshed on primarily by rich sanitarium residents, who, following Kellogg’s tenets, shunned meat. But peanut butter quickly became more affordable, and by 1900 it was within reach of the middle class. Within a year, recipes for peanut-butter cups entered America’s cookbooks. These early cups were simply solid peanut butter, without chocolate.

It wasn’t until chocolate became available to the masses a few years later, thanks in part to Milton Hershey and his five-cent milk chocolate bar, that some unknown genius came up with the idea to cover peanut-butter cups in chocolate.

Chocolate-covered peanut-butter cups were sold commercially as early as 1907. They stood out somewhat because their ingredients straddled the boundary between health food and indulgence. Chocolate and peanuts were treats, but nutritionists also thought of them as nourishing foods because they were calorie-dense, and food scientists thought of all calories as equal.

But these early peanut butter cups never became very popular. They were just one of many candies sold in bulk at drugstores, unwrapped in a jar. Bulk candy began losing popularity after World War I, supplanted by the individually wrapped candy bar, which had been a part of soldiers’ rations. Chocolate-covered peanut-butter cups remained a novelty.


It was during this period of change in the candy market that Harry Burnett Reese decided to start a candy business. In 1919, 40-year-old H.B. Reese was laid off from the Hershey Chocolate Company, where he managed Hershey’s experimental dairy farm. Reese needed to support his 10 children and legally blind wife Blanche Edna Hyson. He’d spent half his life schlepping his family around Pennsylvania from one job to another, mostly in farming, but nothing had stuck. So in 1919 Reese followed in his mother’s footsteps and began making candy.

article-image

The town of Hershey was a great place to start a candy company, with easy access to high-quality chocolate, workers, and engineers. But it was also a town where a candy entrepreneur would be competing against the largest chocolate company in America. Milton Hershey began building the town in 1903, completing his chocolate factory in 1905. By the time Reese moved to Hershey it was fully established, with a trolley system, schools, and parks.

Reese's first venture involved the candies his mother had made and sold: chocolate-covered almonds and raisins. His business, like many other small candy companies, failed. Reese left Hershey to find work, returning in 1921 when his father-in-law bought Reese a house for his family.

Reese took a job with Hershey again, yet he still could barely support his family. He began making after-dinner peppermints in his living room, stretching the taffy with hooks. But the peppermints were no match for the chocolate that dominated the town.

In 1923, Reese quit his job with Hershey and returned to chocolate making, asking Hershey for permission first. Hershey said yes, but with one stipulation: Reese must buy all his chocolate from him. And with a handshake, the H.B. Reese Candy Company was born.

Reese set up his candy workshop in the basement of his house. His first product was a boxed assortment filled with everything from chocolate-covered honeydew melon to enrobed marshmallows. But it was difficult to succeed just by selling chocolate assortments. The real money was in candy bars. So he began to make coconut-caramel chocolate bars, which he named after his children, Johnny and Lizzie. The bars were popular, and as sales increased, he moved his operations to the basement of an Italian restaurant. Sales kept ramping up, so in 1926 Reese built his first real factory in Hershey and moved his family next door. 

article-image

The Reese Company was doing well, but it still didn’t have a blockbuster product like Hershey’s Kisses. Then one day in 1928, everything changed. While on a sales trip in nearby Harrisburg, Reese chatted with a store owner who told him that he couldn’t keep peanut butter cups in stock. He asked Reese if he could make something similar.

Without hesitation, Reese said yes. He left the store and immediately purchased a 50-pound can of peanut butter. When he got home, he began experimenting with a peanut butter ball covered in chocolate. But although tasty, they were difficult to manufacture on a large scale. Reese switched to traditional cups, and began making his own peanut butter, roasting the peanuts so they were almost burnt. According to family lore, it's this roasted taste that makes a Reese’s a Reese’s.

A year after Reese began making his peanut-butter cups, the Great Depression hit and Reese almost went bankrupt. Reese couldn’t pay his bills and began paying his employees in candy. When the Sheriff came looking for him, Reese absconded to his family’s farm. Eventually Reese secured loans, and the company was saved.

Reese’s sales remained strong but paled in comparison to Hershey’s. Then in 1933, one of Reese’s salesman, Mr. Houston, urged Reese to sell peanut butter cups individually because his customers thought they were the best piece in the assortment. Reese was hesitant because they were his least favorite piece. But Houston convinced Reese to sell the cups individually for a penny. And the iconic Reese’s cup was born, along with its first ad campaign: a picture of Reese’s wife and many children with the tagline "16 Reasons to Eat a Reese’s."

Reese’s peanut butter cups were an immediate hit, selling enough that by 1935 Reese paid off his debts. The company was running smoothly until 1941 when the U.S. entered World War II, and sugar was rationed. Although Hershey lent Reese sugar, it wasn’t enough. Reese decided to eliminate his other lines and produce only one candy: the candy that required the least amount of sugar, the peanut butter cup. It was the best decision he ever made. 

article-image

As Reese’s peanut-butter business boomed, other candy companies began producing their own cups. Reese responded aggressively, sending threatening letters to drugstores in 1954 that claimed that it owned the trademark to the term “Peanut Butter Cup.” The letters demanded that stores stop “advertising and selling any candy product, except ours, as Peanut Butter Cups.” After a lawsuit by a Reese competitor, the letters stopped.

In spite of the competition, Reese was doing a booming business in peanut butter cups, selling nationally to both small stores and large retailers like Sears Roebuck. The company decided to build a huge factory in Hershey, but H.B. Reese didn’t live to see it finished in 1957. He died a few months earlier, plunging the company into chaos. Reese left his daughters out of the will, and instead left the company to his six sons—who didn’t exactly get along.

As Reese’s children jostled for control, the company finally entered the modern era with an automated production line in the new factory. In a reversal from its earlier boast that the cups were hand-coated, Reese crowed that its cups were “untouched by human hands.” By the late 1950s, Reese’s had become a respected national brand.

Yet as Reese’s sales soared to over $15 million annually in the early 1960s, the brothers fought over the company’s future. The eldest brothers wanted to sell Reese, but the younger brothers wanted to keep it in the family. They turned down a bid from British American Tobacco, and rumors got back to Hershey about a possible Reese sale. A merger with Hershey made sense: the companies were located in the same town, used the same chocolate, and their founders were good friends.

After much argument, Reese decided to sell to Hershey. Whether or not Reese got the best possible deal is unclear. With the 1963 sale, Reese's sons received over 666,000 shares of Hershey’s stock, valued at over $24 million. A few of the Reese brothers became Hershey board members. Reese’s Peanut Butter Cups gained Hershey’s distribution power, but the merger placed the legacy of H.B. Reese in peril.

article-image

Peanut Butter Cups were an immediate success for Hershey. In 1969 the product was Hershey’s top seller, yet the corporation was slipping from dominance. Mars, maker of Snickers, was closing in. Hershey’s sought to keep their candy crown by launching their first national ad campaign for their three top brands. Hershey hired famed ad agency Ogilvy and Mather, whose team made an unusual discovery about Reese’s.

“When you told folks that we had this marvelous candy bar that had a combination of chocolate and peanut butter, they did not take to that kindly,” recalls creative director Billings Fuess. “It was trouble to persuade them to … try it.”

Although Fuess claims consumers disliked the combination, Reese’s was already selling over 300 million cups annually—more than the Hershey Bar. So why did Ogilvy and Mather and the Hershey company want to position the combo as new? They could have done the opposite, focusing on the enormous popularity of the chocolate-peanut butter combo. But such a tactic would not have allowed Hershey to lay claim to it. By pretending it was new, they could brand it as Hershey’s.  

How they did so was by diminishing the importance of Reese. Fuess said that while doing research, the creative team “went to see Mr. Reese’s little production line there, behind his farmhouse,” perhaps misremembering, as Reese had been dead for well over a decade at this time and Reese’s “little production line” was an enormous factory. Yet by positioning Reese as a small-time candy company, Hershey was able to portray itself as rescuing peanut butter cups from obscurity and bringing them to the masses. 

article-image

The “Two Great Tastes” campaign that emerged from Ogilvy and Mather’s research was brilliant in its simplicity. The classic example is the “Manhole” ad, which features a man walking down the street, inexplicably spooning peanut butter from a jar. He falls into a manhole, toppling into a construction worker eating a chocolate bar. “Hey you got peanut butter on my chocolate,” says the worker, as if such things happen to him every day. “Hey you got chocolate on my peanut butter,” the other man shoots back. After the initial tussle, they discover to their surprise that the combination is delicious. “You get two great tastes that taste great together,” plays the jingle over the happy men eating Reese’s.

The campaign made a six-decades-old product seem new and exotic. But to make this work, Hershey had to erase Reese’s history. While branding Reese’s, Hershey did something even more powerful: it laid claim to the chocolate-peanut butter combination.

article-image

The sweet and salty taste sensation built for the American palate and beloved for generations was now Hershey’s. Following the campaign, sales for Reese skyrocketed so high that Hershey thought it had made an accounting error. Hershey ended the campaign two years later, but Reese’s continued to soar.


Reese’s became the best-selling candy brand in the U.S. and remains so today. As other cup brands fizzled out, Hershey’s ramped up Reese’s brand. If you own the peanut-butter chocolate combination there are few limits to the possible line extensions. Baking chips debuted in 1977; Reese’s Pieces followed a year later. By the end of the 20th century the Reese’s brand had become a behemoth, as supermarket shelves filled with Reese’s frosting, sprinkles, peanut butter, ice cream, and cereal.

article-image

Why has Reese achieved such sustained success?  It’s probably some combination of Hershey’s marketing and distribution muscle and Reese’s distinctive taste. Since Reese’s inception no other company has been able to duplicate its taste, although many have tried. Why other companies can’t make a cup as delicious as Reese’s is a mystery. Reese’s ingredients are certainly not high quality: the peanuts used in its filling are of the bland “runner” type, the chocolate is of the pedestrian milk variety. Yet the chocolate and peanut butter interact in an alchemical way that transcends the quality of the two ingredients.

Making peanut-butter cups with organic ingredients doesn’t improve them. I have tried nearly every artisanal or organic peanut-butter cup available: Justin’s, Theo’s, Colt’s Bolts, those made in tiny hometown candy stores. All leave me unsatisfied, yearning for the processed taste of a Reese’s.

Adding new ingredients also doesn’t work. Hershey has mixed cookies, banana cream, caramel, honey nut flavor, and Reese’s Pieces with the peanut butter, and covered the cups in dark and white chocolate.  Artisanal companies make peanut butter cups filled with bacon, chia seed, coconut, and caramelized bananas. None appear to have supplanted Reese’s taste. Reese’s cups resist such experimentation. Alternatives wither; something inexplicable is lost when the ingredients change. 

article-image

Although Reese’s Peanut Butter Cups are now sold in nearly every gas station, vending machine, and supermarket in the U.S,  the essence of the original cup, which was hand-crafted in H.B. Reese’s basement 89 years ago, has endured. But Reese’s history has not.

Even though Reese created his multimillion dollar peanut-butter cup empire in Hershey, built two factories, and employed hundreds of residents, no monuments to him exist. While the century-old smokestacks of the former Hershey’s factory stand proudly as a marker of the company’s history, Reese’s old factory is unmarked. Sure, you can buy Reese’s in Hershey’s gift shops and read a tiny placard in the museum about Reese. But H.B. Reese is an afterthought in Hershey, his impact diminished, as if giving Reese his due would undermine Milton Hershey’s legacy.

Reese deserves better than simply his name on the package. Even though Reese’s would probably not have grown as successful without Hershey, Hershey is not the sole reason for Reese’s sustained popularity. Reese’s Peanut Butter Cups are unique; their peanut butter is unmatched, their ratio of ingredients divine. Their success is perhaps caused by an even greater force: nostalgia. Peanut butter and chocolate taste of childhood and love and family. Even though H.B. Reese didn’t invent the peanut butter cup, nor Americans’ love of its two main ingredients, he deserves his rightful place in history for perfecting it.

In 1961, Roald Dahl Hosted His Own Version of 'The Twilight Zone'

$
0
0
article-image
article-image

Roald Dahl was many things. A fighter pilot, a renowned author, a spy. But few people know that he was also the host of his very own Twilight Zone–style sci-fi/horror anthology show, Way Out, a macabre program that ran for a single season and almost gave Rod Serling’s more famous program a run for its money. And it all began with a terrible game show.

In 1961, Honeymooners star Jackie Gleason had moved on from his career-defining role as cantankerous bus driver, Ralph Kramden, and become a roving host and guest, appearing on the variety shows, specials, and game shows. One of these endeavors was a game show called You're In the Picture. Intended to display Gleason's skills as a raconteur and show host, the show was to have a panel of celebrities stick their heads through a famous image, then they would have to question Gleason to determine what image they’d stuck their heads through. It wasn’t a hit.

You’re in The Picture ran for exactly one episode, and received such negative reviews that when the next episode was set to air, instead of the game show, viewers were greeted by a half-hour apology delivered by Gleason himself. After expressing regret for dropping “the biggest bomb,” Gleason changed the format to a talk show to limp through the rest of the initial episode order, but producers at CBS needed a new show to fill Gleason’s spot, and fast.

article-image

Under the gun, some enterprising producers at the network began dreaming up a creepy drama show to fill the time slot, and they went right to Dahl. While he is best remembered today for his timeless works of children’s literature like Matilda and Charlie and The Chocolate Factory, for a good portion of his writing career, he was better known as an author of twisted, devilish fiction. As explained in an article originally published in Filmfax Magazine, Dahl jumped at the chance to develop the series, spurred on by the fact that the show’s time slot (9:30 p.m. on Fridays) fell right before another thematically similar little CBS show, The Twilight Zone.

The resulting half-hour show was titled Way Out—strangely the opening screen of the show displayed the title with an apostrophe preceding it, ‘Way Out. The format was set up much like the already successful Twilight Zone series, with Dahl in the Rod Serling role.

The black-and-white show would begin with what became its signature image, a slow pan over a series of mist-shrouded, disembodied hands, before resting on one which would burst into flames at the title came onscreen. Then, flexing his dry British charm like a more cosmopolitan Vincent Price, Dahl would give a short intro to each episode. The bulk of the program consisted of the main tale, usually a short morality play with an ironic or surprising ending or element, which often dipped into the supernatural. Then Dahl would close out the show with another direct epilogue, much like the Cryptkeeper of the later Tales From the Crypt.

Dahl also smoked like a chimney throughout his segments, which served the dual purpose of providing a mysterious haze around the host and demonstrating the show’s main sponsor, L&M Cigarettes. In fact, just about everyone in Way Out really enjoys cigarettes.

Initially the producers wanted to adapt some of Dahl’s pre-existing stories, but in the end only the first episode ended up being written by Dahl, with the remainder of the series’ 14-episode run being authored by other people. The bench of talent never quite equalled that of The Twilight Zone.

The first story, and the only one based on one of Dahl’s stories, was called “William and Mary.” In the episode, a controlling jerk of a husband, William, lies on his deathbed barking insults and commands at his long-suffering wife, Mary. Her torment seems to be at an end when he dies and she is free to smoke refreshing L&M Cigarettes, play cards with friends, and even wear lipstick. But—twist—William has opted to keep his brain alive after death so that he can still keep watch over Mary! But—double twist—with no mouth or body, William finds himself a captive witness to his newly liberated wife’s new life. The tale ends with Mary gleefully blowing smoke into William’s helpless robotic eye. Diabolical fate.

The stories got more outlandish. In the episode The Croaker, a mysterious man begins manipulating a young boy to help him turn the residents of their town into frogs, but the enterprising young lad has some strange plans of his own. In the episode Side Show, a woman with a light bulb for a head is held against her will in a circus sideshow. When an audience member falls in love and decides to free her, he may be in for a shock. In the episode False Face, an actor pays a deformed homeless man to be his model for some Quasimodo make-up, but the effects turn out to be a bit too real. In the world of Way Out, fate always has a cruel sense of humor.

article-image

The show received positive press as it aired from March to July of ‘61, and even today, episodes of Way Out still hold up surprisingly well as tightly drawn, macabre vignettes. But at the time, its high quality didn’t translate to sufficient ratings, even with Dahl’s unforgettable segments. Way Out was cancelled after just one short season.

Today, you can find some of the episodes on YouTube, and the entire collection is held by The Paley Center For Media, although it has never been formally released. The episodes are a must-see for any fans of Dahl’s gruesome sense of irony, or fans of The Twilight Zone. Take a look this Halloween, and, in the words Dahl himself used to close every episode, “Good night, and sleep well.”

Found: The Original Walls of the Cave Where Jesus Was Buried

$
0
0
article-image

Inside the Church of the Holy Sepulcher, there’s a large rotunda, with stories of arches flying high. At the center sits a small, freestanding structure called the Edicule, which contains the slab where Jesus is believed to have been laid to rest. For the first time in almost two centuries, the marble Edicule is being restored, and in the process scientists have found what they believe could be the walls of the cave where the Resurrection is supposed to have taken place, as National Geographic reports.

The Church of the Holy Sepulcher is itself an astonishing structure, first built in the 4th century and renovated, rebuilt and restored many, many times over the centuries since. It’s been more than 460 years since the last time the burial shelf in the original cave was uncovered. The marble Edicule that surrounds the shelf was built in 1810, after a fire damaged the previous structure. In 1927, the “little house” was damaged during an earthquake and for the past seven decades it has been propped up by metal girders.

In March, the different religious orders that control the church agreed to renovate the Edicule. As part of the preparation, radar tests showed that there could still be “hidden layers” behind the marble walls, the AP reports. When scientists moved the marble slab off the burial shelf, they discovered a layer of debris and another marble slab, grey, with a cross etched into it, that could date back to the 1100s. Bennett said that was a “whitish layer,” National Geographic says—and there may still be more beyond that.

Behind the marble walls of the small building, they also found walls of the original cave. These walls were thought to have long since collapsed or disintegrated behind the marble walls, but they are still standing six feet high.

All this work has had to happen in a very short window: the overseers of the church gave the scientists just 60 hours to work on the inner sanctum. Already they’ve discovered materials no one thought was there, and the restored Edicule will allow visitors a glimpse—they’re leaving a window cut into the marble slab of the wall to show the original cave walls.

The Forgotten Halloween Games That Predict Who You'll Marry—And When You'll Die

$
0
0
article-image
article-image

If sticking your face in a barrel filled with cold water and trying to grab an apple with your mouth seems a touch sadistic for a casual game, then you're really not going to like its sister pagan ritual. It's sometimes called "snap-apple," and like bobbing for apples, it comes from a pantheon of mostly forgotten All Hallows' Eve traditions from the British Isles.

In snap apple, the goal is still to grip an apple in your mouth, no hands allowed. But in this game, the apple is attached to one end of a stick. On the other end, there's a lit candle. The stick is hung from the ceiling by a rope and set spinning. The player tries for the apple. Any mangled attempt, too slow, too fast, too aggressive, not aggressive enough, can result in a burning candle in the face.

In America today, Halloween is most often associated with pumpkins and candy. For centuries in the British Isles, though, this eerie night was linked with apples, nuts, and cabbage, all of which were given secret powers. And like snap apple, the games played with these foods had much more dramatic stakes than the ones we play today. The risk wasn't just about getting physically burnt, though.

All Hallows' Eve is a time when the veil between the world of the living and dead is thinner than usual, and these forgotten rituals banked on that closeness to predict the future—you could find out who you might marry or whether it'd never happened for you, if you'd be a widow, or when you might die.

article-image

Among all the food traditions of Halloween, apples appear most often. There are loose theories for why apples are so special on this day. They've long been thought to have an uncanny magic: in Welsh folklore, for instance, they carry a whiff of immortality. In the Book of Hallowe'en, an early 20th century study of October 31 traditions, Ruth Edna Kelley writes that apples came to Halloween through the colonizing Romans, who linked their fall apple festival with the Celtic Samhain.

In the 20th century, games like snap apple and bobbing for apples were seen as diversions, but earlier they were more serious. Consider that in Scotland, bobbing for apples is sometimes called dooking or douking for apples, the same word used to describe dunking a women in water to test if she might be a witch. Bobbing for apples, or "snapping" one suspended on the ceiling from a string (a safer version of snap apple) could reveal the future of your love life: in one version, if you got the apple first, you'd have happy love—or, in another, you'd be the first married.

Apple games could also make matches, or predict whether a person would have a good or bad love. Sometimes the apples would be labeled or marked by young men and women before they were put in a tub of water: the person who caught your apple could be your mate. In another version of snap apple, a hoop is suspended from the ceiling, and different treats and tricks, including cake, candies, bread, apples, and peppers, are stationed along its rim. The one a player caught with their teeth would foretell the nature of their love—would it be sweet, spicy, too hot? Would it nourish or burn them?

There are other ways, too, to coax an apple into telling the future on Halloween. You could peel it all in one strip and throw that peel over your shoulder: if it stayed in one piece, it would form the initial of a future lover. Or you could take an apple to a mirror. Shine a candle in the mirror and eat the apple, and you would see your future husband over your shoulder. In one version of the tradition, the apple gets cut into nine pieces, and it's only after the eighth is eaten that the lover appears, asking for the ninth.

article-image

Nuts could also tell the future, and in some places, Halloween was called "nutcrack night" or the "sacrifice of nuts." If a man brought a woman some walnuts, it would mean he was her true love. But it was also possible to use nuts to see the future of a love. Nuts would be named and set by the fire to see how they burned. If they roasted slowly, together, that would promise a good, strong love. If they crackled and popped and jumped, it was a bad sign.

A girl could test out different suitors this way—the nuts could give a hint to how the relationship might go. Kelley saw a sinister shadow in this: "Who sees in the nuts thrown into the fire, turning in the heat, blazing and growing black, the writhing victim of an old-time sacrifice to an idol?" she wrote.

If none of these fortune telling methods went your way, you could still resort to cabbage. One tradition involved pulling kale stalks from a field: the nature of the stalk, its weight, length, girth, and taste, would hint at one's future spouse. If the roots still had plenty of earth hanging on them, that could augur a hearty dowry. If you were a woman, you could also steal a cabbage and see who you meet on the way home to find out who your future husband might be. In Ireland and Newfoundland, cabbage and potatoes together, a dish called colcannon, would have a ring or button hidden in it, and whoever found the ring could expect to be married soon. The button was less lucky—it would mean you'd never marry.

If colcannon isn't appealing, cake is another alternative. Some people would bake a ring and nut into the cake—the finder of the nut would be a widow or widower. In another version, the cake had a coin, a sloe berry and a bit of wood baked in. The coin meant wealth, the sloe meant you'd outlive all your friends and the wood meant you'd die within the year. There were other Halloween augurs, too, that could foretell death: like snap apple, these games were not kidding around. Compared to the risk of having your death foretold, getting a little wet while bobbing for apples seems pretty tame.  

There's Only One State Embassy In Washington. Of Course, It's Florida's

$
0
0
article-image

In the 1960s, a station wagon full of overtired Florida children looped endlessly around Washington, D.C.'s bewildering traffic circles. The campground where they'd planned to stay was gone. The car made its way down Embassy Row, and the children thought of something. Where was Florida's embassy?

"We explained to them that the different states don't have embassies," their mother, Rhea Chiles, told a reporter in 2003. "They thought that was short shrift." 

Today, Florida House is a reality. Visitors rap on the door with a seashell-shaped knocker, and a staffer welcomes them into a bright foyer. That leads into a living room inset with color-drenched stained glass. Florida art hangs on the walls; an ocean-themed mantel frames a fireplace. Orange juice is always offered.

Different versions of the founding story have appeared in many articles about Florida House over the years; there's even a video with Chiles telling one. They vary, but they all make sense, and because of the children's guileless reasoning—Chiles listened. She founded Florida House, so other Floridians would have a place to call home when they were in D.C. It is right on Capitol Hill, and is the only state "embassy."

article-image

The actual founding, though, was years in the making. Rhea Chiles returned to Washington in the early ‘70s when husband, Lawton Chiles, was a senator. (People called him "Walkin' Lawton" because, during his campaign, he walked from Pensacola to Key West.) She saw a run-down Victorian on her walks to his office. Boards covered the windows, and it stood in part of the city many considered unsafe. But Chiles liked the house. She saw potential, raised money, and bought it for $125,000 in 1973.

Florida House stands on a Capitol Hill corner; the other three corners at the intersection house the Library of Congress, the Supreme Court, and the Folger Shakespeare Library. Its builder was Edwin Manning, an architect who worked on the Library of Congress. In 1939, the house was owned by North Carolina Senator Robert Rice Reynolds, a Reynolds tobacco heir. His isolationist newspaper, American Vindicator, listed the house as its address. On September 11 2001, federal buildings closed, and senators Bob Graham and Bill Nelson sheltered at Florida House with about 35 staffers. During the anthrax scare in the Senate office building, Nelson's staff decamped to Florida House for a couple of weeks.

article-image

Despite the Chiles children's question, Florida House is not, of course, technically an embassy. But it does perform many of the same functions that an embassy from a foreign country might. It's a sanctuary for Floridians with its themed art and gracious living spaces. People host receptions and lunches. There's an intern seminar series, and rotating exhibits of Florida artifacts. Floridians can use the computer, desk, and phone.

The house doesn't get any tax money. Instead, private donations keep it running. And despite being on Capitol Hill, Florida House takes no sides. "You leave your hat at the door," CEO and president Bart Hudson says, "because once you come through the door you are a Floridian."

article-image

Chiles had a larger vision of Second Street as a state embassy row, says Hudson, and he thinks that would not be a bad idea. He points out that many states have societies, which have, for example, their own softball teams. States also maintain a presence in the Hall of States, where many state governments have Washington offices.. There was a brief moment in 1994 when it seemed like there would be an Illinois house, but that passed.

Ask Hudson what he'd like people to know about Florida House, and he has a simple answer: "That we exist." Florida House is 43 years old, he says, so the fact that so many Floridians don't know it's there is "disheartening and challenging and our goal to correct." Hudson says they see probably close to 15,000 people a year, but since there are 19 million people in Florida, he'd like more.


This is How New York City Celebrated Halloween in 1993

$
0
0
article-image

Twenty-three years ago, in 1993, Gregoire Alessandrini was a student living in New York City. As a new arrival from France, he found the city to be an intoxicating mix of constant surprises—ones that he photographed whenever possible. Halloween in particular was an event that caught his attention: "Halloween in New York is such a treat. It feels like everything is permitted and that everyone has the right to be who they want to be…whatever it is!"

But on Halloween in 1993, Alessandrini, rather than film the parade, decided to walk around the West Village.  It was "where the real show was!” Shooting with an old Contax camera and flash, Alessandrini captured an atmosphere he recalls as one of “happiness, tolerance and eccentricity.” 

Some of the photos also hold clues to the time period: in one image, a reveler wears a Ross Perot costume. What’s also striking about the photographs is what we don’t see: no-one is checking their smartphones or posing for selfies.

From 1991 to 1998, Alessandrini captured New York scenes across different seasons and different neighborhoods, some of which have now changed almost beyond recognition. His online archive, New York City in the 1990s, currently has around 1,500 images of New York, all scanned from negatives and slides that he had stored for years in an old suitcase.

Ahead of Halloween weekend, Atlas Obscura takes a look back at how New York celebrated in 1993.   

article-image
article-image
article-image
article-image
article-image
article-image
article-image
article-image
article-image
article-image
article-image
article-image
article-image

Exhausted Geese Are Falling Out of The Sky in Canada

$
0
0
article-image

It’s raining geese on Canada’s Sunshine Coast. According to the Coast Reporter, tired geese have been dropping out of the sky, and landing in people’s yards, simply too tired to continue flying.

Over the past week, at least three snow geese have been brought into the Gibsons Wildlife Rehabilitation Centre, having been found mysteriously grounded. While tests have not yet been conducted to see whether there is some internal reason for their unexpected landings, representatives of the rescue center speculate that the birds were simply too tired to keep going.

The Sunshine Coast, in British Columbia, lies along the 3,000 mile migratory path of the snow geese as they travel from Russia to the George C. Reifel Migratory Bird Sanctuary where they will spend the winter months in the company of friends and relatives.

No reason was given as to why these birds might have become too tired to fly, although it could have to do with a lack of food along some leg of the journey, forcing them to land at random intervals to search for sustenance anywhere they can.  With luck this small epidemic of falling fowl is just an anomaly, but hey, it’s been a long week. We’re all tired.

Watch This Artist Create Kaleidoscope Masterpieces With Single-Celled Algae

$
0
0

Algae kaleidoscopes were among the many creatively biological ways that Victorian scientists entertained themselves. Using the end of a piece of hair, they moved tiny single-celled algae known as diatoms on a slide, arranging them into beautiful, symmetrical patterns that amused wealthy amateur naturalists at social gatherings.

Now, one artist in England, Klaus Kemp, continues this Victorian art of diatom arrangement.

“The first time I saw a diatom, I was 16,” Kemp says in the video by Matthew Killip. “It was love at first sight.”

Kemp spent eight years researching how to create these microscopic masterpieces. He spends much of his time hunting for diatoms in bodies of waters, from horse troughs, ditches, and gutters. Kemp takes samples of the algae and cleans them in his studio before he starts the arduous process of arranging each single-celled organism.

At the 2:28-mark, we get a rare glimpse at his method. Unlike the Victorians who use hair, Kemp's technique uses a precise needle. He patiently moves the circle-shaped diatom across the slide, each nudge just microns of movement, he says.

The glass shells of the diatoms gleam under the lens, Kemp's careful hand creating stunning brightly colored patterns. It’s hard to believe that such delicate and extravagant pieces are made from organisms dwelling in murky puddles of water. 

Every day we track down a Video Wonder: an audiovisual offering that delights, inspires, and entertains. Have you encountered a video we should feature? Email ella@atlasobscura.com.

In the 1800s, Sick People Would Consult Cookbooks Before Doctors

$
0
0
article-image

In most middle-class British households in the 18th and 19th centuries, you would find a booklet filled with recipes collected and curated over generations. But these guides were more than the cookbooks or housekeeping guides you’d find in today’s bookstores. Tucked between recipes for roast goose and apple dumplings were instructions for making medical remedies—everyday ingredients whipped into salves for bruises and syrups for coughs.

"People mostly, especially people of modest means, didn't go to the doctor if they could help it," says Arlene Shaner, historical collections librarian at the New York Academy of Medicine. "These are kind of home family guides with directions of how to take care of common ailments." 

Over the years, these all-purpose guides, which were published throughout Europe and in the United States, went through different iterations and included more recipe and how-to categories. Alongside stews and pies, the books also describe how to carve, pickle, make ink, brew beer, manage bees, and heal sprains. The treasured volumes, passed from generation to generation, give a window into the concerns, common illnesses, activities, and interests of middle-class modern homes.

“Sometimes they were called a cabinet or a Queen’s closet,” says Shaner. “It’s this idea that this is a little bit of a secret, a trade secret that someone shares with you that you wouldn’t be able to get from another place.”

article-image

Gathering medical information in short, concise recipes has a long tradition stretching from the ancient period to the late 19th century, writes Elaine Leong a researcher at the Max Planck Institute for the History of Science. “Historians of medicine see the early modern home as one of the main sites for medical intervention and health promotion. Householders were not only quick to combine self-diagnosis and self-treatment with commercially available medical care but many also produced their own homemade medicines.”

article-image

Catered more towards people of modest means, these books were written mainly by women, but there were some popular cookbooks also penned by men. One popular English cookbook, The Prudent Housewife, or Compleat English Cook, by Lydia Fisher in 1800 had at least 24 editions.

Reading through The Prudent Housewife, some medical treatments sound whimsical while others appear dangerous. For a sprain, The Prudent Housewife advises a soak in warm vinegar, and then applying a paste of stale beer grounds, oatmeal, and hog’s lard every day until the pain and swelling go away. Hiccups call for a tasty sounding syrup of liquid cinnamon on a lump of sugar, while heart burn requires a glass of water or chamomile tea with scraped chalk. To rid of giddiness, people would drink 20 drops of castor oil mixed in water, and “the smoke of tobacco blown into the ear is an excellent remedy” for an ear ache.

article-image

While it’s clear today that some of the ingredients listed in the medical recipes would not be safe to ingest, people at the time used the items and knowledge they had access to, Shaner explains.

“There are some of these remedies that you read that are distressing a little bit,” says Shaner. “If you find a recipe for a cough remedy that has syrup of poppies in it, you can guess whether or not it’s going to work.”

In the appendix of some of the books, like The Prudent Housewife, there are a list of standard base ingredients that all households should have. These would be added into different remedies, such as opodeldoc, a popular lotion or soap that was used to relieve rheumatic and arthritic pain.

article-image

Families often chose one notebook to record and annotate recipes, and add other bits of practical knowledge, writes Leong. On some of the cookbooks, autobiographical notes, confessions, and spiritual meditations are scribbled in the margins or inscribed in the blank pages left at the end of some of the books. Recipes were shared within circles of friends and books were kept within families. In some cases, writes Leong, they "were considered worthy enough to be mentioned in wills and bequests alongside other household objects of value."

These antique cookbooks were cherished items that also serve as useful pieces of history. Some scholars even consider it the first genre of women’s medical writing, and a mark of a shift away from male dominance within the household, Leong explains.

article-image

Today, large comprehensive housekeeping guides exist, but few if any also contain medical remedies. The all-in-one books of yore are still of value, but more for their historical intrigue than their medical utility. Leeches behind the ears will not, in all likelihood, cure your headache.

Inside the Final Days of New York City's Last Dairy

$
0
0

On a recent Friday between 2:00 a.m. and 9:30 a.m., Anthony Vasquez wound his truck through midtown Manhattan, making 40 stops to deliver milk for Bartlett Dairy. New York City is America’s largest, most complex dairy market. A bodega or deli might order a few quarts; Starbucks might need a truckload of gallons. The range in the scale of customers, the congestion, and the confounding parking regulations all contribute to the market’s eccentricities.

Delivering milk has always been hard work because time is critical and milk is heavy. In New York City it also requires strategic thinking. As Vasquez wheeled stacks of 50-pound crates on a handcart or lugged them up and down stairs, he fine-tuned his route for maximum efficiency and minimum parking tickets. (He only got one that morning, his first in a week.) And he reflected on an upheaval in the city’s dairy market that will affect the route he has been driving for 12 years: Elmhurst Dairy, New York City’s last milk processing facility, will be shutting down for good this weekend.

“It is going to have a big impact on our small business,” Vasquez said. “I think it is going to hurt a lot.”

article-image

Based in Jamaica, Queens, Elmhurst supported a unique network of distributors, like Bartlett, provided about 270 manufacturing jobs in a city where such jobs are scarce, and was the sole supplier of more than 110 million half-pints annually to public schools. Its closure, which seemed inevitable to some observers and has shocked others, is rippling out across the city and state. Such disruption is nothing new—New York City’s dairy industry has been undergoing disruption since it was born. Its peculiarities have shaped aspects of national dairy policy and its goings-on have sometimes been at odds with milk’s wholesome image: poison, death, violence, and Pete Seeger all appear in its two-century history.

But the forces contributing to Elmhurst’s closure are hardly local. Nationwide, the story is the same. Cows are producing more, consumers are drinking less—much as they did in the past, before the ascendancy of milk. Elmhurst’s demise represents a kind of coming full circle, in New York City and across the country.  


Cows’ milk first came of age as a consumer product in New York City in the mid-19th century. Before then, by and large, urbanites ate durable dairy: cheese and butter. And urban babies drank mother’s milk. Pace University sociologist E. Melanie DuPuis, author of Nature’s Perfect Food, points to many reasons nursing was outsourced to the cow—among them, middle class women’s expanding roles outside the home. Cows provided a degree of liberation.

As the city’s population grew, pasture dwindled. Remaining herds were packed in small quarters where disease spread and where the nearest, cheapest feed was distillery or brewery mash—a nutritious slop that rendered milk blue, redolent of alcohol, and clumpy. Chalk, magnesia, or plaster were often required to transform the milk into a substance that looked like milk. The contaminants and the bacteria thriving in the unsanitary dairies contributed to skyrocketing infant mortality. Two of today’s federally required tests, for water and for fat content, are a direct legacy of this era. “New York City was a pioneer and innovator in solving problems,” said Andrew Novakovic, an agricultural economist at Cornell University. “Sometimes it was solving stuff that New York shouldn’t be proud of. But that is part of the story.”

article-image

By the early 20th century, temperance advocates and public health activists had made “swill milk” illegal in New York City and pasteurization was the new cause. Nearly all the city’s milk came via train or truck from upstate, restoring milk’s rural, wholesome image. As was true in other regions, this flow was described in the same terms as the flow of water: as a milkshed.

The industry’s reputation soon regressed. Farmers, squeezed by low prices set by growing conglomerates, began striking in upstate New York. In a famous 1933 essay Edmund Wilson described law enforcement’s brutal response—employing “sub-machine guns, gas bombs and riot sticks”—to one strike. A later strike had vocal support from Pete Seeger and his troupe, the Vagabond Puppeteers. Industrialization was also up-ending consumers’ bucolic image of rustic farms and ambling cows; the Rotolactor, an iteration of the mainstay of mechanized milking today, had been recently invented. At the 1939 World’s Fair in Flushing, Queens, Borden—one of the conglomerates—introduced its wide-eyed mascot, Elsie the Cow, in an effort, according to historian Anna Thompson Hajdik, to recoup some pastoral cred.

article-image

The doyenne of what was to become the Elmhurst Dairy was at that fair too, dropping by to demonstrate how to milk a cow by hand. Dora Krout ran one of the last dairy farms in the city and although the cows ultimately had to go, she and her relatives merged their various dairy companies, focusing on processing and distribution. Her son-in-law and, then, her grandson, Henry Schwartz—the dapper, 82-year-old chief executive officer of Elmhurst—steered the family business through many other disruptions—including regulatory changes as well as regional and global milk-market restructuring that, respectively, opened the city’s dairy companies to outside competition and led to a decline in U.S. milk prices overall.


In the 1980s, many city processors began to fold. Some of their related distributors, including Bartlett, moved their offices and trucks onto the Elmhurst compound, which covers 15 acres in a neighborhood rich in automotive repair shops. “I don’t know any place in the world that had the system Henry had,” Novakovic said. Those sub-dealers “were part of the ecosystem of the Elmhurst plant and allowed him to stay alive.” As many as 20 tankers a day would travel from upstate to unload milk to be separated, homogenized, pasteurized, packaged, and then delivered by city-savvy drivers, like Vasquez, who could handle narrow streets, unruly traffic, and an eclectic clientele.

article-image

Many observers thought Elmhurst would survive because it was the last plant standing. But Americans have been drinking less milk and increasingly consume dairy in other forms—just as they did before the 19th century—with many opting for organic, this era’s pastoral icon. And milk prices, long volatile, have plummeted recently. “It finally got to a point where it didn’t make sense to continue any longer,” Schwartz said, “and so I have had to reluctantly decide to close this business.”

The ecosystem Elmhurst cultivated has come apart. It owned the city’s only N-8s—elegant, mesmerizing machines that fill half-pints—and the Department of Education now will be getting half pints from several distant plants—one in Buffalo. Bartlett intends to build its own milk processing facility with half-pint capability near JFK airport by 2020. But for now, it will no longer deliver to all public schools, only to those in Queens and Staten Island. “I’ve got one little school; the lady likes her milk before 8:00,” Vasquez said. He parked the truck, unpacking what would be one of his last deliveries at a public school in Chelsea: two crates of Elmhurst half pints, light blue cartons for skim, purple for one percent. “I love my job,” he said. “I love my route.”

Viewing all 11487 articles
Browse latest View live




Latest Images