The United States of America's flag is iconic, important and more or less timeless. We are extremely proud of its design, and we are very attached to the story of Betsy Ross sewing it.
But let's take a step back. Is it any good?
Not many people really think about what it takes to design such a standard, and what makes our flag any better than any of the other national standards across the globe. Except for people like David F. Phillips.
“I’m 72. I’ve been studying flags since i was about six years old,” says Phillips, a professional vexillologist, or someone who specializes in studying the design, meaning, and effectiveness of flags. “The thing that most interested me as a child, and what I think still interests me, is the way that flags and heraldry, communicate complicated ideas through the use of color and line, without any words. That appeals to me.”
Among vexillologists, there are a few cardinal rules about flag design that make some flags clear winners, and leave others twisting in the wind. Phillips pointed us toward a short booklet, freely available online, that’s helpfully titled, Good Flag, Bad Flag. This short guide to flag design was put together by Ted Kaye, editor of Raven, the North American Vexillological Association’s official journal, and it lists, in no uncertain terms, what it takes to create a successful flag. Kaye’s guide breaks good flag design down into five essential criteria: simplicity, meaningful symbolism, no more than three colors, no lettering or seals, and unique design.
The first, and arguably most important factor is keeping your flag simple. “I feel like a flag, and this is Ted’s argument too, should be able to be drawn with crayons by a child,” says Phillips. “If it’s more complicated than that, it’s too complicated.” A simple design is essential because, as Phillips explained to us, the very purpose of a flag is to be read from a distance, unlike more complicated seals or heraldic coats of arms. Simplicity is also essential so that a flag can be easily remembered and be instantly recognizable.
Then there is the meaning behind the colors and symbols on a flag.“On a good flag, you want the lines and the colors and the charges if any, to have some real significance for the nation or the institution that the flag represents,” says Phillips. Charges are symbols or geometric shapes on flags, that are separate graphic elements from the larger design, and can often hold a lot of meaning in just a single shape. Take, for instance, the Canadian maple leaf, or the Japanese sun. “You can picture it absolutely because it’s very clear, it’s unique, and also the sun has a cultural significance to the Japanese,” says Phillips. The color can be a shorthand, too, as in the case of Ukraine’s blue and yellow flag, which represents a blue sky over wheat fields.
But the color palette should still be limited and coherent. According to Kaye’s guide, a good flag should contain no more than three primary colors. Kaye lists the basic flag colors as red, blue, green, black, yellow, and white (“White and yellow are called “metals,” after gold and silver,” says Phillips), and most good flags use some mix of these. But it also matters which colors you choose, and for the sake of a flag’s visibility, the name of the game is contrast. “In heraldry, there’s a informal rule that don’t put yellow on white, or white on yellow. And you don’t put red on blue, or blue on green,” says Phillips. “Blue on white, red on gold, that’s a lot easier.” This contrast also ensures that if the flag is reproduced in black and white, it doesn’t entirely lose it’s meaning.
The fourth rule to remember is that words and lettering have no place on a successful flag. The reasoning behind this, as pointed out in Good Flag, Bad Flag, is pretty simple. Words and detailed seals almost instantly blur and lose meaning at a distance.
Lastly, make sure your flag is unique. It’s fine and even somewhat encouraged, to allude to other flag designs, but in the end, it is most important that your flag cannot be easily confused for other flags. “If you look at some of the flags, for the African states for example, a lot of them use red, green, and gold,” says Phillips. “They use them vertically, they use them horizontally, they use them in different orders, some of them use them diagonally. Which one is that? You’re not sure.”
While vexillologists probably disagree as to the specifics, there are flags from nations across the globe that qualify as both bad and good. Phillips’ tastes tend to run towards simpler designs, but even he has his favorites. “My very favorite in the world is the Belgian flag,” he says. “It’s a vertical tri-color, black, gold, and red. These are based on the heraldic colors of the coat of arms of the principal province of Brabant. It’s modeled on the French flag, but it’s so vivid and so striking, and the yellow does that to a large extent. Yellow adds enormous power to any flag. It’s just so beautiful to look at and so distinctive.” In addition to the Belgian flag, Phillips identified other simplistic flags that he finds particularly great like those from Denmark, France, Switzerland, Canada, and Japan.
There were even some more complicated flags like the United Kingdom’s Union Jack or Sri Lanka’s national flag, that he called out as being particularly successful. “It is a little bit complicated, although certainly every element has a meaning,” he says of the Sri Lankan flag. “The two stripes near the hoist represent ethnic minorities, and the lion is the lion which used to represent the old kingdom of Kandy, which was a kingdom in the center of what later became Sri Lanka. It has a definite relevance, but it’s also unique in the world.”
Then there are those flags that just don’t hit the mark. Whether the design is too basic, or not original enough, or just confusing, some flags just don’t work. Phillips brought up the flag of Kyrgyzstan as a good example of a bad flag. “It has a rather distinctive element in the center, which is the chimney of a yurt,” says Phillips. “But unless you’ve been in a Kyrgyz yurt, you don’t really know exactly what you’re looking at. It looks kind of like a gold blob on red. Then the red field suggests a Communist orientation, which the Kyrgyz don’t have anymore. That’s not such a great idea either.” Phillips also called out the flags of nations like Egypt, Iraq, Syria, Yemen, and the United Arab Emirates, for being too similar to be distinctive.
Taking into consideration all of the hallmarks of a great flag, we finally asked Phillips what the national flag of Atlas Obscura might look like. “Atlas Obscura, I’m imagining in my mind, a book, partly obscured,” he says. So, knowing very little about our website, and based on the name alone, he suggested a diagonally bisected field of black and gold, with a book in the middle, which would be half hidden beneath the black half of the flag. “I’m not saying it’s the only solution, it’s just the first thing that occurs to me in five seconds.”
We took his advice, and without further ado, we present the official flag of the nation of Atlas Obscura:
Truly, that is a great flag. If you disagree, feel free to go design your own. At least now you know how.
On the morning of November 7th, 1878, Frank Parker, the assistant sexton of Saint-Mark’s-Church-In-The-Bowery noticed a pile of fresh dirt at the center of the graveyard.
The flat tombstone beside the mound seemed undisturbed, but suspicious nonetheless, the sexton decided to investigate. With the help of a few other clergymen, he lifted the heavy stone bearing the name “STEWART” and was lowered down by a rope into the darkness.
What Parker found in the depths of that crypt, or rather what he didn’t find, sparked one of New York’s greatest mysteries. Two essential objects were missing from Stewart’s tomb at Saint Mark’s: an engraved silver nameplate and, more importantly, the body it identified.
And the body was not just any body. The missing corpse belonged (or used to belong) to the third-richest man in the United States. In fact, to this day, Alexander T. Stewart, the "Merchant Prince,” remains the seventh-richest American of all time.
The father of the department store, Stewart made his fortune primarily in retail and manufacturing. When it came to fashionable clothes and dry goods in Manhattan, Stewart was the biggest game in town. So when he died in 1876, the enormity of his inherited estate was a surprise to no one. Stewart left behind an empire at the height of its power, a 76-year-old widow, Cornelia, no children, and a massive personal fortune, worth about $46 billion by today’s standards.
Alexander Stewart made headlines in life as an entrepreneur and shrewd businessman, but his “resurrection” caused a media sensation unparalleled by anything he had experienced in life. Grave robbing was a reality of 19th-century life, but it usually involved the theft of fresh bodies from the poor and disenfranchised for medical experiments. The successful body-snatching of one of the New York’s biggest names, in a bad economy—two years after a failed attempt to rob Lincoln’s Tomb, no less—captured the zeitgeist. (The Lincoln Case, Bess Lovejoy, author of Rest in Pieces: The Curious Fates of Famous Corpses, suggested in an interview, may actually have served as direct inspiration for the Stewart robbers.)
The very same day Frank Parker made his discovery, an eager crowd surrounded Saint Mark’s cemetery, fueled by curiosity. The robbers’ trail was easy to trace. A line of the foul-smelling stains crossed the stone porch, ending at the iron fence where a few scraps of rotting flesh hung limply from its spikes. Detectives found a few other clues: an old copy of the Herald, a shovel, a lamp, a wooden board, and a length of a woman’s stocking. The 11th Street gate’s padlock was found on the sidewalk, unforced and intact. It was apparent that the “ghouls” (as the New York Times dubbed them) had a key.
It was impressive. Not only had the robbers persisted despite the smell of Stewart’s liquefying body, they had also managed to do so while completely evading detection.
Strangely, Stewart’s body had been scheduled to be exhumed and reburied that week at the Cathedral of the Incarnation in Garden City. Garden City was Stewart’s largest and least-understood project. Reporters openly questioned “Stewart’s Folly” when construction began on the ambitious project. After his death, the press’s incredulity only increased when Stewart’s widow set aside $1 million for a massive cathedral to be built there in her husband’s memory. The thieves may have known about the plans to relocate the body, suggesting this was an inside job, and preyed on the distraction. Nonetheless, all the clergy and cemetery workers were cleared.
From its smell, detectives deduced that the thieves wiped their hands on the Herald after handling the body. The newspaper held other clues. It was dry, despite a light rain the night before. This gave investigators a timeline. The thieves had struck just after the storm passed at 3:00 am. This matched eyewitness accounts of a Delivery Wagon parked across the street that disappeared around 3:30 am. Where that wagon had gone was anyone’s guess. Because of the rancid smell, the robbers may have taken the body out of the city to avoid detection.
The Police advised Mrs. Stewart and Stewart’s executor, “Judge” Henry Hilton, to wait for the grave robbers to contact them. Given Stewart’s decision not to be embalmed and the passage of two years, ransom seemed a more likely motive than medicine. Unless, as one source suggested at the time, someone wanted to study Stewart’s skull through the still-popular “science” of phrenology. “Stealing skulls for phrenology happened,” Lovejoy said in interview, “But usually only to people considered geniuses… like Haydn and Mozart.” While never as popular in America as in Europe, Lovejoy pointed out that, “Some people definitely thought it was worth studying the contours of a famous skull.”
Whatever the reason for the crime, Hilton told the New York Times, they would offer a $25,000 dollar reward for help capturing the criminals.
Before his death, Stewart was seen as something of a miser, even Scrooge-like. Stories circulated that he’d once fired a carpenter for losing a single nail. There was another rumor he bankrupted the builder of his Fifth Avenue Mansion by suing him for wartime construction delays. In his will, Stewart left no charitable donations to the city or any university.
After news of the reward spread, more than 700 letters flooded in to Hilton, to Mrs. Stewart, and to the Police. Hundreds more appeared in the Herald personals section. All claimed to have information about the case.
Inspector Duke of the NYPD received one letter— written and addressed in cutout newsprint characters— claiming, “In eight hours I will be in Canada with AT Stewart’s body.” One letter published in the Herald said the body would be returned provided Mrs. Stewart donated $500,000 to any charity. Several spiritualists claimed to channel Stewart himself. In the onslaught, it was hard for investigators to tell what was authentic.
At least two men actually confessed to the crime under interrogation. Two small-time criminals named William Burke and Henry Vreeland offered to take detectives to the body’s hiding place in Chatham, New Jersey. But after realizing they faced jail time and not a reward, the pair refused to cooperate. No evidence ever linked them to the case, but fortune hunters descended on Chatham anyway, digging holes and dredging the river in search of Stewart’s remains.
After the Herald’s favorite theory involving a famous resurrectionist and a Stuyvesant Street boardinghouse fell apart, so did public confidence. Articles providing advice on how to prevent grave robbing appeared in the Brooklyn Daily Eagle. The Herald Tribune took the position that Mrs. Stewart should publicly give up the search in order to end the public hysteria. By Christmas, the story dropped from the headlines, but according to publisher Jacob A. Riis, the damage was done. To him, the Stewart case was, “the dawn of Yellow Journalism.”
In January of 1879, Paul Henry Jones, Post Master of New York and former Civil War General, received a letter from Montreal. A man named Romaine claimed to have Stewart’s body, and asked Jones to serve as his attorney and negotiator. Jones wrote back asking for proof. In answer, he received a package containing Stewart’s missing nameplate. When Jones approached investigators, Hilton refused to pay and accused him of conspiracy. Negotiations faltered, and, unsatisfied, the alleged kidnappers went silent.
Five years after Stewart’s death, Stewart and Company declared bankruptcy in 1881. That same year, police excavated portions of Brooklyn’s Cypress Hill Cemetery after a false tip that Stewart’s body had been stashed there. That was the last public news about the investigation.
Despite promises to Mrs. Stewart and the press, if Henry Hilton ever found the body, he never announced it. Rumors swirled that private investigators in his employ were still following leads as late as 1885.
In 1887, former NYPD Police Chief Walling published his memoirs and offered an ending to the story.
According to Walling, Mrs. Stewart personally reopened the negotiations with the robbers in 1884, two years before her death. She offered $20,000, and the thieves sent her a marked map of the Hudson Valley. On an appointed night, Mrs. Stewart’s nephew rode down the marked road after midnight, and eventually found a carriage blocking his path. A group of masked men emerged with a scrap of velvet coffin cloth and a bag of bones. After counting the money, they rode off into the night. Walling’s account states that Stewart’s bones were quietly laid to rest in the Cathedral of the Incarnation in 1885.
Others are skeptical. Several historians, including Wayne Fanebust, author of The Missing Corpse: Grave Robbing a Gilded Age Tycoon, believe the body was never recovered. The most compelling evidence includes the testimony of Henry Hilton’s personal assistant, Herbert Antsey, who stated in 1890, “No. The body was never recovered.” In fact, Fanebust suggests that Stewart might not have minded. “Stewart himself wouldn’t have paid the ransom,” he said. By not recovering his mentor’s body, Hilton may have been staying true to the man’s principles.
When Cornelia Stewart died in 1886, the New York Times expressed its own skepticism about the recovery, writing that she was buried, “beside the grave wherein Mrs. Stewart had always supposed that the remains of her husband reposed.” When Hilton died in 1899, the New York World remarked, “the body was never returned. Or perhaps it was returned— who knows?”
Walling’s story is reflected in the Cathedral’s records, but, in an interview, Michael Sniffen, Dean of the Cathedral of the Incarnation, admits the story, “sounds a little made up.” Regardless of Alexander Stewart’s final resting place, the Stewarts’ are not buried beneath their prominent floor marker under the Cathedral’s nave. To avoid another robbery, the exact location of their remains is a secret. If you believe the rumors, Mr. and Mrs. Stewart lie somewhere beneath the altar.
“Still, it makes you wonder,” Sniffen said. “What is in Stewart’s tomb?”
In 1998, a UC Davis entomologist named Robbin Thorp explored the forests of southern Oregon and northern California, hoping to learn more about a little-studied native pollinator that lived there. He visited nearly three-dozen sites where museum records indicated the yellow-topped Franklin’s bumble bee had once been seen. “It wasn’t the most common bee I saw,” Thorp recalls, “but I could find it at all the sites where it was supposed to be”—and even in some places where it hadn’t previously been recorded.
The next year he visited the same spots. Again, he found the bee at all the study sites. But the year after, quite suddenly, “the bee became difficult to find,” says Thorp. Bumble bee populations fluctuate from year to year, so at first he wasn’t alarmed. But when numbers didn’t bounce back, he realized the species might be in serious trouble. “Something was going on,” he says.
In 2003, he contacted other bumble bee specialists to see if they were seeing similar problems among the species they studied. They began looking, and concluded that three other species, all belonging to the subgenus Bombus sensu stricto, had also experienced sudden and steep declines. Last week, the U.S. Fish and Wildlife Service proposed that one of those species—the rusty patched bumble bee (Bombus affinis), named for the small, red-brown crescent on its back—receive federal protection as an endangered species.
(Video by Day’s Edge Productions)
There are 47 varieties of native bumble bee in the United States and Canada, and the International Union for Conservation of Nature (IUCN) estimates that more than a quarter of those species face the threat of extinction. But unlike honeybees—an imported species from Europe whose recent mass deaths have been well publicized and extensively researched—bumble bees receive scant attention. If the federal listing of the rusty patched bumble bee proceeds, however, that may change: It would be the first native bee in the continental United States to be protected under the Endangered Species Act.
The rusty patched bumble bee was once ubiquitous across a large, bat-shaped expanse that stretched from New England south through the Appalachians and into the Midwest, and southeastern Canada. Today, however, only a handful of genetically isolated populations survive in Wisconsin and parts of Minnesota. The Fish and Wildlife Service estimated in its listing proposal that populations have declined by as much as 95 percent since the late 1990s. “There are a few little spots where we know they are,” says USDA research entomologist Dr. James Strange, “but only a really few spots.”
What caused the rusty patched bumble bee to disappear? As with many ecological mysteries, there’s not one easy answer. Urban sprawl and agriculture’s continuing shift from small, diverse farms to vast swaths of single-plant monocrops have fragmented habitat and left fewer hedgerows and native plant blossoms to feed pollinators. Agricultural and garden pesticides can kill or weaken bees. And in the specific case of the rusty patched bumble bee, some scientists point to pathogenic intruders, particularly a fungal parasite that may have grown more virulent thanks to our love of year-round greenhouse tomatoes.
More than 85 percent of flowering plants require the help of pollinators to reproduce—that translates to one in three bites of food we eat. Farmers generally rent honeybees, which live in large, easily portable colonies, to pollinate crops such as almonds and cherries. But certain plants—such as tomatoes, sweet peppers, eggplants, cranberries, blueberries—respond especially well to “buzz pollination,” a behavior unique to bumble bees, which latch on to a flower’s anthers with their mouthparts and vibrate their wings at a frequency that dislodges trapped pollen. Buzz pollination increases the weight of tomatoes by 5 to 16 percent, according to Strange.
In the 1990s, as the greenhouse tomato business grew from a boutique industry to a major source of year-round tomatoes, the commercial bumble bee industry grew along with it. Thorp believes those mass-produced bees carried with them a fungal microsporidian parasite called Nosema bombi, which caused a collapse in populations of commercially bred western bumble bees in the 1990s, and may have spread to bees in the wild as well. The rusty patched bee, along with the three other declining species in its subgenus, carry particularly high loads of the parasite. “Because all those species collapsed at the same time in such a dramatic way, the belief is that the subgenus is for some reason very susceptible to this pathogen,” says Rich Hatfield, a conservation biologist with the Xerces Society for Invertebrate Conservation who helped spearhead the petition to list the rusty patched bee. Other pathogens, such as viruses spread by managed bumble and honey bees may also be a factor.
Federal protection could help the struggling bees in a number of ways, says Hatfield. “The measure that would help most would be to regulate the commercial bumble bee industry,” he says. “Nobody’s testing those commercial bumble bees for diseases.” In an email, Netherlands-based Koppert Biological Systems, the only company currently rearing bumble bees in the U.S., notes that “there is no proof for the invasive pathogen hypothesis,” and that their bees are raised in a “safe and controlled manner” in the company’s Michigan facilities, including frequent internal and outside audits, tests and inspections. In addition, the company’s bumble bee production is “inspected by Michigan State Department of Agriculture which certifies the bees as disease free for export purposes.”
An endangered designation would also bring additional research funds that would allow scientists to better understand these little-studied native bees, and would help to protect critical habitat and forage. The Fish and Wildlife Service proposal noted that bumble bees may be more vulnerable to pesticide exposure than honeybees. Hatfield hopes an endangered designation will force the Environmental Protection Agency to require that pesticides, particularly a newer class called neonicotinoids, be tested for their effects on native bees. “There is a whole suite of insecticides that are broadly used throughout North America and the only species that the toxicity has been tested on is the honeybee,” he says. “Using them as proxy for all the bee species in North America is not appropriate.”
The agriculture industry is likely to disagree. After the listing proposal, CropLife America, the trade group representing pesticide manufacturers, said in a statement that “field studies have consistently found no unreasonable adverse effects on pollinator populations when pesticides are applied according to label directions.” In comments opposing the original listing petition, the Independent Petroleum Association of America argued that programs already in place to preserve habitat for honey bees, monarch butterflies and northern long-eared bats were sufficient to protect bumble bee populations.
But scientists believe that without endangered species protections, prospects are dim for the rusty patched bee. “We have a bee that is on the brink of extinction and now we have a chance to do something about it,” says Strange.
For the Franklin’s bumble bee, that chance has, in all likelihood, been lost. In 2006, Thorp visited a spot high in the Siskiyou mountains in southern Oregon. Just below the summit of Mt. Ashland, along a Forest Service road about 50 yards above the Pacific Crest Trail, “there’s a seep area that keeps the vegetation moist,” says Thorp, “where plants keep flowering for a really long time.” As Thorp walked past the meadow, he saw a yellow-topped Franklin’s bee bumble by—the first he had seen in three years. “I wanted to photo-document it but I didn’t have my camera with me,” he says.
It was the last Franklin’s bumble bee anyone has seen. Scientists now believe it is extinct.
The proposed listing opens a 60-day period for the public to provide comments and additional information about the rusty patched bumble bee. The public comment period runs through November 21, 2016. To submit comments or to view documents and comments on the bumble bee listing, visit: https://www.regulations.gov/document?D=FWS-R3-ES-2015-0112-0028/.
If you’ve ever seen a movie made before 1950, you’re familiar with the accent used by actors like Cary Grant, Katharine Hepburn, and Ingrid Bergman: a sort of high-pitched, indistinctly-accented way of speaking that also pops up in recordings of politicians like FDR and writers like Gore Vidal and William F. Buckley, Jr. It’s easy to gloss over today, because movies have captured a few different accents that aren’t really present today, like the Borscht Belt Jewish accent of Mel Brooks and the old New York “Toity-Toid Street” accent. Is it British? Is American? Is it just “rich”?
But the accent we’re talking about here is among the weirdest ways of speaking in the history of the English language. It is not entirely natural, for one thing: the form of the accent was firmly guided by certain key figures, who created strict rules that were aggressively taught. And it also vanished quickly, within the span of perhaps a decade, which might be related to the fact that it isn’t entirely natural.
Today this accent is sometimes called the Mid-Atlantic Accent, which is deeply offensive to those, like me, from the actual Mid-Atlantic region of the United States.
What that name means in this case is that the accent can be placed somewhere in the middle of the Atlantic Ocean, halfway between New England and England. It's popularity, though, in pop culture can be tied to one American woman, and a very strange set of books.
In the 1800s, once relationships with England began to normalize following the Revolutionary War and War of 1812, the cities of Philadelphia, Boston, and, especially, New York City quickly became the new country’s most powerful. Financial and cultural elites began constructing their own kind of vaguely-British institutions, especially in the form of prestigious private schools. And those schools had elocution classes.
The entire concept of an elocution class is wildly offensive to most of the modern linguists I know; following the rise of super-linguist Bill Labov in the 1960s, the concept that one way of speaking is “better” or “worse” than another is basically anathema. But that wasn’t at all the case for the rich kids of Westchester County, Beacon Hill, or the Main Line (those would be the home of the elites of New York, Boston, and Philadelphia, respectively).
“There's a long history of dialect features of Southeast England in Eastern New England dialects, tracing back directly to the colonial era,” writes James Stanford, a linguist at Dartmouth College, in an email. “European settlers throughout New England on the east side of Vermont's Green Mountains tended to stay in closer touch with Boston, which in turn stayed in touch with Southeast England through commerce and education.”
The upper-class New England accent of that time shares some things with modern New England accents. The most obvious of those is non-rhoticity, which refers to dropping the “r” sounds in words like “hear” and “Charles.”
But while parts of those accents are natural—some New Yorkers and many Bostonians still drop their “r” sounds today—the elite Northeastern accent was ramped up artificially by elocution teachers at boarding schools. Miss Porter’s School in Connecticut (where Jackie Onassis was educated), the Groton School in Massachusetts (FDR), St. Paul’s School (John Kerry), and others all decided to teach their well-heeled pupils to speak in a certain way, a vaguely British-y speech pattern meant to sound aristocratic, excessively proper, and, weirdly, not regionally specific. A similar impulse created the British Received Pronunciation, the literal Queen’s English, though RP’s roots arose a bit more gradually and naturally in Southeastern England.
The book that codified the elite Northeastern accent is one of the most fascinating and demanding books I’ve ever read, painstakingly written by one Edith Skinner. Skinner was an elocutionist who decided, with what must have been balls the size of Mars, to call this accent “Good Speech.” Here’s a quote from her 1942 book, Speak With Distinction:
"Good Speech is hard to define but easy to recognize when we hear it. Good Speech is a dialect of North American English that is free from regional characteristics; recognizably North American, yet suitable for classic texts; effortlessly articulated and easily understood in the last rows of a theater."
Skinner is now woefully outdated and many of her ideas are so contrary to the way modern linguists think that her books are no longer taught. (To find a copy of Speak With Distinction, I had to hunt through a performing arts library in New York City’s Lincoln Center plaza.) She’s what’s known now as a linguistic prescriptivist, meaning that she believed that some variations of English are flat-out superior to others, and should be taught and valued as such. I mean, come on, she named this accent, “Good Speech.”
Her influence was felt in filmmaking in a very roundabout way. Film began in New York, only moving en masse to Los Angeles in the mid-1910s. Skinner was born in New Brunswick, Canada, but studied linguistics at Columbia and taught drama for many years at Carnegie Mellon, in Pittsburgh, and Juilliard, in New York City, all highly elite schools. It was in the Northeast that she created Speak With Distinction: an insanely thorough linguistic text, full of specific ways to pronounce thousands of different words, diagrams, lessons on the International Phonetic Alphabet, and exercises for drama students.
Yep, drama: by this point, movies with sound had begun to hit theaters, and then came the disastrous story of Clara Bow. Bow was one of the silent film era’s biggest stars, a master of exaggerated expressions. When the talkies came along, audiences heard her voice for the first time and it was a nasal, honking Brooklyn accent. Though the idea that speaking roles killed her career in film is not entirely accurate (there were plenty of other factors, ranging from drug problems to insane pressures of film studios), it’s certainly true that her career took a nosedive around the time audiences heard her voice, possibly creating a cautionary tale for newly heard actors.
It’s now the 1930s, and Edith Skinner is Hollywood’s go-to advisor for all things speech-related. And Edith Skinner has extremely strong opinions, bred in the elite universities of the Northeast, about exactly how people should speak. So she forced her own “Good Speech” accent on stars, and other voice coaches, and soon her accent became the most popular accent in Hollywood.
Speak With Distinction is incredibly dense, but it’s also very thorough. You can see very clearly, right there on the beat-up pages, why Katharine Hepburn speaks the way she does. “In Good Speech, ALL vowel sounds are oral sounds, to be made with the soft palate raised. Thus the breath flows out through the mouth only, rather than through the mouth and nose,” she writes. (She capitalizes things a lot.) “Each vowel sound is called a PURE SOUND, and the slightest movement or change in any of the organs of speech during the formation of a vowel will mar its purity, resulting in DIPHTHONGIZATION.”
She demands that “r” sounds be dropped. She demands that the “agh” sound, as in “chance,” should be halfway between the American “agh” and the British “ah.” (Interestingly, this is very different than the typical New England accent today, which is highly “fronted,” meaning that the vowel sound is made with the tongue very close to the teeth in words like “father.” The British, and Mid-Atlantic, vowel is pronounced with the tongue much further back.) She requires that all “t” sounds be precisely enunciated: “butter” cannot sound like “budder,” as it mostly does in the US. Words beginning in “wh” must be given a guttural hacking noise, so “what” sounds more like “ccccchhhhwhat.” She bans all glottal stops—the cessation of air when you say “uh-oh”—even between words, as in this phrase, direct from her book: “Oh, Eaton! He’d even heave eels for Edith Healy!” Go ahead, try to say that without any glottal stops. It’s enormously difficult.
She cracks down on the most obvious of regional cues, railing against what’s now called the “pin-pen merger.” Today, the pin-pen merger—in which the word “pen” sounds like “pin”—is a very easy indicator that a speaker is from the American South. Yech, the South. That will not do for Edith Skinner.
Because Skinner was so influential, and her “Good Speech” was so prominent in movies, it began to leak out into the drama world at large. Other teachers began teaching it. In fact, even up until just a few decades ago, this accent, now called “Mid-Atlantic,” was being taught in drama schools. Jaybird Oberski, who teaches acting at Duke University, got his MFA at Carnegie Mellon in 1997, and he says the class was, amazingly, still being taught then. (He isn’t a fan of the accent.) “The Mid-Atlantic accent is considered the neutralization of regionalization, to bleach out character so everybody sounded the same,” he says.
Weirdly enough, this accent class was called a “neutralization technique” at Carnegie Mellon: theoretically, the idea is that it removes regional signifiers like the pin-pen merger. But there is no “neutral” or “accentless” accent; you can replace one accent with another, but the idea that there is some perfect, unaccented variety of English is a myth that’s long been squashed.
This particular accent, too, is far from neutral. It’s immediately recognizable and strange, a take on a clipped upper-class New England accent with even more Britishisms tossed in the mix. In her efforts to create a neutral accent, Skinner created one of the most non-neutral accents in the past few centuries.
The film craze of Mid-Atlantic English was short-lived. By the late 1960s, the New Hollywood movement, complete with innovative, gritty directors like Francis Ford Coppola and John Cassavetes, began to depict the world as it was, rather than the fantasy lives presented by earlier films. That goal necessitated the dropping of the Mid-Atlantic accent; there’s no point in showing the grim realities of Vietnam War-era America if everyone is going to talk like they went to Choate Rosemary Hall, so the actors in those films just...didn’t. And elocution classes at those schools began to be dropped as well. “The prestige of non-rhoticity and other British-related features began to change in the mid-20th century, and scholars suspect it may be due to the role of WWII and American national identity—a new identity on the world stage, no longer so closely tied to England for national identity,” writes Stanford.
The accent vanished quickly, now only surviving as a weird hallmark of that era of filmmaking; the only time you hear it now, really, is if a movie is set in Hollywood, in the film industry, prior to 1960. The real Mid-Atlantic accent, the accent of Philadelphia and Baltimore, luckily, lives on.
Stone circles appear throughout history and across many cultures, including these in Portugal, Ethiopia, the Golan Heights, and even Massachusetts. Some of the more obscure, and least understood, are the Stone Circles at Odry in central Poland.
Often called the "Polish Stonehenge," this vestige of the Iron Age is shrouded in legend, with a healthy sprinkle of mysticism. Dating to the time of the Goths (the first or second century A.D.), the 40-acre site is comprised of 12 circles, each with a large stone at the center called a stelae that is ringed by 16 to 29 boulders. Scattered between and around the circles are over 600 small burial mounds called barrows, each believed to contain the skeletal remains of between one and three people.
Europe’s second largest collection of circles left nearly intact, Odry’s relatively undisturbed condition is perhaps attributable to hundreds of years of avoidance by locals, who have been warded off by tales of magic, witchcraft and evil lurking in the surrounding forest.
In the late 19th and early 20th centuries archeologists began studying the formations, looking for clues of their exact origin and purpose. Early theories revolved around some kind of calendar or astronomical configuration, but these have been mostly debunked as the area has undergone more rigorous study.
They do hold mystery for many visitors, some who report a flowing sense of calm and positive energy, relaxing to the mind and body. It’s also become a favorite spot for dowsers and diviners, and some claim that by standing or sitting within the circles, ailments such as headaches and fatigue can be healed.
OK, so they may or may not spell relief to your migraine or help you with your dowsing practice, but research to explain both their origin story and their transcendent quality is ongoing, and recent satellite imagery suggests there may be more circles, barrows, and ancient clues yet to find. They’re just waiting to be unearthed.
Placed in the center of the city, the Collégiale Sainte-Croix de Liège (Holy Cross College of Liège) was intended to be a religious and civic point of focus when the town was an important station of the Holy Roman empire.
Building started around 976 A.D. in the Romanesque style and continued for many years. As architectural tastes changed over the years of construction, style became more Gothic. Collégiale Sainte-Croix was the crowning jewel of the city, visible from just about any part of town. It was also well-known for housing St. Hubert's Key, a charm used for the treatment of rabies in the Middle Ages.
In the 1960s a highway was built to snake directly around the church. While this served commuters well, it essentially cut Sainte-Croix off from the rest of the city. Within a decade the church had been almost totally abandoned. Despite repeated conservation efforts, it fell into serious disrepair. As of 2014 the major of Liège declared the church unsafe to visit as its edifices were crumbling. There are still movements to reconstruct the historic church, but as of now it still sits tucked between expressways, falling apart.
At the end of the Cammino di San Tommaso, a 196-mile pilgrimage from Rome to Ortona, Italy, are the relics of St. Thomas the Apostle. It’s a long, thirsty walk to get there. If only there was a water fountain.
Turns out, there's something even better: a wine fountain. If you make it all the way to the small village of Villa Caldari di Ortona, a local winery has got you covered. A few miles before you reach the end of the route—the reliquary at Basilica Cattedrale di San Tommaso Apostolo—there is a fountain of wine available 24 hours a day, 7 days a week.
To help out parched travelers, Dora Sarchese Vini has erected the fountain outside of their tasting room, just off the Strada Statale (State Road) 538. Built inside of a mock wine cask, the fountain consists of two brass spigots set in a stone basin, streaming free glasses of red wine for all who stop by—be they a pilgrim or just a thirty tourist.
The fountain is a joint venture between the vintner/owner of Dora Sarchese and the organization promoting and maintaining the Cammino di San Tommaso (the Way of St. Thomas). The hope is to draw more travelers to follow the pilgrim route that was established in 1365 by Saint Bridget of Sweden, who twice walked from Rome to Ortona in honor of St. Thomas, and to see the relics at the Basilica.
Dora Sarchese Vini sees the fountain as a gift for the Cammino di San Tommaso, and as a way for the vintner to help spread their word. The hope is to welcome more visitors to Ortona, and to provide them with a little sustenance to make it the last few miles. But please: one glass each. The owner says it’s not for “drunkards or louts.”
The Indigo Girls sing about them, Run DMC raps about them, artists draw them. Their Facebook page has over 12 million likes. They’ve been mixed into cocktails, baked into pies, and stuffed into burgers. No mass-market candy bar inspires such intense passion as Reese’s Peanut Butter Cups.
The combination of peanut butter and chocolate is quintessentially American, like macaroni and cheese. Unlike other combinations, however, peanut butter and chocolate is synonymous with a corporate brand name. The Reese’s brand has so few serious competitors that that the ad campaign no longer really tries anymore. The slogan is one word: “perfect.”
It wasn’t always this way. The first national ad campaign for Reese’s in 1970, seven years after Hershey bought the company, was based on the opposite premise: that the idea of peanut butter and chocolate together was so revolting to consumers that they would only try it if they literally fell on top of it.
According to the campaign’s creative director, Billings Fuess, “[Reese’s] was a brand-new product that Hershey had just bought from this farmer.” Hershey had to convince customers “that these two things taste good together.”
Yet Fuess was twisting the truth. Americans had been eating peanut butter cups for over six decades when the Reese’s campaign debuted. In fact, the peanut butter cup was so iconic that in 1962 Pop artist Roy Lichtenstein painted one.
Why then did Hershey claim that the peanut butter cup was new? Where exactly did the peanut butter cup come from? And why has it continued to be wildly popular?
The history of Reese’s Peanut Butter Cups begins over a century ago, when Americans invented peanut butter. Food historians debate just who created it: either the Kellogg brothers, of corn flakes fame, or snack-food promoter George Bayle. Regardless, peanut butter began its life in the mid-1890s as a health food, promoted as a nutrient-rich protein source.
Like other health foods before it, peanut butter was soon incorporated into candy. But first its cultural status had to drop. Before 1900, peanut butter was expensive and noshed on primarily by rich sanitarium residents, who, following Kellogg’s tenets, shunned meat. But peanut butter quickly became more affordable, and by 1900 it was within reach of the middle class. Within a year, recipes for peanut-butter cups entered America’s cookbooks. These early cups were simply solid peanut butter, without chocolate.
It wasn’t until chocolate became available to the masses a few years later, thanks in part to Milton Hershey and his five-cent milk chocolate bar, that some unknown genius came up with the idea to cover peanut-butter cups in chocolate.
Chocolate-covered peanut-butter cups were sold commercially as early as 1907. They stood out somewhat because their ingredients straddled the boundary between health food and indulgence. Chocolate and peanuts were treats, but nutritionists also thought of them as nourishing foods because they were calorie-dense, and food scientists thought of all calories as equal.
But these early peanut butter cups never became very popular. They were just one of many candies sold in bulk at drugstores, unwrapped in a jar. Bulk candy began losing popularity after World War I, supplanted by the individually wrapped candy bar, which had been a part of soldiers’ rations. Chocolate-covered peanut-butter cups remained a novelty.
It was during this period of change in the candy market that Harry Burnett Reese decided to start a candy business. In 1919, 40-year-old H.B. Reese was laid off from the Hershey Chocolate Company, where he managed Hershey’s experimental dairy farm. Reese needed to support his 10 children and legally blind wife Blanche Edna Hyson. He’d spent half his life schlepping his family around Pennsylvania from one job to another, mostly in farming, but nothing had stuck. So in 1919 Reese followed in his mother’s footsteps and began making candy.
The town of Hershey was a great place to start a candy company, with easy access to high-quality chocolate, workers, and engineers. But it was also a town where a candy entrepreneur would be competing against the largest chocolate company in America. Milton Hershey began building the town in 1903, completing his chocolate factory in 1905. By the time Reese moved to Hershey it was fully established, with a trolley system, schools, and parks.
Reese's first venture involved the candies his mother had made and sold: chocolate-covered almonds and raisins. His business, like many other small candy companies, failed. Reese left Hershey to find work, returning in 1921 when his father-in-law bought Reese a house for his family.
Reese took a job with Hershey again, yet he still could barely support his family. He began making after-dinner peppermints in his living room, stretching the taffy with hooks. But the peppermints were no match for the chocolate that dominated the town.
In 1923, Reese quit his job with Hershey and returned to chocolate making, asking Hershey for permission first. Hershey said yes, but with one stipulation: Reese must buy all his chocolate from him. And with a handshake, the H.B. Reese Candy Company was born.
Reese set up his candy workshop in the basement of his house. His first product was a boxed assortment filled with everything from chocolate-covered honeydew melon to enrobed marshmallows. But it was difficult to succeed just by selling chocolate assortments. The real money was in candy bars. So he began to make coconut-caramel chocolate bars, which he named after his children, Johnny and Lizzie. The bars were popular, and as sales increased, he moved his operations to the basement of an Italian restaurant. Sales kept ramping up, so in 1926 Reese built his first real factory in Hershey and moved his family next door.
The Reese Company was doing well, but it still didn’t have a blockbuster product like Hershey’s Kisses. Then one day in 1928, everything changed. While on a sales trip in nearby Harrisburg, Reese chatted with a store owner who told him that he couldn’t keep peanut butter cups in stock. He asked Reese if he could make something similar.
Without hesitation, Reese said yes. He left the store and immediately purchased a 50-pound can of peanut butter. When he got home, he began experimenting with a peanut butter ball covered in chocolate. But although tasty, they were difficult to manufacture on a large scale. Reese switched to traditional cups, and began making his own peanut butter, roasting the peanuts so they were almost burnt. According to family lore, it's this roasted taste that makes a Reese’s a Reese’s.
A year after Reese began making his peanut-butter cups, the Great Depression hit and Reese almost went bankrupt. Reese couldn’t pay his bills and began paying his employees in candy. When the Sheriff came looking for him, Reese absconded to his family’s farm. Eventually Reese secured loans, and the company was saved.
Reese’s sales remained strong but paled in comparison to Hershey’s. Then in 1933, one of Reese’s salesman, Mr. Houston, urged Reese to sell peanut butter cups individually because his customers thought they were the best piece in the assortment. Reese was hesitant because they were his least favorite piece. But Houston convinced Reese to sell the cups individually for a penny. And the iconic Reese’s cup was born, along with its first ad campaign: a picture of Reese’s wife and many children with the tagline "16 Reasons to Eat a Reese’s."
Reese’s peanut butter cups were an immediate hit, selling enough that by 1935 Reese paid off his debts. The company was running smoothly until 1941 when the U.S. entered World War II, and sugar was rationed. Although Hershey lent Reese sugar, it wasn’t enough. Reese decided to eliminate his other lines and produce only one candy: the candy that required the least amount of sugar, the peanut butter cup. It was the best decision he ever made.
As Reese’s peanut-butter business boomed, other candy companies began producing their own cups. Reese responded aggressively, sending threatening letters to drugstores in 1954 that claimed that it owned the trademark to the term “Peanut Butter Cup.” The letters demanded that stores stop “advertising and selling any candy product, except ours, as Peanut Butter Cups.” After a lawsuit by a Reese competitor, the letters stopped.
In spite of the competition, Reese was doing a booming business in peanut butter cups, selling nationally to both small stores and large retailers like Sears Roebuck. The company decided to build a huge factory in Hershey, but H.B. Reese didn’t live to see it finished in 1957. He died a few months earlier, plunging the company into chaos. Reese left his daughters out of the will, and instead left the company to his six sons—who didn’t exactly get along.
As Reese’s children jostled for control, the company finally entered the modern era with an automated production line in the new factory. In a reversal from its earlier boast that the cups were hand-coated, Reese crowed that its cups were “untouched by human hands.” By the late 1950s, Reese’s had become a respected national brand.
Yet as Reese’s sales soared to over $15 million annually in the early 1960s, the brothers fought over the company’s future. The eldest brothers wanted to sell Reese, but the younger brothers wanted to keep it in the family. They turned down a bid from British American Tobacco, and rumors got back to Hershey about a possible Reese sale. A merger with Hershey made sense: the companies were located in the same town, used the same chocolate, and their founders were good friends.
After much argument, Reese decided to sell to Hershey. Whether or not Reese got the best possible deal is unclear. With the 1963 sale, Reese's sons received over 666,000 shares of Hershey’s stock, valued at over $24 million. A few of the Reese brothers became Hershey board members. Reese’s Peanut Butter Cups gained Hershey’s distribution power, but the merger placed the legacy of H.B. Reese in peril.
Peanut Butter Cups were an immediate success for Hershey. In 1969 the product was Hershey’s top seller, yet the corporation was slipping from dominance. Mars, maker of Snickers, was closing in. Hershey’s sought to keep their candy crown by launching their first national ad campaign for their three top brands. Hershey hired famed ad agency Ogilvy and Mather, whose team made an unusual discovery about Reese’s.
“When you told folks that we had this marvelous candy bar that had a combination of chocolate and peanut butter, they did not take to that kindly,” recalls creative director Billings Fuess. “It was trouble to persuade them to … try it.”
Although Fuess claims consumers disliked the combination, Reese’s was already selling over 300 million cups annually—more than the Hershey Bar. So why did Ogilvy and Mather and the Hershey company want to position the combo as new? They could have done the opposite, focusing on the enormous popularity of the chocolate-peanut butter combo. But such a tactic would not have allowed Hershey to lay claim to it. By pretending it was new, they could brand it as Hershey’s.
How they did so was by diminishing the importance of Reese. Fuess said that while doing research, the creative team “went to see Mr. Reese’s little production line there, behind his farmhouse,” perhaps misremembering, as Reese had been dead for well over a decade at this time and Reese’s “little production line” was an enormous factory. Yet by positioning Reese as a small-time candy company, Hershey was able to portray itself as rescuing peanut butter cups from obscurity and bringing them to the masses.
The “Two Great Tastes” campaign that emerged from Ogilvy and Mather’s research was brilliant in its simplicity. The classic example is the “Manhole” ad, which features a man walking down the street, inexplicably spooning peanut butter from a jar. He falls into a manhole, toppling into a construction worker eating a chocolate bar. “Hey you got peanut butter on my chocolate,” says the worker, as if such things happen to him every day. “Hey you got chocolate on my peanut butter,” the other man shoots back. After the initial tussle, they discover to their surprise that the combination is delicious. “You get two great tastes that taste great together,” plays the jingle over the happy men eating Reese’s.
The campaign made a six-decades-old product seem new and exotic. But to make this work, Hershey had to erase Reese’s history. While branding Reese’s, Hershey did something even more powerful: it laid claim to the chocolate-peanut butter combination.
The sweet and salty taste sensation built for the American palate and beloved for generations was now Hershey’s. Following the campaign, sales for Reese skyrocketed so high that Hershey thought it had made an accounting error. Hershey ended the campaign two years later, but Reese’s continued to soar.
Reese’s became the best-selling candy brand in the U.S. and remains so today. As other cup brands fizzled out, Hershey’s ramped up Reese’s brand. If you own the peanut-butter chocolate combination there are few limits to the possible line extensions. Baking chips debuted in 1977; Reese’s Pieces followed a year later. By the end of the 20th century the Reese’s brand had become a behemoth, as supermarket shelves filled with Reese’s frosting, sprinkles, peanut butter, ice cream, and cereal.
Why has Reese achieved such sustained success? It’s probably some combination of Hershey’s marketing and distribution muscle and Reese’s distinctive taste. Since Reese’s inception no other company has been able to duplicate its taste, although many have tried. Why other companies can’t make a cup as delicious as Reese’s is a mystery. Reese’s ingredients are certainly not high quality: the peanuts used in its filling are of the bland “runner” type, the chocolate is of the pedestrian milk variety. Yet the chocolate and peanut butter interact in an alchemical way that transcends the quality of the two ingredients.
Making peanut-butter cups with organic ingredients doesn’t improve them. I have tried nearly every artisanal or organic peanut-butter cup available: Justin’s, Theo’s, Colt’s Bolts, those made in tiny hometown candy stores. All leave me unsatisfied, yearning for the processed taste of a Reese’s.
Adding new ingredients also doesn’t work. Hershey has mixed cookies, banana cream, caramel, honey nut flavor, and Reese’s Pieces with the peanut butter, and covered the cups in dark and white chocolate. Artisanal companies make peanut butter cups filled with bacon, chia seed, coconut, and caramelized bananas. None appear to have supplanted Reese’s taste. Reese’s cups resist such experimentation. Alternatives wither; something inexplicable is lost when the ingredients change.
Although Reese’s Peanut Butter Cups are now sold in nearly every gas station, vending machine, and supermarket in the U.S, the essence of the original cup, which was hand-crafted in H.B. Reese’s basement 89 years ago, has endured. But Reese’s history has not.
Even though Reese created his multimillion dollar peanut-butter cup empire in Hershey, built two factories, and employed hundreds of residents, no monuments to him exist. While the century-old smokestacks of the former Hershey’s factory stand proudly as a marker of the company’s history, Reese’s old factory is unmarked. Sure, you can buy Reese’s in Hershey’s gift shops and read a tiny placard in the museum about Reese. But H.B. Reese is an afterthought in Hershey, his impact diminished, as if giving Reese his due would undermine Milton Hershey’s legacy.
Reese deserves better than simply his name on the package. Even though Reese’s would probably not have grown as successful without Hershey, Hershey is not the sole reason for Reese’s sustained popularity. Reese’s Peanut Butter Cups are unique; their peanut butter is unmatched, their ratio of ingredients divine. Their success is perhaps caused by an even greater force: nostalgia. Peanut butter and chocolate taste of childhood and love and family. Even though H.B. Reese didn’t invent the peanut butter cup, nor Americans’ love of its two main ingredients, he deserves his rightful place in history for perfecting it.
The Alcazar of Jerez de la Frontera is a Moorish palace built in the 11th century. It's a very nice palace, complete with a vaulted mosque, a system of wells, elaborate gardens filled with fragrant trees and even a multi-room hamam for performing ablutions in style. But it wasn't until this century that the camera obscura was put in place.
Taking advantage of the fact that the alcazar was built on the highest point in the ancient city, and that the tower is the highest point of the palace, a system comprised of two lenses and a large periscope mirror was installed on the very top of the tower. This optical contraption is controlled from below by a pair of long wooden-handled levers, to cast live images of the surrounding town onto a large parabolic table around which people gather.
In a modern age where the resolution and saturation of the images we see on our myriad screens are constantly questioned and evaluated, the lenses' projected picture on the table before the viewer is dazzlingly sharp. The mirror swivels around and zooms in on impossibly small details of the buildings of historic Jerez, the sprawling sherry warehouses and the fields beyond. Birds fly past, cars rush by, and the effect is thrilling, like the gods atop Olympus watching the minutiae of their human world below in the surface of a birdbath.
You know those sweet little Hummel figurines that you may have found around your grandma's house? Well, America has it's own home-grown version, created by Samual J Butcher, known as Precious Moments. Tucked away in the southwestern corner of Missouri, these little figurines are on display in a rather remarkable attraction.
Mr. Butcher's art spawned a whole new craze among members of a new generation of "collectibles" collectors, or simply anyone with a soft spot for really, really sweet figurines that will be instantly recognizable, particularly to any of us who lived through the 1980s and 90s. As Mr. Butcher's success grew he decided to create a complex of chapel and gardens to celebrate what his creations meant to him.
The centerpiece is the Precious Moments Chapel, which will be instantly recognizable as an homage to nothing less than the Sistine Chapel at the Vatican... but populated instead by the children and cherubs of the Precious Moments tribe. The chapel is surrounded by gardens, (including memorial gardens for deceased children and relatives). There is a visitor center, with exhibits which chronicle the development of Mr. Butcher's artwork and industry as well as a gift shop, and cafe. All in all the complex is so sweet you may want to check your blood-glucose level before you visit.
If you cross the Thames River on the downstream side of the Golden Jubilee Bridges, you will pass the curious skateboard graveyard where broken boards are laid to rest—or rather, thrown to rest, as the graveyard sits on one of the bridge's freestanding support structures in the river.
The original Hungerford Bridge, erected in 1845, was a suspension bridge designed by the renowned engineer Isambard Kingdom Brunel. It was later replaced with a design by Sir John Hawkshaw, using only Brunels original buttresses and adding narrow, rattly walkways close to the train tracks on each side. (The chains from the original suspension bridge were reused for Brunel's most famous bridge, Clifton Suspension Bridge in Bristol.)
In the mid-1990s, it was decided to replace the walkways with footbridges, a decision that was sped on by the brutal, drug-fueled mugging and killing of 24-year-old skater Timothy Baxter who, along with his friend Gabriel Cornish, was beaten and thrown in the river from the narrow walkway in the summer of 1999. Only Cornish survived.
The two new footbridges were officially opened in 2002 given the name "Golden Jubilee Bridges" in honour of Queen Elizabeth II's 50 years on the throne, but they are also commonly known as Hungerford Footbridges. The Golden Jubilee Bridges are separate bridges partially attached to Hungerford Bridge, but also supported by pylons set into isolated support structures in the river. (This proved complicated for a couple reasons: due to the Bakerloo underground line that runs under the Thames, directly under the bridge, and also the fear of setting off unexploded World War II bombs buried in the muddy riverbed.)
Not far from the bridge, under the Southbank Centre, you'll find the hugely popular Southbank Skatepark. Sometime around 2008, the first broken skateboards appeared on one of the bridge's flat support structures, the one closest to the skatepark. The idea caught on, and more and more boards followed, reportedly thrown there in memory of the skater Timothy Baxter.
Every now and again, the council removes the skateboards, or somebody climbs down to rearrange them into letters or numbers, but the graveyard soon reverts to its natural state. It's doubtful that the Skateboard Graveyard was started as a deliberate memorial—it didn't appear until nine years after the killing, but in a town that embraces its psycho-geography, the past can change and the Skateboard Graveyard has become a poignant monument.
One of the lesser known museums in Barcelona, the European Museum of Modern Art, Museu Europeu d'Art Modern aka "MEAM" packs a punch with some of the most skillful artists from our recent generations on display.
It's an 18th-century palace that's tucked away in an ally sized street next to the Picasso Museum, showcasing 20th- to 21st-century contemporary figurative paintings and sculptures. There are exhibits that cycle through but many are permanent installations.
A statement from the museum website explains it as so: "The MEAM is the new temple of this new religion. The religion of those who believe, in the XXI century, in Art. An Art that is not content with experimentation turned into an end in itself, or the permanent trial without reaching any definite forms, or the cult of noise by noise, or with video montages burdened by boredom. In short, an art that justifies itself as something direct, explicit, categorical, absolute, real, intelligible and brilliant."
When the London Underground was being constructed in the 1860s, rather than tunneling under existing buildings, deep tunnels were dug right through the city and then covered up again. But not all of the properties razed to make way for the railway were rebuilt.
Case in point: The houses at 23 and 24 Leinster Gardens, which were demolished to build the tunnel connecting Paddington with Bayswater. The Underground uses this open-topped portion of the line to ventilate a large section of the surrounding Tube system, but the sacrificed homes were never reconstructed. This left a rather unsightly hole in an otherwise very sightly block of Empire 5-Story houses.
And so, a false façade was constructed to conceal the wound. It matches its neighbors in every important detail, except that the windows are painted on, rather than being made of glass. They did a really nice job; you could live in that neighborhood for years, walking by this address frequently, and still not notice the deception if you're not looking for it.
From the back of the block, however, you can see what's going on. The houses on either side are braced against each other by a number of sturdy steel struts, and the Underground tracks are visible—and most audible—just below.
The downstairs of the Fort Washington branch of the New York Public Library feels big and bright, with tall ceilings and sweeping windows meant to keep the building light and cool. The bottom two floors is an open, book-lined space. Walking up that last flight, however, feels like fading into a different building. A water stain darkens the wall, and the etched steps are dusted with the chips of peeled paint fallen like dandruff from above.
At the top, the stairway opens into a large, shabby room with high ceilings. To enter, you pass through a well-crafted wooden frame of what was once a wall; now there is empty space where the door and windows were. The front room is brown and full of the textures of abandonment—the walls and ceiling look like they're sloughing off dead skin. Once, the library hosted performances in this space, and dances, but now the prettily molded ceiling is covered partway with rectangular metal chutes.
When New York City's branch libraries were first built, about a century ago, they needed people to take care of them. Andrew Carnegie had given New York $5.2 million, worth well over $100 million today, to create a city-wide system of library branches, and these buildings, the Carnegie libraries, were heated by coal. Each had a custodian, who was tasked with keeping those fires burning and who lived in the library, often with his family.
But since the '70s and '80s, when the coal furnaces started being upgraded and library custodians began retiring, those apartments have been emptying out, and the idyll of living in a library has disappeared. Many of the apartments have vanished, too, absorbed, through renovations for more modern uses, back into the buildings. Today there are just 13 library apartments left in the New York Public Library system. Some have spent decades empty and neglected.
The apartment doesn't feel haunted, exactly, but lonely and left behind. There is, however, a mysterious black door, with three sections, and a row of bells alongside it. No one knows where it leads, and it's jammed shut. It's the sort of door someone opens at the beginning of a horror movie that releases a demon or hungry creature.
Wrenched open, the middle section reveals a wall, brown and textured like washed-up seaweed. It's the back of a shaft. Look up, and there's a plate of glass keeping the rain out. Look down, and the hole plummets to the basement.
Death chute aside, this would have been one of the nicer library apartments to live in. Often the flourishes that made the Carnegie libraries special—the large windows and decorative moulding—were left out of the custodial apartments, but this one has some nice details. Finding this much empty space anywhere in Manhattan is a rarity; walking upstairs in a well-used building and finding an empty floor feels like being in on a great secret.
Roald Dahl was many things. A fighter pilot, a renowned author, a spy. But few people know that he was also the host of his very own Twilight Zone–style sci-fi/horror anthology show, Way Out, a macabre program that ran for a single season and almost gave Rod Serling’s more famous program a run for its money. And it all began with a terrible game show.
In 1961, Honeymooners star Jackie Gleason had moved on from his career-defining role as cantankerous bus driver, Ralph Kramden, and become a roving host and guest, appearing on the variety shows, specials, and game shows. One of these endeavors was a game show called You're In the Picture. Intended to display Gleason's skills as a raconteur and show host, the show was to have a panel of celebrities stick their heads through a famous image, then they would have to question Gleason to determine what image they’d stuck their heads through. It wasn’t a hit.
You’re in The Picture ran for exactly one episode, and received such negative reviews that when the next episode was set to air, instead of the game show, viewers were greeted by a half-hour apology delivered by Gleason himself. After expressing regret for dropping “the biggest bomb,” Gleason changed the format to a talk show to limp through the rest of the initial episode order, but producers at CBS needed a new show to fill Gleason’s spot, and fast.
Under the gun, some enterprising producers at the network began dreaming up a creepy drama show to fill the time slot, and they went right to Dahl. While he is best remembered today for his timeless works of children’s literature like Matilda and Charlie and The Chocolate Factory, for a good portion of his writing career, he was better known as an author of twisted, devilish fiction. As explained in an article originally published in Filmfax Magazine, Dahl jumped at the chance to develop the series, spurred on by the fact that the show’s time slot (9:30 p.m. on Fridays) fell right before another thematically similar little CBS show, The Twilight Zone.
The resulting half-hour show was titled Way Out—strangely the opening screen of the show displayed the title with an apostrophe preceding it, ‘Way Out. The format was set up much like the already successful Twilight Zone series, with Dahl in the Rod Serling role.
The black-and-white show would begin with what became its signature image, a slow pan over a series of mist-shrouded, disembodied hands, before resting on one which would burst into flames at the title came onscreen. Then, flexing his dry British charm like a more cosmopolitan Vincent Price, Dahl would give a short intro to each episode. The bulk of the program consisted of the main tale, usually a short morality play with an ironic or surprising ending or element, which often dipped into the supernatural. Then Dahl would close out the show with another direct epilogue, much like the Cryptkeeper of the later Tales From the Crypt.
Dahl also smoked like a chimney throughout his segments, which served the dual purpose of providing a mysterious haze around the host and demonstrating the show’s main sponsor, L&M Cigarettes. In fact, just about everyone in Way Out really enjoys cigarettes.
Initially the producers wanted to adapt some of Dahl’s pre-existing stories, but in the end only the first episode ended up being written by Dahl, with the remainder of the series’ 14-episode run being authored by other people. The bench of talent never quite equalled that of The Twilight Zone.
The first story, and the only one based on one of Dahl’s stories, was called “William and Mary.” In the episode, a controlling jerk of a husband, William, lies on his deathbed barking insults and commands at his long-suffering wife, Mary. Her torment seems to be at an end when he dies and she is free to smoke refreshing L&M Cigarettes, play cards with friends, and even wear lipstick. But—twist—William has opted to keep his brain alive after death so that he can still keep watch over Mary! But—double twist—with no mouth or body, William finds himself a captive witness to his newly liberated wife’s new life. The tale ends with Mary gleefully blowing smoke into William’s helpless robotic eye. Diabolical fate.
The stories got more outlandish. In the episode The Croaker, a mysterious man begins manipulating a young boy to help him turn the residents of their town into frogs, but the enterprising young lad has some strange plans of his own. In the episode Side Show, a woman with a light bulb for a head is held against her will in a circus sideshow. When an audience member falls in love and decides to free her, he may be in for a shock. In the episode False Face, an actor pays a deformed homeless man to be his model for some Quasimodo make-up, but the effects turn out to be a bit too real. In the world of Way Out, fate always has a cruel sense of humor.
The show received positive press as it aired from March to July of ‘61, and even today, episodes of Way Out still hold up surprisingly well as tightly drawn, macabre vignettes. But at the time, its high quality didn’t translate to sufficient ratings, even with Dahl’s unforgettable segments. Way Out was cancelled after just one short season.
Today, you can find some of the episodes on YouTube, and the entire collection is held by The Paley Center For Media, although it has never been formally released. The episodes are a must-see for any fans of Dahl’s gruesome sense of irony, or fans of The Twilight Zone. Take a look this Halloween, and, in the words Dahl himself used to close every episode, “Good night, and sleep well.”
Cross a closed bridge, hike two miles deep into the forest, and you'll find Lester, or what's left of it.
It was founded as a town for workers, not necessarily a family village, though a little community did grow up. Originally a logging camp, it was named Lester after a telegraph operator when the Northern Pacific Railway Company set up camp there to build a railroad across mountainous Stampede Pass in 1892.
The logging industry was driven out by forest fires in 1902, but became the town's primary industry again in the 1940s and '50s. At its peak, Lester was home to about a thousand people. But the rail industry waned and jobs dried up. Tacoma cut off access to the one road leading to Lester to protect the quality of drinking water for the Green River Watershed. People moved away, until eventually there were only five students in the Lester school district and the State closed it down. Now, all that remains are dilapidating sheds and houses, the phantom of a once-thriving community.
Lester's last resident, Gertrude Murphy, died in 2002 at age 99. Without anyone living there, Lester died too. Gertrude remembered Lester fondly, as a bucolic forest town. "Once, just once, I saw the fog freeze on the trees, it was so cold," she said. "It was lacy and light and feathery, just beautiful. In the fall, when the vine maples came in, they were like big bouquets all over the hills."
Inside the Church of the Holy Sepulcher, there’s a large rotunda, with stories of arches flying high. At the center sits a small, freestanding structure called the Edicule, which contains the slab where Jesus is believed to have been laid to rest. For the first time in almost two centuries, the marble Edicule is being restored, and in the process scientists have found what they believe could be the walls of the cave where the Resurrection is supposed to have taken place, as National Geographic reports.
The Church of the Holy Sepulcher is itself an astonishing structure, first built in the 4th century and renovated, rebuilt and restored many, many times over the centuries since. It’s been more than 460 years since the last time the burial shelf in the original cave was uncovered. The marble Edicule that surrounds the shelf was built in 1810, after a fire damaged the previous structure. In 1927, the “little house” was damaged during an earthquake and for the past seven decades it has been propped up by metal girders.
In March, the different religious orders that control the church agreed to renovate the Edicule. As part of the preparation, radar tests showed that there could still be “hidden layers” behind the marble walls, the AP reports. When scientists moved the marble slab off the burial shelf, they discovered a layer of debris and another marble slab, grey, with a cross etched into it, that could date back to the 1100s. Bennett said that was a “whitish layer,” National Geographic says—and there may still be more beyond that.
Behind the marble walls of the small building, they also found walls of the original cave. These walls were thought to have long since collapsed or disintegrated behind the marble walls, but they are still standing six feet high.
All this work has had to happen in a very short window: the overseers of the church gave the scientists just 60 hours to work on the inner sanctum. Already they’ve discovered materials no one thought was there, and the restored Edicule will allow visitors a glimpse—they’re leaving a window cut into the marble slab of the wall to show the original cave walls.
If sticking your face in a barrel filled with cold water and trying to grab an apple with your mouth seems a touch sadistic for a casual game, then you're really not going to like its sister pagan ritual. It's sometimes called "snap-apple," and like bobbing for apples, it comes from a pantheon of mostly forgotten All Hallows' Eve traditions from the British Isles.
In snap apple, the goal is still to grip an apple in your mouth, no hands allowed. But in this game, the apple is attached to one end of a stick. On the other end, there's a lit candle. The stick is hung from the ceiling by a rope and set spinning. The player tries for the apple. Any mangled attempt, too slow, too fast, too aggressive, not aggressive enough, can result in a burning candle in the face.
In America today, Halloween is most often associated with pumpkins and candy. For centuries in the British Isles, though, this eerie night was linked with apples, nuts, and cabbage, all of which were given secret powers. And like snap apple, the games played with these foods had much more dramatic stakes than the ones we play today. The risk wasn't just about getting physically burnt, though.
All Hallows' Eve is a time when the veil between the world of the living and dead is thinner than usual, and these forgotten rituals banked on that closeness to predict the future—you could find out who you might marry or whether it'd never happened for you, if you'd be a widow, or when you might die.
Among all the food traditions of Halloween, apples appear most often. There are loose theories for why apples are so special on this day. They've long been thought to have an uncanny magic: in Welsh folklore, for instance, they carry a whiff of immortality. In the Book of Hallowe'en, an early 20th century study of October 31 traditions, Ruth Edna Kelley writes that apples came to Halloween through the colonizing Romans, who linked their fall apple festival with the Celtic Samhain.
In the 20th century, games like snap apple and bobbing for apples were seen as diversions, but earlier they were more serious. Consider that in Scotland, bobbing for apples is sometimes called dooking or douking for apples, the same word used to describe dunking a women in water to test if she might be a witch. Bobbing for apples, or "snapping" one suspended on the ceiling from a string (a safer version of snap apple) could reveal the future of your love life: in one version, if you got the apple first, you'd have happy love—or, in another, you'd be the first married.
Apple games could also make matches, or predict whether a person would have a good or bad love. Sometimes the apples would be labeled or marked by young men and women before they were put in a tub of water: the person who caught your apple could be your mate. In another version of snap apple, a hoop is suspended from the ceiling, and different treats and tricks, including cake, candies, bread, apples, and peppers, are stationed along its rim. The one a player caught with their teeth would foretell the nature of their love—would it be sweet, spicy, too hot? Would it nourish or burn them?
There are other ways, too, to coax an apple into telling the future on Halloween. You could peel it all in one strip and throw that peel over your shoulder: if it stayed in one piece, it would form the initial of a future lover. Or you could take an apple to a mirror. Shine a candle in the mirror and eat the apple, and you would see your future husband over your shoulder. In one version of the tradition, the apple gets cut into nine pieces, and it's only after the eighth is eaten that the lover appears, asking for the ninth.
Nuts could also tell the future, and in some places, Halloween was called "nutcrack night" or the "sacrifice of nuts." If a man brought a woman some walnuts, it would mean he was her true love. But it was also possible to use nuts to see the future of a love. Nuts would be named and set by the fire to see how they burned. If they roasted slowly, together, that would promise a good, strong love. If they crackled and popped and jumped, it was a bad sign.
A girl could test out different suitors this way—the nuts could give a hint to how the relationship might go. Kelley saw a sinister shadow in this: "Who sees in the nuts thrown into the fire, turning in the heat, blazing and growing black, the writhing victim of a n old-time sacrifice to an idol?" she wrote.
If none of these fortune telling methods went your way, you could still resort to cabbage. One tradition involved pulling kale stalks from a field: the nature of the stalk, its weight, length, girth, and taste, would hint at one's future spouse. If the roots still had plenty of earth hanging on them, that could augur a hearty dowry. If you were a woman, you could also steal a cabbage and see who you meet on the way home to find out who your future husband might be. In Ireland and Newfoundland, cabbage and potatoes together, a dish called colcannon, would have a ring or button hidden in it, and whoever found the ring could expect to be married soon. The button was less lucky—it would mean you'd never marry.
If colcannon isn't appealing, cake is another alternative. Some people would bake a ring and nut into the cake—the finder of the nut would be a widow or widower. In another version, the cake had a coin, a sloe berry and a bit of wood baked in. The coin meant wealth, the sloe meant you'd outlive all your friends and the wood meant you'd die within the year. There were other Halloween augurs, too, that could foretell death: like snap apple, these games were not kidding around. Compared to the risk of having your death foretold, getting a little wet while bobbing for apples seems pretty tame.
In 1960s, a station wagon full of overtired Florida children looped endlessly around Washington, D.C.'s bewildering traffic circles. The campground where they'd planned to stay was gone. The car made its way down Embassy Row, and the children thought of something. Where was Florida's embassy?
"We explained to them that the different states don't have embassies," their mother, Rhea Chiles, told a reporter in 2003. "They thought that was short shrift."
Today, Florida House is a reality. Visitors rap on the door with a seashell-shaped knocker, and a staffer welcomes them into a bright foyer. That leads into a living room inset with color-drenched stained glass. Florida art hangs on the walls; an ocean-themed mantel frames a fireplace. Orange juice is always offered.
Different versions of the founding story have appeared in many articles about Florida House over the years; there's even a video with Chiles telling one. They vary, but they all make sense, and because of the children's guileless reasoning—Chiles listened. She founded Florida House, so other Floridians would have a place to call home when they were in D.C. It is right on Capitol Hill, and is the only state "embassy."
The actual founding, though, was years in the making. Rhea Chiles returned to Washington in the early ‘70s when husband, Lawton Chiles, was a senator. (People called him "Walkin' Lawton" because, during his campaign, he walked from Pensacola to Key West.) She saw a run-down Victorian on her walks to his office. Boards covered the windows, and it stood in part of the city many considered unsafe. But Chiles liked the house. She saw potential, raised money, and bought it for $125,000 in 1973.
Florida House stands on a Capitol Hill corner; the other three corners at the intersection house the Library of Congress, the Supreme Court, and the Folger Shakespeare Library. Its builder was Edwin Manning, an architect who worked on the Library of Congress. In 1939, the house was owned by North Carolina Senator Robert Rice Reynolds, a Reynolds tobacco heir. His isolationist newspaper, American Vindicator, listed the house as its address. On September 11 2001, federal buildings closed, and senators Bob Graham and Bill Nelson sheltered at Florida House with about 35 staffers. During the anthrax scare in the Senate office building, Nelson's staff decamped to Florida House for a couple of weeks.
Despite the Chiles children's question, Florida House is not, of course, technically an embassy. But it does perform many of the same functions that an embassy from a foreign country might. It's a sanctuary for Floridians with its themed art and gracious living spaces. People host receptions and lunches. There's an intern seminar series, and rotating exhibits of Florida artifacts. Floridians can use the computer, desk, and phone.
The house doesn't get any tax money. Instead, private donations keep it running. And despite being on Capitol Hill, Florida House takes no sides. "You leave your hat at the door," CEO and president Bart Hudson says, "because once you come through the door you are a Floridian."
Chiles had a larger vision of Second Street as a state embassy row, says Hudson, and he thinks that would not be a bad idea. He points out that many states have societies, which have, for example, their own softball teams. States also maintain a presence in the Hall of States, where many state governments have Washington offices.. There was a brief moment in 1994 when it seemed like there would be an Illinois house, but that passed.
Ask Hudson what he'd like people to know about Florida House, and he has a simple answer: "That we exist." Florida House is 43 years old, he says, so the fact that so many Floridians don't know it's there is "disheartening and challenging and our goal to correct." Hudson says they see probably close to 15,000 people a year, but since there are 19 million people in Florida, he'd like more.
In 2003, South African pilot Johan Heine was flying over the hills of the gorgeous Mpumalanga region of South Africa when he crashed his plane into the mountainside. After exiting the plane, Heine saw before him three monolithic, five-ton dolomite stones sticking out of the ground, and behind them a giant stone circle. The befuddled pilot didn't know it yet, but he had just discovered what is arguably the oldest manmade structure in the entire world.
Known to only a select few and accessible solely by rough dirt roads past the wild horses of Kaapschehoop, Adam’s Calendar is possibly the world's only megalithic stone calendar, and, according to controversial archaeologist Michael Tellinger, it may also be the first mark ever made by humankind.
With the shape of a circle and a diameter of 100 feet, it is nicknamed the “Birthplace of the Sun.” As the sun casts shadows on the rocks, the calendar tells what day of the year it is. The stone circle is also aligned with the world’s equinoxes, solstices, and the cardinal directions of north, south, east, and west.
According to Tellinger, of more than 100,000 ancient stone structures in the Mpumalanga region, Adam’s Calendar, dubbed “Africa’s Stonehenge,” is the oldest. Tellinger notes that the stones of Adam’s Calendar were built in alignment with Orion's Belt, which follows a 26,000 year long cycle around the Earth. This leads Tellinger to believe that the calendar is as much as 250,000 years old, although the most commonly used figure is 75,000 years—a full 16 times older than the Great Pyramids of Giza.
As of now, the true age of Adam's Calendar is highly disputed, and it remains a mystery that is yet to be solved.