In the 1920s, the streets of Chicago were home to hundreds of secret clubhouses and hideouts—home bases to thieving, violent gangsters. Between 1923 and 1926, one sociologist at the University of Chicago endeavored to track down and map the favorite haunts and hang-outs of more than 1,300 gangs for the project Chicago’s Gangland.
“No less than 1,313 gangs have been discovered in Chicago and its environs!” Frederic Thrasher wrote at the time.“Their distribution as shown on the accompanying map makes it possible to visualize the typical areas of gangland and to indicate their place in the life and organization of Chicago.”
Published in 1927, Chicago’s Gangland is tucked into the back of Thrasher’s seminal book on urban crime and ethnography, The Gang, a Study of 1313 Gangs in Chicago. Chock-full of gray and red demarcations, the hand-drawn, multi-layered map provides detailed insider knowledge of Chicago’s gang activity. Gangs are not only tied to their environment, but play a significant role in a city’s social distribution and structure, according to Thrasher. The map shows where certain gangs rule within the Gangland “empire,” which he explains isn’t just based on geography, but on “fissures and breaks in the structure of social organization.”
Thrasher’s map is extremely detailed. He depicted parks, boulevards, cemeteries, and railroads in different shades of gray, and even showed if hang-outs had clubrooms by marking them in red triangles and circles. He also inscribed important locations and gang territories, such as “No Man’s,” “Gang Camp” and “Death Corner.” The gangs each had their own distinct characteristics, creating a mosaic of regions.
“No two gangs are just alike,” Thrasher wrote. “Some are good; some are bad; and each has to be considered to its own merits.”
During the years between World War I and the Great Depression, many cities in the United States experienced a boom in both population and industrial production. As immigrants flooded to these urban centers, major shifts in social and spatial organization gave birth to gangs. Thrasher estimated (conservatively) that there were 25,000 boys and young men in gangs in Chicago. His work focused not on the likes of Al Capone’s mob, but rather the gangs of second-generation immigrant youths who were thrown into the “seediest aspects of American culture,” writes Greg Dimitriadis in the journal Cultural Studies, Critical Methodologies.
The mini-societies that sprung up around gang life led to what Thrasher refers to as “interstitial” areas of Chicago—a gangland of deteriorating neighborhoods, shifting populations, and impoverished and disorganized slums.
Thrasher’s cartographic representation constructs “a stage on which the scenes that are actually of sociological interest unfold,” writes Rolf Lindner in The Reportage of Urban Culture. Other sociology studies of the 1930s contain maps, but Thrasher’s Chicago’s Gangland is considered one of the first thematic maps that surveyed organized criminal activity. His investigation gives a rare glimpse of what gang life was like in 1920s Chicago.
“Gangland is a phenomenon of human ecology,” Thrasher concludes. “The gang develops as one manifestation of the economic, moral, and cultural frontier which makes the interstice.”
Map Monday highlights interesting and unusual cartographic pursuits from around the world and through time. Read more Map Monday posts.
In the 3rd or 4th century A.D., in a farming community near the river Nene, in what’s now Stanwick, England, a man was buried in an unusual fashion—face down, with a stone in his mouth where his tongue should be.
He was probably in his thirties and probably was considered some sort of threat to the community. When people were buried face down in Roman Britain, it often meant they were unusual in some way. But the stone replacing his tongue was even more unusual: Archaeologists have never before seen an example of this burial practice in this part of the world, at this time, TheGuardian reports.
The skeleton was first excavated in 1991, from a burial ground in Northamptonshire, about a two-hour drive north of London. Only recently did a team of archaeologists have the chance to examine the skeleton systematically, though, and they discovered the stone tongue.
They believe that the man’s own tongue may have been amputated. There are other examples from Roman Britain of severed body parts being replaced by objects during burial; most commonly, says The Guardian, pots or stones have been found in the place where the person’s head should be. The team also found evidence of infection on the skeleton, consistent with an amputated tongue, which often leads to infection.
The harder question to answer is why the man’s tongue was missing. Did he perhaps bite or cut it off himself, during a seizure or because of other mental health issues? Was it cut out as punishment? What was so dangerous about him that he needed to be buried face down? All archaeologists can say for sure right now is that there was something strange enough about him that the people who lived here found it necessary to bury him in this unusual way.
In 1978, Georgi Markov was on his way to work at the BBC in London when he felt a sharp sting on his thigh. Behind him, he saw a man picking up an umbrella. The man, who spoke with a foreign accent, apologized and hurried into a cab that whisked him away.
That night Markov came down with a fever; four days later he was dead from ricin poisoning. A medical examiner found a tiny pellet, less than 2 millimeters in diameter, in his leg. Markov, a dissident novelist who had defected from Bulgaria, had been assassinated. Based on the details he remembered before he died, investigators developed a theory of how he had been shot.
That umbrella, they thought, was not a normal umbrella, but one that had been transformed into a gun.
Umbrella guns are by no means the only type of disguised weapon. “Man has attempted to disguise firearms into just about everything you can possibly imagine,” says David H. Fink, a collector in Georgia who has written about disguised guns for the American Society of Arms Collectors. Guns have been hidden in pillboxes, a scribe’s casing, a flute, a pencil, a Pepsi can. There have been pocket-watch guns, ring guns, bike-pump guns, and lipstick guns.
But perhaps no other type of disguised gun has caught the imagination of spies, writers, and conspiracy theorists as the umbrella gun. As a weapon, it is both a little bit ridiculous and deviously clever, and since its use in Markov’s assassination, it had taken its place in the villainous weapon hall of fame.
The umbrella gun was invented in the 19th century, as a variant of the more popular cane gun. First patented in 1823, cane guns were relatively simple weapons, disguised to look like walking canes—a gentleman’s weapon. The umbrella gun took the same idea and applied it to another personal item.
The earliest extant example of an umbrella gun may be one in Fink’s collection, dated to 1860. The umbrella was made in London and marked “Armstrong reg. British Make.” The handle has its own marking: “Richard Grinell 1860.” (Grinnell was probably the owner.) The shaft of the umbrella is actually a rifle.
Fink also has a different type of umbrella gun in his collection, from 1892. In this design, the umbrella shaft itself isn’t a gun. Instead, it contains a small revolver that slides in. To fire it, you pull the gun out of the umbrella’s top.
“These two are the only two honest umbrella guns I’ve seen, and I’ve been collecting for over 50 years,” Fink says. “The survival rate of these things is not very good. The guns may survive, but the umbrellas fall apart.”
In the first part of the 20th century, umbrella guns got little play: in 1917 Popular Sciencedescribed a “toy gun for the pacifists,” that would shoot an umbrella instead of a bullet, and in 1928 Popular Mechanics described how to turn an old umbrella into a spring gun. The most famous user of the umbrella gun, up until the 1970s, was the Penguin, the portly, besuited supervillain of the Batman comics.
In his first appearance, in 1941, the Penguin carried an umbrella with a hollow shaft, which he used to steal art, and Batman soon discovers that his enemy has a giant collection of inventive umbrellas, including one that fires poison gas. Over the many reinventions of the Batman universe, the Penguin has hidden all manner of weapons in his umbrellas, including a flame-thrower and a machine gun.
In the 1960s, though, American intelligence agencies were developing real-life versions of the Penguin’s sneaky weapons. The CIA reportedly created a stun-gun umbrella that would shoot poison darts from its tip, similar to the umbrella used to assassinate Markov.
That umbrella, sometimes called the “Bulgarian umbrella,” may not have technically been a gun: it didn’t work by setting off gunpowder. The tiny pellet that killed Markov might have been stabbed into his leg using compressed air, a hypodermic needle, or another injecting device. The gun itself was likely designed by the KGB for Bulgaria’s secret service.
After the “umbrella murder” became famous, the umbrella gun enjoyed a bit of a renaissance in the popular imagination. One theory about President John F. Kennedy’s assassination centered on an umbrella gun. In 1985, a company called J. Wilson built a very classy model. On television and in film, the umbrella gun became a more common murder weapon and spy tool; most recently, Colin Firth was equipped with a multifunctional umbrella gun in the 2015 movie Kingsman: The Secret Service.
Outside of fiction, though, umbrella guns are still relatively rare, although in 2014 a man in Seattle was charged with using a rifle “stuffed into the sleeve of a Nike golf umbrella,” reports SeattlePI. At this point in the history of the umbrella gun, it's more of a cliché than a clever trick: if anyone suspicious points the tip of an umbrella at you, you know to watch out.
Ravens like to hang out in the parking lots of Yellowknife, in Canada's Northwest Territories, perching on cars and foraging for snacks. Last Thursday, one of them made its opinions on government bureaucracy known: it took a parking ticket in its beak and shredded it.
Annemieke Mulders, a local raven fan, caught the event on video and shared it with the CBC. "I watched the little monster take the ticket from under the windshield wiper and shred it, and (I think) eat some of it," she told the outlet.
In the end, it's unclear whose side the raven is on—according to the Yellowknife director of public safety, the driver still has to pay the ticket.
Every day, we track down a fleeting wonder—something amazing that’s only happening right now. Have a tip for us? Tell us about it! Send your temporary miracles to cara@atlasobscura.com.
Before 1972, the Pittsburgh Steelers hadn't won an NFL championship for 40 years. But on December 23, 1972, the team had its first playoff game in franchise history. Down by one point against the Oakland Raiders, it was 4th down with only seconds to go in the 4th quarter, when one of the most famous—and controversial—plays in American sports history went down.
Known as the Immaculate Reception, the play is memorialized on the exact spot where it happened, on the site of the old Three Rivers Stadium between today’s Heinz Field and PNC Park in Pittsburgh.
The Steelers were hosting the game, and the Raiders were leading with 22 seconds to go—enough time for one last play. Quarterback Terry Bradshaw threw a pass to receiver John "Frenchy" Fuqua, and that’s were opinions start to verge. The pass was either batted away by Raiders Safety Jack Tatum, or the ball bounced off of him and was scooped up by fullback Franco Harris just before it hit the turf. However it happened, Harris ran the ball for a touchdown, and a Steelers victory, just as the clock ran out.
It took 15 minutes to clear fans off the field so the Steelers could kick the extra point. They won the game, and even though they would lose the next week to the Miami Dolphins, the Immaculate Reception signaled a start to the Steelers' dominance of the NFL in the 1970s.
The play earned its name from a caller on broadcaster Myron Cope's radio show, but not without controversy as to who the ball actually touched first. The game tape is murky (you can watch a clip of the original broadcast here), and the rules fairly complicated, but with no instant replay at the time, the call on the field stood. Still, the play has become better known by Raiders fans as the Immaculate Deception.
In addition to the monument here, the city has two statues of Harris catching the ball, one even greeting travelers at Pittsburgh International Airport, right alongside George Washington.
This space age building overlooking the Black Sea would be the perfect hideout out for a villain on vacation. Indeed, when Druzhba was built in 1986, designed by famous Soviet architect Igor Vasilevsky, the structure overlooking the sea was so ominous Turkish spies assumed it was a secret military building.
They couldn't have been more wrong. The building, whose name means "friendship," is actually a health spa.
The strange spaceship design allows for private balconies and amazing views from most of the rooms. On the outside it appears harsh and blocky, but on the inside the circular spa is open, light, and airy. The centre of the building contains social amenities like a salt water pool, a cinema, and cafes.
The artistic concept draws heavily from science fiction depictions of futuristic cities, as well as the Soviet architectural tendencies to maximize public space in a space age style. Vasilevsky credited flying saucers and time machines as inspiration for his design.
Today the spa remains open and popular as ever, catering mostly to Russian tourists.
The National Archives building in Washington, D.C. houses some of the United States' most foundational texts, including the Constitution, the Bill of Rights, and the Declaration of Independence. These three documents are collectively known as the Charters of Freedom, and could be the most closely guarded pieces of paper on the planet.
During the day these important texts available for public viewing under bulletproof glass and constant guard. But every night (and at the press of a button, should the need arise) a special elevator pulls them underground into a custom-built armored vault.
The original vault was built in 1953 by the Mosler Safe Company. The firm was the logical choice, having previously taken on notable achievements like the gold bullion vault at Fort Knox, and a bank vault in Hiroshima that survived an atomic bomb.
The original, 55-ton Mosler Vault was the size of walk-in closet and employed a 20-foot scissor jack to raise and lower the Charters of Freedom. A 1953 documentary shows the lift in operation here. The Mosler Vault was replaced in the early 2000s as the National Archive underwent a major $110 million renovation. The current vault, designed by Diebold, is still shrouded in secrecy.
In 1908, the U.S. suffered its first outbreak of a horrendous disease called pellagra. The nation’s first response? Arguing about cornbread recipes.
The pellagra outbreak was confined to the South, which happened to be the only region of the country where people ate large quantities of cornmeal. Those two facts were thought to be related, and suspicion fell quickly upon cornbread as a vector of disease.
Southerners grew defensive. Corn itself wasn’t the problem, they said: Pellagra emerged instead from faulty ways of growing corn, or grinding meal, or mixing dough. Most important of all, they said, was who did the baking.
Like so many problems in America, the pellagra epidemic was tangled up with slavery, racism, and poverty. Unlike most of those problems, its solution was thought to lie in fingerprints embedded in the crust of corn bread.
Allow me to explain. According to the USDA, in the first half of the 20th century, families in the North, whether rich or poor, ate just a few ounces of cornmeal per week. By contrast, the poorest farm families in the South consumed as much 12 pounds of cornmeal a week, while the richest consumed 8 or 9 pounds.
How do we account for this? For the most part, the reason was poverty. Incomes in the South were far lower than those in the North: In the richest state, New York, the average worker earned $929 a year. In the poorest, Alabama, he earned $321. To stretch that income, you bought corn. In 1909, 25 cents would buy you 7 pounds of wheat flour—but 10 pounds of cornmeal. Why was the South corn-fed? Because cornmeal was cheap, and most Southerners were very poor. Many subsisted of a diet consisting of little more than cornmeal and molasses.
So let’s refine the question: Why did wealthy Southerners eat cornmeal?
A clue lies in a dreadful, anonymous poem called “The Cornbread Country,” first published in the Baltimore Sun and then widely reprinted across the South:
Oh, for the cornbread country, The jasmine land I see, Down there in the dreams of Jackson, Down there with the friends of Lee.
Indeed, for the past two centuries, the North has recognized the South, and the South has recognized itself, as the land of corn-eaters. I speak not of corn-on-the-cob but of the many items made from ground corn: corn pone, corn pudding, corn dodgers, corn cakes, cracklin’ bread, johnny cakes, hoe cakes, grits, hasty pudding, and spoon bread. Southerners in the U.S. have long embraced corn-eating as a matter of identity. In doing so, they even occasionally weaponized cornbread for use in ideological battle.
The skirmish I’m referring to took place in 1909, just after pellagra was first diagnosed in the U.S.
It is a terrible disease: a blistering rash followed by diarrhea, dementia, and death. It had first been identified nearly two centuries earlier—first in Spain, then in Italy—and it was most common in areas where people survived on a diet of corn. In the U.S., pellagra likely killed 100,000 people and sickened 3 million in the early 20th century.
We know now that pellagra is caused by a dietary deficiency of niacin, which is absent from cornmeal but found in fresh meat, milk, eggs, and nuts. But pellagra’s true dietary origins weren’t widely accepted until the late 1920s. When the disease first emerged in the U.S., most doctors believed that it was somehow caused by the cornmeal that was central to the Southern diet. According to one newspaper, “the panic has reached such a stage that … corn pone and corn cake have gone out of fashion.”
Most Southerners, though, weren’t ready to give up their cornbread. They had their own theory about the disease. In Americus, Georgia, a grocer told the Times-Recorder, “Practically every bushel of meal sold here … is ground from Western corn.” By Western he meant what we would call Midwestern—the Corn Belt. Georgians once grew their own corn, but now they planted cotton right up to their doorsteps, and bought cheap imported corn. On its journey from the West to the South, the corn spoiled and became toxic, and those who ate it developed pellagra. So went the theory, at least. As the New York Sun put it at the time, Southerners saw diseased corn as “a sectional conspiracy against the South.”
By 1909, Southerners had absorbed some hard lessons about sectional conflict. Rather than retaliate with force, they looked inward. By buying Midwestern corn, they had despoiled their heritage. How could Southern culture cure itself? By returning to the old ways.
And thus news articles about the horrors of pellagra soon seamlessly transformed themselves into lifestyle pieces about the proper techniques for making cornbread. The MontgomeryAdvertiser in Alabama insisted that badly made cornbread “may produce pellagra or anything else. It is not cornbread.” A writer in the CharlotteDaily Observer agreed. “Corn meal … mixed up with milk, eggs and soda with a spoon and baked in a stove … ought to cause just such ailments as is charged to it. It is a clear case of retribution on the part of the bread.” Proper cornbread contained meal from a local mill, salt, and spring water—nothing more.
And the loaves must be shaped by hand: “The prints of the fingers are left in longitudinal corrugations,” according to the Montgomery Advertiser. “The absence of finger marks is just grounds for suspicion.” The Daily Observer agreed that the cook must carefully shape the pones, “leaving fingerprints on each.”
Those fingerprints served as evidence of who was missing: The cooks of the Old South. The Macon Daily Telegraph explained that real cornbread required “a hickory wood fire, an iron skillet and lid, and an old negro mammy. … She will mix the meal and water, fashion it into pones in her hands, drop the pones into the hot skillet, [and] pat them with her hands.”
The Civil War, by freeing enslaved cooks, had deprived white Southerners of proper bread. And it wasn’t just the cook who was missing—it was an entire social fabric, imagined through a fanciful vision of antebellum racial harmony known as the “Lost Cause”—the belief that Southern ideals had been sanctified by the blood of the fallen, that slavery civilized the enslaved, that God had ordained white supremacy.
In 1909, the same year Southerners panicked about pellagra and cornmeal, the NAACP was founded—a response to, among other crimes, the lynching of more than 900 black men over the previous decade. Racial order in the South was enforced through terror, and justified through storytelling. When newspaper editors celebrated old-fashioned cornbread and lamented the disappearance of enslaved cooks, they buttressed the myths of a happy antebellum South.
Those myths obscured violence—the overt violence of lynching, and the quieter violence of economic exploitation through the sharecropping and tenant farming systems. Poor families spent 40 to 50 percent of their income on food—at least when they had income. Spikes in pellagra tracked the years of economic troubles and crop failures—1909, 1915, 1921, 1930. In 1922, the Charlotte Chronicle noted that tenant farmers had been “compelled to return to … corn bread and molasses for most meals.” The result was yet another pellagra epidemic. As newspaper editors lamented the loss of black cooks, the children and grandchildren of those cooks died of malnutrition.
Eventually, the U.S. halted pellagra. The cure didn’t require the return of black cooks or corn pones decorated with fingerprints. It did, however, involve a new recipe for bread—just not the antebellum-style corn pone editorialists has promoted. The key, instead, was a new ingredient: State and federal laws required that niacin be added to commercial meals and flours, so the pellagra-preventing nutrient was baked into every loaf. (When you buy “enriched” bread today, you are eating a legacy of the pellagra epidemic.)
Through public health laws, the U.S. wiped out pellagra. But we left in place a social system that makes people vulnerable to a new array of nutritional diseases. Eventually, the same quality that made corn an ideal crop for so many generations of Americans—abundant harvests—made it the perfect crop for industrial agriculture, where it yielded raw materials for processed foods. By the late 20th century everyone in America—and many others around the world—started eating a great deal of corn, not as cornbread but as corn-fattened beef and pork, corn oil, and corn syrup. We are all corn-fed now, and the result is an entirely new set of nutritional challenges.
The World Health Organization recently called on nations to tax sugary drinks and subsidize fresh fruits and vegetables. The symptoms of our current malnutrition are not dermatitis and dementia but hypertension and heart disease. The costs—in medical expenses, lost productivity, and human misery—are enormous, and they are borne disproportionately by the poor.
America is a wealthy country, with plenty of food to go around, but the bounty has never been shared. Pellagra, like scurvy or beriberi, is known as a “deficiency” disease—you get it from the lack of a certain dietary nutrient. But the root cause of public health disasters, then and now, is not the lack of certain nutrients. It is a deficiency of justice.
Dr. Joseph Goldberger, the New Yorker who solved the mystery of pellagra in the 1910s and 1920s, once examined an asylum in Milledgeville, Georgia, where pellagra was epidemic among patients—but nonexistent among staff. Doctors earlier had ruled out diet as a cause because staff and patients ate at the same cafeteria. Goldberger noted, though, that the staff ate first. The fresh meat and milk disappeared before the patients dined. It took an outsider to point out that the common meal was not equally distributed.
On a street in Beijing, an elderly Chinese man blows into a hollow lump of hot, melted sugar. A minute or two later, he holds a bunny. Or a giraffe. Or a goldfish. All are shaped via the traditional art of sugar blowing, an increasingly rare sight on the streets of China.
Like glass blowing, the sugar version involves inflating a molten blob into a bubble, then shaping it before it cools. Sugar blowers tend to make animals, with creatures from the Chinese zodiac—dragons, rabbits, pigs, and monkeys—being especially popular. Though the finished creations are edible, they are considered to be art rather than food.
Every day we track down a Video Wonder: an audiovisual offering that delights, inspires, and entertains. Have you encountered a video we should feature? Email ella@atlasobscura.com.
“Living in a bubble” is typically associated with people sheltered from the perils of society, but in southeastern France, the concept is taken to a new, literal level. Every night, tourists in the Marseille area of France swap a conventional hotel with a transparent, climate-controlled bubble.
The Attrap’Rêves Bubble Hotel in Allauch, France features a field of plastic, soundproof pods in a forested environment, giving guests an unobstructed, 360-degree view of the surrounding nature. It’s the ultimate way to immerse yourself in the woods—from the luxury of a king-sized bed.
Each bubble is a mere 13 feet in diameter and costs just over $100 per night. The glorified fishbowls feature tables for two, electricity, showers, and scenic viewpoints in all directions. A silent wind-blower keeps the cozy cocoon inflated, and a telescope in each room lets lovers gaze at the stars.
In addition to Allauch, Attrap’Rêves Bubble Hotels can be found in four other locations throughout southeastern France, all of which ask guests to sacrifice their personal privacy for a magical view.
In the 1950s, Japanese women seeking a new life in America had to learn about more than just visa requirements. They also had to learn how to cook hamburgers, entertain neighbors, and confidently walk in high heels. Eyeliner application was, apparently, a vital skill.
These immigrants weren’t just any women. They were the “war brides” of American G.I.s, and some of them learned these lessons at the American Red Cross, which ran schools designed to prepare them for domestic life in the United States.
The American Red Cross Bride Schools sprang up in response to the wave of marriages between American soldiers and Japanese citizens following World War II. Thousands of G.I.s were stationed in Japan during the postwar Allied occupation, which led to several romances with local women. Although the statistics vary, scholars estimate somewhere between 30,000 and 50,000 such marriages took place through the 1950s.
Getting hitched was a headache, considering the stack of documents any Japanese woman had to provide to wed an American soldier. The complicated process took some couples over a year to complete—and it was especially hard on the women. Military officials thoroughly investigated them and their families for any trace of“Communism, venereal diseases, tuberculosis, or anything that would label her as undesirable.” The husband merely needed to prove he was single, a U.S. citizen, and willing to provide financial support.
But the paperwork wasn’t the only problem. Most war brides had little to no concept of what the United States was like. What they did know came from movies and whatever their husbands told them. So the American Red Cross offered lessons on what women could really expect in the U.S. These “bride schools” opened in cities like Tokyo, Sendai, and Yokohama beginning in 1951.
The instructors were usually American wives of stationed military men. Their lessons covered cooking, baby care, etiquette, and everything in between—but despite the educational intentions, the schools took on an unmistakably patronizing tone. “The war bride schools are a great vehicle for neatly encapsulating what we thought of ourselves as Americans at that time and place,” says Lucy Craft, a co-director of the documentary Fall Seven Times, Get Up Eight: The Japanese War Brides. “We won the war and decided that not only had we won the war, but that everything about us was superior to every other civilization, particularly the people who lost the war.”
This superiority complex is evident in a 1952 Saturday Evening Post article reporting on the American Red Cross Bride Schools. The story is full of anecdotes about clueless Japanese students wearing too many slips under their dresses and slapping raw fish right on the stove. “They’ve been children in a nation’s defeat, have gone hungry, have cared for smaller brothers and sisters with the aid of a couple of old kimono sleeves in contrast to the dozens of diapers they’re now given for their own children,” the Post wrote. “Some are quick, some stupid, many average.”
In order to teach their pupils how to be Americans, the instructors had to emphasize the country’s gender roles. And as so many disappointed Rosie the Riveters learned after V.E. Day, America wanted its women back in the home. These Japanese war brides were destined to be housewives, just like their instructors. A typical class might include a tutorial on washing machines, or how to get crisp hospital corners when making the bed. Topics like U.S. history were covered. But cooking was perhaps the biggest part of the curriculum.
“I have a Japanese war bride mother and I grew up with a culinary repertoire of Sloppy Joes, pineapple upside-down cake, spaghetti, and tuna casserole,” says Elena Creef, a professor of women’s and gender studies at Wellesley College. “It’s kind of hysterical. How did my mother learn to master these really basic, slightly awful all-American dishes? Well, she was trained. I guess those lessons were taught very well.”
Classes typically ran for three to six weeks, ending with a graduation ceremony during which students received diplomas. It’s difficult to gauge exactly how many Japanese women completed the American Red Cross Bride School training. In a 1956 article, Ebony reported that “more than 2,000” had enrolled. Japanese scholar Shigeyoshi Yasutomi put the number at 4,500 to 5,000 graduates in 2015. Either figure would constitute just a small fraction of the total number of Japanese war brides. In the absence of the Red Cross classes, they turned to texts (like The American Way of Housekeeping), their husbands, or their eventual American neighbors.
So did these women actually learn from the bride schools, or was it all paternalistic nonsense? Creef insists that, although her mother recalls the classes “with a great deal of laughter, because of the insulting irony,” the American Red Cross was “fulfilling a need.” Craft says the few bride school students she interviewed had positive experiences, while noting that her war bride mother “had zero interest in it.”
The American Red Cross replicated this model across the world to aid all kinds of war brides in their move to the United States—even the ones who already spoke the language. But few crystallized the problems of the postwar globe as clearly as the schools in Japan. The war brides there left believing they’d have more opportunity in America than their economically depressed, spiritually defeated nation could offer. Yet as Craft notes, “Going to a new country just meant that instead of using a broom to clean your house, you got to use a vacuum cleaner. That was your option, not choosing whether to work inside or outside the home.”
It's easy to miss the ghost-white plaster hand that rests under a plexi-glass box at D’Arcy McGee’s in Ottawa, Canada, where it sits at the top of a small flight of stairs that constitutes the bar’s entrance. The "death hand," as it is known, frequently goes unnoticed among the swarms of politicians and government workers drinking pints after a long day on Parliament Hill.
But the hand itself is no random curiosity; it’s a re-creation of the hand of the pub's namesake, Thomas D'Arcy McGee, an Irish-revolutionary turned a Canadian father of the confederation. It was cast after his assassination in April 1868, just months after Canada officially became a self-governing dominion under British rule, which he had helped lay the foundation for.
By the time of McGee’s killing, death-masks had become common in the Victorian era to either commemorate the dead or help solve crimes, serving, in many cases, the same function as crime-scene photography. But McGee was shot in the head, making his face unrecognizable, forcing castmakers to cast the next best thing: his hands.
The one at D’Arcy McGee’s is actually a copy; the original sits, not far away, at the Bytown Museum, which took possession of it in 1920. Which is fitting in its own way, as D'Arcy McGee's itself isn't very old, having been established in 1996, with the entire bar designed and built in Ireland, before it was sent to Canada to be refabricated, not unlike McGee himself.
McGee was born in Ireland in 1825, later coming to the United States as a teenager and first making his name as a newspaper editor. In 1845 he returned to his homeland, only to flee again after a warrant was issued for his arrest following his involvement in the Young Irelander Rebellion of 1848.
Back in the U.S., McGee again took up writing and editing, before moving to Canada in 1857, where he became a Canadian nationalist, and, eventually, member of Parliament.
But just a year after he was elected into office, on April 7, 1868, he was assassinated. An Ottawa journalist named Patrick J. Whelan was later convicted and hanged for the crime, with several witnesses saying that Whelan had professed his hatred for McGee, and had planned to one day kill him over political disagreements. Whelan, for his part, admitted to being present at the assassination but denied pulling the trigger. He was hanged on February 11, 1869, before a crowd of thousands.
McGee’s wife considered him ugly by the standards of the time, which he made up for with his pen, as a prolific writer and poet. That also makes McGee's hand, and not his face, an appropriate legacy, even if most drinkers at D'Arcy McGee's may not notice it at all.
In the Baltimore Chief Medical Examiner's Office, the largest of its kind in the United States, one room has seen more violence than any other room in the city.
The Medical Examiner's Office is best known for housing the Nutshell Studies of Unexplained Death, the faithfully recreated murder scenes in miniature that helped to further forensic science. Though the Nutshell Studies are still used to train detectives, down the hall is their life-sized counterpart: the Scarpetta House, used to train forensic investigators by staging bloody scenes based on real crimes.
One half of a large room in the office is set up as a small model house with white wooden siding. Its furnishings, though bland, are all real, and little details are added to the scenery depending on the case: a box of cereal atop the fridge, children's toys scattered around the bedroom floor, a trash can full of garbage. The scenes encompass anything and everything medical examiners might encounter in the field. Mothers and children murdered in their beds, mass cult suicides—it has all happened in the Scarpetta House.
The space was donated by mystery novelist Patricia Cornwell and named after Dr. Kay Scarpetta, a medical examiner character in Cornwell's books. Occasionally volunteers are required when the victim count exceeds the number of mannequins on hand or when live actors are needed. A little secret is that these are are often the teenage children of the doctors who work in the Chief Medical Examiner’s Office.
In the Wan Chai District of Hong Kong Island, a chorus of smacking shoes reverberates against the underpass of the Canal Road Flyover. Here, people take vengeance on their villains by seeking a group of primarily elderly women known as the "villain-hitters."
The villain-hitters of Hong Kong have been placing curses on rivals and foes for more than 50 years. Deriving their actions from a centuries-old folk religion in southern China, the villain-hitters are paid to perform an enemy-hexing ritual that requires beating a long strip of paper called "villain paper" with an old slipper or shoe of a client.
"I don't really 'hit' people, I just scare away petty spirits and other nasty things," said the villain-hitter, Grandma Yeung, in the South China Morning Post video above.
While rituals vary, generally the villain-hitter will chant a series of incantations, burn incense, throw divination blocks, and make tributes and prayers to different gods. A client can write down the name or information about the targeted villain on the paper effigy. Some even bring a photo of the person they want punished—whether that's an ex-lover or a political leader. Clients can also pay to curse general villains and drive away evil spirits, explains South China Morning Post. The villain paper is pounded until nothing but scraps is left.
“People want to weaken others through villain hitting to achieve peace of mind," villain-hitter Wong Gat-lei told The Guardian. "But it’s more about achieving peace of mind by releasing your anger.”
In addition to getting back at enemies, villain-hitters cast healing spells, help souls cross-over, and contact the deceased, reported The Guardian. Yeung, who had been a villain-hitter for 10 years, said that her clients seek jobs, have sick loved ones, are facing lawsuits, or have a cheating spouse. She even admitted that her practice is a scam, her self-taught incantations nothing but rubbish.
"You can come to 'hit' other people, but other people can curse you too," said Yeung. "After all, it's not good to hit people."
Every day we track down a Video Wonder: an audiovisual offering that delights, inspires, and entertains. Have you encountered a video we should feature? Email ella@atlasobscura.com.
As villains go, Fantômas is a nasty one. Created in 1911, he is a gentleman criminal who perpetrates gruesome, elaborate crimes with no clear motivation. He hangs a victim inside a church bell so that when it rings blood rains on the congregation below. He attempts to kill Jove, the detective on his trail, by trapping the man in a room that slowly fills with sand. He skins a victim and makes gloves from the dead man’s hands in order to leave the corpse’s fingerprints all over the scene of a new crime.
His creators called him the “Genius of Evil” and the “Lord of Terror,” but he remained a cipher with so many identities that often only Jove would recognize him. The book that first introduces him begins with a voice asking: Who is Fantômas? There’s no real answer:
"Nobody.... And yet, yes, it is somebody!"
"And what does the somebody do?"
"Spreads terror!"
But Fantômas was incredibly popular in his day—a now-obscure villain who helped define fictional bad guys for the 20th century. His influence shows up everywhere from surrealist paintings to Hitchcock movies and the X-Men comics. Fantômas was mysterious enough that he could be reinvented many times over. But in all those iterations, no one quite recaptured the pure, chaotic evil that defined the original character.
Fantômas was created by two writers in Paris, Pierre Souvestre and Marcel Allain, who first started working together as journalists covering the nascent car culture of the early 20th century. They would sometimes fill space with dashed-off detective stories, which attracted the attention of a publisher trying to make it rich on mass-market fiction. He hired Souvestre and Allain to write a series of gripping novels; their contract required them to produce one a month. They invented Fantômas on the way to their meeting with the publisher and spent the next three years churning out fantastic stories about their arch-villain.
Fantômas was most easily characterized by his crimes, which were aggressively anti-social. He stole; he dissembled; he killed frequently and almost indiscriminately. In one story, a broken wall starts spewing blood from the many victims hidden there. His motivation seems to be the joy of the crime itself.
As a character, he has few distinguishing features. Even in the original books, Fantomas’ identity is malleable. He changes aliases many times over and often only Jove, the detective obsessed with him, would recognize him in his new guise. He’s so mysterious that at times it seems, as the scholar Robin Walz wrote, that Jove might have made him up or be ascribing the crimes of many men to one fabricated villain. When Fantômas does appear as himself, he’s shrouded in a black and a mask obscures his face. “At the end of a thirty-two book cycle Fantômas remains as much a mystery as at the start,” wrote film scholar David Kalat.
This shadowy villain, though, captured the hearts and minds of the French public in the early 1910s. The book series was an immediate hit, as audiences devoured the crime stories, as over-the-top as they were. Film companies battled for the production rights, and within a few years Fantômas had his first reinvention, as the subject of a series of silent films. The books were published with great success in Italy and Spain, where in 1915 Fantômas became the subject of a musical. In the years before World War I, Fantômas was everywhere.
From the beginning of his existence, Fantômas attracted unexpected fans who recruited him for their own purposes. Guillaume Apollinaire, the experimental poet, loved the series: he called it “one of the richest works that exists.” He and the poet Max Jacob started a fan club, La Société des Amis de Fantômas, the Friends of Fantômas Society. The Surrealist movement that followed in their footsteps became obsessed with Fantômas, and René Magritte once recreated the cover of the first novelas a painting. It was a crime of his own—a theft of the original art.
The Surrealists were so attracted to Fantômas in part because his world accorded with the one they were creating in their art. It followed its own logic rather that the rational and buttoned-up rules of polite society. In one Fantômas film, Jove seizes Fantômas at a restaurant, only to find himself hold a pair of fake arms—the villain had escaped! “But how come Fantômas just happened to have a spare set of fake arms with him at the time? If you need to ask questions like these, the magic of Fantômas will elude you,” Kalat wrote. The Surrealists loved it.
Because the original Fantômas series was so popular, it quickly spread across Europe, to Italy, Spain, England, Germany, and Russia, as film scholar Federico Pagello documented. He was one of the first arch-villains to make it into the movies, and the film series starring him was directed by Louis Feuillade, who pioneered the thriller genre of movies. The Fantômas series was one of his first big projects, and in it he experimented with storytelling techniques he’d use in his famous Les Vampires, which features a whole gang of Fantômas-like villains, dressed all in black. The techniques Feuillade invented influenced Fritz Lang, the director most famous for Metropolis, and, in turn, Alfred Hitchcock.
As thrillers grew as a genre, Fantômas and his imitators were spreading around the world. In Italy the character Za la Mort took up the Fantômas mantle; in England, a director created Ultus, who was meant to be a conscious copy of Fantômas. After the real-life evils of World War II, Fantômas’ extravagant villainy had less appeal, though, and he went quiet until the ‘60s, when he was revived in a French movie series, a Turkish movie, and an Italian comic book, as Diabolik. In 1975, a Spanish movie, Fantomas Versus the Multinational Vampires, was made in homage to Feuillard.
Even from his first reimagining, though, when the books were turned into novels, Fantômas was softened. “On the film poster, the arch-villain's kid-gloved right hand was merely a clenched fist, whereas on the cover of the novel he held a deadly dagger,” wrote Walz. The plot changed, too: in the original story, Fantômas escapes execution by having an actor play his role, and the actor is beheaded before anyone realizes the mistake. In the movie, Juve figures out the plot before the actor is killed and saves his life.
More often, though, Fantômas is given a valiant motivation. The director who transformed him into Ultus considered Fantômas a Robin Hood character, with noble motivations, Pagello wrote. When Fantômas came to the U.S., he was cast as more of a gentleman thief than a black-hearted nihilist. When he was revived for as the star of a series of Mexican comic books in the 1970s, Fantômas was more of a hero than a villain; in the X-Men comics, where a character named Fantomex first appeared in 2002, he tries to act as a good-hearted thief but is quickly revealed to have been created as part of a government weapons program.
Even though Fantômas was an iconic villain early in the 20th century, he was too evil to survive in his original form. Writers preferred to make their villains a little bit more knowable, a little bit more rational, and, ultimately, a little less dark.
In Arizona’s Tator Hills, the Arizona Geological Survey has located a giant fissure in the earth. It’s two miles long; according to a local news station, that’s a half mile longer than any other fissure in the area. These cracks in the ground can be dangerous and unstable, but the geological survey was able to explore the length of the fissure using a drone, which captured the footage above.
Fissures like this one first started appearing in the Arizona desert in the early decades of the 20th century. They’re created when people pump water from ground aquifers faster than the aquifers can replenish. In some places in Arizona, groundwater’s been drawn from the earth 500 times faster than the aquifer’s rate of renewal. When the water disappears, the ground subsides into the empty space, and fissures form at the edges of alluvial basins or at places where bedrock is close to the surface. In Tator Hills, fissures first appeared in 1977; there's now 11 miles of fissures in the area, according to the Arizona Geological Survey.
These fissures can be deep and dangerous to humans. They can often widen and deepen suddenly, particularly after heavy rains. They’ve ruined houses and highways, and for about a decade the Arizona government has been mandated to identify the locations of fissures and share that information with the public. Even after fissures are located, though, there’s not much to do about them, besides stay away.
It’s no secret that the KGB used assassination, often by poison, to silence political dissidents that spoke out against the Soviet regime (known within the agency as "liquid affairs").What remains shrouded in secrecy to this day, however, is the mysterious laboratory where the Soviets invented new methods of poisoning enemies of the state without leaving a trace.
The Soviet Union’s secret poison factory was established in 1921, not long after an attempted assassination of Vladimir Lenin via poison-coated bullets. Originally dubbed the “Special Room,” it was later called Laboratory No. 1, Lab X, and Laboratory No. 12 before becoming known simply as the Kamera or "the Chamber" under Joseph Stalin.
When it comes to murdering your enemies for political power, poison has been a popular weapon of choice since ancient times. But the goal of the Chamber was to devise a poison that was tasteless, odorless, and could not be detected in an autopsy, so as to protect the anonymity of the assassin. This led to such innovations as a cyanide that could be deployed as a mist, a poison that made the cause of death appear to be a heart attack, and a gas pistol that could shoot liquid up to 65 feet away. One politician was killed by a poison sprayed onto his reading lamp, which the heat from the bulb caused to disperse through the room with no trace.
As for the lab itself, very little is known to this day, including the exact location. KGB agents were not allowed to enter the lab or ever told of its whereabouts; only Chamber staff and high-level officials were allowed in. Some disturbing details were revealed in 1954 by a KGB defector, who admitted that poisons were tested on political prisoners and described the lab as being near the secret police headquarters in Lubyanka.
The Soviet government, for its part, had just the previous year claimed that the lab was abolished. But many believe it may still be functioning in some form today, and the lethal innovations developed there still in use.Though it’s been some 30 years since the fall of the Soviet Union, even within the last decade enemies of the Kremlin have been found dead in mysterious circumstances, including some, apparently, by poison.
In 1972, the graphic novelist Art Spiegelman was asked to draw something for an animal-themed comic book. As he brainstormed his submission, he recalled in a 2011 interview, he searched for a way to zoomorphize a seminal horror: "Nazis chasing Jews, as they had in my childhood nightmares."
He dove into all the available archives, looking for inspiration. "As I began to do more detailed and more finely grained research," he said, "I found how regularly Jews were represented literally as rats… posters of killing the vermin and making them flee were part of the overaching metaphor." In Nazi propaganda, Jewish people were rats. In Spiegelman's artwork—which eventually became the enduring Holocaust epic, Maus—they would be mice.
This Everymouse proved popular: "Aesop's fables traveled the world [from Greece] and were reinterpreted by different cultures," says Owen. Soon, mice were rescuing elephants in India's Panchatantra, and befriending crows in the Middle East's Kalīlah wa Dimnah.
Although they didn't show up much in Aesop, ancient rats alternated between wreaking havoc and teaching life lessons. "From a cultural point of view, the rat is a highly charged figure that can warn and threaten, yet also bring salvation and good fortune," writes Jonathan Burt in Rat. In the Old Testament, rats are unclean, unfit for touching or eating. But in Ancient Greece and Rome, a group of rats was a portent, signifying joy and plenty. In India, they were considered helpful, and mythological rats would gnaw people or other animals out of tricky situations.
Rats and mice on their own are one thing, but when the two appear together, they invite comparison."I think often, certainly in the past, these two animals lived right around each other," says Matthew Combs, a doctoral student at Fordham University who focuses on the brown rat. "One house would have to deal with both problems. You have your mice some nights and your rats some nights… it makes sense to compare them, and to turn them into characters."
In such a scenario, says Combs, mice are going to win the public opinion poll. Your average mouse eats two or three grams of food per day—a crumb-sized amount—while a rat needs 30 to 50 grams, a human portion. They also brook opposite strategies for getting this food: "I almost think about mice as these little borrowers, sort of benign," says Combs. "Rats will disassemble the container that you built to keep them out, and rip food apart." Where a mouse makes a demure mess, perhaps a neat hole in a box of crackers, a rat will leave you with an anarchic one—a ripped-up box with the crackers all gone, and a screw-you smattering of droppings.
If you happen to catch a glimpse of either perpetrator, it won't help the rat's cause. "Looking at rats makes people uncomfortable," says Combs. Where mice have proportionately large ears and heads—both of which, to humans, code for "cute"—rats have small heads, small ears, and large bodies. Combs and Owen agree that the tail is the worst part. "It doesn't really match with the body you look at," says Combs. "It almost looks like human skin, but it's much more gross." Rats, especially city rats, are also more likely to get scabby and lose their fur. "That beat-up look shows up in stories and characters," says Combs—like Ratigan, the villain of The Great Mouse Detective, who grows increasingly mangy as his evil plots advance.
When fictional mice evolve, it's often in the other direction. As Stephen Jay Gould pointed out in "A Biological Homage to Mickey Mouse," Mickey's eyes, ears, and snout got larger and more rounded as the Disney brand became more overtly family-friendly.
These physical characteristics affect how we interpret rat and mouse behavior. If you corner a rat, it might leap at you, and take a chunk out of you with its impressive teeth. If you corner a mouse, it will scamper off and hide—objectively cowardly, but courageous in context. "There are little things they do that are actually quite brave when you consider their environment, and how low they are on the food chain," Owen points out. "They have so many predators, but they still run around." Small creatures who take risks make great role models for human children, which likely explains everything from C.S. Louis's warrior mouse, Reepicheep, to E.B. White's adventurous Stuart Little.
Of course, these hero-mice require human authors, who can amp up some of their natural characteristics while downplaying others.And rats, too, have attributes that deserve a more positive spin, says Combs. Despite loner literary rats like Templeton of Charlotte's Web, real rats are very social, he says. "They'll do lots of play-fighting and grooming and touching, and a lot of affectionate behaviors. You could cast them that way, but often that's not what we're given."
Their intelligence, evidenced by their skill at breaking into food stores and out of traps, is often spun as a sort of sinister cleverness, rather than admirable smarts. In Brian Jacques's Redwall series, for instance, Methuselah the old mouse is wise and learned, while Cluny the Scourge, an evil rat, is conniving, even insane.
But even the oldest tropes get nibbled through eventually. Combs and Owen see rodent reputations slowly changing, both in quantity and quality. Contemporary mouse storytelling is becoming, to Owens's trained eye, "a bit repetitive," while rats are swarming in to fill the void: "In 20th century literature, you have rats more than mice," she says, citing Dostoevsky's Notes from Underground, Orwell's 1984, and Camus's The Plague.
As the 21st century scampers on, rat heroes are moving into the spotlight. "Recently, people are a little more accepting of rats as having some good qualities," says Combs. "There's movies like Ratatouille. And there's all this research where they're using rats as models for human physiology and human medicine." A recent study shows that, when tickled, rats giggle and jump around. Although scientists could have cast this response as a malevolent cackle, they didn't—and public response was swift and positive, says Combs. Maybe there's room for the rat in the hero's seat after all.
Over a week ago, a surfer later identified as a Japanese man named Toru was plucked from the ocean by a cargo ship, having spent 16 hours alone on his surfboard some three-and-a-half miles off the coast of Bulli, which is about 40 miles south of Sydney.
But this week, in speaking to the media for the first time since the ordeal, Toru thought he would clear some things up. It wasn't a current, he explained, but his attraction to fear and the Moon that led him to paddle out that far. (The video above is Toru singing a song to a local reporter, who was there to interview him.)
"Scary is a very important feeling … I like to fight against scary, [to] fight against the enemy inside," he told ABC.
Toru claims he gets by on busking, and, for now, is camping on a beach not far from where he set out paddling that night.
He spoke freely with the media when asked about his "beautiful" experience on the water, but said that he would be sticking to day-time surfing for now.
Police also advised against trying to recreate Toru's adventure for yourself.
Just a two-hour drive from Johannesburg towers a giant castle, accessible through a path lined with a herd of elephant statues. This castle's main fountain spouts water out of the horns of sable antelope, and the courtyard contains a life-sized replica of Shawu, who had some of the longest tusks of any elephant ever discovered in South Africa’s Kruger National Park.
This safari-themed castle in the South African bush is the Sun City Resort, a marvelous palace with a bizarre geopolitical history.
Sun City Resort was built in 1979 by business magnate Sol Kerzner just northwest of Johannesburg. At the time, this region was under the autonomy of Bophuthatswan, an independent created under apartheid. Kerzner chose the controversial region as the site of his palace because it was the nearest place to Johannesburg that permitted topless venues.
Kerzner had always dreamed of building a “mythical royal residence built by a lost tribe” in the middle of the South African bush, and just that he did—Kerzner’s brainchild would make both Indiana Jones and Walt Disney proud. In addition to the countless statues of safari animals, the “Big Five”—lions, leopards, rhinos, elephants, and Cape buffalo—can be found in living form at the resort. In the adjacent Pilanesberg Game Reserve, visitors to the resort can self-drive their cars through the African bush.
Past the majestic Palace of the Lost City and the sandy poolside beach, visitors can explore the largest maze in the entire Southern Hemisphere, a 300,000-piece mosaic of a lion killing a wildebeest, lamps held up by tiny monkey statues, and a 1.25-mile zipline, the fastest in the world.