People who died: 2007

When Karlheinz Stockhausen died December 5, we thought immediately of G.K. Chesterton’s well-worn quip that, “Journalism consists largely in saying, ‘Lord Jones is dead’ to people who never knew Lord Jones was alive.” Serious newspapers across the globe marked the composer’s death with lengthy obituaries touting his vast influence on modern music, which, indeed, can’t be understated. His radical experiments with electronics and his unorthodox composition, arrangement, and performance techniques shaped the 20th-century avant-garde as well as the musical consciousness of pop icons such as the Beatles, Miles Davis, and Frank Zappa, to name a few. And yet few outside of the more rarefied cadres of intellectuals and music nerds have ever heard his music, unless by bizarre accident. (Nicolas Roeg used a snatch of “Hymnen” on the soundtrack to his 1971 film Walkabout, perhaps Stockhausen’s widest audience exposure ever.) He’s more widely known for a taken-out-of-context quip he made about the September 11 attacks than any music he ever made. The world wouldn’t be the same without Stockhausen, but now that people know he’s dead, they still don’t know how.

That’s the sort of thing that inspired our annual salute to folks who somehow made a mark with their lives and yet didn’t have those lives properly appreciated once they were done. Plenty of tears and ink were spilled when Norman Mailer and Kurt Vonnegut passed, not to mention Ingmar Bergman and Michelangelo Antonioni, Andrew Hill and Max Roach, Beverly Sills and Luciano Pavarotti, Marcel Marceau and Deborah Kerr, Evel Knievel and Boris Yeltsin, Ike Turner and Pimp C, Anna Nicole Smith and Jerry Falwell.

But once again, we opt to celebrate a few lesser-known lives that ended over the past 12 months. (Ironically, Stockhausen was too famous.)

Lee Gardner.

Used his noodle: Momofuku Ando

Imagine a world in which ramen, that simple and salty noodle soup most of us can get for 35 cents a package at any supermarket or convenience store in America, had to be purchased on the black market. That’s the world Momofuku Ando lived in, and the one that inspired him to find a way to get soup to the people.

Just after the end of World War II, Taiwan-born Ando was living in Japan and he saw people standing in line at a black-market stall where fresh ramen — Japanese noodle soup — was being made. Food shortages were plaguing the country and Ando decided to create an easy-to-prepare noodle soup that could be eaten anytime, anywhere, without need for scarce fresh ingredients.

Ando, whose expertise was in running clothing companies, formed the Nissin Food Products company in 1948. According to the Nissin web site, Ando’s corporate philosophy was “peace will come to the world when the people have enough to eat.” A decade later, in 1958, Nissin introduced its first ramen product: instant chicken ramen, the world’s first instant-noodle product, made by boiling ramen noodles with flavorings then deep-frying them with palm oil to dry them out.

Ando’s dehydrated noodle soup wasn’t cheap, at first; Japanese grocery stores sold their ramen at one-sixth of the cost of Ando’s revolutionary but pricey instant concoction, so Nissin’s ramen was initially sold as a luxury item. Despite that, people bought the product, and other companies soon followed in Ando’s footsteps, marketing their own versions of instant ramen products.

It wasn’t until the 1970s that Ando’s ideas became truly mainstream; that’s when Nissin Foods introduced Cup o’ Noodles, a cheaply produced, inexpensively priced chicken-flavored noodle soup packaged in a Styrofoam cup. The product was wildly popular overseas and introduced the instant-soup revolution to the United States.

Ando died of a heart attack January 5, but he lived long enough to see his company introduce a specially designed instant ramen called Space Ram for astronauts. In July 2005, Japanese astronaut Soichi Noguchi took the instant ramen product with him on a mission aboard the U.S. space shuttle Discovery. Ando was quoted as saying, “I’m happy I’ve realized my dream that noodles can go into space.”

Erin Sullivan

Something clicked: Robert Adler

If you’re like most Americans, when you get home you flop down on the couch, turn on the television, and start searching around to see if there’s anything good on. You perform those unthinking actions in that accustomed order in large part thanks to Robert Adler. Adler happens to have been the co-inventor for the earliest successful TV remote control, a device that has, for better or worse, affected the lives of almost everyone living in the developed world.

Adler’s personal biography, as available to the casual researcher, is spartan and not atypical of many scientists of his generation. Born in Vienna, Austria, in 1913, Adler earned his Ph.D. in physics from the University of Vienna in 1937 before joining the exodus of affluent, educated European Jews to the United States. He went to work for Zenith in 1941 and stayed with the company for almost 60 years in various capacities, including director of research.

Like most American scientists at the time, Adler mostly worked to further “the war effort” during World War II, in his case improving aircraft radios. When the war ended, he and many of his colleagues turned to the then-emerging technology of television. It seems TV sets weren’t around long before manufacturers went looking for ways to prevent viewers from having to get up from their seats to operate them. Zenith introduced the first TV remote in 1950; the company called the bulky box connected to the set by a wire “Lazy Bones.” Even if the insulting name didn’t limit sales, the wire did, and the company introduced the wireless “Flashmatic” in 1955; the new iteration worked with a light beam that triggered photovoltaic cells, but the cells reacted to other light sources as well, making the setup unpredictable.

Adler had already introduced several improvements to basic television design before tackling the problem of a reliable remote control. Along with Eugene Polley, the man behind the Flashmatic, Adler wrestled with the problem until he came up with a solution: ultrasound. On his “Space Command” remote, introduced in 1956, the buttons cued tiny hammers to strike aluminum rods inside the housing; the resulting tones triggered vacuum tubes inside the TV set that changed channels, adjusted the volume, etc. The tones were too high-frequency for humans to hear, but the mechanism did make a faint clicking sound (hence “clicker”). It weighed half a pound and cost $100 (the equivalent would buy a wall-sized LCD flat screen in 2007 dollars), but people bought it. Remotes would eventually adopt transistorized ultrasonic signals and infrared technology (most common in remotes today), but Adler and Polley’s invention established the remote control as a viable, and desirable, feature.

Every bio of Adler mentions the Space Command as his claim to fame, but he enjoyed a long and distinguished career apart from that one soon-outmoded device. He published dozens of articles, won numerous awards and honors (including an Emmy), and kept innovating, including seminal work on acoustic-wave technology, the field that underpins another everyday device: the touch screen. He held more than 180 patents: His last was awarded in 2006 and related to touch-screen technology; he applied for another two weeks before he died on February 15, at age 93. Otherwise, the scanty accounts of his nonlaboratory life mention a devoted wife, a pilot’s license, and an enthusiasm for the outdoors. He apparently watched little TV.

Lee Gardner

The Normal Heart (and Mind):
Barbara Gittings

Growing up in Wilmington, Delaware, Barbara Gittings knew she was different early on. She found herself having crushes on female students at school; her father found out she was reading a novel about lesbians and wrote his daughter a letter asking her to burn the book. In 1949 she enrolled in Northwestern University as a theater major but flunked out after a year. Rather than studying theater, she had spent all her time researching what exactly it meant to be a lesbian.

“There was nobody I could talk to, so I went to the library for information,” Gittings said in a 1999 American Libraries magazine interview. “That was what I was raised to do, but it wasn’t very much help. I had to find bits and pieces under headings like ‘sexual perversion’ and ‘sexual aberration’ in books on abnormal psychology. I kept thinking, ‘It’s me they’re writing about, but it doesn’t feel like me at all.’”

In 1956, she traveled to California and met the founder of the Daughters of Bilitis, which would become the first national lesbian organization in the United States. Gittings had found her people and was asked to start a New York chapter two years later. She met her partner of 46 years, photographer and writer Kay Lahusen, at a Daughters of Bilitis picnic. She even became the editor of the national Daughters magazine, The Ladder, in 1963. Over the three years she ran the magazine, she changed the cover art from drawings of women to photographs of actual lesbians.

Gittings took part, in 1965, in what is believed to be the first protest at the White House for gay rights and continued to march in front of Philadelphia’s Independence Hall every Fourth of July throughout the ’60s. “It was called annual Reminder Day,” Gittings told Philadelphia City Paper in a 1999 interview. “The purpose was to remind the public that the guarantees of life, liberty, and the pursuit of happiness that are in the documents we celebrate on July 4 are not extended to gay people.” In 2005, a plaque was erected across the street from Independence Hall to commemorate the protests.

Perhaps Gittings’ greatest accomplishment was her work to get homosexuality removed from the American Psychiatric Association’s list of mental disorders. In 1972, she was invited to sit on a panel on the topic at an APA symposium. “My partner Kay said, ‘This isn’t right. Here you have two psychiatrists pitted against two gays, and what you really need is someone who is both,’” Gittings recalled while accepting an award from the APA in 2006. Finding someone willing to come forward proved difficult, but one gay psychiatrist finally agreed to speak, though he insisted on wearing a mask and wig and disguising his voice. He was called Dr. H. Anonymous and his testimony was seen by many as a turning point. The APA voted to take homosexuality of its list of mental illnesses a year later.

It was a major victory for the gay-rights movement, but Gittings continued working. She won numerous awards; she and her partner donated books, writings, and paraphernalia from the gay-rights struggle to libraries, to ensure that future generations won’t have to hunt for information about their history and identity as she did; and she continued speaking out even as she battled the breast cancer that ended her life February 18 at age 74.

Anna Ditkoff

Workers of the World:
Ousmane Sembene

The inspiration to draw from Senegalese filmmaker Ousmane Sembene, who passed away June 9 at his home in Dakar after a protracted illness, isn’t so much what he achieved over his 84 years. Yes, he was a vanguard filmmaker who broke countless barriers — the first black African to make a feature-length movie in sub-Saharan Africa, the first African to win the Prix Jean Vigo at Cannes, the first filmmaker to make a movie entirely in Wolof, the native language of his home — but he never set out to be the first anything. His works were merely the expressions of the political and personal ideals to which he dedicated his life. And that constant engagement is what is most impressive about Sembene. He’s an artist who tirelessly pursued his creative and intellectual faculties, not publishing his first novel until he was in his early 30s and not taking up filmmaking, for which he is globally renowned, until his 40s.

The world in which he traveled before pursuing art obviously shaped his ideas. Sembene was born in Ziguinchor, the capitol city of the Casamance region of southwest Senegal. The city is located on a river, and his father and grandfather were fishermen, the family trade Sembene entered after leaving formal school at 14. Seasickness prevented him from turning fishing into his trade, so he moved to Dakar in pursuit of work.

Over the next 20 years Sembene would earn his keep with the sweat of his labor and prime his mind in books, seminars, and movie houses, and among various political parties and organizing of the era. Like many Africans, he was drafted into the Free French Army during World War II, working in Niger and France, before settling in France in 1947, where he found work first at a Paris Citroën factory and then as a stevedore in Marseilles. In 1950 he joined the French Communist Party and he routinely took part in various protests and strikes.

He fused his own constantly evolving political awareness with storytelling for his partially autobiographical first novel, 1956’s The Black Docker, which was written in French. His literary pursuits brought him his first acclaim, but like many politically minded emerging artists of this era — such as Jean-Luc Godard and Pier Paolo Pasolini, his politically engaged filmmaking peers — Sembene wanted his message to reach wider audiences and knew more people went to the movies than read. And so Sembene studied film at Moscow’s Gorky Film Studio in the early 1960s.

His 1966 Le Noire de ..., awarded Cannes’ Prix Jean Vigo for independent spirit, officially marked Sembene’s emergence as a mature artist and African filmmaking pioneer. Over the next nearly 40 years Sembene would be the chief filmmaking force from the continent, starting a production company and a film and television festival, and crafting his keenly intelligent, politically acute, and fearlessly satirical movies, which were often the first African movies to which many Westerners were exposed.

The best entryway into his artistic greatness, though, is found in his writing. Sembene’s 1960 novel, Le Bouts de Bois de Dieu (God’s Bits of Wood), is a fictional account of the 1947 strike by West African railway workers. It’s not only Sembene’s masterpiece — he was as gifted, if not more so, a writer as he was a filmmaker — but also one of the greatest labor stories ever, on par with John Sayles’ Matewan, Denise Giardina’s Storming Heaven, Thomas Bell’s Out of This Furnace, Chester Himes’ Lonely Crusade, or anything by Upton Sinclair.

Bret McCabe

Unflinching: Paul W. Tibbets

In the weeks after the end of World War II, the U.S. military put the ruined cities of Nagasaki and Hiroshima “off limits.” When reporters snuck in and dispatched reports detailing the devastation, including the then-unknown phenomenon of radiation sickness, the U.S. military censored the story and countered with propaganda. Americans would not be told of nuclear weapons’ lingering effects for years.

The story we learned was this: Because brave men dropped the first atomic bombs on a dauntless enemy, America did not have to invade the Japanese mainland, saving at least one million lives on both sides of the conflict. It is an enduring lie.

We know it is a lie because, 62 years on, historians (see Joseph C. Grew, Gar Alperovitz, Greg Mitchell, James Hershberg, Martin Sherwin, etc.) have long reported what Dwight Eisenhower and Harry Truman knew before the bomb was dropped: Japan was about to surrender anyway. But this fact is entombed in thick books with copious footnotes, so most Americans still don’t know, and many of them deny it bitterly despite the evidence. Their patron saint is Paul W. Tibbets, who died in his Columbus, Ohio, home on November 1, at the age of 92.

Courageous, precise, a consummate pilot, Tibbets assembled and commanded the 12-man crew of the Enola Gay, the B-29 bomber he named for his mother. The Illinoian was chosen over two higher-ranked pilots because, as he told an interviewer, “they were looking for someone who wouldn’t flinch.” On August 6, 1945, Tibbets and crew dropped a 20-kiloton uranium bomb on Hiroshima.

Hailed as a hero taking credit for ending World War II, Tibbets served in the Air Force for another 20 years and never doubted the rightness of the bombing even in the face of declassified documents and the unfolding history — political and medical — of what atomic weapons wrought. He was the solid fulcrum of the nuclear-war age, and in the mid-1990s he fought a final battle to keep his reputation unsullied by unflattering facts.

When the Smithsonian Institution commissioned an exhibit of Enola Gay’s forward fuselage, National Air and Space Museum director Martin Harwit also wanted to put the bombing into historical context. Veterans’ groups protested; Tibbets took point.

Calling the proposed exhibit “a package of insults,” Tibbets gave no quarter to the historians: “Today, on the eve of the 50th Anniversary of the end of World War II, many are second-guessing the decision to use the atomic weapons. To them, I would say, ‘STOP!’” He had learned all the facts he’d ever need from his superiors, in the months before he dropped the bomb, and he had no use for any more. Tibbets vanquished the historians and, as he never tired of saying, “never lost a night’s sleep” over the bomb.

But he did not leave it at that. He joyfully re-enacted the bombing and hawked triumphalist memorabilia from his “official website,” including signed books, photos, and a 10-inch scale replica of the “Little Boy” atom bomb; $275 plus shipping.

In 2002, as the “war on terror” got under way, Studs Terkel put his microphone before Tibbets. “One last thing,” he asked the old hero. “When you hear people say, ‘Let’s nuke ’em, let’s nuke these people,’ what do you think?”

“Oh, I wouldn’t hesitate if I had the choice,” Tibbets replied. “I’d wipe ’em out. You’re gonna kill innocent people at the same time, but we’ve never fought a damn war anywhere in the world where they didn’t kill innocent people. If the newspapers would just cut out the ****. ‘You’ve killed so many civilians.’ That’s their tough luck for being there.”

Edward Ericson, Jr.

________________________________________________

The most influential freshly departed you’ve never heard of

People who died: 2007

 

When Karlheinz Stockhausen died December 5, we thought immediately of G.K. Chesterton’s well-worn quip that, “Journalism consists largely in saying, ‘Lord Jones is dead’ to people who never knew Lord Jones was alive.” Serious newspapers across the globe marked the composer’s death with lengthy obituaries touting his vast influence on modern music, which, indeed, can’t be understated. His radical experiments with electronics and his unorthodox composition, arrangement, and performance techniques shaped the 20th-century avant-garde as well as the musical consciousness of pop icons such as the Beatles, Miles Davis, and Frank Zappa, to name a few. And yet few outside of the more rarefied cadres of intellectuals and music nerds have ever heard his music, unless by bizarre accident. (Nicolas Roeg used a snatch of “Hymnen” on the soundtrack to his 1971 film Walkabout, perhaps Stockhausen’s widest audience exposure ever.) He’s more widely known for a taken-out-of-context quip he made about the September 11 attacks than any music he ever made. The world wouldn’t be the same without Stockhausen, but now that people know he’s dead, they still don’t know how.

That’s the sort of thing that inspired our annual salute to folks who somehow made a mark with their lives and yet didn’t have those lives properly appreciated once they were done. Plenty of tears and ink were spilled when Norman Mailer and Kurt Vonnegut passed, not to mention Ingmar Bergman and Michelangelo Antonioni, Andrew Hill and Max Roach, Beverly Sills and Luciano Pavarotti, Marcel Marceau and Deborah Kerr, Evel Knievel and Boris Yeltsin, Ike Turner and Pimp C, Anna Nicole Smith and Jerry Falwell.

But once again, we opt to celebrate a few lesser-known lives that ended over the past 12 months. (Ironically, Stockhausen was too famous.) — Lee Gardner

 

Used his noodle: Momofuku Ando

Imagine a world in which ramen, that simple and salty noodle soup most of us can get for 35 cents a package at any supermarket or convenience store in America, had to be purchased on the black market. That’s the world Momofuku Ando lived in, and the one that inspired him to find a way to get soup to the people.

Just after the end of World War II, Taiwan-born Ando was living in Japan and he saw people standing in line at a black-market stall where fresh ramen — Japanese noodle soup — was being made. Food shortages were plaguing the country and Ando decided to create an easy-to-prepare noodle soup that could be eaten anytime, anywhere, without need for scarce fresh ingredients.

Ando, whose expertise was in running clothing companies, formed the Nissin Food Products company in 1948. According to the Nissin web site, Ando’s corporate philosophy for the company was “peace will come to the world when the people have enough to eat.” A decade later, in 1958, Nissin introduced its first ramen product: instant chicken ramen, the world’s first instant-noodle product, made by boiling ramen noodles with flavorings then deep-frying them with palm oil to dry them out.

Ando’s dehydrated noodle soup wasn’t cheap, at first; Japanese grocery stores sold their ramen at one-sixth of the cost of Ando’s revolutionary but pricey instant concoction, so Nissin’s ramen was initially sold as a luxury item. Despite that, people bought the product, and other companies soon followed in Ando’s footsteps, marketing their own versions of instant ramen products.

It wasn’t until the 1970s that Ando’s ideas became truly mainstream; that’s when Nissin Foods introduced Cup o’ Noodles, a cheaply produced, inexpensively priced chicken-flavored noodle soup packaged in a Styrofoam cup. The product was wildly popular overseas and introduced the instant-soup revolution to the United States.

Ando died of a heart attack January 5, but he lived long enough to see his company introduce a specially designed instant ramen called Space Ram for astronauts. In July 2005, Japanese astronaut Soichi Noguchi took the instant ramen product with him on a mission aboard the U.S. space shuttle Discovery. Ando was quoted as saying, “I’m happy I’ve realized my dream that noodles can go into space.” — Erin Sullivan

 

Bombshell: Liz Renay

It’s not like being famous for pneumatic body parts or cashing in on a slutty reputation is anything new. Decades before the likes of Kim Kardashian or Paris Hilton courted paparazzi lenses, Liz Renay made a career out of making a literal spectacle of herself with all manner of self-exploitation: Vegas showgirl, modeling, game shows, stripping, Z-movies, tell-all memoirs, general public nudity, and more stripping. In between, she dated mobsters (and a lot of other guys, too), went to prison, and married at least seven times. It’s amazing she made it to 80.

To paraphrase the old rock ’n’ roll tune, the girl couldn’t help it. Pearl Elizabeth Dobbins was born into a fundamentalist Christian household in Chandler, Arizona, in 1926. Despite regular churchgoing and strict rules (no movies, for example), Dobbins was eager for bright lights; as a teenager she started running away to Las Vegas to try to break in as a showgirl. She hoped that her 44DD breasts and passing bottle-job resemblance to megastar Marilyn Monroe would get her noticed, and they did. She had already been married twice, had two kids, and won a bra-modeling contest when she landed a job as an extra on a film being shot in Phoenix in 1950; she somehow parlayed this modest exposure into a Life magazine photo spread called “Pearl’s Big Moment.” Her first moment, anyway.

Redubbed Liz Renay, she worked as a stripper and developed a taste for Mafia men — first Murder Inc.’s Tony Coppola in New York, then, after relocating to Los Angeles, West Coast kingpin Mickey Cohen (see the collected works of crime novelist James Ellroy, in which Renay appears as a character). She showed up in magazines as a model and in tabloid photos on Cohen’s arm. She was eventually called to testify in Cohen’s 1959 tax-evasion trial and lied on the stand about helping him launder money. She wound up spending 27 months in the federal penitentiary at Terminal Island for perjury.

Her conviction killed whatever legitimacy there was to her career, but she pressed on. In addition to stripping, she started appearing in low-budget exploitation films with titles like The Thrill Killers, Hot Rods to Hell, and Blackenstein. She was not above the occasional attention-getting stunt: In 1974 she tried to draw attention to the premiere of one of her films by running naked down Hollywood Boulevard. She also published a memoir in 1971 called My Face for the World to See. Her breathless account of her adventures and trials (literally) made it a cult fave, and the book helped win Renay the role of which she was most proud: Muffy St. Jacques, the lipstick half of the leading Mortville lesbian couple in John Waters’ Desperate Living. The film renewed and solidified her demimonde icon status.

Though past 50 when Desperate Living was released in 1977, Renay continued stripping and making appearances. She wrote another memoir with the arresting title My First 2,000 Men; though she later acknowledged that the title figure was a little high, the book detailed trysts with Cary Grant, Burt Lancaster, and Regis Philbin, among many, many others.

Her life was not without its share of tragedies: She and her daughter Brenda had a mother-daughter strip act until Brenda killed herself in 1982. But Renay continued marketing her ample flesh and va-va-va-voom mystique throughout her twilight years, which she spent in Las Vegas, the city of her original dreams of stardom. She died January 22. — Lee Gardner

 

Solar Sister: Alice Coltrane

Jazz giant John Coltrane changed thousands of lives, but none more than Alice Coltrane’s. While they were partners, musical and domestic, for just a few short years before he died in 1967, the spiritual and musical growth she experienced with Coltrane shaped the rest of her life. Yet Alice Coltrane was not a behind-every-great-man booster like Sue Mingus, or a dubiously talented spouse brought along for company, à la Linda McCartney. Her husband may have set her on the paths she followed in the years after his death, but she traveled them on her own, and no one has followed since.

Born in Detroit in 1937, Alice McLeod grew up steeped in gospel and jazz, but her musical training, beginning at age 7, was in classical piano. Her brother’s influence eventually won out, and by her early 20s, she had traveled to Paris to study with bop legend Bud Powell and was gigging professionally in Detroit and then New York. She met John Coltrane backstage at Birdland in 1962; the two introspective introverts were soon inseparable and married in 1965.

By that time, John Coltrane had driven his music near what was then the outermost edge of jazz, with harsh solo cries, expansive pieces, and a palette of exotic influences as broad as the saxophonist’s pantheism. When pianist McCoy Tyner quit, the saxophonist asked his wife to join the greatest quartet of the time, maybe all time. “I just said, ‘Are you sure? Is this what you want?’” she recalled for a 2002 interview in The Wire magazine, “and he said, ‘I’m positive.’” Alice’s more expansive approach better suited her husband’s increasingly protean sound, but their journey together was cut short by his liver cancer.

Alice Coltrane found herself a widowed mother of four (including a son from a brief prior marriage) and, almost as unexpectedly, a solo artist. Her debut as a leader, 1968’s A Monastic Trio, delivered a stripped-down, more emollient take on Coltrane’s late-period explorations, and she sometimes doubled on harp, an instrument that remains almost unknown otherwise in jazz. Subsequent albums such as Ptah, the El Daoud, and Journey in Satchidananda brought in more blatant Eastern influences as well as more evident spiritual concerns. She reached back to her classical training to bring in vast string sections for ambitious albums such as World Galaxy (on which she reprised her husband’s signature tune, “My Favorite Things,” in an arrangement for strings and organ) and Lord of Lords.

Coltrane made 11 albums in 10 years and then effectively retired from music to devote herself to her spiritual interests. A disciple of Indian guru Swami Satchidananda, she founded a still-extant ashram in Southern California in 1983 and lived a quiet life with her family. (She never remarried and reportedly took a vow of celibacy after Coltrane’s death.) Her music, a little outré even for the ‘70s, was largely dismissed as jazz took a neo-classical turn. But, by the turn of the century, crate diggers and a new generation of more open-minded jazz fans had discovered the mystical yet enduring beauties of her omnivorous sound. After more than 20 years, she was lured into a modest comeback by saxophonist son Ravi, releasing the spry Translinear Light in 2004 and performing several concerts. She appeared to enjoy the notice, but she didn’t seem to need it; she more or less returned to humble obscurity before dying of a lung ailment on January 12. Her music continues to speak for her. — Lee Gardner

 

Something clicked: Robert Adler

If you’re like most Americans, when you get home you flop down on the couch, turn on the television, and start searching around to see if there’s anything good on. You perform those unthinking actions in that accustomed order in large part thanks to Robert Adler. Adler happens to have been the co-inventor for the earliest successful TV remote control, a device that has, for better or worse, affected the lives of almost everyone living in the developed world.

Adler’s personal biography, as available to the casual researcher, is spartan and not atypical of many scientists of his generation. Born in Vienna, Austria, in 1913, Adler earned his Ph.D. in physics from the University of Vienna in 1937 before joining the exodus of affluent, educated European Jews to the United States. He went to work for Zenith in 1941 and stayed with the company for almost 60 years in some capacity, including director of research.

Like most American scientists at the time, Adler mostly worked to further “the war effort” during World War II, in his case improving aircraft radios. When the war ended, he and many of his colleagues turned to the then-emerging technology of television. It seems TV sets weren’t around long before manufacturers went looking for ways to prevent viewers from having to get up from their seats to operate them. Zenith introduced the first TV remote in 1950; the company called the bulky box connected to the set by a wire “Lazy Bones.” Even if the insulting name didn’t limit sales, the wire did, and the company introduced the wireless “Flashmatic” in 1955; the new iteration worked with a light beam that triggered photovoltaic cells, but the cells reacted to other light sources as well, making the setup unpredictable.

Adler had already introduced several improvements to basic television design before tackling the problem of a reliable remote control. Along with Eugene Polley, the man behind the Flashmatic, Adler wrestled with the problem until he came up with a solution: ultrasound. On his “Space Command” remote, introduced in 1956, the buttons cued tiny hammers to strike aluminum rods inside the housing; the resulting tones triggered vacuum tubes inside the TV set that changed channels, adjusted the volume, etc. The tones were too high-frequency for humans to hear, but the mechanism did make a faint clicking sound (hence “clicker”). It weighed half a pound and cost $100 (the equivalent would buy a wall-sized LCD flat screen in 2007 dollars), but people bought it. Remotes would eventually adopt transistorized ultrasonic signals and infrared technology (most common in remotes today), but Adler and Polley’s invention established the remote control as a viable, and desirable, feature.

Every bio of Adler mentions the Space Command as his claim to fame, but he enjoyed a long and distinguished career apart from that one soon-outmoded device. He published dozens of articles, won numerous awards and honors (including an Emmy), and kept innovating, including seminal work on acoustic-wave technology, the field that underpins another everyday device: the touch screen. He held more than 180 patents: His last was awarded in 2006 and related to touch-screen technology; he applied for another two weeks before he died on February 15, at age 93. Otherwise, the scanty accounts of his nonlaboratory life mention a devoted wife, a pilot’s license, and an enthusiasm for the outdoors. He apparently watched little TV. — Lee Gardner

 

The Normal Heart (and Mind): Barbara Gittings

Growing up in Wilmington, Delaware, Barbara Gittings knew she was different early on. She found herself having crushes on female students at school; her father found out she was reading a novel about lesbians and wrote his daughter a letter asking her to burn the book. In 1949 she enrolled in Northwestern University as a theater major but flunked out after a year. Rather than studying theater, she had spent all her time researching what exactly it meant to be a lesbian.

“There was nobody I could talk to, so I went to the library for information,” Gittings said in a 1999 American Libraries magazine interview. “That was what I was raised to do, but it wasn’t very much help. I had to find bits and pieces under headings like ‘sexual perversion’ and ‘sexual aberration’ in books on abnormal psychology. I kept thinking, ‘It’s me they’re writing about, but it doesn’t feel like me at all.’”

In 1956, she traveled to California and met the founder of the Daughters of Bilitis, which would become the first national lesbian organization in the United States. Gittings had found her people and was asked to start a New York chapter two years later. She met her partner of 46 years, photographer and writer Kay Lahusen, at a Daughters of Bilitis picnic. She even became the editor of the national Daughters magazine, The Ladder, in 1963. Over the three years she ran the magazine, she changed the cover art from drawings of women to photographs of actual lesbians.

Gittings took part, in 1965, in what is believed to be the first protest at the White House for gay rights and continued to march in front of Philadelphia’s Independence Hall every Fourth of July throughout the ‘60s. “It was called annual Reminder Day,” Gittings told Philadelphia City Paper in a 1999 interview. “The purpose was to remind the public that the guarantees of life, liberty, and the pursuit of happiness that are in the documents we celebrate on July 4 are not extended to gay people.” In 2005, a plaque was erected across the street from Independence Hall to commemorate the protests.

Perhaps Gittings’ greatest accomplishment was her work to get homosexuality removed from the American Psychiatric Association’s list of mental disorders. In 1972, she was invited to sit on a panel on the topic at an APA symposium. “My partner Kay said, ‘This isn’t right. Here you have two psychiatrists pitted against two gays, and what you really need is someone who is both,’” Gittings recalled while accepting an award from the APA in 2006. Finding someone willing to come forward proved difficult, but one gay psychiatrist finally agreed to speak, though he insisted on wearing a mask and wig and disguising his voice. He was called Dr. H. Anonymous and his testimony was seen by many as a turning point. The APA voted to take homosexuality of its list of mental illnesses a year later.

It was a major victory for the gay-rights movement, but Gittings continued working. She won numerous awards; she and her partner donated books, writings, and paraphernalia from the gay-rights struggle to libraries, to ensure that future generations won’t have to hunt for information about their history and identity as she did; and she continued speaking out even as she battled the breast cancer that ended her life February 18 at age 74.

   Anna Ditkoff

 

Workers of the World: Ousmane Sembene

The inspiration to draw from Senegalese filmmaker Ousmane Sembene, who passed away June 9 at his home in Dakar after a protracted illness, isn’t so much what he achieved over his 84 years. Yes, he was a vanguard filmmaker who broke countless barriers — the first black African to make a feature-length movie in sub-Saharan Africa, the first African to win the Prix Jean Vigo at Cannes, the first filmmaker to make a movie entirely in Wolof, the native language of his home — but he never set out to be the first anything. His works were merely the expressions of the political and personal ideals to which he dedicated his life. And that constant engagement is what is most impressive about Sembene. He’s an artist who tirelessly pursued his creative and intellectual faculties, not publishing his first novel until he was in his early 30s and not taking up filmmaking, for which he is globally renown, until his 40s.

The world in which he traveled before pursuing art obviously shaped his ideas. Sembene was born in Ziguinchor, the capitol city of the Casamance region of southwest Senegal. The city is located on a river, and his father and grandfather were fishermen, the family trade Sembene entered after leaving formal school at 14. Seasickness prevented him from turning fishing into his trade, so he moved to Dakar in pursuit of work.

Over the next 20 years Sembene would earn his keep with the sweat of his labor and prime his mind in books, seminars, and movie houses, and among various political parties and organizing of the era. Like many Africans, he was drafted into the Free French Army during World War II, working in Niger and France, before settling in France in 1947, where he found work first at a Paris Citroën factory and then as a stevedore in Marseilles. In 1950 he joined the French Communist Party and he routinely took part in various protests and strikes.

He fused his own constantly evolving political awareness with storytelling for his partially autobiographical first novel, 1956’s The Black Docker, which was written in French. His literary pursuits brought him his first acclaim, but like many politically minded emerging artists of this era — such as Jean-Luc Godard and Pier Paolo Pasolini, his politically engaged filmmaking peers — Sembene wanted his message to reach wider audiences and knew more people went to the movies than read. And so Sembene studied film at Moscow’s Gorky Film Studio in the early 1960s.

His 1966 Le Noire de . . . , awarded Cannes’ Prix Jean Vigo for independent spirit, officially marked Sembene’s emergence as a mature artist and African filmmaking pioneer. Over the next nearly 40 years Sembene would be the chief filmmaking force from the continent, starting a production company and a film and television festival, and crafting his keenly intelligent, politically acute, and fearlessly satirical movies, which were often the first African movies to which many Westerners were exposed.

The best entryway into his artistic greatness, though, is found in his writing. Sembene’s 1960 novel, Le Bouts de Bois de Dieu (God’s Bits of Wood), is a fictional account of the 1947 strike by West African railway workers. It’s not only Sembene’s masterpiece — he was as gifted, if not more so, a writer as he was a filmmaker — but also one of the greatest labor stories ever, on par with John Sayles’ Matewan, Denise Giardina’s Storming Heaven, Thomas Bell’s Out of This Furnace, Chester Himes’ Lonely Crusade, or anything by Upton Sinclair.

   Bret McCabe

 

Bon Vivant: Jean-François Bizot

In 1968, a little-known magazine seller named Felix Dennis became the publisher of the London-by-way-of-Australian underground counterculture magazine Oz. And over the next five years Oz became one of the hippest things about post-Swinging London, tapping into and celebrating music, art, and some of the most blown-to-bits graphic design of any era. (These days, Dennis is the mogul behind Maxim.) In 1967, a British DJ who had spent some time in the United States started broadcasting for the BBC. His name was John Peel, and for the next 37 years he exercised one of the broadest, most acute, and most enthusiastically discerning ears for good music on the planet.

In the English-speaking world, both Peel and Dennis are fairly renown. In Paris, those roles were filled by one staggeringly hip man. Jean-François Bizot, who passed away in Paris September 8, after a protracted bout with cancer, at the age of 63, founded both Paris’s leading counter-everything periodical of the 1970s, Actuel, and the city’s most outward-looking radio station in the 1980s, Radio Nova. His idea of “alternative” wasn’t some narrowly defined windbag of a marketing strategy; Bizot thought more in terms of throwing irreverent hand grenades into any situation. According to David Byrne, when Actuel put him, Brian Eno, and Jon Hassell on the cover when their 1981 My Life in the Bush of Ghosts exploration of African rhythms came out, the headline read “the whites think too much.”

Bizot came from a stolidly bourgeois home, the son of a Catholic family in Lyon, and studied literature and economics at Ecole Nationale Superieure des Industries Chimiques de Nancy. After a brief career as an economist, he became a journalist with L’Express, France’s sort-of-lefty Time, from 1967-70. He founded Actuel in May 1970, and over its run — 1970-75, 1979-94 — the magazine explored and celebrated environmentalism, feminism, gay activism, squatters rights, anti-racism, rock ‘n’ roll, psychedelia, fashion, comics, visual art, and other such underground, countercultural views, movements, and ideas. He distilled his own experiences and the general gestalt of the times into his books, 2001’s Underground: The History and 2006’s Free Press: Underground and Alternative Publications, 1965-1975.

He created the multicultural, free-form Radio Nova in 1981 following then-President François Mitterand’s deregulation of French airwaves. It was a station that celebrated music from Africa, India, and other international locales long before “world music” became a record-store section, alongside such other foreign sounds as hip-hop, jungle, and techno.

What’s especially remarkable about Bizot is not only what he achieved with his life’s pursuits, but that he became so influential. He was a man known and admired in the so-called aboveground culture, and following his death, weekly French newsmagazine Le Nouvel Observateur published reactions to the news from esteemed journalists such as Phillippe Gavi (a co-founder of French daily paper Libération) and Ariel Wizman, French Minister of Culture Christine Albanel, the Syndicat Interprofessionnel des Radios et Télévisions Indépendantes (a union of independent radio and TV operators ), and his friend Bernard Kouchner, the doctor, diplomat, and Doctors Without Borders co-founder, who said Bizot was the “man who traveled these countercultures from which gushes life.” (Clumsy, nonidiomatic translation this writer’s.)

And he did it with such an impish humanity. In 2003, he published A Moment of Weakness, a book about his own battle with cancer. In it, he named his tumor “Jack the Squatter.” May we all stare our mortality in the face with such absurd defiance.

 

Unflinching: Paul W. Tibbets

In the weeks after the end of World War II, the U.S. military put the ruined cities of Nagasaki and Hiroshima “off limits.” When reporters snuck in and dispatched reports detailing the devastation, including the then-unknown phenomenon of radiation sickness, the U.S. military censored the story and countered with propaganda. Americans would not be told of nuclear weapons’ lingering effects for years.

The story we learned was this: Because brave men dropped the first atomic bombs on a dauntless enemy, America did not have to invade the Japanese mainland, saving at least 1 million lives on both sides of the conflict. It is an enduring lie.

We know it is a lie because, 62 years on, historians (see Joseph C. Grew, Gar Alperovitz, Greg Mitchell, James Hershberg, Martin Sherwin, etc.) have long reported what Dwight Eisenhower and Harry Truman knew before the bomb was dropped: Japan was about to surrender anyway. But this fact is entombed in thick books with copious footnotes, so most Americans still don’t know, and many of them deny it bitterly despite the evidence. Their patron saint is Paul W. Tibbets, who died in his Columbus, Ohio, home on November 1, at the age of 92.

Courageous, precise, a consummate pilot, Tibbets assembled and commanded the 12-man crew of the Enola Gay, the B-29 bomber he named for his mother. The Illinoian was chosen over two higher-ranked pilots because, as he told an interviewer, “they were looking for someone who wouldn’t flinch.” On August 6, 1945, Tibbets and crew dropped a 20-kiloton uranium bomb on Hiroshima.

Hailed as a hero and taking credit for ending World War II, Tibbets served in the Air Force for another 20 years and never doubted the rightness of the bombing even in the face of declassified documents and the unfolding history — political and medical — of what atomic weapons wrought. He was the solid fulcrum of the nuclear-war age, and in the mid-1990s he fought a final battle to keep his reputation unsullied by unflattering facts.

When the Smithsonian Institution commissioned an exhibit of Enola Gay’s forward fuselage, National Air and Space Museum director Martin Harwit also wanted to put the bombing into historical context. Veterans’ groups protested; Tibbets took point.

Calling the proposed exhibit “a package of insults,” Tibbets gave no quarter to the historians: “Today, on the eve of the 50th Anniversary of the end of World War II, many are second-guessing the decision to use the atomic weapons. To them, I would say, ‘STOP!’” He had learned all the facts he’d ever need from his superiors, in the months before he dropped the bomb, and he had no use for any more. Tibbets vanquished the historians and, as he never tired of saying, “never lost a night’s sleep” over the bomb.

But he did not leave it at that. He joyfully re-enacted the bombing and hawked triumphalist memorabilia from his “official website,” including signed books, photos, and a 10-inch scale replica of the “Little Boy” atom bomb; $275 plus shipping.

In 2002, as the “war on terror” got under way, Studs Terkel put his microphone before Tibbets. “One last thing,” he asked the old hero. “When you hear people say, ‘Let’s nuke ‘em, let’s nuke these people,’ what do you think?”

“Oh, I wouldn’t hesitate if I had the choice,” Tibbets replied. “I’d wipe ‘em out. You’re gonna kill innocent people at the same time, but we’ve never fought a damn war anywhere in the world where they didn’t kill innocent people. If the newspapers would just cut out the ****. ‘You’ve killed so many civilians.’ That’s their tough luck for being there.” — Edward Ericson Jr.


KEEP SA CURRENT!

Since 1986, the SA Current has served as the free, independent voice of San Antonio, and we want to keep it that way.

Becoming an SA Current Supporter for as little as $5 a month allows us to continue offering readers access to our coverage of local news, food, nightlife, events, and culture with no paywalls.

Join today to keep San Antonio Current.

Scroll to read more San Antonio News articles

Join SA Current Newsletters

Subscribe now to get the latest news delivered right to your inbox.