Intellectuals

6
Oct

What does "cool" even mean in 2013?

 

By Carl Wilson

Slate Magazine

Last month the electro-psychedelic band MGMT released a video for its “Cool Song No. 2.” It features Michael K. Williams of The Wire as a killer-dealer-lover-healer figure stalking a landscape of vegetation, narcotics labs, rituals, and Caucasians. “What you find shocking, they find amusing,” the singer drones in Syd Barrett-via-Spiritualized mode. The video is loaded with signposts of cool, first among them Williams, who played maybe the coolest TV character of the past decade as the gay Baltimore-drug-world stickup man Omar Little. But would you consider “Cool Song No. 2” genuinely cool, or is it trying too hard? (Is that why it’s called “No. 2”?)

The very question is cruel, of course, and competitive. You can praise the Brooklyn band’s surreal imagination, or you can call it a dull, derivative outfit renting out another artist’s aura to camouflage that it has none of its own. It depends which answer you think makes you cooler.

If that sounds cynical, cynicism is difficult to avoid when the subject of cool arises now. Self-conscious indie rockers are easy targets, vulnerable to charges of recycling half-century-old postures that arguably were purloined from African-American culture in the first place. But what is cool in 2013, and why are we still using this term for what scholar Peter Stearns pegged as “a twentieth-century emotional style”? Often credited to sax player Lester Young in the 1940s, the coinage was in general circulation by the mid-1950s, with Miles Davis’s Birth of the Cool and West Side Story’s finger-snapping gang credo “Cool.” You’d be unlikely to use other decades-old slang—groovy or rad or fly—to endorse any current cultural object, at least with a straight face, but somehow cool remains evergreen.

The standard bearers, however, have changed. Once the rebellious stuff of artists, bohemians, outlaws, and (some) movie stars, coolness is now as likely to be attributed to the latest smartphone or app or the lucre they produce: The iconic statement on the matter has to be Justin Timberlake as Sean Parker saying to Jesse Eisenberg as Mark Zuckerberg in The Social Network, “A million dollars isn’t cool. You know what’s cool? A billion dollars.” That is, provided you earn it before you’re 30—the tech age has also brought on an extreme-youth cult, epitomized by fashion blogger and Rookie magazine editor Tavi Gevinson, who is a tad less cool now at 17 than she was when she emerged at age 11. What would William S. Burroughs have had to say about that? (Maybe “Just Do It!”

Cool has come a long way, literally. In a 1973 essay called “An Aesthetic of the Cool,” art historian Robert Farris Thompson traced the concept to the West African Yoruba idea of itutu—a quality of character denoting composure in the face of danger, as well as playfulness, humor, generosity, and conciliation. It was carried to America with slavery and became a code through which to conceal rage and cope with brutality with dignity; it went on to inform the emotional textures of blues, jazz, the Harlem Renaissance, and more, then percolated into the mainstream.

continue

Category : Culture | Intellectuals | US History | Blog
3
Oct

On the 100th anniversary of the birth of the famed novelist, our reporter searches the north African nation for signs of his legacy

  • By Joshua Hammer
  • Smithsonian magazine, October 2013,

The Hotel El-Djazair, formerly known as the Hotel Saint-George, is an oasis of calm in the tense city of Algiers. A labyrinth of paved pathways winds through beds of hibiscus, cactuses and roses, shaded by palm and banana trees. In the lobby, bellhops in white tunics and red fezzes escort guests past Persian carpets and walls inlaid with mosaics. Beneath the opulence, violence lurks. During the week I was there, diplomats descended on the El-Djazair to repatriate the bodies of dozens of hostages killed in a shootout at a Sahara natural-gas plant between Al Qaeda in the Islamic Maghreb and the Algerian Army.

Violence was in the air as well in January 1956, when the celebrated writer Albert Camus checked into the Hotel Saint-George. The struggle against French colonialism was escalating, with civilians becoming the primary victims. Camus was a pied-noir—a term meaning “black foot,” perhaps derived from the coal-stained feet of Mediterranean sailors, or the black boots of French soldiers, and used to refer to the one million colonists of European origin living in Algeria during French rule. He had returned after 14 years in France to try to stop his homeland from sliding deeper into war. It was a perilous mission. Right-wing French settlers plotted to assassinate him. Algerian revolutionaries watched over him without his knowledge.

The Casablanca-style intrigue—freedom fighters, spies and an exotic North African setting—seemed appropriate. Camus, after all, was often thought of as a literary Humphrey Bogart—dashing, irresistible to women, a coolly heroic figure in a dangerous world.

Camus is regarded as a giant of French literature, but it was his North African birthplace that most shaped his life and his art. In a 1936 essay, composed during a bout of homesickness in Prague, he wrote of pining for “my own town on the shores of the Mediterranean…the summer evenings that I love so much, so gentle in the green light and full of young and beautiful women.” Camus set his two most famous works, the novels The Stranger and The Plague, in Algeria, and his perception of existence, a joyful sensuality combined with a recognition of man’s loneliness in an indifferent universe, was formed here.

In 1957, Anders Österling, the permanent secretary of the Swedish Academy, acknowledged the importance of Camus’ Algerian upbringing when he presented him with the Nobel Prize in Literature, a towering achievement, won when he was only 43. Österling attributed Camus’ view of the world in part to a “Mediterranean fatalism whose origin is the certainty that the sunny splendor of the world is only a fugitive moment bound to be blotted out by the shades.”

Camus is “the single reason people outside Algeria know about this country,” says Yazid Ait Mahieddine, a documentary filmmaker and Camus expert in Algiers, as we sit beneath a photograph of the writer in the El- Djazair bar, alongside images of other celebrities who have passed through here, from Dwight Eisenhower to Simone de Beauvoir. “He is our only ambassador.”

continue

Category : Culture | Intellectuals | Middle East | Philosophy | Terror and Violence | Blog
26
Sep

 

Senator Joseph McCarthy

Inventing the Egghead: the Battle over Brainpower in American Culture

Author: Aaron Lecklider
University of Pennsylvania Press

Reviewed by Todd Gitlin

Aaron Lecklider, who teaches American studies at the University of Massachusetts, Boston, proposes to stand the last century of American intellectual life on its head, or at least on its side. In keeping with Antonio Gramsci’s project of looking beyond the world views of traditional intellectuals – the ones who get paid to write and talk – he wants to resurrect the working class’s organic intellectuals, the non-professionals who exercise ‘brainpower’ even if they’re not credited for it by snobbish conservateurs who carve out exclusive domains where cultural capital confers privilege upon the best and the brightest. Popular culture, Lecklider writes, has been for the last century ‘a critical site in shaping American ideas about brainpower’ (p. 225).

Intelligence, he argues, is contested domain. The town has as much of it as the gown. This is a clever idea, and Lecklider, frequently original, carries it a considerable distance—sometimes farther than the evidence warrants. His starting – and finishing – point is that the charge of ‘anti-intellectualism’ famously and exhaustively leveled by Richard Hofstadter against American culture is actually self-fulfilling, for Hofstadter and his allies, failing to acknowledge that intellectual life could be conducted by non-professionals, ‘opened historians to attack by ordinary women and men for attempting to preserve an elitist category, creating a cycle of misunderstanding that continues to manifest in contemporary American life’ (p. 222). Hofstadter, from this point of view, ‘bracketed off intellect from the brainpower of ordinary women and men and divorced intelligence from working-class cultural politics’ (p. 222). By implication, it’s no wonder the left has been crammed into the margins of history. But Lecklider has prepared a clever flanking movement. The conflict over who is entitled to be regarded as intelligent may even culminate in a happy ending:

    Reclaiming the history of an organic intellectual tradition in American culture represents a starting point for envisioning intelligence as a shared commodity across social classes; wrested from the hands of the intellectuals, there’s no telling what the brainpower of the people has the potential to accomplish (p. 228).

Lecklider begins his counter-history in the early decades of the 20th century.

Even as managers downgraded ordinary workers, adopting Taylorist methods to ‘transform’ themselves into ‘scientists’ (p. 26), vast numbers of working-class Americans refused to believe that managers and their hired hands held a monopoly on brains and intellectual interests. Institutions including amusement parks, comic books, public lectures, and summer schools cultivated the sort of intelligence that did not need – indeed, might actively resist – the sort of formal education on offer in the decades before 1920, when fewer than one 18–24-year-old in 20 was enrolled in college. Brainpower, Lecklider insists, was the subject of class struggle. Contra Hofstadter – who looms in the shadows as Lecklider’s foil throughout, emerging as an explicit bête noire in the epilogue—America as a whole was not ‘anti-intellectual.’ Rather, at least at the turn of the 20th century, ‘anti-intellectualism coexisted with representations of an intellectually gifted working class’ (p. 8). The history of intelligence in American culture, he argues, is ‘tortuous’, ‘considerably more complicated’ than the straightforward declinist narrative embraced by scholars such as Hofstadter, Lasch, Lewis Coser, C. Wright Mills, Herbert Marcuse and – odd company on this list – Reinhold Niebuhr (p. 224).

continue

Category : Education | Intellectuals | Technology | Working Class | Blog