Thursday, August 30, 2012

Film Friday: and Man Created Woman -- the Case of Ruby Sparks

(it's a tad early, but hey, I've got root canal early in the morning...) 

In 1960, an episode of The Twilight Zone -- in fact, the last aired episode of the great first season -- told an amusing tale of a playwright whose Dictaphone appears to have special powers.   Written by Richard Matheson, "A World of His Own" starred Keenan Wynn as the playwright, who must explain to his wife (when she sees him sharing a drink with another woman) that whatever he describes into his machine can magically appear.  And to make it disappear, he just has to toss the tape recording into the fire, and zap! he/she/it's gone.   (That's why the other woman wasn't in the room when the wife entered.)

The wife doesn't believe it -- and when the playwright shows her an envelope with her name on it,  containing a piece of tape that he says is the description he wrote of her, she angrily tosses it into the fire...and disappears.   Wynn's character frantically turns to the machine to re-describe her -- and then changes his mind, describe his new wife with the characteristics of the other woman. 

Men have long been obsessed with creating life -- womb envy, probably.   And in this summer's modest hit romantic comedy, we have the same story played out again: a writer who creates a fictional character who suddenly comes to life. 

But there's an interesting twist: Ruby Sparks, directed by Jonathan Dayton and Valerie Feris, was written by a woman, Zoe Kazan.  And to her credit, Kazan manages in her script and in her acting -- she plays the fictional gal-come-to-life -- to create something very fresh and imaginative. 

One of Kazan's "tricks" is to make her male creator, Calvin Weir-Fields (Paul Dano, Kazan's boyfriend), walk the line between sympathetic loner and egomaniacal control freak.  Calvin is feeling the angst of what Fitzgerald plainly called "early success."  Calvin is not likely to follow in Scott's footsteps by drinking himself while gallivanting across Europe -- much easier simply to get a dog and name him after Fitzgerald.  He's written a Salinger-esque first book but nothing has worked since.

Working off dreams he has been having, he begins to create the girl he sees: the dream even gives her a name.  Soon, she is given flesh and personality.   But even as he comes to accept that somehow Ruby is real, Calvin tries hard to shield her. After letting his brother and sister-in-law meet her (mostly to prove to his brother that he's not gone batty), Calvin reluctantly agrees to have Ruby meet his mother (Annette Benning) and her beau (Antonio Banderas -- who seems to be in everything lately). 

Mom, it seems, has gone all hippie since Calvin's dad's passing, but the place she and her woodworker boyfriend share in Big Sur is one of great comfort and affection.  The boyfriend Mort's creations affirm the vision of the artist, but there is a certain kind of freedom to the art and to his relationship with Calvin's mom that Calvin can wrap his mind around. (Not surprising, again, given his name.) 

Creations usually rebel against their creators (I have two kids, I know!).  As Liza did Professor Higgins, as Annie Hall did to Alvy Singer, so Ruby does to Calvin.   Eventually, Calvin confronts Ruby with his author-ity much the way Keenan Wynn's playwright did to his wife somewhere in the Twilight Zone.  But where the tv show episode plays this for humor, Calvin's yo-yo-ing of Ruby comes across as that of a horrific puppet master.  This reinforces the arguments made by Calvin's ex-girlfriend, who herself has become a noted writer since their relationship ended.  

Ruby Sparks is a charming update on the age-old theme of the desire to create art that can supplant life. Think of Alvy Singer's play of his and Annie's romance: he gives it a happy ending, because while real life is not in our control, our art is.  The author-as-God metaphor is a common one enough, too; again Allen's The Purple Rose of Cairo is his most poignant expression of it.  Kazan gives a reasonably happy ending for the film, though it does re-emphasize Calvin as Author, despite his having traded in what evidently was a magical typewriter for a Macbook Pro.  

The film also turns back to the notion of the fictional-character-as-alive.   The narrator of Calvin's second novel, The Girlfriend (based on his relationship with Ruby), denies his gifts as a writer on the magnitude of Salinger, but praises the great writers for making their best characters -- Holden Caulfield, Huck Finn, Sethe from Toni Morrison's Beloved -- become a living part of our world.  And indeed it is that engagement with the real world that is the lesson Calvin ultimately learns.  Keenan Wynn's playwright humorously stayed in a world of his own: even when Rod Serling himself shows up at the end of the episode, he's able to make Rod himself disappear! Calvin does not have the pleasures enjoyed by the playwright. He has isolated himself in his sterile home; Ruby -- like all great works of art -- gives Calvin back his life.  And that sense of life is what Zoe Kazan brings us here.




Sirens, Silence, and September

Last night -- just a few hours ago, really, since it's now four in the friggin' morning -- I lay in bed and heard lots of sirens not too far from my house.  Police and fire ones, I believe.  I've heard more of them lately, or is it that as we close in on another anniversary of September 11, I hear them more acutely?

There has been a much more visible police presence in my neighborhood, since several local shopkeepers have been shot and killed, apparently by the same assailant.  And of course, at around the time I drop my kids off at martial arts class, the traffic police are around looking for expired meters and busting fools who run red lights or make illegal u-turns. 

But last night, the rushing sirens on their rushing vehicles took me back to that "stupid bloody Tuesday" eleven years ago.   I remember, walking the big street near my house at about ten o'clock that night, how stunningly quiet it all was.   The stores were all closed.  The highway was absolutely empty.   Anyone else walking around was silent, dazed.   And then from time to time, the sirens smashed the silence, by then seemingly like the workers scrambling to remove the rubble, hoping for someone to rescue. 

I remember sending an e-mail late that night, or maybe the next day, letting people know we were okay, and to give blood -- a gesture that was tragically unnecessary at that moment, since the hospitals too were waiting for more people to come in that night, but so few ultimately did.  Phone lines were generally useless, overused and/or down. (How different would that day have unfolded had Facebook and Twitter existed?  Might a few more lives have been saved? What if someone had been able to Tweet on the first plane, heading for the North Tower? Might as well wonder "if only" the polio vaccine had been discovered forty years earlier than it was.) 

I sit here writing, dazed as I was eleven years ago, seven miles from death.  Dazed from lack of sleep, dizzy from a splitting headache, hoping the sun rises soon, waiting for my kids to wish my good morning, as my oldest, then only 16 months old, managed to do on 9/12. 


Tuesday, August 28, 2012

Getting your news from Facebook?

When Orson Welles' famous newspaper magnate first takes over the New York Inquirer, he tells his editor that one of the changes he's planning on making is based on the idea that "the news happens 24 hours a day."   Charles Foster Kane, like his real-life counterparts, was well-aware of this fact and the power of his medium to sway public opinion.   Kane's creator, of course, was also well aware of the power of the newer mass media, radio and film, to mold that public opinion, and in the case of radio, to connect to that public instantaneously. 

Television brought image and sound and simultaneity together, but broadcasting, like Hollywood, remained in the hands of a small number of corporate entities, thus limiting flow of information, largely in one direction.  

But even in the era of the classic three-network structure, Americans didn't always get their news from the classic anchormen of the era.   For years, people reported that they got much of their topical news from watching Johnny Carson's monologues on The Tonight Show.  As late-night comedians prolifierated in the eighties, the names changed -- Letterman, Leno, Conan, Kimmel, Aresino, Craig (first Kilborn, now Ferguson),  Chevy Chase (yeh, i know) -- but the idea was the same: open the show with some jokes about the current events.   It wasn't Mort Sahl, who seemed to create spontaneous comic riffs as he went through the paper that he brought onto the nightclub stage -- it was highly polished and, this being television, rigorously timed so that no ads would be missed.


But more and more, we are getting our information by logging in to Facebook and other social media to see what our friends are posting.   When you can sift through the cute pictures of animals and little children --- and okay, I have both animals and children in my house, so I understand -- you also find numerous links to web sites that your friends frequent -- or find out about from their FB friends. 

One of the things about Facebook's adjustments to your News Feed is that somehow Facebook personalizes that feed based on what it thinks you would be most interested in, rather than posting materials from all your friends in straight chronological order.  (You can create a feed that will do this, but it is a bit time-consuming if you have a lot of friends.)    But as TED lecturer Eli Pariser noted, one problem with that feed is that if you are, say, a liberal, you will see mostly the latest posts of those who share your ideologies, and the posts from your more conservative friends will be seen less frequently, unless you make some of your own adjustments.  

Thus, despite the sense of autonomy that you might feel as you surf the web, it's important to remember that there is always some method of selectivity, and as search engines become more and more familiar with your interests, there is a risk of becoming increasingly narrow-minded in your views.   

The other thing I've seen in recent time spent on Facebook is that I'm still getting largely soundbite/blurb culture.   I'm getting quick glimpses of the news, not always getting the full context, and no matter what side of the spectrum the FB post is, it's usually designed to evoke a strong emotional response, like this post about Congressman Akin's non-existent public statements about breast milk curing male homosexuality.  On my phone, I didn't see that it was coming from a satiric web site, though I could see that more easily when i got to my computer.   No intention to deceive on the part of my FB friend -- at least, I don't think so -- but it sure got the reaction.  More than a few people posted it at their various blogs and twitter feeds as news.   

Just because we are able to have greater editorial control over what we can consume from news, and contribute more easily to its production and distribution, does not necessarily lead to a better informed, more democratic public sphere.   If navigating the infinite news streams were as easy as navigating the travel web sites for the best deal, perhaps we would have a better handle on what we really know is going on.  

Saturday, August 25, 2012

Been a bad month if you're a cultural icon...

Or am I just so old now that I know more of them...?

Just in the last thirty days "we" lost:

Nora Ephron, Andy Griffith, Ernest Borgnine, Sally Ride, Chad Everett, Sherman Hemlsey, Gore Vidal, Marvin Hamlisch, Helen Gurley Brown, Ron Palillo, Phyllis Diller, Jerry Nelson, and Neil Armstrong.

Doesn't that seem like a lot?  

But I'm sure that the list of "who's alive and who's dead" is pretty stable over any given twelve-month period.   So the real difference is me.  In my twenties I was going to lots of weddings.   Now, it's funerals.   And the figures who loomed so important in my younger days are reaching the end of their road, and I have to accept it.  

I remember that in the same month in 2002, Billy Wilder and Chuck Jones died.  Billy was 95. Chuck was 89.   okay, not a bad run.   But when I think of how many movies these two men made that formed my personality, it was devastating.   Of the people on the above list, six were past 80.   Three were in their 70's.  The rest in their 60's.   A generation or two older than I am, but now I'm old enough to be aware of my own mortality even as I see through their work they have achieved a kind of immorality.   (For Pete's sake,  Armstrong is one of only twelve humans ever to walk on the moon!)

Think about this: there are only two living Beatles AND only two living Sweathogs.  

All that's left from Mayberry are Opie and Gomer Pyle.

And who's going to teach us how to count?




Friday, August 24, 2012

Film Friday: First things First

"I grew up on American movies. Sue me!" -- Martin Scorsese

I've decided that I'm going to dedicate Fridays to talking about movies.  My posts may concern current films in theatrical release, new video releases, movies from earlier eras, the industry, film festivals, starts, whatever. 

I try to explain to my kids how different entertainment is today.   They can watch pretty much whatever they want whenever they want.  The only limits are their parents' rules, not an arbitrary schedule.  Audiovisual entertainments are commonplace.  I suppose they were to me, too, since I watched more movies on tv than at the theater.   But when i sit at the multiplex with them, waiting for the trailers to finish, I still get the goosebumps that something awesome is going to appear before my eyes for the next ninety minutes, like Hugo.  (Or just sitting in the theater next to my wife, watching a movie for grownups -- really awesome because so rare!)

Movies are precious.  Woody Allen once said (I'm paraphrasing) that to do great work on television is like a great artist whose medium is sand.   Allen was referring to the ephemeral, temporary quality of television (though I imagine some Zen Buddhists might have some interesting responses to Allen).   In its massive quantities of materials, its hours and hours of programming for a single series, television depreciates its own worth.  This is epitomized by the phrase "jump the shark," referring to the point in the run of a series where the show has become lame.   This happens even to the very best of television series; their popularity ultimately keeps them on the air too long, and that level of excellence is hard to maintain.

Of course, movies can be incredibly repetitive, too, with sequels and formulas.   And before television, movies came in serials, too, cheaply produced, mainly aimed at kids.   (Watch the old Batman serial from the forties -- he's fat!)  But it still seems easier to sift through the sand and find the movie treasures buried there than it does to find tv treasures.  I've taught courses in film history and television history, and I've also taught general survey courses in American literature. While obviously no one course of fourteen weeks will cover the breadth of any of these subjects, to do so for television seems much more of an exercise in futility.  

This is not some kind of "purist" or "elitist" argument about one medium over another.   It's a matter of uniqueness.   While the metaphor may not be as relevant as it once was, Raymond Williams' term "flow" still applies to a large degree for television; it's something you just tap into and dive in at any point.  Movies, like literature, can be taken out of the flow of culture.  Libraries by their very nature encase culture in an historical box.   Film collections do the same thing.  Television is too unwieldy -- at least as a field of study.  (The digital age has made storage much easier, but that doesn't mean all of tv can be studied.  To watch the entire run of a daytime soap would take years, as Robert C. Allen noted.)

In this media conglomerated, synergistic universe, all forms of entertainment overlap.   And maybe these distinctions become irrelevant -- movies, television, YouTube, etc.   But even if we accept the blurring of these boundaries, one has to find a launch point in order to make sense of things -- while we might agree that language systems are full of arbitrary symbols, we still use them to present an order to our world.   It's important of course to understand the politics of such order, just as it is important to understand the politics of canonization.

For me movies at their best maintain a mythic power; theaters are temples, altars of dreams, both materialistic and otherwise.   and on Fridays, I'll offer a sermon or two.  





Tuesday, August 21, 2012

Pussy Riot: "it's different for girls"?

In reading this piece from Sarah Kenzdior, I found myself harkening back to a noted essay I first read in grad school some twenty years ago.  Kenzdior is writing about the Western response to the plight of the Russian rock band Pussy Riot, now sentenced to two years imprisonment for performances/stunts/whateveryouwantocallit  offending both the Orthodox church and the Putin government.

Kenzdior's main point is that the West has misunderstood the various implications of the Pussy Riot story.  In referring to the women in the group as "girls," as "punkettes," the western media are contributing to the same kind of diminuation of the group as the Russian pro-Putin media are doing.  Worse, however well-meaning the support for the group has become worldwide, many of the various "reenactments" taking place strip the original provocative acts by the group from their specific political context.  

These are not "girls" who are "innocent victims" of a repressive regime, Kenzdior says.  These are women who engaged in a deliberately provocative act, knowing there would be consequences for their actions.  But the media response outside Russia has been to diminish their agency. 

The supposition concerning the gender politics Kenzdior raises reminded me of an early essay by the British media scholar Angela McRobbie.  "Settling Accounts with Subcultures" first appeared in Screen Education in 1980; I came across it about ten years later, in Simon Frith and Andrew Goodwin's anthology On Record: Rock, Pop, and the Written Word.   Her overall argument concerns the subcultural theorists of the seventies -- especially Dick Hebdige, whose Subculture: the Meaning of Style has remained a classic of academic literature on youth culture. 

McRobbie is dead-on when she claims that writers like Hebdige and Paul Willis essentially reproduce the divide between the public and private spheres characteristic of patriarchial discourses.  The sociologists looking at the youth subcultures of the time did not write about the "lads" as they lived lives in home.   "Only what happened on the streets mattered," McRobbie wrote. 

Many subcultural groups in modernized societies tend to reproduce those societies' divisions of labor, and McRobbie points out that the opportunities for women in the subcultural scenes were often just as limiting as they were in so-called "straight" society.  the idea of a girls' subculture seems inconceivable to the theorists of the time.  And even though punk afforded a space for women that had not previously enjoyed, lines were nevertheless drawn.  McRobbie, responding to Hebdige's notions about how subcultural expressions are incorporated into the society through various ideological means, points out something very important: had the Sex Pistols been an all-female group behaving just as Johnny Rotten, Steve Jones,  Paul Cook, and Sid Vicious did, the reaction would have been much more apocalyptic, a sign of a "major moral breakdown." 

Thirty years later, the case of Pussy Riot resonates through McRobbie's arguments.   This is not to say that Pussy Riot is the band that McRobbie hypothetically described in 1980; Kenzdior points out that it's not accurate to call them a punk rock band.   But the gender-based politics of reactions remains very much the same today as they did in the late seventies.   When faced with a group that so seriously confounds the long-held expectations, society still seeks -- unconsciously, perhaps, but given the GOP War on Women (more on that another day), consciously, too -- methods of reinforcing the "norms."  

Sunday, August 19, 2012

This one is for the kids in my Freshman Seminar!

To students currently enrolled in SJC 100, Section 10: The Medium is the Message,

Welcome!  I hope you've all had an enjoyable summer and are excitedly preparing for your first term at SJC.  I wanted to introduce myself and the course you are about to take, so that we can start on the right foot the first day our class meets, September 6.

First, Let me just point out the two required textbooks for the course. 

Levinson, Paul, New New Media, 2nd EDITION.   It is URGENT that you get a copy of the second edition of this book, which has just been published.  It is on order at the bookstore; you can rent it for about $20.  The first edition is from 2009, and since its subject matter is about the most recent media, it's already out of date!  The bookstore is also selling this edition for $40, and Amazon has a kindle edition for about $29. The ISBN for the new edition is: 978-0205865574. 

The other required book, McLuhan's Understanding Media, comes in a Critical Edition for about $25 at the bookstore.   I strongly urge you to get this edition, because it contains a useful set of introductions for each chapter and helps give a better understanding of McLuhan's work, but I will not absolutely insist on it. 

The other major topic I wanted to mention concerns the importance of technology in our course.  Because the class is all about contemporary social media, a significant component of our work will involve using it.  We're going to blog, tweet, examine Wikipedia (maybe even work on some articles for it!), podcast, vidcast, whatever we can manage! This is not a "techie" course; that is, I'm not teaching anyone how to host a web page or create a video and upload it to YouTube.  But I am expecting two things:

1. Some fundamental abilities in using social media and its technology.  You probably are more adept at technologies than I am, but if you have trouble using them, you might want to reconsider your seminar choice.
2. A willingness to participate in these various "new new" media.  You have to be willing to use Twitter and Facebook; you have to blog; you have to write and receive comments via these various media.   If you are a "private" person who doesn't want to use these media, then again, you might want to reconsider registering for a different section of the seminar.  I too put off using these media, and I'm an insanely private person. But in order to understand all this "stuff," you have to be in it; that's what the course is about. 

I am looking forward to meeting all of you in September.  Please e-mail me if you have any questions, as I will not be in my office until August 30.

Do we really WANT to know our politicians' music tastes?

Near the end of the Coen Brothers' O, Brother, Where Art Thou?, our main characters, escaped convicts who happen to record a hit single while on the run, find themselves on stage performing their hit... and right in the middle of a political campaign.  It is one of the two gubernatorial candidates, the challenger Homer Stokes, who exposes the "Soggy Bottom Boys" as detrimental to society, having interfered with a lynching.  Stokes explains:



As you can see in the above clip, the crowd rejects Stokes' condemnation of the group and literally run him out on a rail... as a live radio audience listens to the excitement. The incumbent, "Pappy" O'Daniel, seizes the opportunity his opponent has given him and pardons the cons live on-air.   (The medium is the message!)

Oh, well. This Homer (the film is loosely based on The Odyssey) should have kept his mouth shut. And many of us would prefer our politicians not discuss their music tastes on the trail. To wit:

Recently it's come out that  presumptive Republican VP nominee Paul Ryan's favorite band is Rage Against the Machine. Rage's guitarist Tom Morello has published a short piece at Rolling Stone's web site, in which he wonders what could be the band's appeal to Ryan, since, in Morello's view, Ryan supports the very things that the band rages against!

Of course, Ryan isn't the first politico -- or human being, for that matter -- who likes an artist whose own ideological views are different from his own.  And we know of course that politicians will exploit a pop star's fame if it will help their own campaign.  It was true of the fictional governor, and it was true of President Reagan when he tried to show his appreciation for Bruce Springsteen, who asked Reagan not to mention him on the trail since apparently he didn't really understand "Born in the U.S.A."   And Springsteen is always a good example here, since so many of the working-class fans of his music still consider themselves old-school Republicans.  True, Springsteen's audience is probably not as large as it once was since he came out and endorsed Democratic party candidates like John Kerry and Barack Obama, but that's a separate issue, that of the role of the pop star in politics.

Since the election of the first Boomer to the White House, Bill Clinton, rock music tastes have become a part of the candidates' profiles.  Clinton's Vice President, Al Gore, was a big fan of Frank Zappa; ironic that Zappa would offer testimony at Senate hearings on censorship spurred on by Tipper Gore's Parents Music Resource Center.  But as is often the case, we tend to get embarrassed when politicians display their ignorance in the cultural sphere: think of those Congressional hearings on steroid use in baseball, where representatives didn't know names or history.  

So Paul Ryan joins a list of rock fans who don't care what the message is of their favorite rock band.  But I must admit this is pretty funny.  

Thursday, August 16, 2012

Fun story: Lazy Kid asks for help with book report, and book's author "helps"!


No, the author doesn't give the "meaning" away.  But this story, which I saw thanks to a Facebook friend, really amused me.

As an academic, I'm well aware of students being lazy and using whatever means they can to get out of doing the actual work.  Sometimes that effort is more than the actual effort required to do what I've assigned, but they never see it that way.  And of course sometimes they just plain cheat and plagiarize. 

In this hi-tech social media world, it's much easier for kids to get too much help with their work.  The famous kid Calvin of Calvin and Hobbes would often ask his classmate Suzie Derkins to do all his work for him, which of course she refused.   Today's Calvins don't have to worry about that.  They just askyahoo, or go to about.com, or simply google. 

But sometimes interesting things happen! Of course, I'm tempted now to pose as a student who hasn't had a chance to read some of my favorite authors' books to see if any of them might contact me! Jhumpa Lahiri? Sherman Alexie?  Snooki?

Message to all my students, past and present:  if the kiddies can find it on the internet, so can the teachers!

Wednesday, August 15, 2012

The NYC Olympics 2012... Sigh...

I first heard that New York was putting together a bit for the 2012 games some time in the late 1990's.  I was thrilled.   I had visions:

The stretch of the Brooklyn Queens Expressway  called the Gowanus Expressway would have been completely redone.   Built on the same trestles that once held the elevated train line over 3rd Avenue in Sunset Park (a neighborhood whose economic life was destroyed by the building of the highway above it, much as whole neighborhoods were gutted to create the Cross Bronx Expressway -- if you've seen Ken Burns' documentary you've heard that story), the highway is called the Gowanus because it goes over a polluted canal of that name.  But the real reason, as I've told my kids, is that people always say, "I don't wanna go on 'is expressway!!!" It is the most evil stretch of road on the planet.   My mantra for years, until they went with London, was "IOC come fix my roads!"

That second avenue subway line that started when Roosevelt was President? (The books say FDR, but I suspect it was really Teddy.)  Done.  Brand new.  And all the other subway lines up and running at full capacity and not smelling of urine.  

A full rebirth of Coney Island.  Not just partial, half-starts.  Brand new amusement parks.   Maybe even real jobs for the poorer residents.  (You know, just like Bruce Ratner promised with his whole Atlantic Yards project.  What? did I say something funny?)

And yes, price-gouging everywhere! Oh, yes, I'd have rented my house for the month and made a bundle! I'd have provided car service to the venues! Sold shirts from my stoop!

It would have been glorious.  And I was sure we had a good shot; after all the IOC is supposedly a pretty corrupt organization, susceptible to bribery, and who better than New York City politicians to pull something like that off?  Sadly, Ed Koch was no longer Mayor during this process.   Alas, it was not to be.  London got the Games.  

I actually traveled to London last year, and I could not really tell you if its infrastructure had been markedly improved because of the Olympics.   But of course, I don't live there.   I would have noticed the impact had NYC been chosen.  

My home's loss probably contributed to why I didn't watch a moment of the games or any of the ceremonies as they happened (or as they tape-delayed broadcast on NBC).  But then again, I've really not paid all that much attention to  the Olympics since the original Dream Team.  It was a comment a friend of mine made on Facebook that drew me to skimming some videos of the closing ceremonies, and my sadness was rekindled a bit.

My friend was complaining because she'd found out after watching NBC's coverage that the network cut Ray Davies' performance of "Waterloo Sunset" and a few other great moments to make room for promoting a new sitcom.  Not everyone was happy with the whole ceremony, as one blog suggests. (NBC has taken a beating in the twittersphere, but they did have the last laugh: ratings were phenomenal.)  As I went to YouTube to see it for myself, I could not help but smile wistfully, as Ray got out of that limo to roaring applause and sang the loveliest English ballad of the 20th century. (I use the term "ballad" in its traditional sense.) It was so sweet to hear the crowd sha-la-la-ing together, a collective celebration with a song all about loneliness.  And as I saw the clip of John performing "Imagine," I began to think about what the NYC 2012 Olympic closing ceremony would have looked like...

The video screen slowly fades in to an early sunrise.  And yes, it's Henry Mancini's score, minus Johnny Mercer's words.  And Audrey Hepburn gets out of the the old yellow cab, right in front of Tiffany's.

Segue into Gershwin, and the black and white cinematography of Gordon Willis as Woody Allen gushes on about adoring Manhattan. 

Follow it with Bill Lee's jazzy score: Bed-Stuy in the heat, a slow mo of the fire hydrant cooling off the 'hood. 

Suddenly, the Bee Gees.  And Travolta in the white suit.  You should be dancing.

Finally, on the stage in Central Park, Paul and Artie sing a song about a bridge to Queens. Or maybe singing about counting the cars on the New Jersey Turnpike.  Followed by:

A nervous twitching David Byrne suddenly appearing on stage, "this ain't no Mudd Club, or CBGB, I ain't got time for that now..."

A video montage set to Petula Clark's "Downtown."

Some version of the Drifters -- there are about thirty different groups out there touring under that name -- singing "On Broadway."

 The gal from Brooklyn named Carole, singing that there's room enough for two up on the roof.


A fond video remembrance of Joey, Dee Dee and Johnny, performing "Rockaway Beach." 

John and John, once from New England, now the pride of Brooklyn: They Might Be Giants.   let em sing whatever they want!

Lots of breakdancing.   As long as the phrase "Electric Boogaloo" appears somewhere in neon.

And then maybe Lou Reed gets everyone to sha-la-la-la to his great New York ballad: "Street Hassle."

Of course, none that would have really happened. Though maybe Bruce would have sung about being frozen out on 10th avenue.  

But at least I could have seen it all from up on the roof somewhere overlooking the brand new Gowanus Expressway.  The stars putting on a show for free.


Sunday, August 12, 2012

Negative ads don't work, but...

Because I live in a state that tends toward the same political party every Presidential election, I don't get to see that many ads for national office.   But I did live in a "swing" state for 9 years, and  pretty much every election at every level got nasty as the spring gave way to summer gave way to November.  So it was a bit of a throwback visiting my family in said swing state and hearing the attack ads against the two main Presidential candidates.   (I won't dignify either side by embedding the ads here.)

As I considered the ads, I did some quick checking for articles about negative ad campaigning and its effects on voter choices.  My special interest was in the most notorious ad from the first time I was eligible to vote for President, 1988: the Willie Horton ad.  This was an ad produced by supporters of Republican nominee George H.W. Bush that depicted Democratic nominee Michael Dukakis as soft on crime, symbolized by Dukakis's furlough program while Massachusetts governor.   The symbol of Dukakis's dangerous liberalism was that of convicted felon Willie Horton, who committed more horrific crimes while released from prison on a furlough.   The ad notoriously preyed on White America's fears of Black crime.  

But most of what I discovered in my quick research is that the claims that the Horton ad doomed the Dukakis campaign are very much myths, that Dukakis had been losing ground to Bush before the ads came out.  Some further research via academic databases uncovered a number of studies that demonstrate that negative ads have no effect on voter selection, that at most, they merely discourage participation in the electoral process by creating an overall mood of pessimism that "they're all no good, so why bother going to the polls?"

In many respects this makes sense: when I see an ad attacking a candidate I support, I tend to tune out the negative message, dismissing it as propaganda.   And I think most of us are like that: we've already made up our minds at least in a general sense of our political values, so ads are not so likely to sway us.   Although there are always numbers of voters who tell pollsters they are undecided, there is still no evidence that enough of them are swayed one way or another by an ad campaign to change the fate of an election.  

So why does the negativity continue?  Why spend all this money tearing down your opponent if it doesn't translate to more votes? 

I suspect that that the medium has a lot to do with the continued presence of negative ads on our tv screens and radio speakers.   These media thrive on drama and conflict.  Angry ads get our attention, no matter what the context.   We can say that we hate ads all we want, but we still watch them.   ESPN Radio talk show host Colin Cowherd often makes this point on his program:  everyone says they are tired of hearing about this or that star but the numbers don't bear it out.  When we talked about Brett Favre's final years, where he was leaving, not leaving, retiring, not retiring, etc., our numbers were up.   Everyone loves to talk about a Cinderella story at the NCAA basketball tournament, but when tiny George Mason made it to the Final Four, the ratings for their game were the lowest in years.   We can say that we love stories like George Mason, but for the big games, we want to see Duke, UConn, Kansas, and Carolina.  

I still wonder of course if the point of negative ads is not to change anyone's mind.   The ads contribute to the overall structures that govern the society.   They help reinforce a lot of our perceptions about the process, and help maintain the status quo, no matter who wins.   Would the lives of Americans have been measurably better if Michael Dukakis had won the election?   With all respect to those who support the Democratic party, I'm not so certain.    What the ads succeed at is contributing to a common sense of our political system, making it seem as if it were some "natural" thing like the weather, when it of course a careful, elaborate construction. 

In such a system, most of us will lose.

Friday, August 10, 2012

Wikipedia in convenient book form? That'll be $47, please!

I was at the Barnes and Noble web site looking for a cheap widescreen-format DVD of James Mangold's Walk the Line.  Being in a hurry, i just typed in the title in the search line without specifying format, so I got DVDs, Blu-Rays, books, etc.  And here was an interesting title:  Accolades Received by Walk the Line.   Curious, I clicked on the title.  Here's the cover:

Accolades Received by Walk the Line

See that in the corner? it reads: HIGH QUALITY CONTENT by WIKIPEDIA ARTICLES.  Yes, that's right.  The entire book consists of material from Wikipedia.    How many pages' worth of wikipedia articles are in this book?  80.   How much does this publisher charge for a collection of Wikipedia articles?   $47 US.   Yep, that's right.  Figuring that some of the pages actually don't have content, that's about two bucks per page.  

Now, Alphascript Publishing, responsible for this and numerous other books containing Wikipedia content, has been around for a while.  I'm late to the game on this, and here's a nice little rant from three years ago about it.  Nevertheless, it was news to me: what a stroke of evil genius!   Take content that's free on the internet, bind it into a book, and sell it for a ridiculous amount of money.   (Did they come up with bottling water?)



Thursday, August 9, 2012

Sight and Sound's 50-Greatest List: Boys' Club?

Last week the noted British film magazine Sight and Sound published its decennial 50 Greatest Films of All-Time list.   The magazine has done this every ten years since 1952.   The list is determined by a critics' poll; the magazine often includes a directors' poll and readers' poll, and in this issue, lists from individual contributors. 

The big talk about this year's list is that for the first time in the poll's history, Orson Welles' Citizen Kane (1941) didn't finish on top.   It is now 2, behind Alfred Hitchcock's Vertigo (1958).   But in a few places around the web, we find a different conversation taking place. 

If you click on this link, you'll be taken to the introduction to the list at the magazine's web site.   See the cartoon there?  Okay, for those who didn't link, here it is:



From left to right, with their films in the top 50 in parentheses: Stanley Kubrick (2001: A Space Odyssey, #6 on the list); Federico Fellini (8 1/2, #10, and La Dolce Vita, #39); Orson Welles (Citizen Kane, #2); John Ford (The Searchers, #7); Alfred Hitchcock (Vertigo, #1, and Psycho, #35) 

The cartoon serves as an indication of the obvious: the list compiled by the magazine's critics is a guys' list.   Of the fifty films in the poll, only one was directed by a woman, Chantal Ackerman's Jeanne Dielman, 23 quai du Commerce, 1080 Bruxelles (1975).  

Alyssa Rosenberg, writing in Slate, notes that this should not come as a surprise, given the astonishingly low representation of women directors in the major studios in the US and world-wide. But veteran critic and current yahoo blogger Thelma Adams makes a strong point about the poll's "inflammatory list": since  it is "culled from a predominantly male, older-skewing clan of cineastes," of course it's going to have a bias.  

There have been many excellent films directed by women throughout film's history, as Adams points out with a quick list of her own.  And certainly some of them could have been included on the S and S list.   Rosenberg is optimistic that as more and more women make films, the works of contemporary directors like Kathryn Bigelow (The Hurt Locker, 2009), Jane Campion (The Piano, 1993) and Lynne Ramsay (Ratcatcher, 1999) will appear on this list -- as might those of older generations like Lina Wertmuller (Swept Away, 1975).  (It may also happen as some of these older critics, well, die off.)

But pointing out the bias of the critics' poll should not be about guy-bashing.  What's at stake is who gets to decide what goes in the canon of important films.   The Sight and Sound  poll isn't the only locus of canonization.  The U.S. Library of Congress honors and preserves films every year for their historical importance.   The Oscars are also a form of canonization.   And so is academia, where I work, and have been on the front lines of canon debates in film and in literature.    (I know more than a few colleagues teaching in English departments who don't teach any literary works by women -- they didn't learn them when they were undergrads, didn't study them as doctoral students, and have little or no interest in teaching them today.) 

Obviously, I have my own prejudices that are reflected in my syllabi.  As a professor, I'm inclined to teach what I know best, going back to high school.   But one of the things I learned as an undergrad was that what we learn and how we learn each has a political element, deliberate or not.  When a list like this is so lopsided in terms of gender representation -- to say nothing of racial representation -- we owe it to our students to have dialogue about it, to encourage them to ask: why?   Especially when the film on top is arguably one of its director's most misogynistic films. (I think its second only to Marnie.)   It's not about tossing out this or that film from a list; it's about examining how one comes to make a list in the first place.  




Monday, August 6, 2012

My Blog was plagiarized!

All my course syllabi now contain a clear policy on plagiarism.   I quote the Student Handbook or the current course catalog where the term is defined and punishments described.  Usually the rules offer a range of possibilities for punishments, from failing an assignment to expulsion from the school; this gives initial discretion to the faculty member.   (Expulsion hearings usually require the Dean to petition the administration to convene one.)  My usual rule is this: if you are found to plagiarize, you fail.  


I tell all my students that if it is easy for you to find something on the web, it's even easier for me to find it, too.  (Not because I'm smarter than they are, but because I'm putting in a specific phrase from a suspect paper, so I get exactly what I'm looking for right off the bat.)   That doesn't stop students, of course; a case has just come up for me, with the usual results.  But it got weird for me last year.

I do have a few articles published in scholarly journals, and it would not be too difficult for students to find them via the databases like Academic Search or J-STOR (which they have free access to via the library).   Of course, plagiarizing the professor's own words would probably be a brazenly stupid thing to do.   But what if you don't know it's his/her words you're stealing?

Back in 2011, I discovered that a post I'd written for a long-deleted blog is now for sale at about a dozen different "free term papers"-type sites. Of course, not every site is giving my post away; some are selling it for as much as fifteen dollars.  And no, I don't see any of that action, nor would I want to cash in and allow someone to steal my ideas for a good grade. 

Is it worth my effort to try and have this essay removed? Probably not. Somebody else can pass it along, if they've used it, and it might take me too long to prove that I was the originator of the essay.  But what I can do is publish the fact that this essay is mine in the hopes of scaring people away from using it.   In my own courses,  I post links to the various pages where this essay is available, and I tell them that these sites are stealing my ideas and so are their clientele.   The fact that I know this goes on makes the students aware that I am aware.   Just that fact helps reduce the problem, like seeing a cop car on the highway makes everyone slow to the speed limit.

I can also post something on some of the various academic list-serv's that I belong to, and post at the many online forums of the professional societies I have joined, telling people that if they come across an essay with the same title as my post, it's a case of plagiarism.   It's much easier to combat this sort of thing with more information, rather than to attempt to suppress it. 

The name of the post?  "School of Rock: Selling it to the Man?"  Go ahead.   Search for it.  Yep, I wrote that about seven, eight years ago. It is pretty damn good, though not laden with academic jargon and maybe not apropos for a film class assignment; it's more or less a review.  But it's out there, and it seems that anyone can pass it off to their teachers and professors and claim it is theirs. The best that I can do is raise hell about it without reaching for the phone and calling my lawyer.  

Sunday, August 5, 2012

Who complains? Who cares? Well...

Read an interesting Times interview with Chris Rock today.   It's hard to take Rock seriously when he claims that Grown Ups is a better movie than The Artist, but I get his point.    What interests me are his comments concerning how everyone can tweet their outrage so easily when someone says something offensive.  

Q. On July 4 you tweeted: “Happy white peoples independence day the slaves weren’t free but I’m sure they enjoyed fireworks.” Were you surprised at the outrage that stirred up?
A. That’s the kind of joke I would have told on Letterman. We just live in a world where the audience gets a say now. My actual belief? Only fans should be allowed to criticize. Because it’s for the fans. When I hear somebody go, “Country music [stinks],” I’m like, well, country music’s not for you. You’re just being elitist. Only a fan of Travis Tritt can say the record [stinks], because he’s got every one. Same thing with jokes. You’re a fan of mine, that joke’s not even a single, it’s a B-side that never gets released. It’s no big whoop.
Q. Whether it’s your tweet, or Daniel Tosh joking about rape, or Tracy Morgan saying he’d kill his son if he came out to him, does it seem like the Internet is just adding more fuel to these fires?
A. Are they real fires? Or are people just reacting to something? Just because there’s an alarm going doesn’t mean it’s a fire. And I think that people are confusing the two. It’s only a fire when it offends the fans, and the fans turn on you. Tosh has fans, and they get the joke. If you’ve watched enough Tracy Morgan, you let the worst thing go by. When did Tracy Morgan become Walter Cronkite? You have to mean something to me to offend me. You can’t break up with me if we don’t date. 

Twitter is a locus where people complain about someone saying something stupid, like Aston Kucher's (literally) ignorant remarks about Joe Paterno.   But Rock questions its relevance.   I'm not sure he's entirely correct, but my first thought was Rush Limbaugh.   This is a guy who says things that offend a lot of people -- but his core audience remains loyal to him.  One of his jobs is to offend liberals.  He doesn't have to care if they don't like him.   

Late tonight, I came across this piece in Yahoo News.   The satiric newspaper The Onion, in their weekly news roundup video, ran a story about "Sears extremists" planning to fly a plan into the Willis Tower, which was until not too long ago called the Sears Tower. The video features an image of a plane with the Sears logo on it just moments away from crashing into the building.  The humor of the piece is rooted in the absurdity of extremists from a department store, and the absurd resentment the company must feel for having had the building's name change. 

However, the image so strongly resonates with the very recent memory of planes attacking New York and Washington on 9/11 that it's easy to see why the social mediasphere found the image in very poor taste.  (I certainly did.)  The yahoo story reported over 4000 posts in response to the image on the paper's Facebook wall; that number has probably grown since then.  

The article quotes a few of the posts, and what is suggested to me by them is that Rock's argument seems to hold up: the most insightful posts are written by people who genuinely love the paper, and still felt that they had crossed the line of good taste.   Consider this post: 

"I've been an Onion fan for years, but I think this crosses the line," Kelly Davis wrote. "I still love you though, like a child that was really really bad, like got arrested for pot bad. But seriously, I would not like seeing something like this again please."


While some may claim they will never read the paper again, I doubt that to be the case.   That doesn't mean the paper will avoid ending up in the media trash-heap, but I think that some of the twittermoaning is just that, noise, static interference.   If the core signal reaches the desired audience, the medium will remain strong, be that medium a comedian, movie star, or newspaper staff.  

Saturday, August 4, 2012

...and speaking of instant gratification...

My kids just got back from camp today.  They were away two weeks, and they seem to have had a pretty fabulous time.   I'm sure they'd like to show us pictures of the good times they had, but that's going to have to wait.   About two weeks in fact.   Whaaa?

The official camp rules prohibit the kids from bringing digital cameras (and many other types of electronic devices).   So we sent ours with disposable ones from the drug store.  I was stunned to discover, when I brought the cameras to be developed, that the store, a major retail chain, no longer processes roll film; they send it to the manufacturer, where it is developed and sent back.  

Such is the fate for many who use roll film, although for the professionals who still use it, there are plenty of labs, especially in a big city, where they can get their work done right and fast.  Now, some major drug store chains will still develop roll film and offer you one-hour photo services.  And big warehouse stores like Costco do it, too.   So all is not totally lost.  But for how much longer?   Let's face it, even people of my mom's generation (to give you a sense of her age: she voted for Nixon three times) use digital cameras these days.   (Of course, they also remember when you'd drop your roll of instamatic film off and wait for two weeks to get it back...)

What's really annoying is that some kids at camp did in fact bring digital cameras! We could have seen pictures instantly had we done the same. But no, we followed the rules, and now we're going to wait til summer's almost over to see all the fun.  It's just so funny that in an era where stuff like camp pictures can go viral in an instant, I've suddenly entered the 1970's.  

Friday, August 3, 2012

Angry you can’t watch your favorite Olympic events live? You’re probably too young…


NBC can’t make everyone happy, so... 

Okay, that was a story from the Onion’s web site, but I’m sure some execs probably feel that way about the complaints people are making that the network is not giving them some of the major events live.  Even funnier are the people who tweeted their complaints about NBC reporters tweeting event results because they didn’t want to know them before watching them later.   Let me get this straight: you’re using Twitter, a font of instantaneous information, to complain about receiving instantaneous information.   Okay then.

Complaints aside, NBC’s numbers have been pretty excellent, live broadcasts or delayed.   Indeed this is why a noted match between Michael Phelps and Ryan Lochte was broadcast in prime time, hours after the event actually took place in London: there are more eyeballs still watching their TV sets at home than are using their mobile devices.   The diehard fan of a given event – swimming, gymnastics – is probably diehard enough to watch that event via live streaming.   The more casual viewer is more likely to be drawn in by the stories the network presents as they rebroadcast coverage from earlier in the day.   And the Olympics are pretty unique as sporting events go; we don’t necessarily feel that knowing the result ruins our experience of watching them afterward. 

Part of that’s because the Olympics have so many individual sporting competitions, in contrast to the dominant television sports that are team-oriented.   It’s easier to be drawn in to a compelling human story, like that of Kayla Harrison, who has survived sexual abuse to become the first American to win gold in Judo.   Part of the appeal is also national pride.   The most dramatic Olympic story of my youth was the Miracle on Ice, and that was broadcast on a tape delay.  I knew the U.S. had beaten the Russians, but it was still an unbelievably exciting thing to watch.   (Things like that only happened every twenty years, you know!)  

I am of the last generation of viewers who grew up with only three commercial broadcast networks.   Many events I saw via programs like ABC’s Wide World of Sports were tape-delayed.  Indeed, there was a time when NBA Finals games were shown on a tape delay.  (And I lived on the east coast.   For those living out west a LOT more stuff was broadcast on a delay.)  Yes, it was also easier to avoid news about “the game” before you got your chance to see it, but it wasn’t like the local media were going out of their way not to report the scores.   So for people my age we don’t have as big a problem with NBC’s Olympics coverage as the young’uns.

Having said all that, there’s no question that with increased mobility and “on demand” entertainment, the younger generation of Olympics viewers will have less tolerance for watching “tape-delayed track and field events, or for that matter speed skating.  (The next Winter Games are in Sochi, Russia, so NBC better be ready for angry tweets in two years.) As information has become increasingly mobile, the viewer is not going to need to come home in the evening and watch events he/she has already watched during the day. (Of course, worker productivity might bedown a lot as more are more of us do that.)  As that time comes the networks who broadcast the Games will have to find more innovative packaging to keep that audience interested – or just hope that people of my generation live a lot longer.  


Wednesday, August 1, 2012

Blogging, or, as we used to call it: TALKING TO YOURSELF!

Yes, before people had a place to express their thoughts, they just said them out loud, and we looked at them on the street and thought they were pretty weird -- unless we were the ones talking to ourselves.

This blog begins as part of a process growing out of my teaching a course on what is being called by some the "new new" media.  I have worked on blogs before, even got paid to write one for a short while, but I wanted to commit myself to writing again and to working with the various new new media in preparation for working with a group of 18 year olds who probably are much more proficient than I am tech-wise.  I'm pretty savvy, though, a quick study, and it's more a matter of remembering where all the controls are than one of "how do I work this?"

I wanted to give myself a full month to get moving, and so here it is August 1, and it's time to get started.  At the very least, I wanted to set this up, announce it, and see how it looks out there. 

Much of what this is will be what might be called "cultural commentary."  It's what I do. I'm an academic.   (My name's Friday.)  I study popular culture.  I consume it mindlessly, I reflect on it thoughtfully, the whole gamut.   This blog will likely contain some lecturing, some arguing, a spot of ranting, and when running out of my own ideas, some linking to other good stuff out there in cyberspace.  Oh, and of course, lots of talking to myself.

Feel free to listen in.