Monday, November 14, 2016

Thunder-"Struck"? Or Just Aww-"Struck"?


A brief shoutout this week to the newest struggling major-player in the Streaming Subscription-verse--
Filmstruck, a new service that debuted on Oct. 19, finally made its peace toward being a professional service this past week, by officially cutting their old exclusive ties with HuluPlus.  It's now a new monthly subscription-streaming film service on its own, picked up by the doctor and slapped on its hinder, and ready to be cord-cutting viewers' latest monthly a la carte television substitute.
(The mission statement by co-partner Criterion:   FilmStruck launches October 19, Criterion.com

I'm not paid to promote it, in fact, I'm still the grudging skeptic who thinks it's not quite what it's cracking itself up to be yet, until all the kinks are ironed out...BUT, knowing where it comes from, I'm still willing to give it the benefit of the doubt, and push that doubt on others.  The service is still a desktop/tablet service, although in their rush to bring the service, better apps for the living room are still in development--We're promised AppleTV in December, and Roku boxes, Chromecast and X-Box/Playstation game-console streaming apps "early in '17".

For all of a lot of misleadingly gratuitously-snooty "mission" press about "Bringing independent films to the public" (we're already drowning up to our online eyeballs in "independent films", thank you, and not good ones, now that the studios have taken the mainstream films away from Netflix and Amazon Prime), we're getting this for a much more simple and realistic reason:
Criterion and HuluPlus used to have a partnership, back when Hulu Non-Plussed was still a struggling, mostly public-domain streaming service, and was grateful for some actual exclusive deal (in addition to a short-lived deal for Miramax films) that would give them some competition with the Instant Netflix titan.  Well, that was then--Hulu is now its own corporate giant, has since folded its modest old free-with-ads desktop roots to become a nation's Network-Rerun Binge-Addiction on Roku's and smart-TV's, and partnerships are a little more of a serious negotation.
Turner Classic Movies tried to branch out its own streaming service, but like most private channel-streaming, eg. HBO or Disney Channel, never got past a few desktop/streaming apps, and still required a premium-cable subscription to the network...Which was becoming more of a defeated purpose now that viewers were using streaming as an excuse to drop their cable services.  The new mission was to partner the two streaming orphans to create a TCM and Criterion service, although the "and" is still a bit forthcoming in the works.

Current new subscribers are offered a choice between monthly streaming the "Filmstruck Collection" at $6.99/mo., or an upgraded "The Criterion Channel", including Criterion's collection of rare classics and Janus foreign films, for $10.99/mo.  
At the moment, I don't quite know what the difference between the two is, as TCM won't be involving itself for another few months (until it can fold its own service)--And apart from a few rare inexplicably random films that used to play Hulu's otherwise-empty Movie page, and might or might not show up on Criterion's label next year (Mad Max?  Shakespeare in Love?  On a Clear Day You Can See Forever?) much of Filmstruck's catalog as we speak seems to be just a more limited selection of what's already on the Criterion Channel page.  If that's what you get for your ten bucks, well, that's as much as most already pay monthly for Netflix, and I can GUARANTEE you'll find more actual vintage movies.
Those with a shelf of Blu-rays already know the Criterion name, and buying a Criterion disk to some fans was like buying a rare wine bottle at auction, whether you'd ever drunk that vintage or not.  Those who don't know, oh, come out from hiding under the bed, it's not all Ingmar Bergman and Max Von Sydow grim-reapers playing chess.  (Although they do have a bad habit of forcing "The Seventh Seal" down our throats.)  Criterion's mission is to create a sort of "in-home film school", with all the titles your professor might make you write end-of-term essays upon, and takes a wide intuitive sweep of what films meet that bit of history--We get almost the entire filmographies of Bergman, Federico Fellini and Akira Kurosawa available, right next to David Cronenberg's "Scanners" and "The Brood", the comedies of Charlie Chaplin and Harold Lloyd, "Time Bandits", "Gimme Shelter", and even the Beatles running through the train station in "A Hard Day's Night".  If it's classic--and more accurately, if it's had an arthouse restoration in the last twenty years--it's probably on Criterion disk.  

Just to go off on a nostalgic side note:  When I was in college, movie study WASN'T in the home.  If you (like me) took a semester of film at NYU or Boston's Emerson College, it was like signing on to some mad-monk discipline, and because you were mad and monkish enough to do it--They didn't give you a syllabus of DVD's, or even VHS tapes, to look up at the college library, for the simple reason that they couldn't back then.  It was the early 80's, and if you had to watch a film for that week's lesson, you sat in an audience every Tuesday and Friday and watched it in a theater.  Fortunately, the college had its own private screening-room theater to watch it in...That was the cool part.  (Oh, you chose a Friday-afternoon class, if you could help it.)
Remember when you were in sixth grade, and the substitute teacher just turned out the lights and put on a video instead?  That wasn't a class, that was a reprieve period that you sighed relief that you got to blow off.  To go to a film class, you sat in the private campus screening room with your classmates (the newer big spiffy universities would have their own big stadium-seated theater, but small enough to be private; the one we had at Emerson was an old Boston-antique room on the third floor fitted for old wooden theater-seats), and watched a big movie screen on the wall, where film would be projected.  Yeah, watching movies for our credits, we thought, we're doing something right--And some of us had actually taken the major or minor not to be lazy about it.  We knew we'd be there to watch the WHOLE movie, no switching the channel or bathroom breaks, because it was a movie...And as such, we were all conscious we'd have to watch Dr. Strangelove or Mean Streets or La Strada knowing we smart cool people would asked to analyze it later for what the professor would insist we learn from it.  Ohh, sure, sure, we'll be analyzing it, but right now, we're having fun being a room full of Smart Movie People, with no disrespectful cellphone idiots who don't understand the true faith.  It became a sort of collective attempt to see who could foster a better hive-mind sense of Movie Smartness.
I haven't been back to film college since then, so I don't know if the technology is still there since the DVD and video projector, but it installed in me the last great generation of  the Cool Audience:  If you're going to watch an old movie right, watch it in the dark holy padded-seat temple with thirty or forty other people who are there to improve themselves with it.

Of course, that was then.  Like the Millennials say, we had to.
Nowadays, Film Study has pretty much become a correspondence-course:  You take your disk home and watch it, and write your essay later.
If that's the field we play on now--and if streaming makes the school syllabus as available in the home as Blu-ray disk does (not a replacement, just a free sample!)--it doesn't make it any less of an in-home course.  I know I'm probably losing a few people here by saying "Try an in-home study course in the classics!" by making it sound like online Phoenix University, but it's really a lot more fun than that.  It's still movies, after all.  You're still in one big, big campus screening-room theater, just that you can't see the other people in the audience.
If you see a classic-movie press photo for Filmstruck's Criterion movies, there's a 2 out of 3 chance it'll be from either A Hard Day's Night or The Seven Samurai...Well, there's a reason for that:
Some classics aren't just classics, they're also GOOD--And they'll leave you a lot more energized in your seat than trying to punish yourself binging the third season of Daredevil.  If you've ever met a foreign-movie buff who tried to get you to watch Da Classics, he's probably tried to sell you on at least one Kurosawa film to start off with.  You have only one viewing of Samurai, Yojimbo, Sanjuro, Throne of Blood (the Shakespeare one) or The Hidden Fortress (yes, "the Star Wars one") to convince yourself that other countries with other cultures and histories knew how to make "real" genre movies--yes, with action, not just standing around to make arcane visual symbolism--fifty or sixty years ago, the same as we did in commercial Hollywood...Just that they were there, and you were here, and you probably didn't find out, is all.
And we already mentioned a few months ago, even a Silent or two, courtesy of Chaplin, Lloyd or Fritz Lang won't hurt either.

Cutting the Cable Cord can be a handful of monthly streaming subscriptions, and even if having only two or three (and being able to drop the services you don't watch) can be too many, at least you get some actual CHOICE for your "New online revolution of choices".  You can watch a film people before you have heard of, or you can chat online about how a season binge of Stranger Things is the coolest thing on Netflix right now, just because it's the only thing you can find on Netflix right now...Your call.
Education begins with curiosity.

Sunday, November 6, 2016

Election Edition: I'm the Movie Activist, and I....oh, YOU know.

Just taking a break from Movie Activism to encourage the rest of the activists out there:

Get out and vote.  
Nobody--I repeat, NOBODY--will give rattus tucchus uno if you try to show off your Righteous Anger With the System by saying "I don't like either of them, I'm staying home, so there!"  That's like protesting the Atlantic Ocean by refusing to drink a glass of water.  
Seriously, grab a twelve-foot Olympic vaulting pole and get over yourself:  If you're going to be a smug self-righteous jerk, at least be a jerk and do something that will have some actual effect.
(Like the days of folksy, outhouse-nutty ol' Ross Perot, many people right now are making big shows of singing the praises of Libertarian Gary Johnson or the Green Party's Jill Stein without really having a clear idea of their respective political platforms, just to express their own tantrum at having to choose between Clinton and Trump.  Regardless of their relative chances, voting Third-Party without knowing the platforms, for no other reason than to thumb rebel-patriot noses back at the established Two-Party candidates, is a bit like the high-school girl who dates the class nerd just to get back at her boyfriend:  It doesn't improve the relationship, it gives the wrong guy a lot of false attention she'll regret later, and it's just not worth the date itself simply to prove some point.)


There's a reason it was called a Secret Ballot when it was written into our system, and why they put dividers and curtains up at polling stations:  
Nobody is going to think badly of you whoever you vote for, because nobody is going to KNOW who you voted for.  Nobody can tell you before you do, nobody's in there with you while you do, and nobody cares after you do.  Trump aside, you're not staking your immortal soul for now and all time on a stand for Good vs. Evil, or Patriarchy vs. Feminism, or Team Iron Man vs. Team Cap, you're just participating in a democratic political rotation that takes place every four years.  
And we've had Family Guy on TV for seventeen years and survived, you'll live through four.

We've got it easy in our country.  Our politicians aren't like Hollywood studio executives, in that we can vote them out of office as easily as we can put them in:  
No politician brags about being a "shark", or that they live by the laws of their own "jungle".  
No politician is so dismissive of domestic trade that he shrugs "Hey, if US customers won't buy our goods, just make all our industries to sell to China, they'll buy anything."  
Our politicians are not self-styled gangster-boss dictators who set themselves up in private dominions answerable to no one, free to use as much propaganda and cooked numbers as they can spin to convince the people of what they "should" think, and who gloat over their elite status far above us poor five-figure-salaried peasants, and their ability--their traditional pride and duty, in fact--to lie, cheat and hustle their rivals for personal marketplace gain as they kill-or-be-killed to keep their jobs.  Okay, Donald Trump, maybe, but that's because he has the same corporate business-CEO background as said studio execs, and doesn't know how our little "democracy" thing works...Where those in charge don't get to indulge themselves, and do have two-hundred-year-old higher political authority to answer their actions to.
And in politics as well as business, what you don't know can hurt you:  In the end, democracy gives everyone a voice, and either way, someone's going to get a rude awakening in their ear.  All it takes is for one person to speak up, and enough One Persons to be a People.
If we can stop one politician's dream of building a wall, we can stop a studio's dream of building one movie into an eight-film "Franchise".  If we can stop a war from killing innocent people overseas, we can stop the war Warner is waging to kill innocent Blu-ray disks. If we can change the problems in our country, we can change the problems in our movies and TV shows...How hard could it be?

But while we're here, in the interest of TV/movie/sitcom cliche's, I also put that cartoon at the top for a reason--
There are three widespread mainstream pop-culture annoyances that will make me froth at the mouth...Okay, FOUR, if you count people who spell "Santa Clause" like he was the title of a Tim Allen Disney comedy (it was a clause in a contract, people, rent it!):  
One are kids who still confuse "Loose/Lose" into adulthood, the second are people who put a comma in Shakespeare's "Wherefore art thou Romeo?"  (Hint:  Look up "wherefore" in Webster's.)  
And the third are would-be current-ref comics and hootsters who try to joke about quoting "...And I approve this message" from political ads in everyday conversation.  Nothing creeps up my spine like a much needed Cliche'-Buster.


Think we've got the "Ugliest presidential campaign in American history" right now with Trump v. Clinton?  Well, we'll probably make the top three in the next historical revision.
At the moment, the top prize is hotly debated between John Adams v. Thomas Jefferson, 1800, which basically caused the first US two-party political split when a debate over a central Federalist Constitution versus the earlier Democratic-Republican idea for states to mind their own businesses turned too personal and made the two lifelong friends mortal political enemies...And in the other corner, George HW Bush v. Michael Dukakis, 1988, when nervous Republicans, about to lose eight years of Ronald Reagan to the two-term rule, refused to give up their "Republican Camelot" without one ugly schoolyard fight, and were determined to grind the intellectual Democratic Massachusetts governor into paste under their heel.
That was the election that created more slang terms in our campaign culture than any other (quick, who said "Senator, you're no Jack Kennedy"?), most of them referring to aired attack ads, and all aimed by Republican strategist Lee Atwater towards "that little midget":  "Willie Horton".  "Revolving door".  "Boston harbor".  "Vacuum cleaner".  Bring back memories for some of those older folks out there?  For those who weren't there, ask your parents for definition.  I could explain them for the young folks, but the tastelessness and implied racism of some of them might be unsuitable for this blog.
And, okay, "Vacuum cleaner" wasn't one of theirs--That was the election we saw the rise of the Third-Party Attack Ad, which could be produced by private organizations, like a PAC support group or the Religious Right, and express a little more personally questionable anger than the candidate himself would want his committee to be associated with, but which the candidate could still stay above and say "Not mine, folks, don't look at me."  It started to become just a little too convenient an excuse, particularly when it was hard to determine whether those "crazy loose-cannons" had been acting on under-the-table orders or not.
The use of nameless drive-by attack ads with or without the sponsoring candidate's name, face or participation had reached such proportions in state and federal elections that by 2002, the Bipartisan Campaign Reform Act, designed to close loopholes in campaign funding, also created the Stand By Your Ad provision, which required the candidate to appear, onscreen, in any TV, radio or print ad produced by his own campaign committee, and say so.  (Internet is still exempt from the rule, and frequently exploited.)  The idea was not only to help identify the "real" attack ads from the "fake" ones, but also that no image-conscious candidate would want to be seen next to his own genuine schoolyard attack ad, and thus make fewer of them, or at least keep the tone a little more civil--And if they did personally "approve" (= authorize) their own attack ad against their opponent, that told you everything you needed to know.

Ever wondered?  That's why.
When a candidate in a political ad says "I approve this message", he is not--now, let's repeat that very slowly and clearly so everyone can understand it, he is NOT--trying to annoy you by chirping "I liked it!", nor is he some vain idiot saying "Dang, I'm good!...See how good it makes me look?"
He is complying with the election law and validating his own campaign comittee's ad with his legally-required disclaimer stamp of APPROVAL.  Get it?
Adam?  Jamie?  I'd call this cliche' "Busted".

Sunday, October 30, 2016

A Flop is Not a Disappointment (and vice versa)


Almost twenty-five years ago to the day (I wanted to save it for the November anniversary, but there were too many other good topics, and it's a quiet week), I was browsing magazines at a Barnes & Noble, flipped through the Entertainment Weekly that week, and read one of the most elusive, unexpected lightning-bolts of accidental GENIUS I had ever read about movies. I don't even think the columnist quite knew what he'd hit upon either.  
But twenty-five years of quoting and expanding upon the theory later, it was probably one of the first great influences that set me on the road to Movie Activism, and not following the crowd of entertainment-newsthink.  It was like the movie-nut equivalent of the Theory of Relativity.

Video columnist Ty Burr (now critic for the Boston Globe) had been handed the home-theater review for 1991's "Hudson Hawk", only a few months after it'd become one of the biggest theatrical money-losers to date, and since he hadn't already seen it, tried to find a "hook" that would liven up the column.
Trying to put aside the cheap "so bad it's good" angle--which it WASN'T--he asked the simple question, is a Flop a "flop", and if so, how do you know for sure?  Maybe it's just "misunderstood"?
For all those years, until the miracle of Google, the Internet and magazine-website archives, I thought I'd never find it again.  And now you can read it too:
EW.com - Video, "Hudson Hawk", November 22, 1991

A quick summary for the Too Long, Didn't Read crowd:
A "Box office disappointment" is an otherwise reasonably watchable movie that didn't fare well for reasons beyond its control--bad timing, poor marketing, being put up against tough-competition weekend, etc.--and might be rediscovered later on video.
A "Flop" makes its own mistakes through bad creative decisions at the highest production level, and has no one but itself to blame.  And because they have a bad habit of making the same mistakes over and over, you can judge a Flop by testing it against the bad decisions made by other more famous established Flops:
  • "The Howard the Duck test:  Is the movie’s very concept ridiculous?
  • The Heaven’s Gate test:  Was the director given insane license to splurge?
  • The Leonard Part 6 test:  Is it vanity fare from a star no one dared say no to?"
And these were just the flops we knew of from the 80's to 1991, folks.  Look back at the summer of '16 and consider, how innocent were we thirty years ago?

For some reason, I remembered the article as being longer.  Over the years, every time I quoted the article, more 80's "tests" seemed to crop up in my recollections of EW's '91 column, like:

  • The Dune test:  Was the absolute wrong/unsuited director chosen by the studio for the genre?
  • The Annie (1982) test:  Were large portions of the budget spent on opulent set details that would spend little time onscreen?
  • The Ishtar test:  Did the studio put too much faith that star-value alone would rescue a weak script?
I somehow remembered the Heaven's Gate test as the "set detail" one, and the Howard the Duck test as "Was it made by a director who was his own executive-producer, and no studio had higher control over creative decisions?"...Or maybe that was the Willow test.

Moving on from the innocent screenwriter-80's into the corporate-franchise big-studio 90's, 00's and 10's, I was able to apply other more specific tests, like:
  • The Godzilla (1998) test:  Was a familiar property handled in a willfully wrong or inaccurate tone compared to what the core audience expected?  (See also:  Batman & Robin, Dark Shadows, Green Lantern, Jem & the Holograms)
  • The Lone Ranger test:  Was the studio/director so confident that a hit director was reuniting with his star from a previous hit, they tried to change the new film to incongruously copy the earlier one?  (See also:  Battleship, The Wild Wild West)
Looking back at Michael Lehmann's work on Hudson Hawk, we can even today say it chiefly failed:
  • The Green Hornet test:  Was a major big-budget studio genre film for a wide mass audience instead given to a director of small, quirky cult films? (See also:  Fantastic Four (2015), The Dukes of Hazzard)
The very definition should be in the name:  
A Box-Office Disappointment raised your hopes about it, and circumstances disappointed you.  
A Flop, onomatopoetically, trips over its own feet.

Last summer we had a bit of confusion with two of Disney's underperforming movies, which was already news considering they had the monopoly on almost every other hit film that year:
July's "The BFG", directed by Steven Spielberg, had an almost non-opening in fourth place, followed by August's update of "Pete's Dragon" which struggled in third behind two front-loaded cult-films before disappearing without a trace.
In entertainment headlines, that's the stuff that gets reporters to pass the popcorn--The need to validate all good and bad box-office figures as "true", for obviously being the movie's own fault, had analysts dancing around the fires.  And when Spielberg's "BFG" unexplainably did poorly, it was a time for vanity-bonfires and crackpot theories.  
Variety and other industry sites jumped on the unexpected headlines with bloodlust claiming "Spielberg can't make a summer hit anymore!  Is his career over?"  (Hey, got a little drool there, might want to...)


It's not a bad film, actually.  In fact, it's rather cute:  Spielberg had wanted to film the script adaptation by "E.T."'s Melissa Mathison for years, and the project suggests a labor of love after Mathison had passed away during production.  JK Rowling had once mentioned wanting Spielberg to direct the first Harry Potter movie, and here, she almost gets her wish--Spielberg gives Tom Hanks a rest and puts aside his usual Jewish/wartime agendas of his past few films, lets Mark Rylance as the title character transform Roald Dahl's nonsense-word jabberwocky into a natural North-country dialect, and turns the keeping-calm and carrying-on of British-fantasy whimsy up to eleven to give us, basically, the Early Harry Potter movie he never made.

So why did the movie do so badly?  Here's where we get into the theory of what makes a Disappointment different from a Flop:
First, it had just about the year's worst release date imaginable--Maybe Disney was modestly not expecting their own Pixar's "Finding Dory" to be the year's biggest box-office hit to date, but it certainly couldn't have helped Spielberg's film to be released two weeks later.  Many who saw BFG felt it would have done better later in August, when less competition in theaters finds it easier to attract parents with back-to-school kids, as well as end-of-summer audiences who've spent out the tentpole blockbusters and are willing to try something different.  Unfortunately, Warner's "Suicide Squad" had the same second idea, and Disney had tried to steer clear...Not to mention, they'd already saved that slot for their own other summer oddity.
Second, the marketing was difficult, as Disney was overestimating the children's-book literacy of the audience--Roald Dahl's book is a staple in England, but over here, not too many Yanks past their fourth-grade reading lists know their Dahl apart from classic Gene Wilder candymakers, giant peaches and smart telekinetic waifs. No one quite knew what a "BFG" was (a Big Friendly Giant, in case you're wondering), and groused that it was the movie's fault for not telling us in the trailer.  Thus, they wondered who the funny-looking character with the big ears was, since they had no idea that Rylance's CGI-enhancement had been crafted to resemble Quentin Blake's well-known illustrations from the book.
Do any of these complaints have to do with the movie itself?  No.  They have to do with the audience, the studio, and the pre-release marketing.  And when a movie unfairly suffers for the crimes of the audience and the studio, that's, well...DISAPPOINTING.

Disney's remake of "Pete's Dragon", although it doesn't quite extend far enough into the "mistake" territory of the Flop (at least not as much as the bizarre downbeat bait-and-switch of Disney's flop "Tomorrowland" did the summer before) did rather puzzle its intended audience:
An old-school singing-and-dancing 70's-Disney musical set in a 1900's Maine town was instead turned into a serious realistic TV-styled contemporary non-musical drama--with gripping action climax--with no resemblance to the sentiments of the previous movie, except for the baby-boomer money-title and central concept of a boy and his invisible dragon.  Parents wooed by childhood Disney nostalgia were disoriented, and kids who hadn't grown up with their parents' DVD's had to take a relatively generic movie at face value...Along with an even stranger all-CGI virtual-character than just an old grandpa with elephant ears.

Neither movie seems to outright fail any of the Flop Tests, and yet we're left with the sense that Dragon was the guiltier party of the summer--
Many Disappointments find amnesty years after video, with their theatrical numbers long forgotten, while a Flop is when the central reasoning or appeal of the movie itself doesn't make sense, and causes every audience to ask basic questions any normal audience member would ask--Like, "The Lone Ranger rides a freakin' elephant??"
(Or, in the case of Disney's summer movie, "Who the heck ever heard of a FURRY dragon anyway?  Seriously.  That's like...no.  That's even stranger than the 'no musical' thing!")

The point is, every Flop and every Disappointment has to be taken on a case by case basis, and some even manage to have their "criminal records" cleared.  The trick is just in knowing which questions to ask.

Years from now, who knows, experience may provide us with even more new tests like, 
"- The Fantastic Beasts test:  Did merchandising shift and misdirect story focus to minor side franchise characters/details that held no interest for the main audience?  (See also:  X-Men Origins: Wolverine)", etc., but the theory remains.
The science of forensically testing our "flops" will help determine the Innocent from the Guilty.

Sunday, October 23, 2016

October 23, 2016 - 


I know I'm not the only who's noticed, and neither are you.  The move of corporate Hollywood from the story-encapsulating images of 70's posters, or 50's monsters carrying the girl, has now given way in 00's-10's Hollywood to a sort of generic portrait gallery, with the same pose meant to "ritually" identify each particular sub-genre.
It's already become a favorite viral joke among Internet wags on Tumblr and Pinterest, to put together Genre-galleries by poster image--Like the "Eye is the Window of the Soullessness" scifi/horror movie, or the "Back-to-Back 'Pretty Woman' Dueling Rom-Com-Couple":

It's not that complicated an idea--Studios, after all, have a jealous habit of wishing they were Somebody Else's Movie You Liked Better, and dressing up like their favorite role models.

But this blog isn't here to do viral jokes.  We're on a proactive mission.  Anyone can viral-gag asking a question, the hard part is in trying to answer them.  If we know the cause of the disease, we're that one step closer to finding the cure.  (If any.)

If you want to find some root cause of the disease, think back to the exchange between Geoffrey Rush and Tom Wilkinson in Shakespeare in Love:
"But I have to pay my actors and author!"
"Share of the profits."
"There's never any." 
"Of course not!"
"I think you may have hit upon something..."
Actors today are a little smarter than they were in Shakespeare's time.  After the case of Art Buchwald v. Paramount Pictures, suing for his stolen story on 1988's "Coming to America" and winning, discovered that the #3 box-office smash of 1988 "hadn't turned a net profit" under Paramount's "Hollywood Accounting", and thus had nothing to pay out, actors know they're not stupid enough to ask their agents for a share of movie profits anymore.
What they now want are Points--Points that will establish their "sales ability" as recognizable studio stars, share in the investment, a legal co-producer credit if they do share in the production, a profit off of any marketing of their character outside of the movie, and negotiable value as a Hollywood power-player staple that will allow them to leverage more demands on their next movie contract.

Among other complicated studio entanglements, this now means two things:
1) Actors are chiefly interested in marketing their faces for a living, and
2) Marketing those Faces For Sale now comes at a PRICE.  Talk to their agents, or face the consequences.
Studios, however, are interested in only one aspect of marketing, apart from what agents demand:
1) Audiences already know the movie they're buying a ticket for, walking in.  You're selling them the intimate in-knowledge of an old friend who's come back again, not some total-stranger movie out of nowhere they, like, wouldn't know.

Sometimes, the actor has negotiated his character to be the entire selling point of the marketing.  
Often this can be achieved with just an arresting ultra-closeup of nothing but the actor's face, and the teasing word-economic story ambiguity of the Vertical Tagline:
The teaser poster for the 2013 Carrie remake showed Chloe Moretz's face, with the vertical tagline "You. Will. Know. Her. Name."  The teaser poster for 2016's Jason Bourne showed Matt Damon's face, with the vertical "You. Know. His. Name."  
Well, now that we're all familiar on a name basis, let's have some drinks; who's for a game of Pictionary?

But this brings up the obvious question:  How can you tease an action movie, which is 90% ABOUT the lead actor/character hero, and not contractually show his face for another month or two?
Simple:  He's there, but with his back to us.  He's too busy Facing His Lone Destiny.

Or just carries some instantly recognizable weapon in his hand.
Don't worry though, the hit franchise character is still iconic to his fans even from the back or other isolated body parts, so You'll Know His Name.

But the most crucial Job One of a franchise sequel is to sell not a hand or nape of a neck, but to sell the LOGO--Like the logos of opera masks and French orphans that now ride atop every NYC taxi on Broadway, a shaped symbol logo must singlehandedly conjure up an entertainment movie-studio franchise.  
And with six months to a year to wait for the "You Know Its Logo" sequel to the earlier hit movie, the most arresting image is that the abstract presence of a logo you thought was gone is now beginning to reappear (ooo!), and won't fully take shape until next May, June or November.  (The one thing a teaser poster must sell is not the title or tagline, but the DATE--One look at the logo, and You Know Its Title.)  As with the damage to Starfleet Headquarters in 2013's "Star Trek: Into Darkness": 
And as usual, Chris Pine--or is it Benedict Cumberbatch?--has his back to us.  Not that we would really be able to see his tiny face next to all that big wreckage even if he was allowed to turn around and show it to us from the front.

But once the title is teased, taken hold, and started production, presumably all the major actors have their contracts ironed out, and are now free to show their face.  We have the second problem--WHOSE face gets to be shown?  Who is the true "star" of the movie?  Ask an ensemble movie of several lead characters, and you may find each actor had an agent who told them they were.  And no studio wants to start that argument.
This brings up that Points problem of "Character marketing", as even a major supporting character who was contractually cast now has to assert his presence in the movie.  Standup comic Robert Wuhl once joked about the system after appearing in 1989's Batman:  "I get a percentage of the marketing of my character--I play the wisecracking reporter who disappears halfway through the movie and doesn't get the girl; I'm hoping for those action-figure sales."
Studios solve that problem with something that allows them even more "Lobby space" for promotion--A poster for each character.  (A practice that started during the early comic-book movies, like the '99 Star Wars: Episode I or '00 X-Men, when anal-retentive fans couldn't wait to judge their first peek at what each new character would look like.)  


To take a harmless example, here are upcoming posters for a 2017 Power Rangers reboot--First we have nameless silhouettes, an abstract tagline against a still-abstract lightning-bolt-logo sky, and no title but a date:
Now, five similar posters on a theme, each with the clear, negotiated face of one of the marketable characters in costume/gear, also with no title but the logo, date and cultural-nudge tagline--Which one are you rooting for?  Red? Gold? Pink?  (I don't care, I never watched the show as a kid.)
Yes, like those old third-grade school days...everyone's special enough to get a poster for participating, and no one gets left out.

And, of course, when a movie does finally open with its final marketable sales image, title and cast/credit list, it doesn't have six posters.  It has one.  With a lot of people who were legally told they would be on it.
To understand this concept, you have to understand about medieval art--In the old days, before Italian Renaissance painters like Brunalesschi developed the 3-D realism of perspective, "flat" 2-D medieval religious art had to determine the size of the figures by who was more important in the hierarchy to the scene depicted.  The saint would always be more ginormous than the almost equally big faithful king/noble patron featured, who was bigger than the normal-sized priests and teeny peasants, and so on, in terms of relative flattery.
This brings us, in our modern days of feudal Hollywood, to the final poster, or "Head salad"--It's the studio's last chance before opening to give every actor the poster-representation they negotiated for, and like the class picture, EVERYBODY has to be in shot.  Just how big we see them in the shot, however, depends on their negotiable role in the movie.

So, as we said at the beginning, it's really not that complicated.  All major-studio movie poster campaigns today boil down to one simple formula--Nobody, Each & Everybody.
Let's take an example from last summer:  "X-Men: Apocalypse", Nobody, Each & Everybody.
Take a look at the Everybody final-poster graduation shot:  James McAvoy/Xavier and Michael Fassbender/Magneto, they're the A-list focus of the story, they're front, center and in your face.  The new breakout supporting-character actors?--You can just spot them somewhere in back, they didn't get as much screen time.  (Insert Sesame-Street Grover voice:  "Near.....Fa-a-a-ar!").  The returning actors from the earlier film?  Don't worry, they made sure they're big enough to make out their faces clearly somewhere in the middle behind the front, or they wouldn't have come back for the sequel.
The villain?  Well, he's this entry's plot, he always has to be looming in back before he can show his face, like Donald Trump looming behind Hillary Clinton talking about healthcare.

Now let's try it with a big upcoming movie we haven't seen yet...Umm, I know:  "Rogue One: A Star Wars Story"!
If the theory holds, we should have the teasingly abstract suggestion of a cool recognized pop-image/logo with no discernible actors and a date, a themed closeup of each supporting character as uniquely market-identifiable dramatic partner of the whole, and one graduation-shot where we get to see Who's Bigger Than Who.  Nobody, Each & Everybody.
There you have it--Told you it was simple.   The answer to the Internet question, "Why do all movie posters look exactly alike?":  Because the studios have no choice.
Because Hollywood Accounting, corporate franchising and agent negotiation have forced them to market every movie exactly alike.  Well...duh.

Tuesday, October 18, 2016

The TV Activist, Pt. 4 - It's Just Like the MOVIES!!



We've heard a lot of praise lately that TV is currently in some "Golden Age", and "Better than ever!"  And by "better", the definition usually given is that we now have more of it.
Budgets are bigger, and shows are determined to spend them, so long as it's not on anything unimportant.  Game of Thrones and The Walking Dead push new limits on violence, sex, and general cinematic candy, that broadcast never could.  Viewers now have "a wealth of options" at their fingertips, bold, uncut and immediate, can happily cut the cord from cable systems' overpriced reality shows, and be set for life, never having to watch a vintage twentieth-century movie or rerun again if need be.

Broadcast, cable and streaming-era TV certainly seems to be all those things.
It's certainly Available.  It's certainly Bold.  It's certainly not on those old poopie commercial networks with their fascistically enforced airtimes or choices of channels anymore, that only showed you one episode a week.  What many are noticing, however, is that in the last few years, what it hasn't been is Fun.


Grimness is now the order of the day.   Series are season-arcs, in which no plot will ever be satisfyingly complete without an unsettling cliffhanger development.  Former A and B-list actors now act serious and important, trying to justify their career moves, while the show treats them front-and-center to justify their salaries.  Editing is harsh and bullet-point, and cinematography is dank and steely, trying to be "cinematic".  A "pivotal series moment" usually means that a character will be killed off (fan sites regularly gush over predicting who will be killed off by the end of every season, instantly assuming someone will), or catastrophic 9/11-style disaster will "change the characters' lives forever".  So-called "Glass Ceiling" series now feature female attorneys, female doctors, female investigators, etc., grimly determined to be taken seriously in a male profession, and never daring to betray their sisters with a smile, unless a righteously vindictive one against the un-PC.  
We're invited to spend some grim, bold, important, unsettling time with characters who, well, seem too determined to be having any FUN.

Why would TV knock itself out to be More Important Than Entertaining?
One year, in an expensive high-profile ad during the 2013 Oscars ceremony, ABC tried promoting their current ratings hit, by targeting the obvious movie-fan demographic with the appeal of all those neato CGI effects and wild fantasy premises on "Once Upon a Time".  
If you ever needed any explanation about what happened to TV in the 10's, it's one of those things you can't un-see or un-hear:

Yes, ABC knows:  Why do we watch a big-budget effects-heavy show, like Game of Thrones, Walking Dead or Doctor Who?  Because it's just like going to a big-budget effects-heavy movie EVERY WEEK! 

Some right now are hearing crickets after that, where they thought they would be hearing thunderous applause.  Why?  Well...it's kind of a weak sell, when you come right down to it.  We already have movies pretty much at our fingertips now, and usually in the same places we're finding the TV shows.  And having the two together has not only confused us about which one is which, or should be, it's confused themselves as well.
TV now tries to be as Cinematic as the movies, while Movies sell themselves as favorite cult-watch big-budget series, and sell their studio-tentpole franchise entries with post-credits teases to "Tune in next year, for our next exciting episode!"

Things were a bit simpler in the early days:  Television was what television had always been since the days of radio--An hour of drama, or a half hour of comedy, variety or information, so long as you remembered to buy the sponsor's soap.  
Movies, well, those were different.  New movies were things you dressed up and went out of the house to see, and old movies that people used to go out to see were now favorite secret pasttimes during the late-night filler and local-advertising hours, and you got your me-time space ready in the dark with recliner and popcorn.  
When a big Hollywood-studio movie premiered on TV--right in your own home and you could stay up to watch in pajamas!--it was an event on Sunday night when all America would be in their living rooms to tune in, and hear Ernie Anderson's ABC-voiced "To-night:" pump you up with the trailer of movie-iconic moments:

And if you missed it, Monday's water-cooler conversation would only remind you that you had.  Frustrating, yes, but that was the tradeoff of what you got for nothin' (except a few ad breaks).

Well, we know what happened to that.  HBO premium-cable and the VCR saturated the home in the early/mid-80's and soon movies on your TV were as common as squirrels at your bird feeder.  Both brought new and recent movies uncut for censorship and without commercial breaks for bathroom and popcorn-refill, and without stretches or condensing for two and three-hour time slots, and networks saw less and less reason to promote an expensive movie that no one would tune in to watch.  Especially if it arrived months after we had already seen it in its original form.  Movies on network-TV were soon seen as an insult, when Milos Forman began suing over seeing his classics condensed, commercial-broken or cropped for 4:3 screens.

When Disney took over ABC, and now wanted to play with its big national-mainstream toy, it tried bringing back the Saturday Night Movie in the early 00's, showing mainstream films usually as an excuse to tack on some promoted commercial or "sneak peek" of some corporate offering as the bait...But the ratings weren't there anymore.  Who needed to watch the first Harry Potter movie, when most families who would want to already owned the complete series on their DVD shelf?


HBO, of course, along with Showtime, could give us one thing that TV couldn't--Nude scenes.  As viewer-funded premium channels with their own satellite linkups, movie networks weren't as beholden to sponsors or the FCC, and could not only air movies uncut, but shows uncut as well.  In the late 70's and early/mid-80's--when cable either made most of its shows in its garage, or imported it from the looser censorship standards of Canadian television--an "HBO Original Program" like The Hitchhiker or Dream On, or the earlier days of Showtime's "Bizarre", was a little back-alley nudge to that late-night 13-yo. that meant "Free boobs".  And before the VCR was in every home, you took a little forbidden fruit where you could get it.
But as HBO and Showtime's fortunes grew, and the networks tried to establish more of an identity with original series and movies, premium channels became the go-to for programming that the networks were afraid to touch!  The forbidden fruit was now more controversial than mere cheap papayas:  HBO offered a string of made-for-cable biopics about too-hot-to-touch news-disputed figures like Jack Kervorkian, Roy Cohn, and the Jay Leno/David Letterman feud; Showtime could air a too-soon miniseries about the Reagans, HBO could air the Tony-winning gay/AIDS drama "Angels in America", and comic Bill Maher was allowed to air whatever talk-host political conspiracy theory or atheist rant struck his fancy with no higher authority to gag him.
To be Bold got attention, and Who Dares Wins Emmys.  The networks were a tad envious.

And then the movies disappeared.  Turner Classic Movies, coming from Warner, who had dominion over most of the catalogue classics from Warner, MGM, United Artists, and RKO established their own sovereignty, and other past-movie channels were left to fend for themselves.  American Movie Classics and FX got by for a while on Fox and Universal classics, but soon found that if they didn't want to show Jaws and The Omen three nights a week, they'd have to start increasing their original-channel programming.
FX's "The Shield" took advantage of cable's looser envelope for violence and language, and AMC, which had already downgraded itself from a premium to a commercial cable-tier channel, had a string of cult hits whose Boldness hit sweet-spots with an audience ready to be newly addicted to anything.  AMC's depiction of the missing 50's-60's male-mystique and consumerism in "Mad Men" was both forbidden-fruit away from the broadcast networks and teasing serial drama, either one to fill a void with their viewer cult.  The broadcast networks went into full one-upmanship mode, and tried to throw any chauvinistic beehive-hairdos at us they could, with ABC's "Pan Am" and NBC's "The Playboy Club".  One lasted seven episodes, the other didn't do much better.
Soon, former movie-fest channels like AMC, FX and Starz were known more for their Emmy-nominated weekly national cult-addictions, while HBO had America glued to the pottymouthed cowboys of Deadwood and Old-neighborhood violence of The Sopranos.  What, movies?...Did we used to show those things?  Well, do you see any lying around?

And with broadcast, cable and streaming in the mix, "One-upmanship" has been the word ever since.  Like teenagers daring themselves on the schoolyard, the new game is to see what the Biggest Dare is to tackle, and who's going to push the envelope further without ripping it or turning chicken.  (Insert Marty McFly's "Nobody...calls me...chicken!" here.) 
Sitcoms are considered too frivolous unless it tolerantly shocks our sense of Today's Society--If ABC brings a mixed-gay family sitcom, Amazon must counter with a transgender sitcom, and a family getting by with cerebral palsy must be countered with a family getting by with autism.  A procedural thriller of government investigators must be topped with a procedural thriller of post-9/11 terrorist profiling.  If Showtime gives us a popular happy serial-killer, the broadcast and cable-tier networks must counter with lovable old Hannibal Lechter and Norman Bates.  If you want mere entertainment, well, you're just not up with today's complex, troubled, multicultural society.
And the rule of the serial season-arc, never leave them satisfied, but always wanting MORE.

TV has become afraid of its stagebound set roots.  As ABC says, it now must be Epic.
It doesn't want to be the half-hour three-act comedy of a sitcom episode that brought a Broadway stage of ensemble performers into our living room, or a vaudeville of variety, or encapsulated one-hour magazine-adventure of a story.  It wants to something BIGGER and more IMPORTANT.  Something that will make us gasp, and silence any dispute of its existence.  Something that courts awards.  Something a corporate empire can be built upon.  Something that no one will dare question or laugh at.


And as CS Lewis once observed, there is no one more desperate to "look grownup" than an insecure child.