Saturday, July 22, 2017

links for the weekend, on modes of comedy and moralism and the moral role of the self-described smart set (plus some Jane Austen stuff)

Here were are half way through the year and there's only been 93 posts at Wenatchee The Hatchet?  Trippy.  Not that I want to get back to the record high of 717 posts in 2012 ... but it sure has felt like this has been a fallow year for blogging. 

 Yep, it's been kind of a links for the weekend summer at Wenatchee The Hatchet.  Preparatory reading and intermittent blogging have been the order of the summer, though it feels more like the order for this year.  Composing music takes time.  There's been some pretty abstruse material that's been at the blog this year.  Finally getting around to the arts side of things has had some not entirely anticipated consequences. So, links and stuff.

Sometimes it seems like arts journalists and pundits are determined to learn the wrong lessons from box office activity in advance.  Take this ...

 Hollywood’s box-office woes: Is the industry aiming too narrowly at men?

Another weekend, another branded movie struggles.

Hollywood’s so-called franchise fatigue hit for what seemed like the umpteenth time this year, as “Spider-Man: Homecoming” followed up its auspicious domestic debut July 7 with just $45 million in ticket sales (a 61% drop) this past weekend. That paved the way for the movie to lose what many thought would be a tight opening-weekend battle with “War for the Planet of the Apes” — incidentally, an underperformer in its own right. And it raised the question, once again, of why so many supposedly surefire hits can’t seem to stick.

Many big-budget live-action movies over the last few months have been dropping at a dismal rate — at least 55% — in their second weekends. That includes films that had strong starts, such as “Spider-Man,” “Guardians of the Galaxy Vol. 2” and “The Fate of the Furious.” And it includes titles that didn’t, such as “Transformers: The Last Knight,” “Pirates of the Caribbean: Dead Men Tell No Tales” and “Alien: Covenant” (the last one falling a whopping 70%).

Clearly the geriatric age of some of these franchises is a factor — both “Pirates” and “Transformers” are on their fifth installments, and “Alien: Covenant” is even longer in the tooth, depending on how you classify some earlier entries. But it doesn’t explain why “Apes,” at just its third iteration of its current cycle and with some of the best reviews of its life, had such an underwhelming opening of $56 million. Or why “Spider-Man” couldn’t parlay its strong debut into a better second weekend.

Yet, clearly, Wonder Woman and a remake of Beauty and the Beast have done great at the box office.  If Hollywood is really so bad at paying attention to what more than just young males want then why are we getting a feature length My Little Pony movie this October?  I'm probably going to see it, honestly, simply because I have so many nieces it's going to happen and because I honestly think the show, as launched by Lauren Faust back in season 1, is actually pretty good.  I'm partial to Princess Luna myself and the show had a fun Inception homage/spoof using her character.

And did the Captain Underpants movie just not exist? So some of the limitations in punditry about film comes from this tacit reality that people who opine on film and television are vastly more likely to be riffing on prestige television and tentpole or arthouse cinema and its stereotypes rather than writing about stuff aimed directly at kids or all-ages entertainment that isn't confined to Mouse House.

For instance, it's vastly easier for an American film critic to have just declared the Ghost in the Shell remake "doesn't work" than to discuss either the anxieties Japanese artists brought to bear in the peculiarties of Japan's technocratic responses to the Cold War era or to Oshii's singular take on subversive appropriation of biblical literature to make his metaphysical and political points about his time and place.  The former requires a level of cultural history and engagement a lot of American film critics don't want to bother with and the latter, well, pretty much the same!  If you're steeped in an educational or journalistic milieu that largely presupposes that the Abrahamic religions were the worst thing to befall the entire human race then how would you be situated to get the significance of Oshii's appropriation of the Babel narrative in Patlabor: the movie?  Now, sure, people can binge seasons of Game of Thrones while still reflexively thinking that reading stories of similar nastiness in Judges is somehow a mark against Judges but that may just be how Americans are these days ... .

As for named franchises.  I never bothered with the Apes reboot.  If the axiom holds that history plays as tragedy and then a second time as farce then trying to get lightning to strike twice is a futile effort.  We got our remake of the premise of Planet of the Apes as farce a decade ago when Mike Judge made Idiocracy, didn't we? Now maybe film critics like Dana Stevens can gush that the new Apes films raises the question of whether we humans deserve the wonderful planet we live on but this is in essence code for how the wrong kinds of humans with the wrong kinds of beliefs about the nature of the human condition and the human being are going to deprive the rest of us the opportunity to live here.  Dystopian stories inevitably traffic in the kinds of justified stereotypes about the people we dread getting the kind of power and social influence we don't want them to have.  But I'm wondering this year whether the subtextual concern is a transference to "them" of a totalitarian impulse we may find in "us".  Dystopian literature is inevitably a self-exonerating exercise.

Which thematically leads us to a thinkpiece about humor in the age of Trump over at Slate. Now I've written here in the past about how there are ultimately two basic types of humor, laughing with and laughing at.  A lot of mainstream political humor has trafficked in the simple formulate of "laugh with me as I laugh at how stupid/evil/immoral/hypocritical/greedy those people are." The proposal in the piece is that Trump and the trolls who support Trump have forced mainstream liberal comedic thought to reassess both the nature of comedy in general and the tropes used by mainstream comedians.

Thus, a lengthy set of excerpts:

https://slate.com/arts/2017/07/trump-and-his-trolls-arent-killing-comedy-theyre-saving-it.html


...
 
This conviction—that humor is “a superior way to tell the truth”—is extremely recent, but it is dearly held. The 2016 election confronted liberals with forms of expression that were indisputably amusing to many people but failed miserably to meet the Nussbaum Criterion: lies, insults, and cruel pranks, emanating from anonymous abusers and presidential candidates alike. Baffled, we started to call such things trolling. The word conjures a kind of evil twin of humor. The trolls had “stolen Washington,” where, as Amanda Hess lamented in the New York Times, they worked their nihilistic magic: “No reason, no principle, just the pure exercise of power.”
 
 
The panic about trolls has little to do with their actual political influence, which is tiny, and a lot to do with our fear that shittiness and humor might be compatible. They are. [emphasis added] And when you think about how constrictive the Nussbaum Criterion is, it starts to become clear why practitioners, consumers, and critics of comedy might be struggling to understand its role in our culture and politics.
 

It's like some of these people have never actually watched shows like Rick & Morty or Archer or any number of shows where the humor of cruelty is the engine of plot. 
...
 
 
That changed around 1957. Here is how Nesteroff describes it: “Eventually men like Lenny Bruce, Mort Sahl, and Jonathan Winters came along and led a revolution by developing their own material, derived from their actual personalities.” Bruce talked about abortion; Sahl made incisive political observations. Winters said, “Just tell the truth and people will laugh.” According to George Carlin, comedy “changed forever for the better.”
 
These stand-ups didn’t see their acts quite the way we see them: They understood themselves to be constructing characters. “Will Rogers used to come out with a newspaper and pretend he was a yokel criticizing the intellectuals who ran the government,” Sahl has said of his vaudevillian predecessor. “I come out with a newspaper and pretend I’m an intellectual making fun of the yokels running the government.” Bruce, too, was initially received as a performatively “sick comedian” and not a portal to authentic being. The force that united Bruce’s jokes was not self-revelation but rather the urge to scorch middle-class tastes. Both men’s acts were experiments in identity, in who you could get away with pretending to be, be he pervert or pundit.
 
The idea that we couldn’t perceive the human soul until Lenny Bruce told jokes about masturbation in 1957 not only warps our understanding of the comedy that followed, it blots out the comic traditions that existed before and alongside him. It has no room for Will Rogers, Bob Hope, and Danny Kaye—former vaudevillians who somehow managed to entertain people before comedy became funny—because they weren’t “real.”  [emphasis added] Neither was Moms Mabley, the immensely popular black stand-up whose act shingled character over social observation over character. (She performed as an old lady for most of her career.) This is how Nesteroff accounts for her significance: “From the 1930s through the 1950s, Mabley was comedy’s primary voice of the Civil Rights Movement,” which started in 1954.
 
Having witnessed the rise and fall of Mars Hill the construction of a persona that trades on authenticity goes a long, long way to explain a Mark Driscoll in Puget Sound.  Why would mainstream liberal comedians, or leftist comedians, ever for a second imagine that the persona of the authentic truth-teller couldn't be appropriate by anyone regardless of social or political or religious commitments?  Mark Driscoll was in a sense nothing more than a mirror image of Dan Savage, the two men parlaying their former Catholic upbringings into shock jock moralizing from ostensibly liberal/left and right perspectives in generic terms.  Now depending on how you parse that they might both have represented what some scholars might call the relatively new neoliberal center but that's a topic other people can tackle since this is a weekend and not a school day for me. :)

These performances depend on the acceptance of a lie, the lie that the character is the person, yet it is this lie which seems to be accepted.  There may be value in slicing through the performances of social obligation and the tensions the performance of social obligation create for sincere and honest relationship and that will get us to somebody at length, but first ... .

It might be hard to overstate the observation that a performative character is not the same thing as the actual person.  Whether thanks to a Sylvia Plath in poetry or a John Lennon in songwriting or the emergence of any number of other people, the illusion that the artist expresses his or her real feelings and observations about the world can be pervasive.  We know in the abstract not everybody works or thinks this way but the generalization is that you can fairly safely assume that if artist X does Y and Z in artistic media that's a direct expression of what the artist thinks or feels.  But this can be transformed, imperceptibly into an interpretive method that goes like this "I thought or felt A and B while consuming X and Y artistic creations therefore A and B is what X and Y are actually about."  We can forget that the authenticity we read on to art works can be merely ourselves.

One of the lazier habits in arts criticism is to mine work for clues to biographical concern.  This may be one of many reasons Romantic era writers preferred Beethoven to the equally brilliant Haydn, because Haydn made his works to order and to explicitly please his audience whereas Beethoven was regarded as authentically expressing his own feelings.  This Romantic trope has been with us for centuries.  I admit to being explicitly anti-Romantic in my overall outlook.  There's an interesting passage that comes up about what the "role" of a contemporary comedian often is in the age of American comedy since Jon Stewart:
...
 
 
The role that Colbert and Fallon are competing to occupy—Your No. 1 Hypothetical Friend—would not exist had comedy not become synonymous with personal authenticity, and personal authenticity with wisdom. Their authority (and the authority of Seth Meyers, John Oliver, Samantha Bee, and Trevor Noah) rests on the illusion that because an audience is laughing, the performers must be channeling some holy spirit, not their partisan loyalties or professional interests.
 
But this breed of comedy is didactic, and things that are didactic are not funny. [emphasis added] Baudelaire, in his treatise on laughter, makes a distinction between “significative comedy,” which you recognize by its carefully expressed “moral idea,” and “absolute comedy,” which you recognize because you are laughing. Our political humor today is certifiably significative. As a Vox video put it, “What makes satire such a powerful antidote to Trumpism isn’t that it’s funny.” (It’s that it’s true.) While absolute comedy affirms that we are all equally ignorant, significative comedy assures you of your superiority over others. For liberals, the experience of late-night comedy is largely one of narcissistic gratification—lectureporn, as Emmet Penney termed it. Before the election, John Oliver, Samantha Bee, and company had presented themselves as jesters in the court of a crazy king. Now they play straight men to him, ceaselessly signaling to the audience, “This guy is a nut! Normal people like you and me can see what a nut he is!” Not only is this shtick monotonous, it seals its audience in a bubble where a smirk is worth more than a joke. [emphasis added]

All that pivot required was for the wrong guy to be in the oval office.  The same can also be said for lectureporn from the right. In more church terms the phrase would be preaching to the choir. There may be a whole bunch of people who imagine themselves to be employing the wit of an Elizabeth Bennett who are behaving and thinking a bit more like a Mr. Collins--using jokes and observations about the way we live now that are partly rehearsed and partly the spur of the moment but at all times signaling that we're the right sort of people, after all. Collins is not just a point of substance but of communicative mode.

Not necessarily meaning to bash Baudelaire but in real life I think we can recognize that there is some real overlap between significative and absolute comedy.  We wouldn't still be talking about either Shakespeare or Jane Austen if there were never any real way for these two modes of  comedy to overlap. 

Of course this week was the bicentennial of the death of Austen. 

https://www.theatlantic.com/magazine/archive/2017/09/jane-austen-is-everything/534186/
...
As Austen’s own Emma Woodhouse put it to her querulous father, “One half of the world cannot understand the pleasures of the other.” But in the case of Austen, that misunderstanding seems to have an urgency that isn’t attached to any other canonized, pre-20th-century literary figure. The disagreement has been amplified as her fame has grown, and her fame may never have been greater. This year sees her unveiling by the Bank of England on a new £10 note, replacing Charles Darwin (and before him, Charles Dickens); she is the first female writer to be so honored. Meanwhile, the scholar Nicole Wright’s revelation that Austen was appearing as an avatar of sexual propriety and racial purity on white-supremacist websites made national news on both sides of the Atlantic. A few years back, her 235th birthday was commemorated with the honor of our times, a Google doodle. The wave of film adaptations that began in the 1990s may have receded, but it left in its wake a truth as peculiar as it seems to be, well, universally acknowledged: Austen has firmly joined Shakespeare not just as a canonical figure but as a symbol of Literature itself, the hazel-eyed woman in the mobcap as iconic now as the balding man in the doublet.
 

...

Iconic as she's become, the reasons for her status often stir up zealous dispute. Is Austen the purveyor of comforting fantasies of gentility and propriety, the nostalgist’s favorite? Or is she the female rebel, the mocking modern spirit, the writer whose wit skewers any misguided or—usually male—pompous way of reading her? (For her supremacist fans, Elizabeth Bennet would have a retort at the ready: “There are such people, but I hope I am not one of them.”) Any hint of taking Austen out of her Regency bubble brings attacks. When the literary theorist Eve Sedgwick delivered a talk in 1989 called “Jane Austen and the Masturbating Girl,” some male social critics brandished the popular term politically correct to denounce Sedgwick and her profession. Six years later, when Terry Castle suggested a homoerotic dimension to the closeness between Austen and her sister, Cassandra, the letters page of the London Review of Books erupted. In other precincts, business gurus can be found online touting “what Jane Austen can teach us about risk management.” Not only is my Austen unlikely to be yours; it seems that anyone’s Austen is very likely to be hostile to everyone else’s

... that gets me thinking of the Shostakovich wars but if you don't already know what that is I'll spare you even more weekend reading!

A few things from here and there on the state of the educational milieu in which Jane Austen is likely to keep getting debated.  There's concern that colleges may be heading for a crisis.

https://www.insidehighered.com/news/2017/07/19/number-colleges-and-universities-drops-sharply-amid-economic-turmoil

Fredrik deBoer, for instance:
https://fredrikdeboer.com/2017/07/11/the-mass-defunding-of-higher-education-thats-yet-to-come/

thus, the perhaps too obvious pull quote:

I am increasingly convinced that a mass defunding of public higher education is coming to an unprecedented degree and at an unprecedented scale. People enjoy telling me that this has already occurred, as if I am not sufficiently informed about higher education to know that state support of our public universities has declined precipitously. But things can always get worse, much worse. And given the endless controversies on college campuses of conservative speakers getting shut out and conservative students feeling silenced, and given how little the average academic seems to care about appealing to the conservative half of this country, the PR work is being done for the enemies of public education by those within the institutions themselves. And the GOP has already shown a great knack for using claims of bias against academia, particularly given the American yen for austerity.

But let's move a little further to what I took to be the real concern:

In 2010 I wrote of Michael Berube’s What’s Liberal About the Liberal Arts?, “the philosophy of non-coercion and intellectual pluralism that Berube describes and defends so well isn’t just an intellectual curiosity, but an actual ethos that he and other professors live by, and which defends conservative students.” I grew up believing that most professors lived by that ethos. I don’t, anymore. It really has changed. For years we fought tooth and nail to oppose the David Horowitz’s of the world, insisting that their narratives of anti-conservative bias on campus were without proof. Now, when I try to sound the alarm bells to others within the academy that mainstream conservatism is being pushed out of our institutions, I get astonished reactions – you actually think conservatives should feel welcomed on campus? From arguments of denial to arguments of justification, overnight, with no one seeming to grapple with just how profound the consequences must be. We are handing ammunition to some very dangerous people. [emphasis added]

and this next part ties back to the aforementioned observations about lectureporn as "comedy"

David Brooks has a column out today. That means that social media is going through one of its most tired types of in-group performance, where everyone makes the same jokes and the same tired “analysis” of whatever his latest dumb argument is, over and over again. None of the jokes are funny, none of the analysis useful, but this ritual fulfills the very function that Brooks is talking about in his column: making fun of David Brooks is one of the ways that bourgie liberals signal to other bourgie liberals that they are The Right Kind of Person. [emphasis added] Brooks, of course, is incapable of really understanding his own observations, given his addiction to just-so stories about character and gumption and national grit. He does not see, and can’t see, the economic structures that dictate so much of American life, nor is he constitutionally capable of understanding the depths of traditional injustices and inequality. If he did, he wouldn’t have the column.

But his critics can’t see something that, for all of his myopia, he always has: that our political divide is increasingly bound up in a set of class associations and signals that have little to do with conspicuous consumption and everything to do with a style of self-performance that few people ever talk about but everyone understands. It is the ability to give such a performance. [emphasis added[

Sometimes I feel like we're trained to believe that it is in the distinction between social performance and the real self that hypocrisy always exists.  I don't think that's really the case.  There may be a bond between the social performance you do every day in whatever setting you're in and what you think inside your head and the push and pull of that fluid dynamic is, in many respects that are hard to express, who you actually are.  An author like Jane Austen can brilliantly convey this whereas other, lesser, comedic authors might camp out on the idea that the performance is the facade while the inner life is the real person.  Austen gives us characters in which the performance and the inner life are always bound together.  When the inner life that manifests in action and the performance are twisted into opposing directions, yes, Austen highlights the tension. 

That giving the right performance opens and closes doors of opportunity that define what you can and can't do for the rest of your life and that this peril of giving the right performance defines modernity may be one of many reasons why Austen's work is still so relevant to our own era in spite of the fact that Austen died two centuries ago. Famously she told stories of how people can appear to be The Right Kind of Person at all the surface levels who turn out to be the wrong kinds of people in how they actually treat people.  It may be a testament to the politics of authenticities and personas that people want to read radicalism into Austen rather than accept her moralism at face value because her satires of upper class entitlement and the constraints of middle class existence, particularly for women, are easier to read as political rather than moral judgments.  But if they were not first and foremost moral judgments with political implications and consequences would we still be discussing her work centuries later? 

I was struck by deBoer's piece on the dogma of educational culture that he published earlier this year:
https://fredrikdeboer.com/2017/04/24/the-official-dogma-of-education-version-1-0/
https://fredrikdeboer.com/2017/07/17/mechanism-agnostic-low-plasticity-educational-realism/

...

The implied policy and philosophical changes for such a viewpoint are open to good-faith debate. As I have written in this space before, I think that recognizing that not all students have the same level of academic ability should agitate towards a) expanding the definition of what it means to be a good student and human being, b) not attempting to push students towards a particular idealized vision of achievement such as the mania for “every student should be prepared to code in Silicon Valley,” and c) a socialist economic system. Some people take this descriptive case and imagine that it implies a just-deserts, free market style of capitalism where differences in ability should be allowed to dictate differences in material wealth and security. I think it implies the opposite – a world of “natural,” unchosen inequalities in ability is a world with far more pressing need to achieve social and economic equality through communal action, as that which is uncontrolled by individuals cannot be morally used to justify their basic material conditions.

Put that way it reminds me of how in the Torah there's a set of laws condemning the unfair exploitation of people with disabilities.  Yeah, there's laws prescribing the marginalization of people with skin diseases that can produce plagues, too, but there are times when I think that for all our contemporary contempt for Bronze age legal codes they managed to do something we seem to refuse to do in contemporary American contexts, admit that some figuratively and maybe even literally grotesque inequalities come from the nature of one's birth and that the legal codes need to ameliorate this for the common good rather than just deny that these kinds of things exist.  But obviously I digress. 

This may all get to something Michael Lind was writing about on the question of which American Dream we think we should be pursuing. 

https://thesmartset.com/which-american-dream/

Do we go for an American Dream in which the opportunity to rise as you make opportunities for yourself in the social ladder?  Or do we go for an American Dream in which material outcomes tend to get better (by however small the observable increments) for everyone regardless of educational or social opportunity?  We could put this, perhaps, in educational terms--do we believe the future of the United States should favor better and increasingly better outcomes for people who go through the trouble to get advanced degrees and move on from that or do we think we should favor pursuing a society in which there are decent outcomes for those people while, say, focusing more on the possibility that the "unskilled labor market" has better resources available?  Lind doesn't pretend he doesn't prefer the latter path. 

But there is a flaw in the standard-of-living-for-everyone American Dream that Lind didn't really get to, which is the potential ecological fall-out of such a dream.  There may not be an American Dream that isn't some kind of disaster, which hardly seems a surprise to me at this point since I've reached the conclusion that the United States is most likely the latest iteration of a power called Babylon the Great in the book of Revelation.  Of course other people who don't hold to any kinds of religious views at all might have other convictions about that ... .

Something else Lind riffed on comes to mind, how in utopian and dystopian visions in Western science fiction we keep seeing the presumption of a unified world.
https://thesmartset.com/the-future-of-the-future/

...

Wars, hot or cold, are also missing from standard science fiction versions of the future. Interplanetary wars don’t count, and neither do wars with robots or zombies. I mean wars among nation-states or global alliances or regional blocs. George Orwell’s 1984, inspired in part by James Burnham’s The Managerial Revolution, imagined a world divided among three totalitarian blocs: Oceania, Eurasia, and Eastasia. I can’t think of any other well-known examples of geopolitics in science fiction.

Typically, as noted above, science fiction authors posit a united world under benign or tyrannical world government. How our present divided world came to be united in the future is seldom explained. Science fiction authors are notorious for getting out of plot holes by inventing new technologies like “handwavium.” The political equivalent of handwavium is the World Federation of Handwavia. [emphasis added, and think about Star Trek for a bit. ;) ]

Global political unification is becoming less, not more, likely. In 1900, outside of the U.S. and the independent former colonies of Latin America, most of the human race was ruled by the British, French, and other European empires. If Imperial Germany had conquered Europe and subordinated the European overseas empires, it would have had a shot at world domination.

It was already too late for Hitler to conquer the world by the 1930s. Industrialization had added the Soviet Union and Japan to the ranks of great powers outside of Western Europe, in addition to the U.S., which by then potentially was by far the most powerful country. The Soviet Union was a major threat but it never had a chance at global domination. By the 1970s, its leaders sought the status of an equal superpower, a status which their economy and military-industrial base could not sustain.

...


Great-power rivalry, demographic collapse, mass migration — three of the major forces reshaping the world — have been all but completely absent, both from classic science fiction and newer novels and movies that have shaped public consciousness. Most science fiction is not trend analysis, but a moral or political allegory, as the late Thomas M. Disch pointed out in The Dreams Our Stuff is Made Of. And, as he also pointed out, much of it is children’s literature. Unfortunately, literary and cinematic visions of the future influence the way the public and the policymaking elite think about the future.

This is particularly a problem for the left. Since the 19th century, the Marxist left has expected that at some point in the future an ill-defined revolution would create a global utopia — the World Socialist Republic of Handwavia, as it were. When this vision of the future collided with reality, the Marxist left split into reformist social democrats, who were hardly distinguishable from left-liberals, and communists, who, in the countries in which they came to power, soon abandoned socialist ideology for nationalist Realpolitik and the perks of a new ruling class.

Meanwhile, from the early 20th century to the early 21st, many centrist liberals have put their hopes in international institutions — the League of Nations, the United Nations, or, more recently, projects of trans-national regionalism like the European Union. Great power rivalries marginalized both the League and the UN, and populists in European nations like the British citizens who voted for Brexit now seek to dismantle or limit the powers of the EU.

Today’s national populists are told that they are on the wrong side of history, by elites whose members claim to speak on behalf of an emerging world community. But maybe the populists and nationalists are on the right side of history and the elites have been duped by bad science fiction.

Floating an idea here for the weekend, traditional defenses of capitalism may have worked best with the supposition that it worked at a regional or local level.  The influx of trading possibilities rather than legal prohibition was what ultimately ended the slave trade and caste systems in the Pacific Northwest Indian tribes around these parts.  These tribes could be a useful exception to the idea that hunter gatherer societies could be construed as, by their very nature, being fundamentally inimical to caste systems or slavery.  That's not true, obviously, but academics can be tempted to believe that it "could" be true, perhaps. 

But the idea here is that the more genuinely global capitalism becomes the more cogent the critique of capitalism as a global system could become.  What if it was possible for the Anglo-American view of capitalism to "work" because as vast as the Anglo-American empires have been they still had fundamental social, legal, geographic and economic limits?  Every empire crumbles at some point and every bid for a truly global system is going to probably end up with a totalitarianism of some kind--it may just be that for people committed to certain modes of economic life as the ideal for human flourishing that the totalitarian element inherent in how humans historically implement that ideal can be too easy to ignore. 

Oh, yes, and here's this piece about the new Spiderman: Homecoming film that came out recently by ... somebody.

http://www.mbird.com/2017/07/spider-man-homecoming-with-frosting-so-good-you-can-forget-theres-something-off-about-the-cake/

So it's not just WtH and Film Crit Hulk who have some reservations about the odd split between the explicit moralism of responsibility and the implicit moralism of the exemptions for the special.  Richard Brody's got that concern, too.

http://www.newyorker.com/culture/richard-brody/review-spider-man-homecoming-hedges-its-bets

Now, sure, Brody had that review of Love & Friendship where he made the absurd claim that Susan Vernon didn't break any of the "important" rules of ethics and social life.  It would be ... impossible to read Austen's take on that character as her being someone who did anything but break the most important rules while keeping up her reputation by dint of fastidious adherence to the niceties of social life.  Austen was pretty good at depicting characters who are at their base pretty bad peope but who are clever enough at giving the right performance for the public so as to come off seeming above reproach.  It may be the reason 21st century adaptations of Austen's work put so much stock in the rightness of horndogs like Wickham or Lydia to get their relationships to work is because "that" is what "we" believe in these days and we pay lip service to a Darcy or a Lizzy Bennett because we know, thanks to a century or so of academic inculcation, that we're "supposed to" admire them even if in reality we are more like a Lydia or a Wickham or perhaps a Charlotte Lucas or a Mr. Collins.

and we may be blind to our own colonial tendencies in being on the lookout for colonialism in others, to get back to deBoer.  I've managed to grab Kyle Gann's book on the Charles Ives Concord Sonata and it was interesting to read him vent his frustration at his blog about identity studies.  His lament was that even though in theory he's in favor of identity studies regarding gender or sexuality he found that the peer review process at an earlier publisher fixated on those issues at the expense of his stated goal of discussing the musical work itself.  He also vented that musicologists were doing identity studies at the expense of looking at music manuscripts and published works to such an extent that it seemed that the work of musicology that had previously been done by musicologists would have to be done by actual composers.  Point noted!  And this ties into deBoer's proposal that cultural studies in American academic can paradoxically be concerned with colonial imagination while ... demonstrating the same basic kind of colonialism itself.

https://fredrikdeboer.com/2017/07/21/cultural-studies-ironically-is-something-of-a-colonizer/
...

This results in some awkward tensions between pedagogical responsibility and political theory. Patricia Bizzell exemplified the perspective that the purpose of teaching is to inspire students to resist hegemony, rather than to learn, say, how to write a paper – and that professors have a vested interested in making sure they stay on that path:

…our dilemma is that we want to empower students to succeed in the dominant culture so that they can transform it from within; but we fear that if they do succeed, their thinking will be changed in such a way that they will no longer want to transform it.

This strange, self-contradictory attitude towards students – valorizing them as agents of political change who should rise up and resist authority while simultaneously condescending to them and assuming that it is the business of professors to dictate their political project – remains a common facet of the contemporary humanities.

Perhaps this might be likened to ... to be deliberately provocative about this ... the cheater who is worried that other people get away with cheating; the liar who is incensed that other people get away with lying; the plagiarist who is alert to how he or she has been plagiarized and so on.  The colonialist, regardless of formally professed ideology or practice, may be painfully alert to the colonialism of others. 

At least the old Religious Left and the Religious Right made the war to claim culture explicit enough to make it easy for people to disagree.  If a comparable enterprise has been taken up in academia the dog whistles could be so profuse from within academia the odds of this stuff being recognized for what it is could take generations to recognize ... which ... depending on how you read this stuff and its real vs professed aims ... might be the point?

Years ago when I wrote my ... well ... basically bad review of Andrew Durkin's book Decomposition, one of my simple disagreements with his premise was that he seemed set on rejecting authenticity and authorship as concepts because these were emblematic of capitalism and commercialism.  Neither of these reflect that.  If anything the literary and historical case that authenticity and authorship fixations are endemic to the Romantic era project seems pretty hard to dispute.  Durkin seemed unwilling and unable to concede that the upshot of arguments against "authenticity" and "authorship" would be to accept that there cannot be an inauthentic style of music and that corporately produce art cannot be considered illegitimate because of the corporately-funded algorithm-driven product that may be multi-authored and singularly branded.  Durkin was not necessarily willing to grant this. 

But proposing that Western musical notation is vague and inaccurate is worthless as an attempted argument in itself.  If Ben Johnston could take Western musical notation and finesse it into a system that allows one of his string quartets, which has more than a thousand discrete, identifiable pitches, to get performed, then Johnston may have a point in saying the problem isn't the notation system and its limits as that we've constrained ourselves in Western music to the dubious assumption that equal tempered tuning of keyboard instruments is the way to think in music.  Durkin, in a back and forth in comments at this blog, asserted that words only mean things in the contexts in which we use them.  Sure, but my counter argument was to propose that the language in which we learn confers upon us the cognitive and procedural constraints we keep working with the rest of our lives.  Insisting that words only mean what we use them to mean in context is merely part of a whole and a miserable distortion of the whole if that's what gets emphasis.  It amounts to, without a proper check, an assertion that the user of language gets to magically define what language is supposed to mean.  While I appreciated getting an introduction to Nancarrow's music through Durkin's book, on the whole Durkin's arguments were non-starters and his attempt to re-mystify music notation as a way to somehow de-commodify music was a failure. 

We're never going to get to a musicology in which Steve Wonder and Haydn, or Monk and J. S. Bach can be considered part of a gorgeous and holistic musical canon if that's the kind of rhetorical angle we take.  As tedious or meaningless as it might seem, comparing the technical ways in which Scriabin and Stevie Wonder could deploy octatonic scales or chains of chromatic median relationships seems more likely to be the path to affirm Wonder's axiom that music is a world within itself, with a language we can all understand than trying to go on and on about the imprecisions of musical notation systems for a PhD earned in English. 

What, ideally, we could learn if we engage in academic and theoretical discourse across a panoply of positions are the ways in which we put ourselves and each other into unexpected double binds.  We seem sufficiently balkanized to the point where it is with some disappointment I consider the possibility that just being willing to read across the political and ideological spectrum out of curiosity is something that's frowned upon across the spectrum. 

Well, that's probably enough writing for the time being ... at least at the blog. 

At least for now.

Tuesday, July 18, 2017

two centuries ago today, Jane Austen died, a few thoughts on one of Wenatchee The Hatchet's literary heroes

I've written this thought a few times before but the two literary voices I most leaned on as influences for how this blog works have been Jane Austen and Joan Didion.  If I had to boil my literary heroes down to a mere two figures I guess they'd be Dostoevsky and Austen. Fortunately nobody really has to distill their literary inspirations down to a mere two authors, but if I had to pick just two those would be the two.

I consider Austen one of the great comedic geniuses of English language literature.  That Austen's characters spend so much time navigating, with success or abject failure, the differences between the formalities of expected public discourse and the private realities of what people really do has been a natural fit for a lot of my literary and regional historical interests. 

Ten years on it actually feels inevitable that Wenatchee The Hatchet would be a Jane Austen fan who has written a lot about the gap between the branding and the interior reality of what used to be called Mars Hill.  I'm also a Christopher Nolan fan and his penchant for telling stories about corrupt and corruptible men who fool themselves into thinking the terrible things they want to do are the right thing to do is another inspiration for the kinds of stuff I love writing about. 

But as a contrast to the dudely dude chest-thumpery of Mark Driscoll and the bros who admire him, it would be difficult to find a literary figure who might be more antithetical in style and guiding ethic than, say, Jane Austen. 

I'd planned to write more for this particular day but there's other writing I've been tackling.  So drawing some inspiration from some famous Christian author who's had no problem shamelessly recycling old content ... we can do something like that here.

Here's an old piece from a few years ago that ... somebody ... wrote about Jane Austen's most famous literary work back in 2013. 

With a few years between publication and the present, it's evident to anyone who has read the book that Austen regarded the mercenary pragmatism of the Collins marriage to be less than ideal.  Yet Charlotte's correction to her friend Lizzy was to say that not everyone had the beauty and brains to have the luxury of shooting down a suitor on the assumption that another offer was just around the corner.  Lizzy could afford that, Charlotte could not.  Yet by novel's end it was abundantly clear that we weren't supposed to regard the stupid horndogs Wickham and Lydia as having married for particularly good reasons.  It would be a bit tricky to assert that Jane Austen was a feminist or a romantic/progressive of any of the stripes we see in 21st century American culture; but it was relatively clear that she believed that a marriage that was not "just" bout business concerns and also more than just the fire of the loins was what was necessary.  Her ideals regarding romance and companionate marriage may have become so notorious that her satires of the self-aggrandizing and entitled nature of the aristocracy can be all but forgotten.

Anyway ...

https://mereorthodoxy.com/romance-in-pride-and-prejudice-sometimes-we-settle/
Romance in Pride and Prejudice: Sometimes, We Settle
February 18, 2013

It is axiomatic that an artist’s work will be admired and disdained for a single set of qualities. Some admire the breadth and passion of Beethoven while others find his stamina and pathos tedious. Some admire the precision and pacing of Kubrick’s films while others find them pretentious.  Jane Austen is no exception; her longevity is like that of any other significant artist. The defenders and detractors never stop having their arguments about the worth of her work.

It may be worth revisiting Pride & Prejudice, which is two hundred years old this year, to consider what distinguishes her romances from contemporary romances. After all, Elizabeth Bennett is not the kind of character we can imagine will be convincingly portrayed by a Meg Ryan or a Kate Hudson, or even a Julia Roberts. Lizzy and Jane are not heroines who lend themselves to being championed by America’s sweethearts in just about any generation of film.

Arguably, Noah Berlatsky, writing for the Atlantic, has summed up the paradoxical appeal of Austen’s work: “She has to be one of the least romantic writers ever to write romance.”
Austen’s tales of romance may endure because she put so little stock in romance as we tend to define it. In an Austen novel, career advancement, real estate values, the size of an entailment, and the social and fiscal connections that come with marriage all matter. If that seems unappealing it is because we can’t conceive of a culture in which a marriage could be arranged to benefit clans rather than as the culmination of a quest for a “soulmate.” We also live in a culture which, in some sense, denies the inevitability of death.  And so Austen’s tales of matrimony and negotiation don’t make sense to us because they are often, as Berlatsky put it, as “small as life.”  Americans want life to be bigger and grander in every respect than a life could be in Jane Austen’s time.

But a title like Pride & Prejudice suggests that however domestic the tale, Austen’s themes are hardly small. Just as stories about war are rarely “just” about war, Austen’s tales of romance are not “just” stories of people who marry.  The title tips us off to character flaws before we’ve even opened the book. Though Elizabeth and Darcy are not imbued with a social or symbolic significance as apocalyptic as Dostoevsky’s characters, they do represent ways of living life. That Austen is quotidian where Dostoevsky is apocalyptic, that Austen is mundane where Dostoevsky is grotesque hardly means she was not writing about ideas. Austen had an eye for the mundane details with which philosophies of life must contend on a daily basis. Dostoevsky wrote about the personal and social cataclysms that philosophies create when untempered by other ideals.  But it is the dry domesticity of Austen’s narrative world and the long term decisions made within it that give her characters’ decisions weight.  Irreversible life-altering decisions hinge on a person’s ability or inability to make the right decision after observing mundane details.

The marriages that take place in the novel are made by people following ideals (Elizabeth and Darcy), altruistic affection (Jane and Bingley), pragmatism (Charlotte and Collins), and visceral chemistry (Lydia and Wickham).  While it is obvious that Austen did not endorse the latter pairings, it is equally clear she shows us the latter two couples are not at all disappointed with their respective catches. Charlotte Lucas isn’t ruined by settling for Mr. Collins any more than Lydia is unhappy to be married to Wickham.  Charlotte realistically assessed herself, knew marriage to be a sure defense against poverty and loneliness, and pragmatically accepted the best offer she had. Charlotte could tell Lizzy that Lizzy had the luxury of being beautiful enough and clever enough to actually turn down proposals. But Charlotte had neither and so went with her best option. Austen’s stories are stories in which social, economic, and sexual capital are all part of a calculation for a plausible pairing as a business decision, not merely a quest for true love. But even Lydia, silly as she is, never seems unhappy with Wickham or the support they get from the Darcys by books’ end.  They just continue as they do.

Today we may recognize that the ideal is Lizzy and Darcy, but when our culture advises to settle we prefer to settle for Lydia and Wickham rather than Charlotte and Collins.  The love that bursts forth like a fire, demolishing property and removing clothing, was the sort that Austen made fun of.  Yet in the 21st century that sort of attraction is so taken for granted that director Joe Wright determined that Austen was too discreet to tell us the real reason Lizzy and Darcy fall for each other. So it turns out for a contemporary film-maker to sell himself on Lizzy and Darcy he has to believe they were drawn to each other like Lydia and Wickham. They are now heroes for sublimating their desire more decorously than others.  But that sort of erotic obsession goes past the point of even Lydia and Wickham to become the obsession of Dmitry Karamazov with Grushenka, only benefiting from the refinement of English manners.

In Austen’s actual novel, Lizzy and Darcy must overcome their own character flaws to discover they love each other.  In Wright’s film they simply need to contain themselves long enough to get social permission to do what they soon realized they wanted to do.  Cinema is full of tales where transgressive love is prized as a philosophical statement. “Theirs was a forbidden love” has been the clarion call to more than one or two made-for-TV-dramas. If the sparks create a fire hot enough then the heat was worth it.  This ethos is so prized it was shoe-horned into a Jane Austen adaptation. Apparently without that spark we won’t believe her story in our time.

Even evangelicals who claim that we should not be like the world still seem to want that devouring spark. When evangelical speakers and writers say that a marriage founded on anything but mutual love and attraction is going to fail, this indicates ignorance not merely of literature but of history. Negotiated lives together have happened in all sorts of ways. Because the spark of mutual sexual attraction inevitably wanes, friendship is important.

So far, so obvious. But great art, music, and literature help us avoid underestimating the obvious. Charlotte and Mr. Collins may not know the rapturous heights of mutual affection that Lizzy and Darcy know, but neither do they feel the imagined betrayals or wounded egos with bitterness and shame.  For Mr. and Mrs. Lucas the whole thing was to make sure the clergyman had his wife and the wife had her home. Having asked for no more than that and having found it, that was that. Though they may seem insultingly insular and provincial to us, they found their happy ending.

That Austen showed us happy endings even for those who settled in marriage is a reason for her greatness. It is because we know the marriages in that time and place could not be simply annulled that Charlotte’s decision bears a hint of tragedy. She settled, and she settled in a way we hope never to do. And by novel’s end Lydia and Wickham also settled, in their own way. In their world there can be no seven-year itch in which they reconsider their choices.  But both couples seem happy, if not wise, and by contemporary American romantic tropes they may be sitting in a church pew near you or me.

Sunday, July 16, 2017

over at Aeon, a piece surveying how usury stopped being thought of as sinful in the Judeo-Christian millieu and became respectable finance

Much of the time pieces sent to Aeon can be unconvincing and even insanely stupid.  But the price of promulgating think-pieces is sometimes the think pieces have dumb ideas, like the idea that children should be redistributed by the state across all racial lines so as to ensure racism never happens again, as though the totalitarian regime that would be necessary to enforce such a policy over against "genetic narcissism" would only be a benefit to the human race. 

But sometimes there are useful or at least interesting surveys and to such a little survey we turn:

https://aeon.co/essays/how-did-usury-stop-being-a-sin-and-become-respectable-finance



...
 
In Debt: The First 5,000 Years (2011), the anthropologist David Graeber argues that before the advent of money, economic life within a community was a web of mutual debts. People did not behave as self-interested individuals – at least not from the perspective of a single transaction; rather, they would share food, clothes and luxuries, and trust that their peers would repay the favour in return. When we consider these origins of debt and credit – as a system of mutual aid between people who trust each other – it’s no surprise that so many cultures viewed charging interest as morally wrong.
 
...
 
Meanwhile, the Catholic Church played its own part in sowing the seeds of a change of attitude. In the 13th century, it introduced the concept of Purgatory – a place that had no basis in scripture but did offer some reassurance to anyone committing the sin of usury each day. ‘Purgatory was just one of the complicitous winks that Christianity sent the usurer’s way,’ wrote the historian Jacques Le Goff in Your Money or Your Life: Economy and Religion in the Middle Ages (1990). ‘The hope of escaping Hell, thanks to Purgatory, permitted the usurer to propel the economy and society of the 13th century ahead towards capitalism.’

and here's more along those lines.  Indulgences do get a mention. 

over at The Imaginative Conservative, an author rues the day Star Wars ruined arts culture by celebrating distraction, skips over the monomyth and the possibility that Star Wars franchises may be the distillation of the total work of art sought by German and French avant garde ideals

I admit I tend to identify as moderately conservative about religion and politics.  By moderate I mean to say I'm a Presbyterian dour Calvinist who thinks the human condition is fraught by human frailties so stark that I find myself thinking the Frankfurt school authors were too optimistic about the human condition in modern technocratic societies.  And I've been reading arts history/art criticism books by authors who write for Thesis Eleven ... .  My commitment is more to Christian doctrine and teaching than to the left or right on the political spectrum.   My views may be an uneasy grab bag of Edmund Burke, Jacques Ellul and Roger Williams ... . Throw in Dostoevsky, Kafka, Conrad, some Bonhoeffer and Brunner and I guess that's where I'm at. 

Which is set up for the observation that when I see a title like "The Imaginative Conservative" I can't help but wonder if "The Imaginative Reactionary" might not be a synonymous title.  Take this recent Sean Fitzpatrick piece that rues the day George Lucas' franchise exploded into the Cineplex. 

http://www.theimaginativeconservative.org/2017/07/star-wars-sean-fitzpatrick.html

Forty years ago this summer—what seems to many a long time ago in a galaxy far, far away—Star Wars was released, and America was sold into the slavery of pop-culture merchandising. With this era-changing movie, the American cinematic focus shifted away from sophisticated dramas—such as The Godfather, One Flew Over the Cuckoo’s Nest, and Taxi Driver—back to a pre-60s golden-age trope where exhibitionism and carnival capers in motion pictures made money. Some say that George Lucas effected a return to what the movies were meant to be, while others argue that his swashbuckling “space opera” was a backslide from which cinema has never recovered. In either case, Star Wars was the flagship film to sell itself as a franchise, driven and dominated by mass marketing, special effects, action sequences, and cornball dialogue. Gaining the status of highest-grossing film of all time, Star Wars became the epitome of the summer blockbuster, recasting movies as commercial events that cater to the lowest common denominator of the movie-going public. The effects of Star Wars run deep in the entertainment industry and have made explosive, eye-candy spectacle an idol of distraction for many whose lives are so meaningless that distraction is a crucial drug.

Popcorn flicks like Star Wars are central, even integral, to American leisure—which is arresting if Josef Pieper’s notion about the basis of culture is correct. Where would society be without its screens, its celebrities, and its space sagas? It is rare to walk into a home that does not have a television dominating, or even enshrining, its living room. It is almost a matter of principle akin to a religious obligation in the civilian temples of Americanism. The parallels between the television and the tabernacle show how deft the forces of darkness are at leading man from the truth by imitating it. Leaving aside the comparisons that exist between the local church and the local theater, entertainment has become something like a new religion, a ritual for people to fill the voids in their lives—only entertainment is fast becoming nothing more than an addiction to nothingness, a placebo against the emptiness of the times. In these ways, modern entertainment is not simply distorting the elements of religion, but actually commandeering the role of religion in human society. A new idol has risen for the idle neo-pagans, and it is the idolatry of distraction.

***
For an author to take this stance about Star Wars while singing the praises of Dickens or Poe or Arthur Conan Doyle invites a question as to what it is about the pulp fiction of earlier centuries that let it become part of a literary canon in our more recent era.

The Godfather was no less than Jaws founded on pulp idioms and popular fiction. 

The process by which multi-media comprehensive branded merchandising and marketing was not necessarily all done in 1977.  The process started but the deregulated industry practices that allowed for children to be exposed to films for which there were toys and comics and novelizations and cartoons more properly erupted in the Reagan years.  It's not a surprise if a contributor to The Imaginative Conservative would like to think that the beginning of the doom of pop culture enslavement happened during the Carter administration but that seems daft. 

Any accounting of Star Wars that ignores Campbell's monomyth is an accounting that isn't really worth taking seriously. 

It's like a whole bunch of people don't get what European avant garde theorists were proposing centuries ago about the role the arts could play in formulating a new mythological substitute for Christian religion.  It's not that pop culture somehow was "allowed" to commandeer the cultic elements of religions.  Entertainment figures explicitly set out to create cults around their franchises.  Even an atheist like Joss Whedon can talk about how great it was, twenty years later, that Buffy the Vampire Slayer became show with the cult following it has. Perhaps he hopes a comparable cult following can let him keep playing with Firefly stuff for a while.

Now David Roberts has written three books that can be pretty opaque but he proposed, at length, that the ideal of the total work of art as precursor of and catalyst for the ideal society moved from Germany and France to the United States.  Others have mentioned this, too, but the idea I'm mulling over is that if in the avant garde of Europe utopianism and the avant garde tended to fixate on the utopian past or the utopian future, the American innovation in the later 20th century is more inclusive.  The futurist tech of the Star Wars franchise famously took place a long time ago in a galaxy far, far away.  Ancient future.  Something Roberts discussed at length in his books is how the Germans venerated Athens and the French venerated Sparta and how theorists and philosophers imagined that the Athenian art religion was a unified celebration in art of a unified society.

Well, okay, let's suppose that the American approach to unified art or the total work of art or ... the brand ... is also a celebration of an idealized status quo.  That could mean the dreams of German philosophers and avant garde artists would have been most realized in American franchises like Star Trek, Star Wars, My Little Pony, Transformers, G. I. Joe and so on.

But that can't be right.  It's supposed to be Wagner's operas and the literature of Mallarme and Schiller and Goethe and ... it's not supposed to be Optimus Prime and Twilight Sparkle or Captain Kirk and Luke Skywalker. 

Just because the religious or cultic elements of pop culture don't adhere to an old conservative nationalistic or ethnic demographic does not necessarily make them any less functionally religious.  In an era where conservatives write about morally therapeutic deism why wouldn't they spot that this is central to a Star Wars spirituality?  A religion of universal humanity, human reason and art doesn't need a deity to be functional.  Consider the cult of Star Trek these last fifty years. What middlebrow arts critics find so loathesome about mass and pop cultural franchises is that they not only make no bones about being explicitly and directly philosophical, their moralizing is front and center.  Superheroes explicitly insist upon telling us who is and isn't a hero and why.  It's not like Woody Allen films where the protagonist is an author stand-in or other kinds of films that are open to the interpretations of suitable cognoscenti--no, the Star Wars cinematic universe doesn't give you the luxury of supposing Palpatine is the hero of the story.  You're not supposed to imagine that perhaps the Empire has some worthy goals.  Maybe someone will write a funny piece at The Federalist making such a case, replete with the line "The Empire is back, baby, and they're gonna show these hippies who's boss!" 

Fans of the highbrow from the left and the right will likely never stop wringing their hands that too many people derive too much pleasure from too much pulp fiction. 

Since I'm a moderately conservative Presbyterian rather than a really conservative Catholic I suppose I may never land in the same spot as the sort of person who writes what's quoted above at The Imaginative Conservative.

The idea that a franchise like Transformers could reflect a not-even latent desire to in some sense see the world re-enchanted in a paradoxical way through technology is probably not going to be on the table.  That's probably because the kinds of folks who write for The Imaginative Conservative are going to be soooo busy attacking all ideas that even could possibly be associated with Marxists as never being able to correspond to any ideas that people with traditional or conservative Christian beliefs could agree with.   Actually ... there's a piece at Aeon I want to link to about the long history of how Christendom in the West went from saying usury was straight up evil to defending it ...

over at Vulture a case that Tony Stark is the "real" villain of Spiderman-Homecoming with the bromide that the Vulture is a Trump voter ... but ...

http://www.vulture.com/2017/07/spider-man-homecoming-is-iron-man-the-real-villain.html?wpsrc=nyma


...
 
Tony Stark’s always been something of a lovable rogue, and he’s accomplished many heroic things in other films. Here, however, his actions seem more sinister when he’s dealing with children — and as it turns out, when he’s running Stark Industries, which, in Homecoming, seems to operate on the shady end of the spectrum. In the beginning of the film, we learn that the business of cleaning up the wreckage from the Avengers’ New York battles has been given over to the Department of Damage Control, which, as Darren Franich pointed out in EW, is co-financed by Tony Stark and seems like a fairly malevolent force, despite the fact that national treasure Tyne Daly is its main spokesperson. DDC forces out local contractors like Michael Keaton’s Adrian Toomes, giving it the monopoly on superhero clean-ups. This might be designed to prevent dangerous alien tech from slipping into the hands of the unready (even though Toomes and his pals manage to steal it anyway), but it also ensures that Tony Stark has a vertical monopoly on superhuman activity: The battles use Stark technology; the clean-up crews are Stark branded; the PR is managed through Pepper Potts. Stark’s superpower, after all, is that he’s smart and rich. He lives in a world with few consequences. Money solves most of his problems; his monopolies prevent him from directly answering to the public. Who is he to teach a 15-year-old personal responsibility?

It’s unclear whether or how Stark Industries turns a profit, but its actions, as Homecoming reveals, have forced Americans out of their jobs. Case in point: Adrian Toomes, who offers the most compelling critique of Stark before he decides to become the evil Vulture. Toomes starts out in salvaging, gets forced out of his job by the Department of Damage Control, and then turns to a life of crime. As he faces off against Spider-Man, Keaton also gives the film a rare jolt of class consciousness as he tells Peter, “The rich and the powerful, like Stark, they don’t care about us.” The movie’s quick to supply examples of Toomes’s hypocrisy; as Vulture’s own Abe Riesman pointed out, he’s something akin to a monstrous vision of a Trump voter, furious at the elites of the world but unable to acknowledge his own relative privilege, as exemplified by a modernist home with way too many windows. [emphasis added]

 
The Vulture wears a bird suit, and goes from murder-curious to murderous after accidentally killing Logan Marshall-Green, but that doesn’t mean we should ignore his ideas. In the long term, Tony Stark’s actions do hurt the little guy. He’s like a Silicon Valley CEO who, after disrupting the economy with one good product, doesn’t acknowledge the evil he’s produced as a consequence. Tony Stark and his compatriots have seized control of a significant portion of the world’s power apparatus, and they are forcing out the ordinary man. Does this make Iron Man the villain? Marvel movies tend to have villains who intend to do harm, while people who cause damage unintentionally are more redeemable. (See Bucky Barnes in Winter Soldier or Civil War.) Surely, there’s enough evidence in Homecoming to see Toomes as at least a complicated figure, operating in something of a moral gray area.

The thing about the stereotypical Trump voter/alt right voter is white nationalistic ideas.  Yet ... for anyone who has actually seen Spiderman: Homecoming, the interracial marriage that led to the existence of Peter's crush Liz is really obvious by act 3.  Perhaps journalists wanting to describe the latest Spiderman villain in political terms want to find some other reference point for a white guy married to a black woman who's committing all his crimes to provide for his family in terms that don't deviate from the mainstream script in the press about Trump. 

The problem with Toomes isn't that he's a hypocrite.  No Marvel antagonist so far seems more committed to doing everything under the radar and as quietly as possible.  Toomes ends up killing Shocker 1 after an incident where Shocker 1 insists on showing off high-powered weaponry in a suburb without regard for collateral damage.  Underground arms dealer though he is, this is still an Adrian Toomes he can regard Mac Gargan with contempt as someone he wouldn't even deal with if Spiderman hadn't messed up other business deals.  If people want to cast Toomes in some kind of political sense the idea that this Adrian Toomes is a Trump voter seems a bit much.  Maybe he could be likened to a Reagan Democrat ...

But his criticism of Stark and the Department of Damage Control (subtle name, as always) is that what Stark and company benefit from is the kind of crony monopolistic capitalism in which the haves get to have more and those who don't get completely sidelined.  How do we know that Adrian Toomes, if he were magically a real person, wouldn't have voted for Sanders?  He might even have voted for Clinton, whose record as a hawk doesn't seem in any contradiction to the Vulture's family-driven pragmatism.  Had Trump not won would journalists even think to interpret the Vulture's activities and motives in Trump-voter terms?  Not ... very ... likely.  Last year some tried to describe the antagonist of the Magnificent Seven remake of a remake in Trumpian terms even though the production was under way (i.e. already scripted) before Trump's candidacy was solidified.  But there seems to be this penchant in the entertainment industry for a kind of political punditry recency bias; X or Y is imputed to a pop culture event that may have taken years to come together as though it were somehow consciously anticipating or responding to current events.  That makes sense if we're talking about a show like South Park where Parker and Stone are obviously reacting within a few weeks to current events.

When Parker confronts Toomes at the end Toomes' objection is that he is, in fact, pretty much doing the same thing that got the Starks their empire of wealth, selling weapons to killers.   Toomes' problem isn't hypocrisy so much as that he refuses to concede that the difference between what Starks Tony or Howard did and what he's been doing is the difference between the formal relationship granted by the state.  The state, in the form of the Department of Damage Control, deprived him of his job and contract to clean up the post-Avengers 1 damage. He, in turn, steals from Damage Control to refurbish alien tech into weapons and tools that he sells on the black market.   he's still a criminal but with an understandable motive.

If the studios want to even bother with a Sinister Six film they can bring back Keaton as the Vulture and maybe bring back Molina as Doc Ock.  One of the fun things about the classic Spiderman villains is that since they're older guys older actors could step into the roles.  Odds are pretty decent that the Osborn stuff has been too badly played out to be worth continuing.

Saturday, July 15, 2017

links for the weekend--"is classical journalism in decline?" is still the rhetorical question du jour in journalism, which might invite a different rhetorical question, does an arts scene that can't be monetized existed for arts journalism?

Here's an article that asks whether classical music journalism is fading into silence, one of those rhetorical questions as title articles that we inevitably see on this subject.

https://www.sfcv.org/article/diminuendo-is-classical-music-journalism-fading-to-silence
...
As arts coverage has shifted from major to minor, the diminution of print coverage of classical music events takes its toll. As in Hadyn’s “Farewell” Symphony, the players, snuffing out their candles, slowly exit the stage one by one until two violins play the final pianissimo adagio.

In 2016, reports of newspapers eliminating arts journalists through layoffs and buyouts seem more mind-numbing than shocking. Since the beginning of the millennium, legacy media has shed jobs across the board. In 2007-08, a quarter of all U.S. jobs in arts journalism were eliminated. By 2011, the John S. and James L. Knight Foundation’s then vice president for arts Dennis Scholl estimated that as many as 50 percent of the local arts journalism jobs in America had vanished. In July 2015, The American Society of News Editors reported its first double-digit decline—10.4 percent— in all newsroom jobs since the Great Recession. And anecdotally, the loss of arts journalists, especially critics, outpaces those in other departments. “I can count the number of full-time classical music critics on both hands,” says Douglas McLennan, the editor of ArtsJournal.
...
Online journalism has entered the wild west when it comes to monetization. Traditional broadsheets are forced to compete for clicks with The Daily Beast, Huffington Post, Salon, Slate, and BuzzFeed and with the distillation of newsbites on social media such as Facebook and Twitter. “There used to be a media that was top-down, but that has changed rapidly for the news industry,” says Michael Zwiebach, the senior editor of San Francisco Classical Voice since 2009. “People are not addicted to, nor do they trust, one source of information. For a lot of sources like the San Francisco Chronicle and Boston Globe, there has been a free-for-all competition for eyeballs. They take stories they think will bring in the most hits,” which in the culture category is most likely television, movies, pop music, or an occasional blockbuster like Hamilton.

And culture that isn’t easily monetized gets ignored. “At one time when there were classical recordings, there was a revenue and economic stream for classical music and the opera world that perpetuated media to cover them because it was also a business and industry,” says Peter B. Carzasty, the founder of the arts consulting firm Geah, Ltd. The survival struggle of media institutions was only exacerbated by the Great Recession of 2008–09. [emphasis added]

Increasingly shortened attention spans have driven hunger for quick internet news. The 1982 launch of USA Today, specifically designed for a generation raised on television and whose motto was, “an economy of words, a wealth of information,” predated internet trollers who can’t wade through anything over 140 characters, much less a 1000-word review.

It might be impossible to overstate the significance of the phrase phrase in bold--one might dare to say that one of the problems for the classical musical scene is that, in journalistic terms, we could ask whether or not something that is easily monetized even exists to begin with in cultural journalism.  Take one of my pet topics for discussion here at this blog over the last twelve years, polyphonic music composed for the guitar.  Has there been a single headline for that topic?  Nikita Koshkin's 24 preludes and fugues got published this year, at last (!), and yet to date I'm not sure if there has been any coverage (yet) on this cycle. 

On the one hand if there were a more robust or healthier arts journalism scene perhaps this cycle could get the coverage I think it should get.  On the other hand, the knowledge of the guitar and its literature, let alone the knowledge necessary to establish how one could assess polyphonic music, seems to be more a rarity in the classical guitar scene than maybe it should be.  When guitarists so often can say the instrument is not really suited to idioms like sonata and fugue does the fact that Igor Rekhin, Nikita Koshkin and German Dzhaparidze all have cycles of preludes and fugues for solo guitar they've composed register?   You could count the cycle yours truly composed ... if you wanted to ... but this gets back to the aforementioned question as to whether a work of art even exists for arts journalists if it isn't presented to the world in a monetizable format?  The Koshkin and Dzhaparidze cycles are both really good, by the way.  The Rehkin cycle is a bit more mixed for reasons I hope to get to later this year. 

So the lament and concerns about the state of classical music journalism is anything but an abstract or theoretical concern to me.  For that matter coverage of local religious scenes was often wanting.  Had local coverage been more thorough over the last ten years this blog wouldn't have gained notoriety for discussing what was once Mars Hill.  That's another case where presenting a small sea of information in a way that has been stubbornly free of monetization seems to have built up over time in a way where journalism establishments, for the most part, regarded the stuff as not necessarily existing, excepting maybe a stretch between 2013 and 2014. 

When Sousa warned about the rise of what we now know of as the music industry his worry was that amateur musicianship would wither away, and it was the cultures of amateur musicianship he regarded as the lifeblood of musical culture.  We live in an era in which the amateur musical scene is back and perhaps it could even be as robust as it might have been in the pre-music industry era.  But ... I wonder if arts critics and arts journalists have stopped to consider that an explosion of genuinely amateur composition and performance might mean that the cultures of monetizable properties that their bread and butter has depended on is no longer going to be as much of a thing to be covered and that ... as this article puts it:

There are winners and losers in pop’s attention economy, but most acts fall into the latter category

In an attention market, the haves get more and those who have not might lose even what they have. 

The question of how robust the amateur arts scenes are could be particularly scary and unknowable/unanswerable a question for more than just arts journalists and critics; it is a question that may need to be considered at the level of education.  Are we sure that the arts will die unless academic establishments keep it around?  Overseas there's some controversy about some proposed changes in education and a proposal that the loss of arts will be offset by the rise in IT.

‘Arts GCSE decline compensated by rise in IT,’ claims Tory education minister

I don't happen to agree that the arts are the "easy option" for people who couldn't cut it in math or engineering or something like that.  It's not that people who are not disposed toward those things couldn't or wouldn't say that, obviously.  No, I think a concern that has been brewing or erupting here on this side of the Atlantic might provide another reason we can't be entirely sure that an arts education lost will be met as tragic by non-artists--it's the whole canon wars as to what should be in the academic canon of the arts that people have to learn and why.  In European contexts where there's potentially more canonical certainty about German or French or English or Irish art that whole array of topics might be moot, but in the United States the question as to why arts instruction (if we're going to have that and keep having that) might favor a Euro-centric canon is la lively one.  Whether from the left or the right perhaps the blind spot journalists and academics have is that if we no longer have the possibility of a "folk culture" in the age of online videos and digital reproduction, we may still be witnessing a resurgent culture of amateur-driven arts activity.  Or not, that's the thing, it might be hard to gauge. 

Because what journalists and academics often like to rule out is financial success.  Twenty years ago I was eagerly collecting and reading the manga of Rumiko Takahashi, probably best known for Ranma 1/2 and Inu Yasha.  She also made the manga Maison Ikkoku--when a friend suggested her work to me he described her as being a kind of Jane Austen of manga.  I was curious about this claim, though at the time I'd read no Jane Austen.  That would come later.  Takahashi's work may be popular enough that some 200 million copies of her manga or anime adapted from her manga are in circulation but ... have you heard of her work?  Recent headlines about someone involved in the translation of the work were ... disappointing to read but if you don't already know the less you know the better for now.

The theme at this point is the linkage between monetization of art and who gets recognition.  This summer's biggest superhero blockbuster may remain an instructive case in point but we don't even have to stick to that ... .

Terry Teachout wrote recently about how it only took him somewhere between five to six minutes to map out a season of theater programs scripted entirely by women.  Teachout writes for a publication that ... at least for a majority of people who live here in Seattle, would not be identifiable as "left".  Commentary magazine is also not particularly "left".  But if Teachout could map out a season of all women playwrights in under ten minutes that's some context for a recurring set of thinkpieces as to how and why plays by women or other artistic projects helmed by women don't get more exposure.

In the summer of Wonder Woman 2017 this would, at the highest-profile level, seem like a summer to at least keep this topic in public attention. Of course for some authors over at The New Republic, Wonder Woman is just Americanist propaganda. Josephine Livingstone could sympathetically regard the big dumb spectacle of Valerian because even if it's inspired by a comic book at least it's a European comic book rather than an American one.  It seems that for folks at The New Republic or The Imaginative Conservative pulp fiction can be forgiven being what it is if it's mid-20th century European comics or inspires Coppola films ... . 

It's like there's some tacit goldilocks deal where the art can't be TOO mass-produced and TOO popular or it must either not be art or must be propaganda ... but a the same time if it exists in a form that can't be monetized or doesn't make its presence known in a market-force-level way then it doesn't even exist.  From the standpoint of arts journalism as the first draft of arts history a whole lot of the arts never existed.  This will remain the most likely outcome for a lot of arts out there in spite of the fact that, in theory and potentially also in practice, there's more and more stuff you can get access to now in the arts than ever before. You didn't need to get stuff on Prime day but with a market event like Prime day it would be relatively easy to go buy music and film and books.  Which is the transition for this weekend to ...

Elsewhere at TNR ... a piece with a sidelong commentary about the big river company:

 ... 
 
Amazon did not come to dominate the way we shop because of its technology. It did so because we let it. Over the past three decades, the U.S. government has permitted corporate giants to take over an ever-increasing share of the economy.
 
Back in the perma-temping, no-medical-benefits-jobs era of the 1990s I heard an IT person connected to a slightly larger-than-average company in the Puget Sound area complain that what Amazon did was get a strangle-hold on one-click purchasing.  So I have my doubts that "we let it" is an adequate explanation of the rise of Amazon.  It seems like it's just vague leftist boilerplate.  Even for people who tilt conservative or libertarian the history of crony capital manipulation of legislation relating to intellectual property is, if not easy to look up, feasible to research. Wouldn't an author who contributes to the TNR have more time and interest in proposing how massive corporate interests took time to revise and guide legislation regarding intellectual property, trademark, and licensing?  Well, maybe not? 

The older I get the more I get the impression that one of the problems in contemporary popular art is how much it is hamstrung by the reality that the majority of what we have as popular culture is licensed or trademarked and under copyright.  I don't think the "solution" to this problem is to come up with facile and largely unpersuasive arguments against the legitimacy of copyright.  If arts educators wanted to build a case for why arts education is essential here's an angle that, by and large, I have never seen anyone put forth in a journalistic context--in light of how restrictive copyright and trademark and licensing practices are in this new and international arts market, the most compelling reason to preserve an artistic canon of some kind is that by teaching an arts canon that is gloriously public domain and by exploring the ways in which that artistic canon has influenced and inspired more recent under-copyright art, we can give students ways to learn how to cultivate an interest in those arts that are genuinely public domain.  This is something conservatives already want to do on other grounds, as it is, the Western canon and all that. 

It's not like at this point we can even accept at face value the bromide that the Western literary canon is all dead white males.  We're hitting the bicentennial of the death of one of the greatest comedic geniuses in English language literature.  Yes, of course, I'm referring to Jane Austen. I began reading her work shortly after the start of the millennium and I have written here on a number of occasions that as I formulated the tone and literary voice for this blog tackling the history of Mars Hill I made a point of emulating the literary approaches of Jane Austen and Joan Didion.  It happens that I love the former's novels and the latter's non-fiction; it's also the case that I concluded that if there was going to be a kind of anti-Mark Driscoll aesthetic then the counterpoint to Mark Driscoll's camera-loving stand-up comic emulating stage persona drawing from Chris Rock, John Piper and Douglas Wilson would be a literary style inspired by Jane Austen and Joan Didion, both of whom have a penchant for a kind of bemused, chilly detachment.  If you read the first line of Pride & Prejudice and do not instantly grasp the nature of its joke then it's just not likely to be your thing.

There's a little piece at the New York Times recently about Jane Austen's literary style and the ideas running through her work:

https://www.nytimes.com/2017/07/06/upshot/the-word-choices-that-explain-why-jane-austen-endures.html?_r=0

It is at the heart of Austen’s work: What is going on behind the veneer that politeness demands? [emphasis added] These distinctive words, word clusters and grammatical constructions highlight her writerly preoccupations: states of mind and feeling, her characters’ unceasing efforts to understand themselves and other people
 
Human nature (together with the operation of time) is the true subject of all novels, even those full of ghosts, pirates, plucky orphans or rides to the guillotine. By omitting the fantastical and dramatic elements that fuel the plots of more conventional novels both of her own time and ours, Austen keeps a laser focus.

With Joan Didion's work one of the threads running through her non-fiction is a musing upon how the stories we tell ourselves to identify ourselves can often be deceptive, how the stories we tell ourselves to share with others often have a self-exonerating motive so that we don't have to consider what our real motives are.  Both women have male (and female) detractors who resent their icy, elitist style  Granted, and yet what's interesting about the comedic woman is that it seems men who despise women trafficking in humor that pours contempt on people can revel in that sort of humor when practiced by males.  That might be a opic for another post some time later. 

I've written in the past that there are ultimately only two types of humor, you're either laughing with or laughing at.  I had a blog post about this topic way back on August 17, 2013:
A layman makes a case for less humor from the pulpit

Now a writer can do whatever he or she wants and laughing with and laughing at are options we can all avail ourselves to.  But I have this proposal that the humorists whose work survives manage to find some kind of balance so that the laughing with and laughing at have an equilibrium.  For as often as Austen revels in laughing at her characters she arrives, in time, at resolutions to her stories in which we can laugh with them that things worked out acceptably enough for most people in the end. 

I'd write more for this blog post but I'm incubating some stuff about the newest Spiderman movie. A teaser of where I'm thinking about the new MCU Spiderman film goes roughly like this ...  

There's this joke in military cultures that if you break the rules and fail you get a courtmartial and if you break the rules but succeed you get a medal.  That's pretty much the entire MCU in a nutshell, breaking the rules but succeeding.  It's even a puchline in a subplot in the new Spiderman film wehre Captain America has done a public service announcement about how breaking the rules never pays off after a gym instructor has joked that by now Cap is probably a war criminal, but, whatever, the state paid for all these educational videos so we gotta use `em.

FilmCritHulk just went on a tear about the MCU, saying that there's this problem with them, that the gap between what the films SAY they are about and what they REVEAL themselves to be about by how they reward their protagonists in their third acts is now downright disturbing and that the new Spiderman film illustrates in ways that FilmCritHulk now finds frustrating. For the record, I think FilmCritHulk wrote the best, bar none, English language overview of Hayao Miyazaki's film The Wind Rises I've read.  Sure, I happen to like mine, too, but FCH is conversant enough in film and theory that even when I disagree with the "what" or "why" FCH writes in a way that spurs further thought and conversation.  Which is to say that for FilmCritHulk to articulate what Hulk regards as a fatal flaw in every single Marvel Cinematic Universe film that's something to mull over.  And that problem can be summed up in the aforementioned joke about how breaking the rules only gets you courtmartialed if you fail.

While FCH "may" not conversant enough in things military to formulate an objection in terms of military jokes, that's the beef, that it seems the MCU films feature heroes who all break the rules of reasonable/ethical conduct in ways that should get them courtmartialed under normal terms but since they always succeed in the third act they keep getting the medals.  What makes FilmCritHulk's recent complaint about the new Spiderman movie (which FCH does, in fact, like) intriguing is that FCH takes time to demonstrate how and why the Nolan Batman films and Raimi Spiderman films DON'T make the same mistake; Nolan and Raimi gave us heroes who made what they thought was the right and best decision to make at the time they had to make a decision that turned out to be not just a terrible strategic blunder but also to be, bluntly, morally wrong.  Whether as the result of fear or cowardice or resentful entitlement, Bruce Wayne and Peter Parker are both motivated by the reality that they made decisions that led to the deaths of people they loved.

So some of that has to be saved for the actual piece I'm meaning to write ... .

Enjoy your weekend.