Saturday, July 22, 2017

links for the weekend, on modes of comedy and moralism and the moral role of the self-described smart set (plus some Jane Austen stuff)

Here were are half way through the year and there's only been 93 posts at Wenatchee The Hatchet?  Trippy.  Not that I want to get back to the record high of 717 posts in 2012 ... but it sure has felt like this has been a fallow year for blogging. 

 Yep, it's been kind of a links for the weekend summer at Wenatchee The Hatchet.  Preparatory reading and intermittent blogging have been the order of the summer, though it feels more like the order for this year.  Composing music takes time.  There's been some pretty abstruse material that's been at the blog this year.  Finally getting around to the arts side of things has had some not entirely anticipated consequences. So, links and stuff.

Sometimes it seems like arts journalists and pundits are determined to learn the wrong lessons from box office activity in advance.  Take this ...

 Hollywood’s box-office woes: Is the industry aiming too narrowly at men?

Another weekend, another branded movie struggles.

Hollywood’s so-called franchise fatigue hit for what seemed like the umpteenth time this year, as “Spider-Man: Homecoming” followed up its auspicious domestic debut July 7 with just $45 million in ticket sales (a 61% drop) this past weekend. That paved the way for the movie to lose what many thought would be a tight opening-weekend battle with “War for the Planet of the Apes” — incidentally, an underperformer in its own right. And it raised the question, once again, of why so many supposedly surefire hits can’t seem to stick.

Many big-budget live-action movies over the last few months have been dropping at a dismal rate — at least 55% — in their second weekends. That includes films that had strong starts, such as “Spider-Man,” “Guardians of the Galaxy Vol. 2” and “The Fate of the Furious.” And it includes titles that didn’t, such as “Transformers: The Last Knight,” “Pirates of the Caribbean: Dead Men Tell No Tales” and “Alien: Covenant” (the last one falling a whopping 70%).

Clearly the geriatric age of some of these franchises is a factor — both “Pirates” and “Transformers” are on their fifth installments, and “Alien: Covenant” is even longer in the tooth, depending on how you classify some earlier entries. But it doesn’t explain why “Apes,” at just its third iteration of its current cycle and with some of the best reviews of its life, had such an underwhelming opening of $56 million. Or why “Spider-Man” couldn’t parlay its strong debut into a better second weekend.

Yet, clearly, Wonder Woman and a remake of Beauty and the Beast have done great at the box office.  If Hollywood is really so bad at paying attention to what more than just young males want then why are we getting a feature length My Little Pony movie this October?  I'm probably going to see it, honestly, simply because I have so many nieces it's going to happen and because I honestly think the show, as launched by Lauren Faust back in season 1, is actually pretty good.  I'm partial to Princess Luna myself and the show had a fun Inception homage/spoof using her character.

And did the Captain Underpants movie just not exist? So some of the limitations in punditry about film comes from this tacit reality that people who opine on film and television are vastly more likely to be riffing on prestige television and tentpole or arthouse cinema and its stereotypes rather than writing about stuff aimed directly at kids or all-ages entertainment that isn't confined to Mouse House.

For instance, it's vastly easier for an American film critic to have just declared the Ghost in the Shell remake "doesn't work" than to discuss either the anxieties Japanese artists brought to bear in the peculiarties of Japan's technocratic responses to the Cold War era or to Oshii's singular take on subversive appropriation of biblical literature to make his metaphysical and political points about his time and place.  The former requires a level of cultural history and engagement a lot of American film critics don't want to bother with and the latter, well, pretty much the same!  If you're steeped in an educational or journalistic milieu that largely presupposes that the Abrahamic religions were the worst thing to befall the entire human race then how would you be situated to get the significance of Oshii's appropriation of the Babel narrative in Patlabor: the movie?  Now, sure, people can binge seasons of Game of Thrones while still reflexively thinking that reading stories of similar nastiness in Judges is somehow a mark against Judges but that may just be how Americans are these days ... .

As for named franchises.  I never bothered with the Apes reboot.  If the axiom holds that history plays as tragedy and then a second time as farce then trying to get lightning to strike twice is a futile effort.  We got our remake of the premise of Planet of the Apes as farce a decade ago when Mike Judge made Idiocracy, didn't we? Now maybe film critics like Dana Stevens can gush that the new Apes films raises the question of whether we humans deserve the wonderful planet we live on but this is in essence code for how the wrong kinds of humans with the wrong kinds of beliefs about the nature of the human condition and the human being are going to deprive the rest of us the opportunity to live here.  Dystopian stories inevitably traffic in the kinds of justified stereotypes about the people we dread getting the kind of power and social influence we don't want them to have.  But I'm wondering this year whether the subtextual concern is a transference to "them" of a totalitarian impulse we may find in "us".  Dystopian literature is inevitably a self-exonerating exercise.

Which thematically leads us to a thinkpiece about humor in the age of Trump over at Slate. Now I've written here in the past about how there are ultimately two basic types of humor, laughing with and laughing at.  A lot of mainstream political humor has trafficked in the simple formulate of "laugh with me as I laugh at how stupid/evil/immoral/hypocritical/greedy those people are." The proposal in the piece is that Trump and the trolls who support Trump have forced mainstream liberal comedic thought to reassess both the nature of comedy in general and the tropes used by mainstream comedians.

Thus, a lengthy set of excerpts:

This conviction—that humor is “a superior way to tell the truth”—is extremely recent, but it is dearly held. The 2016 election confronted liberals with forms of expression that were indisputably amusing to many people but failed miserably to meet the Nussbaum Criterion: lies, insults, and cruel pranks, emanating from anonymous abusers and presidential candidates alike. Baffled, we started to call such things trolling. The word conjures a kind of evil twin of humor. The trolls had “stolen Washington,” where, as Amanda Hess lamented in the New York Times, they worked their nihilistic magic: “No reason, no principle, just the pure exercise of power.”
The panic about trolls has little to do with their actual political influence, which is tiny, and a lot to do with our fear that shittiness and humor might be compatible. They are. [emphasis added] And when you think about how constrictive the Nussbaum Criterion is, it starts to become clear why practitioners, consumers, and critics of comedy might be struggling to understand its role in our culture and politics.

It's like some of these people have never actually watched shows like Rick & Morty or Archer or any number of shows where the humor of cruelty is the engine of plot. 
That changed around 1957. Here is how Nesteroff describes it: “Eventually men like Lenny Bruce, Mort Sahl, and Jonathan Winters came along and led a revolution by developing their own material, derived from their actual personalities.” Bruce talked about abortion; Sahl made incisive political observations. Winters said, “Just tell the truth and people will laugh.” According to George Carlin, comedy “changed forever for the better.”
These stand-ups didn’t see their acts quite the way we see them: They understood themselves to be constructing characters. “Will Rogers used to come out with a newspaper and pretend he was a yokel criticizing the intellectuals who ran the government,” Sahl has said of his vaudevillian predecessor. “I come out with a newspaper and pretend I’m an intellectual making fun of the yokels running the government.” Bruce, too, was initially received as a performatively “sick comedian” and not a portal to authentic being. The force that united Bruce’s jokes was not self-revelation but rather the urge to scorch middle-class tastes. Both men’s acts were experiments in identity, in who you could get away with pretending to be, be he pervert or pundit.
The idea that we couldn’t perceive the human soul until Lenny Bruce told jokes about masturbation in 1957 not only warps our understanding of the comedy that followed, it blots out the comic traditions that existed before and alongside him. It has no room for Will Rogers, Bob Hope, and Danny Kaye—former vaudevillians who somehow managed to entertain people before comedy became funny—because they weren’t “real.”  [emphasis added] Neither was Moms Mabley, the immensely popular black stand-up whose act shingled character over social observation over character. (She performed as an old lady for most of her career.) This is how Nesteroff accounts for her significance: “From the 1930s through the 1950s, Mabley was comedy’s primary voice of the Civil Rights Movement,” which started in 1954.
Having witnessed the rise and fall of Mars Hill the construction of a persona that trades on authenticity goes a long, long way to explain a Mark Driscoll in Puget Sound.  Why would mainstream liberal comedians, or leftist comedians, ever for a second imagine that the persona of the authentic truth-teller couldn't be appropriate by anyone regardless of social or political or religious commitments?  Mark Driscoll was in a sense nothing more than a mirror image of Dan Savage, the two men parlaying their former Catholic upbringings into shock jock moralizing from ostensibly liberal/left and right perspectives in generic terms.  Now depending on how you parse that they might both have represented what some scholars might call the relatively new neoliberal center but that's a topic other people can tackle since this is a weekend and not a school day for me. :)

These performances depend on the acceptance of a lie, the lie that the character is the person, yet it is this lie which seems to be accepted.  There may be value in slicing through the performances of social obligation and the tensions the performance of social obligation create for sincere and honest relationship and that will get us to somebody at length, but first ... .

It might be hard to overstate the observation that a performative character is not the same thing as the actual person.  Whether thanks to a Sylvia Plath in poetry or a John Lennon in songwriting or the emergence of any number of other people, the illusion that the artist expresses his or her real feelings and observations about the world can be pervasive.  We know in the abstract not everybody works or thinks this way but the generalization is that you can fairly safely assume that if artist X does Y and Z in artistic media that's a direct expression of what the artist thinks or feels.  But this can be transformed, imperceptibly into an interpretive method that goes like this "I thought or felt A and B while consuming X and Y artistic creations therefore A and B is what X and Y are actually about."  We can forget that the authenticity we read on to art works can be merely ourselves.

One of the lazier habits in arts criticism is to mine work for clues to biographical concern.  This may be one of many reasons Romantic era writers preferred Beethoven to the equally brilliant Haydn, because Haydn made his works to order and to explicitly please his audience whereas Beethoven was regarded as authentically expressing his own feelings.  This Romantic trope has been with us for centuries.  I admit to being explicitly anti-Romantic in my overall outlook.  There's an interesting passage that comes up about what the "role" of a contemporary comedian often is in the age of American comedy since Jon Stewart:
The role that Colbert and Fallon are competing to occupy—Your No. 1 Hypothetical Friend—would not exist had comedy not become synonymous with personal authenticity, and personal authenticity with wisdom. Their authority (and the authority of Seth Meyers, John Oliver, Samantha Bee, and Trevor Noah) rests on the illusion that because an audience is laughing, the performers must be channeling some holy spirit, not their partisan loyalties or professional interests.
But this breed of comedy is didactic, and things that are didactic are not funny. [emphasis added] Baudelaire, in his treatise on laughter, makes a distinction between “significative comedy,” which you recognize by its carefully expressed “moral idea,” and “absolute comedy,” which you recognize because you are laughing. Our political humor today is certifiably significative. As a Vox video put it, “What makes satire such a powerful antidote to Trumpism isn’t that it’s funny.” (It’s that it’s true.) While absolute comedy affirms that we are all equally ignorant, significative comedy assures you of your superiority over others. For liberals, the experience of late-night comedy is largely one of narcissistic gratification—lectureporn, as Emmet Penney termed it. Before the election, John Oliver, Samantha Bee, and company had presented themselves as jesters in the court of a crazy king. Now they play straight men to him, ceaselessly signaling to the audience, “This guy is a nut! Normal people like you and me can see what a nut he is!” Not only is this shtick monotonous, it seals its audience in a bubble where a smirk is worth more than a joke. [emphasis added]

All that pivot required was for the wrong guy to be in the oval office.  The same can also be said for lectureporn from the right. In more church terms the phrase would be preaching to the choir. There may be a whole bunch of people who imagine themselves to be employing the wit of an Elizabeth Bennett who are behaving and thinking a bit more like a Mr. Collins--using jokes and observations about the way we live now that are partly rehearsed and partly the spur of the moment but at all times signaling that we're the right sort of people, after all. Collins is not just a point of substance but of communicative mode.

Not necessarily meaning to bash Baudelaire but in real life I think we can recognize that there is some real overlap between significative and absolute comedy.  We wouldn't still be talking about either Shakespeare or Jane Austen if there were never any real way for these two modes of  comedy to overlap. 

Of course this week was the bicentennial of the death of Austen.
As Austen’s own Emma Woodhouse put it to her querulous father, “One half of the world cannot understand the pleasures of the other.” But in the case of Austen, that misunderstanding seems to have an urgency that isn’t attached to any other canonized, pre-20th-century literary figure. The disagreement has been amplified as her fame has grown, and her fame may never have been greater. This year sees her unveiling by the Bank of England on a new £10 note, replacing Charles Darwin (and before him, Charles Dickens); she is the first female writer to be so honored. Meanwhile, the scholar Nicole Wright’s revelation that Austen was appearing as an avatar of sexual propriety and racial purity on white-supremacist websites made national news on both sides of the Atlantic. A few years back, her 235th birthday was commemorated with the honor of our times, a Google doodle. The wave of film adaptations that began in the 1990s may have receded, but it left in its wake a truth as peculiar as it seems to be, well, universally acknowledged: Austen has firmly joined Shakespeare not just as a canonical figure but as a symbol of Literature itself, the hazel-eyed woman in the mobcap as iconic now as the balding man in the doublet.


Iconic as she's become, the reasons for her status often stir up zealous dispute. Is Austen the purveyor of comforting fantasies of gentility and propriety, the nostalgist’s favorite? Or is she the female rebel, the mocking modern spirit, the writer whose wit skewers any misguided or—usually male—pompous way of reading her? (For her supremacist fans, Elizabeth Bennet would have a retort at the ready: “There are such people, but I hope I am not one of them.”) Any hint of taking Austen out of her Regency bubble brings attacks. When the literary theorist Eve Sedgwick delivered a talk in 1989 called “Jane Austen and the Masturbating Girl,” some male social critics brandished the popular term politically correct to denounce Sedgwick and her profession. Six years later, when Terry Castle suggested a homoerotic dimension to the closeness between Austen and her sister, Cassandra, the letters page of the London Review of Books erupted. In other precincts, business gurus can be found online touting “what Jane Austen can teach us about risk management.” Not only is my Austen unlikely to be yours; it seems that anyone’s Austen is very likely to be hostile to everyone else’s

... that gets me thinking of the Shostakovich wars but if you don't already know what that is I'll spare you even more weekend reading!

A few things from here and there on the state of the educational milieu in which Jane Austen is likely to keep getting debated.  There's concern that colleges may be heading for a crisis.

Fredrik deBoer, for instance:

thus, the perhaps too obvious pull quote:

I am increasingly convinced that a mass defunding of public higher education is coming to an unprecedented degree and at an unprecedented scale. People enjoy telling me that this has already occurred, as if I am not sufficiently informed about higher education to know that state support of our public universities has declined precipitously. But things can always get worse, much worse. And given the endless controversies on college campuses of conservative speakers getting shut out and conservative students feeling silenced, and given how little the average academic seems to care about appealing to the conservative half of this country, the PR work is being done for the enemies of public education by those within the institutions themselves. And the GOP has already shown a great knack for using claims of bias against academia, particularly given the American yen for austerity.

But let's move a little further to what I took to be the real concern:

In 2010 I wrote of Michael Berube’s What’s Liberal About the Liberal Arts?, “the philosophy of non-coercion and intellectual pluralism that Berube describes and defends so well isn’t just an intellectual curiosity, but an actual ethos that he and other professors live by, and which defends conservative students.” I grew up believing that most professors lived by that ethos. I don’t, anymore. It really has changed. For years we fought tooth and nail to oppose the David Horowitz’s of the world, insisting that their narratives of anti-conservative bias on campus were without proof. Now, when I try to sound the alarm bells to others within the academy that mainstream conservatism is being pushed out of our institutions, I get astonished reactions – you actually think conservatives should feel welcomed on campus? From arguments of denial to arguments of justification, overnight, with no one seeming to grapple with just how profound the consequences must be. We are handing ammunition to some very dangerous people. [emphasis added]

and this next part ties back to the aforementioned observations about lectureporn as "comedy"

David Brooks has a column out today. That means that social media is going through one of its most tired types of in-group performance, where everyone makes the same jokes and the same tired “analysis” of whatever his latest dumb argument is, over and over again. None of the jokes are funny, none of the analysis useful, but this ritual fulfills the very function that Brooks is talking about in his column: making fun of David Brooks is one of the ways that bourgie liberals signal to other bourgie liberals that they are The Right Kind of Person. [emphasis added] Brooks, of course, is incapable of really understanding his own observations, given his addiction to just-so stories about character and gumption and national grit. He does not see, and can’t see, the economic structures that dictate so much of American life, nor is he constitutionally capable of understanding the depths of traditional injustices and inequality. If he did, he wouldn’t have the column.

But his critics can’t see something that, for all of his myopia, he always has: that our political divide is increasingly bound up in a set of class associations and signals that have little to do with conspicuous consumption and everything to do with a style of self-performance that few people ever talk about but everyone understands. It is the ability to give such a performance. [emphasis added[

Sometimes I feel like we're trained to believe that it is in the distinction between social performance and the real self that hypocrisy always exists.  I don't think that's really the case.  There may be a bond between the social performance you do every day in whatever setting you're in and what you think inside your head and the push and pull of that fluid dynamic is, in many respects that are hard to express, who you actually are.  An author like Jane Austen can brilliantly convey this whereas other, lesser, comedic authors might camp out on the idea that the performance is the facade while the inner life is the real person.  Austen gives us characters in which the performance and the inner life are always bound together.  When the inner life that manifests in action and the performance are twisted into opposing directions, yes, Austen highlights the tension. 

That giving the right performance opens and closes doors of opportunity that define what you can and can't do for the rest of your life and that this peril of giving the right performance defines modernity may be one of many reasons why Austen's work is still so relevant to our own era in spite of the fact that Austen died two centuries ago. Famously she told stories of how people can appear to be The Right Kind of Person at all the surface levels who turn out to be the wrong kinds of people in how they actually treat people.  It may be a testament to the politics of authenticities and personas that people want to read radicalism into Austen rather than accept her moralism at face value because her satires of upper class entitlement and the constraints of middle class existence, particularly for women, are easier to read as political rather than moral judgments.  But if they were not first and foremost moral judgments with political implications and consequences would we still be discussing her work centuries later? 

I was struck by deBoer's piece on the dogma of educational culture that he published earlier this year:


The implied policy and philosophical changes for such a viewpoint are open to good-faith debate. As I have written in this space before, I think that recognizing that not all students have the same level of academic ability should agitate towards a) expanding the definition of what it means to be a good student and human being, b) not attempting to push students towards a particular idealized vision of achievement such as the mania for “every student should be prepared to code in Silicon Valley,” and c) a socialist economic system. Some people take this descriptive case and imagine that it implies a just-deserts, free market style of capitalism where differences in ability should be allowed to dictate differences in material wealth and security. I think it implies the opposite – a world of “natural,” unchosen inequalities in ability is a world with far more pressing need to achieve social and economic equality through communal action, as that which is uncontrolled by individuals cannot be morally used to justify their basic material conditions.

Put that way it reminds me of how in the Torah there's a set of laws condemning the unfair exploitation of people with disabilities.  Yeah, there's laws prescribing the marginalization of people with skin diseases that can produce plagues, too, but there are times when I think that for all our contemporary contempt for Bronze age legal codes they managed to do something we seem to refuse to do in contemporary American contexts, admit that some figuratively and maybe even literally grotesque inequalities come from the nature of one's birth and that the legal codes need to ameliorate this for the common good rather than just deny that these kinds of things exist.  But obviously I digress. 

This may all get to something Michael Lind was writing about on the question of which American Dream we think we should be pursuing.

Do we go for an American Dream in which the opportunity to rise as you make opportunities for yourself in the social ladder?  Or do we go for an American Dream in which material outcomes tend to get better (by however small the observable increments) for everyone regardless of educational or social opportunity?  We could put this, perhaps, in educational terms--do we believe the future of the United States should favor better and increasingly better outcomes for people who go through the trouble to get advanced degrees and move on from that or do we think we should favor pursuing a society in which there are decent outcomes for those people while, say, focusing more on the possibility that the "unskilled labor market" has better resources available?  Lind doesn't pretend he doesn't prefer the latter path. 

But there is a flaw in the standard-of-living-for-everyone American Dream that Lind didn't really get to, which is the potential ecological fall-out of such a dream.  There may not be an American Dream that isn't some kind of disaster, which hardly seems a surprise to me at this point since I've reached the conclusion that the United States is most likely the latest iteration of a power called Babylon the Great in the book of Revelation.  Of course other people who don't hold to any kinds of religious views at all might have other convictions about that ... .

Something else Lind riffed on comes to mind, how in utopian and dystopian visions in Western science fiction we keep seeing the presumption of a unified world.


Wars, hot or cold, are also missing from standard science fiction versions of the future. Interplanetary wars don’t count, and neither do wars with robots or zombies. I mean wars among nation-states or global alliances or regional blocs. George Orwell’s 1984, inspired in part by James Burnham’s The Managerial Revolution, imagined a world divided among three totalitarian blocs: Oceania, Eurasia, and Eastasia. I can’t think of any other well-known examples of geopolitics in science fiction.

Typically, as noted above, science fiction authors posit a united world under benign or tyrannical world government. How our present divided world came to be united in the future is seldom explained. Science fiction authors are notorious for getting out of plot holes by inventing new technologies like “handwavium.” The political equivalent of handwavium is the World Federation of Handwavia. [emphasis added, and think about Star Trek for a bit. ;) ]

Global political unification is becoming less, not more, likely. In 1900, outside of the U.S. and the independent former colonies of Latin America, most of the human race was ruled by the British, French, and other European empires. If Imperial Germany had conquered Europe and subordinated the European overseas empires, it would have had a shot at world domination.

It was already too late for Hitler to conquer the world by the 1930s. Industrialization had added the Soviet Union and Japan to the ranks of great powers outside of Western Europe, in addition to the U.S., which by then potentially was by far the most powerful country. The Soviet Union was a major threat but it never had a chance at global domination. By the 1970s, its leaders sought the status of an equal superpower, a status which their economy and military-industrial base could not sustain.


Great-power rivalry, demographic collapse, mass migration — three of the major forces reshaping the world — have been all but completely absent, both from classic science fiction and newer novels and movies that have shaped public consciousness. Most science fiction is not trend analysis, but a moral or political allegory, as the late Thomas M. Disch pointed out in The Dreams Our Stuff is Made Of. And, as he also pointed out, much of it is children’s literature. Unfortunately, literary and cinematic visions of the future influence the way the public and the policymaking elite think about the future.

This is particularly a problem for the left. Since the 19th century, the Marxist left has expected that at some point in the future an ill-defined revolution would create a global utopia — the World Socialist Republic of Handwavia, as it were. When this vision of the future collided with reality, the Marxist left split into reformist social democrats, who were hardly distinguishable from left-liberals, and communists, who, in the countries in which they came to power, soon abandoned socialist ideology for nationalist Realpolitik and the perks of a new ruling class.

Meanwhile, from the early 20th century to the early 21st, many centrist liberals have put their hopes in international institutions — the League of Nations, the United Nations, or, more recently, projects of trans-national regionalism like the European Union. Great power rivalries marginalized both the League and the UN, and populists in European nations like the British citizens who voted for Brexit now seek to dismantle or limit the powers of the EU.

Today’s national populists are told that they are on the wrong side of history, by elites whose members claim to speak on behalf of an emerging world community. But maybe the populists and nationalists are on the right side of history and the elites have been duped by bad science fiction.

Floating an idea here for the weekend, traditional defenses of capitalism may have worked best with the supposition that it worked at a regional or local level.  The influx of trading possibilities rather than legal prohibition was what ultimately ended the slave trade and caste systems in the Pacific Northwest Indian tribes around these parts.  These tribes could be a useful exception to the idea that hunter gatherer societies could be construed as, by their very nature, being fundamentally inimical to caste systems or slavery.  That's not true, obviously, but academics can be tempted to believe that it "could" be true, perhaps. 

But the idea here is that the more genuinely global capitalism becomes the more cogent the critique of capitalism as a global system could become.  What if it was possible for the Anglo-American view of capitalism to "work" because as vast as the Anglo-American empires have been they still had fundamental social, legal, geographic and economic limits?  Every empire crumbles at some point and every bid for a truly global system is going to probably end up with a totalitarianism of some kind--it may just be that for people committed to certain modes of economic life as the ideal for human flourishing that the totalitarian element inherent in how humans historically implement that ideal can be too easy to ignore. 

Oh, yes, and here's this piece about the new Spiderman: Homecoming film that came out recently by ... somebody.

So it's not just WtH and Film Crit Hulk who have some reservations about the odd split between the explicit moralism of responsibility and the implicit moralism of the exemptions for the special.  Richard Brody's got that concern, too.

Now, sure, Brody had that review of Love & Friendship where he made the absurd claim that Susan Vernon didn't break any of the "important" rules of ethics and social life.  It would be ... impossible to read Austen's take on that character as her being someone who did anything but break the most important rules while keeping up her reputation by dint of fastidious adherence to the niceties of social life.  Austen was pretty good at depicting characters who are at their base pretty bad peope but who are clever enough at giving the right performance for the public so as to come off seeming above reproach.  It may be the reason 21st century adaptations of Austen's work put so much stock in the rightness of horndogs like Wickham or Lydia to get their relationships to work is because "that" is what "we" believe in these days and we pay lip service to a Darcy or a Lizzy Bennett because we know, thanks to a century or so of academic inculcation, that we're "supposed to" admire them even if in reality we are more like a Lydia or a Wickham or perhaps a Charlotte Lucas or a Mr. Collins.

and we may be blind to our own colonial tendencies in being on the lookout for colonialism in others, to get back to deBoer.  I've managed to grab Kyle Gann's book on the Charles Ives Concord Sonata and it was interesting to read him vent his frustration at his blog about identity studies.  His lament was that even though in theory he's in favor of identity studies regarding gender or sexuality he found that the peer review process at an earlier publisher fixated on those issues at the expense of his stated goal of discussing the musical work itself.  He also vented that musicologists were doing identity studies at the expense of looking at music manuscripts and published works to such an extent that it seemed that the work of musicology that had previously been done by musicologists would have to be done by actual composers.  Point noted!  And this ties into deBoer's proposal that cultural studies in American academic can paradoxically be concerned with colonial imagination while ... demonstrating the same basic kind of colonialism itself.

This results in some awkward tensions between pedagogical responsibility and political theory. Patricia Bizzell exemplified the perspective that the purpose of teaching is to inspire students to resist hegemony, rather than to learn, say, how to write a paper – and that professors have a vested interested in making sure they stay on that path:

…our dilemma is that we want to empower students to succeed in the dominant culture so that they can transform it from within; but we fear that if they do succeed, their thinking will be changed in such a way that they will no longer want to transform it.

This strange, self-contradictory attitude towards students – valorizing them as agents of political change who should rise up and resist authority while simultaneously condescending to them and assuming that it is the business of professors to dictate their political project – remains a common facet of the contemporary humanities.

Perhaps this might be likened to ... to be deliberately provocative about this ... the cheater who is worried that other people get away with cheating; the liar who is incensed that other people get away with lying; the plagiarist who is alert to how he or she has been plagiarized and so on.  The colonialist, regardless of formally professed ideology or practice, may be painfully alert to the colonialism of others. 

At least the old Religious Left and the Religious Right made the war to claim culture explicit enough to make it easy for people to disagree.  If a comparable enterprise has been taken up in academia the dog whistles could be so profuse from within academia the odds of this stuff being recognized for what it is could take generations to recognize ... which ... depending on how you read this stuff and its real vs professed aims ... might be the point?

Years ago when I wrote my ... well ... basically bad review of Andrew Durkin's book Decomposition, one of my simple disagreements with his premise was that he seemed set on rejecting authenticity and authorship as concepts because these were emblematic of capitalism and commercialism.  Neither of these reflect that.  If anything the literary and historical case that authenticity and authorship fixations are endemic to the Romantic era project seems pretty hard to dispute.  Durkin seemed unwilling and unable to concede that the upshot of arguments against "authenticity" and "authorship" would be to accept that there cannot be an inauthentic style of music and that corporately produce art cannot be considered illegitimate because of the corporately-funded algorithm-driven product that may be multi-authored and singularly branded.  Durkin was not necessarily willing to grant this. 

But proposing that Western musical notation is vague and inaccurate is worthless as an attempted argument in itself.  If Ben Johnston could take Western musical notation and finesse it into a system that allows one of his string quartets, which has more than a thousand discrete, identifiable pitches, to get performed, then Johnston may have a point in saying the problem isn't the notation system and its limits as that we've constrained ourselves in Western music to the dubious assumption that equal tempered tuning of keyboard instruments is the way to think in music.  Durkin, in a back and forth in comments at this blog, asserted that words only mean things in the contexts in which we use them.  Sure, but my counter argument was to propose that the language in which we learn confers upon us the cognitive and procedural constraints we keep working with the rest of our lives.  Insisting that words only mean what we use them to mean in context is merely part of a whole and a miserable distortion of the whole if that's what gets emphasis.  It amounts to, without a proper check, an assertion that the user of language gets to magically define what language is supposed to mean.  While I appreciated getting an introduction to Nancarrow's music through Durkin's book, on the whole Durkin's arguments were non-starters and his attempt to re-mystify music notation as a way to somehow de-commodify music was a failure. 

We're never going to get to a musicology in which Steve Wonder and Haydn, or Monk and J. S. Bach can be considered part of a gorgeous and holistic musical canon if that's the kind of rhetorical angle we take.  As tedious or meaningless as it might seem, comparing the technical ways in which Scriabin and Stevie Wonder could deploy octatonic scales or chains of chromatic median relationships seems more likely to be the path to affirm Wonder's axiom that music is a world within itself, with a language we can all understand than trying to go on and on about the imprecisions of musical notation systems for a PhD earned in English. 

What, ideally, we could learn if we engage in academic and theoretical discourse across a panoply of positions are the ways in which we put ourselves and each other into unexpected double binds.  We seem sufficiently balkanized to the point where it is with some disappointment I consider the possibility that just being willing to read across the political and ideological spectrum out of curiosity is something that's frowned upon across the spectrum. 

Well, that's probably enough writing for the time being ... at least at the blog. 

At least for now.

No comments: