Saturday, August 08, 2015

some links for the weekend

Over at Salon (no surprise), a theory that evangelical obsessions with sexual purity aren't about sexual purity or women so much as about white evangelicals fretting about the loss of their hegemonic death grip on culture.
Sexual purity movements, past and present, are not ultimately about promoting a biblical view of sexuality. They are about explaining large-scale culture crises (e.g. Anglo-Saxon decline, the Cold War, changing gender roles and sexual mores) and providing a formula for overcoming those crises.
Today’s movement is laden with a therapeutic rhetoric that presents these choices as the best choices for those who seek to conform their behaviors to God’s will. It promises that those who conform will enjoy spiritual, physical, and emotional satisfaction in their marriage relationships. Other scholars have parsed these claims in more sophisticated ways than I do and many other writers have demonstrated that these expectations are anything but a path to personal well being. What I’m saying is that sexual purity has never been about personal well-being for evangelical adolescents— or anyone.

Each historical example I analyze demonstrates that purity work and rhetoric has emerged at moments when socially conservative evangelicals seek to assert and maintain their political power.

For some reason Coates writing at The Atlantic on organic black conservatism comes to mind ...

Rolling Stone getting a new managing editor.

David Byrne (remember him?) "Open the Music Industry's Black Box"

This weekend is the opening for yet another Fantastic Four film that is probably just another reason to watch Brad Bird's The Incredibles again.

What was going to be a review of Andrew Durkin's Decomposition has mutated into more an incubating series of posts inspired by the book that are reacting to shortfalls in the book.  Since I'm taking the book as meant to be a conversation-inspiration rather than either an actual philosophy of music or a manifesto the failure of the book to be either of those things does not need to be held against it.

And there's some more stuff incubating about Legend of Entitlement (more officially known as Legend of Korra). There's this thing in genre fiction known as retroactive continuity and seeing as we've hit the tenth anniversary of Avatar: The Last Airbender this year, it seems worth blogging about how the success of the earlier series highlights the travesties of the successor. Way to go transforming Asami Sato from the Q for James Bond (Korra) to the Bond girl, folks. Take away the muscular woman of color character design and focus on the scripting and character arc and you find you get a "Korra" who acts like Tom Cruise in Top Gun.  The reward for saving the day is the hero gets to go on a vacation and have sex with the hottest woman on the planet as a reward.  Why were people claiming the Korrasami pairing was progressive?  Korra has come across as Tom Cruise's Maverick with ovaries and is about as likeable (and, in case you hadn't spotted this earlier, I don't like the character Tom Cruise played).  Korra's more likeable than Maverick in a whole lot of ways but she embodies the Maverick trope.

If we were to describe Korra in Jedi terms she's easily seduced by quick and obvious results, i.e. the Dark Side of the Force ... but we'll have to get to that some time later.

Alastair Roberts on sacralizing personal narratives, WtH has a suggestion for why a Rachel Held Evans was never going to match Mark Driscoll for bluntly manipulative personal narratives as argument

The fact that scriptural narrative, in contrast to much preaching upon it, is not typically focused upon the subjective states, inner lives, and autonomous identities of its protagonists is seldom properly recognized. While Scripture speaks of many particular persons, it does not share the type of emphasis that our culture places upon individuality and personal narratives. Where we have elevated ‘personality’, often to the neglect of ‘character’, Scripture presents us with limited clues to the ‘personalities’ of its characters and seems to have little interest in the matter. In God’s eternal wisdom, he did not choose to reveal Jesus’ MBTI personality type.

In Scripture, individuals find much of their significance within the larger stories to which they contribute and in terms of the typological roles that they perform. Biblical characters are pretty ‘flat’, rather than possessing the ‘rich internal life’ that the self-reflection encouraged by such things as widespread diary-writing and the modern novel has accustomed us to. First person autobiographical narratives are not the norm. Rather, biblical narrative situates people within a story that is not their own and speaks of them from a third person perspective that clearly relativizes their self-accounts.


Personal stories can have the most profoundly distorting effect upon our moral judgment. By playing up the ‘luxurious’ details of personality and the ‘depth’ of individual character, we can blind ourselves to the true ethical nature of actions. Žižek’s phraseology is important—‘the story we tell ourselves about ourselves in order to account for what we are doing’—and captures a number of important matters. First, ‘our story’ is not some eternal truth, but an account told by interested and unreliable narrators—ourselves—and should be handled very carefully as a result. Second, not only are we the narrators of our own stories but we are also the primary hearers—it is a story we ‘tell ourselves about ourselves.’ We are the ones most easily and typically deceived (usually willingly) by our own unreliable narration. Third, it is a story told ‘in order to account for what we are doing.’ As such it is a story typically designed to help us live with ourselves and our actions. It is usually a rationalization, an attempt to make sense of our actions retrospectively, in a manner that acts as a defence against the harshness of the ethical or rational judgment that they might otherwise provoke.

short, we need to be a lot more critical of our own stories and a lot more cautious when it comes to those of others. We have been practicing our wilfully distorting and self-exculpating narrations on ourselves for our entire lives and are past masters at it. [emphasis added]


Recognizing the way that the personal narrative can function, we should appreciate the pernicious way in which it can often be used as a trump card, to close down debate. The personal story, especially if it is a painful one, is immune to challenge and is thus a convenient way to advance positions in a manner that prevents others from calling them into question, for to do so would be cruel and insensitive (I have addressed some of the dynamics of this

While this piece was not necessarily an explicit sequel to "The Ad Man's Gospel", nor was it addressing the situation in Seattle, Roberts' observation opens up the reason why someone like Rachel Held Evans was incapable, at the most basic level, of meaningfully addressing a Mark Driscoll.

By way of a comment:'
The Man Who Was . . . says:

I have noted before how a lot of progressive Christians relies heavily on the form of the memoir: Nadia Bolz-Weber, Rachel Held Evans, and Sarah Bessey are all primarily memoirists, and pretty good at it. But their writing tends to fall off a cliff when it comes to explicit theological argument

In other words, Rachel Held Evans and Mark Driscoll can both be said to leverage personal narrative as something sacred, inviolable.  You can't be allowed to question the narrative because it makes the personal political, even if the rhetorical flourish would have it the other way. People who have attempted to leverage the personal narrative as a counter to anything Mark Driscoll has said or done have failed to account for the fact that if Mark Driscoll has mastered anything at all it's leveraging personal narrative as the way to frame any and all potential discussion of what he says and does.  An Evans will never present a personal narrative that can provide a plausible counter-argument to a Driscoll.  Driscoll's clearly been better at leveraging the personal narrative to a base than Evans, even if Evans' star-power in leveraging the personal narrative may be on the rise.

Driscoll's reputation did not crumble because of questions about personal narrative, although that came up, too. To the extent that the abrupt shift from the pre-Real Marriage narrative and the post-Real Marriage narrative suggested the pre-2012 public narrative was some kind of spin, it depended on documenting what the narrative was and to what end it was used.  Because Driscoll uses personal narrative as a framing device for propositional declarations about applied ethics, the shift in the narrative frame all but destroyed the credibility of practical application.  If Mark Driscoll's defenders had up until 2012 or 2013 conceded that Mark Driscoll was what's colloquially known as an asshole, the defense was he was an authentic asshole, he was not putting on a show.

But the narrative shift from pre-2012 to post-2012 invited questions as to whether this was the case.  Ironically Driscoll's predilection to frame his polemical and practical proclamations on ethics depended so much on the personal narrative frame that the 2012 shift did two things.  The first and most crucial thing the narrative shift did was ultimately raise doubts about the veracity and legitimacy of the emotionally charged personal narrative as a public propaganda campaign. If it turned out the Driscoll marriage was kinda miserable for a decade that made the thing seem like a sham.  Not that it was a sham, really, because plenty of Christians can have mediocre or even awkward and unpleasant marriages and get them to work.  Contrary to the ideals of American popular imagination people don't have to be happily married to be ethically married.  Mark Driscoll had spent so much of his public ministry conflating being happily married with being ethically married in his public personal narrative the fracture introduced to that public narrative by Real Marriage was substantial. For longtime attenders or members of Mars Hill it made the whole personal narrative begin to smell like a long con.

But the second was equally important and it was the plagiarism controversy that erupted with Janet Mefferd's 2013 interview that brought this second element to light--if the personal narrative Driscoll was literally selling raised questions about the continuity of the personal narrative, what about the content for which that personal narrative was invoked?  The personal story was the "how" of the sales pitch, and previously a reliable one, but what about the "what" of what he was selling, the program of ideas and ideals he was saying we should live by? 

The plagiarism controversy opened up the question of what Mark Driscoll was selling.  To the extent that the ideas seemed good it began to seem suspiciously like those good ideas were gained second-hand. To the extent that ideas formulated by Mark Driscoll were bad, those seemed like the original ideas.  Paradoxically and tragically for the sake of what Mark Driscoll was selling, he never needed a personal narrative to sell marriage advice.  You can go through sermon after sermon by a John Donne or a Martyn Lloyd-Jones and never find personal narratives.  In Mark Driscoll's sell, the personal was part of an image, an authentication process--the authenticity of the bad boy image needed to be retained, it seemed.  When Driscoll temporarily tried to shift to the father-figure-Doug-Wilson-knock-off image it came across as fake because it has never been who Mark Driscoll is, for one, and for two the image management was too abrupt and too obvious.  The facial hair and suits, really?  It would have made more sense if Driscoll had abruptly and suddenly announced he likes the show Archer. No, don't look it up, but if Hollywood wanted to do a movie about the life and times of Mark Driscoll H. Jon Benjamin would be someone to cast as Mark Driscoll.

Anyway ...

it seems Mark Driscoll's going the charismatic/word-faith/prosperity route.  He's courting the influence and aid of the kinds of people he was denouncing as heretical wingnuts seven years ago. That gets into the other risk of counting on the personal narrative, it means you can and will be measured by the continuity of that narrative.  I.e. watch your life and doctrine closely, because in the case of a Mark Driscoll the galactic contrast between the kind of submitted and loyal church membership he's enjoined "you" to live by for nearly two decades and the kind of "God says I can quit" narrative he's only trotted out in 2015 when talking to more charismatic-than-Reformed conference scenes is too big to ignore. 

Mark Driscoll can't escape the legacy he's been building for himself over the last 18 years in the public sphere.  Like it or not, his daughter Ashley Driscoll lives in a world where with a few clicks of a mouse she could end up reading Mark Driscoll's old "Using Your Penis" ramble.  There can be questions about the 2015 stories Mark Driscoll's been sharing on the road at Thrive or with Brian Houston.  If Mark Driscoll heard a voice saying a trap had been set on a Monday night, wrote a resignation letter Tuesday (his account to Houston) then why was it (in the Thrive conference) did the Driscoll kids not learn the news until Wednesday via social media?

The thing about legacies is that if we even have a legacy it is not necessarily ever under our control.  A good name is worth more than riches, though. It is better to be a poor person who walks in integrity than to be wealthy and crooked.

Mark Driscoll can keep leveraging personal narrative as he moves forward, but he must do so in a setting where he warned for the record to be careful of men who recycle their old sermons, just as he's begun to recycle his decades old approach to Ecclesiastes as if he were doing something in any way new. Driscoll has to live with the reality that once you've published it in mass media you have to consider it may never go away.  His kids will, probably mostly for worse, live in a world that can remember the bit about "penis homes". 

And for those like Rachel Held Evans who thought they could "stand up" to Mark Driscoll, you can't stand up to Mark Driscoll by weaponizing personal narrative in lieu of a consistently worked out theological position if you want to debate matters of Christian teaching and applied ethics.  The problem with sacralizing the personal narrative is that anybody can do it, whether a Rachel Held Evans or a Mark Driscoll.  One of the reasons Christians have a canonical text is because it lets us debate the meaning of the stories we agree are literally sacred scripture while people ranging from Driscoll to Evans want us to treat their personal narratives like sacred scripture.  Thankfully we're under no obligation to comply for either of them.

Kyle Gann on the status-mongering dynamic of academic musicology "the musicology ladder", and a question from Wenatchee The Hatchet
Still, while I haven’t spent much time consorting with musicologists, I have spent enough to learn what a strict composer-based hierarchy the world of musicology is. I was once on a panel with some big names, and highly complimented a famous scholar on his book on Muzio Clementi, which had been a great help to me. He seemed almost irritated that I had brought it up, as though it were some secret from his past that he didn’t want mentioned in front of his colleagues. He had now written a book on Beethoven, which meant he had climbed a couple dozen steps up the musicology ladder. And I have learned in that world that to have written the first book on Nancarrow was a miniscule accomplishment, almost negligible, compared to writing the 67th book on Bach, Beethoven, or Brahms. In the world of music historians, your stature is exponentially proportional, not to the quality of your research and writing, but to the prestige of the composer you can claim to be an expert on.

So this gets me wondering, has anyone, ever, in the history of musicology, published a dedicated monograph on sonatas and sonata form composed and published by the early 19th century guitar masters?  Because it's Wenatchee The Hatchet's belief that the step from the themes and styles of early 19th century guitar composers to early 20th century ragtime is a very simple set of evolutionary steps. As often as the canard has come up that European art music and American popular music are somehow separated by a vast chasm in the imaginations of some musicologists, theorists and philosophers, for any open-minded and practical musician this is nonsense.

Wenatchee The Hatchet would be interested to find a book discussing the sonatas of the early guitar masters.  Barring the existence of such a book it'd be fun to write such a book, if it didn't seem that musicology as a whole and even the guitar world in particular has had no interest in such a project for decades.

But then that's what blogs can be for.

Friday, August 07, 2015

D. G. Hart "... American believers vacillate between thinking they live in the greatest nation on God's green earth or have been sent to the Soviet Gulag."

HT Jim West whose post opens with a different pull-quote than Wenatchee has selected from the piece, which is:
Perhaps because of an overly realized eschatology — the kingdom of God is breaking out here and now — American believers vacillate between thinking they live in the greatest nation on God’s green earth or have been sent to the Soviet Gulag.

a piece over at Charisma features mentioning Driscoll, "I don't have enough information to truly speak to the issues on either side ... " like that's ever stopped Charisma from publishing stuff anyway.

Technically the article is titled "5 Lessons We Can Learn from the Mark Driscoll controversy" but as Driscoll seems set on leaning more charismatic than Reformed and relaunching a career, the following excerpt seems most germane to Charisma coverage in general these days:
... I don't have enough information to truly speak to the issues on either side, but I want to remind all of us that Christians are fallible and make mistakes. We should consider the total portrait of one's life, character and ministry and evaluate on that basis. ...

By all means don't let your ignorance keep you from publishing stuff.

Tuesday, August 04, 2015

John Piper's paradox (1999-2015): natural disasters are God's gentle warnings, Mark Driscoll resigning is a victory for Satan.
August 18, 1999No earthquakes in the Bible are attributed to Satan. Many are attributed to God ...
September 2, 2005...God sent Jesus Christ into the world to save sinners. He did not suffer massive shame and pain because Americans are pretty good people. The magnitude of Christ’s suffering is owing to how deeply we deserve Katrina—all of us.

Our guilt in the face of Katrina is not that we can’t see the intelligence in God’s design, but that we can’t see arrogance in our own heart. God will always be guilty of high crimes for those who think they’ve never committed any.

August 1, 2007...When I sat on her bed and tucked her in and blessed her and sang over her a few minutes ago, I said, “You know, Talitha, that was a good prayer, because when people ‘blame’ God for something, they are angry with him, and they are saying that he has done something wrong. That’s what “blame” means: accuse somebody of wrongdoing. But you and I know that God did not do anything wrong. God always does what is wise. And you and I know that God could have held up that bridge with one hand.” Talitha said, “With his pinky.” “Yes,” I said, “with his pinky. Which means that God had a purpose for not holding up that bridge, knowing all that would happen, and he is infinitely wise in all that he wills.” 

 Talitha said, “Maybe he let it fall because he wanted all the people of Minneapolis to fear him.” “Yes, Talitha,” I said, “I am sure that is one of the reasons God let the bridge fall.”
August 19, 2009...The tornado in Minneapolis was a gentle but firm warning to the ELCA and all of us: Turn from the approval of sin. Turn from the promotion of behaviors that lead to destruction.
March 17, 2011Earthquakes are ultimately from God. Nature does not have a will of its own. And God owes Satan no freedom. What havoc demons wreak, they wreak with God’s permission.


So should a vast swath of the west coast of the United States crumble along the Cascade subduction zone when the "big one" hits that'd be a "gentle reminder", based on John Piper's public theology of catastrophe, that we don't deserve life.  But one guy quits being a pastor at the church he co-founded after years of controversy surrounding plagiarism and rigging the NYT best seller list and being an abusive leader who demonizes dissent and all of a sudden that turns into a triumph for Satan? Didn't Piper tell the world what havoc demons wreak, they wreak with God's permission? Where's the John Piper who habitually used catastrophes the caused the deaths of tens of thousands over the last fifteen years as a springboard for reminders that none of us deserve life and we should repent? 

Monday, August 03, 2015

John Piper on the debacle in Seattle ... but if it was a defeat for Reformed theology why did it seem so many of Driscoll's most persistent public critics have actually been Reformed?
"The debacle in Seattle is a tragedy, from untold angles," said Piper, who is best known for authoring the book 'Desiring God' and his conservative Calvinist theological stance. "It was a defeat for the gospel, it was a defeat for Mark, it was a defeat for evangelicalism, for Reformed Theology, for complementarianism. It was a colossal Satanic victory.

A little tough to say the Driscoll situation was a defeat for Reformed theology.  Evangelicalism?  Well, maybe.  Defeat for Mark?  He's said this year God gave him permission to quit, basically.  Sure, Driscoll never mentioned a word of that last year when he actually resigned, but to go by what he's been saying at charismatic conferences this year, it was all part of the plan.

The thing about the claim of a satanic victory is that spirits of calamity bringing torment or devastation on the reigns of corrupt and self-serving rulers is a trope in Old Testament literature.  Whether Abimelech or King Saul or King Ahab, the handful of cases where a spirit of calamity or "an evil spirit" hits a leader among God's people can be construed as a divine commission, a judgment against those who arrogated to themselves a type of rule that subjugated rather than served the common good of God's people. 

But let's focus a little on the defeat for Reformed theology part.  Now maybe Piper doesn't count folks who aren't Calvinist Baptist but let's consider who some of the more familiar public voices providing a critique of Mark Driscoll's doctrine, biblical interpretation and conduct have been over the years.

Let's start with Janet Mefferd, does she seem not-Reformed to John Piper?
How about R. Scott Clark?
How about D. G. "I don’t know why people are not debating whether Driscoll should even be writing books" Hart?
Or, oh, Carl Trueman?
At the risk of mentioning some others, how about Wendy Alsup?
And finally, Wenatchee The Hatchet.

Notice any patterns there?  Like, say, how many of them have self-identified as Reformed, and particularly if there are associations with OPC or PCA?  For those not already familiar with this, Wenatchee The Hatchet is Presbyterian, as is Wendy Alsup.  Trueman is OPC, Mefferd, too (I think).  Clark is Reformed. 

What Piper seems to have utterly missed is that Mark Driscoll's decline is not really a defeat for Reformed theology because what a number of us in the Reformed camp have been trying to point out for years now is that if you look at what Driscoll's actual doctrinal approach and practice has been he's only been "Reformed" in the sense that religion reporters who were too theologically ignorant or lazy took Mark Driscoll at his word (a matter that seems increasingly dubious, to put it politely) when he kept describing himself as Reformed.

Yet there's not a whole ton of evidence that Driscoll's sacramentology or ecclesiology seems very Reformed.  He doesn't even register as charismatic or Pentecostal in a mid-20th century sense.  If you pegged him as maybe closer to a Latter Rain or associated camp then, sure, okay, maybe. 

But let the record show that if Mark Driscoll's most vocal and persistent critics in the public sphere predominantly hail from the Reformed camp the defeat Reformed theology had was when Mark Drsicoll was ever taken seriously as allegedly a member of "our team" to begin with. 

That it took a concerted and conscientious effort on the part of those within the Reformed camp to publicly critique Driscoll before things began to change ... it seems that after a decade of secular/progressive criticism of Driscoll not a whole lot happened.  Driscoll was playing to the people who thought he was on their team.  Since secular and progressive coverage tended to not dig deep enough to go beyond a "yuck" reaction, it was apparently left to the Reformed and to evangelicals to provide an insider critique. 

That Mark Driscoll's advocates might be tempted to claim "all" critics of Driscoll were liberals or progressives is their own failure of imagination.  It's not like John MacArthur could be accused of being hugely liberal by anybody.  For the majority of a decade Driscoll could and did ignore criticism from anybody he considered to the theological or political right of where ever he decided he was because they didn't matter and he probably had better numbers than they had anyway.

The foundational failure of secular critique to do anything other than bolster Driscoll's image with his fans could be instructive if secularists and progressives were open to that.  Unfortunately they may not be, and so it's easier to fixate on the misogyny stuff than to consider what the sociological appeal of the dynamics of Mars Hill may have been so as to replicate elements of that to prevent such a dynamic from taking shape further.  Think of all the young guys who have been Driscoll's target demographic as at risk youth, at risk of getting a sales pitch for legacy.

There's still time, Driscoll could prove to be a great uniter of groups that otherwise might not have a chance to learn from each other.  Maybe secular progressives and Reformed sticks in the mud have an opportunity for doing something we could both enjoy, a scholastic survey of the sociological dynamics of how this stuff played out. A whole lotta single guys joined Mars Hill in the hopes they'd land a wife.  Some of them joined bcause they were idealistic enough to think it could be a positive contribution to theregion.  Others were interested in the idea of an evangelical quasi-artists collective.  There's a lot that could be studied for those who have not concluded, like guys like John Piper, that everything has been discovered. 

If we want history to not repeat itself so quickly next time around we need to not presume that we have nailed down what it was we saw.  It isn't ignorance that is the enemy of wisdom, it's assuming you already know.

If Driscoll in any way signals a defeat for Reformed theology it's that we ever took seriously the assertion that he was meaningfully on our team to begin with. 

Ironically, if there was someone who was given an opportunity to address the situation with Mark Driscoll and the leadership culture of Mars Hill years ago it was John Piper.
After multiple appeals were continually rejected by Mark and Jamie, we discreetly implored some local and then national leaders, who Mark said he respected, to help us, including John Piper and C.J. Mahaney. No one was willing to get involved. I was shocked and heartbroken again. You’re kidding? The whole Body of Christ and no one is willing to step in, judge the matter, and attempt to make things right? How can Matthew 18 be carried out if not one Christian leader will stand in to bring peace and reconciliation?

Piper can say he wished he had more of an influence but one can ask whether he ever had any influence at all; what if what he had was not influence but a halo effect, a way to lend his credibility to Driscoll in exchange for seeming more relevant to younger guys? If the Driscoll situation does signal a defeat, it might signal a defeat for John Piper in having failed (for whatever reasons) to address the situation years ago when he was invited to intervene.

trope alerts in music: just going through the scale like that's an idea (not that it can't work)

You may have read axioms about how music tames the savage beast or how music is some great mystery of expression and soul.

Yeah, well ... kinda.

See, music is also stuff you can build and sometimes a musical work draws such blunt-force attention to the nature of its construction you can't really miss it.

This post was originally going to get called "the downward dog in music" but then at the last moment a counterexample came to mind.  So first we'll just get a couple of things out of the way.

Just outlining the notes of whatever scale you're writing your music in, in a straight line, is something that can only work (if that) when you rely on a asteady pulse.  nobody's getting any sense that you're going anywhere exciting just going down the scale unless there is at least a pittance of rhythmic activity somewhere in that musical presentation of a scale.

Quite possibly the lamest way to announce the scale of things to come is starting at the tonic.

Witness Lita Ford ... in "Kiss Me Deadly".

Yup ... straight down the scale, introduced with an upbeat note.


Lest you snort that this is a weakness so typical of rock and roll Pachelbel beat Lita by centuries.

Follow that first fiddle

Pachelbel at least had the sense to recognize that you can't just go straight down the scale from the tonic if you've got no initial rhythmic spark.  You at least start at the third or fifth of your tonic chord, which is a strategy also endorse by ....

Weezer, in "Hash Pipe".

Start on the fifth of the chord, but jump up to the third and then downward dog.
Of course if you want to be drastically fancier in how you just roll down the scale there are other examples.

Try about 1:55 with this one

The subject of the closing fugue for Beethoven's "Hammerklavier".  Sounds pretty complicated and all, but he is, in a way, just taking an unusually fancy path down a scale.

Of course there's no reason you have to always go DOWN a scale.  You can go up a scale, too.
It can also sound totally awesome.  Take it away Stevie Wonder ...

Thus endeth the musical trope alert post for today.

Atlantic, a few years back, "History is Beautiful Things Made by People with Ugly Ideas", a problem the internet age will keep running into

Something in the age of the internet that can make it feel truly unique, but in a terrible way, is what Wenatchee The Hatchet has called imputing comprehensive guilt by tangential association.  Another way of putting it is to forget that, as someone at the Atlantic put it a couple of years ago "History is Beautiful Things Made by People with Ugly Ideas".


Imagine medical residents refusing to mimic a surgical technique pioneered by a racist doctor, or English majors declining to recite any poems written by sexists. Imagine people in all fields being made to feel as if opposing racism or sexism requires that sort of boycott. What a waste that would be in a world where there is a perfectly good alternative, one that hardly requires airbrushing history or human pathologies. It is to say that some people, Whitman hardly unique among them, had wrongheaded, offensive beliefs on some subjects, but still managed, through the best of what they produced, to render things so wonderful that generations of people have seen fit to pass them on.

That isn't to say that everyone must appreciate Whitman, or any larger than life figure from the past. It is to say that 25-year-olds like McNair, along with 33-year-olds like myself, would be wise to stay open to the possibility that inhabiting the art of someone whose aesthetics or personal moral beliefs we find abhorrent might nevertheless end in our gleaning something valuable from the experience. The opportunity to learn in that way won't survive, for most students, in a world where rejecting bigotry is thought to require rejecting everything produced by every dead bigot. Let's reject that standard, and the attendant fantasy that it's possible to shun the part of our cultural inheritance contributed by people who held ugly ideas. To really confront the horrific scale of bigotry, and American racism in particular, is to know that is impossible. Too much would have to be shunned. The timber of humanity is too crooked. We'd have nothing left.

But it seems to be a sign of our times ... yesterday's measure of beauty can be today's measure of ugliness.
Baby boomers inherited a world that believed deeply in the value of the Western canon, and now inhabit a world that holds that canon responsible for many of our culture’s ills. One belief for childhood, the opposite for adulthood.

Scott Timberg's invective against the old patronage system skips over a not-so-mundane detail about how and why an absent artist could be jailed, it's called a technically military contract relationship, elucidated by H. C. Robbins Landon back in the 1980s
Talk to techno-utopians and well-meaning libertarians about the crisis in culture today – the increasing demand for musicians, artists, and scribes to work for free or almost nothing – and you’ll hear a cheery solution. Patronage! If everything else that used to help creatives get paid has fallen through, what about the tribe of noble rich people – especially those groovy, socially progressive folks in Silicon Valley who just love music and culture — dialing it up directly?
The composer Joseph Haydn invented the symphony and the string quartet. The bulk of his career was spent with the House of Esterhazy, where he wore livery and dined with the servants. Monteverdi wrote the most important early opera, “L’Orfeo”; when he left royal service in Lombardy after two decades of labor, he had about enough money to buy himself lunch. Velasquez was responsible for painting the portrait of Spain’s King Philip IV as well as overseeing the royal janitors.

Musicians and artists who left their posts could be thrown into prison.

Now, we could see the moguls in Silicon Valley loving to have an artist or jester on a string. Some of their culture love may be sincere. “Musicians, comedians, writers,” a character in Dave Eggers’s Silicon Valley novel “The Circle” says of one of the founder’s passion project, “to bring them here to get exposure, especially given how rough it is out there for them.” This artsy founder brings artists in to the Google-like campus: He just doesn’t pay them.

Timberg's article would make it seem like it was arbitrary and nasty for a duke or prince to have some artist arrested for deserting a post but let's not skip over the thing Timberg so conspicuously skipped over.  There's this decades old lecture given by H. C. Robbins Landon about Haydn's time within the patronage system of his age.  The lecture starts off with a grim but matter-of-fact observation that artists of various types, but particularly composers and musicians, dying in abject poverty was more the norm than the exception:
In recent years we have become ever more fascinated and horrified by the fate of Mozart, whose earthly remains were buried in an anonymous grave outside the city walls of Vienna. That event, the indignity of which gradually became infamous, took place in 1791, not quite a decade before the eighteenth century came to an end. But Mozart’s death - though particularly appalling in view of his special genius- was by no means unique. Thousands of more modest composers and performers were igno-anonymous overnight. A roll call chosen at random might include:
Antonio Vivaldi, once the astonishment of settecento Europe, died in abject poverty at Vienna in 1741. His grave is unknown, and for many years it was not even known that he had died in Austria.

Carlos d’Ordoñez, a then well-known composer of Austro- Spanish parents, died in penury at Vienna in 1786.

Carl Ditters von Dittersdorf, once the most popular composer of German-language operas, died at Neuhof in Bohemia in 1799, his desk drawer full of symphonies that no one wanted to perform or publish.

Anton Huberty, once a celebrated Parisian music publisher and string player, who had issued Haydn’s first symphonies in Paris in the 1760s, died at Vienna in 1791. His effects were hardly enough to keep his daughters from starvation.1
Luigi Boccherini, the Italian composer once famous throughout Europe, died ‘in dire poverty’ at Madrid in 1805, his music no longer fashionable.
The anonymous and already-forgotten deaths of many musicians and composers is the way the world works under normal circumstances. 

It's important to bear in mind what Robbins Landon pointed out in this lecture, that Esterhazy musicians were engaged as OFFICERS. Technically, a musician in the court there was engaged in a military capacity.  So while Timberg may have wanted to highlight how disastrously bad it would be for a musician who left a post or was not at post for a patron, he didn't necessarily give a reason WHY this would happen.  Omitting that "why" is a big omission. Think about this for even one second and it becomes abundantly clear that if you're on contract in a military role and you weren't at your post without prior permission you could and would get punished on the basis of the contract.  Court martial and arrest and so on. 

The other thing that is worth noting, for those who haven't already familiarized themselves with the Robbins Landon lecture, is that Haydn wasn't exactly being compensated on the basis of mechanical royalties but for services and labor.  This can help explain why old Lefties like Dwight Macdonald could in all seriousness assert that the old aristocratic patronage system for the arts was more humane and human than the corporate enterprise variations of the 20th century.  You can disagree, of course, but it helps to get a clearer understanding of what the patronage systems entail before you do. 

Timberg's lament rings so hollow because whether he'll admit it or not we've had a patronage system in the last century and it's been the corporations that constitute the music publishing and broadcasting industry. 

Sunday, August 02, 2015

Hal Foster reviewed NoBrow, pop cultural consumption as identity marker and how "without pop culture to build your identity around, what have you got?"
What are the bearings that Seabrook takes? Even though he is a self-declared ‘hegemonoculous’ (a wonderful-horrible appellation meant as a homage to a traumatic seminar with Raymond Williams at Oxford in the early 1980s), he knows that the old map of oppositions – high and low culture, Modernist and mass art, uptown and downtown – no longer corresponds to the world. So he makes a chart of his own, and devises a lexicon to go along with it: ‘Nobrow’ (where ‘commercial culture is a source of status’, not of disdain); ‘the Buzz’ (‘a shapeless substance into which politics and gossip, art and pornography, virtue and money, the fame of heroes and the celebrity of murderers, all bleed’); ‘Townhouse’ and ‘Megastore’ (‘in the Townhouse there was content and advertising; in the Megastore there was both at once’); ‘Small-Grid’ and ‘Big-Grid’ (‘the America of you and me’ and ‘the America of 200 million’; ‘what lies between is a void’). In the end, as Seabrook sees it, the law of Nobrow is simple: the Arnoldian criterion of ‘the best that is known and thought’ is long gone, and what rules is the Buzz principle of whatever is hot. No more ‘is it good?’ or even ‘is it original?’, only ‘does it work in the demo?’ – ‘demo’ as in ‘demographics’, not to be confused with ‘democracy’, much less ‘demonstration’. Incidentally, for Seabrook Clinton is ‘the perfect steward’ of this ‘numbers and spin construct’ of ‘polls, focus groups and other forms of market research’ – he was, after all, the first President to appear on MTV – though George W. could make Slick Willy look positively unkempt.

Not surprisingly, Seabrook’s findings boil down to hypotheses about identity and class. ‘Once quality is deposed’, he argues, identity is ‘the only shared standard of judgment’. For Seabrook this identity must be ‘authentic’ (somehow authenticity survives as a value), and it can only be made so through a personal sampling of pop goods at the Megastore: ‘without pop culture to build your identity around, what have you got?’ [emphasis added] For an old guard of American highbrows like Dwight Macdonald and Clement Greenberg, this statement would be grotesque: mass culture is the realm of the inauthentic, and there is no more to be said. For Seabrook (and here he has learned from cultural studies since Williams), it is not absurd at all – in large part because he views pop culture not as mass culture but ‘as folk culture: our culture’. Yet this semi-paradoxical turn of phrase doesn’t solve a basic problem: given his account of the Megastore, is the ‘sampling’ of an identity à la hiphop any different from the ‘branding’ of an identity à la George Lucas? British cultural studies gave us the notions of ‘resistance through rituals’ and ‘subversive subcultures’; American cultural studies has given us the Post-Modern subject that is ‘performative’ in its construction. But with the near-instantaneous time to market from margin to Megastore (or from Small to Big Grid), how much resistance or subversion can subcultures offer today? And is the Post-Modern subject so different from the consumerist subject, that ‘perfect hybrid of culture and marketing’, as Seabrook calls it, ‘something to be that was also something to buy’? This approach represents one of several recoupings of critical positions in Nobrow: call it the revenge of the hegemonoculous on the identity-line in cultural studies.

In an eagerness to not stake out any claims that there might be aesthetic criteria in themselves that the arts could endorse, what do we get?  That's another element of the high, mid, low cultural question and the matter of consumption of cultural artifacts as identity can have.  What a shared aesthetic may lack in being able to unite those who don't share that aesthetic value it can make up for in being something that can be agreed upon regardless of social strata.  Thus for a leftist like John Halle "Nothing is Too Good for the Working Class" can be a defense of why classical music isn't too good for working class people.  But then if that's the case what about the high/low divide as an actual class divide?  One of the tropes in high culture is that the learning curve required just to appreciate the stuff can be pretty big.  It's one of the problems of thinking of the high and low as too fixed and why there may always be some middle.

Without a set of agreed on aesthetic ideals in the arts for an arts community what else would be left but the slipperiest and slimiest of buzzwords, authenticity? Something we'll be playing with a bit in future blog posts. 

Richard Taruskin on how the high arts world in America took the gravy train for granted
Especially in New York, then, the period roughly from the founding of Lincoln Center to Black Monday in 1987 was a golden age for art producers. Major organizations used public subsidies to supplement private donations and vastly improve the working conditions of their employees. Tiny groups, including several in which I then participated as a performing musician, proliferated. And I have not even mentioned the corporate foundations, which also mushroomed both in number and in lavishness of largesse. The Ford Foundation, the biggest one pre-Gates, had been founded in 1936 to "strengthen democratic values, reduce poverty and injustice, promote international cooperation and advance human achievement." It took up the cause of the arts, including classical music, in the late 1950s. The Rockefeller Foundation, a much older organization, got on the arts bandwagon in connection with Lincoln Center. In the 1980s, the big name was Citicorp. During this period, as Tindall comments, "most performing arts groups were subsidized by unearned donated income, as well as tax incentives, and therefore did not always have to link revenue to the quantity, quality, or type of product they offered."

As long as this gravy train lasted, the attrition of the audience could be overlooked. The result of living for three decades in a fool's paradise was a vast overpopulation of classical musicians as many more were trained, and briefly employed, than a market economy could bear. The cutbacks that seemed to imply the sudden cruel rejection of classical music were really more in the nature of a market correction, reflecting the present scarcity of patronage and a long-deferred confrontation with the changed realities of demand.

Martha Rosler on the self-ascribed messianic status of artists as a necessary, if imaginary, corrective to their dependance on the big money of patrons.

A great deal has been asked of artists, in every modern age. In previous eras artists were asked to edify society by showing forth the good, the true, and the beautiful. But such expectations have increasingly come to seem quaint as art has lost its firm connections to the powers of church and state. Especially since the romantics, artists have routinely harbored messianic desires, the longing to take a high position in social matters, to play a transformative role in political affairs; this may be finally understood as a necessary—though perhaps only imaginary—corrective to their roles, both uncomfortable and insecure, as handmaidens to wealth and power. Artists working under patronage conditions had produced according to command, which left them to express their personal dimension primarily through the formal elements of the chosen themes. By the nineteenth century, artists, now no longer supported by patronage, were free to devise and follow many different approaches both to form and to content, including realism and direct social commentary.19 Still, the new middle-class customers, as well as the state, had their own preferences and demands, even if a certain degree of transgression was both anticipated and accepted, however provisionally (the Salon des Refusés was, after all, established by Napoléon III). ...
There are armies of books that could be written on Romanticism as an ideology that sacralized the arts as an end unto itself, but since Leonard B Meyer wrote a fantastic book about that already that was published decades ago we'll just try to get to blogging about that at some point. 
One of the percolating ideas I'm hoping to blog about in what was supposed to be a review of Andrew Durkin's Decomposition is that Durkin failed to accomplish much by attacking what he identifies as the ideology of authorship and authenticity in music without aiming to dismantle the entire ideological paradigm of Romanticism going back centuries.  This would potentially be because he doesn't want to dismantle the ideology of romanticism at all and perhaps even embraces it but finds it problematic in the sense that it has been co-opted by the market.  What else was going to happen?  The Authenticity Hoax covered that breezily enough and not without a few pretty good points along the way.

n+1 on the free and the anti-free: "antifree was a small group of interested artisans speaking up for the dignity of being gainfully employed", although the question of whether there "should" be gainful employment in the arts may be its own question
By the mid-aughts, a day job was no longer an inconvenience but an aspiration, and attitudes toward it changed. The work writers could get at corporations—as listings editors or fact-checkers—may have remained secondary to artwork in their minds, but that work, so much less reliably available than before, demanded a new level of effort to find and to keep. Not only one’s position but one’s entire department could, without much warning, disappear.

These writers and copy editors were among the many who, faced with limited resources and their own cultural omnivorousness, came home each night eager to download MP3s, PDFs, and other digital copies of artworks and research they would otherwise be unable to access. Around the reality of these thefts a powerful ideological movement emerged, taking as its inspiration not just facts on the ground but also the libertarian, antigovernment, “hacker” spirit of the earliest personal computing and internet communities. The apostles of the Free Culture movement, as it came to be called, argued that stealing digital content was a progressive politics and should be brought into the open. Some of these apostles were hucksters and profiteers, others were merely hypocrites (who preached the virtues of free from their perches as well-paid magazine editors or college or law school professors), but still others, like the freeware hacker Aaron Swartz, were true believers. Congress had allowed copyright protections to be rewritten by huge corporations (most notably Disney) to become a parody of a law. If what was being illegally downloaded was some of the best that had been thought or said by human beings, and the downloaders were people who couldn’t afford the purchase price of the books or movies (some of which were expensive)—wasn’t that a good thing?

Free Culture ideology appeared to be approaching mainstream consensus when the 2008 recession made users feel, both rightly and perversely, that culture–producing corporations were fragile. In book publishing that year, hundreds of midcareer editors, writers, publicists, and other industry workers were pushed out. In the first week of December alone, the Observer reported a “massive reorganization” with layoffs to follow at Random House; a reorganization and layoffs at Macmillan; layoffs at Simon & Schuster; and an acquisitions freeze and layoffs at Houghton Mifflin. Some of these people eventually found new publishing jobs, but the industry had contracted. Many were the twentysomethings who had sold out in the Nineties and now, a decade later, ran up against the possibility that they no longer had anything to sell.

What could a no-longer-young person do in this situation? Many turned to the digital platforms that, even before the recession, had been putting magazines and newspapers out of business. So it came to happen that postrecession digital startups were helmed not only by young people and risk takers but also out-of-work publishing veterans. ...

The Free movement had a few professorial spokespeople and millions of adherents; antifree was a small group of interested artisans speaking up for the dignity of being gainfully employed. As antifree grew beyond the small world of left-wing blogs, it attracted 25-year-olds who objected to being paid $50 by a corporate website that presumed them lucky to get the experience. It attracted veteran journalists who balked at being asked to write for a large, profitable magazine’s website for chump change. And it attracted unpaid interns, who at profitable media corporations (ranging from Condé Nast to Gawker), actually filed suit for violations of labor laws. These were individual stories, but they added up. The entities that had once supported journalists and writers were now doing their best not to pay them for the simplest of reasons: they could get away with it.

What's been interesting to read is how different strands of the Left have differed on whether or not the decline of the middlebrow is even a bad thing.  This may be a category mistake because TV and film and mass media haven't gone anywhere.  We live in an era in which it's easy to highlight how six ostensibly different country songs all kinda sound the same.

Absent financial compensation artists and writers don't just make work for free, they functionally make it at their own expense and at a loss unless there is what people call "cultural capital".  Another way of putting it is that an artist who isn't paid in money might gain some form of exposure or prestige or a recognized role within a social unit. 

Rock and roll as the real opiate of the masses, an old polemic about it from half a century ago courtesy of The Atlantic

So powerful was the rock beat that all other attributes of the music were presented as secondary, or totally inconsequential. “‘Positive’ lyrics are mostly a sop to minds that do not want to know what they are thinking,” he wrote, before describing a rock-gospel vocalist futilely singing praises to God even as the “the music itself rocked on and out away from the words into a new wild night of nihilism.” This nihilism, he said, allowed rock to placate adolescent angst, not by channeling it toward the outer world but by making it a pleasure in itself: [emphases added] “Through exposure to rock ‘n’ roll, teen-agers learn to handle their aggressions and discontents—not through understanding, criticism, and self-conscious social rebellion, but through surrendering them to manufactured purgative.”

“Manufactured” is a key word here. Larner devoted a lot of words to the major-label songwriting machine, the practice of payola, and the trend of white artists making money by covering black songwriters. Rock wasn’t art, it was product, designed to transfix through its brute effect on human physiology. In the most devastating passage, he made the medium sound like aural toothpaste:

What teen-agers need in music is more or less what modern adults need too: not music to be listened to but background music as they hurry through their appointed activities. The background may be throbbing RnR or tinkly Muzak, but it all comes from the same package. On opening the package, the buyer finds a clearly labeled, constant stream of facile stimulant, factory guaranteed to jazz you up, smooth you out, purge your violence, and leave you kissing-sweet and ready for maudlin love.
But it is here that a rift between an Old Left and a New Left can be observed before long.  The Matthew Arnold/Adorno ideal of art as for-betterment and whether you're entertained by it be damned is not the only way people have approached the arts.  What's funny is that whether we're looking at a Kyle Gann for the Left or a Terry Teachout for the Right (bear with me, I'm simplifying based on a few online polemics you may not already know about), it's interesting that there are those who are self-described as not seeing a necessary divergence between the pleasure art gives as entertainment and its capacity to engage and embrace more substantial metaphysical and epistlemological values.

And if I felt like invoking Taruskin's sprawling treatise on Stravinsky I might mention how there are aesthetes and decadents who can define the arts by virtue of formalism.  But that gets into Leonard B Meyer stuff that would ideally be saved for some other post.

You're going to find that's a pattern here. 

college degrees as status indicators of a new leisure class? Well, okay, maybe (probably)

Your parents’ income may play a large role in the major you select: in “Rich Kids Study English,” Joe Pinsker considers the elite bias toward studying the arts, history, and other less practical majors:
What Pinsker’s research indicates is that only the rich think they can afford to learn something that isn’t useful to modern life’s larger goal (namely, procuring a secure and profitable career).

The part that the AC article didn't quote from the end is what sums up the Atlantic feature most eloquently:

From this angle, college majors and occupations start to look more and more like easily-interpreted, if slightly crude, badges doled out to people based on the wealth and educational levels of the parents they were born to.

The more useless your degree seems to be to any "real world" productivity the higher status your family probably has in terms of money and education. So it's not entirely without cause that stereotypes about academics as a kind of leisure class erupt in political diatribes, even though that can be wildly inaccurate for a variety of reasons.  What gives the stereotype an emotional appeal is that those who have taken the time to get advanced degrees in things that don't lead quickly to vocational work but to academic credentialing can be seen as part of an elite class of those who can afford to get any college education in the liberal arts.

and since lamenting that the arts may become a luxury activity is something Scott Timberg has been fretting about for years, it might not be a bad idea to say that outside an elite arts culture in which patronage and virtually dynastic skill sets are involved that's what it has always been.  Let's close with a lengthy excerpt or two from a review of Timberg's book.
 Culture Crash points an angry finger at the internet. His prime example is the hollowing out of the music business, especially for indie rock. A lot of thorough analysis has been done in this area, and Timberg has availed himself of The Future of Music Coalition and the work of other concerned journalists. (There’s a good bibliography in his notes.) The music recording business has lost two thirds of its value in just over a decade, from $14.6 billion in revenues in 1999 to $5.35 billion in 2012. He notes that now, musicians earn on average 6 percent of their annual income from recordings.

He does not, however, go back to the devil’s deal that musicians made with recording labels early in the history of the industry, in which the money from record sales went to the studios and they were expected to make their living touring. Indie self-promotion turns out not to be any better.
 But for anyone who still believes in the redeeming monetization of social media contacts Timberg relates the sad anecdote of cellist Zoe Keating, who followed every rule of the new music economy. Keating self-releases her music, has 1.2 million Twitter followers, and in 2013, between two million YouTube views and 400,000 Spotify streams earned a combined total of about $3,000 from both services. That’s before adding in the problems of unpaid or pirated downloads.
"If we’re not careful, culture work will become a luxury, like a vacation home,” Timberg writes. It’s a good line, and one that anyone who values a diverse cultural ecology would want to affirm. What he doesn’t want to admit is that, absent direct patronage, professional culture workers have often depended on outside sources of income. For some it was the second job (in the post-war period, that job was primarily teaching, a job indirectly subsidized by the government in the form of the G.I. Bill fostering a new population of students). For others, it was something unrelated (meet pediatrician William Carlos Williams). For many (more than we have usually acknowledged and certainly more than today’s BFA and MFA students are aware) it was a trust fund, family member, or a spouse of means. That cushion made it possible for a talented person work on a novel or a painting until the work could earn respect, if not a proportionate wage for the work the artist put into it. Maybe the market would respond, and maybe it wouldn’t, but at least the creative person had a chance to find out.

That’s one of the reasons that pop culture exhortations to follow one’s bliss are so maddening. [WtH and for a rather remarkably long rant on that bromide, see the following article at Jacobin.] They imply a kind of privilege at the very heart of the class structures Americans are eager to say don’t exist. The fraying of the middle class is not just something that has happened to creatives. It’s just that Timberg never thought that what had happened to unionized manufacturing workers could happen to the educated type of knowledge workers who worked at the LA Times
We haven't even gotten back to Dwight MacDonald's "Masscult and Midcult" yet.  Seeing how folks on the left/ish have seen fit to gut Timberg's outrage has been an interesting reading project this year. 

on that nebulous middlebrow courtesy of the NYT.

The word crept into English, in class-ridden Britain, between the wars. It was deployed to memorable effect by Virginia Woolf in a letter to The New Statesman, in response to a review. “If any human being, man, woman, dog, cat or half-crushed worm dares call me ‘middlebrow,’ ” she wrote, “I will take my pen and stab him dead.”
The reasons for her rage are spelled out in vivid, good-humored detail. Woolf is proud to call herself a highbrow, which she defines as a “man or woman of thoroughbred intelligence who rides his mind at a gallop across country in pursuit of an idea.” This designation places her in the company of writers from Shakespeare to the Brontës, and also carries an unmistakable, not entirely metaphorical trace of class distinction. Highbrow status is a matter of breeding and belonging. But the highbrow, though an aristocrat, is not a snob.
“I honor and respect lowbrows,” Woolf asserts, “and I have never known a highbrow who did not.” (Lowbrows are defined as those who are as committed to living as highbrows are to thinking.) This is because high and low are in alliance against the middle. “I myself have known duchesses who were highbrows, also charwomen, and they have both told me with that vigor of language which so often unites the aristocracy with the working classes that they would rather sit in the coal cellar together than in the drawing room with middlebrows and pour out tea.”
What makes the middlebrows so contemptible? Woolf’s tautological response is their very middleness, their inability to be either one thing or another, and their habit of “indistinguishably and rather nastily” mixing up art and life (the pure, complementary pursuits of the high and the low) with things like “money, fame, power or prestige.”
The natural affinity of the high and low, and their mutual suspicion of the middle, has been a remarkably durable idea, though it has never proven to be anything more than an idea, a nostalgic vision of ideal order. At heart it is a fantasy of aesthetic authenticity secured by static and hierarchical social distinctions. A world of landlords and peasants, of masters and servant, of patrons and workers is one in which art and life harmonize. In such a world, the middle will always be a place of vulgarity and ostentation, of the kind of money-grubbing, backslapping, self-conscious display Woolf (or at least her notional duchess) would flee to the basement to avoid.
A name for that place, in the postwar years, would be America, which emerged as a kind of Promised Land — or nightmarish dystopia, depending on whom you asked — of middlebrow culture.
The middlebrow is robustly represented in “difficult” cable television shows, some of which, curiously enough, fetishize such classic postwar middlebrow pursuits as sex research and advertising. It also thrives in a self-conscious foodie culture in which a taste for folkloric authenticity commingles with a commitment to virtue and refinement.

But in literature and film we hear a perpetual lament for the midlist and the midsize movie, as the businesses slip into a topsy-turvy high-low economy of blockbusters and niches. The art world spins in an orbit of pure money. Museums chase dollars with crude commercialism aimed at the masses and the slavish cultivation of wealthy patrons. Symphonies and operas chase donors and squeeze workers (that is, artists) as the public drifts away.

Universities and colleges, the seedbeds of a cultural ideal consecrated to both excellence and democracy, to citizenship and to knowledge for its own sake, are becoming either hothouses for the new dynastic elite or training centers for the technocratic debt peons of the digital future.
In the hectic heyday of the middlebrow, intellectuals gazed back longingly at earlier dispensations when masterpieces were forged in conditions of inequality by lucky or well-born artists favored by rich or titled patrons.

Social inequality may be returning, but that doesn’t mean that the masterpieces will follow. ...

If you're curious where "Masscult and Midcult" might fit in (or especially if you're not) ...
" ... There's something way down deep that's eternal about every human being." The last sentence is an eleven-word summary, in form and content, of Midcult. I agree with everything Mr. Wilder says but I will fight to the death against his right to say it in this way.

Thus on the play known as Our Town.  In the lexicon of trade axioms, maybe the problem of the middlebrow is that it self-consciously traffics in the proverbial in a way that the high and low avoid either through dread of cliché or ignorance of its promulgation. :)  Perhaps only a middlebrow can fashion a work of art in which the message is the point?

Scott Timberg's lament about the loss of the middlebrow evokes some "so what?" reactions from folks who aren't into middlebrow, `twas inevitable
Another institution that failed here was the White House. I consider Obama a vast improvement over George W. Bush, and this supposed African socialist has been far better for capitalism than his CEO predecessor. But his housing policy – and lack of guts dealing with the banks which held mortgages and helped create the crisis – was pathetic and lame. People like me, who’d worked their whole lives, made frugal decisions, had maintained good credit, were left to twist in the wind. The banks got bailed out with our tax dollars, their execs got bonuses, and for me and many other musicians, artists and writers who lost their homes, all of our wealth was destroyed.
 Middlebrow has always been a complicated/ ambiguous concept – or set of concepts – and I may have done it no favors here.

When I lament the loss of middlebrow, I’m not saying I want nothing but overplayed warhorses at symphony orchestras, nothing but Matisse shows at the museum, etc. What I miss is the notion that art is somehow clarifying or restorative, and that a broad public education and media push is worth investing in. Middlebrow means Leonard Bernstein on TV, Thelonious Monk on the cover of Time, Anne Sexton learning how to write a sonnet on public television, Lionel Trilling and Auden leading a book club for non-scholarly readers, public school art classes, etc. It says there’s something valuable about culture that goes beyond money (what the neoliberal or capitalist values) or shock value (what much of the cultural left values.)
 Middlebrow, whatever its fault and blind spots and earnest pieties, values literature and the arts as aspects of human achievement
While there's been too many reviews to try to sum them all up it's tough to find a review that more readily sums up a "so what?" reaction than the following:

The title including "middlebrow tantrum" might tell you just about all you needed to know before you even consider reading the review proper.  But here's an excerpt:
Thinly rejecting Dwight MacDonald’s evisceration of the middlebrow in 1960’s “Masscult and Midcult,” Timberg celebrates the democratic middlebrow, less valuable for generating meritorious art or innovative discourse than for establishing a sort of cultural public square. But not online! Timberg is more interested in the lost role of newspaper arts sections than he is in the health of elite culture, though his analysis of Los Angeles, for example, seems to focus exclusively on the latter, pointing to the closure of Ferus Gallery (1966), Artforum’s departure for New York (1967), and the collector-dealer Virginia Dwan’s same move (1968) as portents of Los Angeleno decline. Despite his cheerleading for the middlebrow, Timberg never fully reconciles how the high relates to the middle. Given the relative strength of the former — a surge of little magazines (e.g. McSweeney’s, Guernica, n+1) greeted the new millennium while big print dwindled — his doomsday assessment of culture writ large seems compromised. But Timberg never strays far from the middle, at one point dismissing poetry wholesale for “being inhaled into academia and [losing] its connection to the literary and intellectual mainstream.”
 As a media critic, Timberg also falls short of the mark. For one, the hysterical retelling of the gutting of the Gannett newspaper empire could benefit from some historical perspective. Here’s A.J. Liebling, writing in the New Yorker in the same year as “Masscult and Midcult,” on the precarity of the journalistic profession: “If a journalist is working in a town where there are two ownerships, he is even money to become unemployed any minute, and if there are three, has two chances out of three of being in the public relations business before his children get through school.” Timberg cites Bureau of Labor Statistics figures on declines in print publishing employment from 2002 to 2012, yet no comparative effort is made to assess growth in other forms of media. The losses of old lions are bemoaned, like the Village Voice’s firing of J. Hoberman, but Timberg fails to note that the film critic’s byline remained a mainstay, and appears frequently today in the New York Review of Books, just as Michael Musto, another Village Voice layoff, resurfaced at Gawker. (Hoberman has also contributed to ARTINFO.) That the media industry underwent a traumatic shift is not in dispute, it’s just that personal hardship does not inherently imply broad-based cultural impoverishment.
 Moreover, even if we are to assume that Timberg’s media theories are correct, it is not obvious to what extent, dispassionately speaking, the fate of a predominantly white and middle-class cadre of culture-section newspaper writers weighs on the public culture of a pluralistic democracy.

Never heard of "Masscult and Midcult", one of the shots fired against middlebrow culture from the mid-20th century?  Well, here ya go.

Macdonald's evisceration of Hemmingway as a short story author who made the mistake of writing novels and who at his best honed baby talk into a literary art form was pretty funny!  Now virtually no one, even on the Left, would agree rock and popular styles are incapable of being art, just as it would actually be difficult to sustain a case that jazz was really "folk art" since it was promoted as a musical commodity just as much as rock would be after it.  Still, one of the debates that constantly burbles within arts criticism and art history, it seems, is a question of what "folk" art is compared to "high art" and how the two relate to each other.  The quasi-Marxist prelapsarian ideal of a high and low that knew their respective before capitalism destroyed everything with the emergence of middlebrow would require books I'm not sure I feel like writing and that, rest assured, others already wrote.

But the question of what folk art is, what the low brow art is in socio-economic terms might be interesting to see get some more discussion.  If high art has existed within a patronage system where the wealthy pay the skilled to create art as a signifier of status and low art is what the masses produce for their pleasure in their spare time than maybe a semi-Marxist way of describing what folk art is as a product related to the means of production is that folk art is stuff that's largely anonymously made at the expense of its creators, who do so at a loss of their time, money and resources simply to contribute to their cultural moment by making something they consider pleasurable and beautiful.  The high arts are things procured as a way to establish socio-economic ranking and legacy.  But what a person could playfully propose about both kinds of art is that the folk art and high art that emerges will be indicative of the empire in which it was created and there is no art that is not, in some basic sense, an imperial relic.  What were the Dark Ages if not merely an era in European history in which empires collapsed and regional feudalism emerged without significant and efficient consolidations of power?  The Roman empire fell and not a whole lot really replaced it. 

But that might be a whole set of other posts ... suffice it to say, when Scott Timberg laments that the production of the arts may be a luxury one can fairly ask how on earth he ever got the idea it was ever otherwise.