Saturday, August 15, 2015

more piggy-backing on ideas from ribbon farm, the Locust Economy as a possible way to interpret the dynamics of Mars Hill growth and decline

http://www.scientificamerican.com/article/when-grasshoppers-go-bibl/
...
 
It took just two to three hours for timid grasshoppers in a lab to morph into gregarious locusts after they were injected with serotonin. Conversely, if they were given serotonin blockers, they stayed solitary even in swarm-inducing conditions. [emphasis added]
...
When these insects go into swarm mode, they don't just get super social, they also completely change physically, becoming stronger, darker and much more mobile, says study co-author Swidbert Ott, a research fellow at Cambridge. In fact, he says, the before-and-after bugs look so different that, until the 1920s, they were assumed to be two unique species.

In the wild, swarms usually appear after a rainy period followed by a time of drought. After rains, populations of grasshoppers explode, Burrows says, because there is food aplenty. But when the land becomes parched and grass scarce, the populations get pushed into smaller and smaller areas, becoming more packed as desirable pasture diminishes, he says. At a certain point of density, the swarm-inducing serotonin gets triggered and the locusts set off en masse to find greener pastures. After that, few things — other than an end to the food supply or an ocean — can stop them.

Burrows says that locusts can switch out of swarm mode, though it takes days rather than hours. He notes, however, that the about-face rarely happens in the wild, because the offspring of locusts that breed while swarming are born swarmers.
 
 http://www.ribbonfarm.com/2013/04/03/the-locust-economy/
Locust swarms are also among the rare nomadic biological entities besides humans (nomadism is not a predictable pattern of movement, unlike migration,which is typically a sustainable pattern of movement through a resource landscape that’s not being devastated by the movement). They are nature’s rioting mobs, moving opportunistically from one store of food to another, without much concern for sustainability.
 
Whatever the biological details, the key point is that locusts devastate their foraging base.

Locust swarms don’t create new value. At a systemic level, the most charitable thing that can be said about them is that they efficiently strip mine value in a tyranny-of-the-biomass-majority way. [emphasis added]


They out-compete other species through sheer numbers, and leave others to pick up the pieces as they return to their solitary, non-swarming grasshopper phase. In this case, human farmers. The collapse of locust swarms completes the cycle in a way we’ll get to.

Locust economies are built around 3-way markets: a swarming platform “organizer” player who efficiently disseminates information about transient, local resource surpluses, a locust species in dormant grasshopper mode, and a base for predation that exhibits a scarcity-abundance cycle. [italics original, bold emphasis added]

So long as different locations are not synchronized, a locust market will usually have a surplus somewhere, even if it is a zero-sum or negative-sum market overall. Where that surplus comes from varies. In human farming, it is a natural consequence of the plant-harvest model.
 
There's plenty more where that came from because ribbon farm pieces tend toward length. The working idea of human behavior resembling locust activity is that while Venkat is addressing the "sharing economy" as something he considers a predatory reaction within some groups to resource shortage and the illusion of surplus, Wenatchee The Hatchet is floating this idea that the emergence of Mars Hill as a movement could be construed rather vaguely in locust economy terms.  the swarming platform "organizer" player who efficiently disseminated information about transient local resource surplus?  Well, we "could" say that was Mark Driscoll but that gives him vastly too much credit. 
 
What makes more sense is to suggest that with the various tech innovations in the Clinton years and the mainstreaming of the internet there was a way for those with evangelical interests in doctrine but broad-minded cultural interests in production to sorta, I dunno, crowdsource themselves.  Mars Hill was able to serve as a catalyst as an information culture.  It was the "swarming platform" in the sense that it provided a central informational hub through which assortive alignment was possible.  So Mars Hill as an information culture could be construed as the swarming catalyst but what was going to swarm?  It is here that mileage will vary and people with political and social agendas may jump the gun.  For what little it may be worth, let Wenatchee The Hatchet proceed with some anecdotal consideration.
 
When I first heard of Mars Hill there was no mention of that Driscoll guy at all, it was described as a kind of church/community where people who were doctrinally pretty conservative and traditionally but culturally very wide-ranging could find similarly minded people.  There you have the locust species in dormant grasshopper mode.  Since by Driscoll's account his aim was to get the young guys who could theoretically be tomorrow's establishment and culture-makers then at the risk of putting it this way, Wenatchee The Hatchet was just one of thousands of people who were grasshoppers in dormant locust mode.  To the extent that this blog has talked about Mars Hill maybe you can think of those blogs and discussions as a long interminable process of examining how one guy got from grasshopper to locust mode who has been spending a few years trying to get back to grasshopper mode.  So the dormant locusts were, basically, the people who at any point in their lives for any length of time in their lives decided they were on board with what Mars Hill was doing. 
 
As for the base for predation that exhibited an abundance scarcity cycle?  Driscoll explained that by saying how few evangelical churches there were in the Puget Sound area, you know REALLY evangelical churches.  There may have been few evangelical churches whose aim was maximizing growth through attractional dynamics; maybe there were few evangelical churches that sorta billed themselves as something like a "life together" quasi-arts commune dynamic, but that there were few evangelical churches in Puget Sound is something that might merit more concrete numbers.  In any case, Driscoll and company made the case for the scarcity and announcing that scarcity may have been sufficient to signal the start of a swarm dynamic for all the folks who may not have realized they were grasshoppers who were dormant locusts after all.
 
We're all capable of being locusts.  Jonathan Haidt's The Righteous Mind unpacks this idea at some length when he runs with the idea that we're 90% chimp and 10% bee.  We have a capacity for hiving and swarm behavior that has accounted for our most powerful innovations and collaborations, our most remarkable collaborative goods as well as evils. 

Where this applies to Mars Hill as a historical movement that may or may not survive in its spin-off churches is that we have an opportunity to examine what happened.  If possible let's try to remember that even if we liken people in the Mars Hill scene as locusts we cannot do this without seeing the locust within ourselves.  The problem with seeing "them" as locusts while "we" are not is that we may not know what our "serotonin" is and what corresponding serotonin blockers may need to be in place to keep us from joining the swarm. 
 
For one guy he might join Mars Hill for the social life.  Memorably, a guy seven or eight years ago told me as he transitioned bitterly out of Mars Hill that he felt like for all the time and effort he put into Mars Hill he felt like he should have gotten something out of it.  It was impossible for me to not construe "I should have gotten something out of it" to mean "wife".  Some guys stayed on and even tried to fit into the leadership culture because they noticed that the guys who became community group leaders were more likely to get girlfriends through that and, by extension, get married.  The mating game can't be ignored as an attraction for men and women who came and went through Mars Hill, not least because for a few years Mark Driscoll and other prominent people in the church insisted on talking about that stuff from the pulpit and within the social scene.  All the exigencies of how you could manage to get yourself off the open market and into a licensed marriage was no small
attraction.
 
But for others it was the possibility of networking widely and readily with artists and writers and musicians and people who shared a common set of beliefs (Christianity in a generally evangelical and later Reformed sense) and might or might not have a shared set of aesthetic interests.  Wenatchee The Hatchet has definitely been in that particular camp.  If Mars Hill was a cult it was a cult because it appealed to the possibility of meeting an emotional, social, economic or physical need that was not being met in some other fashion.  You can find what you're looking for here, folks, was the sales pitch.  The closest thing to a "serotonin blocker" for the prevention of swarming activity, if we're sure we want to avoid that, might be diffusion of identity. 
 
Back in his Who Do You Think You Are? phase Mark Driscoll seemed to hammer on the question of identity.  It's possible to find all sorts of ways to build your identity in Christ and to also weave that sense of identity in a way that integrates all the subsets of identity into a larger whole.  How this could play out in a place like Mars Hill is that if a majority of your friends or family or business connections and general social life are refracted through the church, if you met your wife or husband there, then everything is, if you will, maybe "born in the swarm".  The people who managed most successfully to transition out were those who had a wide-ranging set of foundations for social identity.  If in sociological terms the "diffusion of responsibility" is why we believe someone was murdered one night decades ago the diffusion of identity might be a way to ensure that at a social or political level all the eggs of a person's identity are not hastily put into just one ever-encompassing basket such as "I'm part of Mars Hill".
 
But you can't seriously begin to address what might catalyze the swarming dynamic that transforms some other grasshopper into a locust if you've never thought about what it might take for you, grasshopper, to discover yourself to be a locust, too. There may be people out there who are doubling down on the kinds of ideologies and ideas they were embracing before they joined Mars Hill and became locusts.  Doubling down on the ideals you embraced before you joined up is "probably" not how you'll avoid becoming a locust in the future, grasshopper. 
 
If I could not look back on the last ten to fifteen years and thoroughly see in myself the behavior patterns of a locust in the swarm I wouldn't even attempt to propose that we who were at Mars Hill for any length of time stop to consider how we might all have been locusts in the swarm. I'd like to think that we would like to return from locust mode to grasshopper mode.

POSTSCRIPT

Obviously analogies and metaphors are imperfect but perhaps the controversies about abusive leadership style; the plagiarism stuff; the Result Source scandal; and the penchant for historical revisionism combined to form "serotonin blockers" for all the people who left.  It is improbable that any one of those scandals was enough to change the way things happened but cumulatively they had some measurable impact in that Driscoll resigned but, crucially, that Mars Hill was hemorrhaging members at a remarkable rate between 2013-2014.  So there may already be a lot of people who went from locust form to grasshopper form in the last few years.  Wenatchee The Hatchet has contributed here and there toward what I "hope" has been a process of letting people disentangle convictions about Christian life and practice from the narrative of Mars Hill. The beliefs are not inherently something that require brand loyalty.

Conversely, if you're still displaying swarm mode/locust behavior merely switching swarms probably helps no one.  This is why there are those who are against Mars Hill or Mark Driscoll who are at least as toxic and harmful as, well, whoever they were when they were still in the swarm.  There are people who have left the swarm of Mars Hill but are still in swarm mode for an ideological or confessional team.  If  you are not a Calvinist but are in swarm mode against Calvinism then the swarm mode Calvinist who is against Arminians shares with you being in swarm mode. What I hope we're shooting for here is to shift from being locust to grasshopper, not merely transform from a locust who's moved from one swarm to another.

Tokumitsu of Jacobin's "In the Name of Love" talks to the Atlantic about work mythology, cf Ribbon Farm on the difference between inconspicuous vs conspicuous production in the contemporary workforce (i.e. chimneysweep vs bard)

...
 
Tokumitsu: So in my book I have my theory about where it came from. I really feel like it comes out of post-World War II prosperity. The Protestant work ethic is work, work, work—work is a calling, work is virtuous. I felt like that was with us for a long time, but pleasure never factored into that much.
 
But then come the Baby Boomer generation—you have the wars seemingly over and there’s a lot of prosperity, though it’s been spread pretty broadly throughout society. And that gave people the opportunity to indulge themselves a little bit. And within the U.S. particularly, there arose a culture of self: thinking about what makes me happy and how to improve myself. [I argue that the] virtue strain of work and the self strain of work combined in the late 1970s and 1980s, and in a way pleasure-seeking became the virtue.

Tokumitsu: ... I feel like this whole culture of feeling good too is just really kind of hedonistic. And I also feel like it’s a little bit dark. There’s almost something in it to me that speaks of like addiction or something. We can never be at just baseline contentment. [emphasis added] We always have to be relentlessly seeking these “good feelings.”
 
 
http://www.ribbonfarm.com/2013/07/10/you-are-not-an-artisan/

The future of work looks bleaker than it needs to for one simple reason: we bring consumption sensibilities to production behavior choices. Even our language reflects this: we “shop around” for careers. We  look for prestigious brands to work for. We look for “fulfillment” at work. Sometimes we even accept pay cuts to be associated with famous names.  This is work as fashion accessory and conversation fodder. [italics original, bold emphasis added]

We can think of this as conspicuous production, by analogy to conspicuous consumption. First-world artisan tendencies take this to a logical extreme.

When you subconsciously think of work as something you consume for pleasure, you end up with a possibly irrational (economically speaking) attraction to artisan work. Even those who don’t actually end up as artisans choose work the way they choose cars, jewelry or handbags, over-valuing things like resume-value and exposure-value.

The result is a misguided analysis of the impact of computers and automation that makes us think the future of work is much darker than it is.

What’s the difference between a tradesman and  an artisan?  Think chimney-sweep versus bard as the extremes of the spectrum. Both are archetypes that mostly disappeared with late industrialization in the early twentieth century, thanks in part to automation, but there the similarities end.

One fulfilled a critical economic function by engaging in unpleasant and inconspicuous production. The other fulfilled a non-critical economic function in the economy by engaging in pleasurable and conspicuous production.

One generated a higher, less volatile income, but with little potential for upward mobility, the other generated a lower, more volatile income, but with more potential for upward mobility.

The median chimney sweep did better than the median bard, but the best bards did better than the best chimney-sweeps (by finding favor with a king for instance). Since this was before mass media, bard reward distribution was not as skewed as it would become, but it was still skewed.

The emerging future of work does resemble pre-modern patterns of labor organization in a few key ways, but most of us are going to turn into digital-era chimney sweeps rather than bards. And this is a good thing.

The difference between bard work and chimney-sweep work is that it far easier to convince yourself that a relaxing hobby is actually real work. It is a kind of 
gollumizing effect: behavior that makes you atrophy psychologically.

What makes it worse is that in an economy based on a fiat currency, shareholder value maximization and deficit spending, the capacity to generate an income does not necessarily imply that meaningful work is being done, either in a subjective psychological sense (it helps you evolve rather than atrophy) or economic sense (net wealth is being created rather than consumed or harvested). You might even end up having to pay to do real work.
 
For those who read this far and note the tag at the end of the post, all that pre-amble is a way to frame the above quotes as an observation about Mark Driscoll's taxonomy of manhood.  Driscoll spent years warning guys to not get "joe jobs" without necessarily defining what those joe jobs were.  Driscoll also spent a few years joking about how much he loved his "job". The implication in that attempt at humor might have been that he loved his job and that it was as though it wasn't really work for him.
 
... and then we got the Result Source controversy and the plagiarism controversy and that revelation about the extent to which ghostwriters could ghostwrite at Mars Hill and research assistants helped and ...
 
it did begin to seem as though Mark Driscoll could represent a production mill of occasionally recycled work rather than the production of anything particularly new or useful and at times even anything particularly interesting.  There was, to borrow Venkat's phrasing from ribbon farm, this conspicuous production thing with Driscoll.  The irony of this could be best framed by invoking Mark Driscoll's old standby about those who might object to what he said from the pulpit, he was just delivering the mail, you know.  Don't shoot the mailman.  Now, sure, he invoked being a postal worker but it's arguable the average postal work, even the average ineffectual postal worker, conveys more directly relevant and useful information to the average person in a day than Mark Driscoll ever did.  If you got your electric bill in a timely manner that was information you could literally act upon as soon as you found out about it if, you know, you get your utility bills by mail and all that.
 
It seemed like Driscoll was against jobs that were inconspicuously productive sometimes, while his own "job" was conspicuous production that turned out to have some problems with it.  First, we couldn't be sure how much of it was really his, secondly, we couldn't be sure to what degree the people who were actually getting the work done were acknowledged for it. 
 
Then again if people will buy Star Wars movies over and over again Mark Driscoll can recycle sermon ideas, too.  What the market will bear, perhaps.
 
Given the extent to which Mark Driscoll's returning to Ecclesiastes (again) it doesn't seem he's doing an real work at all so much as reconsolidating his branding.  He is, to borrow his old analogy about postal work, aiming to deliver the same mail he delivered ten and twenty years ago.  Driscoll now comes across like a bard who has spent his public career telling chimneysweeps they need to be bards. What may benefit Driscoll most is to go back and be that bread delivery truck guy, to be what Venkat at ribbon farm has dubbed the chimney sweep, who does the thoroughly unsexy scut work that produces something people can use.
 
Now maybe after nearly two decades doing the public speaking/motivational speaking/entertainment/ministry thing Mark Driscoll doesn't know how to do any real-world work but that seems unlikely.  It's more probably that he knows how to do real work but may not WANT to do work that has no prestige. He said last year that if he had to choose between being a pastor and being a celebrity he'd choose being a pastor.  Then he quit being a pastor in 2014 and only in 2015 on the charismatic leadership conference circuit scene has he seen fit to explain that, oh, yeah, God totally gave me audible verbal permission to go against everything I told everyone they ought to do in general and that I intended to do in particular.  So Mark Driscoll is no longer any kind of pastor but he is still a celebrity, a celebrity recycling his old material.  The transformation of Mark Driscoll into almost everything he warned Mars Hill against has seemed alarmingly complete. The tragedy and comedy of it seems to be that he seems completely ignorant of this transformation over the last two decades. In the lexicon of work types presented by ribbon farm Driscoll has clearly chosen to be a bard rather than a chimney sweep and time will tell whether the song remains the same as it ever was (seems like a fairly safe bet there).

an afterthought on the recurring problem of Reed Richard's hubris in a Fantastic Four origin story

It has been noted in a few places that there's this problem with Fantastic Four movies where the heroes are dealing with a menace they explicitly created.  It might be said in objection by some fans of the franchise that critics don't get the exigencies of the 1960s or genre tropes.

Well, no, maybe they do.  Critics have to deal with broad swaths of artistic creations that get pigeon-holed into genres and types.  You might have heard of these distinctions called "drama" and that there's things called "tragedy" and "comedy". In the Fantastic Four franchise we're expected to root for heroes who make decisions that in some sense mess up their lives.  Now it's not that there aren't tragic heroes but, you know ... Oedipus. 

Whereas in comedies it's not the same thing.  We can accept that within the realm of comedy a disaster can be caused by the protagonist and the protagonist can still in some sense be sympathetic because we're going to be both laughing WITH them and laughing AT them in a comedy where things will come back to some kind of normal.

In other words, we aren't required by the conventions of drama and comedy and tragedy to automatically view Reed Richards' over-confidence as a heroic virtue.  We could ask why on earth he didn't think anything could go wrong.  It'd be fair.  Conversely, take this exchange from "Sea Tunt"

Lana Kane:  Do you think maybe we're walking into a trap?
Sterling Archer: No! .... but then I never do ... and it very often is.

See we're not exactly startled if the majority of the disaster and mayhem Sterling Archer and his not-quite-friends deal with is self-inflicted by way of their incompetence, corruption, ignorance and nastiness.  A central engine of the comedy as a comedy is that they stubbornly refuse to see themselves as the worthy victims of their own stupidity and vice. They get to be comedic heroes because no matter how badly they botch things they return their world to some semblance of "normal".  When the Fantastic Four imperil the world as we know it and they save the day it's not played for laughs overall and so the "rules" are a bit different. 

Andrew Durkin's Decomposition--neither a philosophy of music nor a manifesto, but a potential conversation starter for those who slog through it

http://knopfdoubleday.com/book/237102/decomposition/
http://www.musicandliterature.org/reviews/2015/1/5/andrew-durkins-decomposition-a-music-manifesto

I kinda wanted to like this book but it's not really a manifesto because a manifesto would get straight to the point, actually say something, and say something in a simple way.

But it's not a philosophy of music either, or even at all.  So neither of the working titles seem accurate.  Then again, publishers can make decisions so it seems unfair to try to judge a book by its failure to live up to either published title.  The term "decomposition" is still there and it gets something like a definition.

There are some core ideas in this book I could totally endorse without reservation if those had been the ideas Durkin had spent the majority of his book actually dealing with.  Instead the book takes aim at the ideologies (Durkin's term) of authenticity and authorship. Durkin tries to show how dicey these two ideologies are without being successful.  The reason he's not successful is that he is attacking these concepts as ways of writing about music rather than attacking the legitimacy fo the concepts as a whole.  And the trouble is that even if he had attacked the legitimacy of the concepts of authorship and authenticity as concepts he could have just had everybody read Andrew Potter's The Authenticity Hoax, which directly attacked the notion of authenticity across the board since the dawn of the Romantic era. 

And in any case attacking authorship and authenticity would not do a single thing to change the proliferation lof Taylor Swift songs or One Direction songs or all country songs sounding vaguely the same.  Leonard Meyer pointed out half a century ago that products of art have been team-built products.  Durkin's half a century late to the party if he wanted to point out that authorship is a myth because much of what passes for solitary invention is really more like a social process. 

Earlier reviews have suggested Durkin could have engaged more of the musicology writing in the last thirty years.  Actually, having read the book, I've come to think the problem is Durkin could have been more thorough in absorbing Leonard Meyer's writings.  It seems particularly unfortunate Durkin's writing about the problems of authenticity and authorship being problems that he doesn't seem to have read Style & Music: Theory, History and Ideology, in which Leonard B. Meyer explicitly and at length deals with ideology as an engine for concepts about originality and authorship from the end of the high Classic period through the end of the Romantic era.

Nor has Durkin's reading seemed to include Meyer's impressive 1967 book Music, the Arts, and Ideas, in which Meyer predicted the emergence of a stable steady-state of polystylistic options across the arts. Meyer also noted that the products of artistic activity in the 20th century were more collaborative and committee-based. 

Let's get to a few particular bits in Durkin's book.  Durkin's attack on authorship features some ruminations on the collaborative nature of Duke Ellington's compositional approach.  That got discussed at some length in Terry Teachout's biography Duke from a few years ago. It's fun but not necessarily a meaningful counter to an ideology of authorship since not many people celebrate Ellington's music as a brand.  We still think of it as Duke's music. 

When Durkin later turns to Beethoven he doesn't really say anything much more than that Beethoven's popularity coincided with a shift in European taste in which instrumental music became popular and symbolic of European high art aspirations.  The trouble is that Durkin never even starts to address how or why that ideological/aesthetic change took place and this was precisely what Leonard B. Meyer discussed at length in Style & Music!  As nationalistic impulses began to emerge in the 19th century there was a push to get away from what was believed to be too international and cosmopolitan a style in music toward the roots music of cultures.  But, as Meyer put it, even though divorcing music from the constraints of language could ensure music evoked the "universal" of feeling and intuition nobody could easily devise a new musical syntax to replace the forms of the 18th century, whether the Baroque forms or the forms of the Classic era.  So Romantic composers spent a century finding ways to disguise their reliance on the conventions of the eras they were trying to distance themselves from.

But Durkin's book never starts moving in that direction. What he does instead is take aim at the uncertainties and exigencies of notational systems, which is altogether a waste of time.  Once again going back to things written about half a century ago, Paul Hindemith remarked that the Western notational system has plenty of problems and a composer can only use it to convey to a performer an approximation of what the composer has in mind. Meyer, in Music, the Arts, and Ideas mentioned that (forget exactly where now) one of the things a written notation system permits for music is the development of complex forms.  In Durkin's conclusion he makes a plea for complexity in music, not for the sake of complexity but for music being, well, anyway ... the irony of Durkin's attacks on authorship and authenticity by going after notational systems is that Meyer pointed out that without notational systems human musical activity is tethered to what people can remember long enough to continuously perform and when we're anchored to the limits of human cognitive bandwith to THAT degree humanity has proven time and again that we stop short at about the three minute mark.  Paging Taylor Swift again ... .

If Durkin stays committed to attacking notational systems as a case against the ideologies of authorship and authenticity he might find that he's making an argument against the slippery and often inadequate systems that, nonetheless, have permitted the complexity he'd like to hear more of. 

All that said, reading Durkin's book was a gateway to listening to some interesting stuff, like Conlon Nancarrow's player-piano studies. Where Durkin had a chance to formulate a real manifesto could have been in this material, where he details that no matter how solitary a genius people might want to think Nancarrow was his creative process was still essentially collaborative.  It's a shame Durkin relies on what seems to be ad hoc and idiosyncratic jargon.  It's also a shame he doesn't seem to have cast his net wide enough to show that as ideologies go, an insistence on the "right" edition of the perfect score is relatively new.  Sure, he gets to Bach editions and how accepted performance practice informed that but that's not quite what needed to be articulated.  No, the fact that Mozart and Haydn, or even later composers, were happy to revise and rewrite their works to suit the needs and strengths of their musicians at hand could have bolstered Durkin's arguments for what he calls contextual and direct collaboration.

Actually, it's a shame Durkin didn't use the most obvious case for a collaborative process in Beethoven available, the Diabelli variations.  Diabelli tends to get dismissed as a mediocre composer and not without some cause, but Diabelli's Op. 29 guitar sonatas have their merits.  If you could hear a Marcin Dylla playing the F major you'd hear that at his best Diabelli had some promising material.  It took a Beethoven to coax that greatness from the music since Diabelli, whatever his gifts, left some ideas under-developed but that's some blog post for some other time.  Durkin's not a guitarist and is a jazz musician rather than somebody who composes chamber music so I'm trying to not be unfair here when I say that Durkin needed to strengthen his readings on Beethoven a little more before he wrote a chapter that purported to demythologize the lone genius narrative of Beethoven without actually doing so.  It didn't have to be a door-stopping double-tome like Taruskin's treatise on Stravinsky's appropriation of Russian folk music, but if Durkin wants to take aim at the biggest names in the concert music canon he needed to brush up on the critical literature a bit more.

However, Durkin's proposal, if I can dare to distill it into an actual axiom for a manifesto goes something like this:

* All artistic activity, no matter how physically isolated, is invariably a social activity
* All artistic activity, as an act of communication either aspires toward or must risk cliché
* Therefore originality and alleged social authenticity must be viewed with some skepticism

Had Durkin read more widely in Meyer he could have benefited from Meyer's axiomatic observation that the ideological insistence of the Romantics on individual expression created a paradox, the problem is that individual expression and artistic individuation paradoxically depends upon the norms that are supposed to be contravened for the sake of individuality.  Style as an indicator individuality got abandoned by those Meyer called empiricists and transcendentalists (like John Cage) and so in the 20th century there were artists moving away from the idea that the aim of art was to be an expression of the artist's self. 

Quaintly enough, some of those 20th century composers in some sense returned to an 18th century doctrine of affect.  Meyer described the distinction as being one in which the 18th century doctrine of affect proposed that music represented emotional states while 19th century ideology claimed that music expressed the feelings of the individual artist who created the work.  I think the 18th century formulation makes more sense.  You can work to represent an emotional state in a musical work as a matter of convention, understanding that not everyone will perceive the music in the same way, whereas the Romantic variation can come across pretty quickly as narcissistic solipsism

Oh, have I forgotten to mention I actually don't like a majority of Romantic era music?  :) 

Durkin's attempts to cast doubt on whether we all hear the same music, this is not an organized review so much as a year's rumination, is also frustrating.  Durkin spent 300 some pages to get at a point that Paul Hindemith knocked out swiftly in the first 40 pages of A Composer's World: Horizons and Limitations.  That really was half a century ago, give or take a year, and Hindemith wrote that what happens when we listen to music is a form of parallel mental construction in which we compare what we're hearing to what we think is going on in the musical work.  Hindemith credits this idea as going as far back as Saint Augustine, and Hindemith stated that one performance of a musical work could elicit a different response from each member of the audience because each member of the audience will mentally comprehend and interpret the music in different ways. 

Durkin's attempt to question authenticity and authorship based on asking "do you hear what I hear?" is a self-defeating notion because even a thoroughly conservative and traditionalist type like Paul Hindemith pointed out decades ago that, yeah, we all mentally interact with what we hear.  If we're hearing music based on a musical syntax and vocabulary we don't understand, however, we WON'T HEAR IT AS MUSIC.  John Cage's innovation was to introduce the possibility that once you perceive music as a mental process or disposition toward hearing music in the sounds around you then human agency in music can be altogether removed.  Questions of the authenticity of the composer or performer on the one hand or the centrality of the author as a Romantic ideological talking point vanish. 

But that's not the only movement within the arts to get in the direction of questioning Romantic ideological insistence on innovation, personal expression and all that.  When Stravinsky insisted that music was powerless to express anything beyond itself this can be construed as a reaction to Romantic ideology, which freighted music with so much necessity to express the soul of the artist and be emotionally and culturally "authentic" it was overwrought.  Stravinsky famously jumped from style to style and appropriated ideas and methods from a variety of musical eras.  You could suggest that Stravinsky tilted back away from a Romantic ideology of art-as-expression-of-the-artist's-true-self toward something more like an 18th century doctrine of affects in which music can abstractly depict a series of emotional states or concepts within a shared understanding, but that the artist as artist has no inherent claim to dictating what the language may mean.

Durkin's attack on what he calls the ideologies of authorship and authenticity ring hollow because while he takes many and generally ineffectual labors to attack the concepts as rhetorical devices within writing about music he never even begins to attack the legitimacy of the concepts as a way to understand the arts, or even his own approach to the arts.

If he had cast about beyond music his ideas and questions could have been fascinating.

Let's take film, what is the "real" version of Star Wars?  Hasn't Lucas spent decades insisting that everyone who enjoyed the 1977 version of Star Wars fixated on an unfinished movie? Well Herman Melville insisted Moby Dick was a draft of a draft and it's still classic American literature whether you enjoyed it or not. That there's a "de-specialized" version of Star Wars out there suggests that within cinema history audiences can be persuaded that the director/screenwriter's authority has limits.  Once you put something out there as a completed/released work of art the audience interaction with that work takes on a life of its own.  Durkin's blog shows he's clearly familiar with all the ways in which Lucas got ideas from other places. 

Maybe Durkin's even read The Secret History of Star Wars, too. If Durkin wanted to get at a case study in which ideological insistence on authenticity and authorship is potentially toxic and flies in the face of audience affection it would be impossible to find a better case study than George Lucas! The more we have at hand to learn about how the films developed the more thoroughly it becomes clear that the films were a sprawling collaborative process yet if there's any self-aggrandizing solipsistic myth-making process in the contemporary arts about the arts, George Lucas' decades of "I meant to do that" tinkering with films he made decades ago could have been exhibit A for Durkin's best points about the shortcomings of a straitjacketed ideological insistence on a certain understanding of what authorship and authenticity are.

But we keep mythologizing anyway.  It doesn't matter how many Beatles there were, we pick the individuals who are allegedly most responsible for the final product.  Why we keep gravitating toward individuals as emblematic of collaborative processes is almost too obvious to mention.  As Daniel Kahneman has put it, our brains are organisms designed to jump to conclusions.  Most of the time that works.  In fact 90% of the time it works just great which is why when the 10% shows up where that doesn't work great it's a disaster of cognitive shortcuts.

If Durkin had written his whole book around a critique of contemporary copyright practice then it would have been an entirely different (and probably better) book.  Durkin points out that there's a big difference between urging that copyright needs reform and the two binary positions of "copyright is evil" and "copyright as it is is great".  Alas, this is the part of the book that seems tacked on and incomplete.  Nothing about, say, a supreme court case in which publishers wanted to reject the right of first sale!? 

The concept of "proprietary totalism" as the default position of corporations is an interesting one, one that could be fleshed out more.  I would suggest, since, you know, this blog tends to get known as that blog covering the life and times of Mars Hill Church and Mark Driscoll as a public figure, that Durkin's writing invites a question.  Let's consider that "Blurred Lines" case and by contrast consider that in the midst of a year or two of controversy about whether or not Mark Driscoll was a plagiarist not a single author or publisher opted to take legal action.  Was this really because there was no evidence in the first print editions for plagiarism?  Well, no, that's not likely because as Warren Throckmorton documented, Driscoll's publishers retroactively fixed many a passage by adding footnotes and attribution that had not previously existed in first print editions.  So THAT would seem like evidence that authors and publishers "could" have made a case for plagiarism.  Nobody did, whereas Marvin Gaye's estate did sue after they were sued, if memory serves. 

Durkin has proposed that technology informs how we understand the arts.  Arguably our reliance on machines to both create and hear music has done more to inform our cultural insistence on authenticity and authorship than traditional musical notation. Our expectation that a musical piece sound the same every time isn't a "universal" but it's an expectation that can be reinforced by the ways that we listen.  Sousa warned that by letting our musical experience be mediated by machines this would introduce a brutal stratification between producers and consumers of music; it would lead to a proliferation of music via machine that would obliterate regional musical dialect and flavor; it would lead to a culture in which the amateur musician all but vanished and the need for music teachers and the infrastructure of cultural preservation would be harmed. 

Now the funny thing here is that about half a century later the émigré Paul Hindemith complained that American musical culture, particularly in music education, had this idiot egalitarian streak that bore no resemblance to the real world. Tell every kid that he or she could be a future Beethoven and you're lying to the kid but American aspirational ideals, as Hindemith perceived them, ran with the idea that you tell a kid to reach for the sky and assume the best.  He sourly asserted that all the American musical educational system seemed to be good for was not producing balanced all-around musicians but music teachers who would produce more music teachers and specialists who would produce more specialists.  Not unlike Sousa, Hindemith believed the role of the amateur musician in fomenting musical culture was a necessity.

And in a way one of the boons of the technology that may have divorced the production of artistic work from any monetizable way of profiting from it could be the emergence of a new amateur class.  But I wonder if this is the part that Durkin and others may not have considered in classic left/progressive terms.  Let's get back to "Masscult and Midcult" and old Left attacks on the middlebrow.  If the middle class is shrinking does the middlebrow go away?  No, the corporate funding can jump in and save the day.  But that's not what is interesting here, the question that comes up is if the old Left talked about high art and low art, about high culture and folk art, what could we say describes those two realms?  High art had the patronage system but folk art, what was that?  Couldn't we get quasi-Marxist here and suggest that folk art was made at the time and expense of the common person without any necessary expectation or reality of financial compensation? 

A life in the arts if you're not in an aristocratic leisure class might be a life in the arts where you do it entirely at your own expense with no actual pay just because you love making music or art or whatever your hobby is.  If Durkin wanted to really attack the ideologies of authenticity and authorship he could have attacked the legitimacy of the ideological insistence on the superiority of the vocational artist over the hobbyist.  He could have done this, too, by way of Charles Ives.  Durkin's working definitions of authorship and authenticity were too narrow and so he ended up stopping short of getting into some stuff that could have been fascinating. 

Decomposition doesn't read like either a manifesto or a philosophy of music. It reads like a series of dead ends on tertiary issues that don't seem important to what Durkin's worried about in his book and at his blog.  It reads like a series of false starts and rabbit trails into things that don't address the real ideological stuff going on behind the buzzwords he spent a few hundred pages trying to neutralize to no effect.

Still, fumbling attempts to address how we think about the arts is better than not thinking about how we think about the arts.  I doubt Durkin would agree with Sousa about the dangers of ceding so much of our musical life to machines from a century ago.  I don't know that he'd even agree with Leonard B Meyer's warning in the 1990s that the danger in our listening is how inattentive and partially focused it is now that it's mediated by machines through which we listen to music while we walk to the grocery store instead of listening in the devoted setting of a concert or a recital.  Really, I do think Durkin could have benefited in writing his book from being more deeply engaged with Meyer's work beyond the book with "emotions" in the title.  I also, obviously, think Durkin could have benefited from spending less time attacking the straw man of the vagaries of music notation and more time on the ideologies behind his ideologies of authenticity and authorship.

Meyer wrote that the problem of value is inescapable.  We can pretend it doesn't matter, we can bury the debates about value in technical jargon, we can find ways around it but we can't escape the question of what we value and why in the arts.  In some sense Durkin's ultimate failure is that he tries to dismantle the ways we talk about values without questioning the values themselves.  He's not really interested in attacking authorship or authenticity as customs of history or even as ideologies, but as buzzwords he perceives in music journalism.  But the problem is that if we're not talking about music in the aesthetic and technical terms of the construction process what do we have?  We will tend to fall back on to the lifestyle reporting Ted Gioia complained about.  But the Scylla to Charybdys here is that classical music writing became irrelevant, as Richard Taruskin has put it, for being obsessively into shoptalk, talking about the technicalities of the construction process without getting around to what these things mean.

Leonard Meyer wrote in his Music & Style that it is impossible to assess a work of art on its own terms.  Our very capacity to recognize something as a work of art is culturally and ideologically mediated.  The paradoxes of Romanticism are that the individuality it prized depended on the norms the individual was expected to subvert.  Meyer described Romanticism as the ideology of elite egalitarians.  The egalitarian impulse was strong in Romanticism but the trouble is that all art as communication depends on the kinds of conventions the ideology repudiated on the one hand, and on the other hand the capacity to deviate from norms depended on a knowledge base that generally ended up being elitist.  It was ultimately impossible to repudiate the syntax and vocabulary of the 18th century for the 19th century composers.  By the 20th century the norms had been abandoned to degrees that tested the limit of human cognitive bandwidth and let's just say we don't all listen to Schoenberg in the elevator ... .

Which is to say that Durkin can attack authenticity as much as he wants but the problem is that in lieu of explicitly formal/formalist values in assessing what music is and whether we call it "good", the default definition for artistic greatness becomes ... authenticity.  Thus old debates about whether whites could really play jazz or whether black composers could contribute to classical music in the same way as whites have because it's music of the Man or something kind of like that.  Paul Desmond was a great saxophonist and George Walker has written some fine piano sonatas, so there.  That gets to my misgivings about Durkin's project, he doesn't want to affirm aesthetic criteria through which authenticity based on ethnic or class signifiers define participation in the arts, and I think I totally get the reasons he doesn't want to "go there".  But the trouble is that if we don't go "there" and we don't turn to the proposal that there are aesthetic values we can construe as aspirational absolutes then we're kind of at a burger joint saying we don't want to buy a bacon cheeseburger but the place doesn't sell chicken.

this week's Salon defense of elective abortion and an old David Livingstone Smith piece on dehumanization as a psychological prerequisite for killing

http://www.salon.com/2015/08/10/i_had_an_abortion_at_planned_parenthood_and_im_not_ashamed/
...

I didn’t have an abortion because I was raped, or because my life was in danger, or because the fetus was the product of incest. I had an abortion because I had recreational sex, got unintentionally pregnant, and wasn’t ready or willing to be a mother. This is something I haven’t written about before, but there comes a point when staying silent begins to look like shame — and I am not ashamed.
...

Back in 2012, when I first discovered I was pregnant, the man I was seeing at the time was so flustered and without resource that he typed “abortion.com” into the search bar of the browser. I don’t blame him, because that was a scary moment for both of us. Fortunately for me, I knew better. I had been going to Planned Parenthood since 16, when a doctor recommended I get on the pill to regulate my menstrual cycle, and I trusted them explicitly with my health because I knew they trusted me explicitly with my body and my future. I didn’t have to tell them I was a 24-year-old working in a restaurant, living in a studio apartment in New York City. I didn’t have to explain that although I think I do want children someday, this man is not the person I wanted them with. I didn’t have to convince them I deserved their respect and kindness; they gave it willingly. And most important, I didn’t have to apologize. I still don’t.

A while back David Livingstone Smith published a piece about wartime acts of mass killing through usual but important case studies--the firebombing of Tokyo, the Holocaust, Rwandan genocide.
But the author zeroes in not on the atrocities as atrocities but on the rhetoric and narrative that featured in the rationalization:

http://aeon.co/magazine/society/how-does-dehumanisation-work/
...
What is the common element in all these stories? It is, of course, the phenomenon of dehumanisation. But this is neither recent nor peculiar to Western civilisation. We find it in the writings from the ancient civilisations of Egypt, Mesopotamia, Greece and China, and in indigenous cultures all over the planet. At all these times and in all these places, it has promoted violence and oppression. And so it would seem to be a matter of considerable urgency to understand exactly what goes on when people dehumanise one another. Yet we still know remarkably little about it.

...

My focus is on a different conception of dehumanisation – a deeper one that typically underpins all the others. We dehumanise other people when we conceive of them as subhuman creatures. Dehumanisers do not think of their victims as subhuman in some merely metaphorical or analogical sense. They think of them as actually subhuman. The Nazis didn’t just call Jews vermin. They quite literally conceived of them as vermin in human form.

Look at how European settlers thought about the Africans whom they enslaved. As the US historian of slavery David Brion Davis remarks: ‘It was this extreme form of dehumanisation – a process mostly confined to the treatment of slaves and the perceptions of whites – that severed ties of human identity and empathy and made slavery possible.’ The writings of Morgan Godwyn, a 17th-century Anglican clergyman who campaigned relentlessly for the civil rights of Africans and Native Americans, throw considerable light on how English colonists thought about their putatively subhuman slaves. In The Negro’s and Indians Advocate (1680), he wrote that he had been told ‘privately (and as it were in the dark)… That the Negros, though in their Figure they carry some resemblances of Manhood, yet are indeed no Men.’ ‘They are,’ he continued, ‘Unman’d and Unsoul’d; accounted and even ranked with Brutes’ – ‘Creatures destitute of Souls, to be ranked among Brute Beasts, and treated accordingly.’

The internet being what is that invites a brief, dry joke about how Godwyn's observation that humans share common dignity might prevent people from rushing to make the kinds of statements that led to the other Godwin's law.

Anyway ... it would seem that the question of the ethics and methodology of dehumanization merit further thought.  The abortion wars have shown that pro-abortion advocates explicitly and emphatically deny the basic personhood of the fetus.  The fetus isn't a human, it is a clump of cells.  That in the history of humanity dehumanizing what you wish to kill in combat (note the "what" there in place of "who") it's striking and depressing that a magazine like Salon might not see the hierarchy difference between an individual celebrating the liberty to terminate a clump of cells on the one hand and objections from authors at Salon about a pre-emptive military activity to stop things going on in another country that might impinge upon American consumer activity.

But then the paradox of social conservatives not wanting a baby to be aborted who then don't want a social welfare system in place to provide for the child if it isn't a child from marriage may be literally the other side of the same coin, that we humans selectively humanize and dehumanize.

David Livingstone Smith's simple observation is that you dehumanize the person you want to justify killing in order to secure for yourself the thing you want. What seems a hugely depressing but necessary consideration is that humans have always, currently, and always will dehumanize in order to rationalize killing.  The reason we need to study how and why we do this is not because we will ever stop doing this but so that we can understand why a conservative might dehumanize people consider enemy combatants to justify continuing military activity in a foreign country as part of a pre-emptive war on the one hand, and why a progressive might dehumanize a fetus so that it cannot be considered human so that it may be killed in order to permit the pregnant woman to end the pregnancy rather than deal with the financial and consumer burdens and social burdens of raising an unwanted child.

The dehumanization gambit seems precisely the same in both cases but it's only regarded as morally suspect by the respective political/ideological positions.  Nobody's going to change their minds about this but if we consider that there is a dehumanizing move in both kinds of cases it may give us a chance to ask ourselves not "if" we dehumanize each other but why we might find it beneficial.  The author quoted above has proposed dehumanization is a necessary psychological/social step toward rationalizing a killing we would otherwise not be comfortable doing.

Thursday, August 13, 2015

so, uh, will Game of Thrones help subsidize Elmo now? Atlantic Monthly feature on HBO/Sesame Workshop partnership

http://www.theatlantic.com/entertainment/archive/2015/08/sesame-street-and-the-achievement-gap/401255/

Sesame’s migration to cable begs to be understood as a failure in public funding, and it is in part. In a kinder society, PBS would have more funding, and it could rush in to support a struggling flagship. But what changed Sesame Workshop’s financial situation wasn’t a PBS funding cut but the media environment itself. The same economics that have hurt musicians—the transition from physical ownership to digital ownership to streaming—are what threatened Sesame Workshop’s budget and sent it running to HBO. In a world with less media ownership, even widely beloved, publicly funded media need a premium patron.
...
Yet the problem remains. Sesame Street was made to give poorer children a leg up, but by virtue of being a popular TV show, it’s helped richer children too. So Thursday’s news, that affluent HBO viewers will get new Sesame episodes for a full nine months before non-premium-cable subscribers get them, doesn’t so much create an unfortunate tension as ratify one. Think of all those DVD and t-shirt sales, and how crucial they were to Sesame Workshop’s bottom-line: Sesame Street has long relied on appealing to richer homes in order to subsidize helping poorer ones.

Now, that relationship will dictate the creation of the show itself.

Maybe this could be construed as meet the new patronage system ... kinda remarkably similar to the old patronage system?

Oh, and, why not ... another link for the sake of it
http://spinoff.comicbookresources.com/2015/08/13/sesame-street-moves-to-hbo/

Anyone else waiting for a cross-over episode in which The Count is explaining to Tyrion Lannister how fiat currency works?

ribbon farm on leadering which is " ... the art of creating a self-serving account of whatever is already happening and inserting yourself into it in a prominent role."

http://www.ribbonfarm.com/2015/03/12/the-art-of-agile-leadership/
...

See, the difference between leading and leadering is that leading is an extraordinarily rare event: one person getting it right for 20 seconds instead of 5 seconds. And in those 20 seconds, getting enough right, and getting it right enough, that the precious, gooey rightness can be shared with others. When some of this precious, gooey shared rightness  gives an entire group a bit of an edge for a while, we call it leading.

Given the default randomness of the human condition, and the extreme power of compound interest, a little bit of leading goes a long way. Many thriving corporations, for instance, live out their entire lifespans fueled by about five minutes of actual leadership. Sometimes those five minutes can even be attributed to the person who later graduates to full-time leadering.

Episodes of actual leading are rare enough that they do not constitute pervasive, persistent and effective behavior patterns. So we do not in fact need a noun like leadership. Most of what passes for leadership is in fact systematic and self-serving misunderstanding of the pervasive, persistent and ineffective (but mostly harmless) behavior patterns corresponding to the verb leadering. 

Leadering is the art of creating a self-serving account of whatever is already happening, and inserting yourself into it in a prominent role. This requires doing things that don’t mess with success (and the baseline for success is continued survival), but allow you to take credit for it.[emphasis added] Successful companies might have only about five minutes of actual leading in their stories, but they have hour after endless hour of leadering.


And that, unfortunately, could probably sum up the entirety of Mark Driscoll's public career in ministry at Mars Hill.  Sure, formally he may have founded the corporation that came to be known alternately as Mars Hill Fellowship and Mars Hill Church, but so long as he didn't botch things up so badly the movement died the group dynamic could keep the momentum going.

Now, well, we'll get to see if any of the spin-off churches go the distance. It remains to be seen.

an idea from ribbon farm's venkat on the inadequacy of faddish collectivism as a replacement for competent leadership.

An idea that can be popular with people who have moved along from Mars Hill can be that institutions have been the problem.  Well ... maybe the institution that has been Mars Hill was certainly problematic, but a faux-anarchist collectivist utopian ideal wasn't THAT far off from what a lot of people imagined for Mars Hill.  The idea that the remedy for the inevitable corruption in institutions is to forsake them sounds better in theory and on paper than in practice.  When I have needed serious medical attention I have not relied on the informal networks to get done what institutions are vastly better qualified to handle.

Over at ribbon farm, there's a little excerpt on corporate cultures and a failure to systematically study how leadership cultures and executives behave.  It comes off as a bit dense and snobby but, hey, that's ribbon farm.  You might get used to that part about it.
 http://www.ribbonfarm.com/2015/08/13/executive-engagement/
...
Almost nobody studies executive engagement. Especially not the people who should  be studying it: employees and stockholders. Most rank-and-file employees lack basic literacy in executive engagement assessment and have no idea how to evaluate their leaders. They swing between mindless idolatory of charismatic executives, faddish collectivist religions that serve as an alternative to trusting leaders, and stubborn resistance to the more hapless executives.

when a Vogue cover is construed as a political statement at The Atlantic ...

http://www.theatlantic.com/entertainment/archive/2015/08/beyonce-hair-on-vogue-september-issue-cover/401246/
...
And here is the most powerful female celebrity on the planet, on the cover of the biggest issue of what is arguably the world’s most important fashion magazine, seeming to push back against all that. Bey and Vogue are not necessarily recommending that the Normals of the world start rocking stringy hair. What they are doing, though, is what all high fashion will, in the end: They’re setting a new benchmark. They’re suggesting that unkempt hair, Cerulean sweater-style, can and maybe even should trickle down to the habits of Vogue’s readers and admirers and newsstand-passersby. They’re making a political statement disguised as an aesthetic one. Here is Beyoncé, whose brand is strong enough to withstand being photographed with stringy hair, suggesting that, for the rest of us, the best hairdos might be the ones that don’t require all the doing
 
But that might be because of the strength of the brand permitting an alpha female in a contemporary society to get away with stringy hair because everything else looks picture-perfect. 
 
To be sure, there's an equivalent of this kind of thing for guys, maybe fantasy football?  Maybe debating whether it would be Captain America or Wolverine who would win a fight?  The political statement seems more about Beyoncé's brand than about what may ever be true about any other women ... but maybe that's just a guy's take on this.

Then again, let's revisit what Amanda Hess wrote a while back at Slate:

Beyoncé is the living embodiment of diversifying beauty standards for women in America, but in many ways, she now is the standard, and it’s still an unattainable one.

Slate features musical examples of modal mutation, horror themes moved from minor to parallel major keys

http://www.slate.com/blogs/browbeat/2015/08/13/horror_theme_songs_in_major_key_are_chillingly_hauntingly_dorky_video.html?wpisrc=burger_bar

Employ modal mutation for five famous themes from horror/sci-fi shows.

The X-Files in parallel major sounds like an out-take for the triumphant strains to Chariots of Fire.
Halloween sounds like bumper music for an NPR segment.
Saw, heh, might as well be an episode of House. Same basic premise, right?
The Exorcist, more like NBC Nightly News bumper music.
Nightmare on Elm Street ... they compare it to Polar Express but I'm inexplicably getting more of an Air Supply vibe.

Wednesday, August 12, 2015

revisiting an old idea from an old guest post in light of a newly discovered book by Leonard B. Meyer called Music, the Arts, and Ideas

Some of you regular readers may recall this guest post over at Internet Monk back from 2012.
“There Is neither Art nor Pop, neither Indie or Mainstream…”

Well, in that I proposed from Galatians 3:28 and Colossians 1:19-20 that there was no high or low, rock or pop, classical or non-classical, indie or mainstream.  Now, of course, it's not that these stylistic distinctions don't exist, they clearly do!  But what it means for a Christian who is also a musician is that following Christ suggests that if in and through Christ Jew and Gentile are reconciled to God and to each other; if in and through Christ it was God the Father's pleasure to reconcile all things to Himself, then a fairly natural application of this observation would be that if this reconciliation is true about people then it can also be about the music those people create and share.

Which gets me to the 1967 book by Leonard B. Meyer called Music, the Arts, and Ideas. There is a lot that could be written about this book but, basically, Wenatchee The Hatchet thinks you should read it.  It's heady and at times opaque in terms of literary style but Meyer's prediction back in 1967 was that there would no longer be a musical "mainstream" for what we call classical music.  There wouldn't be any mainstream at all and there was no longer even going to be an avant garde.  That particular point can be explicated a bit later, the kicker for this paragraph is Meyer predicted what he called a dynamic steady state, a polystylistic stability that would become the new norm.

One of the practical bits of advice he had for composers and musicologists of the 1960s in academic settings was to not imagine that total serialism and atonality would ever become the new dominant style.  There would be many, many reasons for this but the simplest one had to do with a matter of "redundancy".  In terms of information that has to be perceived for there to be a musical experience, Meyer proposed that the shortcoming of atonality and particularly of total serialism was that it foisted too much information too fast on an audience that was not privy to the precompositional constraints and methods composers of this music brought to the table.  In other words, there were composers writing music that was not so much music as what Meyer and Adorno described as a set of relationships to be studied.  Any question about why on earth anyone would want to LISTEN to total serialism could be met with an objection by way of a defense of the chart establishing the shrewdness of the precompositional process that led to the musical result.

In other words, it looks pretty kick-ass on paper there but you won't hear why that is in the finished product.  Meyer anticipated that over time that sort of music wouldn't last because people literally would not be able to remember it.  People don't have the mental bandwidth to remember music where every split second must be heard as it is in order for the music to be understood and appreciated.

There was another reason there would no avant garde, not just because what had become known as the avant garde had become a crew demanding the impossible of its soon-to-be dwindling audience, but also because Meyer proposed that we had reached a point that Fukuyama would later call the "end of history".  Francis Schaeffer's variation was to say the Christian worldview was no longer endorsed.  What Meyer actually described was that history had become non-teleological.  It no longer even had a goal and the avant garde as a progressive movement or dynamic within the arts is only ideologically feasible in a history in which a goal is presupposed.  No goal for human history?  No possibility of the avant garde.  That's fleshed out a bit in a chapter called "The Renaissance is over", and Meyer proposed that inherent in a Protestant Christianity that considered self-improvement and self-refinement important it was possible for an avant garde in the arts to emerge because of the teleological approach to both personal and collective history.  That no longer applies.

Meyer, however, noted that the desire for self-improvement didn't go away.  It was no longer an aspiration to sainthood but to self-improvement.  More or less, the self-help industry became the secular iteration of aspiring to the perfection of the fully understood self in place of a Christian contemplative/ethical tradition. But Meyer noted that it's not that traditionalists and traditions would cease to exist, it's that they would exist in a realm of arts and ideas more formalist in conception and more wide-ranging in options. 

So, where does that go with this blog?  It's been fun to discover that what Wenatchee The Hatchet has taken as a given over the last twenty years, that we have a wonderful poly-stylistic present to enjoy and share, was anticipated by a musicologist half a century ago.  What it can mean at a practical level may be more easily explained by someone like Cuban guitarist and composer Leo Brouer--he has said that the future of music is probably in stylistic fusions but that the academy has not even bothered considering this musical route because, well, their academics and so they're busy studying the styles that do exist rather than exploring how fusions across styles may be possible or practical, let alone actual.

For Christian musicians there is a theological (Meyer might say ideological) incentive to explore what fusions are possible.  Each style can be its style but if followers of Christ are trusting in the effective reconciliation of previously inimical groups this can be applied to music.  "Classical" and "popular" music haven't even been at odds in many a culture.  Sure, the high Classic era sound might not seem anchored in Polish and Austrian and German folk music but that doesn't mean these things have not interacted. 

Sherman Alexie and others can say no good art comes from assimilation but let's play with that idea a little.  Have good cultures come from obsession with cultural/ethnic purity?  It's not as though Richard Wagner's ideas about the inability of Jews to have musical or any other culture with "soul" couldn't fit into the idea that no good art comes from assimilation and that all really good art is tribal.  But what if really good art that is tribal comes from tribes that kill other people?  And a point Sherman Alexie may be able to appreciate is that when so many American Indians got massacred that very few of them even know how to speak their own languages how will truly tribal American Indian arts and literature thrive?  Alexie's written some fine stuff in the English language but in that sense his art was assimilated via language before he was even a cult figure.  Tribalism and assimilationism may just be dynamics that exist along a continuum and one is not necessarily bad or good. 

To formulate this in terms of the Christian canonical documents, if we take Galatians and Colossians to apply to the music and not just the peoples who are reconciled to God through Christ, we may get to have some fun exploring how each culture can be as pure as it wishes to be while those who are interested in how their at times contrasting aesthetics can be reconciled through a fusion can do that, too.

So that's sort of a rambling, roundabout teaser for why Leonard B. Meyer's Music, the Arts, and Ideas was a very, very fun read.  It'll also get to why Andrew Durkin's Decomposition was such a disappointment.  He needed at least two more Meyer books under his belt before he undertook a ramble of 300 pages that covered things Meyer knocked out in 100 pages almost half a century ago and stuff that Paul Hindemith covered in the first 40 pages of A Composer's World. :(

at the Atlantic Lukianoff and Haidt on "The Coddling of the American Mind" and the emergence in academia of "vindictive protectiveness" and how it differs from the old political correctness

 http://www.theatlantic.com/magazine/archive/2015/09/the-coddling-of-the-american-mind/399356/
...
The press has typically described these developments as a resurgence of political correctness. That’s partly right, although there are important differences between what’s happening now and what happened in the 1980s and ’90s. That movement sought to restrict speech (specifically hate speech aimed at marginalized groups), but it also challenged the literary, philosophical, and historical canon, seeking to widen it by including more-diverse perspectives. The current movement is largely about emotional well-being. More than the last, it presumes an extraordinary fragility of the collegiate psyche, and therefore elevates the goal of protecting students from psychological harm. The ultimate aim, it seems, is to turn campuses into “safe spaces” where young adults are shielded from words and ideas that make some uncomfortable. And more than the last, this movement seeks to punish anyone who interferes with that aim, even accidentally. You might call this impulse vindictive protectiveness. It is creating a culture in which everyone must think twice before speaking up, lest they face charges of insensitivity, aggression, or worse.

Apropos of a theme the Atlantic has been exploring plenty, not only can it be said there is an environment of "vindictive protectiveness" education in the United States has become goal-focused to a point where love of learning has been supplanted by a push for successfully reaching targeted goals.

 http://www.theatlantic.com/education/archive/2015/08/when-success-leads-to-failure/400925/
...
The truth—for this parent and so many others—is this: Her child has sacrificed her natural curiosity and love of learning at the altar of achievement, and it’s our fault. Marianna’s parents, her teachers, society at large—we are all implicated in this crime against learning. From her first day of school, we pointed her toward that altar and trained her to measure her progress by means of points, scores, and awards. We taught Marianna that her potential is tied to her intellect, and that her intellect is more important than her character. We taught her to come home proudly bearing As, championship trophies, and college acceptances, and we inadvertently taught her that we don’t really care how she obtains them. We taught her to protect her academic and extracurricular perfection at all costs and that it’s better to quit when things get challenging rather than risk marring that perfect record. Above all else, we taught her to fear failure. That fear is what has destroyed her love of learning.

It's probably not good enough for anybody to be average or even below average in academics these days.  Yet decades ago one of the more bracing introductions to a college course I ever took had a professor saying: "I don't want us to misunderstand each other. Most of you are average writers. Most of you will get average grades.  There is nothing wrong with getting a C."
http://wenatcheethehatchet.blogspot.com/2010/10/liberation-of-being-average.html
http://wenatcheethehatchet.blogspot.com/2010/11/liberation-of-being-average-part-2.html

But then given the trajectories of academic life and publishing these days it seems like a life that demands much and rewards little.  Twenty some years ago getting into an academic setting to do research seemed like a fun idea.  Now it seems more like a dodged bullet with the shifts in culture and it's hard to commend higher education with its attendant debt.

Kyle Gann quoting Morton Feldman on the end of multi-movement form ... during the heyday of the rock and roll concept album

http://www.artsjournal.com/postclassic/2015/06/rethinking-multimovement-form-2.html
 I remember Morton Feldman saying in the ’70s that if there was one musical idea that was finally dead, it was multimovement form.  But weren't the 1970s arguably the apotheosis of the concept album in progressive rock and pop music?  It might have seemed to Feldman mulimovement form  was dead in concert music but it seems that the idea of the album picked up in popular music where multimovement form merely seemed like a dead letter somewhere else. It's not like Dark Side of the Moon is multimovement form in the same way that Beethoven's Hammerklavier is but who says it has to be?  Ideas can be elastic.

Tuesday, August 11, 2015

HT Phoenix PReacher, Carl Trueman not positively impressed with recent podcast by Tchividjian--WtH on his growing appreciation of the Book of Judges as a canonized confession of atrocities

...
Tchividjian and Driscoll are both products of the way American showbiz aesthetics and values drive so much of the evangelical subculture. Style and swagger and soundbites -- and little else. And they both benefited from the fact that nothing immunizes one to accountability in America like success.  As long as you are successful, no-one calls you on your behaviour, no-one makes you answer the hard questions, and plenty of people are happy to use your name to sell tickets to their gig.  The tragedy is that good men are then allowed to go bad, and outright charlatans are allowed to continue with influence, with groups like TGC backing them until the public relations problems, not the obvious theological and accountability issues, render them too hot to handle -- long after others have been pointing out the obvious ...

there's more beyond the link but that's the most memorable excerpt for WtH. At the risk of linking back to an earlier post here ...

a general observation, let's put the "no true scotsman" defense away, especially if it's for "our" team

Whether we're talking about a Mark Driscoll or a Tony Jones or a John Howard Yoder or more recently a Tullian Tchividjian.

Pick your team, there's not only going to be rampant moral failure, there's probably also going to be at least one atrocity.

It's just a matter of whether you've looked at your team's history long enough and honestly enough. 

In the last few years I've immersed myself in the book of Judges.  It's one of those books in the Bible people don't rush to read but it's a fascinating book.  It's too easy for lazy people to read it as either "it's in the Bible so God must have approved of it all" or to read it as "here's what all those sinners in days of yore did that we wouldn't do."

That's not necessarily why the book of Judges is in the canon.  Whatever your team, your team has atrocities to its name.  Think of Judges as a confession of God's people, "We did this and it was horrid."  You can't read the book of Judges as a believer and not remind yourself that this book is about your team and my time, whichever teams we're thinking we're on. Levites, the ones who should have been pioneers of faith, were pioneers in wickedness, whether in mercenary promotion of idolatrous cults or throwing a concubine out to be gang-raped and then, when she was not necessarily even dead yet, slicing her up into pieces and mailing her remains to the tribes to instigate a civil war that nearly led to an intra-Israelite genocide of a tribe of Israel.

If you think we couldn't and wouldn't do that today then remember what internet outrage is. :) We're tempted to do this with reputations rather than physical bodies.  Crimes and atrocities get done ... but if we don't read and appreciate the book of Judges and its place in the canon the great illusion we'll sell to ourselves, and that every day, is that whatever atrocities we see in that book we'll assure ourselves that WE would never do that.  But we certainly would ...

Matthew 23
29 “Woe to you, teachers of the law and Pharisees, you hypocrites! You build tombs for the prophets and decorate the graves of the righteous. 30 And you say, ‘If we had lived in the days of our ancestors, we would not have taken part with them in shedding the blood of the prophets.’ 31 So you testify against yourselves that you are the descendants of those who murdered the prophets. 32 Go ahead, then, and complete what your ancestors started!

To those who comforted themselves with the idea that they wouldn't have harmed prophets Jesus had a rebuke that they were going to keep harming the prophets and finish the job. If you think your team isn't answerable for any destruction of lives, just wait, your turn may soon come if it hasn't already.

The older I get the less I believe the old evangelical saw that "ideas have consequences". It could be true but to go by the consequences going on in a lot of places in the Christian scene in the last forty years (or even among some non-Christians) if ideas have consequences a whole lot of people seem to be having the same ideas when they have made careers out of thinking, and telling others, that this isn't the case.

James Parker at the Atlantic reviews a book and draws an interesting parallel between the Inklings and the Beats

http://www.theatlantic.com/magazine/archive/2015/09/j-r-r-tolkien-and-c-s-lewis-revived-myth-telling/399347/
Who can compare with these writers? In the intensity of their communion, their accelerating effect upon one another, and their impact on posterity, their only real 20th-century rivals are the Beats. And the Inklings would have detested the Beats. Nonetheless, the two core groups can be mapped onto each other with weird precision: Tolkien would be Kerouac, sensitive maker of legends; Lewis, the broad-shouldered preacher-communicator, would be Allen Ginsberg; Charles Williams, kinky magus, would have to be William Burroughs; and the sagacious and durable Owen Barfield, Gary Snyder. (The Inklings had no Neal Cassady, no rogue inspirational sex idol—they were all too grown-up for that.)

But the Beats, bless them, consumed the greater portion of their own energies, with the result that their influence went mainly into rock and roll and advertising, and stayed there. The Inklings, on the other hand, are still gathering steam. Tolkien revived in us an appetite for myth, for the earth-tremor of Deep Story. (See: Game of Thrones, and the pancultural howls of pain at the death of Jon Snow.) Lewis invented Narnia—though the exacting Tolkien regarded it as an incoherent mythology—and he may be, write the Zaleskis, “the bestselling Christian writer since John Bunyan.” As for Williams and Barfield, they hang in the tingling future: for the former I prophesy an H. P. Lovecraft–style cult (with creepy folk music), and for the latter, cosmic vindication. And Warnie serves another round of drinks, and the Inklings, huffing and puffing and hurtling through time and space in their armchairs, have their victory.

Noah Berlatsky on why, in spite of Alan Moore's intentions, Rorshach's the actual hero in Watchmen and why the superhero is the supervillain

http://www.splicetoday.com/politics-and-media/rorschach-for-president
...
Rorschach is the character who seeks out the truth, opposes the supervillain Ozymandias, and dies because he is horrified by the deaths of ordinary New Yorkers. "…despite Moore's intent, Rorschach becomes not figure of satire but moral center of book. And ironically reaffirms ideal of superhero," Heer insists.
...
Rorschach is one of my favorite superheroes in comics. I don't know why Cruz likes him, but I think Polo is wrong in seeing the character as unsympathetic, and Heer is wrong in seeing him as affirming superheroism. Instead, I think Alan Moore and David Gibbons set up Rorschach so that he’s sympathetic because he fails to embody the vigilante archetype that comic book readers, and Rorschach himself, set up as an ideal.

Rorschach sees himself as brutal and without emotion; he wants to be brutal and without emotion. But there are hints throughout the comics that he's not nearly as hard-hearted as he appears. In fact, his entire super-hero career is built on empathy. He decides to become a costumed vigilante after hearing about the Kitty Genovese murder, in which (at least legendarily) a woman was raped and murdered in Queens while apathetic neighbors looked on. He becomes, psychologically, Rorschach, discarding his Kovacs self, after investigating the disappearance of a young girl and finding that she was murdered.
Rorschach wants to be hard; he wants to be unmoved by pain; he wants to be the avenging angel. But he's a softy. And ultimately, when he's faced with the apocalypse for which he prayed at the beginning of the book, he quails. Ozymandias, the liberal one-worlder, is the guy who can look down and whisper "no." Rorschach is horrified at mass murder. He's not the judging uber-father: he's just Kovacs, the kid, who is motivated by empathy for others' pain. And so he takes off his mask and dies, not as Rorschach the superhero, but as ordinary Walter Kovacs.  Contra Heer, Rorschach doesn't validate the awesomeness of superheroes. Instead, he shows that the real superhero (Ozymandias) would be a supervillain, and that empathy, and decency, are only possible when you stop trying to be some sort of psychotic avenger, and take off the mask.

Slate--the reason the Fantastic Four movies have been lame is their leader is an arrogant elitist tool, and a brief consideration on his basic story

 http://www.slate.com/articles/arts/culturebox/2015/08/the_fantastic_four_movie_flop_the_marvel_superheroes_are_gigantic_jerks.html

While they survive, the rays profoundly transform them, turning them all into victims of Richards’ profound lack of scientific ethics—and somehow making them into superheroes entirely by accident. Faced with the abject failure of his work, he is not humble. “I’ll call myself … Mr. Fantastic,” he exclaims to the friends he has just permanently mutilated. Presumably still addled by translunar radiation, they do not object to his unearned, self-congratulatory hubris. In another sort of story, he would be the villain. Here, for some reason, he is a hero.

It's not an unfair point at all, really.  Some ambitious overeager scientist too confident in his ability to foresee and control the outcome of meddling with things beyond his ken that leads to an unexpected transformation doesn't just sound like Reed Richards there's ...

Curt Connors
Otto Octavius
Norman Osborn (there's this weird pattern with Spider-man villains ... )
how about that guy who became Morbius the Living Vampire?

Yeah ... Reed Richards may ostensibly be the hero in the tale but the backbone of his origin story ALSO reads like the same thing as a who's who of Marvel villains, doesn't it?

Monday, August 10, 2015

a ramble from ribbon farm, on Frontierland and how Disney is America, precisely because it gives us a narrative and a frame that makes no authenticity claims.

Ultimately, Disneyland and video games have been more successful than back-to-the-land primitivism in filling the vacancy left by the closed frontier. This is precisely because they make no authenticity claims. Freed from the preoccupation with sincerity and faithful representation, they allow the creation of really new worlds of narrative and shared values.

Atlantic: as retiring academics collect pensions, that operating cost will get passed along to students in higher tuition

http://www.theatlantic.com/education/archive/2015/08/entitlements-for-education-pension-universities/400820/
...
“We’re no longer really funding students,” said Jane Wellman, a university-financing expert and senior advisor to the College Futures Foundation, a California-based advocacy group aimed at removing barriers to higher education. “We’re funding benefits.” ...

There's a bit more where that came from but there's the pull quote to whet your appetite, just in case.

Slate piece on academia, tenure tracks, that " ... there is some fantasy space for intellectual work that operates outside of the real economy Intellectual work has to be supported with actual money."

http://www.slate.com/articles/life/education/2015/08/the_professor_is_in_karen_kelsky_creates_a_delusion_free_job_search_for.html
Karen Kelsky: That there is some fantasy space for intellectual work that operates outside of the real economy. Intellectual work has to be supported with actual money. In the Renaissance, it was aristocratic patrons. In the high-growth postwar period in the U.S., the government made this investment, and that is when the current system of graduate training was established. We all forget this history and believe that the option of doing scholarly work is available to anyone with the talent, and that it’s above mundane concerns of money. It is neither. Refusing to foreground the actual monetary costs of academic labor in the current economy is a kind of grad-student gaslighting, and a form of abuse.

It isn't just the evaporation of tenure-track lines and the scandal of adjunctification. It’s the systematic debt that is now part of the graduate school experience. Graduate school debt is the fastest-growing form of student debt. According to the National Science Foundation, the average grad student debt is almost $60,000, and 20 percent of graduate students owe over $100,000. This is not in law and medicine. This is in the humanities, where there isn’t the faintest hope of a salary sufficient to pay off those amounts, even in the unlikely event that the student gets a tenure-track offer.

Sunday, August 09, 2015

DG Hart on the unscrupulous and dishonest pragmatism of Daleiden, a few thoughts on the lack of a real need for secretly obtained content in many cases of journalistic research

http://oldlife.org/2015/08/journalists-and-saints-together/
...
But here’s the worst part of the journalistic-ethics defense of Daleiden. If a journalist went to a Roman Catholic archbishop and presented himself as a member of the church and in need of sacramental grace as part of a way of doing an expose of clerical sexual misconduct, what would the social conservatives say? Is that the way journalists behave? How loud would the outcry be over such dishonesty?
 
Or how about a reporter who while doing an interview with Mitt Romney to gain better access to insider information, what if that reporter presented himself as a fellow Mormon (when he wasn’t) and a regular donor to the GOP (which he didn’t)? Would anyone possibly take that “reporter” seriously as a journalist? Would Romney or his staff?

None of this means that Daleiden doesn’t deserve some credit for exposing a truly despicable aspect of American society. But if he is going to claim either the mantle of journalistic ethics or Christian morality, can’t we/I question that?

While this could inspire some ruminations on what some call watchblogging, that's for some other time ... maybe.