Saturday, February 24, 2018

some links for the weekend on a few things music from both sides of the brain, worldviewisms and other stuff

https://newrepublic.com/article/114221/orchestras-crisis-outreach-ruining-them

https://www.wqxr.org/story/317987-timid-programming-classical-musics-biggest-threat/

https://van-us.atavist.com/surveying-the-orchestra

From the inquisitive “Is Timid Programming Classical Music’s Biggest Threat?” in WQXR to the damning “America’s Orchestras are in Crisis” in New Republic, discussions of programming repeat an alarming diagnosis: performing groups choose repertoire from a rapidly shrinking list. The Republic’s Philip Kennicott thinks that managements cater to a caricature of an elderly audience who only wants to hear Beethoven’s Fifth Symphony. “Many in the managerial class… care deeply about the rich, variegated, and complex history of classical music, but can find no practical way to offer that history to like-minded patrons,” he writes. “There are fewer and fewer safe pieces,” Opera America’s president Marc Scorca said.
What qualifies as safe? Beethoven’s Fifth will do, but what about Symphony No. 2? Can a Nielsen symphony take the place of Sibelius 5? Discussions of programming can quickly become charged with unsupported claims, so, in order to form a more detailed picture of one performance culture, I examined the mainstage concerts of 16 orchestras in the San Francisco Bay Area during the 2016–17 season, tallying more than 300 entries in a spreadsheet. The results? Whether ranked by number of unique works performed or by number of appearances, the most-represented composers were Beethoven (27 appearances of 14 unique works) and Mozart (22 appearances of 20 unique works). Trailing slightly behind were Verdi and Brahms, Shostakovich and Mahler, and a smattering of Sibelius, Rachmaninoff, Tchaikovsky, Prokofiev, Copland, Stravinsky, and Haydn.

...

Now as a guitarist I suppose I might as well put my cards on the table and say I think Kyle Gann undersold things when he proposed that we "make way for the guitar era" because the 20th century was the era of the guitar, especially if we stop looking at the Western canon in terms of the symphony or the string quartet and look at the Western musical world in terms of popular music and vernacular styles.  It was the guitar era and stopped being the guitar era within the first decade of the 21st century.  Gann later went on to publish another post about how that guitar era he thought he was seeing emerged, GAMA.

http://www.artsjournal.com/postclassic/2003/12/guitar_mystery_solved_gama_did.html

Long-time electronic composer and general Downtown raconteur Tom Hamilton sends me an interesting fact in response to my perceptions of the guitar’s takeover of the composing world:

In 1995, an industry group called the Guitar and Accessories Marketing Association (GAMA), along with the NAMM and MENC, started a launched a program to train teachers to start guitar programs in middle and high schools. That group estimated that by 2001, over 200,000 students have learned guitar in school, and over 38,000 students bought their own guitar. They project a trend that by 2010, will have over 1.5 million students learning guitar in school programs, and over 300,000 students purchasing guitars. And that’s just through one school-based program! My observation is that most guitarists learn through woodshedding and private lessons without any institutional structure at all.
So no wonder young guitarists seem to be coming out of the woodwork: it was a calculated industry initiative! Tom also notes that when he was in school (and he and I are roughly the same antediluvian age, struggling together to figure out these youngsters), guitarists had to major in piano and take guitar lessons on the side. Bard, I might note, and to brag about my own institution for a moment, allegedly boasts the country’s oldest college guitar program, begun around 1968 by our cellist/guitarist Luis Garcia-Renart. Perhaps that’s why, to this day, a good half of my students are guitarists.
...

I skipped the "Mostly Mozart" programs because I'm honestly not much of a fan of Mozart.  I adore the music of Haydn, for instance, but even though I like some later Mozart I've found I have more fun with some later Clementi opus numbers than with the usual Mozart pieces.  It's not that Mozart wrote bad music or anything, it's just that I'm as vaguely indifferent to Mozart and his cult in comparison to Haydn in a way similar to how I'm vaguely indifferent to the perfection of Palestrina compared to the music of William Byrd.   But I admit to having anti-Romantic convictions and sympathies, so I'm not on board with art-genius cults that got formulated from the 19th century onward, a cultish mentality that spreads all over Western arts.  I'd probably be able to enjoy Kubrick films half as much more if Kubrick fans didn't venerate him the way Wagner fans venerate Wagner. 

In a time when people are concerned about toxic populism (and at this point we don't really need to rehearse the script for that here in the United States), I don't think an endorsement of the liberal arts should be construed, at any level, as some kind of healthier or saner alternative.  Arts funding in the conventional sense has been subject to a variety of crises.  Either there's been concern the NEA might get gutted or that even among the traditional arts organizations and subcultures distribution of funds is unequal and unfair.  Debates about just how white, male and dead so many of the canons are doesn't have to be seen as merely an outgrowth of some kind of cultural Marxism.  If arts funding such as it has been tends to just re-entrench a status quo of an arts canon that will leave no room for work from you or me or living artists we know then how beneficial is the arts funding culture to anyone who is not already a star?  That's not a denial of the value of arts funding, it's a question about whether or not some of the crises associated with the arts are coming at a time when, across the Anglo-American world, there's been room to ask just how elitist arts institutions can become and have become in a few cases.

https://www.artsprofessional.co.uk/news/specialist-arts-colleges-among-most-elitist-country

Given how recently he died, that Shostakovich has become canonical within symphonic literature is instructive.  Considering that he's 20th century and that, in Western cultural terms, he was a Soviet composer who was on the "wrong" side in the Cold War, the adaptation of a legend that Shostakovich was a secret anti-totalitarian dissident might tell us less that this was the case (not to rehash the Shostakovich wars all over again) but that Shostakovich's music was popular enough in the West that that story of secretive dissidence became a powerful narrative through which to legitimize enjoying the music of Shostakovich in Western cultural terms. 

But what that baptism of the music of Shostakovich as the work of a secret dissident suggests is that even in a realm of scholarship as seemingly disconnected from political histories as musicology would have seemed (sarcasm warning), is full of attempts to praise or damn the work of a composer in terms with what is known in Christian punditry terms as ... worldview.

That gets to a little piece at The Davenant Institute that's handy for this particular question.

https://davenantinstitute.org/whats-bad-worldview/

Yes, there's something bad about worldview formulations and questions when they get handled in a routine way.  It's why, on the whole, evangelicals need to break free from an often slavish invocation of categories introduced in the Francis Schaeffer and van Til way of ... engaging culture.   I've written about this in the past by comparing the condemnation Francis Schaeffer made of the music and thought of John Cage to the condemnation of the same made by Cornelius Cardew and John Tilbury--the Marxist Maoist criticism of Cage and his music ended up reading as exactly the same in essence as that of a conservative Presbyterian, that man as man cannot live by these ideas and ideals espoused in the art. 

Or, to invoke David Martyn Lloyd-Jones a bit, the communist and the capitalist are equally sinners before God so there's no good point in deciding that only one of these groups of people need to hear the message of Christ preached to them.  Merely being anticommunist does not make a person a Christian even if Western cultural inertia might lead people to imagine this is the case in a Cold War context.  But, obviously, we're no longer in the midst of the Cold War.  I've got a variety of issues with critical theory as the darling of the privileged elite who can afford to wax eloquent about how all art is political as a bromide for getting more cultural support or funding for their own artistic ventures while forgetting that the history of all elite art and vocational art is essentially a history of vocational artists serving empires, but not everything I've read from people associated with the Frankfurt school is all bad.  That's for later.  But ... since Quincy Jones did that bananas interview I can touch on one of those ideas in a sideways way.

...
 
Do you hear the spirit of jazz in pop today?
 
 
No. People gave it up to chase money. When you go after Cîroc vodka and Phat Farm13 and all that shit, God walks out of the room. I have never in my life made music for money or fame. Not even Thriller. No way. God walks out of the room when you’re thinking about money. You could spend a million dollars on a piano part and it won’t make you a million dollars back. That’s just not how it works.
  
Is there innovation happening in modern pop music? 
 
Hell no. It’s just loops, beats, rhymes and hooks. What is there for me to learn from that? There ain’t no fucking songs. The song is the power; the singer is the messenger. The greatest singer in the world cannot save a bad song. I learned that 50 years ago, and it’s the single greatest lesson I ever learned as a producer. If you don’t have a great song, it doesn’t matter what else you put around it.
 
 What was your greatest musical innovation?
 
Everything I’ve done.
 
 Everything you’ve done was innovative?
 
Everything was something to be proud of — absolutely. It’s been an amazing contrast of genres. Since I was very young, I’ve played all kinds of music: bar mitzvah music, Sousa marches, strip-club music, jazz, pop. Everything. I didn’t have to learn a thing to do Michael Jackson.
 
...
 
Musical principles exist, man. Musicians today can’t go all the way with the music because they haven’t done their homework with the left brain. Music is emotion and science. You don’t have to practice emotion because that comes naturally. Technique is different. If you can’t get your finger between three and four and seven and eight on a piano, you can’t play. You can only get so far without technique. People limit themselves musically, man. Do these musicians know tango? Macumba? Yoruba music? Samba? Bossa nova? Salsa? Cha-cha?
...

The irony of Quincy Jones saying that the art of music depends on what we colloquially know as the left side of the brain even within pop music that's so savory is that Adorno (he of the Frankfurt school) damned all popular music across the board as not even being art, as a thing that was (besides not even being art) only able to peddle what someone like Girard might have called mimetic desires.   Not that high art was much more than a manifestation of the oppressive bourgeois ruling caste.  The trouble with Adorno's criticism of high and low alike is that, to tersely summarize the Roger Scruton case against Adorno, Adorno's damnation of Western civilization had only a will-never-exist utopia as an alternative.  A David P Roberts from-thesis-eleven criticism is that Adorno and Horkheimer were so sweeping in their condemnation of the dehumanizing trends in modernism of the enlightenment they damned the concept of civilization itself in the process, and revealed that they were paradoxically as trapped in the mentalities of the long 19th century as the reactionaries they set themselves against.  Even from within the realm of the left that a good deal of Adorno's condemnation of jazz and popular music comes across in the 21st century as both elitist and racist can't actually be evaded by just insisting that Adorno hadn't heard any of the good jazz yet.  Even in "On Jazz" Adorno's dismissal of even Ellington was swift and sweeping.  Adorno's biggest fans have to contend with Adorno appearing to believe that the tonal vocabulary of Romanticism was so dead that not even black guys could salvage it by reformulating it into jazz. 

And yet Jones could say that the problem with popular music today is there aren't even any songs any more, that people forget that the left side of the brain has to be involved.  Adorno damned Hindemith as being a reactionary composer but even Adorno could grant that Hindemith, reactionary and regressive though he was, was at least a competent musician, and that someone like Bartok came closest to charting some kind of middle path between Schoenberg and Stravinsky.  Adorno was concerned, besides with the idea that capitalism ruined all of society, that the Romantic era mythology of the seat-of-the-pants genius destroyed a capacity to think about art as a mental discipline and craft.   That Jones could make criticisms of contemporary popular music as someone who worked in popular musical styles for decades that resemble the core of some of Adorno's critiques of popular music swims in irony, both because Jones echoes a critique of Adorno's about popular music but in a way that grants the capacity for art within popular idioms Adorno was unable to grant was even possible.

There's been stuff incubating that I want to write about but as with so many projects I try to tackle in my writing, things are complex.  I haven't been as prolific a writer in the last few years as I was earlier and some of it is just things in the offline world happen.  I'm also composing a couple of projects in terms of music and incubating a couple of writing projects while having all that real world stuff that happens happen from time to time.

I haven't felt like really writing much about the passing of Billy Graham.  I don't subscribe to altar calls, I reject revivalism as a form of nationalist civic religion masquerading as a concern for genuine Christian life and practice.  I am not much for the syncretism of Second Great Awakening methodologies filtered through integrated multimedia propagandistic techniques being a form of evangelism--I can't altogether shake the sense that Graham's legacy was preaching to the nominal and strayed rather than making bona fide never-before-been-a-Christian converts.  It's not saying he wasn't a believer in Christ or anything like that.  I'm suggesting that by making Graham a spiritual hero we may have praised the potency of his public methodology at the price of considering whether that methodology could be construed as the best manifestation of Christian thought and practice.

 If Graham's legacy has bad points in terms of being a public figure there's a sense in which a Jon Stewart or an Amy Schumer as a public figure who agitates a point of view with the presentation of an inform-the-public person their legacy is closer to a Billy Graham than a Walter Cronkite.  If an editorialist at Teen Vogue thinks it's great to tweet that Graham will/should burn in Hell then she's showing that William Wallace II antics are not solely the domain of stunt jobs by preachers.  But thanks to worldviewism of the progressive or reactionary kind, these sorts of stunts can be justified in the minds of those who deploy them as a proof of righteousness.  I've had my differences with all sorts of people writing at this blog but I try to be careful to not dehumanize them.  I also try to be careful to make it clear that in the world as it is we need not only fewer guys like Mark Driscoll from the proverbial religious right doing William Wallace II stunts, we need fewer guys like Dan Savage in his tenure at The Stranger, too.

Tuesday, February 20, 2018

Douglas Shadle--Systemic Discrimination: the Burden of Sameness in American Orchestras--on how the symphonic establishment keeps catering to dead white males from the 19th century

https://www.icareifyoulisten.com/2018/02/systemic-discrimination-burden-sameness-american-orchestras/

I read Orchestrating the Nation a few years back, so I am aware he's got a book he's plugging. I'm also more or less up to speed on the Future Symphony Institute defense of the time-tested popular and established symphonic canon counter-argument.

The various canons throughout the arts are like crowdsourcing. These works are the best of the best and actual people over the ages keep voting for them without ideological intent. Seems pretty democratic.

It's not that that isn't a compelling argument, actually, it's that I wonder if the FSI crew would consider it as compelling if the massive and persistent popularity argument were applied to someone like Michael Jackson, Stevie Wonder, Elvis Presley, David Bowie, the Beatles or the Rolling Stones.

Not so sure the "these are popular time-tested classics" argument would be as readily accepted. 

I'm more into classical guitar (obviously) and chamber music myself, so I would be curious to hear Florence Price's string quartets if those ever end up on a CD. 

This kind of thing still interests me, though, because we had Scott Joplin's centennial last year (of his death) and ... even though Joplin's music has obviously had an influence on culture there wasn't much of a centennial for his music.  Ethan Iverson had a centennial observance for Lou Harrison (hey, why not, Oregon composers need some love) and Thelonious Monk (hooray!) but ... not Scott Joplin.  Is Scott Joplin too lightweight for serious musicologists and musicians to pay homage to the beauty of his work?  One of the problems may simply be that ever since the long 19th century only those who can be taken "seriously" are apt to be given a serious consideration. 

Which in several ways gets back to whether the musicians noted above, in the not-symphonic department, would ever pass that test.  As I get older I begin to appreciate the artistry of musicians and bands who in my twenties I not only couldn't get into but actively disliked.  There's nothing like hearing a slew of Katy Perry songs to make me feel like, you know, Whitney Houston was just all around better than this stuff.  I'm also way more appreciative of Hall & Oates in my forties than I was in my twenties.  What changed was the nature of the out put of the song machine and as I've gotten older I suppose that it wasn't going to just be enjoying more Xenakis and Messiaen that was going to change.  "How Will I Know?" is a tightly constructed and elegantly straightforward song.  It sure beats the pewling of Vance Joy.

Bringing this back around to being a guitarist, given how expensive and entrenched the symphonic repertoire is, I would propose that it may be easier to add women to the canon of Western music in other contexts.  Joan Tower's string quartets, for instance, are well-made pieces.  I was waiting a decade for someone to finally record Incandescence.

But it's on the guitar that I can think of a few composers whose work has really stuck with me.  Annette Kruisbrink's chamber music for doublebass and guitar should be the cornerstone of anything resembling a canon for those two instruments.  Nadia Borislova's Butterfly Suite is a great piece, and she's written some music for clarinet and guitar I've thought about writing about over the years.  If on the one hand the symphony institutions (pun intended) are not so open to women and people of color being just added to the canon willy nilly we guitarists need more than just the same old transcriptions of Bach and Albeniz to play decade after decade.

Monday, February 19, 2018

Gibson guitar company facing down possibility of bankruptcy after 116 years in business

It's gotten a little coverage here and there, but the main thing is, as a guitarist, this is kind of the news that guitarists would want to hear something about.

http://www.ajc.com/news/national/gibson-guitar-company-maker-the-les-paul-facing-bankruptcy-after-116-years-business/OlaIEYdtLEv1rP92RvZFrM/

Not really thinking I'll rehash the last four to five years of the history of drama associated with Gibson for guitarists, who should already know the basics about that stuff.  This is not exactly the kind of blog that, if known about, is known for getting newbies gently caught up to speed about anything. Either you already know or you don't (and that's not even necessarily a bad thing).

So, anyway, that's a headline to bear in mind.

Saturday, February 17, 2018

a brief thought on a failure of Francis Schaeffer in The God Who is There and the rest of his trilogy

As I get older I can't help thinking that Schaeffer did ... okay in the visual art/plastic art overview but that he was going to be okay on that front by way of Rookmaaker.  His take on philosophy seems ... slapdash.  It strikes me that he was going for the big name highbrow philosophers as a pastor without having the acumen to tackle them.  That fellow Christians consider Schaeffer to have butchered Kierkegaard and others is not something I want to exactly tackle in a short post.

No, I think Schaeffer missed the boat by failing to adequately address highbrow culture in the 1960s when he could have more gainfully engaged what was going on at a more middlebrow and even lowbrow level.

Let's play a game where we imagine if Francis Schaeffer chose to publicly tackle Joseph Campbell's The Hero With a Thousand Faces rather than get sloppy attempting to thumbnail sketch Heidegger.  What if Schaeffer had spelled out a case to Christians that Campbell's monomyth was a bad joke as comparative anthropology that was a distillation of American narcissism that, if suffused throughout popular culture, could lead to a cultural monomyth in which everything is about "me", the hero? 

Now, obviously, Schaeffer didn't do that.  But besides not addressing Campbell's monomyth there was the cosmogonic cycle and that ... well ... even a conservative like Roger Scruton could say in The Ring of Truth that what Wagner did was tell the story about our stories.  The idea of an art work about our works of art; a myth about our myths, that was something Scruton says Richard Wagner was attempting to do in the Ring cycle.  Scruton thinks Wagner succeeded at conveying ther sacred in the absence of the reality of gods. 

I think Schaeffer made a giant mistake in arts history by ignoring Wagner altogether.  The Romantic era didn't die, it hasn't died.  We're still living with the total work of art as a utopian vision of a better society and humanity that doubles up as a critique of contemporary society.  We can see it most clearly not in the highbrow circles where in a post World War II world it's shameful to have such a work as an objectively observable work--no, for the highbrow the total work of art has been transubstantiated into an ideology like post-Marxist thought or neoliberalism.  The cults of art in the lowbrow are where Wagner's ideal of the total work of art reflecting the Folk thrives. 

Any competent pastor could inveigh against the cult of the Superbowl as an alternative Sunday gathering of course. 

But Schaeffer could have addressed a mixture of Wagner's legacy and Campbell's legacy and probably have done more good than coming across to highbrow scholars as if he were nothing more than a reactionary fundamentalist pedant. 

Of course ... that wasn't he Francis Schaeffer we had in the legacy of Anglo-American Christian thought.  Schaeffer had some ideas that, were they taken as a starting point, could have been fun and exciting as a catalyst for explorations in the arts and literary scholarship.  But, alas, Schaeffer's worldviewism tends to be taken as a conversation stopper by people who will invoke worldview talk to say this or that artist doesn't have a Christian worldview or just has a "postmodern worldview" and that's the end.  No need to even get into how or why such a set of ideas (which are themselves not even always explained) are manifest in the art work potentially under discussion.  Nope, just say X created Y which is a reflection of the Z worldview that X has that in general terms is implicitly or explicitly not Christian and the Christian school report is done!

But the most striking reason I've come to believe that Schaeffer's approach of assessing everything in light of a Christian worldview is not "just" because it tends to not define what a real Christian worldview is, often tacitly in terms of a white Anglo-Saxon Protestant default mode that favors Americanist cultural ideals that no Christian needs to feel hugely obligated to, it's also because, as I've demonstrated elsewhere at this blog, a Francis Schaeffer condemnation of a John Cage can read pretty much the same as a condemnation of John Cage and his music leveled by a Maoist Marxist like John Tilbury or Cornelius Cardew.  With the end of the Cold War it's possible, and I suggest also necessary, to regard one of the shortcomings in Schaeffer's whole approach as being so busy fretting about America being a post-Christian nation (as if Christendom and its salvage were the only path forward for Christians) that he could not imagine a form of Christianity that is not wielded toward the end of American revivalist ideals.

A postmillennialist theonomist and a Marxist do not seem different to me in terms of their overall teleological conception of history.  Since the Presbyterian wings of American Christian thought still have some folks committed to some variant of this stuff it's a reason I've considered writing a few things to demonstrate that with the passing of the Cold War you can turn to a Schaeffer or a Cardew and find that they can both blast Cage for the same reason despite seeming to formally be on opposing sides within the context of the Cold War. 

Meanwhile, Wagner's legacy lives on so vibrantly within Hollywood that I can see a trailer for Infinity War and where the 19th century had Wagner's Wotan wielding the Ring of the Nibelung to govern the world that eventually falls due to the curse of Alberich upon the Ring; whereas Frodo and Sam take the One Ring to destroy it to save the world; now in our century we've got Thanos seeking the completion of the Infinity Gauntlet so he can remake the cosmos according to his own whims. 

At no point did Schaeffer, in his various writings, tackle what would turn out in the last forty years to have been the most potent and pervasive art religions.

Nor does it seem he anticipated that these art religions would become so numerous that evangelistic efforts could transform or translate popular cultural tropes and brands as if they were in some sense dim pointers in the mode of the altar to the unknown god in Athens described in Acts. 

semi-incubation part 2

Some of the projects I wanted to tackle are more time intensive than I'd hoped.

The Justin Dean book review now needs to be informed by his podcast promotional activities, for instance.

Still mean to discuss PR Matters this year, but the history of Mars Hill Church public relations controversies is ... a little formidable.  Some of the more memorable issues predate Justin Dean, of course.

Atlantic--post Bill Clinton it's like we all stopped pretending we cared about the fidelity of politicians compared to their results

https://www.theatlantic.com/politics/archive/2018/02/presidential-infidelity-shame/553559/

Right up until 2016 or so, there was a clean narrative about political infidelity. Back in the day, the story went, politicians had affairs with abandon—John Kennedy, of course, but also Franklin Roosevelt, Dwight Eisenhower, Lyndon Johnson, and plenty others. (It’s a curiosity that Richard Nixon, the most famously unethical president, is one of the few without serious allegations of infidelity.)
...

Here we can pause a moment to consider that Richard Nixon's scruples could be questioned without questioning his fidelity to his wife, and by extension, looking back on the controversies that surrounded a former local megachurch leader, it was possible to regard him as having had a number of ethical shortcomings and misuses of power and influence despite having never cheated on his wife.  Suggesting that Mark Driscoll could be thought of as a Richard Nixon of megachurch pastors is unflattering but, since a journalist mentioned that Nixon was never credibly charged with cheating on his wife, it's an instructive moment evangelicals and conservative Protestants could benefit from considering--you can be a very, very bad leader even if you're a faithful husband and family man.

Moving along:
....

“If Nixon’s resignation created the character culture in American politics, then Hart’s undoing marked the moment when political reporters ceased to care about almost anything else,” Matt Bai argued in a 2014 book on Hart. “By the 1990s, the cardinal objective of all political journalism had shifted from a focus on agendas to a focus on narrow notions of character, from illuminating worldviews to exposing falsehoods.”

Or so the story went.

But this narrative looks dubious these days. Friday morning, The New Yorker’s Ronan Farrow published a long, detailed account of how Trump’s friend David Pecker, the head of the tabloid empire that includes The National Enquirer, killed the story of Trump’s affair with former Playboy model Karen McDougal by buying the rights. The Wall Street Journal previously reported on the Pecker’s move to suppress the story, but Farrow adds a great deal of detail, and obtained a contemporary written account by McDougal of her relationship with Trump, who was early in his marriage to his third wife, Melania.

And Farrow’s story comes the same week that Michael Cohen, Trump’s attorney, admitted he “facilitat[ed]” a payment to Stormy Daniels, a porn star who also alleged an affair with Trump, in exchange for her silence. Cohen had previously denied this; his vague statement did not really rule out Trump having been the source of the $130,000 payout, though it was clearly intended to give that impression. Friends of Daniels promptly told a celebrity news site that she felt this disclosure sprung her from her agreement to be silent.

There’s simply no plausible deniability that Trump is a serial philanderer—each of these stories has contemporaneous evidence and hush-money agreements, to say nothing of Trump’s history of infidelities. There’s also no reason to believe that the latest story will change much. In the old era, voters didn’t know about infidelity and what they didn’t know didn’t hurt them. In the interim, they knew, and it drove lots of politicians from office. And in the new era, voters know and they just don’t care.
 
If this is true, however, it didn’t start with Trump—he simply represents the apotheosis. Instead, it began with Clinton, who previously appeared to be the high-water mark of the middle period. Clinton was caught with his pants down (not quite literally, but close) having an affair with a White House intern. He lied about it, including to his closest friends and cabinet, but most consequentially to a grand jury. That led to Clinton’s impeachment in the House.

But a strange thing happened. Clinton wasn’t convicted by the Senate, and he didn’t resign. He didn’t show much shame at all. Oh sure, he apologized for lying, he bit his lower lip, the whole nine yards, but he more or less forged ahead. It worked. Voters knew—and it turned out they didn’t care. The highest approval rating of his presidency came around the time of his impeachment, and it stayed high, around 60 percent, for the rest of his term.

....

But even if Bill Clinton has, by 2018, been deemed too toxic to be thought of as an asset, his legacy is inseparable from her legacy.   If in the wake of Bill Clinton's impeachment infidelity and even deceit were not deal breakers then what might be left by which to assess a political figure's success?  Perhaps something like blunt policy implementation.  If Trump were to keep even a third of the things he said he'd do, let alone half, then anyone who had concluded that the post World War II status quo was not working for the working class or for middle class whites might just throw in with Trump not so much because he was considered a person of good character or even as someone who might necessarily fulfill campaign promises but because if the last twenty years of political dynasties such as Bush or Gore or Kennedy or Clinton or whomever got things to where they were in 2016 then to vote for Trump was to vote for someone and something different. 

Democrats could hardly show Al Franken the door and still keep Clinton around as if he wasn't a liability in a #MeToo moment.  But twenty years later presents its own difficulties on the Clinton legacy and, more pointedly for me, what is not in question is whether or not the Clintonian legacy is being shifted to the side while making no fundamental reassessments of policy.  If Trump has had no observable shame about not being a faithful husband the precedent for this not sinking a public figure began decades ago with Bill Clinton, who managed to go through an impeachment process and come out the other side still popular. 

Take this ...
https://www.politico.com/story/2018/02/14/bill-clinton-metoo-backlash-campaign-407280

“Bill Clinton’s a former president of the United States, and in his administration, we took an economy that was in the tank and built an economic engine that had been unparalleled. Did he make significant mistakes? Of course he did,” Perez said. “People will make judgments race by race about who are the best surrogates to come down and advocate.”

So maybe Clinton did some bad stuff.  Maybe he wasn't a faithful husband.  Maybe during his watch aerial bombing in the former Yugoslavia kept happening when anyone with any passing knowledge of military history could suggest that bombing the daylights out of a nation never does anything in strategic terms (it's one of a few things about Clinton's legacy I considered ghastly and evil at the time even while I still considered myself both a Republican and even a hawk on national defense, bombing the former Yugoslavia was an idiotic and immoral policy but I digress a bit).  But for the people who felt like Bill Clinton's years fixed the economy it was all good.  Sure, maybe we could look back on a decade of the internet boom and perma-temp and contract work with no medical coverage and weird hours in the Puget Sound but, hey, those years were awesome for some people, I have to assume.  Just not me.  For those people who just want to throw on a graphic about how under the Clinton years more jobs were created and we didn't have any officially announced military actions, if that's your thing I won't be able to change your minds. 

John Halle had a blog post not o long ago where he discussed how Democrats reacted to the Paula Jones allegations twenty years ago and how that sinks the foundation for moral outrage on the part of Democrats who might have backed Hillary Clinton with respect to the character of Trump.  Not everyone thinks that how Clinton or Trump handled married life is worthy of emulation but within the confines of the big two parties it seems clearer and clearer that in spite of some observable differences about implementation of certain goals relevant to a pax Americana, the modes of operating, the basic ends in mind are shared.  And, to that end, it seems we may be at a point where the two parties have shown what levels of pragmatism they have been willing to endorse in the last thirty years (and further back, obviously) to get their goals achieved.  Even an attempt by the DNC to distance itself from Bill might be as much pragmatism as principle. 

Not feeling particularly trusting of either of the two parties at this point, though. 
...

from The Atlantic, on college debt but without the degree

https://www.theatlantic.com/education/archive/2018/02/college-debt-without-the-degree/553037/

Amid all the coverage about people with student debt who do get the degree, there's not as much coverage about the people who get all the college debt but don't manage to finish the degree.

thinking back on authors at Slate saying we shouldn't pay down the national debt (2000), or default on it (2011) and on a crisis in higher ed to do with exploitive lending

About 18 years ago someone at Slate argued that 2000 was not the time to start paying down the national debt.

http://www.slate.com/articles/arts/everyday_economics/2000/09/dont_pay_down_the_national_debt.html

The entire thing was formulated in terms of you, the taxpayer, not in terms of the national debt itself.  The only question in consideration from a national fiscal policy perspective was whether the tax cuts would happen "now" or "later", not whether or not the possibility that no tax cuts would be made, still less that taxes might be raised and spending also cut to do something like paying down the national debt.

What a difference a decade makes ...

http://www.slate.com/articles/news_and_politics/explainer/2011/04/could_we_ever_default_on_the_national_debt.html

The question of the United States simply defaulting on the national debt was considered in 2011 by Brian Palmer.

...
So what would happen if the government did default on its debt? Well, there are two kinds of default. In the first scenario, the government simply wouldn't be able to cover its interest payments—in other words, what ordinary people mean when they use the word default. The results of this would be catastrophic. When creditors suspected that things might play out this way in Argentina in 2001, that nation's interest rates rose 5 percent [PDF] in a matter of months. A similar spike in Uncle Sam's average interest rate would increase the federal deficit by 30 percent in the first year, with a snowball effect going forward. The good news is that we're a long way from reaching that kind of crisis. Last year, the government paid $213 billion in interest on its publicly held debt. That accounts for just one-tenth of government revenue.

The second default scenario is more likely. In that case, the government would have enough money to pay interest to its creditors, but not enough to issue Social Security checks or pay soldiers' salaries. There's no analogue to this kind of default—if default is the right word—in the private sphere. Economists disagree on its significance. Secretary Geithner insists that it's "default by another name," since it would indicate the country's willingness to walk away from financial commitments. Others have argued that prioritizing debt-service payments, and walking away from entitlements or discretionary spending, would actually make us seem more reliable (and deserving of low interest rates), since it would establish just how seriously the U.S. takes its debt payments. At this point, it's not clear how, exactly, this would play out.


...

Neither seemed likely to happen then or now.  The first type of default would, well, guys like Rod Dreher say we're Weimar America as it is.  No point in belaboring that observation.

The second default scenario of paying the debt but gutting Social Security or military salaries sounds pretty bad, too.  What could possibly be the risk of deciding to not pay soldiers' salaries or Social Security checks?  Would American soldiers and the elderly be so philosophical as to not take action?  But then since either type of default seems moot as an action taken by us the first default seems, ultimately, more likely than the second.   It might take longer for the United States to reach such a point but it's not necessarily going to be a decision for the United States to make as if it controls that outcome.  Wouldn't a moment of default arrive not merely because the borrower decides to default but because the lender takes an action to get paid back?  I.e. there could be such a thing as a default default on debt, the kind of defaulting on a debt that could lead to ... say ... a default decision. 

You can read how this thing can happen with stuff like student loans. Taking on a large amount of debt for an educational certification of dubious value on the job market has been considered a crisis for years, even by some other people who write for Slate.

http://www.slate.com/articles/business/moneybox/2016/01/student_loan_crisis_at_its_ugliest_i_graduated_and_found_out_i_owe_200_000.html

...
I don’t question the importance of higher education. But the detrimental effects of crushing debt shouldn’t be the shared experience of millions of young people and their families. Currently, about 40 million Americans owe $1.2 trillion in student loan debt, and it continues to grow. According to the Institute for College Access and Success, students who borrow graduate with an average debt of $29,000 for a bachelor’s degree. In 2014, 69 percent of graduates had student loan debt, and from 2004 to 2014, the average college debt grew at more than double the rate of inflation. Even with smaller amounts of debt than mine, starting a life quickly becomes very hard. So how do people get to this point? We’ve debated student debt for decades, but our understanding of how it shapes a young person’s experience—from naïve teenager to indebted young adult—is still limited. Here’s what happened to me.

...
If the proposed solution is that the government pays for all education Americans need to consider that in those lands where the state pays for all your education they frequently take an interest in telling you what your professional career is going to be.  That is the part that I suspect Americans will not tolerate.  They may love the idea that Uncle Sam foots the bill for you to study whatever you want, but to be a cog in the national industry machine?  That seems oppressive!  What if a standardized career test says that you should be a horticulturalist when you want to go into something more literary?  I ran into that in high school taking one of those standardized career tests. 

Reading about a law under proposal that would require students to apply to at least one college ... I'm admittedly a pessimist about American higher education at a couple of levels.  Requiring high school students to apply to a college when the higher education debt situation seems as bad as it is seems like it's suggesting that telling American high schoolers they HAVE to apply to a college to  make sure enrollments don't drop at a state school sounds unscrupulous.  j

If so many who get advanced degrees struggle to find jobs that can pay back that debt then wouldn't insisting that students apply do nothing to fix that?  So what if a student applies?  If they don't get accepted they're made to apply and that could glut admissions assessment.  If they get accepted after they have applied that does not really mean they have to actually enroll, does it?  You could apply and get accepted and then just decide not to go because there's not enough financial aid or you never had the money to begin with but a requirement for a student to apply takes no consideration of capacity to pay in advance, does it?
Overall it seems that the nature of the American national debt is such that the moment of default is probably not going to e the moment that we decide it's going to be but the moment the creditors decide they want their due.   Precisely to whom this national debt is owed doesn't even seem to ever come up in a venue like Slate. 

Friday, February 16, 2018

Sadists with heartstrings--Richard Brody on the gleeful sadism of the new Peter Rabbit film and its conflict with the life-lesson moralizing of the character arcs, the day of a shooting in Florida

Richard Brody's piece about the new Peter Rabbit film was published the same day as the shooting in Florida so he may have been reported to some degree as Brody was writing.  Since he doesn't mention the shooting it's not possible to be sure, but the juxtaposition of Brody publishing his short rumination on Peter Rabbit the day of the shooting was an interesting coincidence. 

The "rascal, rebel, rabbit" marketing lost me at the first two trite words.  Seeing that this was written and directed by Americans had me not-sold from the start.  Every American protagonist is a rascal and a rebel, male or female, in so very many films. 

Brody's piece is worth considering as a whole.  Anyone who could regard Susan Vernon as other than the villain of Love & Friendship is someone I'm apt to disagree with a lot of the time, but as the old saying has it, a broken clock is right twice a day.  Because, perhaps, Brody has children and has children who have food allergies to boot, he couldn't completely remove the parent perspective in considering this recent film.  For a film critic who has lamented moralism in mainstream film it could seem that ... well ... sometimes he feels a bit moralizing.

 
The Real Problem with “Peter Rabbit” ’s Allergy Scene
 By Richard Brody
February 14, 2018
 
Last Saturday, a day after the opening of “Peter Rabbit,” Will Gluck’s new and free adaptation of Beatrix Potter’s stories, Kenneth Mendez, the president and C.E.O. of the Asthma and Allergy Foundation of America, issued both a statement on Facebook and an open letter criticizing the film’s makers and its studio, Sony, for one particular scene. In that scene, Peter and the four other rabbits, who are being threatened and pursued by Tom McGregor (the heir to the venerable Mr. McGregor’s garden), adopt a new strategy to fight him: knowing that he’s allergic to blackberries, they use a slingshot to shoot blackberries at him, and one goes directly into his open mouth. He begins to choke, feels an anaphylactic episode coming on, reaches into his pocket for his EpiPen, injects himself with it, and keels over in exhaustion. [emphasis added]
 
The Asthma and Allergy Foundation criticized the filmmakers for making light of a life-threatening allergy and for depicting the use of an allergen as a weapon against a gravely allergic person. The statement warned that the movie could be “disturbing” to children with serious allergies; some people advocated a boycott. In response, Sony offered an apology. As a parent of children with severe food allergies, I wish I’d seen the movie before the controversy broke out, because I’d be curious to see whether I would have reacted strongly to the scene without having been alerted to it beforehand. Under the given circumstances, I found that I agree that the scene spotlights an unpleasant insensitivity, even an ugly obliviousness, on the part of the filmmakers. Yet, even more, it throws into sharp relief the over-all tone and import of the film, and, in the process, reveals other peculiarities that make “Peter Rabbit” exemplary of recent movies and of the times.
 
Peter Rabbit” is a boisterous comedy in which live action (human characters in realistic homes, landscapes, and towns) is blended with C.G.I. as seamlessly and as persuasively as in “Paddington 2.” The film was made by a comedy director (Gluck directed “Easy A” and “Friends with Benefits”) who, in the script, which he co-wrote with Rob Lieber, has taken extreme liberties with Potter’s stories. Peter and his family live in a hollow beneath a tree in rural Windermere, England, and gleefully filch produce from the garden of their nemesis, the elderly Mr. McGregor. When Mr. McGregor suddenly dies, the house and garden are inherited by his great-nephew Tom (Domhnall Gleeson), a Londoner and a neat freak who is even more hostile to the rabbits than Mr. McGregor was. But his battles against them are inhibited when he makes the acquaintance of his neighbor Bea (Rose Byrne), an artist who is the rabbits’ defender and protector (and also their portraitist). Bea and Tom fall in love; knowing that Bea also loves the rabbits—and, especially, their ringleader and brightest personality, Peter—Tom has to do his rabbit hunting on the sly.
 
Peter and the other rabbits take advantage of Tom’s self-enforced restraint to run rampage through his garden and make his life miserable; Tom, for his part, stealthily takes increasingly forceful action against them. That’s when, facing real danger, the rabbits prepare to unleash the blackberry attack, knowing full well its potential consequences. Peter calls it “the endgame.” For that matter, a bit earlier, as they plan the attack, the other rabbits are hesitant; Peter’s mild-mannered cousin Benjamin says that “allergies are serious” and adds, “I don’t want to get any letters.” (The line wasn’t inserted into the movie after the controversy arose; it was always there.) [emphasis added]
 
What’s peculiar about “Peter Rabbit” is that, along with its quippy, often self-referential humor and plentiful (often clever) visual gags, it features an unusual quantity and degree of violence, which link it to classic-era Looney Tunes cartoons and Three Stooges shorts. When the elderly Mr. McGregor keels over, Peter examines him by poking his eyeball—and, after declaring him dead, gleefully takes credit for killing him. (Mr. McGregor actually died of a heart attack.) Tom comes slamming at the rabbits with rakes, hoes, and other garden tools. He installs an electric fence against the animal intruders, only to have the rabbits rewire it, electrifying his doorknobs with shocks that blast him, cannonball-like, against hard stone walls. The rabbits plant snapping traps and rakes around Tom’s bed, leading to pinchings and clobberings; they leave various fruits on staircase landings, sending Tom tumbling down. There’s a repeated gag in which one of the sisters enjoys taking a hard fall and breaking one rib after another, and a climactic bit, involving dynamite, that’s nearly apocalyptic.
 
In another sense, though, the story owes nothing to the action-heavy, character-thin antics of Bugs Bunny or Daffy Duck, Elmer Fudd or Road Runner and Wile E. Coyote. Rather, Gluck’s “Peter Rabbit” is thoroughly composed and intricately characterized; the rabbits, no less than the humans, are given elaborate backstories and large emotional arcs that the plot is devoted, at length, to illustrating, explicating, and resolving. Peter and his sisters—Flopsy, Mopsy, and Cottontail—are orphans; their father was killed and eaten by Mr. McGregor, and Peter’s familiar blue jacket is actually his father’s. Their mother died, too, making Bea is the closest thing to a parent that the rabbits have.
 
Meanwhile, Peter is a mischievous, temperamental, vain, proud hothead, who, in a quiet moment, acknowledges that it’s his “character flaw” to do “stupid and reckless” things. [emphases added] (Oddly enough—or perhaps not oddly at all, given that the movie is written by two men—Bea is given the least backstory.) When romance blooms between Bea and Tom, Peter’s response is partly one of a practical worry for the rabbits’ safety. But, as the violence ramps up between Tom and Peter, even Benjamin wonders whether Peter has an ulterior motive—jealousy. In other words, with Bea as Peter’s virtual mother, Peter Rabbit” is something of a story about Peter trying to come to terms with a stepfather; the comedic drama links Peter’s mean streak to his emotional deprivation and trauma, and it takes him carefully through the paces of his rise to self-recognition and maturity.
 
It is precisely this strain of emotional realism that makes the allergy subplot, slight though it is, so repellent. The movie’s other varieties of violence are exaggerated, cartoonish, not just in depiction but in substance. Few kids have experience with electrical engineering or have dynamite at home; most kids know other kids with severe allergies. (Despite its explosive extremes and intricate, Rube Goldberg-esque calculation, there are no guns and no knives; Gluck clearly knows that certain things aren’t to be trifled with.) Meanwhile, the same emotional realism turns “Peter Rabbit” didactic, dutiful, tedious. Its mechanistic moralism, seemingly distilled from screenwriting classes and studio notes, is the sort that marks so many movies now—ones for adults as well as those for children—imparting values in the form of equation-like talking points, which prepare viewers not for life but for more, and similarly narrow, viewing. [emphases added]
 
Gluck clearly relishes the slapstick action that the characters incite, the situations inspire, and the technology enables, and he invests it with his own sense of exuberant discovery, which is minor but authentic. When it comes to life lessons, however, he dons his official hat and, far from doing any learning in the course of the action, merely dispenses the official line. That’s why the scene involving a life-threatening allergy is all the more conspicuous: while the rest of the movie marches in lockstep with its edifying narrative, that scene is out of place. It doesn’t follow the script.
 
The shooting occurred the day this was published and I wonder if Brody couldn't have gone further.
 
The pat moralism of cinema that celebrates and takes delight in sadism and the possibility of murder even in a children's film that is an adaptation of Beatrix Potter's Peter Rabbit may say something about American culture.  If Peter's "father wound" just gets addressed and he has enough empathy and emotional support then his capacity for premeditated murder can be brushed aside. 
 
I'm not going to say that Brody could have opted to be more direct and more sweeping.  He tends to be sweeping in describing cinema trends rather than society, but his complaint about the pat moralism stops short of explaining what could be so repellent about it.  The emotional realism may have made the allergy subplot repellent to Brody but we could step back and consider that emotional realism does not necessarily require that characters be "good".  An emotionally realistic story arc could, depending on how things were written, have had Peter and associates escalating activity to remorseless murder.  The man dies of a food allergy and the rabbits are safe and things are fine.  End of story.  Brody is probably right to highlight a tension between the glee with which American film revels in violence and spectacle but it seems he finds the moralism rote and unconvincing. 
 
But the mechanistic moralism distilled from screenwriting classes can't really be completely separated from a film industry that gives us stuff like The Walking Dead or Game of Thrones, can it?
It could be Brody's seen enough films aimed at adults that when he sees the mechanical moralism of children's movies that, as he puts it, prepares a child only to view more of the same kind of movie rather than prepare them for life, he's aware that what Hollywood does is mechanical moralism but where it's heart is truly at is in sadism, cruel humor, and stuff like that.  Thing is, even when the moralism is completely serious or sincere it's not altogether clear if someone like Brody would be on board with it.  Film critics may have the weakness of having to watch too many films.  My own pondering in the last few years is an idea that if you feel that there's nothing new going on in movies or in music that may signal not so much that there's nothing interesting going on so much as that you're consuming too much and need to scale back consumption.  The dividing line between creative artistic activity and consumption artistic activity may sometimes be too great for people. 
 
Now we could (but probably won't) ban or regulate access to stuff like AR-15s, but there are other things at play.  We could talk about how violence in video games is not likely to manifest in actually violent behavior in a majority of cases.  I wouldn't say violent video games are things people should seek out because desensitization seems like a real possibility.
 
But I wonder if the problem is the disconnect between the rote moralism that Brody complains about in films and the lively sadism in the said same films.  What if wwe Americans want to believe that "we are not like this" when, a good chunk of us are?  The pat moralizing is the mask we want to wear over our cruelty and mockery that can neutralize it, if not to the point of keeping others from being harmed, to the point where we can look in the mirror and convince ourselves that we're good people, even if we've done a few innocent jokes now and then and made some sport of people.
 
Perhaps I could, as a blogger might, suggest that the sadism that briefly created a stir from the Peter Rabbit movie is a better tell as to who "we" really are than the screenwriterly moralism that is expected.  Maybe Peter Rabbit American-style isn't dealing with some father wound and needs to come to peace with his surrogate mother bringing a stepfather into his life, maybe he's an evil vindictive asshole who represents what Americans really are, and the bid for what Brody calls emotional realism is just the rationalization for why the humor derived from cruelty and premeditated murder can be excused because nobody we care about or may see regularly dies on screen.
 
Because if we don't see it on what some call the "second screen" these days, did it happen?
 
 
 

Thursday, February 15, 2018

The Bard as Borrower, research and arguments continue on the matter of just how much Shakespeare refined and appropriated the ideas of others

I tend to think of myself as ... maybe not cheerfully anti-Romantic in my aesthetics and convictions (I mean ... I admire a couple of the Puritans, am a Calvinist, and love the string quartets of Shostakovich) but I dislike a lot of stuff about the art and music but, most of all, the ideology about art as replacement for religion that took shape in the 19th century in Europe and the United States, basically the West.

The idea of the lonely misunderstood genius does not seem to me either an accurate understanding of how creative vitality works, nor do I think that it's a sustainable myth the more we learn about the ways in which ideas persist and get transformed in the stream of human creative activity.  What seems revolutionary in one time and place could simply be the recovery of elements that had fallen into disuse or creative presentations of relatively rare combinations in this or that work of something that, if we were to bracket it out into its respective elements, is full of things that could be considered even pedestrian.  The body may be new, you see, while capillaries and nerve endings and skin color and skin type might all be fairly "normal". 

So if it turns out Shakespeare made use of, and significantly reworked, ideas and works from his time and place, that fits what my understanding of the creative process is, as a kind of collaboration even when it seems to be just one person surveying an art form and its idioms and traditions and developing something. 

So, there's continuing research to suggest some more details about some relatively overlooked places for appropriation for the Bard.

https://slate.com/culture/2018/02/shakespeare-was-no-plagiarist-but-genius-isnt-born-in-a-vacuum.html


 
This week, scholars Dennis McCarthy and June Schlueter announced that they had discovered a new major source of Shakespeare’s plays. Using plagiarism software and literary analysis, McCarthy and Schlueter are preparing a new book in which they argue that the forgotten A Brief Discourse of Rebellion and Rebels by the even-more-forgotten George North was a key point of inspiration for 11 of his major works. As reported by the New York Times:
The book contends that Shakespeare not only uses the same words as North, but often uses them in scenes about similar themes, and even the same historical characters. In [a] passage, North uses six terms for dogs, from the noble mastiff to the lowly cur and “trundle-tail,” to argue that just as dogs exist in a natural hierarchy, so do humans. Shakespeare uses essentially the same list of dogs to make similar points in “King Lear” and “Macbeth.”
 
The article, and the book, have many more examples of places where the words of Shakespeare and North intersect. Even though plagiarism-detecting software was used to make this discovery, McCarthy and Schlueter want to make clear that they are not accusing Shakespeare of plagiarism. Instead, they’re simply arguing that North’s writings were an inspiration for him.
They needn’t bother. By our standards, Shakespeare, who lived before modern ideas of authorship, plagiarized constantly. The discovery of North’s influence on Shakespeare is a welcome opportunity to remember how the Bard of Avon’s genius actually worked, and how much his methods are at odds with our own ideas of artistic greatness.

Shakespeare is not Western literature’s great inventor but rather its great inheritor. The Bard borrowed plots, ideas, characters, themes, philosophies, and occasional passages from sources ranging from Plutarch’s Lives and Holinshed’s Chronicles to Montaigne’s Essays and plays by his contemporaries. He returned again and again to ancient Rome, finding inspiration in Ovid, Seneca, Plautus, and others.

His inheritance also goes beyond the textual. When he began working in the London theater scene, its component parts were there waiting for him. There were already professional theater companies, outdoor amphitheaters, plays in five acts, iambic pentameter, and conventions surrounding comedies, tragedies, and history plays.


None of this should make us think less of Shakespeare’s achievements and neither should the increasing evidence that he sometimes used uncredited collaborators and occasionally served as one himself. Shakespeare didn’t just faithfully reproduce his sources—he argued with and subverted them, he combined them in unconventional ways, and he made substantial changes to them. King Leir, the anonymous source text for Shakespeare’s King Lear, ends with Leir restored to the throne and everyone still alive. Shakespeare frequently expands roles for women in his plays and removes many passages where characters share their motivations. Shakespeare also often makes his plays more complicated than his sources, both ethically and logistically. He even went so far as to add an extra set of identical twins to The Comedy of Errors.


This is, generally speaking, not how we think about Shakespeare or, given limited classroom time and the emphasis on close textual reading, how he is taught. Many of his predecessors’ plays are lost, and his peers among Elizabethan playwrights aren’t read, taught, or produced nearly enough, making it harder to see the connections between his work and his contemporaries. Even when Shakespeare’s sources are mentioned, rarely is much time spent on reading them to see the influence clearly.

Thus, we look at Julius Caesar and marvel at the incredible rhetoric but don’t see it as in dialogue with plays about Rome by other Elizabethans such as Thomas Lodge’s The Wounds of Civil War, and we don’t look at Plutarch’s accounts of Brutus and Mark Antony’s lives, which served as the source for both Caesar and Antony and Cleopatra. The result is that our understanding of both Caesar and Shakespeare is impoverished. By looking at his sources, we can see what he kept and cut and changed. By looking at his context, we can see the debates and cultural moments that he was responding to.
 
What emerges when you do this is a richer appreciation of the plays and a more down-to-earth view of their writer. Shakespeare wasn’t a God, and he wasn’t unique, even if he was the best. He was an artist responding to his time the way artists actually do, through opening themselves up to influence and creating out of the materials around them. There’s a practical side to his work as well. He wrote for a company, which means he wrote to the particular skills and limitations of his actors. He wrote prolifically, which necessitated recycling ideas, themes, and bits of dramatic business. As a part owner of his company, he also had to respond to practical matters like trends, government censorship, and the need to fill up to 3,000 seats a night.

A grubby businessman furiously writing plays and ripping off whatever he could get his hands on hardly fits our model of artistic genius. We think of geniuses as tormented rock stars, breaking new ground with sui generis innovations that spring from their minds like Athena from the brow of Zeus. In movies like Amadeus, this myth of artistic genius makes for delicious art in its own right, but it’s not how artistic creation really works. Creating a work of art is part of an endless dialogue that reaches both back thousands of years and out into the world around us. This is what Shakespeare did, and he didn’t do it alone. If it worked so well for him, perhaps we should stop being attached to ideas of originality that have no bearing on how art is actually made
 
 

and so T. S. Eliot wrote that immature poets imitate and mature poets steel ... but transform whatever they've taken into something better or different from the original meaning or implication of the thing they took. 

We may still be stuck with a number of powerful vestiges of 19th century Romantic ideological commitments to a certain conception of genius.  Certainly in music pedagogy and historiography regarding 19th and 20th century music the favor is given to whomever is considered daring and innovative.  Those considered conservative or traditionalist tend to get bypassed.  This can sometimes happen even in cases where a fairly clear counter-argument could be made.  Richard Taruskin described Anton Reicha as having a conservative approach to defining musical forms like sonata, if memory serves.  School teachers can tend to land that way but Reicha's theoretical writings, as Kyle Gann has mentioned them, speculated on the viability of quarter tones and Reicha wrote a fugue in 5/8 in which a subject in A major is given an answer in E flat major, the sort of weirdness that Beethoven found a bit too far afield for him to consider such works fugues.  As Gann has put it, the 18th century for what we call classical music was a much weirder and more experimental period than traditional music history tends to give it credit for.

We could approach the history of an art form focusing on innovators but as our era proliferates in recorded music and preserved scores and stuff like, oh, suits about songs like "Blurred Lines", one of the ways to remedy what may be an overemphasis on originality is to remind ourselves that there are only so many variations on "I love you" or "I don't like that".  There's something to be said for considering an arts history of innovators, but there's something to be said for considering an arts history of consolidators and refiners, too.  J. S. Bach didn't exactly introduce any new forms or styles or ways of composing.  What he did was to refine and comprehensively explore what was possible within the realms of the styles and idioms he lived with. 

As more scholarship gets done on where Shakespeare got his ideas it might be worth noting that he can be understood as a consolidator and a refiner rather than some revolutionary. 

suit against Taylor Swift dismissed on grounds that phrases that appear in Swift lyric too banal to be considered infringeable material

For anyone who has heard "Shake it Off" the suit that was recently dismissed had to do with

http://variety.com/2018/music/news/taylor-swift-player-hater-dismiss-1202697663/

A federal judge dismissed a lawsuit Tuesday that accused Taylor Swift of copyright infringement on her hit song “Shake It Off.”
 
Songwriters Sean Hall and Nathan Butler brought the suit last fall, arguing that the chorus of the song borrowed from their 2001 composition, “Playas Gon’ Play.”
 
In his ruling, Judge Michael W. Fitzgerald held that combining the phrases, “Playas gonna play” and “haters gonna hate,” does not entail sufficient originality to warrant copyright protection.
 
 
“By 2001, American popular culture was heavily steeped in the concepts of players, haters, and player haters,” Fitzgerald wrote. “The concept of actors acting in accordance with their essential nature is not at all creative; it is banal.”
 
The plaintiffs’ song includes the following line in the chorus: “Playas, they gonna play, and haters, they gonna hate.” “Shake It Off” includes the line, “Players gonna play, play, play, play, play, and haters gonna hate, hate, hate, hate, hate.”
 
 
Though short phrases are generally immune from copyright infringement claims, the plaintiffs argued that combining the two thoughts was sufficiently original to claim copyright protection. Fitzgerald disagreed.
 
“It is hardly surprising that Plaintiffs, hoping to convey the notion that one should persist regardless of others’ thoughts or actions, focused on both players playing and haters hating when numerous recent popular songs had each addressed the subjects of players, haters, and player haters,” he wrote. “In short, combining two truisms about playas and haters, both well-worn notions as of 2001, is simply not enough.”
 
“In sum, the lyrics at issue – the only thing that Plaintiffs allege Defendants copied – are too brief, unoriginal, and uncreative to warrant protection under the Copyright Act,” Fitzgerald concluded.
 
The case was dismissed with leave to amend, but Fitzgerald advised the plaintiffs not to refile the suit unless there are as-yet-undiscovered similarities between the two songs.
 
It doesn't really seem like there's a clear similarity beyond a couple of repeated phrases.
 
there's this ...
and then this ...
 
 
...
 
"The verdict in this case threatens to punish songwriters for creating new music that is inspired by prior works," states the artists' brief authored by Ed McPherson. "All music shares inspiration from prior musical works, especially within a particular musical genre. By eliminating any meaningful standard for drawing the line between permissible inspiration and unlawful copying, the judgment is certain to stifle creativity and impede the creative process. The law should provide clearer rules so that songwriters can know when the line is crossed, or at least where the line is."

The amici say this case is "unique" because the two works at issue "do not have similar melodies; the two songs do not even share a single melodic phrase."

Instead, they suspect that the jury perceived similarity in the overall "feel" or "groove," which harks backs to the very first filing in the lawsuit. They point out that Gaye himself was heavily influenced by Frank Sinatra, Smokey Robinson, Nat "King" Cole, James Brown and others. They tell the 9th Circuit that there's a "bright line" in film, television and book copyright cases, but that the realm of music hasn't produced any legal clarity about what are "ideas" free to be used by anyone and what's "expression" that's off-limits to be misappropriated.

...

One of the things I hadn't read from the side that has been against the "Blurred Lines" verdict was how the case got catalyzed.

With the caveat that lawyers are lawyers ...

https://www.billboard.com/articles/business/7989223/richard-busch-lawsuits-spotify-blurred-lines-appeal-music-industry

...

How did the “Blurred Lines” case come to you and why did you decide to take it?

It was referred to me by Mark Levinsohn, the transactional lawyer for the Gaye family. If you remember correctly, Pharrell Williams and Robin Thicke sued the Gaye family [seeking a declaratory judgment that “Blurred Lines” didn’t infringe Gaye’s “Got to Give It Up.”] We had a very strong musicology report, and I felt it was a strong claim.

Some of the media coverage focused on the idea that a win for your side would open up a can of worms, so more current songwriters could be sued by the owners of old compositions.

I could not disagree more. You have to check the source and realize that those who say that in an article may have an agenda. That was their pitch at trial and it has been the entire story of their legal team. It’s just not true. It is based on standards that have been in place for decades. When the Isley Brothers sued Michael Bolton [for infringing their copyright to the song “Love Is a Wonderful Thing” on his song of the same name], there was the same outcry: "This is going to stop the original creation of music." It didn’t happen then, and it’s not going to happen now.

Where’s the line between influence and infringement?

There are real standards. I can’t tell you how many people have come to me saying someone copied their song and I sent it to a musicologist and they said it wasn’t original or it wasn’t compositionally similar -- I turn down 80 to 90 percent of the cases that come to me. But we have a case involving Lil Jon and DJ Snake [who are being sued over "Turn Down For What" by Golden Crown Publishing for infringing the Freddie GZ song of the same name]. And we just settled a case involving Ed Sheeran [that involved a lawsuit brought by songwriters Martin Harrington and Thomas Leonard over his song "Photograph"]. There are standards you have to meet -- and the “Blurred Lines” case met them.

One of the conundrums of our era is that so much popular culture is under copyright or trademark in some fashion.  Rather than argue, as some people do, that intellectual property itself is the problem or bad, some better education on what is in the public domain and what the public domain is for seems like a good idea.  But then as a classical guitarist and a composer I guess I'm already steeped in a style of music that goes back for a century or two.  I'm not sure, I'm afraid, that many people want to immerse themselves in the styles of music for which nearly every identifiable thing is public domain. 

Theodore Gracyk once made a distinction between music that is "ontologically thick" and "ontologically thin".  It's an academic distinction in that kind of jargon but the easiest examples would be classical music and pop music in the 20th century.  To say a music is "ontologically thin" is to say that it is transmitted and preserved and presented in a way that involves a communication system that is not hugely dependent on any one performer, any one style, and can be retained over a long period.  A music score for a string quartet would be considered "ontologically thin". 

"Ontologically thick" music would be basically any Beatles song or a pop song, music that is concretely tied to specific sounds, specific people, particular ways of generating sounds and that there is some "definitive" version.  You can identify that Prince himself as distinct from anyone else is performing a Prince song.  Similar specificity applies with any other pop star.  This is rather broadly what Gracyk's definition of "ontologically thick" music is.

Well, one of the biggest legal pitfalls with ontologically thick music is stuff like license and copyright.  If popular music could gain (or retain, really) connections to ontologically thinner music than itself and if, in turn, the concert music or "classical" idioms or post-classic idioms retained some connection to ontologically "thicker" performance idioms both styles of music would seem to have a better chance of retaining some vitality.    One of my soap box concerns for anyone who's read this blog in the last twelve years. 
 
So, anyway, a bit of musical news