Saturday, November 14, 2015

Kyle Gann quotes what he considers the distillation of Richard Taruskin's work, that studying the humanities is ultimately no guard against being inhuman

The grim history of the twentieth century – something Brahms or Franck could never have foreseen, to say nothing of Matthew Arnold or Charles O’Connell – played its part as well both in discrediting the idea of redemptive culture and in undermining the authority of its adherents. The literary critic George Steiner, one such adherent, after a lifetime devoted (in his words) to “the worship – the word is hardly exaggerated – of the classic,” and to the propagation of the faith, found himself baffled by the example of the culture-loving Germans of the mid-twentieth century, “who sang Schubert in the evening and tortured in the morning.” “I’m going to the end of my life,” he confessed unhappily, “haunted more and more by the question, ‘Why did the humanities not humanize?’ I don’t have an answer.” But that is because the question – being the product of Arnoldian art religion – turned out to be wrong. It is all too obvious by now that teaching people that their love of Schubert makes them better people teaches them little more than self-regard. There are better reasons to cherish art.

                  – Richard Taruskin, Music in the Nineteenth Century, p. 783

What made Hayao Miyazaki's film The Wind Rises remarkable and challenging is hidden in plain sight in a little question, would you rather have a world with or without the pyramids.  Miyazaki has snuck in the idea that may be most unsettling to those who would believe the humanities humanize and that the arts are a path to speaking truth to power--the pyramids are works of art and engineering, surely, but they are the monuments of empire.  Miyazaki gently tossed out there for our consideration the gloomy observation that all art, no matter how iconoclastic we might want to believe it is, is in some sense a reflection of an imperial aspiration even if that imperial aspiration is as simple as one guy saying he just wants to make something beautiful. 

a few links about the birth and theoretical death of jazz

Thoughts on the potential end of jazz as a popular art form (as opposed to a high art form) have been circulating for quite some time.


John Lewis, the music director of the Modern Jazz Quartet, noted: “Jazz developed while the great popular music was being turned out. It was a golden age for songs. They had a classic quality in length and shape and form and flexibility of harmony. The jazz musicians were drawn to this music as a source of material.” The Songbook, a product of a fleeting set of cultural circumstances when popular, sophisticated music was aimed at musically knowledgeable adults, was the crucial wellspring of jazz. Both jazz and its progenitor are worthy of radical—indeed, reactionary—efforts to preserve them. But despite Gioia’s ardency, there is no reason to believe that jazz can be a living, evolving art form decades after its major source—and the source that linked it to the main currents of popular culture and sentiment—has dried up. Jazz, like the Songbook, is a relic—and as such, in 2012 it cannot have, as Gioia wishes for it, an “expansive and adaptive repertoire.”

So it's not surprising to see at Books and Culture a proposal that the increasing disconnect between the practice of jazz and the vocabulary of popular musical styles and forms has led to a decline in the popularity of jazz over the last half century.
Marc Myers' Why Jazz Happened sheds light on a question like that, seeking to show how jazz has adapted to popular tastes to survive. "For the past ninety-five years jazz's survival has been based on the ability of musicians to interpret their times without relinquishing the characteristics that define the art form," Myers notes. His intent is to show how the happenstances of American social history have crucially shaped the evolution of jazz since its official beginnings with the Original Dixieland Jazz Band's seminal 1917 recording.

What Myers actually chronicles, however, is less a successful series of transformations than a long decline in influence, of a music that lit up ordinary Americans in the two decades between the two world wars but then became steadily less central to the culture. Jazz's artistic development, like that of most art forms, has inevitably put it beyond the reach of the ordinary evaluator. Jazz fans tend to bristle at such verdicts. So too, in response to the equivalent argument about their music, classical music representatives insist the problem will be solved via presentation strategies and lowering ticket prices. They assume that, with those issues finally resolved, Sisyphean though the task ever appears to be, surely the music's passion, the "soul"—a term often heard from classical fans as well as from jazz buffs—will work its magic.

However, just as fish don't know they're wet, fans of refined musical forms like classical and jazz have a hard time putting themselves into the heads of ordinary listeners, especially ones nurtured in the pop-saturated musical culture of the past 60 years, focused on volume and histrionic performer charisma. Jazz in the form of candy-flavored pop tunes put across in danceable fashion—which was how most experienced the music in the Twenties and Thirties—was catnip to younger Americans of the time. However, what this kind of jazz later became—musically dense, focused on individual improvisation, and intended for quiet listening—has always been more like absinthe, now and forever a specialty taste. Myers' book neatly demonstrates this, despite his intent to reveal an art form dynamically responding to popular tastes.

And as we should all know by know, the ascent of rock and roll and its evolution from things like jump bands and blues meant that the direct simplicity of expression torch got passed from jazz to rock and pop whether jazz musicians and fans have wanted to recognize it or not.
In 1968, when Patti Smith was twenty-one and working in a Manhattan bookstore, she went to a Doors concert at the old Fillmore East. She loved the Doors. As she described the concert in her memoir “Just Kids,” everyone was transfixed by Jim Morrison, except for her. She found herself making a cold appraisal of his performance. “I felt,” she concluded, “that I could do that.” For many people, that response is the essence of rock and roll.

To this way of thinking, rock and roll—the music associated with performers like Chuck Berry, Little Richard, Buddy Holly, and the early Beatles—is music that anyone can play (or can imagine playing) and everyone can dance to. The learning curve for performing the stuff is short; the learning curve for appreciating it is nonexistent. The instrumentation and the arrangements are usually simple: three or four instruments and, frequently, about the same number of chords. You can add horns and strings and backup singers, and you can add a lot more chords, but the important thing is the feeling. Rock and roll feels uninhibited, spontaneous, and fun. There’s no show-biz fakery coming between you and the music. As with any musical genre, it boils down to a certain sound. Coming up with that sound, the sound of unrehearsed exuberance, took a lot of work, a lot of rehearsing. No one contributed more to the job than Sam Phillips, the founder of Sun Records, in Memphis, and the man who discovered Elvis Presley.

What springs to mind lately is that finding some kind of successful synthesis between scholastic formalist approaches to music on the one hand and an awareness of popular or folk material on the other is an endless process of experimentation.  Charles Rosen, in The Classical Style, wrote a short thought about how in the works of Haydn and Mozart and somewhat with Beethoven there was this sort of synthesis. 

A person could almost get the sense that both jazz and classical music face a decline in audience size and mass appeal the more its advocates and agents locked themselves to the dogma of art for the sake of art.

We're fast approaching the centennial of when many people consider the earliest blues and jazz recordings to have been made.  That would seem like a case for including jazz as a high art form at a place like, oh, Yale.  But some folks would prefer to not teach jazz as if it were part of the Western art music canon even though at a popular level the distance between jazz as it has been practiced as an extension of a popular sets of idioms and the popular idioms of our current era seems to have grown in the last half century.

Some believe that the last thing jazz should try to do is double down on being a high art form and some have a particular person in mind ...

In terms of technical, formal understanding, and a willingness to experiment goes, a continual fusion of classical and jazz traditions could be fantastic.  But if the classical and jazz communities band together out of a disdain for pop that's not a particularly healthy response.  It may be that the "death" of jazz and classical music as "popular" idioms could be precisely because of a commitment to a set canon and also to the idea that there is a concrete "sound", rather than a flexible conceptual approach. There's simply no reason we can't have a 12-bar blues become the first theme in a sonata allegro form any more than there's a reason we can't use the approach of 18th century contrapuntal procedures to play around with a blues riff.  This stuff has been done in the last century, after all.

Kyle Gann on how atonality in music as an expression of anxiety--

As Philomel, Sinfonia, Gruppen, and Piccola Musica Notturna show, even 12-tone organization is not the issue. It strikes me that the deciding factor is whether or not the listener senses that there is some organizational factor that you’re supposed to be hearing that can’t be located by ear, whether the meaning of the piece is buried somewhere underneath the surface. That quality seems to be more what Holland objects to about Perle than the mere lack of tonality. I was dumbfounded by the quotation Alex Ross in his book unearthed from Boulez; asked why the serial pieces of the ’50s never became standard repertoire, the meister admitted, “Perhaps we didn’t pay enough attention to how people listen.” In general, and as evinced by a thousand film scores, atonality tends to express anxiety, and much of the music, like Sun-Treader, that freely acquiesces to that is extremely effective. But Wolpe’s output is Exhibit A that music can be relentlessly atonal and also whimsical, jaunty, and attractive.

The way Leonard Meyer put it half a century ago was that the problem total serial music was facing was that understanding the rules of the precompositional process was not the same thing as understanding the end result.  You can't any more appreciate a piece of serial music because you have grasped the tone row than you will appreciate Beethoven's Eroica simply because you know what the key of E flat is.

Gann's pointed out the obvious, but as a teacher of mine used to put it, don't underestimate the obvious.  Atonality has been great as expressing emotions like dread and anxiety and fear.  Erwartung isn't exactly camping out in the emotional/social realms of My Little Pony: Friendship is Magic

As polemics go, though, there's another polemic that can be made and has been made about when atonality does and doesn't stick but we may save that for some other post.

an e-book that illustrates a small point, William Perkins on the Art of Prophesying

Said book is about preaching, and conflates prophesy with the preacher's activity.  For years I'd hear Driscoll say that prophesy was preaching as if it that were the plainest possible sense of the term.  No exegetical or historical case was presented, just the bald assertion. 

Nothing from Zwingli or Bullinger or any Reformer.

Thing is, anyone who bothers to read the Bible at all will eventually see that prophecy, whatever it is, is able to include preaching but cannot be reduced to that. Anyone who takes a look at the Torah and the narratives of the earlier Israelite history (i.e. Joshua, Judges and Samuel, as distinct from debates and discussions about when those books were edited and compiled) will see that the understanding of prophetic activity did not necessarily take the role of the prophet to be what a preacher would do now, or even half a millennia ago.

I'm floating this theory I may get to later this year that what happened in the 1520s to 1540s was a set of polemics within the nascent Reformation in which anti-Catholic and anti-priestly polemics tended to shift the metaphorical understanding of pastoral activity away from its most obvious precedent in the OT law and narrative literature, the priesthood, toward the activity of the prophet.

Certainly people have complained about N. T. Wright and New Perspective types about Paul.  A valid point that's worth raising, however those scholars may be right or wrong about Paul, is their point that we need to consider the biblical texts beyond the fifteen and sixteenth century polemics about doctrines.  This may be particularly useful in the case of what the Reformers had to say about prophecy and prophets because the rhetorical and ideological incentives for the Reformers to see themselves as prophets without conceding an essentially ad hoc nature to the identification can lead people to uncritically take up conclusions they drew at a particular time for particular reasons with particular caveats without understanding, perhaps, what's been done.

Thursday, November 12, 2015

Driscoll still on the conference circuit, The Most Excellent Way to Lead, March 2016, with the likes of Perry Noble and Steven Furtick and others

Pastor Mark will be speaking at the Lead Conference hosted by Perry Noble. This one-day conference will be held at NewSpring Church in Anderson, South Carolina on March 3rd, 2016. This event is uniquely designed for leaders who want their teams and their organizations to succeed beyond their expectations.

Last week we revisited the stories not shared in the stories of ministries Mark Driscoll Ministries has said Driscoll helped to found.

The ministries Driscoll founded or co-founded at this point either don't really exist in any functional sense as ministries that interact with the public or exist but have either no use for Driscoll himself or have publicly distanced themselves from any connection to him.

It might be a little too soon to feature Driscoll, even in 2016, as someone who can speak at a conference that has the goal of helping leaders who want their teams and their organizations to succeed beyond their expectations.  By his own account he's the unemployed guy this year.

Wednesday, November 11, 2015

over at Mark Driscoll Ministries, a guest post featuring content from Ashley ... Mark Driscoll keeps dragging his kids into the spotlight two years after "The Hardest Part"?

We've discussed the disparity between Mark Driscoll in 2013 on his kids being exposed to stuff via social media before

Back in 2013 there was "The Hardest Part", where Driscoll regaled us with the travails of his kids and wife because of his public ministry.

At Thrive he shared how the kids found out about the resignation through social media before he or Grace told them, apparently, even though based on the Brian Houston interview there had to have been at least 17 hours or so in which to have told the kids "Poppa Daddy quit".

Be all that as it may, this year Mark Driscoll Ministries has not just featured Grace Driscoll on friendship and noted Alexie's birthday, this week Mark Driscoll Ministries brings up something that had a short live over at Pastor Mark TV, blog content from Ashley.

There's this old axiom about eating your cake and having it, too.  This has been discussed by others before but it bears repeating, Mark Driscoll seems unable or unwilling to piece together that to the extent that he keeps featuring his children on all manner of social and/or broadcast media he's keeping them in the spotlight.

If the spotlight is where trouble has come from isn't it okay to let the kids have normal lives?  Or is that possible?  A parent is certainly entitled to be proud of what his/her child is doing and achieving (unless all pride is utterly satanic, which may or may not be how Driscoll thinks about pride these days).

Still, for a guy who took such pains in 2013 to share how scary things were for his kids how sure can Driscoll be that things are different in Phoenix?

Well, though featuring the Driscoll kids at the website seems counterintuitive, it's possible Ashley has learned from the mistakes of her dad who once sounded off on women as penis homes.  Here's to the hope that if she's going to keep ending up on social media anyway thanks to Poppa Daddy that Ashley Driscoll proves, 20,000 times more prudent than her dad about thinking through the internet-is-a-long-time dynamic of social media.

Tuesday, November 10, 2015

another Atlantic link, Don't romanticize rejection--"Focusing on how individual artists should persist in the face of rejection obscures how the system is set up to reward only a chosen few, often in a fundamentally unmeritocratic way"
Time and time again, the literary establishment seizes on the story of a writer who meets inordinate obstacles, including financial struggles, crippling self-doubt, and rejection across the board, only to finally achieve the recognition and success they deserve. The halls of the literary establishment echo with tales of now-revered writers who initially faced failure, from Stephen King (whose early novel Carrie was rejected 30 times before being published), to Alex Haley (whose epic Roots was rejected 200 times in eight years). This arc is the literary equivalent of the American Dream, but like the Dream itself, the romantic narrative hides a more sinister one. Focusing on how individual artists should persist in the face of rejection obscures how the system is set up to reward only a chosen few, often in a fundamentally unmeritocratic way

I.e. you could say the winner-take-all game is even more brutally lop-sided in the realm of literature than other arts. Of course white guys obviously don't feel like the game is rigged in their favor "that" much or we wouldn't have had the year's earlier controversy about the white guy passing himself off as Asian American to get published in a volume that Sherman Alexie edited. From Alexie's standpoint, as he so eloquently put it, he was trying to see to it that the volume wasn't completely dominated by poetry teachers. 

a study indicates religious children less altruistic than non-religious children ... though the sample size seems doubtful even at more than 1k

It's hard not to think of Daniel Kahneman's Thinking Fast and Slow in which he explained why many a social scientific finding is essentially worthless because of sample size. If the sample size is too small it's not representative of a population and if a study can't be replicated multiple times it's impossible to take the finding as being all that reliable. 

There's been a slow-cooker crisis related to the credibility of social scientific findings over the last decade or so.
THERE'S NOTHING NEW ABOUT scientists who fudge, or even fabricate, their results. Whole books have been written about whether Gregor Mendel tweaked the measurements of his plants to make them better fit with his theory. When attempting to fit the irregular polygon of Nature into the square hole of Theory, all researchers face a strong temptation to lop off the messy corners. Imagine that you’re going along, accumulating data points that fall into a beautiful line across the graph, and all of a sudden some dog stands there like a dummy, refusing to salivate. You ring the bell again, louder—nothing. Is he mentally defective? Is he deaf? What will you trust: the theory you have spent years developing, or the dog? (This is not to cast aspersions on one of the great pioneers of experimental psychology. As far as we know, Pavlov’s dogs really did do what he said they did.)
Then again, maybe there is no dog at all. For scientists in a real hurry to establish themselves, the quickest way to go from arresting hypothesis to eye-catching publication is to skip the research altogether and just make up results.

OUTRIGHT FAKERY IS CLEARLY more common in psychology and other sciences than we’d like to believe. But it may not be the biggest threat to their credibility. As the journalist Michael Kinsley once said of wrongdoing in Washington, so too in the lab: “The scandal is what’s legal.” The kind of manipulation that went into the “When I’m Sixty-Four” paper, for instance, is “nearly universally common,” Simonsohn says. It is called “p-hacking,” or, more colorfully, “torturing the data until it confesses.”

Sample size is a touchy topic in psychology, because undergraduate subjects often expect to be compensated, and researchers must pay them from grants that are overseen by the tight-fisted guardians of research funding. But because larger sample sizes increase the predictive power of results, Simmons now tries for at least 50 subjects in his own research. “The brutal truth is that reality is indifferent to your difficulty in finding enough subjects,” he says. “It’s like astronomy: To study things that are small and distant in the sky you need a huge telescope. If you only have access to a few subjects, you need to study bigger effects, and maybe that wouldn’t be such a bad thing.”

Of course it's no small thing when the samples turn out to be bunk.  Torturing the data until it confesses may be the more common and worrisome risk than outright fraud, but Slate had a lengthy feature dealing with fraud earlier this year.

That gay marriage study that faked data, for those who remember that, that study was used as a basis to distribute money for campaign projects.

There are people who doubt scientific research because they have doubts about the research in particular, even if it's tempting to propose that those sorts of people doubt the efficacy of science in general.  It's not hard to come across people who take to social media to condemn pharmaceutical companies in favor of this or that naturopathic approach.  On the other hand, to tweak a line attributed to C. S. Lewis about the sciences, it may be there's cause even within the realm of science to worry that, within certain limits, people get the kind of scientific findings foundations are willing to pay for.

links with an educational theme

Cheating in Online Classes Is Now Big Business

As has been noted over the years, no field of study is really "safe" from rigging things.

Elite Liberal-Arts Colleges Aren't Producing the Highest-Earning Elites

Ignore Prestige: The Colleges That Provide the Biggest Earnings Boost


In the new world of rankings where money is king, Washington and Lee outperforms Harvard thanks to graduates with median earnings of $77,600, more than $22,000 above where expected. Harvard falls to number four with median wages of about $87,000, around $12,000 more than the estimated median salary. Villanova, Babson, Bentley, Otis College of Art and Design, Alderson Broaddus University, Lehigh, Texas A & M International University, and California State University-Bakersfield round out the top 10. Princeton and Yale fall to 772 and 1,270, respectively, with graduates who earn less than the researchers’ models would have anticipated.

So what has been going on at Yale?  Well ... there's different takes on one of the recent things.

Friedersdorf is clearly unhappy with what he considers a shift within student bodies at elite schools (i.e. American colleges) toward totalitarian impulses.  Nora Caplan-Bricker over at Double X on Slate considers the Yale case to be different from others and an example to follow.


But the student protests at Yale are different. They are not primarily about censorship; they are about students who feel disenfranchised and vulnerable using the language of “safe spaces” to claim a very basic right—the right not to face discrimination in their own homes. They are a call for the university to treat racism (and sexism) on campus the way it would treat most any other overt threat to the mental health and well-being of its students.

Vulnerability can be anywhere, certainly
 ... wondering a little about disenfranchisement at Yale, though.  Could not "check your privilege" apply to literally anyone who's able to attend Yale to begin with?  Or has the placing 1,270 changed things?