Thursday, August 28, 2014

Throwback Thursday: Extinction of a species

Woman's Death Marks Extinction of "Cub" Species 
Last person alive when Cubs won World Series, she's end of an era 

(Chicago, IL, July 18, 2026) -- Ludmilla Sverovla never saw the Chicago Cubs win a World Series. In fact, she never saw a baseball game of any kind. But when the lifelong resident of Sumy in Northeastern Ukraine died Friday at the age of 117, a baseball milestone came to an end. Not only was she the world's oldest living human, but by virtue of having been born on September 28, 1908, she was also the last person on the planet who was alive when the Cubs last won the World Series in 1908.

The next oldest living human, Roger Sklyver of Switzerland, was born on October 21, 1908, just one week after the Cubs defeated the Detroit Tigers 2-0 to win their second - and, to this date, last - World Series championship.

The impending cloud of extinction casts an 
apocalyptic pall over watching a Cubs game
To fully appreciate the magnitude of this event, one would have to go back to October 11 1907 - the day before the Cubs defeated the Tigers 2-1 to win their first of back-to-back World Series - to find a date in which not one person on planet Earth could be said to have lived to see the Cubs triumphant.

"This is truly staggering," said Trevor Sagacious of the Sagan Institute in California. "What we're witnessing here is the human equivalent of an extinct species. The idea that a professional sports team could be so inept that every single person on the planet would have died between championships is almost unfathomable."

Experts have scrambled to find some biological event to compare to the failure of the Cubs, but have so far come up short. "There are turtles still alive that were born shortly after the signing of the Declaration of Independence," said historian Bruce Brauer of the History Channel. "Even when we look at the Passenger Pigeon, probably the most famous extinct species, we fall short on comparisons to the extinct Cubs World Series survivor."

Brauer continued, shaking his head several times in apparent disbelief. "You'd expect that at some point in history the last Civil War veteran, the last World War I veteran, the last signer of the Declaration of Independence, would die. That makes sense. These were once-in-a-lifetime events, not to be repeated. The World Series is different. There a team has a chance to win each year. Granted, with the number of teams in baseball today, even if a different team won each year, you'd have a team that had gone at least 29 years without a title, then 28 and so forth. But to go 117 years without winning? To do so for so long, in fact, that there's not one person alive on the entire planet who was around the last time you won? To have an entire race of people die out without seeing the team take the Series? What are the odds?"

At least 15 to the eighth power against it happening, according to mathematician Skip Loover of the University of Chicago. "We had a hard time developing a program to calculate the odds, frankly," Loover said. "Every time we tried to do it the computer would come back with a statement that we were asking for a mathematical absurdity. It was like trying to calculate the final number of pi. Finally we had to develop a program that disabled the logic inhibitor, and that's how we arrived at the number. Even then, the computer included a comment at the end that said, 'Why Bother?' I guess that's how a lot of Cubs fans feel."

Sagacious said that government intervention was the only possibility of regenerating the rare species, and even that was a long shot. "Entire generations had come and gone without witnessing a Cubs victory, but this is ridiculous," Sagacious said. "Imagine, if you can, that babies could be born with the gift of speech and intellect. What you're really saying is that for the last hundred or so years, a baby who, emerging from the womb, said, 'Before I die, I just want one thing - to see a Cubs victory' - at the moment of birth, with that baby's entire life ahead of him or her, in essence you're telling that child, 'You're out of luck, kid.'

"A whole race of people have become extinct - those who were alive when the Cubs won. It's nothing short of a tragedy. The government has to do something - but, to be honest, I'm not sure what. Even if you tried to federally mandate a Cubs victory, they'd probably find some way to screw it up."

Those Cubs fans who hoped this 2026 season would be the year seem to be coming up short once again. This year's edition of the Cubs started the year with a ten-game losing streak, and already find themselves 17.5 games behind the three-time defending champion Pittsburgh Pirates in the National League's Central Division. But, as one Cubs fan told us on Rush Street today, there's always hope.

"Wait 'til next year," 87 year old Max Driver of Arlington Heights said. "There's still my unborn great-great grandson to think about. I just hope I live long enough to pass this great love of losing baseball down to him, to continue this time-honored tradition. Go Cubbies!"

Originally published October 16, 2007

Monday, August 25, 2014

Wish I'd written that - pigskin edition

What's all the stink over the Redskin name? It's so much [expletive] it's incredible. We're going to let the liberals of the world run this world. [...] It's all the political correct idiots in America, that's all it is. It's got nothing to do with anything else. We're going to change something because we can. Hey listen, I went through it in the '60s, too. I mean, come on. Everybody lined up, did this. It's fine to protest. That's your right, if you don't like it, protest. You have a right to do that, but to change the name, that's ridiculous. Change the Constitution -€” we've got people trying to do that, too, and they're doing a pretty good job."

Mike Ditka, on pressure to change the name of the Washington Redskins because it's supposedly derogatory to Native Americans.

Put that in your peace pipe, Keith Olbermann, and smoke it.

Friday, August 22, 2014

Policing the police

Fox News's John Stossel recently had some interesting things to say about how the police have handled the violence in Ferguson. I think Bobby's right about the irresponsibility of the young man killed by police, but Stossel sounds a cautionary note that the police are hardly blameless, either, suggesting that the increasing militarization of police forces everywhere tells us something about what's happening in American, and that ain't good:

[The Cato Institute’s Walter Olson] notes that a man identifying himself as a veteran from the Army’s 82nd Airborne Division reacted to video of police in Ferguson by tweeting, “We rolled lighter than that in an actual war zone.”

If authorities arm cops like soldiers, they may begin to think like soldiers -- and see the public as the enemy. That makes violent confrontations more likely.

Again, this doesn't excuse the violence of that segment of protestor who's looking for trouble (cough-cough-Al Sharpton-cough), and Stossel makes clear that lawlessness is never acceptable.  But this whole issue with the police is incredibly troubling, something that should have been addressed quite some time ago; but better now than later.  It's further evidence of why the Founders thought we needed a Second Amendment, and proof of their wisdom in understanding that the government, no matter who's in charge of it, should never be completely trusted.  As Benjamin Franklin once wrote, "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety."  We know that this quote has been misappropriated many times, for many different reasons, but there's still something to it, don't you think?

Where do we stand on that today?

Thursday, August 21, 2014

Throwback Thursday: Hitting the wrong (?) note

A couple of weeks ago, Joe Queenan (full disclosure: one of my favorite writers) published a very funny, very nasty piece in The Guardian in which he attacked modern classical music. (I particularly liked one of his lines in which he describes concert-goes who "have learned to stay awake and applaud politely at compositions by Christopher Rouse and Tan Dun. But they do this only because these works tend to be short and not terribly atonal; because they know this is the last time in their lives they'll have to listen to them."  On the other hand, I'll admit six years after the fact that I actually rather liked Satyagraha.)

Needless to say, Queenan's piece has caused something of a stir among classical music blogs. It's quite possible, dim bulbs that some of them are, that they didn't realize Queenan's role as a resident cultural curmudgeon. But the debate has been nothing if not spirited. Terry Teachout, while taking Queenan to task on his total dismissal of modern music, agrees that "I don't go in for crunch-and-thump music, nor do I care for the over-and-over-and-over-again minimalism of John Adams and Philip Glass, which puts me to sleep." (I agree with a lot of what Teachout says about the modern music he likes.) Teachout's friend Ethan Iverson, who champions "frequently fiercely dissonant and somewhat tuneless" music, disagrees, as Teachout attests. (By the way, there's some terrific writing going on from these bloggers, so don't think that the links I've chosen here are anything more than the tip of the iceberg in this discussion. And check out the writers they link to as well - you might not agree with them, as I didn't, but you'll be informed about the discussion by reading them.)

I suppose the way you feel about classical music in general will dictate your answer to this question. If you feel that it's a window to God's creation, for example, you probably tend to side more with Jay Nordlinger, who wrote that

Music critics and other such types like to say that what the public really wants is modern music — Cage, Birtwistle, Stockhausen. None of this Classical Top 40 stuff. But this is wishful thinking, of course. If you give 'em Tchaikovsky, Beethoven, and "Finlandia" — not to mention "Carmen" or "William Tell" — boy, do they come.

As is often the case, I gotta go with Jay on this. (Even though there's plenty more good stuff out there than the "top 40" that you hear on classical radio nowadays - hey program directors, you ever heard of "deep cuts"?) Yes, there are twentieth-century composers that I'm very fond of - Britten, Stravinsky, Copeland, Rorem, Barber, Menotti, even Ligetti and Webern. Now, there are a lot of critics who would complain that this music isn't "modern" enough for them (except perhaps for the last two). But, as we've written many times at this site, there is an undeniable relationship between truth and beauty, particularly the natural beauty in the tones that mirror the rhythms of the human body. (And we touched on modern music as well, as this link to our four-part roundable from last year on "Art and Politics" will attest - though since this initial writing I've come around on several pieces, including Nixon in China, which I now agree is one of the seminal 20th Century American operas.)

And so we are left with this question to ponder: is there, in fact, a relationship between the harshness of modern music and the harshness of our modern culture? For we have become a harsh people, in our words, our opinions, our very way of life. Some call it edgy, but others might suggest it's merely nasty. Is that a good thing? Is our culture really better off now than it was fifty years ago?

I'm just asking.

Originally published July 29, 2008

Tuesday, August 19, 2014

Where have all the bloggers gone?

As I've been known to say from time to time, I get my ideas from unusual places.  This time it's from World Soccer Talk, one of the places I frequent for my soccer fix, where an interesting article notes that 55% of the 100 best soccer blogs, as chosen in 2011 by the English newspaper The Guardian, are now gone.  For those of you who are mathematically-challenged, that means 55 blogs that have either disappeared altogether or have ceased to exist for all intents and purposes,* and another 12 are considered endangered.

*The definition used in the article is that no new articles have been published since March, a period of time that would have included the recent World Cup.

As for why so many highly-regarded blogs have disappeared, the author suggests a number of reasons ranging from social media (many of them are still alive via Twitter feed) to media consolidation (the best bloggers leave the independent blog field to take up with larger sites such as Grantland).  The 30+ sites that continue have carved out their own unique niche, filling the need with content that remains unique and valuable.  It's an analysis that makes a great deal of sense.

Blogging does seem to ebb and flow.  I don't think there's much question that there aren't nearly as many blogs around today as there used to be; I know a good many that I've had to remove from the sidebar over the years, many of which were around long before either of mine started.  For many, it's a case of real life intruding - if you want a good blog, you have to devote some time to it, and that isn't always the easiest thing to continue, especially when you're doing it for little or no money.  Others have said what they have to say, and have nothing more to add.  Still others probably get bored a few months after the novelty wears off, and leave the site deserted.

My ideal writing forum always has been, and remains, the novel.  I've written two so far, and I'm at work on a third.  The latest one is probably going to be the first one published; I'm about a third of the way through it, and the subject matter is time-sensitive enough that I'd like to see it in print sometime next year, probably through my own publishing efforts.  My second book was almost self-published a few years ago, before I started changing jobs and moving across country; it needs no little revision, but it has good bones and might make it next year as well.  The first book I finished likely will never see the light of day, at least not without major changes.  It served its purpose, though - it taught me that I could write a story from beginning to end, and fill all the pages in between.  Add to this a couple of non-fiction books I hope to get from It's About TV, and you can see how the blogs can be something of a challenge.

In fact, I probably could get a great deal more writing done if it weren't for these two blogs.  I don't get paid for them, and every word I type for free takes away from one I could get paid for.  So why do I continue?

With It's About TV, there's really no secret.  I get a great deal of pleasure writing about television, and I think the blogging has helped me not only to refine my writing style, but organize my ideas for the inevitable books that will grow out of the blog.  It may take away some of the time I'd spend on my novels, but in now way do I think it is a waste.  It is, as I've said before, one of life's simpler pleasures.

As for this site, honesty commands me to disclose that I've seriously considered closing shop at least twice.  The first ending was averted when I rebooted the site from one with a focus on religion to that of a general cultural awareness, which included religion but also politics, television, movies, books, and sports.  When I spun the TV portion off into its own blog, I considered wrapping this one up again, but then I looked at the outstanding contributors that keep it running - Bobby, Steve and Drew - and there were so many pieces that each one had written that I loved, I couldn't see the sense in letting them fade away.  Even when It's About TV became the primary blog and I could have abandoned this one, I decided instead to give it a face lift, a new name, and redouble my efforts to keep the content coming.  Indeed, I don't know if you've noticed, but we've already surpassed last year's number of posts, with over four months still remaining.  The two blogs have, in fact, an equal number of posts, though this one can make that claim only because there are four of us working on it.

Eventually the time will come when this blog stops being a useful form of communication, when Bobby and Steve and Drew decide they've got other things to do, and when that happens I might pull the plug, or keep it going on a much more limited basis.  Likewise, there may be a point when I fell It's About TV has run its course, when it's completed its mission of demonstrating the link between television and American culture and it becomes time to move the discussion to the book format.  But until then, I enjoy the challenge of succeeding where others have given up, and as we come up on the tenth anniversary of In Other Words (!), it's my goal to continue writing here for as long as I can.  Every time another blog fades away, I become more determined than ever to keep this alive.   Besides, we'll never stop having opinions, and as long as that's the case, we'll keep expressing them here.

Don Pardo, R.I.P.

Was there a better voice for television than Don Pardo's?  At the same time, he was able to project warmth, authority, credibility.  He could announce a game show like the original Jeopardy!, while he could also break the news of JFK's shooting, and be absolutely the right voice for both.

Many of us, of course, know him best as the announcer and occasional foil on Saturday Night Live, and there was more than one time when his introduction of the cast was the best thing about the show.  It was then that his voice called not for authority, but humor - and the little tremor he put into it was just right.

Was he the most famous television personality that nobody would have recognized?  He did appear on-camera occasionally on SNL, but most of what he did was behind the scenes.  Nonetheless, he was elected to the Television Hall of Fame, and rightfully so.  The fine obit in The New York Times mentions that new SNL cast members "couldn’t wait to hear their name said by him,” according to Lorne Michaels.  It must have been like a kid growing up dreaming of playing baseball in Yankee Stadium and having his name announced by Bob Sheppard.

That Times bit also tells us something of what early radio and television was like.  Pardo, of course, got his start on radio, as a staff announcer.  But if you think that was a simple job, waiting around to give the time and station ID, you're wrong.  "As a staff announcer, he did more than introduce shows and read commercials. The announcer also played the role of engineer, getting the radio programs going and cuing up the right bits at the right time. If you could not do those chores, he said, you would not last as a radio announcer."  I wonder how many of our radio and television personalities could do that today?

His voice will continue in reruns of SNL, but many of his other work is lost, either literally in the sense that the shows (like Jeopardy!) no longer exist, or lost because his voice can't be used again - there's no occasion to play a pre-recorded Pardo introducing a show like SNL, because the cast names are all different.  I haven't watched SNL for years - in fact, though I think of myself as having a pretty good memory, I literally can't recall the last time I saw it.  Probably when Dennis Miller was doing Weekend Update.  But Michaels says the show will present a tribute to him this fall, and I'll probably tune in for that.  For SNL is one of TV's longest-running shows, and Don Pardo was its longest-running cast member.

Here's a wonderful appearance he made on Weird Al's video I Lost on Jeopardy (along with original host Art Fleming), which gives you a pretty good feel for the original show.  I love his line, "You don't even get a lousy copy of the home game!"  As is always the case with parody, it's the little details that make the difference.

Here is his voiceover on NBC television bringing the first news bulletin on JFK:

And here he is talking about getting the SNL gig:

Don Pardo died at the ripe old age of 96, one of only two people (the other being Bob Hope) to have a lifetime contract with NBC.  He had a great career, working right through the end of this season's SNL.  And best of all, he sounds like he was a good man.  R.I.P., Don Pardio - we'll miss your voice, and we'll miss you.

Cross-posted to It's About TV!

Monday, August 18, 2014

The lack of responsibility led to the Ward and Brown cases

For wrath killeth the foolish man, and envy slayeth the evil one.” – Job 5:2

Sitting through my thoughts over the week's news, I looked at notes from my Ryrie Study Bible, purchased while at a college retreat, and is falling apart from a decade and a half of consistent use, thinking of this verse while finagling through the news in both the Canandaigua, New York and Ferguson, Missouri cases, where in each case, a man died from wrath, and outrage from the nation has become the issue. Upon the review of each story, it was evident that wrath played a role in both cases.

In the Canandaigua incident, which some are wanting the “assailant,” Anthony W. Stewart, 43, of Columbus, Indiana, charged with manslaughter, the disturbing video evidence proved negligence not of Mr. Stewart, the 1997 Verizon IndyCar Seriesand 2002, 2005, and 2011 NASCAR Sprint Cup Series champion driver, proved the driver allegedly struck by Mr. Stewart, Kevin Ward Jnr, 20, had lunged towards a second driver (the #45 car) in the 360 (cubic inch displacement engine – a lower-cost series versus the 410 cubic inch) sprint car event before he gathered his anger at Mr. Stewart by lunging in front of the #14 car, with dim lighting in a dark firesuit and helmet, with little (if any) reaction time to react with his actions. This incident has forced Mr. Stewart in hiding, while his operations (Stewart-Haas Racing, Tony Stewart Racing, Eldora Speedway, True Speed Communications) are run by associates in the interim, while some are wanting him guilty when it's evident Mr. Ward had lunged in road rage that he crossed the line, forcing motorsport authorities to tighten rules regarding driver behaviour when exiting the car (which, in theory, will lead to more fights in the pit area). In motorsport, a driver involved in an incident is often forced to be restrained by circuit officials if he attempt to charge at another competitor. The Canandaigua Motorsports Park officials did not restrain Mr. Ward in any matter, which is disturbing considering most circuits would have officials do their hardest to prevent other competitors from charging at other cars.

In the Ferguson incident, reports are the teenage assailant and a friend entered a convenience store (where the most popular items sold are tobacco, alcohol, and gambling tickets) and committed a strong-arm robbery where a full cigar case, valued at $50, was stolen in the robbery. The clerk in the store called police and likely called it an “armed robbery”. When questioned, the suspects then attacked the policeman (the cigars were stolen, and the suspect may have looked too young to own tobacco), with reports coming that the shoplifter and his accomplice were involved in an incident with police, attempting to resist arrest, with part of the battle involving the policeman's gun. When a policeman could be held defenseless by an assailant, he has a right to defend his weapon, which he did away from these assailants who were going after his gun, and shooting the policeman was a possibility.

Think about a motorist who travels US Highway 21, which I travel often, with a 88 KPH speed limit on the two-lane road. If you are traveling at 130 KPH, more than 41 KPH over the limit, that's a six-point penalty (0-16 is two points, 16-40 is four points, 41 and up is six points; twelve is a suspension). If you resist arrest and go on a high-speed chase, you will face extra forms of police technique to stop you from your antics. Reckless driving will result in police using severe tactics to stop the dangerous driving, similar to what happened when the policeman was having to stop the assailants from attempting to use his gun to attack him.

If you look under 18 years of age with the full box of cigars, let alone having stolen the tobacco, that is an automatic suspicion since tobacco sales are prohibited to those under 18, and with the actions that took place as proven by cameras, it was serious case of stealing, which even our Ten Commandments prohibit. However, in today's humanist teachings, there are no standards. They are taught there is no right or wrong, and it is based on feelings. The looters forget the most important part of the Ferguson story in that Mr. Brown and his accomplice first engaged in the strong-arm robbery, which in effect was shoplifting. That's stealing and violates Exodus 20:15. But once again, they are not taught the Bible today in schools. Popular culture has turned away from God's Word in favour of relativism.

This leads to an article this past week in the Southern Baptist Convention's Ethics and Religious Liberty Commission by Collin Garbarino of Houston Baptist University2. Mr. Garbarino, a professor, assigned a paper to students based on Thucydides' accounts of the Sparta and Athens battles, and Athens' attack on Melia when the city refused to join their side in the Sparta-Athens battles, led to the annihilation of the area. The professor found the students unable to see how Athens' moral failure was based on their self-centred arrogance, and they could not see how democracies could only stand with moral fabric. Such is why we have urban arrogance in an attempt to push a new humanist religion as the state religion, and attempts by activists to eliminate the moral fabric by wiping out standards based on the Bible, and instead impose standards based on man.

When considering that, the moral relativism of those who ransacked Ferguson as “social justice,” even reaching into the idea they want what they want, and refuse to be responsible for behaviour that led to the death of the robber.

In both the Michael Brown and Kevin Ward Jnr cases, both men were irresponsible. In Mr. Ward's case, he charged at race cars traveling at safety car speed in anger, which led to his own death for not following correct protocol after a crash, and wanted to take the situation with Mr. Stewart immediately, and not await until later to see what happened. In Mr. Brown's case, he committed crimes of robbery, and when suspicion aroused he was involved in illegal behaviour, he showed no responsibility and resisted arrest, even threatening the policeman.

Whatever happened to responsibility? Neither man was being responsible after their incidents, leading each to rage that led to their own deaths. We need to be responsible now. And moral relativism is allowing a lack of responsibility to dominate. Think about it.


1INDYCAR attempted to run the season to end the season at Indianapolis; that plan, which had been proposed for the 1996-97 season, was to have the season-opener in Loudon 1996, through the five-race season that continued through Las Vegas, and 1997 races at Disney World and Phoenix, before finishing at the Indianapolis at the 500; during the calendar year change, INDYCAR decided to finish the season at the end of the calendar year, meaning new races in Fort Worth, Fountain (Colorado), and Charlotte, and the two established races at Loudon and Las Vegas would end the 1996-97 season; further INDYCAR-sanctioned seasons have been calendar year only. Mr. Stewart's title is written as per style guidelines established by INDYCAR. Please note “INDYCAR” in all capital letters specifically refers to the sanctioning body, formerly known as the Indy Racing League until the end of the 2013 season. The name change was formalised in the 2014 INDYCAR rule book.

2The article refers to a movie that is now on DVD; when Mr. Chang attended the movie as part of a church trip in March, he discovered the movie was an feature-length film advertisement for Vivendi Universal Music around its storyline. Obviously, in the twelve years since taking his first voice lesson, and further learning of church music from friends such as Ingrid Schlueter, and reading of theologians regarding church music history, which effectively separates church music into a antebellum, postbellum ('gospel song'), and modern emotionalism era (OCP, GIA, and the Michael Jackson Family Trust that controls modern church music), he cannot justify a movie that promotes secularised music that is too often played in churches today.)

Friday, August 15, 2014

Bad radio

The "homers, screamers, and shouters" on radio have long been a target of this blog. As the collegiate gridiron season starts, last November's Iron Bowl derby (Alabama vs Auburn) showed, oddly, the difference between a professional who has worked at the national level, and a homer. Auburn's Rod Bramblett showed the example of the bad homers, while Alabama's Eli Gold went national on his call, which was appropriate considering he has worked college and pro games (and other events) on a national level. (Mr. Gold was on the CBS roster from 1997-2000, mainly serving as #2 for NASCAR Truck Series races on broadcast while their #1 for Cup races on cable.)

That's part of the reason why I've criticised the lack of national grade radio broadcasters remaining in sport today. It's this level how broadcasting has deteriorated when homers replace national, and yet most television broadcasters in the past came from a radio background, and still keep that radio background in their calls. The big irony is how Mr. Gold used the derby's name in his call, while Mr. Bramblett did not even reference the derby's well-known title.

Thursday, August 14, 2014

Throwback Thursday: The moral responsibility of the writer

The May 2007 issue of First Things has an intriguing article by Ross Douthat entitled “Lost and Saved on Television.” Douthat writes about the underlying questions of religion, morality and salvation (some obvious, others allegorical) that appear in several of today's most successful TV shows, such as Lost (obviously, judging from the title of the article), Battlestar Galactica, and The Sopranos (full disclosure: these aren't shows that I watch, although any good cultural archaeologist would certainly be familiar with them). It's a good piece, one that should be read on its own merits.

However, of particular interest, especially to the aspiring artist, is the following section:

The question, of course, is whether the audience gets the point, or whether The Sporanos’ faithful viewers are in it for the same reasons the mobsters are: the adrenaline rush that comes with any violent or sexual encounter, no matter how degrading it may be. This is the problem for any artist who seeks to show sin as it is. Does depicting an act make you complicit in it, even when you stand in judgment? Last Tango in Paris makes loveless sex look like hell on earth, for instance, but there are still people who watch it for titillation, just as there must be some segment of The Sopranos’ audience – young men, in particular – who spend their time cheering on the killers, identifying with the mobsters instead of profiting from their hell-bound example.


[I]s it the chance to see the story of Christ’s Passion as Mel Gibson reimagined it – blood-drenched and harrowing and brilliant – worth giving the same R-rated carte blanche to Quentin Tarantino, or worse, the makers of torture-porn thrillers like Hostel and The Hills Have Eyes?

I don't know how far Douthat intended to go down this particular avenue, but here we have an issue that works on many levels, radiating from one central question which Douthat asks: Are you glamorizing sin? We've talked often in these pages about the relationship between art and the artist, and the moral responsibility thrust upon the artist by his art. This gets dangerously close to Paul Drew’s territory (Nazi artists and whatnot), so we'll defer to him for the most part on the historical analysis.

But one cannot look at this without thinking of art as a creation, and the artist as creator. And while the idea of art for art's sake is an old one, it would seem that at least a secondary effect of art is the depiction - the revelation, if you will - of the artist himself. Art doesn't create itself, and it seems as if separating the art from the artist, even if one could do so, would leave the creation incomplete, lacking in some fundamental way. For example, we cannot know that God is good simply by looking at His creation, but we can know that His creation is good by looking at Him.

And therefore, one must read into the creation itself the personality of the creator, which in turn will tell you not only about the creation, but the creator as well.

Taking this back into the world of art, specifically the medium of the written word, it seems safe to say that much of what a reader knows about an author comes from the author's own words. The conclusions they draw about the author are to a great extent based on what they read, and that judgment of the author's character in turn helps to determine the weight to which they give those words.

So the writer returns to the question posited by Douthat - are you responsible for how people interpret your art? Can you plead innocence, even in cases where you ought to know better, as to what that interpretation is? Can you be held responsible for drawing your readers into, say, the proximate cause of sin? As Douthat asks, "Does depicting an act make you complicit in it, even when you stand in judgment?" For the author who attempts to portray man's rise from sin to salvation, what kind of risks does he assume when he takes on the mantle of sin itself?

This is an issue that confronts me directly in my (as-yet unpublished) fiction, one story of which features as its heroine a stripper, another with a professional assassin as the protagonist. What can be gained, despite the literary quality of these stories, by delving into such territory? Can it be justified by invoking the name of Art Itself? And does the author assume the responsibility for everything the reader takes from his work, even if it runs contrary to the author's own desire?

Is there a psychological or sociological justification which can be cited, for example, the desire to explore the Big Question? For one with a fertile, inquiring mind it is a road that begs invitingly. Certainly I think such a case exists, or else I wouldn't be investing time in it myself. I think that in terms both of self-expression and the desire to lead the reader into areas of the mind that might not previously have been considered, the author has a responsibility to honestly confront these issues as best he can.

It's possible, of course, that this could also simply be some kind of self-justification wrapped in denial. I wouldn't dismiss it.

But there can be no denying that the work will rub off on in some way on the writer, and will color the impression of said writer in the eyes of the reader. Which is why, once again, it is so important for the writer to assume responsibility - good or bad - for what he writes. If the writer wants to be taken seriously, if the writer seeks to influence or inspire, if the writer intents to pose serious questions to which he requests serious answers - all this will depend on how he conducts himself, both in public and in private. His writing may appear to apply only to the public arena, but surely the reader will interpolate its contents into the writer's private life as well.

So do I worry about glamorizing sin, about making sex too sexual and violence too violent? Absolutely. It presents a constant struggle within the creative process. A good many writers whose work I admire appear to go through similar struggles, with varying outcomes - some of which I quesiton, some with which I disagree totally. I can never know completely how they arrived at that process (although in the confessional world of the blogosphere I can come closer), and so I must content myself with my own personal struggles.

But this is a question worth posing, not only for the moral theorist but for the creative writer. In choosing the subjects one pursues and the words one sets down, the consideration of the effect these words have on those who read it can never be put far from one's mind.

Or, in other, simpler terms, think before you write.

It's a lesson those in the blogosphere could mull over more often.

Originally published April 25, 2007

Tuesday, August 12, 2014

How "elite" higher education is destroying our society

I got this link from Mitchell over the weekend; he knew this was bait for another "Scam Alert" on higher education.

It comes from the website of the Dallas Morning News, where William Deresiewicz pulls back the curtain on the scam that is an Ivy League education.  It's a damning indictment, and you really ought to read the whole thing, but this excerpt should give you an idea of what Deresiewicz, who used to be on the faculty at Yale (and thus, knows of what he speaks), sees in today's "elite" higher educaton:

Our system of elite education manufactures young people who are smart and talented and driven, yes, but also anxious, timid and lost, with little intellectual curiosity and a stunted sense of purpose: trapped in a bubble of privilege, heading meekly in the same direction, great at what they’re doing but with no idea why they’re doing it.

Now, we used to think of college as the place wherein young people prepared to face the world, and in an ironic way the Ivy League has done exactly that.  For Deresiewicz describes perfectly what kind of a world we now live in.  Look at U.S. foreign policy, for example - don't the words "anxious, timid and lost" describe it to a T?  Of course, what else would you expect from a society that displays these very traits?  One has only to look at the breakdown in religious belief to see a people with no sense of purpose, little to no curiosity, no idea why they do what they do.  Remove belief from life, and that's what you get.


When I speak of elite education, I mean prestigious institutions like Harvard or Stanford or Williams as well as the larger universe of second-tier selective schools, but I also mean everything that leads up to and away from them — the private and affluent public high schools; the ever-growing industry of tutors and consultants and test-prep courses; the admissions process itself, squatting like a dragon at the entrance to adulthood; the brand-name graduate schools and employment opportunities that come after the B.A.; and the parents and communities, largely upper-middle class, who push their children into the maw of this machine. In short, our entire system of elite education.

Deresiewicz speaks of the students he taught at Yale, and once again it's hard not to see the cause-and-effect, in that the system is producing exactly the kind of society one might expect when it's made up of young people who bring these experiences into the world.  There were many wonderful young people,

But most of them seemed content to color within the lines that their education had marked out for them. Very few were passionate about ideas. Very few saw college as part of a larger project of intellectual discovery and development. Everyone dressed as if they were ready to be interviewed at a moment’s notice. [...] Look beneath the facade of seamless well-adjustment, and what you often find are toxic levels of fear, anxiety and depression, of emptiness and aimlessness and isolation. A large-scale survey of college freshmen recently found that self-reports of emotional well-being have fallen to their lowest level in the study’s 25-year history.

Well, it's hard to argue with any of this, especially when we don't even know what college is meant to achieve:  "Is it just about earning more money? Is the only purpose of an education to enable you to get a job?"

In the midst of the gloom, Deresiewicz offers the start of the solution: "I’ve come to see that what we really need is to create one where you don’t have to go to the Ivy League, or any private college, to get a first-rate education."  You may or may not agree with his conclusions, but be sure to go here and read them.

Monday, August 11, 2014

War as a metaphor for war

This extraordinary photo, of gas-mask-wearing soccer players (likely soldiers playing the game during some training or down time) accompanies Brian Phillips' equally-extraordinary account of soccer during World War I, "Soccer in Oblivion,"  at Grantland. For me, World War I has always held more fascination than any war other than our own Revolution, because the cultural implications are so distinct.

While it's important to acknowledge, as Spengler does in this Asia Times piece, that World War I wasn't necessarily any more horrific than other wars of the past, at least in pro-rated manpower, it's also true that the Great War inflicted a kind of cynical somberness that the world likely will never recover from.  The thought of God looking down on His creation, all of it, doing its damnedest to tear itself apart, is a sad one, perhaps one of the saddest that a religious person can imagine.

The money quote from Phillips' story, which should tell us everything we want to know about the war, and about ourselves, then and now:

Never such innocence again. But we still make the same mistakes, because we still understand war through analogy and our analogies still fail. Now we see it as a video game, or we see it as a component of the NFL’s set of minor paraphernalia, jet flyovers part of the same combo pack that includes beer commercials and classic-rock riffs. We’re still trying to make the metaphor work, only now we’re doing it in reverse, endlessly describing games in terms of who conquered/eviscerated/bombed/slaughtered whom. It’s the same old trick, though. It’s a way to hide the horror under one layer of spectacle and another layer of moral virtue — a way to pretend that war is like a game, that there are rules, that there is safety. A way not to look into oblivion. We missed the cruel irony in all those soccer balls that show up in World War I photos. Nothing is a metaphor for war. War is a metaphor for nothing.

Make no mistake - war, no matter how horrible, is sometimes necessary.  That should happen sparingly, and without celebration, though.  As the soldier (attributed to Robert E. Lee, perhaps apocryphally) once said, “It is well that war is so terrible, or we would grow too fond of it.”

Friday, August 8, 2014

Flashback Friday: The cult of celebrity

The cult of celebrity is something that we've alluded to here from time to time, most often in the idea that celebrities, like everyone else, are role models whether they like it or not. And then there's what we've called the "Oprafication" of life, the wear-your-heart-on-your-sleeve mentality that substitutes feeling for thinking.

Nowhere was this more apparent than in the death of Diana, Princess of Wales. In "The Dianification of Modern Life," Theodore Dalrymple of The New Criterion sums up what a commenter refers to as the "vapidity of celebrity culture." It strikes at one of those things that seems so very wrong about our society today:
Her death provoked a reaction of sociological and psychopathological interest. Her combination of inaccessible glamour and utter banality (on her own admission, she was not very intelligent, and it was evident that she had no taste for threateningly elitist intellectual or artistic pursuits) appealed to millions of people. Apart from the fact that she was extremely rich and married to the heir to the British throne, she was just like us. Her personal tribulations were just like ours: at base, rather petty and egotistical. She was the perfect character for a soap opera, in fact, and those who ‘grieved’ after her death were really protesting at the deprivation of a large part of the soap opera’s interest.
He also makes an excellent point about how this very tendency was leapt upon by Tony Blair, eager to find the hook for his prime ministership. It was Blair who made popular the phrase, "The People's Princess," and as Dalrymple shows, the impact on Britain (and Western culture, for that matter) has been extensive:
In the orgy of demonstrative pseudo-grief that followed her death, Mr Blair said that the people had found a new way of being British. Indeed so: they had become emotionally incontinent and inclined to blubber in public when not being menacingly discourteous. They had come to believe that holding nothing back was the way to mental health, and their deepest emotional expression was the teddy bear that they were increasingly liable to leave at the site of a fatal accident or at the tomb of someone who had died in early adulthood.
A wonderful phrase, that: "holding nothing back was the way to mental health, and their deepest emotional expression was the teddy bear." Whether or not you agree with that, you have to like the way it rolls off the tongue. Say it a few times, and you'll probably begin to see the truth of it as well.

Dalrymple begins his essay with a quote from the (viruently) athiest Sam Harris, in his book The End of Faith, who writes, "Three million souls can be starved and murdered in the Congo, and our Argus-eyes media scarcely blink. When a princess dies in a car accident, however, a quarter of the earth’s population falls prostrate with grief."

That is the scandal of modern life, the scandal of the cult of celebrity, the scandal for the believer. And it is a scandal, because we can do better than that, we are capable of far more than we are showing. If we fall into these traps, if we sanctify the dead simply for being dead (as Dalrymple puts it, "How could anyone who personally hugged people suffering from AIDS and was against the planting of landmines not be a force for good?"), then we give those like Sam Harris no reason to look further into the eternal truths of Christianity. If the only object is to feel good - well, you can get that anywhere, can't you? Hardly seems worth needing a Redeemer, what?

Life is a challenge, and oftentimes the only way to meet that challenge is through reason, combined with faith in the power of the mind. Sadly, thinking is something that seems passé all too often, now.

Dalyrmple's essay comes from a fascinating forum at Britannica Blog, "Diana and the Cult of Celebrity." It's a discussion well worth checking out.

Originally published on August 29, 2007

Thursday, August 7, 2014

Classic Sports Thursday - remembering 1984

It's only fitting Classic Sports Thursday goes back 30 years to the Olympics . . .

BREAKING . . . .Pauley Pavilion, the venue in question for today's article, was flooded by a water main break in California. The main floor is damaged and will be out of commission as a temporary floor will likely be used on campus for the time being.

With the Soviets out of commission at the Olympics, Edwin W. Pauley Pavilion (the gymnastics venue) seemed to become an East Bloc battle of Romania, the sole survivors from the boycott. What we didn't know then was the Romanians were falsifying ages just to compete, as it took 18 years, the fall of the Iron Curtain, and an innocent marriage licence application in Cobb County, Georgia to confirm the truth. (Daniela Silivas entered senior gymnastics competitions at 13 when the minimum age had just been raised to 15; after her engagement to Scott Harper, and during the time leading to her wedding, she admitted the the age falsification, as she was 31, not 33 as her international competition passport said, in order to comply with Cobb County authorities.)

Yet when Romanian coach Bela Karolyi defected on a tour of the United States in 1981, he and wife Marta settled in Texas. Little did they know how a 13-year old rising star from West Virginia, the daughter of Jerry West's teammate on the 1959 West Virginia basketball team, and a Hershey's National Track and Field Games participant would become the most celebrated project, with recovering from arthroscopic knee surgery just six weeks beforehand, and with a superhuman recovery, she made it back to the Games. Little did we know the legacy that sixteen year-old did thirty years ago, on August 3, 1984. She defeated Ekaterina Szabo with that perfect ten on the vault, by the slimmest of margins, and being nicknamed the matriarch of American gymnastics.

How would we have known Mary Lou Retton now, at 46 and the mother of rising American elite gymnast McKenna Kelley, who tied for the win at an elite meet in February?

We celebrate her victory at Pauley Pavilion this week.

Wednesday, August 6, 2014

What a tangled web we weave...
In the pages of the Asia Times, the columnist Spengler doesn't mince any words in describing the mess that is American foreign policy:

The United States has misunderstood everyone in the world outside its borders and mismanaged everything. It has done so with a bipartisan consensus so broad and deep that it has no opposition except simple-minded isolationism. America gets unwanted results — most recently in Iraq - because it wants the wrong things in the first place. And there seems to be no way to persuade Americans otherwise. The crumbling of the Iraqi state will provide yet another pretext for mutual recriminations among political parties. The trouble is that both parties wanted the wrong thing to begin with.

Not pretty words, but sometimes the truth hurts.  Find out what both parties wanted, and the inevitable denouement to which it may all lead - it's well worth spending a few minutes - here.

Tuesday, August 5, 2014

Good news in the old "Our Word" family

If any of you are long-term enough readers to go back several years, you might remember some very funny and very good posts written by one of our old contributors, Kristin. (Check this one out, for example.)

Sadly for us, Kristin's stay at the blog was too short, and she went on to bigger things.  I wouldn't have thought they were necessarily better, though - after all, what's better than this blog?  Well, Kristin found something a lot better than that.  A month or so ago, our friend tied the knot, and she's now happily married!  I knew this some time ago, but sadly I've neglected to broadcast it to our readers until today.

I'm so happy for you, Kristin.  Best of everything for the future.  And if you ever want to make a cameo appearance or two, you'd be welcomed back with open arms!
Related Posts Plugin for WordPress, Blogger...