The Delta Snake Review

The Delta Snake Review


Saturday, June 14, 2014

If music reviews are purely opinion, do people take reviews seriously?

 The answer to the question is yes...and no.

For a new artist who has a new release out, it can be a matter of life or death to have that release reviewed. Not because an aspiring star wants a good review, though they and the label prefer that, but that the release is mentioned at all.

People can't buy what they don't know is out there.

So simple as it seems, that's the primary purpose of a review. It lets the buying public know this particular release is available. It's part of the buzz.

Of course the record label knows that when the consumer sees that a new release is out, the first thing they think is is it worth my money?

Notice that I didn't say how good is it?

The fact is most people don't really care what a critic thinks, and the problem with a lot of critics is that's what they think their job is, to tell public how good the music is.

I'll use the blue genre as my primary example as to why the latter type of critic is useless.

The reason is, as the former editor and publisher of the Delta Snake Blues, is that when someone sent me a letter agreeing or disagreeing with my review, it was obvious they had already bought the record.

I started the newsletter for the same reason many people create music newsletters; I was an avid fan of the music, was a serious record collector, and I wanted free records. To get those I had to be a reviewer.

When I started the newsletter in the early 80s, there wasn't that many blues publications, so the blues labels gave me excellent support. They bought advertising and sent me plenty of records. In fact, so many that I eventually had to form a reviewing staff and farm them out.

One of the first things I noticed is everybody reviews records differently.

I tended to follow the early Rolling Stone model. That is to say, there was a formula. First you wrote a paragraph or two about the group, then you described briefly their earlier recordings if they had any, then went on to describe the record being reviewed and how it differed from the others or if it was similar.

There was a section where you described if you liked the record or not, but you always tried to make sure that the record was described in the context of the group's history or discography.

There's good reasons for this type of review. The most important being that the consumer would be able to tell that you weren't an industry shill who simply reworded the press release.

The practice of using press releases and pre-written reviews was pretty common in early teen magazines and it's made a huge comeback on the Internet.

If you Google a product, and read the various statements to come up, you'll find that a lot of the reviews are fairly similar and in many cases some don't even bothers to change the wording of the company's press release. This is particularly true of guitar reviews.

Many of the records and CDs I got were accompanied by nice press photos, and almost always a sheet of review excerpts, such as "his finest work in many years" or "songs that will become classic in the coming years," and so on.

If your review had some sort of a phrase that was really nice, the label would ask you if they could use it also. Some of my blues reviews are still up on the Internet, because they were positive, which is perfectly fine practice, but more often than not you got critics who would write reviews with the intent of being quotable and worthy of inclusion in the press release.

The reason was simple, critics like being famous too, just like everybody else. They want their opinion to be respected, and America is one of those countries that has a public that's absolutely addicted to and respects "expert" opinion.

When I began farming out the reviews, like I said, I found people reviewed releases differently. Some people flat-out told me they didn't want to write a negative review, so in their mind they would only "emphasize the positive." Which can ignore certain faults that would almost certainly your take a consumer like a bad stereo mix, out of tune instruments, or singer that's tone deaf.

Others would want to make a mark so they often would go heavily negative, while others simply read the liner notes on the back of the LP cover, and sort of wrote their own version of it.

There was a dozen other variations of that but suffice to say, when I read some of the reviews as an editor, it became obvious that most of the reviews were not very useful for the consumer.

For one thing the reviewer always has to be aware of they're writing for a market. If you're writing a review about a blues record, it's not really your job to evangelize and get people like the blues. The review is mostly going to be read by somebody who is into that music.

What the blues listener really wants to know is, is it worth spending my money on, and what does it "sound" like. In other words, is it Chicago style blues, folk blues, rock influenced blues, and so on.

After 16 years of writing blues reviews, I still can't say I  really ever got a real handle on what a perfect review was. The closest is still the 60s Rolling Stone model.

The reason is most the people who buy music will be fans, either of the group or the genre. What I found is they don't want to know whether it's good or not, but what does it "sound like" and is the group competent? Does sound like the groups last record, or have they changed direction?

Details like audio quality weren't as important in the blues, as many of us blues fans still love the reissues of the old 78s and are used to scratch noises and lousy sound. Though whether that sound quality is truly bad would easily be the subject of a full blog on its own.

When I say the consumer's interested in what the record sounds like, what it also means is that the blues fan wants to make sure it's a blues record and not music done by a rock band that decided to do blues, or if the group was simply calling it a blues record. The latter happened a lot in the 80s.

In other words the kind a review I found that the consumer appreciated most was similar to a QA inspection report.

Is the record what it says it is?

Which makes sense. For example, as a teen I was a fanatic Hot Tuna fan. I was aware that a lot of people didn't like the group, didn't understand that it was a form of country blues mixed with psychedelic, and if the reviewer thought the upcoming album was lousy, I'd have bought it anyway. I might find myself agreeing with the reviewer later on if I was disappointed in that particular record but it wasn't going to stop me from buying it.

There's actually two very insightful observations about reviews that were made by two very different rock artists.

Todd Rundgren once said that new rock records should be reviewed by young reviewers, not by older critics that had heard it all before. The reason being that rock 'n roll really is a basic sensibility that is the same but it's expressed a little differently with each generation. The new generation hears it as an exciting new sound and energy, but the older critic has heard it before and tends to be cynical and critical about it.

That was one thing I always had to guard against when reviewing a blues record. I was familiar with blues records all way back to the 20s, so some group in the 80s who was doing probably the three thousandth version of Dust My Broom could easily come off as boring, even they did succeed in making it sound a little different or infused with fresh energy. You had to try to listen to it with a set of "new ears" so to speak.

Frank Zappa had the other incisive comment, which is also basically true, especially for genre music. He said that people often picked music as part of a range of choices that made up a lifestyle.

So if someone was really into the idea of being out of the mainstream, they made sure that their fashion sense was very different, and the music that they liked fit the lifestyle. It didn't matter if the music was good or not.

For example being a hard-core punk. That meant ratty ripped up clothes, and being into the correct type of superfast and dissonant music.

That whole concept wasn't just Frank Zappa being cynical. I was in a punk band for a couple years, and what Frank said was right on target.

Same with the blues. Particularly in the 70s and early 80s, the younger bands worked very hard to be good at the genre as opposed to changing revolutionizing it. That's nothing to criticize actually, it was just how you wanted to play the blues.

Though I will say that during that period. the predominant sound among the younger bands was West Coast style blues, which had swing elements. That was often a turn off to those who liked the harder core Chicago blues, and so it was important in my review that I identied the type of blues being played.

So the most important point in my review wasn't whether it was good, but making sure that the consumer know what style of blues it was.

So as far as reviewing music, one has the be aware that you're writing to a market, but that the audience has a wide variety of motives for wanting to hear the music or buy it. That's why every review draws both praise and criticism. Because many fans see it either as a validation or criticism of their taste. Whether or not it's a bad record unfortunately tends come out after purchase.

So in the reviews of that I wrote, at least the ones that I had the time to really do a good job on, I went for the early Rolling Stone type.

The reason it was a great formula, was that for the fan of the group it gave them an idea of where the group was heading or was a warning that the group had changed the sound that they liked.

For the buyer who would never heard of the band, it would give them a clear idea of where the music was coming from, what it sounded like, and if there was a historical context to it.

Making sure that the review included a lot of context made the review informative, so at a minimum it was entertaining to read.

The last point was my bottom line. The reviews were only part of the main business at hand, which was getting more readers and advertising (to be honest). If the reader enjoyed the review and found it informative, I figured that's as far as I could take it, and even if I did say it was good or bad, I knew that in the vast majority of people's minds, that was something they were going to decide, not me.

People like to hear from another human being what they think about something. The important thing to remember is that it's like any other advice you offer a friend or stranger.

In the digital age it's almost unnecessary to have critics. On iTunes you can hear a minute of a song, which is about as much time as an record company A&R guy spends listening to a cut on a new demo, and there's no better critic then you as far as what you want to spend your money on.

Even if there's been cases when the critics have been right, that's a true statement. The consumer is still the best critic.

One of the the most famous examples is when the Stones released Exile On Main Street. It was widely panned and hated by critics.

The public disagreed of course and several of the cuts became hits (or at least FM hits). These days the general consensus is that it's probably the Stone's finest work but the record simply was what it was, it's just that the critics didn't like it at first, and now they do.

So that example tells you what a record review can or cannot be in a nutshell.

If you read a review on my blog, i'm going to assume that you're your own best critic, and concentrate on telling you the other details and what I personally think of the record. Especially since you're almost certainly going to have the ability to go to the music sites and hear the record samples yourself.

In the digital age there's really no point anymore in telling a consumer if the song is good or bad. The digital age has brought back the old listening boothes where a person could listen to a single before buying it.

But then, you might hear the cut, and sort of want a second opinion. 

In which case this critic stands ready to help.