Reviewing Science Fiction Books With Statistics

Before the Internet if you wanted to find a rip-roaring science fiction novel to read you’d flip through the books at your favorite bookstore and hope to stumble upon the next mind-blowing sci-fi novel to change your life.  Or you’d ask your best buds about which books knocked them into orbit.  True fans subscribed to science fiction magazines and fanzines, reading all the reviews so as to maintain their status as the Sci-Fi guru at their local Slan shack.

Back in the 1980s I wondered if there was a better way, and developed a statistical system that I wrote about for a fanzine Lan’s Lantern.  I describe the process at my Classics of Science Fiction website.  When the world wide web came along I put my lists online, and eventually revised them three times.  The latest list, Classics of Science Fiction by Rank, is now several years old.

Today I discovered SFFMeta.com, a site created by Eric Bouchard, that also applies statistics to the task of finding a great science fictional read.  Think of it as Rotten Tomatoes for science fiction, fantasy and horror books.  SFFMeta is the newest in a succession of websites that use statistics for identifying the best science fiction books.  An early endeavor was Tristrom Cooke’s The Internet Top 100 SF/Fantasy List, which is now maintained by a new list maker.  Years later came Sci-Fi Lists Top Science Fiction, an excellent polling type site from the land down under.  I wish their creators would take credit and write about developing their systems.

Each statistician of reading has come up with a different method for identifying good reads. All of us look for ways to cash in on the wisdom of crowds theory.  Bouchard’s site is built on the idea that collectively, a group of current book reviewers, will spot the best reads.  I love his simple and elegant web design.  And it will be one that will evolve with wisdom over time. 

Bouchard assembles lists of reviewed books from online reviewing sites.  This produces worthy information now, but not deep enough to show wisdom just yet.  In other words, his samples are too small.  Rotten Tomatoes gets over a hundred reviewers for each film, but SFFMeta is limited by surveying a much smaller industry, and many books on his list have just 1 review. 

SFFMeta’s 90 day lists are a helpful indicator now, but their all-time best books are iffy.  It might take SFFMeta 5-15 years to gather the data using their methods to show inherent wisdom in identifying all-time classics because they have to wait for old books to be reviewed in new editions.  And be reviewed in numbers more significant than new books.

Their best lists now are the 2008 and 2007 summary lists.  Statistically, it would be wonderful if we could compare them to sales figures, and other annual best lists, because it would further reinforce the wisdom of crowds concept. I’m looking forward to the 2009 list.

I made my lists before the Internet was well known, and I had to combine the wisdom of fan polls with the wisdom of cross-tabbing critic’s recommended reading lists, along with award lists, and other criteria.   We came up with 28 lists, and to get on the final list, a book had to be on at least 7 of those 28 lists. 

If SFFMeta could find more reviewers and up their green cutoff to 5 reviewers their accuracy would improve dramatically I think.  It would also help if they could factor in other indicators besides reviewers – such as sales numbers, awards and nominations, Google citation numbers, and critical articles, foreign editions, audio book editions, for instance.  

SFFMeta also faces the problem that most of their cited reviewers are either overly kind, generous, or just plain hate to trash a book.  One positive review can get a book on the list, but it takes three reviews to get a highlighted green score.  Because their site is new, their 90 day list has only two green highlighted titles.  Their all-time list covers 100 books, with all getting the green rating, and one book having 14 reviews.  Statistically that’s better than the 90 day list, but not good enough for identifying true classics.

As their database of reviewed books grow, I’d like to see SFFMeta allow the viewer to manipulate the lists – for example, to see a 90 day list made of books getting more than 3 reviews, or more than 5, etc.  You can eyeball this now, but their programmer is obviously talented enough to do this for us.  I hope SFFMeta can find many more review sites too.  Here is their current list.  Print reviewers, I encourage you to reprint your reviews on the web if possible.

Bookmarks Magazine collects statistics on books via reviewers too, but uses print reviewers.  In their annual best of the year grid.  Their standout books will have 7 or more reviews, and the best of the best will have 12 or more, from the reviewers they use.  This illustrates why writers lust after reviews – any kind of attention helps.  There are so many books published that it’s hard for most books to get noticed at all, and for some to get noticed by several reviewers is a triumph. 

SFFMeta is a dream come true for genre writers because its results further emphasizes the best reviewed books.  SFFMeta is also a positive force for book reviewers.  Be sure and click the titles to drill down to where you can read the reviews.  There’s a major amount of work that’s gone into this site and I hope it becomes a huge success.  Hopefully, SFFMeta will bring more readers to the reviewers too, and that should help educate the audience for SF/F/H.  It should promote the value of reviewers and maybe bring more into the field. 

I’ve always dreamed of doing something more with my lists, but it’s so much damn work.  The latest books on my site are three from 1992.  I’d love to find enough lists to make it practical to identify books through 2000, but that will be hard.  I have 28 lists now.  If I could find 5-7 newer lists it would catch a lot of new books, but if I left the cutoff at 7 the final list would be far too long.  I’d need to make the cutoff 8-10 lists, and that makes it even harder for new titles to get listed.  My system has it’s limits.  It tends to recognize the very best of the very best of older books in the bell of the curve, dropping older titles that are being forgotten, and making it very hard for newer titles to be recognized.

If I used a 10 list cutoff, my current list would be 116 books.  If I use 11 lists, I’d get 94 books.  By using 7 lists, I get 193 books, far too many to be a real Top 100 SF Books, but look what gets left off (scroll down to #94 and see).  The Top 100 Sci-Fi Books site have a great overlap with my list and they do have a few newer titles.  When SFFMeta collects enough reviews and start matching those two lists it will be a powerful system with a lot of built in wisdom.

SFFMeta, if it becomes popular, should help sell books.  I watch way more little movies after the advent of Rotten Tomatoes.  I was overwhelmingly surprised by how many unknown authors (to me) I saw listed on SFFMeta.  For old SF farts, stuck in the 1950s science fiction world of Heinlein-Clarke-Asimov triumvirate, it’s quite a revelation.  Using the wisdom of crowds should push book reviewing into a new paradigm, but it will make it even harder for a new writer to break in.  One book review will sell books, but now buyers will expect books to be positively reviewed from a database of reviewers.  This could become a dangerous trend.

Books have always competed in a survival of the fittest competition, but now the internet will push that competition to newer heights.  My Classics of Science Fiction web site gets on average 92 hits a day – not that many, but it builds up over time for people looking for a list of SF books to read.  I’ve gotten lots of emails over the years from people telling me they use my list to find new books to try out.  This helps maintain fans for these older books.  SFFMeta will also create a momentum for popular new titles, and hopefully it will help find new readers for the genre by helping them to discover exciting books.

Will the wisdom of crowds increase the number of overall readers though?  Harry Potter books certainly got more kids to read for pleasure, but I’ve often heard kids say they couldn’t find anything exciting after HP, and gave up reading.  It’s hard to find books to love, so systems that identify top reads should create new bookworms.  Let’s hope so.  Be sure and add SFFMeta to your Blogroll.

JWH – 12/20/9