Every so often another "top X" list hits the blogosphere in general, or the edublogosphere in particular. But are these lists even worth bothering to read?
I would argue that the majority of lists aren't worth the paper they're not written on. As far as I'm concerned, a list has to meet the following criteria for it to warrant spending any time on it beyond the cursory 5 seconds it takes to decide whether it's a decent list or not. And the criteria should be addressed explicitly.
1. There should be a good reason for it in the first place. Note the two words there: "reason", and "good". Really, the list should seek to answer a question, which would provide a reason; and the question should be one that is worth asking. That would supply the "good" part.
2. It has to enable you to make judgements without having to do further work yourself. Let me explain what I mean. If someone produces a list called, say, "The top 10 word processors" and then proceeds to list 10 applications, all it does it provide a list. Its only value is in collating the names of 10 applications, some of which you may never have heard of, in one place. You will still have to look at each of the applications listed in order to see whether it is likely to meet your needs.
Paradoxically, therefore, the longer such a list is, the less useful it is. A more useful approach would be to write a comment about each one, perhaps giving its unique selling point, or even a personal opinion.
3. The target audience should be fairly obvious. This point is partly, though not wholly, covered by criterion1. For example, the reason for the list may be to provide a list of good word processing applications. But the list might be different, or ordered differently, if the target audience was professional writers as opposed to teachers. Believe me, some of the word processing applications written especially with writers in mind are very different from Microsoft Word and its imitators, because they have a different underlying purpose.
4. There should have a rationale underpinning what's included in the list, and the rank order of the items. Even if the rationale is the list creator's personal preferences, I can live with that. What I do find hard to tolerate is the sort of list that's put together by people nominating items to go into the list, but without any indication of who did the nominating, and how many people nominated the same item.
Put bluntly, how do you know that an item, in this case a word processor, wasn't nominated by the person whose company is selling it?
How many of the lists you see promoted actually meet those criteria?
A rather disturbing aspect of all this is that if you look at the standards for functional skills in ICT, many lists published and publicised on the web would not gain their authors a pass at higher than a Level 1 in communications, and even that is assuming a degree of leniency.
In case you haven't come across functional skills before, they are the skills in English, Mathematics and Information & Communications Technology that have been deemed in the UK to be the ones that people need in order to play a full part at work and in everyday life.
It seems to me that a well-written list is potentially a brilliant source of information, and a wonderful example of efficiency in action, whereas a poorly-constructed list is simply a time-waster. One of the unfortunate effects of fast communications is that rubbish is circulated just as quickly as good stuff.
So what would I regard as examples of good lists? I would suggest the following:
- My own lists. Yes, I know that sounds rather egotistical, but I think my lists are good because they meet all the criteria I've listed above.
Of course, that's all a bit self-referential: I devised the criteria and I wrote the articles, so it would be odd if the two didn't match up! You will need to decide for yourself whether (a) you agree with those criteria and (b) if my lists satisfy them.
My published lists aren't quite the same as the example I gave, being more of the "10 things you can do to achieve X" variety, but I don't think that affects my argument. You'll find some examples of what I'm talking about here.
- Here's an example of a good list, written by Larry Ferlazzo, on the subject of collaborative tools. It's clear why the list was compiled, who compiled it in terms of populating it, and there is a sentence or two about each item on the list.
Bottom line: if I'm looking for a collaborative tool then this list gives me enough info to not need to waste time looking at applications which clearly don't suit my needs.
- Here's a list which partly meets my criteria. It's a list of the 50 most influential female bloggers. I don't care for the subject matter much, in the sense that I think drawing up lists of bloggers of a particular gender, race or whatever is divisive, and that people should be judged on their merits.
Nevertheless, that does not fall foul of my criteria. What does is the following set of facts: I don't know who compiled the list, or how he or she is judging how influential these people are. I don't even know if they are in rank order.
Still, notwithstanding that the comments about each one positively glows, there is enough factual information about each person listed to help me decide which, if any, I will bother to explore further.
Can I give you examples of what I regard as poor lists? Yes, of course. But I don't see the point. What you have here is the following (if I may be permitted to make a list):
- A list of criteria of what makes a good list.
- Three examples of lists which I think are good.
I think it would be interesting to discuss with colleagues and students what they think makes a good list. My main reason for raising this issue in the first place is that I think that the creation of poor lists, and their uncritical acceptance and promulgation by educators, do not provide good role models for students.
I would hope that lists compiled by people in education be a cut above those you might find in a pub quiz. Alas, I am often disappointed.