marcus westbury

my life. on the internets.

marcus westbury header image 2

The problem with censorship: it’s not working

November 9th, 2009 by marcus


How can we make reasonable judgments about what is acceptable when censorship laws can’t keep up?

Australia’s censorship and classification regime has a major problem. It is breaking, if not broken. It is in need of a serious national debate and not the simplistic solutions or opportunistic point-scoring that tends to characterise debates around censorship. The problem is obvious but complex and rarely discussed. The problem with the classification system is that it doesn’t work as it was intended to any more.

Australia’s whole system of censorship and classification is premised on an assumption that is simply no longer true: that by regulating the points of sale, distribution, importation and exhibition of books, images and films, Australia can control what people are able to view, read and experience. But that’s how censorship used to work — by physically stopping people from seeing bad things and by blocking their distribution.

Today we can no longer do that. There are simply too many points of distribution for this approach to work. Books, films, images and games can be ordered from anywhere in the world and sent by courier or mail order or downloaded directly on to a personal computer or any number of other devices.

It presents a very practical problem. Every day, either deliberately or unintentionally, Australians are bypassing the Australian classification system by ordering content through iTunes, Amazon, eBay or millions of other sites and services that have never been classified here. Many overtly legal services exist in a grey area where they are defined under the classification system of the country of origin of the service but have never been classified in Australia. That is to say nothing of the availability of unclassified, pirated, illicit and illegal material that has deliberately avoided classification.

As artists know well, censorship issues are never far away from the news and yet the frenzy that surrounds them often works against any actual debate, analysis or reflection. There is nothing like a censorship scandal to get something out of the arts ghetto and on to talkback, into the tabloids, on current affairs shows and into the news pages. The hyperventilating run for the high moral ground panics that have accompanied the Henson case, the proposal for a national internet filter, sedition laws and the debate around Australia’s ban on R-Rated computer games have all recently demonstrated that noise can often squeeze out actual discussion of process and priorities.

Our attitudes to censorship are evolving all the time. We have largely forgotten that we once banned books such as Lady Chatterley’s Lover and Catcher in The Rye. We look back almost fondly at the Oz trials and quizzically at the scandal that befell Eugene Goosens or that accompanied Graham Kennedy’s infamous Crow Calls, while simultaneously whipping up a frenzy around skits on Hey Hey It’s Saturday that in another era seemed innocuous.

Yet our capacity for censorship is eroding much faster than our attitudes are changing. Simplistic solutions such as the proposed national internet filtering scheme are often more an excuse to avoid the problem than a solution. Yet the genie is well and truly out of the bottle and attempts to reassert the old paradigm over the new reality are destined to fail. Such a scheme — which will almost certainly never actually happen — takes little energy to bypass and is fundamentally flawed by the assumption that everything on the internet is identifiable, catalogued and classifiable.

The real debate we need to have is much more grounded in reality. What are our options and priorities are when censorship and classification is detached from control of distribution? Censorship is becoming increasingly about symbols and messages and not about regulation. What are the community standards in such an environment? How do people make reasonable judgements about what is and isn’t acceptable as the reality moves ahead of the law? How do we educate parents about the complex realities and not falsely reassure them with simple rhetoric?

If anything, the extreme cases are actually the easiest to deal with. We have rightly sent a message that child pornography is unacceptable by infiltrating networks and arresting and prosecuting those involved in its dissemination and collection, but such an approach can only work for the most extreme of cases. Pursuing every unclassified piece of media, every foreign cut of a film, every R-rated video game in a similar way would be practically impossible and highly undesirable — a recipe for selective enforcement and abuse of power.

There are no simple answers, only complex ones. We need to have a bit more of a real conversation and a little less high-minded rhetoric sooner rather than later.

Similar Posts:

Tags:   · · · · 3 Comments

  • Delicious
  • Facebook
  • Digg
  • Reddit
  • StumbleUpon
  • Twitter

Leave A Comment

3 responses so far ↓

  • 1 Mark Newton Nov 9, 2009 at 10:03 am

    Hi Marcus.

    Oddly enough, I started thinking about this from exactly the same premise (that censorship is increasingly impossible) as a response to Donald McDonald’s unspeakably hilarious proposal to classify iPhone apps last month.

    I jotted some thoughts there:

    I know it’s a pretty simplistic and hopeful proposal (“the politics would disappear”), but at the end of the day any national debate about censorship which includes a suggestion that censorship is even possible anymore is intellectually bankrupt, so all we’re left with is classification — And since the CB has made such a terrible hash of that in the 20 years since the Internet has been mainstream I can’t see why they should hold a monopoly over it.

    – mark

  • 2 Rebecca Cannon Nov 23, 2009 at 6:26 pm

    I think the solution is self-classification, as we’re doing over at

    Sure, it’s an illegal, trust-based system at the moment, but it works.

  • 3 sas Nov 8, 2012 at 11:01 pm

    f u c k