Wear­ing my EFA Board Mem­ber hat, I spoke today at an event at Par­lia­ment House hosted by the Men­zies Research Cen­tre in a debate with Tony McLel­lan of the Aus­tralian Chris­t­ian Lobby. The audi­ence was pri­mar­ily mem­bers of the Aus­tralian Lib­eral Stu­dents Fed­er­a­tion; young Lib­er­als des­tined for jobs as polit­i­cal staffers and politicians.

Below is the text of my part of the debate.

Let me begin with a short anecdote.

On Mon­day night as we watched Four Cor­ners and Q&A, my not-​​quite-​​13 year old daugh­ter, Han­nah, made a par­tic­u­larly inter­est­ing obser­va­tion. “Gee, Dad,” she said, “I think I’ve just seen more rude pic­tures in that story than I’ve ever seen on the Internet.”

Han­nah has been using the Inter­net since she was four.

Cer­tainly, much of that time it has been under our super­vi­sion, but increas­ingly it’s not. When Han­nah uses the Inter­net, she uses a con­nec­tion at home that is com­pletely unfil­tered, nei­ther by the router we use nor by acti­vat­ing the fairly com­pre­hen­sive parental con­trols that come as a stan­dard part of mod­ern oper­at­ing sys­tems. She has admin­is­tra­tor access to the machine she uses and she also knows and under­stands how to access and man­age the home network.

Know­ing I was com­ing here today, I con­ducted some­thing of a straw poll of that obser­va­tion amongst friends and acquain­tances with kids of a sim­i­lar age. I delib­er­ately avoided ask­ing only “‘Net savvy” parents.

Uni­ver­sally, the expe­ri­ence was the same; none of our chil­dren had ever inad­ver­tently encoun­tered porno­graphic or other offen­sive mate­r­ial on the Inter­net, let alone mate­r­ial of the kind that falls under the umbrella that the National Clas­si­fi­ca­tion Code defines as Refused Clas­si­fi­ca­tion. None of the chil­dren had fil­tered or man­aged Inter­net con­nec­tions. All of them used com­put­ers placed in pub­lic spaces in their homes and sev­eral had their own com­put­ers in their rooms.

The most recent research into pub­lic opin­ion on the fil­ter, car­ried out by the Safer Inter­net Group con­sist­ing of Google, Inter­net Indus­try Asso­ci­a­tion, iiNet, Aus­tralian Coun­cil of State School Organ­i­sa­tions and the Aus­tralian Library and Infor­ma­tion Asso­ci­a­tion and oth­ers shows a marked increase in doubts about the fil­ter amongst parents.

There is sig­nif­i­cant oppo­si­tion to the government’s fil­ter as pro­posed. Rather, par­ents first want greater edu­ca­tion options and at-​​home fil­ter­ing and as a next-​​best option, an opt-​​in fil­ter. Manda­tory fil­ter­ing runs a long last.

So too, our friends inter­na­tion­ally, includ­ing most notably the US Ambas­sador to Aus­tralia, Jeff Ble­ich, speak­ing on Q&A have come out pub­licly against the fil­ter as it stands. Ambas­sador Ble­ich, an inter­na­tion­ally recog­nised author­ity on human rights, was par­tic­u­larly clear, when he said:

“We have been able to accom­plish the goals that Aus­tralia has described, which is to cap­ture and pros­e­cute child pornog­ra­phers … with­out hav­ing to use inter­net fil­ters. We have other means and we are will­ing to share our efforts with [the Aus­tralian government].”

The argu­ments of the gov­ern­ment and its sup­port­ers in favor of the fil­ter reg­u­larly hang on the mat­ter of RC mate­r­ial. On this, I’d like to first high­light two mat­ters of inter­est that seem to cause some real confusion.

First, is the myth that all RC mate­r­ial is ille­gal. This is sim­ply not true.

The fact is that of all mate­r­ial clas­si­fied RC, it is only mate­r­ial depict­ing the sex­ual abuse of chil­dren that is that is ille­gal to own. For good rea­son. No rea­son­able per­son in today’s soci­ety believes that such mate­r­ial is suit­able for adults to access, let alone children.

Mate­r­ial that falls under the RC umbrella is unques­tion­ably some­times dis­taste­ful or con­tro­ver­sial or con­tains or depicts con­cepts of an adult nature; drug abuse, explicit mate­r­ial about abor­tion, guides to assisted sui­cide, vio­lence. Whether you per­son­ally approve of such things or not, none of this mate­r­ial is ille­gal to pos­sess in this coun­try; it’s per­fectly legal for me or you to own a copy of Baise Moi or The Peace­ful Pill, just not to make it avail­able for sale.

Yet the fil­ter seeks to change this. Our clas­si­fi­ca­tion sys­tem in Aus­tralia is some­thing that largely works and is designed to empower adults and minors alike to make appro­pri­ate, rel­e­vant choices. When imple­mented, and have no doubt, the government’s plans for the fil­ter are far from aban­doned, it will take away adults’ abil­ity to decide for them­selves whether or not to access mate­r­ial that is by-​​and-​​large, legal in this country.

Sec­ond, is the fan­tasy that stum­bling across mate­r­ial that is RC on the pub­lic web is some­thing that occurs with fright­en­ing reg­u­lar­ity. It’s not even easy to stum­ble across R– or X-​​rated mate­r­ial, not all of which is porno­graphic in nature and none of which will be tar­geted by the fil­ter. You have to go look­ing for these things very delib­er­ately. Look­ing for mate­r­ial that is RC is even harder.

The mate­r­ial the gov­ern­ment pro­poses to fil­ter is, in some cases, com­pletely appro­pri­ate to access. For that which is not, child sex­ual abuse mate­r­ial, it is well known that the crim­i­nals who trade in this mat­ter do so using tools and pro­to­cols that will not be man­aged by this or any other fil­ter. Rather crim­i­nals trade their mate­ri­als in pri­vate networks.

Addi­tional dol­lars and human resources for law enforce­ment by the Aus­tralian Fed­eral Police ought to be sup­ported. It is only through the dili­gent and suc­cess­ful efforts of the AFP and its over­seas col­lab­o­ra­tors that those peo­ple pur­vey­ing child sex­ual abuse mate­r­ial are appre­hended and put in jail where they belong.

Let’s look in turn at a num­ber of the other issues around the pro­posed filter.

First, the mat­ters of cyber-​​safety, edu­ca­tion, self-​​determination and dig­i­tal citizenship.

There is no ques­tion that as adults and par­tic­u­larly as par­ents, we wish to pro­tect our soci­ety and chil­dren from dan­ger and from expo­sure to deeply offen­sive or inap­pro­pri­ate mate­r­ial. Cer­tainly, as a father, this is para­mount in my concerns.

In order to do this, I have a respon­si­bil­ity. As a par­ent as and a mem­ber of soci­ety, it is incum­bent on me to edu­cate myself, my child and those who I come into con­tact with about issues such as good dig­i­tal cit­i­zen­ship and appro­pri­ate online behav­iors. Doing so helps us, par­tic­u­larly, to pro­tect our­selves from threats the fil­ter will not even address such as cyber-​​bullying (and bul­ly­ing in the flesh-​​and-​​blood world), from online preda­tors, from iden­tity theft.

These issues are cer­tainly much higher in the minds of the par­ents, teach­ers and stu­dents I speak to reg­u­larly as a part of my work than are mat­ters like RC.

Despite the marked increase in this coun­try of pol­icy that erodes our free­doms, push­ing back against per­sonal deter­mi­na­tion and our abil­ity to make deci­sions for our­selves, the fact is that the vast major­ity of Aus­tralians are not com­plete dullards who need the Nanny State to tell them how to run their lives. Rather, they are per­fectly nor­mal, intel­li­gent peo­ple who are capa­ble of self-​​determination, of crit­i­cal think­ing and decision-​​making.

Aus­tralian par­ents are largely not irre­spon­si­ble and incom­pe­tent at bring­ing up their kids. Most of them are entirely the oppo­site, doing a fine job of par­ent­ing and mak­ing appro­pri­ate deci­sions about child rear­ing. They are per­fectly able, as par­ents and adults, to decide what is and isn’t appro­pri­ate for their chil­dren to see online and else­where. Equally, they are able to teach their chil­dren, with help from edu­ca­tors, law enforce­ment and oth­ers, how to behave as rea­son­able dig­i­tal citizens.

The mil­lions of dol­lars the gov­ern­ment pro­poses to spend on the fil­ter, a tech­nol­ogy that will not actu­ally work as adver­tised and will be eas­ily cir­cum­ventable, would be far bet­ter spent on law enforce­ment and on thor­ough pro­grams for teach­ers and par­ents to edu­cate them­selves on risks, on teach­ing how to man­age their own and their children’s access to the Inter­net, on appro­pri­ate online behav­ior and, where they wish to, how to fil­ter their own com­put­ers directly and by choice; prov­ably the most effec­tive form of fil­ter­ing and plac­ing the power to con­duct them­selves firmly in the hands of indi­vid­ual peo­ple rather than in the hands of a government.

In more than one research study, both here and over­seas, strong evi­dence exists that the risks to minors of expo­sure to unwanted, by which I do not mean only ille­gal, mate­r­ial, are con­sid­er­ably overblown. Chil­dren are not irrepara­bly dam­aged by see­ing things that may be dis­taste­ful or inap­pro­pri­ate online, par­tic­u­larly if they are sur­rounded by a frame­work of par­ents, men­tors, edu­ca­tors and other sup­port ser­vices that can help them make sense of these things.

Even if some form of fil­ter is ulti­mately intro­duced, it would be far bet­ter if such a thing was opt-​​in rather than manda­tory, as it was in Labor’s orig­i­nal pre-​​election pol­icy. This leaves the decision-​​making in the hands of par­ents, where it belongs. Indeed, many oppo­nents of the cur­rent fil­ter scheme have stated that their objec­tions would largely be mit­i­gated of opt-​​in was the choice.

I don’t want to spend a great deal of time on the tech­nol­ogy, as the con­cepts here have been argued at length and in detail by oth­ers. Suf­fice it to say that, in spite of Sen­a­tor Conroy’s argu­ments to the con­trary, there are major tech­ni­cal issues with the fil­ter that remain unan­swered or lack­ing in enough detail to be satisfying:

All of these issues require evidence-​​based, thor­ough answers.

The black­list itself is prob­lem­atic on a num­ber of fronts. These too have been dis­cussed at length, but let’s look at them briefly.
The list is secret. In a world where open gov­ern­ment in mod­ern democ­ra­cies is receiv­ing sig­nif­i­cant atten­tion, this is, at the very least, inter­est­ing. We hear argu­ments that a secret list pro­tects us from expo­sure to the URLs that con­tain the offen­sive mate­r­ial. How­ever, if the URLs are fil­tered, in what way do we risk expo­sure? The argu­ment fails its own logic. Beyond that, it’s sim­ply offen­sive to me to think that any gov­ern­ment believes that I am inca­pable of enough inde­pen­dent thought to deter­mine what URLs I do and do not visit.

By its very secrecy, if my web­site ends up on the black­list, I am unable to know how and why it got there. It’s also unclear how I get off the list if I’m there unjus­ti­fi­ably. What hap­pens if some­one opposed to your polit­i­cal views or faith man­ages to get your site on the list?

Secret things have a ten­dency to leak through the cracks. The black­list has already been leaked once. It’s not incon­ceiv­able that it will hap­pen again. And again. And again.

The list is tiny. In a world where the pub­lic web is now in the tril­lions of pages, a list of some­thing around 10,000 URLs barely scratches the sur­face of any pool of offen­sive, let alone ille­gal, con­tent that may exist.

Which brings us to crim­i­nal net­works dis­trib­ut­ing child sex­ual abuse mate­r­ial — I’ve already men­tioned this, but it bears repeat­ing — these net­works do not use the pub­lic web to dis­trib­ute their wares. The tech­nolo­gies they do use — pri­vate net­works and peer-​​to-​​peer — will not be fil­tered.

The only effec­tive way the dis­tri­b­u­tion of this ille­gal mate­r­ial can be stopped is through active law enforce­ment. The AFP has a highly com­pe­tent cyber­crime unit that could be more effec­tive if it was the ben­e­fi­ciary of addi­tional fund­ing and resources.

Last, to mat­ters of fil­ter­ing and free speech.

Sen­a­tor Con­roy, on Mon­day night’s Four Cor­ners, stated clearly that for the pur­poses of the fil­ter, his government’s pol­icy was to fil­ter RC con­tent only and that he would be amongst the many voices raised in protest should some sub­se­quent gov­ern­ment decide to broaden the scope of the filter.

The fil­ter cov­ers mate­r­ial legal in other forms and media. It lacks account­abil­ity and appela­bil­ity which are at odds with our open democ­racy and markedly dif­fer­ent to equiv­a­lent deci­sions that are open to scrutiny when sub­ject to other media.

While the Senator’s and the government’s hearts may cer­tainly be in the right place, we can­not be so cer­tain about unknown future gov­ern­ments and their thoughts on the nature of what could and should be sub­ject to fil­ter­ing. It is entirely pos­si­ble that over the long term not only mate­r­ial that is RC will be sub­ject, but per­haps dis­sent­ing polit­i­cal voices, mat­ters of taste or voices belong­ing to cer­tain faiths may be censored.

So, here’s a sum­mary of the issues as I see them:

This is far from a sim­ple issue.

I’d like to close with a few words from Will Briggs, an Angli­can priest from my wife’s home town of Som­er­set, Tas­ma­nia. Will is a strong voice in the dis­course on the fil­ter. He said:

“[This issue] is best [addressed] through clear infor­ma­tion, bal­anced argu­ment, rea­soned debate…[on the] mul­ti­plic­ity of issues… [it is] a debate which is not sim­ply about sex­ual ethics but about free­dom of speech, the reduc­tion­ism of moral­ity, and the role of gov­ern­ment in soci­ety… by… sim­pli­fi­ca­tions in this case [we] look like simpletons.”