Alt_Right White Lite: trolling, hate speech and cyber racism on social media

Cosmopolitan Civil Societies: an Interdisciplinary Journal, Vol. 9, No. 3, 2017
ISSN 1837-5391 | Published by UTS ePRESS |


Alt_Right White Lite: trolling, hate speech and cyber racism on social media

Andrew Jakubowicz

University of Technology Sydney, Australia

Corresponding author: Andrew Jakubowicz, Social and Political Sciences, Faculty of Arts and Social Sciences, University of Technology Sydney, 15 Broadway, Ultimo, NSW 2007, Australia.


© 2017 Andrew Jakubowicz. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 Unported (CC BY 4.0) License (, allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.

Citation: Jakubowicz, A. 2017. Alt_Right White Lite: trolling, hate speech and cyber racism on social media. Cosmopolitan Civil Societies: an Interdisciplinary Journal. 9(3), 41-60.


The rapid growth of race hate speech on the Internet seems to have overwhelmed the capacity of states, corporations or civil society to limit its spread and impact. Yet by understanding how the political economy of the Internet facilitates racism it is possible to chart strategies that might push back on its negative social effects. Only by involving the state, economy and civil society at both the global level, and locally, can such a process begin to develop an effective ‘civilising’ dynamic. However neo-liberalism and democratic license may find such an exercise ultimately overwhelmingly challenging, especially if the fundamental logical drivers that underpin the business model of the Internet cannot be transformed. This article charts the most recent rise and confusion of the Internet under the impact of the Alt_Right and other racist groups, focusing on an Australian example that demonstrates the way in which a group could manipulate the contradictions of the Internet with some success. Using an analytical model developed to understand the political economy and sociology of mass media power in the later stages of modernity, before the Internet, the author offers a series of proposals on how to address racism on the Internet.


Internet; cyber racism; antisemitism; Australia; regulation; state; economy; civil society; social movements; trolls; Alt_Right; trolling

DECLARATION OF CONFLICTING INTERESTThe author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article. FUNDING This project was supported by an ARC Linkage Grant ARC LP 120200115.

Introduction: three online connections

The spread of cyber racism has become an increasingly prominent issue, from Myanmar to India, from the USA to Africa, and throughout Europe. An Australian study, ‘Cyber Racism and Community Resilience’[1] (Jakubowicz et al., 2017) has recently been completed by an interdisciplinary research team from six Australian universities, in partnership with the Australian Research Council, the Australian Human Rights Commission, VicHealth and the Federation of Ethnic Communities Councils of Australia, and in conjunction with the Online Hate Prevention Institute (OHPI). This article draws on that study to reflect on one aspect of online racism, namely antisemitism, and the rise of online neo-nazism. While the full study covers many ethnic, racial and religious groups, antisemitism has been chosen here because the author came across a strong antisemitic attack directed towards him online, and discovered in pursuing its origins some of the real world complexities of fighting racism online.

In the first half of 2017 the global social media industry was struck by an explosive break-down in its underlying business models, a predictable outcome of its underlying logic. The two behemoths that had charted the trajectories of exponential expansion for the previous decade started to freeze. Both Facebook and Alphabet (the conglomerate covering Google, YouTube and many other cutting edge apps and platforms) were confronted by advertising boycotts (Mostrous, 2017) and huge financial losses, as major global brands found their marketing materials served to sites closely associated with terrorist, racist, homophobic and sexist messages (Alba, 2017). Companies tripped up on Google included PepsiCola and Wal-Mart. Over a period of a few weeks the algorithms that had produced billions of dollars in profit for the brands by finding consumers whose social media profiles best fitted with their products turned on themselves to begin destroying the brands’ value. While the platforms reworked their practices to some extent, much of the correction was done by staff over-riding the deeper algorithms. The Verge’s Nick Statt reflected on this position, when he wrote in the midst of the crisis in March 2107, ‘YouTube is now in the position of being structurally incapable of policing its platform and perhaps culturally hesitant even to do so with more heavy-handed moderation methods’ (Statt, 2017).

The deep dynamic that created these misaligned serves was compounded by payment going to the sites for the number of hits they attracted. It was when these sites tripped a minimum visits alert, that advertisers were served to the sites, and the site owners received payment from the advertisers through the platforms’ e-business model. For the Alt_Right (Alt standing for Alternate, a handle developed in the early days of the Internet to describe non-mainstream newsgroups and bulletin boards) the flow of payments from Facebook and Alphabet had become a motivation for and an unexpected benefit of the way the Internet had been developed. While the Alt_Right began in the USA it soon spread globally, including to Australia, where old Right-wing groups soon found new followers and new impact as social media opportunities emerged that the platforms had not envisaged and could not easily control.

Inspired by the Breitbart (Amend and Morgan, 2017, Cadwalladr, 2017) pro-Trump intervention in the US Presidential election in the USA, a group of Australian Internet trolls adopted the name ‘TheDingoes’ in mid 2016, and set up a social media network to test out how far an Alt_Right strategy could penetrate the Australian online world through a multiple platform attack on Jews, Aborigines, Muslims, Africans and ‘multiculturalism’. Their fairly marginal presence in Australia compares to the central role that the Alt_Right has achieved under the Trump presidency – thus the title of this article includes the descriptor ‘Lite’, a reference to both their copying of the American model, and the comparatively low level of their political influence. Even though they were small, they achieved over a short period of time front-page coverage in the mainstream Fairfax press, while they established an increasingly popular and virulently hate-filled podcast site, and saw a string of their tweets screening in Q&A on ABC TV (@ndy, 2017, Begley and Maley, 2017, Di Stefano, 2017, Di Stefano and Esposito, 2016).

Also early in 2017, following many months of damage to his own reputation while the platform had been facilitating hate speech and racism (Maier, 2016), Facebook head Mark Zuckerberg announced that after a decade of connecting people to friends and family, Facebook would now expand into the business of building communities, while interfering with freedom of speech within communities as little as possible (Zuckerberg, 2017). He extended this viewpoint in an interview on the CNN network in June 2017 (Kelly, 2017) where he reiterated the value of community and his preference that communities set their own standards and Facebook not be required to police their morality, allowing individuals to defend their personal spaces from exposure to views they detested by blocking content they did not want to see, and allowing online communities to crowd-source their boundary settings for civility.

The three instances that are used here, that is, the advertisers’ boycott of Alphabet and Google, the emergence of the Alt_Right and its Australian mimics, and Facebook’s attempt to create a newer discourse of responsibility to relieve it of the pressures that it was facing, reveal global and local fractures in the Internet. These fractures tie back into the heartland of the Internet, a place now so increasingly poisoned by hate speech that even the originator of the World Wide Web Tim Berners-Lee has admitted that his dream of freedom had produced a monster of hate, and his Foundation should now set out somehow to put things aright (Berners-Lee, 2017, World Wide Web Foundation, 2017).

How the Internet does its racism thing

These were not unconnected events, as they reveal the layers that contribute to the experience of racism online. The Internet was created in the mid 1990s, designed as a multinodal communications system that would be difficult to control (or destroy) from one central point. Its technological infrastructure depends on a series of negotiated agreements that facilitate the rapid transfer of data through multiple redundant pathways, employing compatible methods. The infrastructure and those who manage it, the International Telecommunications Union (ITU), remain agnostic as to what messages are carried through the system (International Telecommunications Union, 2017). The main globally active interventions by states relate to child sexual abuse and violence, and even in this realm there are many spaces that exist outside regulatory effectiveness. More recently there has been a growing focus on the use of the Internet for the recruitment and promotion of terrorist violence, but here too interventions have not been overwhelmingly successful. While racism has moved more to the foreground since the rising racism generated in the migration crisis in Europe, the role of race hate speech and its proponents in the US presidential election in 2016 have revealed the multiple layers of conflict around race and the value placed on the Internet by national and transnational social movements (Brookings and Singer, 2016, Amend and Morgan, 2017, Simon Wiesenthal Center, 2017).

Political economy and cultural analysis of online racism

A 1994 Australian study of Racism and the Media (Jakubowicz et al., 1994) completed in the period just before the advent of the Internet drew on John Thompson’s model of media power in modernity (Thompson, 1994) to structure its analysis through a layered exposition of the relationship between politics, economics and culture in television, cinema, newspapers, magazines and radio. While the research at that time differentiated between producers and consumers of media, it explored the political economy of the spaces in which meaning is created, that is, when the intention of the producer and the application of attention from the consumer link the two together. One of the most significant changes in media in the digital age has been the emergence of the prosumer, members of audiences who at the same time produce content – ranging from material that is highly professional through to the billions of often incoherent or sub-culturally coded posts in Facebook and Twitter.

The media operate on a global, national and local level, both reflecting and shaping wider social perceptions, values and understandings. They operate to tie together the state (as regulator and the centre of power), the economy (huge commercial enterprises and their multitudes of satellites) and civil society (as the source of prosumers, the locus of civil interactions, and the space where social movements form). An understanding of racism in cyberspace needs to address how each of these elements magnifies the opportunities that proponents of racist hate speech have discovered.

The single element that distinguishes the Internet from previous media lies in the location where value is produced (Fuchs, 2017, Curran et al., 2012). While value is produced at the point of production, and much of the industry still manages this through wage relations, a significant part of the value that the Internet economy generates comes from the unpaid labour of Internet users. Much of the content that draws users to platforms such as Facebook, and the clumping of users is what Facebook then sells to advertisers, (a key element in theories of modern media (Mattelart et al., 1984)) has been generated by the users themselves through their social networking, their commentaries, their images, and their offering of interest profiles (Fuchs, 2009).

Our research (Jakubowicz et al., 2017) has described the clustering of Internet users around points of attraction (we label these points ‘sticky spots’) as ‘swarming’. Any process that enables such clustering into online communities (the goal so recently announced by Facebook’s Zuckerberg (Zuckerberg, 2017)) becomes effectively sealed into the underpinnings of the Internet. Any actor or motivator who can create a sticky spot that enables a swarm to form and to stay attached over time becomes a value creator of significant importance in the vast universe of value nodes. While the overall capital value of the Internet is probably incalculable, it remains important to understand that the Internet has become a framework for the creation and circulation of capital, with the multiplicity of value nodes serving a similar function in the digital realm as the factory served in early periods of industrial capital.

In one sense the ‘owners’ of Internet capital comprise those with any capacity to produce sustainable swarms of users whose presence can be on-sold and whose presence can therefore be monetised, including as it turns out racists, terrorists, pornographers, and child abusers. The most manifestly successful constellations of clusters are Google (and its associated Alphabet entities such as YouTube and CapitalG) with its linking capacity between things, and Facebook with its linking capacity between people. Any major attempt to control the spread of racism that requires an adjustment of this political economy may well fail at the outset.

The opportunities that have opened for value expressions on the Internet include space for significant numbers of ultra-conservative and radically supremacist groups and individuals. Nagle (Nagle, 2017) has argued that the new Internet culture of the Alt_Right also includes significant ‘learning’ from the counter-culture of the late twentieth century, a mix of geeks, libertarians and counter-hegemonic cultural warriors, washed through with disengaged psychopaths. A successful player in the race hate field has been the increasingly influential US White Power community associated with the label of ‘Alt_Right’, a term that refers back to the earliest period of the Internet with its ‘Alt-’ newsgroups and bulletin boards, based on the alt- key function on computer keyboards, used to specify alternative text when selected. In the next section, I explore what makes the Alt_Right such a pernicious and effective movement, tying together as it does psychological delight in gaming power and a commitment to ‘politically incorrect’ (or rather politically abusive) attitudes on race, gender and sexuality.

Alt_Right and Social Media: the dark side of the web

There is growing agreement that the Internet has played such a powerful role in building the edifices of contemporary racism due to three interacting factors (Klein 2017).

  1. The political economy of the Internet favours freedom over control, facilitated by technologies that magnify the anonymity of racist protagonists.

  2. The ideology of the Internet has long been flavoured with attachments to freedom without limits, boundless interactions magnified by the heavy ideological and legal commitment of the United States (US) to unfettered freedom of speech.

  3. The particular configuration of activity on the Internet, that places billions of lone individuals before their screens, interacting effectively with anyone they choose, enhances the psychological dimensions of anonymity, disengagement, and dis-inhibition, particularly where we find people whose personalities accord with the Dark Triad or Tetrad (Buckels et al., 2014).

This last combined personality-oriented characteristic of web trolls has emerged as the most significant parameter in the growth of Internet hate. The identification of the personality traits associated with the producers of hate speech has created a triadic association between the different personality traits, including psychological attributes of narcissism (the passionate fixation on oneself), Machiavellianism (the enjoyment gained from manipulating the behavior of others), and psychopathy (lack of empathy). While in the outside world these traits can occur independently, in the digital space they appear together with ominous regularity. A fourth trait, often identified in conjunction with the others (thus the tetrad), is sadism, the enjoyment of cruelty through the inflicting of pain on others. (March et al., 2017).

The pursuit of targets online has come to be called ‘trolling’ (Stein, 2016). Trolling has become an omnipresent part of the web experience for anyone of any celebrity (or indeed anyone based on their appearance, clothing or opinions). It began its life when social media for the first time made it possible anonymously to identify and harass other people. The heartland of trolling has been the ‘Politically Incorrect’ (/Pol/) board of the website ‘4chan’, essentially a US-based old-style bulletin board which allows people to post ideas totally anonymously, and then see where they run (Hine et al., 2016). /Pol/, which serves as ground zero for the rise of cyber hate in the period of social media, has been the development space for aggressive memes, for the capture and re-purposing of Pepe the Frog – a meme asserting the dominance of White Power online (Koebler, 2016) and the strategies for bypassing attempts by major Internet providers to develop automatic devices (bots) to block race hate (Pearson, 2016). Moreover it has been the birthplace for the Alt_Right hashtag and cyber group, identified publicly as overtly racist when US Democratic Presidential candidate Hillary Clinton outed the nascent movement in the US presidential elections in 2016 (Chan, 2016). ‘4chan’, amongst its many other features, has published a dystopic set of ‘rules for the internet’, reflecting the gamer language and self-enhancing narcissism of its many users.

While proponents of race hate have been avid users of the Internet since its inception (Jakubowicz, 2012), the explosion in racist demagoguery since at least 2010 has been fueled by political, economic and social crises around the world. As identity politics has deepened along the fault lines of race, gender and sexuality, so the Internet has become not merely a battleground but indeed a real weapon in the conflicts over resources, power and life choices (Klein, 2017). That is, the Internet has been used to harass targets, including the practice of ‘doxing’ (the release of private documents), in which the real world information of targets has been revealed so that they can be further pursued, for instance, through harassment of their children both online and off.

Racists are more likely to exist in societies that are racially demarcated, with histories of racist oppression, and in hierarchies where race is associated with privilege or liability. Metropolitan societies of former empires are very likely to generate racist hierarchies, though racially-conflicted societies clearly also exist where the remnants of colonial regimes have left unresolved inequalities. Racially-associated resentments can continue for many generations where it has proved as yet impossible to ensure equity between racial groups.

The US, occasionally described after the election of President Barak Obama as ‘post-race’, has more recently seen a re-ignition of racial conflict and racial antagonisms. Seeking to frame the origins and structural continuities in racism especially for African Americans, Joe Feigin has noted:

In the United States, racist thought, emotion and action are structured into the rhythms of everyday life. They are lived, concrete, advantageous for whites and painful for those who are not white. Each major part of the life of a white person or person of color is shaped directly or indirectly by this country's systematic racism (Feigin, 2014).

First published in 2000, Feigin’s study tracks the structural impact of racism in the US and explores how that structure affects the agency, opportunities and life outcomes for majority Whites and a range of ethno-racial minorities. He denies though that the country can be truly analysed as post-racial, reflecting on the 97% of Americans who opted for a single racial label in the 2010 Census (p.251). Feigin argues that US society has been dominated by a White Racial Frame, such that the narrative of the society and the development of its institutions have been systemically bent towards the interests and perspectives of Whites, especially White men. While Australia has a much smaller historical involvement with slavery than the US (mainly in relation to indentured workers from the Pacific) the idea of White framing that Feigin uses may well prove helpful in reflecting on Australian racism.

Australia has a particular constellation of histories of invasion, extermination, slavery, and exclusion – it was founded by the invasion from the United Kingdom by military forces and forced settlers (Jupp, 2002). Thereafter governments imposed various regimes of racial oppression and exclusion, mainly but not only on Indigenous people, Asians and Pacific Islanders, such that in the formal creation of the Commonwealth of Australia in 1901, racial hierarchy was central to the project and its legislative priorities (Williams, 2012). Thus for at least three generations after the establishment of the Commonwealth, White privilege drove the social, political and economic life of the country. Indigenous people were effectively without recognition until the mid-1960s, a time also when White Australia immigration restrictions began to erode. It was not until the late-1970s that the celebration and defence of White privilege was abandoned by government, though core values and worldviews of Whiteness remained circulating within the society, as an insistent counter-narrative (Jakubowicz, 1997). Multiculturalism was adopted as formal policy in 1978 (Jakubowicz, 2013), though there has been sustained resistance to it among many sectors of the society, which continue to bemoan the reduction in White privilege. Institutions geared towards countering racism have been in place since 1976, though not without resistance and sustained hostility (Soutphommasane, 2014).

An Australian case study: The Dingoes as an exemplar of gaming the system in the name of White power

The example used in this article of the trolling/ 4chan approach, set up in Australia during the US Presidential elections, is a project of a group calling itself TheDingoes. Perched on a service provided by .xyz (a new service platform that hosts many thousands of clients), TheDingoes exemplifies all the various elements of state of the art antisemitic and racist online presence; Buzzfeed reported that the founders of the TheDingoes were intentionally using as wide a range of social media as they could, skirting rules and testing boundaries, in order to normalise racist hate speech (Di Stefano, 2017, Di Stefano and Esposito, 2016). Typically the members remain disguised behind pseudonyms and delight in their anonymity, particularly the opportunity it gives them to ‘bant’ (banter).

Their use of the .xyz demonstrates a close knowledge of Internet trends. The .xyz domain name was released to the general public in mid-2014, as part of a refreshing by ICANN of generic top-level domain names. Google adopted it for its corporate Alphabet site, (, and by June 2016 it was the fourth most registered top level global domain name, .net and .org. The name is managed by a company called Generation XYZ, (, which describes itself as ‘a global community inspired by the Internet and its limitless potential… to connect with the world in a whole new way… you can focus on connecting with your audience anywhere in the world’. It represents a further layer of defence for users, from any retributive pursuit by people they harass.

TheDingoes appeared online in 2016, their website registered in January, followed up with a Twitter account in June. A number of the people associated with the group also joined about that time, including one tweeter whose display image contained the anti-immigration slogan ‘Fuck Off, We’re Full’. TheDingoes (once the name of a 1970s Australian music band that left for the US) described itself as ‘#AltRight, but not in the way that violates #Rule1’. Here they refer to Rule1, that is the 4chan /b/ rule 1, ‘Do not talk about /b/’ (which is also rule 2). /b/ is the general posting board for 4 Chan users. They also have ‘88’ on their page, which stands for the initials ‘HH’, a code for ‘Heil Hitler’. As of February 2017, TheDingoes had 1,461 followers online, had posted 3,640 posts, garnered 5,507 likes, and was following 442 other Tweeters; by September 2017 it had grown to 2,146 followers (gaining about 100 followers a month), with 4,615 Tweets and 7,500 likes, though it had abandoned some of its followed friends (down to 420). The site followed a range of micro-nationalist groups, a raft of conservative online commentators and some ‘lulz’ (Laugh Out Loud plural) antisemitic posters, such as one identifying as ‘Goys just want to have fun’, and another as ‘Dachau Blues’, backed by an image of the Auschwitz ‘Arbeit Macht Frei’ sign.

The 400+ twitter accounts that TheDingoes follow, provide a helpful geography in Australia, the USA and Europe, of both the antisemitic old right and the new Alt_Right, that shows how fragmented and competitive and attention-seeking such groups can be. TheDingoes consciously incorporate three other rules from 4chan/b/, numbers 13 to 15: ‘nothing is sacred; do not argue with a troll – it means they win; the more beautiful and pure a thing is, the more satisfying it is to corrupt it’. These rules are compounded by two other insights (Rules 19 and 28) generated by the Dark Triad souls who drive the machine: ‘The more you hate it, the stronger it gets’; and ‘There will always be more fucked up shit than what you just saw’. These views are nihilist rather than conservative, angry and pathological rather than intellectual or analytical.

TheDingoes are aligned with a number of ultra-Right sites in the US, where they posted podcast interviews with former ALP leader and later conservative commentator Mark Latham[2] and National party MP George Christensen[3] (hosted globally on The Right Stuff (TRS) (The Dingoes, 2017). Querying the Christensen interview, one commentator (self-styled as ‘rabbijoeforeskinsucker’) challenged the interviewers for focusing too much on Muslims (Christensen was strongly anti-Muslim), declaiming: ‘Lot's [sic] of talk about Muslims, not much about the Jew. Are you guys kosher nationalists by any chance? Or are you just cowards?’

The slackbastard blog, an anarchist monitor of far right politics in Australia, reported that TRS ‘obtained its semi-popularity on the AltRight, in part, by its willingness to address The Jewish Question, ie to name the Jew as being responsible for All The (Bad) Things’ (@ndy, 2017).

By February 2017, TheDingoes had become national news, when their TRS interview with Christensen became a point of attack by Jewish leaders on the Australian Prime Minister Malcolm Turnbull, on the day Benjamin Netanyahu, then Prime Minister of Israel, arrived in Australia for an official visit. BuzzFeedNews reported that:

Dr Dvir Abramovich, whose Anti-Defamation Commission tracks anti-Semitism online … was angered to hear Christensen speaking to the podcast network. Abramovich said ‘The Right Stuff’ network started one of the most prominent anti-Semitic memes on the internet, which involved racist trolls putting parentheses, or ‘echo’ symbols, around the names of Jewish journalists. ‘While we do not know what is in Christensen’s heart, for an elected official to be interviewed on a podcast that traffics in bigoted and demeaning stereotypes is deeply troubling,’ he said (Di Stefano, 2017).

While researching racism online this author found himself a target of TheDingoes. I have a Jewish heritage, my immediate family being Holocaust survivors who escaped to Australia, via China, in 1946. Many members of my close family were murdered by the Nazis in Poland during the Second World War. Since 2011, I have been a regular contributor to The Conversation, a global website publishing articles by academics from universities across the world, written for an intelligent lay audience. Over this time, I have contributed over 40 pieces, mainly on multiculturalism and racism. On 6 December 2016, I published a piece on ethnocracy, applying to Australia the ideas of scholars who had analysed Israel and Northern Ireland in terms of their different populations’ unequal access to democracy (Jakubowicz, 2016a, Jakubowicz, 2016b). The article attracted 5,400 hits, and 120 published comments. Among the comments (since removed by the community standards guardian at The Conversation) were a number of curious posts. The first, from a Clara Newsom (most likely a pseudonym), asked ‘To what extent are Jews like you overrepresented in positions of power?’. She followed with: ‘I want to know how many university professors are Jews’. Then she wrote: ‘Good take on this over at The Dingoes. ‘Ethnocracy’ is a pretty shabby concept tbh, not worthy of a real sociologist. Looks like more ethnically motivated, anti-Australian animus to me!’ She then posted a link to TheDingoes web page (posted on 8 December), which referred to the ‘Skype-rat’ jacubowicz [sic]. The comments continued – ending with ‘What if academics of a (((certain ethnicity))), e.g. are disproportionately guilty of sowing white ethnomasochist memes such as ‘white privilege’ …. Try this instead: the Dingoes link’.

I had not, at that stage, heard about the moves made on 4chan or TRS to label people as Jewish and therefore a problem, or picked up on the bracket creep ((( ))). The (((echo))) device was first launched by the TRS blog in June 2016 (note its logo names it as an ‘Echo Free’ site), as a way of capturing Jewish names ‘that echoed through history’ (Fleishman and Smith, 2016b). The echo brackets are supposed to represent graphically the ringing of a bell that continues over time, which therefore enhances the names inside the brackets with the implied negative impact of Jewish perfidy through the ages. Soon after, a related device was trialed, by altrightmedia which uploaded an extension for Google Chrome, called ‘Coincidence Detector’. The extension would draw on a list of supposedly-Jewish names, regularly updated, and wherever one of the names appeared during a web search for one of its users, the name would be echo-bracketed for that user. This process was designed as a ‘Jew detector for unknowing goyim’, ensuring that those so inclined could see American media and politics draped in the brackets (Fleishman and Smith, 2016a).

So, that explained the echo brackets that Newsom had used in her TheConversation comment, and which re-appeared on The Dingoes attack. Indeed the story about the brackets had broken quickly in the US and global media some weeks before, though in resistance many people adopted the brackets as a sign of solidarity with the targeted Jews.

But Skype-rat was something else again. As it soon turned out, a new game was being tried with Google, where the trolls at 4chan had invented a strategy for identifying Jews (and Blacks and Mexicans) through attaching proxy-labels, which were major commercial identifiers on the Internet. Hine and colleagues have studied 4chan, ‘the dark underbelly of the Internet’, and its /pol/, the politically incorrect board. In particular they looked at ‘Operation Google’, a response to Google’s announcement of anti-hate machine learning-based technology and similar initiatives by Twitter, designed to sabotage the then extant anti-hate strategies (Hine et al., 2016). The Alt_Right trolls proposed using ‘Google’ to replace ‘nigger’, and ‘Skype’ to replace ‘Kike’ (Jew). The call went out on September 22 – that day ‘Google’ increased its use on the Internet by word count by five times, while Skype doubled. By September 26, the use had declined, though the words remained part of the /pol/ vernacular. Ten weeks later the word ‘Skype’ added to the old antisemitic label of ‘rat’ was up and running in Australia.

Within two days of my The Conversation article being published, the Skype-rat piece had been posted on, in their website, on Twitter and on Facebook, written by ‘Carl’. It contained over 2,000 words focusing on the ‘Jewish Marxist’ race traitor motif that has been a common trope of neo-Nazis (and indeed the original Nazis). The post opened with my history laid out, the first being my (((Polish))) parents, under an image of a canopy of photographs of Jews killed in the Holocaust, taken from a Jewish memorial site, the Yad Vashem Hall of Names.

By May 2017 TheDingoes had made it onto the front page of the Sun Herald, a leading Sydney newspaper. The article reported a forthcoming Dingoes conference, where the guest speaker was to be US neo-Nazi ‘MIke Enoch’ (aka Mike Peinovich) founder of TRS (The Right Stuff) and host of The Daily Shoah podcast. The article concluded by quoting ‘Mammon’, a Dingoes’ spokesperson, as claiming Australia should become a white ‘ethnostate’, a term curiously redolent of my argument which the site had attacked some months before (Begley and Maley, 2017).

In investigating this group’s emergence, I tracked the layers of technology it had used to create the webpage. The site itself does not supply ownership information. The publishing software they use is made by, a US firm, which describes its mission as ‘to create the best open source tools for independent journalists and writers across the world… Behind the scenes, we're just a group of weird, fun-loving humans who enjoy experimenting with new technology. We believe in creating as much freedom in the world as we can, and everything we do is based on that core principle…’. In an email to me, John, the principal at Ghost, responded that ‘our code of conduct covers our customers, and the user in question is not one of our customers, It’s a person on the Internet who has downloaded our free-of-charge code and made his own website with it… there’s absolutely a legal precedent for Digital Ocean, of whom this person is a customer, to take responsibility for what he publishes’.

I then contacted Digital Ocean, a US company which hosts the site. Despite three attempts the company failed to respond, an automated message saying ‘we are not seeing any abuse complaints’. The domain name they used, from .xyz, was not covered by anti-abuse provisions, as ‘XYZ exercises no control over the uses to which a domain name may be put and no control over the content or operations of any website’. A ‘Whois’ search revealed that was created in January 2016 for eight years.

Meanwhile another Alt_Right site,, accused me of being a member of the ‘regressive left’, collecting a series of descriptors from my The Conversation profile (Hiscox, 2016). This XYZ name was chosen as the Alt_Right response to the ABC (Australian Broadcasting Corporation), while also acknowledging its political proximity to the US site Breitbart ( However has no relationship to

Andrew Anglin, founder of the Daily Stormer (genuflecting in its name to the Nazi Party’s Die Sturmer), described his group of Alt_Right thus:

Trump-supporting White racial advocates who engage in trolling or other activism on the internet … The core concept of the movement, upon which all else is based, is that Whites are undergoing an extermination, via mass immigration into White countries which was enabled by a corrosive liberal ideology of White self-hatred, and that the Jews are at the center of this agenda (Anglin, 2016).

The bursting forth of the Alt_Right into the sphere of antisemitism online has been paralleled in time with the rapid rise in threats directed against Jews in the real world, especially in the US. It is important to recognise that the trolling online has close similarities with the real world attacks being tried out in cities and towns, be it the rapid spread of graffiti in trains and on walls, for instance ‘Jew’ scrawled on images of rats stenciled on walls in Chinatowns in the US to bring in the Year of the Rat (Chernikoff, 2016) or neo-Nazi graffiti stickers that have appeared in Melbourne.

As part of the same coterie, 4Channers have described themselves as:

the people devoid of any type of soul or conscience. Products of cynicism and apathy, spreading those very sentiments daily. Anonymous is the hardened war veteran of the internet. He does not forgive or forget. We have seen things that defy explanations; heard stories that would make any god-fearing, law abiding citizen empty their stomach where they stand. We have experienced them multiple times and eagerly await their return[4].

The bizarre world of trolls and cyber Nazis has become bound together with centuries old antisemitism, while the characteristics of the Internet contribute to, and the personalities of the trolls may well ensure, that the hate will flourish.

Countering racism online: lightening the impact of the Alt_Right White

While authoritarian regimes with strong Internet controls, such as the Peoples’ Republic of China, have almost managed to curb unauthorised racism online (Hornby, 2017), most democratic societies and their associated capitalist economies have found the challenges far greater (McGonagle, 2012a). Constrained by ideologies promoting freedom of speech above freedom from hate, and concerned to ensure economic growth and the profitability of online enterprises, governments have been loath to intervene in cyber behaviours that are not overtly criminal. Indeed the Australian government has rejected options to do so a number of times – most significantly in the adoption of the European convention on Cybercrime, for which Australia at first floated in a draft agreement, and then withdrew, acceptance of the optional protocol on cyber racism (Council of Europe, 2001, Council of Europe, 2003). Two major forums in 2002 and 2010 sponsored by the Australian Human Rights Commission have heard calls for but rejected any legislative intervention on cyber race hate speech (Australian Human Rights Commission (AHRC), 2013, Jakubowicz, 2010). In the launch of the Children’s E-Safety Commissioner, the Australian government noted that its concerns were for sexual abuse and psychological damage to children, carefully stating that freedom of speech priorities would apply to any cyber race hate issues affecting adults (Parliament of Australia, 2014).

Of course the same government’s relentless battles over its unsuccessful attempts to reduce the protections offered under Section 18C of the Racial Discrimination Act in the period from 2014 to 2017 reinforce the public bemusement with the commitment of the state to protecting vulnerable targets from online racist abuse. It was only in August 2017, when the 18C battle had been pushed into the background, that the E Safety Commissioner was able to initiate action around the threats young people experienced on the basis of race hate. The E Safety Commissioner found that more than half of Australians between the ages of 12 and 17 had witnessed racist or hateful comments about cultural or religious groups online (Giakoumelos, 2017).

While the United Nations and its agencies such as UNESCO and the ITU, the European Union (European Commission against Racism and Intolerance (ECRI), 2015, Cerase et al., 2015) and other international bodies have recognised the social, political and economic harm caused by online racism(Gagliardone et al., 2015, McGonagle, 2012b, Global Forum for Combating Antisemitism, 2015), there has been little progress in collaboration at the global level between government, civil society and the economic actors given the resistance by the largest corporations and their agents to any mention of governmental and especially intergovernmental regulation. Such collaboration will ultimately be crucial, as unconstrained vitriolic hate speech contributes to declining civility both within and between nations.

Potential Australian initiatives

However there are possibilities at the national level, and locally in countries such as Australia. There are federal government agencies and departments concerned to enhance social cohesion, reduce the contributors to violence and conflict, and supportive of civil communication. Meanwhile the onshore representatives of the global economic players on the Internet can be brought together, while their industries are also well-organised (Internet Industry Association, Australian Interactive Multimedia Industry Association). An argument can be made that oligopolistic economic actors such as Google or Facebook have very significant social obligations, however uncomfortable such claims might be for the owners of those platforms. For instance Napoli has argued in relation to journalism online that the preventative and individualistic approach adopted by most of the platforms in relation to their freedom to deliver content as they wish should be replaced with a public interest test for the algorithms used by the platforms (Napoli, 2014). Yet advancing such arguments remains constrained by the absence of national lobbies of civil society actors concerned with such a public interest test.

Civil society has no national peak concerned with online racism, other than the rather amorphous Racism It Stops With Me network established by the Australian Human Rights Commission. Unfortunately, the most influential civil society organisation, and one that opposes any constraints on hate speech, the Institute for Public Affairs, has close ties to the News Limited print media, where the Alt_Right also has a base on its Sky News cable channel. Their links reveal the close alliance between conservative political forces and major economic actors in the media environment. Logically then the broader regulatory context in Australia would be well-served by a national body to promote a more civil and protected online communication space, with the authority and influence to negotiate with government and industry to support effective monitoring, responsiveness and ultimately intervention.

Online civil society activism depends on information about the extent, form and impact of Internet racism, combined with networks that can support targets, applaud resisters, and pursue perpetrators. Racism works best when it terrifies, fragments and immobilises its targets. Racist propaganda travels quickly through the Internet, cleansing sites of opponents, and intimidating bystanders. It can be slowed when creative interventions and systematic resistance are produced and sustained, racists called out, and their platforms pursued to deny them communication space. The anonymity of trolls currently provides them with their greatest weapon, while the preference of platforms to protect targets by allowing them to cut themselves off from attack leaves the target isolated and the perpetrator free to operate elsewhere with their defamatory and harassing posts and activities left untouched in the growing number of secret cells of private groups.

Governments often do not see racism as important enough an issue to provide the level of investment necessary for civil society action against it. Organisations such as the Online Hate Prevention Institute, identified globally as innovators in finding and outing racism online, struggle to survive as Australian governments focus on protecting children and tracking terrorist recruitment. While these other related spheres are clearly very important, racism online contributes greatly to both of them, threatening children and justifying in the minds of some their recruitment into violence against their racialised ‘enemies’.

If we understand that online racism has all the characteristics of multiple conflicting and competing social movements, then a social movement model might best provide the way to limit its impact and hold both perpetrators and supporters to account. There are many civil society groups, from local churches, mosques, synagogues, temples and ashrams, through online activist groups such as GetUp!, to trade unions, neighbourhood committees, and sports clubs, that have an interest in a more civil and comprehending cyber sphere.

Within the sphere of economy there are real challenges to the Internet industry where its fundamental economic logic may need to be somewhat unstitched and then recombined in ways that allow the sticky spots that attract the swarms that deliver the profit to operate with less negative impact on more vulnerable and less resourced users of the internet. While governments at the state level have recognised the importance of social media in promoting social cohesion, they have been slow to think through the systematic processes they will need to research, design, implement, review and refine. This article has sought to illuminate why action against cyber-racism has so quickly become as important as it has, while suggesting some frameworks for innovation in the context of the way the Internet industry is organised. As civil society decries the rise of the cyber racists, racists are making hay as they discover ever more ways to ‘bant’ the system supposedly there to limit their impact.

Some current proposals in Australia

In this final section a summary is provided of the range of recommendations that the CRaCR project has made to its industry partners, namely the Victorian Health Promotion Foundation (VicHealth), the Australian Human Rights Commission (AHRC), and the Federation of Ethnic Communities Councils of Australia (FECCA). While the partners will make their own decisions about their priorities, the range indicates the multiple levels at which cyber racism must be addressed.

There are three areas of law that could be addressed. At the global level, Australia could withdraw its reservation to article 4 of the International Convention to Eliminate All Forms of Racial Discrimination. Such a move has been flagged in the past, but stymied by relentless opposition from an alliance of free speech and social conservative activists and politicians. Perhaps with the new data released by the E Safety Commissioner on the exposure of children to Hate Speech, governments and opposition parties could be led to a common cause. Similarly Australian law could move to recognise European legislation on Cyber Crime, and adopt the Additional Protocol as it has for the overall legislation. Finally, Australia could adopt a version of New Zealand’s approach to cyber hate, where platforms are held ultimately accountable for the publication of online content that seriously offends, and users can challenge the failure of platforms to take down offensive material in the realm of race hate. Taken together these elements would mark out to providers and users of Internet services that there is a shared responsibility for reasonable civility.

However there are many initiatives in civil society that would empower those who are currently the targets, and disempower those who are the current perpetrators of race hate. For organisations concerned with sustaining civility and community mental health, a multi-layered approach becomes crucial. Firstly, people who are targeted by racists need support and affirmation; this approach underpins the approach that the E Safety commissioner has undertaken in the development of a Young and Safe portal. The portal offers stories and scenarios designed to build the confidence of young people, and provide them with the skills to grow. The Online Hate Prevention Institute has become a reservoir of insights and capacities to identify and pursue perpetrators. There could be a CyberLine for tipping and reporting race hate speech online, for follow up and possible legal action. Anti-racism workshops (some have already been run by the E Safety commissioner) have aimed to pushback against hate, while building structures where people can come together online. Modelling and disseminating best practice against race hate speech offers resources to wider communities that can then be replicated elsewhere.

The Point magazine, an online youth-centred publication for the government agency Multicultural New South Wales, reported two major events where government sponsored industry/community collaboration to find ways forward against cyber racism. (Fares, 2016) In Sydney a YouTube Content Creators Bootcamp brought together the industry with young creatives to find ways of building counter narratives. In Melbourne, the Federal government, focusing on countering radicalisation, collaborated with industry groups and young people in a media online hate prevention forum. However the industry participants could not see, in this sort of collaboration, an ending to cyber racism: there would need to be industry and community collaboration, whatever that might mean.

Finally we need to recognise that the growth of cyber racism marks the struggle between a dark and destructive social movement that wishes to suppress or minimise the recognition of cultural differences, confronted by an emergent social movement that treasures cultural differences and egalitarian outcomes in education and wider society. Advocacy organisations can play a critical role in advancing an agenda of civility and responsibility through the state, the economy and civil society. The social movements of inclusion will ultimately provide the pressure on the state and in the economy that ensures the major platforms do in fact accept full responsibilities for the consequences of their actions. When the population confronts the industry, demonstrating it wants answers, then we will begin to see responsibility emerge. Cyber racism has been produced by the intimate relations of power generated by the state, the economy and civil society. The focus for civil society social movements will necessarily be to loosen the suction that holds state bodies so closely in thrall to the industry.


@ndy 2017, 'Bonus Latham!', slackbastard (11 February). Available at: (Accessed 23 February 2017).

Alba, D. 2017, 'YouTube’s ad problems finally blow up in Google’s face', Wired(25 March). Available at: (Accessed 13 April 2017).

Amend, A. and Morgan, J. 2017, 'Breitbart Under Bannon: Breitbart's comment section reflects Alt-Right, anti-Semitic language', Hatewatch. Available at: (Accessed 23 February 2017).

Anglin, A. 2016, 'A Normie’s Guide to the Alt-Right', The Daily Stormer (31 August 2016). Available at: (Accessed 2 November 2017).

Australian Human Rights Commission (AHRC) 2013, 'Human Rights in Cyberspace', Background Paper, Available at: (Accessed 1 March 2017).

Begley, P. and Maley, J. 2017, 'White supremacist leader Mike Enoch to visit Australia'. Available at: (Accessed 13 May 2017).

Berners-Lee, T. 2017, 'Tim Berners-Lee: I invented the web. Here are three things we need to change to save it', The Guardian. Available at: (Accessed 15 March).

Brookings, E. and Singer, P. 2016, '"War Goes Viral" How social media is being weaponized across the world', The Atlantic, (November).

Buckels, E., Trapnell, P. and Paulhus, D. 2014, 'Trolls just want to have fun', Personality and Individual Difference, vol. 67(September), pp. 97-102.

Cadwalladr, C. 2017, 'Robert Mercer: the big data billionaire waging war on mainstream media', The Guardian. Available at: (Accessed 14 March 2017).

Cerase, A., D’Angelo, E. and Santoro, C. 2015, 'monitoring racist and xenophobic extremism to counter hate speech online: ethical dilemmas and methods of a preventive approach', VOX Pol. Available at: (Accessed 4 June 2017).

Chan, E. 2016, 'Donald Trump, Pepe the frog, and white supremacists: an explainer That cartoon frog is more sinister than you might realize', Available at: (Accessed 2 November 2017).

Chernikoff, H. 2016, 'In D.C., Chinese Zodiac Symbol Becomes Jew Rat Graffiti', Forward (5 August). Available at: (Accessed 28 March 2017).

Convention on Cybercrime(ETS No. 185).

Council of Europe 2003, 'Additional Protocol to the Convention on Cybercrime, concerning the criminalisation of acts of a racist and xenophobic nature committed through computer systems', ETS No.189. Available at: (Accessed 23 July 2015).

Curran, J., Fenton, N. and Freedman, D. (eds.) 2012, Misunderstanding the Internet, Routledge, London.

Di Stefano, M. 2017, 'Jewish leaders are disgusted that a government MP appeared on an “anti-Semitic” podcast', BuzzFeedNews, Available at: (Accessed 23 February 2017).

Di Stefano, M. and Esposito, B. 2016, 'Australia has an Alt-Right movement and it’s called #DingoTwitter: The white nationalists are down under', Buzzfeed News. Available at: (Accessed 4 May 2017). No longer available

European Commission against Racism and Intolerance (ECRI) 2015, 'Combating Hate Speech,' ECRI General Policy Recommendation No.15, Available at: (Accessed 19 November 2016).

Fares, W. 2016, 'Social media platforms battle online haters', The Point Magazine. Available at: (Accessed 17 June 2017).

Feigin, J. 2014, Racist America: Roots, Current Realities, and Future Reparations, Third ed. Routledge, New York.

Fleishman, C. and Smith, A. 2016a, '"Coincidence Detector": The Google Chrome extension white supremacists use to track Jews', Tech.Mic. Available at: (Accessed 23 February 2017).

Fleishman, C. and Smith, A. 2016b, '(((Echoes))), Exposed: The secret symbol neo-Nazis use to target Jews online', Tech.Mic. Available at: (Accessed 23 February 2017).

Fuchs, C. 2009, 'Information and communication technologies and society: A contribution to the critique of the political economy of the Internet', European Journal of Communication, vol.24, no.1, pp. 69-87. Available at: (Accessed 23 August 2015).

Fuchs, C. 2017, Social Media: A Critical Introduction, 2nd ed. Sage, London.

Gagliardone, I., Gal, D., Alves, T. and Martinez, G. 2015, Countering Online Hate Speech, UNESCO, Paris. Available at: (Accessed: 21 February 2017).

Giakoumelos, P. 2017, 'Racism, hate and a new online tool to help young people', SBS News. Available at: (Accessed 31 October 2017).

Global Forum for Combating Antisemitism 2015, GFCA 2015 Final Statement on Combating Cyberhate and Antisemitism on the Internet.

Hine, G. E., Onaolapo, J., De Cristofaro, E., Kourtellis, N., Leontiadis, I., Samaras, R., Stringhini, G. and Blackburn, J. 2016, 'A longitudinal measurement study of 4chan’s Politically Incorrect Forum and its effect on the Web (version 3)'. Available at: (Accessed 20 February 2017).

Hiscox, D. 2016, 'XYZ vs AGE (part 1): Regressive Left run for cover', XYZ. Available at: (Accessed 25 February 2017).

Hornby, L. 2017, 'China battles to control growing online nationalism', The Financial Times(9 January). Available at: (Accessed 23 January 2017).

International Telecommunications Union 2017, 'What does ITU do?' Available at: (Accessed 21 January 2017).

Jakubowicz, A. 1997, 'In pursuit of the anabranches: Immigration, multiculturalism and a culturally diverse Australia', in Gray, G. & Winter, C. (eds.) The Resurgence of Racism; Howard, Hanson and the Race Debate, Monash Publications in History. Clayton Department of History, Monash University.

Jakubowicz, A. 2010, 'Cyber Racism, Cyber Bullying, and Cyber Safety', Conversation at the AHRC Cyber-Racism Summit 2010, Available at: (Accessed 31 October 2017).

Jakubowicz, A. 2012, 'Cyber Racism', in More or less: democracy and new media [Online]. Version. Available at: (Accessed: 16 July 2016).

Jakubowicz, A. 2013, 'Comparing Australian multiculturalism: the international dimension', in Jakubowicz, A. & Ho, C. (eds.) 'For Those Who've Come Across the Seas...': Australian Multicultural Theory Policy and Practice, Australian Scholarly Press, Melbourne, pp. 15-30.

Jakubowicz, A. 2016a, 'First the word, then the deed: how an ‘ethnocracy’ like Australia works', The Conversation. Available at: (Accessed 12 March 2017).

Jakubowicz, A. 2016b, 'Once upon a Time in … ethnocratic Australia: migration, refugees, diversity and contested discourses of inclusion and exclusion', Cosmopolitan Civil Societies: An Interdisciplinary Journal, vol.8, no.3, pp. 144-167.

Jakubowicz, A., Dunn, K., Paradies, Y., Mason, G., Bliuc, A.-M., Bahfen, N., Atie, R., Connelly, K. and Oboler, A. 2017, Cyber Racism and Community Resilience: Strategies for Combating Online Race Hate, Palgrave Macmillan, London.

Jakubowicz, A., Goodall, H., Martin, J., Mitchell, T., Randall, L. and Seneviratne, K. 1994, Racism, Ethnicity and the Media, Allen and Unwin, Sydney.

Jupp, J. 2002 From white Australia to Woomera : the story of Australian immigration. New York Port Melbourne, Vic.: Cambridge University Press.

Kelly, H. 2017, 'The Zuckerberg Interview', CNN Business. Available at: (Accessed 24 June 2017).

Klein, A. 2017, Fanaticism, Racism and Rage Online: corrupting the digital sphere, Palgrave Macmillan, Cham, Switzerland.

Koebler, J. 2016, 'Hillary Clinton Is Right: Pepe Is a White Supremacist', Motherboard (15 September). Available at: (Accessed 21 February 2017).

Maier, L. 2016, 'Germany investigates Mark Zuckerberg and Facebook over slow removal of hate speech', Forward. Available at: (Accessed 23 March 2017).

March, E., Grieve, R., Marrington, J. and Jonason, P. K. 2017, 'Trolling on Tinder® (and other dating apps): Examining the role of the Dark Tetrad and impulsivity', Personality and Individual Differences, no.110, pp. 139-143.

Mattelart, A., Mattelart, M. and Delcourt, X. 1984, International Image Markets: In Search of an Alternative, Comedia, London.

McGonagle, T. 2012a, 'Minorities and online “Hate Speech”: A parsing of selected complexities', European Yearbook of Minority Issues Online, vol.9, no.1, pp. 419-440.

McGonagle, T. 2012b, 'The troubled relationship between free speech and racist hate speech: the ambiguous roles of the media and internet', Day of Thematic Discussion “Racist Hate Speech”, Available: UN Committee on the Elimination of Racial Discrimination. Available at: (Accessed 24 May 2016).

Mostrous, A. 2017, 'YouTube hate preachers share screens with household names', The Times(17 March). Available at: (Accessed 25 March 2017).

Nagle, A. 2017, Kill All Normies: Online culture wars from 4chan and Tumblr to Trump and the alt-right, Zero Books, Arlesford, Hants.

Napoli, P. 2015, 'Social Media and the Public Interest : Governance of News Platforms in the Realm of Individual and Algorithmic Gatekeepers', Telecommunications Policy, vol.39, no.9, pp. 751-760.

Parliament of Australia 2014, Enhancing Online Safety for Children Bill 2014, Explanatory Memorandum Canberra: House of Representatives. Available at: (Accessed: 15 September 2016).

Pearson, J. 2016, 'Scientists invented a tool to expose 4chan’s racist trolling campaigns', Motherboard. Available at: (Accessed 21 February 2017).

Simon Wiesenthal Center 2017, 'Simon Wiesenthal Center’s 2017 Digital Terrorism & Hate Report Card: Social Media Giants Fail to Curb Online Extremism', Simon Wiesenthal Center. Available at: (Accessed 15 April 2017).

Soutphommasane, T. 2014, 'Unconscious bias and the bamboo ceiling'. Available at: (Accessed 24 October 2016).

Statt, N. 2017, 'YouTube is facing a full-scale advertising boycott over hate speech: The biggest brands continue to leave', The Verge(24 March). Available at: (Accessed 31 October 2017).

Stein, J. 2016, 'How Trolls Are Ruining the Internet', Time (18 August/ 12 September) (Accessed 21 February 2017).

The Dingoes 2017, 'The Convict Report Episode 62: Mark Latham 2 – Dingo Boogaloo', The Right Stuff Radio: Echo Free Entertainment, 2017 (7 February). Available at: (Accessed 11 April 2017).

Thompson, J. 1994, The Media and Modernity: a social theory of the media, Blackwell, London.

Williams, G. 2012, 'Removing racism from Australia's Constitutional DNA ', Alternative Law Journal, vol.37, no.3, pp. 151-155. Available at: (Accessed 25 March 2017).

World Wide Web Foundation 2017, 'Delivering Digital Equality: The Web Foundation’s 2017 – 2022 Strategy', Web FoundationAvailable at: (Accessed 12 March 2017 2017].

Zuckerberg, M. 2017, 'Building Global Community', Facebook. Available at: (Accessed 2 May 2017).

[1] The CRaCR research team is made up of Andrew Jakubowicz (University of Technology Sydney), Kevin Dunn (Western Sydney University), Gail Mason (University of Sydney), Yin Paradies (Deakin University), Ana-Maria Bliuc (Western Sydney University), Nasya Bahfen(La Trobe University), Rosalie Atie (Western Sydney University), Karen Connelly (University of Technology Sydney), and Andre Oboler (OHPI and La Trobe University).




Share this article: