Skip to main content

Public hearing 28 - Violence against and abuse of people with disability in public places, Brisbane - Day 5

  • Video
Publication date

CHAIR: Good morning, this is the final day of Public hearing 28 of the Royal Commission into Violence, Abuse, Neglect and Exploitation of People with Disability. We will begin, as always, with an Acknowledgment of Country. I wish to acknowledge the Jagera and Turrbal people as the traditional owners and custodians of the land on which the city of Brisbane is now located and on which this public hearing of the Royal Commission is taking place. We pay our respects to their elders past, present and emerging. We also acknowledge and pay our respects to any First Nations people who are following these proceedings, whether from the Brisbane hearing room, or via the live broadcast. Yes, Ms Bennett. 

MS BENNETT: Before I hand to Mr Fraser, Commissioners, I would like to make an observation. The Royal Commission is being greatly assisted this week, as ever, by evidence from people with lived experience of disability. Hearing evidence from people with disability is essential if we are to carry out our task. A witness who has given evidence earlier this week appears to have been the subject of abuse because of their appearance. This is a matter, Commissioners, that we will investigate fully. 

It is timely, however, to make the observation, to remind those following the Royal Commission that under section 6M of the Royal Commission Act, it is criminal offence for any person to cause or inflict any violence, punishment, loss damage or disadvantage to any person for - or on account of - reasons, including the person having given evidence before this Royal Commission, and that there be no doubt that the Royal Commission will not take   hesitate to take steps within its powers to ensure that witnesses and those who otherwise engage with us are protected. Please the Commissioners. 

CHAIR: Yes, thank you, Ms Bennett. Yes Mr Fraser. 

MR FRASER: Commissioners, our next witness is Ms Carly Findlay OAM. Before I ask for Ms Findlay to be affirmed, I note that there is an appearance to be announced by the lawyer representing Ms Findlay.

CHAIR: Yes.

MS HEALEY: Good morning, Chair, Commissioners. My name is Ms Healey. I have the privilege of appearing for Ms Carly Findlay OAM. 

CHAIR: Thank you very much, Ms Healey.

MR FRASER:  Ms Findlay is present with us and I ask that she be affirmed. 

CHAIR: Ms Findlay, thank you very much for coming to the Royal Commission in Brisbane in order to give evidence today. We appreciate both your attendance and the statement that you have provided, which has been made available to each of us and which we have read in advance of the evidence that you will give today. In a moment, I will ask Mr Fraser to ask you some questions, but if you would be good enough to follow the instructions of my Associate, who is seated just in front of me, she will administer the affirmation to you. Thank you very much. 

ASSOCIATE: I will read you the affirmation. At the end, please say yes or I do. Do you solemnly and sincerely declare and affirm that the evidence which you shall give will be the truth, the whole truth and nothing but the truth? 

MS FINDLAY: Yes. 

<CARLY FINDLAY, AFFIRMED

CHAIR: Thank you very much, Ms Findlay. Yes, Mr Fraser. 

<EXAMINATION BY MR FRASER

MR FRASER: Ms Findlay, you have provided a written statement to this Commission dated 28 September 2022, which includes 13 annexures. Is that right? 

MS FINDLAY: Yes. 

MR FRASER: Have you had a chance to read those in preparation for your evidence today? 

MS FINDLAY: I have. 

MR FRASER: Are the contents of that statement true and correct to the best of your knowledge and belief? 

MS FINDLAY: They are. 

MR FRASER: Commissioners, a copy of that statement is at tab 123 of the Tender bundle, followed by the annexures. I tend their statement with the annexures. I ask that the statement be marked as Exhibit number 28 36, and the annexures be marked sequentially from 28 36.1 through to 28 36.13. 

CHAIR: Yes, Ms Findlay's statement will be admitted into evidence and given the marking of Exhibit 28 36 and the annexures thereto will be admitted into evidence and given the markings indicated by Mr Fraser. 

<EXHIBIT 28 36 STATEMENT OF CARLY FINDLAY DATED 28 SEPTEMBER 2022

<EXHIBITS 28 36.1 TO 28 36.13 ANNEXURES TO STATEMENT OF CARLY FINDLAY DATED 28 SEPTEMBER 2022

MR FRASER: Now, Ms Findlay, you wanted to make some acknowledgments before you started your evidence today. Can I invite you to do that?

MS FINDLAY: Sure. Thank you so much for having me today. I acknowledge we are on the lands of the Turrbal and Jagera people, and I extend my respects to Aboriginal people who have given evidence, who are watching and who are in the room today. This always was and always will be Aboriginal land. I would also like to thank everybody who   particularly disabled people who have given evidence at this hearing and the hearings before, but particularly this hearing because the   the themes are so consistent throughout, and it's been quite hard to hear, but it is very important that we all hear them and that change comes, because we have put in a lot of time into this Royal Commission and we need to see change effected. 

MR FRASER: Now, Ms Findlay, can I ask you to tell the Commissioners a little bit about yourself. 

MS FINDLAY: Sure. So, I am a   I'm from regional New South Wales originally. I was born to Annette and Roger Findlay. My parents have had an interesting story in that they came to Australia to marry during the Apartheid era in South Africa. My mum is, as she describes, a coloured South African. She's here supporting me today. And my dad is a white Englishman. And they couldn't get married because of the racial segregation law in South Africa at the time, so they came here to Australia, and I was born soon after that in a regional town called Albury. 

My parents are incredibly supportive and resourceful, stoic, just they are   just get on with it kind of people. And I'm very, very lucky to have been born into   born to two extremely supportive parents. They didn't have much money when they came here, they didn't know anyone, and then I was born, and I was born with a rare severe skin condition called Ichthyosis. 

In the early 1980s, it wasn't known of, really, and where I lived there was very little dermatology support and so my parents would travel to Melbourne or Sydney to the specialists regularly and also for me to stay in hospital for, you know, until I was probably about 13, I would travel to Melbourne regularly to stay in hospitals for treatment when I was unwell. And I think that that was   you know, a very, very hard time for them, not having any family support around them. 

I went to a mainstream primary school and high school and I really enjoyed writing. I'm a writer today and that was my dream, to be a writer, and I'm living it now. School was really hard. It was hard being in a small country town, particularly a very religious country town. I think that until I was about 10 years old, my mum might have been the only black person to live in our small country town in Walla Walla. That was hard. 

People used to whisper, “I'm red because my mum and dad are black and white”, which isn't true. You know, that's not how genetics works. You know, I guess being black and white was seen as a sin then. Sorry, that   I should have clarified that. It's not true. Yes. I found it very hard to be an outcast in that town. And I couldn't wait to leave. I couldn't wait to leave the small town and move to Melbourne. 

Yes, so I went to a   you know, mainstream high school, primary school. I felt very excluded there, and also I didn't identify as being disabled then because I didn't see anyone like me. You know, Ichthyosis impacts one in a million and so I didn't know anyone that looked like me. Didn't think it was a disability because I didn't see anyone else. And back then we only really saw disability in the Paralympics, should it have been shown on TV, which is quite rare then. Or on terrible tabloid TV shows. And so, yes, it took me a long time to identify as being disabled, and I think if I had have back then, I could have asked for the support I needed. Instead, I didn't. 

MR FRASER: So you   you moved from that town and you ultimately went to Melbourne and you obtained a Bachelor in e-Commerce and Masters in Communication; is that right? 

MS FINDLAY: Yes, I studied for my Bachelor of e-Commerce in Wodonga and then after that I moved to get a job in the public service in Melbourne in 2003, and then I got a Masters of Communication from RMIT. 

MR FRASER: And you   when you were in Melbourne, you started volunteering as a TV presenter on a local television show. Is that right? 

MS FINDLAY: I did. I started volunteering as a presenter on No Limits, which was a disability-led TV show run by disabled people. You may remember Stella Young was on that show, John McKenna. Lots of incredible disabled people. 

MR FRASER: And you have also published a number of books; is that right? 

MS FINDLAY: Yes, I wrote a memoir Say Hello, which was released in 2019. And I edited Growing up Disabled in Australia, which was released at the start of last year which features 46 stories by disabled Australians. 

MR FRASER: And in 2020. You were awarded the Order of Australia Medal for service to disabled people. Is that right? 

MS FINDLAY: Yes. 

MR FRASER: And you're very active online, aren't you? 

MS FINDLAY: Yes. 

MR FRASER: In your statement you talk about some experiences of abuse that you had in the community and in the online community. Can I start with not the online community but if we call the physical world first. You talk in your statement, from around paragraph 22, about - some   your experiences. Can I invite you to tell the Commission a little bit about that? 

MS FINDLAY: Sure. I'm going to refer to that, if that's okay. I feel that most   and similar to the people who have spoken this week   to the witnesses who have spoken this week, most days when I leave the house, there's a comment, stare, question on my appearance. Not every day, and, you know, I am very grateful to have a media profile which means that often it's   they recognise me from that or online, which is lovely. And so I would say that thank you to my work it's   the comments around   negative comments around my appearance have decreased a little because they know   they recognise me from that. 

Yes, as I said in my statement, I can't think of a single day that I have left the house without something being said or mocked or laughed at or questioned about my appearance. And I note that earlier in the week, Fiona Strahan was talking about how often when she's walking down the street, she's not thinking about her short staturedness. I'm often not thinking about my redness, and then I'm interrupted by someone asking what happened to you or   or "she's so sunburnt", or a laugh or you know, a gasp or whatever, or sniggers. So, there is that. It can be very disruptive for the day, and the evidence this week has been so relatable because of that. 

MR FRASER: And you have these sorts of experiences on public transport? 

MS FINDLAY: Absolutely. Yes, I remember a time when I moved   I think in the first year I had been living in Melbourne and I was on a tram, and a person who seemed to be quite intoxicated was yelling at me, and she had a bottle   a broken bottle in her hand, and I didn't know what she was going to do. But she kept on talking about my face and, you know, saying how ugly I was. And no one defended me on the tram, and I ended up getting off and going into a pub and saying what happened. 

And then I remember going to work on the Monday, and I told my boss what happened. And she said, you know, are you okay? I think you need to go to the counsellor. This is   you know, this doesn't happen to everyone. And it was sort of from then that I realised, I guess, that this doesn't happen to everyone. You know, it's   yes, I   that really impacted me, I think. I was scared for my safety, and I was scared just with no one sort of speaking up. 

MR FRASER: And the experiences you've had aren't just limited to public transport. Even trying to take private transport? 

MS FINDLAY: Yes, taxis. I just said to my mum this morning, actually, when we got a taxi here, putting on a mask increases my chance of getting a taxi, unfortunately. I   people   taxi drivers have been extremely discriminatory. I can't count how many times I've been discriminated against, particularly since 2013. I just shared on my Facebook last night, actually, from 2012, I wrote a thing: 

"No, taxi driver, my appearance is not relevant to this trip."

And that still stands today. You know, I have taxi drivers asking me what's wrong with my face, refusing to take me because of the way I look. In 2013, I got in a cab or tried to and the taxi driver said that I would ruin his seats. Now, I don't know anyone that's rubbed their face on a taxi seat, but apparently he thought I would. And I made a complaint to the taxi company to the Taxi Commission in Victoria and then to the Human Rights Commission, because when I put my complaint in, I said that we need to increase driver education around what disability looks like. Because it's so diverse. 

And I was told, no, we are going to wait until another incident happens before we do that. I said no, and I texted Graeme Innes, who was Disability Discrimination Minister at the time   Commissioner, sorry, at the time, and I said we have to change this. And I took it to the Human Rights Commission, and we did a video   education video as a result. But taxi abuse still happens. It happened to me with a disabled friend on Saturday night in Melbourne. Not directly to me, but to my friend. 

And I just keep thinking, what more do we have to do? I've done training, I've logged complaints. Last year I was in a video or two videos for the Taxi Commission Victoria. I have talked about it relentlessly on social media and nothing changes. And I've had enough. 

MR FRASER: Well, I can move to social media as you mentioned. I think you said before you have an active online presence. 

MS FINDLAY: Yes. 

MR FRASER: Is that   you consider that to be an important part of your working life? 

MS FINDLAY: Absolutely. I   I have been online for   since the internet came to Albury. I was researching Savage Garden back in 1996. And my school librarian told me I should be researching people with Ichthyosis, and I did. And, you know, researching Savage Garden changed my life but researching people with Ichthyosis did as well because I met this community and I saw people who looked like me and people who experienced what I did. And I have been blogging since 2002. So, 20 years. Probably longer, actually. 

And when I was doing my masters, I thought I have to start a portfolio of proper written work. And I did. And that's just led to so much work. I started blogging, which led to writing for the Office of the Disability Commissioner in Victoria, which led to mainstream writing work which led to the No Limits, and I have been writing professionally online for about 13 years now and, you know, I have travelled overseas because of my writing. 

I've written for CNN because I did something that was broadcast online. It's been incredible. And I've met my, you know, friends. I've met the disability community. I've met my partner online. Yes. 

MR FRASER: And, in a sense, online is your workplace. 

MS FINDLAY: Yes. 

MR FRASER: You talk in your statement about some    abuse that you experienced online. For example, starting with an experience in 2013 with Reddit. What's Reddit? 

MS FINDLAY: Yes. Reddit is an online discussion community. It has a lot of different threads, thousands   millions, maybe. I don't know. And I had been blogging on this particular blog, on the current one I've got now, for probably like three years before that. And I woke up and I   at that time, I would check my blogging statistics to see, you know, how many hits I had got back then. I don't do that anymore. I don't   it doesn't phase me anymore. But I noticed I had a whole heap of hits, and I hadn't blogged for some time. 

I had just met my boyfriend, now husband, and, you know, I hadn't blogged sort of that period. And I was like why? Why had there been all these hits to my blog? And they had come from Reddit. So, someone had posted my photo with me holding a glass of champagne, and it was on a forum called What the Fuck forum, and below that there was hundreds of comments about my face. People had said things like, what does your vagina look like? What the fuck is that? It looks like something that was partially digested by my dog. They described me as a lobster. And there was lots   and they diagnosed me as well. They said that I should be killed with fire. 

But there was some supportive comments as well, and when I first came across it, I sort of knew that my photo would be misused on the internet someday, and it took me a long time to put my photo online. I was   you know, I used to chat to lots of people on ICQ back in the day, and I had a relationship with someone online, and it took me a very long time to show him that photo of   also, back in the 90s, it was very hard to put a photo online with the technology we had then. 

But yes, you know, even in 2013 I was worried about what my   what would happen for my photo. And so I sort of had a feeling, it could be misused. And so I didn't put it on for a long time. I wrote this Facebook post about Reddit and then I copied it and modified it a bit for   for a response. I responded. And I said you know, this is me, yes, and, like, while you're talking about a stranger on the internet, I was out seeing, you know, my favourite singer with my boyfriend and leading a happy life. 

And that changed   that changed the conversation, when I posted. And it was interesting. The original poster gave me an apology, and I did notice actually when I was collating this   the original post has been taken down. I have all the screenshots from it though. And then it made, like, national news, international news. You know, I was getting emails from Fox News, CNN. It was featured on The Project. Which was bizarre because that importance of own voices, of us actually disabled people telling our story and intervening to say, hey, this isn't okay, and yes, this is me, I'm proud and confident in the way I look, is really important. 

I also want to say that, you know, I think for visibly disabled people or people with visible differences, people don't think we can be confident in how we look. And so it's a surprise when we are. Because people's expectation of disabled people are so low that they look at us and think, poor her or him, or poor them. And so being   sorry, and that's   this is a tangent but being online, being visible online is really important for visibly disabled people, especially, because it shows that we are confident, that we are present, and it shows younger people with these same impairments, same diagnoses that   what is possible and that it is okay to be visible and not apologetic about our appearance. 

CHAIR: Forgive my profound ignorance of social media, which is probably likely to continue. With the comments on Reddit, were they from people all over the world? 

MS FINDLAY: Yes. 

CHAIR: That's not just confined to Australia. 

MS FINDLAY: Yes. Yes. And, interestingly, when I responded, I got a comment back from this guy that we met in a taxi in New York. Like, that's how far it reached. Like, we   mum and I were on holidays the previous year in New York, and someone was like, "I met this lady when we shared a cab in New York." Like, it was profound in that   in the reach, you know. I had people offering to buy me a beer. It was amazing. Just   I mean, the horribleness but then also the positiveness after I   you know, intercepted that thread. 

MR FRASER: And these instances in this type are not just one particular platform, are they? 

MS FINDLAY: No. No. 

MR FRASER: So, in your statement, you talk about an experience that you had on Twitter, comments on a YouTube video. 

MS FINDLAY: Yes. 

MR FRASER: Commentary on a forum website. So, it's not limited to one particular platform? 

MS FINDLAY: No. And I think   I have become very good at not reading the comments. I just had a   a   I was just in the news recently talking about a footballer who's   a footballer   anyway. I hate football, but I seem to be commentating on that a little bit when they are in the media around disability. And I think social media platforms   I will give them credit for making it easier for people to lock down our platforms so that we don't endure comments on to our platforms. 

And so my comments are quite strict in who can comment. But I have then got all these emails from fragile white men at let's telling me how terrible I was for talking about this footballer. So, yes, it's not limited to one platform, it's everywhere. 

MR FRASER: And when these sorts of abuse occurs, do you take   have you taken steps to try to get it taken down or make it  

MS FINDLAY: Yes. 

MR FRASER: Make it cease? 

MS FINDLAY: Yes, absolutely. I've reported to Facebook and Twitter and YouTube. I have contacted   

MR FRASER: Let's take this   the example you give at paragraph 42 of your statement. You talk there about having some comments made on a YouTube video and then trying to take some steps to have that remedied. So, what did you   what did you do? How did you know where to go? 

MS FINDLAY: Yes. I mean, yes, the comments were actually quite sexually abusive, and I ended up taking them to the police. I called the police or went to the police. And they asked if I knew who made them. I   I don't. I don't feel like   I didn't know and I don't feel like the law has caught up with the fact that online is real life, but also we don't often know our trolls. And so, yes, I took it to the police, and they asked me who   if I knew who made the comments and I said no. And then they said it will be difficult to investigate because they didn't know who made them. 

So   and I also took it to   I was advised to report the abuse to YouTube or contact the Australian Federal Police. I contacted the police, who directed me back to the Victorian police. So, I contacted the Federal Police who directed me back to the Victoria Police and also back to YouTube. And I was just going be around in circles. Yes. 

MR FRASER: So, one of the difficulties there was not knowing who the abuser or abusers were. 

MS FINDLAY: Yes.

MR FRASER: I think in your statement at paragraph 55, you talk about the different approach when you did know who the particular person was. 

MS FINDLAY: Yes. I had a serial harasser for a few years. And I knew who he was. I had met him. But it was happening online. And a friend and I went to the Victoria Police in the city in Melbourne, and they said immediately go to the Magistrates Court which I did. And the process was so different. You know, I collated 25 pages of evidence of this ongoing abuse online by this person. And because I knew who they were, something could be done, and an IVO was taken out against him for a year. 

MR FRASER: You also give an example in 2017 about some comments being made on your blog and an approach you made to the Office of the eSafety Commissioner. Can I ask you to tell the Commission about that experience? 

MS FINDLAY: Yes, could you refer back to the paragraph?

MR FRASER: Paragraph 39. 

MS FINDLAY: Sure. Yes, I have contacted the eSafety Commissioner a number of times from anonymous abuse and also from people I know, and literally nothing can be done. Their parameters   even with the new social media protections, this year, it's not broad enough. It's like   I don't understand why they will only act if you're suicidal or if a naked photo is leaked or if it's abusive towards a child. I absolutely understand the importance of acting on those things. They are serious offences. 

But it should not take someone being suicidal for the eSafety Commissioner to act. I have reported so much. I reported   you know, the anonymous abuse from people who write to me on my blog, who, you know, say that I   that I should be said dead or say that I   you know, make derogatory comments about my appearance or who are threatening to hack my website. There was a person who I know that, you know, made a comment that they were threatening to hack my website and nothing could be done. 

MR FRASER: When it comes time to do the reporting, to make the report, have you found to be a quick process? 

MS FINDLAY: No. The process   the onus is on us   on the abused. And we have to provide reams of evidence. I have had   I have been really fortunate for friends to collate things so that I don't have to see them. In the experience where I knew the person who was abusing me, I paid a friend to collate the stuff because it was so much work. It's a lot of work, and the onus is always on the abused, and it shouldn't be. 

MR FRASER: And that time which you spend putting together these complaints is time you would otherwise be doing other things? 

MS FINDLAY: Yes, it is. And also having to look at it, having to see it. And, you know, I know some of it that's written about me that's not true, and some of it, you know, is fair commentary as well. But it's hard reading that stuff. Also quite laughable. Yes. 

MR FRASER: So, obviously you say it can be hard reading those   those sorts of things. In your statement, you talk about the impact that this sort of abuse has had on you. Can I ask you to tell the Commission a little bit about that? 

MS FINDLAY: Yes. I mean, I think there is a time it takes. Like, it's the trolling. I think Lindy West said this, that trolling is the   it robs you of the time you could be doing something else. Lindy is an activist and writer in the US, and she came face to face her troll on This American Life, the podcast. When it's been trolling from people I know, I've had panic attacks. I've had time off work. My husband's sometimes gone into   to bat for me to say, hey, that's not okay, and then he's been accused of further abuse when he hasn't done that at all, when he's just said, "Stop it." 

It's the panic attacks. It is also   you know, I said my parents are incredibly supportive, particularly my mum, and sometimes I don't want to tell her because she doesn't understand the impact of it. Because she's maybe not using social media like many other people. So, you know, yes, it can be great sometimes and a privilege to be in the media, but with that comes the trolling and, you know, the comments and that. So, you know, sometimes I   I just rather not tell her about it. 

CHAIR: How far you can protect yourself by blocking someone, not reading material that comes in, having some mechanism to delete the   whatever it be that may contain key words. Are these mechanisms available to you? 

MS FINDLAY: Yes. On Instagram and Facebook I know that there's key words that you can block so they don't appear in your comments. So, I've blocked disability slurs like the R-word or the M words from my account. I think the same with Twitter as well. And you block people. My block list is very long. Sometimes I don't block and just let them carry on and look silly. But in terms of other places, you can't block that. I   I remember I did You Can't Ask That a few years ago, which changed my life in terms of the comments that I got about my appearance reduced. Like in public. 

And more people would ask me if I had been on You Can't Ask That or say they have seen me on that than, you know, "Are you sunburnt", which was incredible. And I remember on the day   at the time I was writing for Fairfax's Daily Life. I was a regular columnist then. And I got trolled on my post that I did   on my blog   sorry, on my article that I did for them, and the troll   the trolling was coming in the comments. And Daily Life moderated those for me. 

You know, for many of us, the many women online, we are not protected by the people we write for or work for on their social media. So, you know, I had to flag with the editors, hey, I'm getting a lot of trolling through that, and they blocked those. But sometimes not. And yes, we can choose not to read it, but we can't just switch offline. 

MR FRASER: The   can I ask you this: You have talked about things you can do where obviously the burden is on you to take those steps. Is it an option to simply remove yourself from the online space all together? 

MS FINDLAY: No. Why   why should we leave? No. The online world is my workplace. It is a source of community. It's where disabled people hang out, particularly in the pandemic. I mean, COVID is not over, and there's so many disabled people that are not able to go out and not wanting to go out. And the other thing is, you know, we   we connect with other activists online. We see ourselves in those people. We have opportunities for work. So, no. We   we can't just switch off and we shouldn't have to just switch off. 

MR FRASER: I think you talk at paragraph 117 of your statement and 119 about the importance of the online space. Employment   as you said, it's your work   your workspace. Would removing yourself from the online space detrimentally impact your income? 

MS FINDLAY: Yes. I mean, a lot of   a lot of the work I get is because I'm online. I get, you know, messages through my contact form on my blog or via my speaking agent. My writing agent, for example, Jacinta, knew me because I had an online presence, and she took me as part of her writers that she represents. Publishers see, you know, my record of work. Absolutely it would. Yes. 

MR FRASER:  And also you're   your friendships that you make online as well. 

MS FINDLAY: Yes. I think they are the most important to me. Particularly people from the disability community. You know, finding other disabled people has changed my life, and finding other people with Ichthyosis particularly, you know, this very rare, very severe skin condition, and we often feel alone, and through sharing our experiences, we know we are not alone. If I may, I'm going to read something out that I wrote in my statement.

A few years ago, probably about seven years ago, I met a woman called Caroline online, and I know she's okay with me mentioning it because I asked if her if I could write about this in my statement, and I have written about her in my book as well. And Caroline knew no one with Ichthyosis until she found me. And she carried so much shame. So much. She said that she hid her skin for over 40 years, because of the shame. I'm going to read this from my statement: 

"After she found me and heard about my experiences, she felt the courage to be open about her Ichthyosis and her other diagnoses. She's no longer afraid to wear sleeveless dresses. And she's told me that she's begun to feel human again. My friend is now an incredible disability advocate and ally within the community. Examples such as this are the reasons why I continue to show up and advocate publicly, and this is why, despite the abuse that I   and hate I receive, I refuse to be silenced."

MR FRASER: You have taken the time in your statement to set out your observations on what needs to change. I'm at paragraph 98 of your statement. One of the things you talk about is the representation of disabled people and people with facial differences in the media need to be addressed. Can you tell the Commission what you mean by that? 

MS FINDLAY: Absolutely. And I think this has been a real theme this week. You know, we have heard from Debra, we have heard from Tracy, we have heard from Fiona, all incredible women that have spoken this week, and they have all said that. We need to see more people in the media, but also in everyday situations with disabilities, with facial differences, so that the fear reduces, so that the stigma reduces, so that discrimination reduces. And we need to have the world become used to us, not that   you know, this sounds ridiculous to say that in 2022. You know, we are here, we are existing, and we are proud, and we need to show the world that. 

MR FRASER: You also mention the need for positive representations. So, it's not just a question of prevalence, but the type of representations. Is that right? 

MS FINDLAY: Absolutely. People with facial differences particularly are used in horror movies, in, you know, Halloween costumes, in   shown as the evil villainous people in the   in the media. A few years ago I did a radio interview with the prominent broadcaster on the ABC, and he made an observation that my face wouldn't be good at Halloween. And   I mean, who says that? You know, that   that made the media as well and as a result of that I got some trolling, mostly positive, which was incredible. 

But, you know, when our faces are used as Halloween costumes or used to   as a scary character other evil character in film, it   it gives permission for the public to fear us, to hate us, to ridicule us. And that needs to stop. You know, I'm not saying that we have to be the super crip. You know, we don't need to be this heroic superhero character. We just need to be ordinary and not having the back story around how it got like this, because for many of us we didn't get like this. It's just how we were. 

And Stella, Stella Young wrote one day, it doesn't matter how we got like this. You don't need to know. And so, yes, that positive representation, but also not is to have to explain the back story, that's really important. The incidental representation. You know I love what Kurt Fearnley is doing on One Plus One. I love on The Cook Up   which I have been on, which has been incredible   you know, Adam Liaw has talked to Ellie Cole and Kurt Fearnley and other people, and disability became secondary. And we need to be seen in   doing the things we love in and the things we are good at and not necessarily in a negative portrayal. 

MR FRASER: You mention in your statement when you encounter young children. Can you tell the Commission about that, and then whether you think that idea of positive representation will assist also? 

MS FINDLAY: Yes, I mean, young children, it's hard, because we understand children have to learn and they're curious, but someone with a visible disability or visible difference or facial difference, it might be the first time someone in the public has seen us. But that day, it might be the 10th time someone has commented on our appearance. And so when we get a curious child, "Mum, what's wrong with her face" or look at you or fear, that can be really hard. 

I want to tell a really lovely story, actually, that just happened last week. I work at Melbourne Fringe. It's the best job. I work at an arts festival and I'm an access advisor helping deaf and disabled people put on artwork and helping people make their shows accessible. And we had our launch last week. And there was a child stylist there. And they were making us all costumes, I guess. And I sat down with the child stylist. They were eight. 

And they gave these   I got some pipe cleaner glasses and a cape, and they gave to me   they said to me, "Do you want a post it note?" I said, yes, please, I will have a post it note. And they gave me a pink post it note and they had hearts over it. And on the post it note, they had written "Pink people with cool." And that was pretty amazing to have kids recognise that. And I   I take it Melbourne Fringe is an incredibly progressive arts festival, and, you know, I   the artists are incredibly progressive but that, for the child to see the whole me   yes, I was wearing this pink dress with this ridiculous pink collar, but also they saw me and they didn't deny who I was. And they recognised who I was and they celebrated who I was, and that was pretty amazing. 

MR FRASER: If I can turn back to some of your suggestions for reform or for the future. And, you know, based on your own experiences online, you make some observations about the online space at paragraph 107 of your statement, and particularly about the existing   your views on what can or might be changed. Can I invite you to speak to the Commission about that? 

MS FINDLAY: Yes, sure. So, I will read out what I said: 

"In the online space it seems to me that the existing legal frameworks are insufficient. In Australia, we have a Federal Government office, the eSafety Commissioner, responsible for online safety but apparently it cannot do much to prevent the kind of abuse I and many other disabled people have experienced. There's a very high threshold that must be met for something to be classified as cyber abuse that can be acted on by the eSafety Commissioner. This leaves me and other disabled people liable to ongoing hurtful online comments and abuse from complete strangers or people we know and with no recourse other than to repeatedly contact the social media sites or platforms to try to get them to take action. This is both exhausting and should not be on our shoulders alone."

The other thing I think   people with facial differences should and visible    differences and perhaps things that aren't classified as disability to that person, if they don't identify as disabled, we really need to be protected by the Disability Discrimination Act. We   you know, the Disability Discrimination Act I don't believe sufficiently protects disabled people, and pursuing any kind of action under this Act is onerous. 

There is a   there's an organisation that I am an ambassador for, Face Equality International. It is currently working on a global campaign for greater legal protections of people with facial differences. And Face Equality International are working to position that facial differences is a social and legal and human rights issue. And I might just mention in the UK this week, it is Hate Crime Awareness Week, and the hearings that we have had earlier in the week, the witnesses we have had, have   many of them have talked about how this is hate crime. We need to   we need to take it seriously. 

Fiona Strahan at the start of the week said that it's hate, not ignorance. They know exactly what they are doing. Exactly. You know, when   when I experience discrimination by taxi drivers, or by people in the street or by, you know, the shopkeepers, people say to me, "Maybe they are having a bad day." They are not having a bad day. They know what they are doing. You know, it's not just a   a careless thought. They know what they are doing. 

MR FRASER: Can I   I will start that again. At the end of the statement you have addressed what you say your hopes for the future, and can I invite you to tell the Commission what those hopes are? 

MS FINDLAY: Yes. Sure. I will read it out, if that's okay: 

"I very much hope that the new generation of disabled children are able to speak up more about their experiences. I hope the ableism that I endured growing up and still do now shape   stops at this generation. I hope that society becomes more aware and accepting of difference and celebrates all people's disabilities and diversity. I also really want more considered representation of disabled people in the media, for there to be a character with facial difference on television or in film who puts people at ease or who saves world rather than trying to destroy it. I want to see a character with a facial difference whose empowering and also one that's ordinary. 

In addition, I would like there to be greater recognition that online platforms and sites are incredibly important for the disabled community. Without an online space, many of us would not be able to connect, have work opportunities, and find people with similar experiences who we can form lasting relationships with. It is therefore particularly important for there to be initiatives and resources dedicated to make online spaces safe for disabled people. 

Ultimately, for me personally, I would love to be able to appear in public to discuss issues other than my facial difference or disability. This would finally mean that I was viewed as someone with many important experiences and opinions on a range of issues and topics in addition to disability. With respect to the online space, based on my considerable past experiences with abuse and harassment following appearances I have made in the news or the media, I have come to anticipate such behaviour when I decide to partake in anything publicly. 

I believe it is important for the Royal Commissioners and the public to hear about my experiences, which is why I've decided to provide this statement. However, I do have concerns about the repercussions I could face from people online as a result of media about my evidence here today. It is my experience that online hate has a pack mentality and online trolls egg each other on. It can be overwhelming to be on the receiving end of it. However, I have also found that online love has a pack mentality as well and it surpasses that hate. 

When love is shown online, it gives permission for more love to be shown. Along with the work and social relationships which come out of being online, the infectious positivity of so many people in online communities mean that I don't want to and cannot leave the online world. And the positive impact of being public, particularly online, is evident from me. I'm able to connect with friends around the world. I have been connected of parents with children with Ichthyosis and I'm able to see them grow up. I have been able to connect with people I admire, and I'm now able to connect with fans of my own. That's really important."

MR FRASER: Commissioners, subject to any questions that you may have, that's the end of my questions for Ms Findlay. 

CHAIR: Thank you very much. Ms Findlay, I will ask my colleagues if they have any questions to put to you, starting with Commissioner Galbally. 

COMMISSIONER GALBALLY: Thank you for your statement and appearance. I would like to ask you about paragraph 109 where you say that laws need to be developed which criminalise hate crimes and hate speech against disabled people. Do you want to expand on that? 

MS FINDLAY: Sure. So, I said that laws need to be developed which criminalise hate crimes and hate speech against disabled people, for example, the Race and Religious Tolerance Amendment Bill 2019 which was introduced into Victorian Parliament and sought to extend the race and religious bills protected. We need to ensure that those cover the disability community. And the bill did not pass the Upper House, but was the subject of a Legislative Assembly Committee Inquiry into the Parliament of Victoria. 

Fiona Patten MP has been particularly vocal about that, and I was lucky to meet her this year, actually, during Face Equality Week. She spoke about and the importance of protecting disabled people and people with facial differences in Parliament during Face Equality Week, which was incredibly important, and I was so thankful for her for mentioning that. 

COMMISSIONER GALBALLY: Thank you. 

CHAIR: Commissioner Ryan? 

COMMISSIONER RYAN: Thank you, Mr Chair. Thank you so much for your evidence today. I was just going to ask you about paragraphs 48 and 49 of your statement, where I couldn't help but compare the difference between the response you got from Twitter and the response you got from the eSafety Commissioner about two complaints that appeared to be the same. Did you want to draw attention to that or comment on that? 

MS FINDLAY: Yes, sure. So, late last year, an influencer hate forum thread was started about me. I hate being called an influencer by the way; I'm a writer. And this forum has particularly focused on influencers in the UK but has now mentioned a lot of influencers in Australia. It mentions influencers that have 10,000 followers or more. And the   the laws changed earlier in the year around online safety, and I remember at the end of the year I contacted the eSafety Commissioner about it and then when the laws changed I contacted them again and said, "What can you do now?" 

A friend then   because I stopped reading it; I just didn't want to subject myself to it anymore. But when my friend   I had two friends actually collate the information and send it on, and the eSafety Commissioner said that while the material on the forum may be menacing, harassing or offensive, it was not considered as likely to   intended to cause serious harm. They said that I could obtain legal representation if I wished and that I should sue the people. 

I don't know who these people are. I mean, the writing   some of the writing on it seemed to be similar to what I've seen before, so I mean, it could be people I know. I'm not sure. But it was really disappointing and, you know, why   why should it take someone feeling suicidal or taking their own life before the eSafety Commissioner acts on it? Why should it take a naked photo to be, you know, leaked on there? 

Some of the comments on that forum have been around how I'm apparently not at greater risk of COVID compared to other disabled people, that I shouldn't have received my fifth vax because it's taken away from most people who need it, apparently. Now, when COVID came, you know, became more prevalent in May 2020, my dermatologists were absolutely committed to contacting me and telling me that because I have a skin condition and the skin is the biggest organ in your body and it's very susceptible to infection, of course I'm   I'm at risk of COVID more than most people. 

You know, there has been stuff around, you know, derogatory things said about my husband, about my mum, about me. Mostly that I'm insufferable, apparently. But, you know, that's apparently not reason for the eSafety Commissioner to take   to take action. And I'm pretty strong, even though genetically I have thin skin. I'm pretty thick-skinned. But if I wasn't thick-skinned, what could happen, you know? So, yes, it's disappointing that literally nothing can be done. 

COMMISSIONER RYAN: I was pointing out that you had a similar complaint to Twitter. What did they do? 

MS FINDLAY: The forum was separate to Twitter. The eSafety Commissioner in the past has actually contacted Twitter for   to act on abuse, and I believe that some of the   some of the accounts were taken down from Twitter because of eSafety's intervention but that   that forum is not linked to Twitter. 

COMMISSIONER RYAN: I'm just thinking, though, you reported   that when you reported something to Twitter, they immediately did something    taken down.

MS FINDLAY: Absolutely, they did something. 

COMMISSIONER RYAN: It seems that they seem to have   whilst, of course, they have more power over the forum, they seemed keener to do something than the eSafety Commissioner. 

MS FINDLAY: Yes. 

CHAIR: Not necessarily. 

MS FINDLAY: Sometimes. 

CHAIR: There may be questions of power. 

COMMISSIONER RYAN: The other thing I wanted to ask you, you reported something to the police that had been said about you on YouTube. And you said the person was anonymous. I imagine the person was anonymous to you. That would be right, wouldn't it? 

MS FINDLAY: Yes. 

COMMISSIONER RYAN: But they wouldn't be anonymous necessarily to YouTube, because usually to use YouTube   

MS FINDLAY: You need an account. 

COMMISSIONER RYAN:   you have to sign it and be identified. Does it not disappoint you that it's not possible to   for someone, not you, but someone to track that person back, because what they said was pretty revolting and, in other circumstances, would   

MS FINDLAY: Yes, can you refer   sorry, maybe you can refer to the quote, because I just want to read that out. 

COMMISSIONER RYAN: I think it's paragraph 42. 

MS FINDLAY: Yes, I want to read this out, because it was revolting and they actually said that, you know, it wasn't   it wasn't directed to me, except it was on my account. You know. And the   yes, it was just   the excuse that these perpetrators can't be found out, that's   that doesn't fly with me. Not when   as you said, you have to sign up to an account. Sure, they can be anonymous, but there's an IP address   which also can be hidden. But yes. Yes, on YouTube someone said to me: 

"Get used to doggie style and get laid with the lights off and forget about daytime sex unless your partner's blind. Don't forget to pull those tampons out."

And the police, both Victoria and Federal, couldn't do a thing. 

COMMISSIONER RYAN: Thank you Mr Chair. 

CHAIR: Yes. Thank you very much indeed, Ms Findlay, for giving evidence and for telling us of your experiences, both in your statement and today. We know it's not an easy thing to do in this forum and we are very grateful for the assistance you have provided to us in conveying your experiences and the impact these activities, these actions have upon you. Thank you very much. 

MS FINDLAY: Thank you. 

<THE WITNESS WITHDREW

MR FRASER: Chair, request that we adjourn until 11.15 am. 

CHAIR: Yes, we will adjourn now until 11.15 am.

 <ADJOURNED 10:58 AM 

<RESUMED 11:16 AM

CHAIR: Yes, Ms Dowsett. 

MS DOWSETT: Thank you, Chair. If it pleases the Commission, I would like to deal with the tender of some documents from yesterday before we call the next witness   witnesses, sorry.

CHAIR: Yes. 

MS DOWSETT: So tendering the documents first from the police panel. There is a statement   sorry, an undated response from South Australia, which is at tab 90 in the bundle. I tender that and ask that it be marked Exhibit 28 20 together with its two annexures   sorry, with one annexure to be marked 28 20.1. 

CHAIR: Yes, well, the response of South Australia will be admitted into evidence and given the marking of Exhibit 28 20 and the annexure thereto will have the marking of Exhibit 28 20.1. 

<EXHIBIT 28 20 RESPONSE FROM SOUTH AUSTRALIAN POLICE

<EXHIBIT 28 20.1 ANNEXURE TO RESPONSE FROM SOUTH AUSTRALIAN POLICE

MS DOWSETT: The next document is at tab 92. It's a further response from South Australia. I ask that it be   I tender it and ask that it be marked 28 21. 

CHAIR: Yes. That can be done and will have the marking you have indicated. 

<EXHIBIT 28 21 FURTHER RESPONSE FROM SOUTH AUSTRALIAN POLICE

MS DOWSETT: The response from Queensland, which is at tab 93, together with its annexures. I ask that the response be marked 28 22, together with responses 22.1 through to 22.9. 

CHAIR: Yes. The Queensland response will be admitted into evidence and the annexures will also be admitted with the markings you have indicated.

<EXHIBIT 28 22 RESPONSE FROM QUEENSLAND POLICE

<EXHIBITS 28 22.1 TO 28 22.9 ANNEXURES TO RESPONSE FROM QUEENSLAND POLICE

MS DOWSETT: I tender a further response from Queensland and ask that it be marked 28 23. 

CHAIR: Yes. That too can be done. 

<EXHIBIT 28 23 FURTHER RESPONSE FROM QUEENSLAND POLICE

MS DOWSETT: I tender the response from New South Wales Police Force dated 23 August 2022, together with its annexures and ask that they be marked Exhibit 28 24, with the exhibit   the annexures to be 24.1 through to 24.8. 

CHAIR: New South Wales response and annexures will be admitted into evidence with those markings. 

<EXHIBIT 28 24 RESPONSE FROM NEW SOUTH WALES POLICE FORCE DATED 23 AUGUST 2022 

<EXHIBITS 28 24.1 TO 28 24.8 ANNEXURES TO RESPONSE FROM NEW SOUTH WALES POLICE FORCE DATED 23 AUGUST 2022

MS DOWSETT: Response from the New South Wales Police Force dated 23 February 2022, I tender and ask that it be marked Exhibit 28 25. 

CHAIR: Yes. That can be done.

<EXHIBIT 28 25 RESPONSE FROM NEW SOUTH WALES POLICE FORCE DATED 23 FEBRUARY 2022

MS DOWSETT: Moving on to the New South Wales Ageing and Discrimination Commissioner, the Annual Report   the 2020/2021 Annual Report I tender and ask that it be marked Exhibit 28 27. 

CHAIR: Yes. That can be admitted with that marking. 

<EXHIBIT 28 27 2020/2021 ANNUAL REPORT OF THE NEW SOUTH WALES AGEING AND DISCRIMINATION COMMISSIONER

MS DOWSETT: The New South Wales Ageing and Disability Commission Guide to establishing collaboratives, tender and ask that it be marked Exhibit 28 28. 

<EXHIBIT 28 28 NEW SOUTH WALES AGEING AND DISABILITY COMMISSION GUIDE TO ESTABLISHING COLLABORATIVES

CHAIR: Yes. That can be done. 

MS DOWSETT: And the Safeguarding adults vulnerable to abuse   a public policy framework document, I tender and ask that it be marked Exhibit 28 29. 

CHAIR: Yes. That too can be done. 

<EXHIBIT 28 29 SAFEGUARDING ADULTS VULNERABLE TO ABUSE   A PUBLIC POLICY FRAMEWORK DOCUMENT

MS DOWSETT: And, finally, Ms Veiszadeh's biography from her website, I tender and ask that it be marked Exhibit 28 32. 

CHAIR: Yes, that biography will be admitted into evidence and given that marking. Thank you.

<EXHIBIT 28 32 BIOGRAPHY OF MARIAM VEISZAHDEH FROM WEBSITE

MS DOWSETT: Thank you, Chair. And I now call the second   a panel with representatives from Twitter, who are appearing by audio visual link and are present on that link now: Ms Kara Hinesley and Ms Kathleen Reen. 

CHAIR: Yes. Thank you, Ms Hinesley and Ms Reen, for coming to the Royal Commission remotely to give evidence and thank you for the information you have provided in writing. I understand that you have already been affirmed, therefore, I shall ask Ms Dowsett now to ask you some questions. 

<KARA HINESLEY, CALLED

<KATHLEEN REEN, CALLED

<EXAMINATION BY MS DOWSETT 

MS DOWSETT: Ms Hinesley, if I can begin with you, you are the Director of Public Policy Australia and New Zealand for Twitter. 

MS HINESLEY: Hi, Ms Dowsett. Yes, thank you, I co confirm that that is my title. 

MS DOWSETT: Could you briefly outline your role and responsibilities? 

MS HINESLEY: Yes, my role here at Twitter is to basically lead the public policy team for the ANZ region, as well as helping look after Southeast Asia, and we conduct public policy engagements, liaise with governments, non-profits, civil society, and a number of other philanthropic programs. 

MS DOWSETT: Turning to you, Ms Turner, you are the Senior Director of Public Policy for the Asia Pacific? 

MS REEN: Ms Reen, yes, Ms Dowsett. I am the Senior Director for Asia Pacific Public Policy and Philanthropy. 

MS DOWSETT: And could you briefly outline your role and responsibility? 

MS REEN: Similar to Ms Hinesley, I am responsible for public policy team across the Asia Pacific region. I help lead our programs and partnerships, engaging with governments, non profits and academic stakeholders regarding public policy and regulatory issues as they pertain to Twitter. 

MS DOWSETT: Twitter has provided a statement to this Royal Commission in response to a notice. Have each of you had an opportunity to read that statement in preparation for your evidence today? 

MS HINESLEY: Yes. 

MS REEN: Yes. 

MS DOWSETT: Were you involved in the drafting of the statement? 

CHAIR: I think it would help if you directed your attention to one or other of the witnesses or both. 

MS DOWSETT: I will begin with you first, Ms Hinesley. Were you involved in the drafting of the statement? 

MS HINESLEY: Yes, along with our other cross functional teams, I was involved in the drafting of the statement. 

MS DOWSETT: And, Ms Reen, were you involved? 

MS REEN: Yes. 

MS DOWSETT: Chair, I tender the statement from Twitter and ask that it be marked Exhibit 28 37. 

CHAIR: Yes. The statement will be admitted and given that marking of Exhibit 28 37. 

<EXHIBIT 28 37 STATEMENT FROM TWITTER

MS DOWSETT: In her opening for this Public hearing on Monday, Senior Counsel Assisting the Royal Commission described Twitter as one of the public squares of the 21st century. Ms Hinesley, would agree with that characterisation? 

MS HINESLEY: Thank you for the question, Ms Dowsett. I would say that Twitter is a part of the overarching over the top sort of service that exists on the internet to assist people to instantaneously be able to communicate with each other without barriers, and our mission is to facilitate the public conversation. So, we do   in everything that that we do, from a product perspective and policy perspective, work to facilitate open free communication and free expression that requires a diverse range of perspectives. 

MS DOWSETT: Twitter has a relationship with its users. In order to sign up to Twitter you need to agree to certain things? 

MS HINESLEY: That is correct. 

MS DOWSETT: And you can't comment on Twitter, you can't Tweet, unless you've signed up? 

MS HINESLEY: Correct. So, we do have very, very clear limits that are imposed on what users can do on the Twitter service. And it's called the Twitter user agreement, which every user has to agree to whenever they want to create an account on Twitter. 

MS DOWSETT: And part of the Twitter user agreement includes the Twitter rules? 

MS HINESLEY: It does actually three prongs to the Twitter user agreement. And if it's helpful, I would be happy to quickly tell you what they are. 

MS DOWSETT:  Yes, please. 

MS HINESLEY: Yes. So, globally, with the Twitter user agreement, we do have three elements that really encapsulate or help create the agreement itself, and it's the terms of service, the privacy policy, and then the Twitter rules, which you referred to. Now, the terms of service govern a user's access to and use of the Twitter service. And so that's everything from the app and website to our application programming interfaces, and which are called APIs. But that's where the terms of service really govern and make sure that a user's bound by those terms. 

We also have a privacy policy which of course governs how we handle any information that's shared by users and how that collection, use and storage is processed. And then we have the Twitter rules which you referred to, Ms Dowsett, which ensure that all people can participate in public conversation freely and safely. So it really looks at content governance and making sure that whenever people are utilising the service, and when they are tweeting and when they are replying, when they are liking any type of content, that they have to abide by those rules. So to give a brief example, for instance, the type of contents   

MS DOWSETT: I will just pause you. I will pause you there because we will come to the specific policies. Commissioners, you have a copy of the Twitter rules at tab 145 of the bundle. I tender that and ask that it be marked exhibit   

CHAIR: I think if we can just leave the tenders together until the end of your examination, I think. 

MS DOWSETT: Certainly. So, under the rules there is a section represented to safety, and it provides, is it accurate to say, an overview of the requirements on users. And then there are some specific policies that sit underneath the rules? 

MS HINESLEY: Yes. We do have overarching policy or thematic buckets with our rules and then more detailed information that governs more specifics incidents or behaviours and content on the platform. 

MS DOWSETT: And I think you were going to go on to mention them before, but one of those policies is the hateful conduct policy. 

MS HINESLEY: Yes. We have a hateful conduct policy. 

MS DOWSETT: And there is also an abusive behaviour policy? 

MS HINESLEY: That is correct. 

MS DOWSETT: And, Commissioners, for your information, you have copies of those respectively at tabs 138 and 139. Could I ask you, please, Ms Hinesley, if you could give us an overview of what is prohibited by these policies and how they differ? 

MS HINESLEY: Sure. Of course, Ms Dowsett. Thank you for the question. So, the types of content that's prohibited on the Twitter service will include content that's unlawful, but also any content that threatens violence, engages in targeted abuse of someone or tries to incite people to harass others. And so that's part of our abuse policy that you have referred to earlier. 

With reference to our hateful conduct policy, that is a policy that prohibits people from trying to basically bring forward any violence or threaten or harass other people on the basis of a variety of protected categories, which do include disability. So, with hateful conduct in particular, this is one of the elements when we were developing the policy, was recognising that people cannot express themselves on the platform if they feel like they are going to experience abuse or if they are going to encounter any sort of hatred, prejudice or intolerance. 

So, when we were developing our hateful conduct policy, we wanted to make sure that people felt like they would be able to not only come to Twitter to be able to see information, but to also be able to participate in the global public conversation and be able to do so in a safe way. 

MS DOWSETT: And if I could just ask you to focus on the hateful conduct policy in response to my next questions, firstly, is it correct of us to understand that the definition of hateful conduct doesn't require the potentially contravening conduct to be targeted at a particular individual? It can be hateful conduct at large? 

MS HINESLEY: That is correct. The hateful conduct policy focuses on protected categories, of which disability is included. 

MS DOWSETT: And, Ms Reen, I noticed you nodding. Is there anything that you would wish to add at this point? 

MS REEN: Thank you for the question, Ms Dowsett. Simply that when we talk about protected categories, we are referring to a variety of groups and communities regardless of where they may live and where they are located or how they are formed. And those protected categories are, like the rest of the Twitter rules and the policies that are in development, are for us living documents. So, they are in constant redevelopment as we continue to research and learn about online behaviours and how those communities are developing and expressing themselves online, especially on Twitter. 

MS DOWSETT: Thank you. I might just pause you there because we will come back to the development of the policies a little later. But back to the definition of hateful conduct, there is also no requirement in that definition for any particular degree of harm to be caused by the conduct? Is that correct Ms Hinesley? 

MS HINESLEY: In the actual definition, when we are look at harm, we are basically looking very focused in the hateful conduct policy specifically at not being able to promote violence against or directly attacking or threatening other people on the basis of those protected categories, which include race, ethnicity, national origin, caste, sexual orientation, gender, gender identity, religious affiliation, age, serious disease or disability, which is I know of interest to the Royal Commission today. So, that's when we're looking at those thresholds, we are looking at not being able to promote violence against, directly attack or threaten people on those bases. 

MS DOWSETT: Under the hateful conduct policy, there is a heading that says When This Applies. And the policy provides: 

"We will review and take action against reports of accounts targeting an individual or a group of people with any of the following behaviour, whether within Tweets or direct messages..."

And that there is conduct including conduct against people in the protected categories. I just wanted to ask about the report. So, does Twitter's action rely upon somebody making a report of suspected violation of this rule? Ms Hinesley, if you could answer first, please. 

MS HINESLEY: Thank you for the question, Ms Dowsett. So, with regards to actual reporting options, one of the key things that we have been trying to do as a service that has hundreds of millions of Tweets a day and hundreds of millions of users in the world, we're trying to utilise technology in a way to respond at scale and improve the experience for people that are on the service. And so we have been working very hard  

CHAIR: Ms Hinesley, sorry, sorry to interrupt. First all, I want to make sure I'm pronouncing your name correctly because it's been pronounced in several different ways. It's Ms Hinesley, is it? 

MS HINESLEY: Yes, correct. Thank you Chair. I believe it is Chair speaking. 

CHAIR: Okay. Would you be good enough to pay attention to the question and ask   and answer the question that's been asked. I will ask Ms Dowsett to ask the question again. 

MS DOWSETT: Does Twitter require a report to be made of potentially contravening conduct so that it can take action under the hateful conduct policy? 

MS HINESLEY: Thank you, Ms Dowsett. One of the key things I wanted to highlight is that, depending on what kind of report has been filed   

CHAIR: Could you please answer the question. I'm sorry, could you please answer the question. It's a very simple question. 

MS HINESLEY: The issue here, Chair, is that sometimes with reports, some of them are able to be surfaced via technology. And sometimes the reports require people to actually file the report so that they self identify. So, what I needed to explain   yes   was just that  

CHAIR: Ms Dowsett   Ms Dowsett will clarify your answer. Thank you. 

MS HINESLEY: Thank you. 

MS DOWSETT: So, starting with those where somebody is required to self identify, you describe in your statement at page 408, a process of reporting that is required to as "symptom firs" reporting. Can you briefly outline what that is and how it operates from Twitter's perspective? 

MS HINESLEY: Of course. So, one of the key things with the systems first reporting which I was referring to earlier, was the fact that because we have such a high volume of content, what we have been trying to do is leverage technology to be able to surface up potentially violative content for our teams to be able to review it. And so utilising this technology, we also started to look at what kind of online behaviours people were experiencing and how we could actually turn that into enforceable reports. 

So, we actually launched a new reporting process in June of this year, just a few months ago, that made it easier for people to be able to report unhealthy or unwanted content, and it also is trying to lift the burden from individuals to be the one having to interpret the violation. So, instead of asking them if they think a rule on Twitter has been violated, we ask them what happened to them, and we guide them through a reporting flow that basically asks them to describe what happened, and it closes the loop to show which Twitter rules would apply to that report and why. 

So, this was a ability of using that technology to try to streamline and also make a lot of the number of reports more actionable and more effective so that people were not having to parse through the Twitter rules themselves to figure out what might be applicable in this situation that they are experiencing. 

MS DOWSETT: And you refer at page 5 of your statement to Twitter's capacity to use technology to detect and take down contravening conduct. What proportion of enforcement action does Twitter take in response to things that are identified by technology as opposed to by a human report? 

MS HINESLEY: So, utilising the recent updates that we made to the reporting flow, we have actually seen 50 per cent more action taken in response to these reports than previously before we had the new reporting flow introduced that is symptoms first. So, we got better at being able to interpret the issues that reported with the new flow, and given the additional information that we are receiving, it's allowing our teams to be much more efficient and effective in how they resolve and respond to the reports. 

CHAIR: Ms Hinesley, I'm still not clear as to whether the process you're describing requires as its initiating point a complaint or report by a user, or are you saying that this new technology self generates reports? 

MS HINESLEY: Thank you, Chair. With this specific reporting flow, this is by a user reporting it. So, this would be instigated by a person. 

CHAIR: Well, this is   this was the original question asked, that is to say, does the process that Twitter uses to investigate or deal with actions that violate the policy, does that depend upon a report being made? And from what you are saying, the answer is yes? 

MS HINESLEY: In this instance, yes, Chair. But apologies if I'm not being very clear. We   we have    

CHAIR: Thank you. 

MS HINESLEY: Yes. We have two different avenues by which people and reports are actioned. So, one is by users reporting    content  

CHAIR: I understand, but they have to report. That's the point. 

MS HINESLEY: Not all the time. We are also utilising machine learning technology. 

CHAIR: Well, could you explain, then, how it is possible for Twitter to receive and/or act upon a violation of policy without a report being made by someone who is affected? 

MS HINESLEY: So, there   since we have a range of rules   and apologies for not being clear, we   

CHAIR: Don't worry about the apologies. Just give me the answer and we will be fine. 

MS HINESLEY: Of course. Thank you, Chair. So, we have two different avenues by which content can be reviewed and actioned. One is by user reports. One is by proactive technology or machine learning. And machine learning is usually better at picking up behaviours or content like spam, for instance, or issues where there is going to be violations that don't necessarily require a person to put their hand up and say, "I've experienced this specific behaviour or violative kind of action based on who I am as a person or based on a protected category." 

And so when Ms Dowsett was asking the question about the updated reporting flow that we have, that has been something that has helped people, especially when they are having to report something themselves and say, you know, "I have experienced something because I have a disability, because I fall within a protected category that I need to be able to report this to Twitter." We have updated those processes to make it easier for them to do so. But we do have proactive technology that is working to also take down content proactively without needing a report, and it's better at doing certain types of content that don't, again, require that self identification. 

CHAIR: And the latter is what was started in June of this year? 

MS HINESLEY: The symptoms first updated reporting flow was started in June of this year. 

CHAIR: Is it possible to put an image on Twitter as opposed to words? 

MS HINESLEY: Yes, Chair, we do allow people to be able to put photos. 

CHAIR: Alright. What happens if a Swastika goes up? 

MS HINESLEY: Chair, that would fall within our hateful conduct policy and hateful imagery. 

CHAIR: Yes, but what happens? 

MS HINESLEY: If that content was going to be either reported or it was picked up through proactive technology, then it would be turned over to our content moderation teams for review. 

CHAIR: Would it be picked up by your proactive technology? 

MS HINESLEY: I wouldn't be able to answer that with any degree of certainty at this stage. 

MS DOWSETT: Ms Reen, are you able to answer the Chair's question? 

MS REEN: Thank you, Ms Dowsett. Just to supplement what Ms Hinesley said  

MS DOWSETT:  I'm sorry, if you could  

MS REEN: Can you hear me?

MS DOWSETT: If you could firstly answer the Chair's question, would Twitter's technology enable it to proactively identify a Swastika if it was uploaded to your platform? 

MS REEN: Yes, may I make a quick clarification? If you will allow?

MS DOWSETT: Please. Yes. 

MS REEN: Thank you. Context really matters. There are companies and video streaming services that are making documentaries about the history of World War II, and they may have Swastikas in their content and they may post promotions, for example, about such content which would appear in the form of imagery. This is an example of the type of complexity that enables Twitter to need but also to make sure that we use a combination of that technological review, the proactive review, as well as the human review. 

So, sometimes it's a question of is this violative, and for that reason we would rely on reports, and sometimes it's a matter of us picking it up. But, where possible, we are also trying to avoid what we call the false positive problem, where we are taking down all content which may be harmful but lacks any context. Thank you. 

CHAIR: Ms Reen, you may take it that I fully understand that a Swastika may appear as part of something other than an attempt to engage in hate speech or to threaten violence. I fully understand that. That is why I asked what does Twitter do. How do you work out whether the Swastika is appearing as an element in a threat of violence or in an attempt to intimidate as opposed to a documentary or a scholarly interchange? What does Twitter do to work out which is which? 

MS REEN: Thank you, Chair. So, a combination of our detection capabilities and our ongoing machine learning capabilities and the technological reviews. Remember, there are more than half a billion Tweets being posted every day. So, this is an enormous corpus of work and effort happening globally all over the world in real time. So, we rely on that technology in order, one, for global healthy conversations to happen freely in multiple languages in real time, 24/7, wherever they may and wherever they are able. 

But on the other hand, we also want to make sure that our detention capabilities are also catching that violative content, and where we can, we try. And we report on that twice a year in the Twitter Transparency Report which is made available on our website called Twitter Transparency Centre. 

In addition   and this is why the human review is so important, and why sometimes it's so profoundly critical that there are   that there is human review and that there are people available to take in the reports from the most vulnerable people and groups who are reporting to us so that they are prioritised, but also with the technological review so that we can get at scale, so that those images which may be reposted in sort of a spammy or scammy kind of way, that our systems that are built to catch those and the ways in which bad actors try to circumvent the Twitter rules and try to circumvent the systems that we build, to make sure that we are moving against those effectively. And that is the work   that is the job. 

MS DOWSETT: The human review that you just spoke of then, Ms Reen, this is undertaken by employees of Twitter? 

MS REEN: Yes. 

MS DOWSETT: And would it be correct to refer to those people as reviewers or moderators? 

MS REEN: Yes. 

MS DOWSETT: Are you able to tell the Royal Commission about the diversity in the moderator pool? To your knowledge, are there people who identify as people with disability in that pool? 

MS REEN: I wouldn't be able to comment on the detail of it for the health and safety and privacy of those moderators, given the extremely sensitive and profoundly difficult work that they do. But the short answer is, yes, we do. 

MS DOWSETT: And are you able to tell us whether the moderators receive any training in regards to responding to reports of violence, abuse, and harassment made by people with disability? 

MS REEN: Yes, they do. And they receive   they also receive updated training and regularised training. But I might turn to Ms Hinesley for additional information on this, if you would allow, please, Ms Dowsett?

MS DOWSETT: Yes, of course. Ms Hinesley? 

MS HINESLEY: Thank you, Ms Dowsett. So, we have a number of training programs that we have put into place internally for employees, which also include our content moderator team. And it's three different ones. One is Foundations of Accessibility, the one is Disability Awareness and the third one is Positive Communication. So, these are updated on an ongoing basis, and our teams, both as employees and also constant moderation teams, receive these trainings year-round. 

MS DOWSETT: And do you know if that training is co designed with people with disability? 

MS HINESLEY: With regards to the way the training is designed, we have two different internal   basically groups. So, we have a Centre of Excellence and then a Global Accessibility Group internally, and those teams help co-design these processes and these programs for employees. And this does have people with disability as part of those teams. 

MS DOWSETT: And does the training deal with any particular vulnerabilities that certain groups may have in relation to particular kinds of abuse or harassment? Is there targeted training? 

MS HINESLEY: I wouldn't be able to provide more detailed information about the degree of targeted training, but I just know that with the accessibility policies and training situations that we do have, that we do have basically the understanding of why and how we need to make sure that there is accessibility and disability inclusion within the company. So, there is certain foundational accessibility tenets and courses that are built into this, but in terms of targeted training, I would have to take that on notice. 

MS DOWSETT: Some of the witnesses who have given evidence this week have spoken about non-consensual films and photography. So, when they are out in public, people take their photos and shoot videos of them without their permission. Can you talk   Ms Reen, I will begin with you   about whether Twitter can do anything   and if it can whether it does   to prevent images of that kind being displayed on your platforms. So, non-consensual filming and photography? 

MS REEN: Ms Dowsett, if you will allow, I would like to clarify the question briefly first. Are you talking about any kind of filming in any public space whatsoever in Australia, whereby under Australian law it is legal to film and upload such material and whether Twitter has any policies separate from those?

MS DOWSETT: So, yes, you can   I'm talking about in Australia, in public places, people filming and photographing people with disability and then uploading it to your platform. Is there anything you can   Twitter can or does do about that? 

MS REEN: So, it depends how it's posted. So, if that   let's say there is a conference event where someone with a disability is on a keynote panel, and they are speaking about   

MS DOWSETT: Perhaps   my apologies. I'm going to cut you off. We are   this hearing has been about violence and abuse in public places. So, let's assume that it's not a video taken of a conference of a keynote speaker; it's somebody going about their business, walking down the street, but they are a person with disability. They are only filmed because they are a person with disability in a public place. 

MS REEN: If there is any violence, or threat of violence or incitement of violence, or if this is posted with the intention of abuse or harassment of that individual, it would potentially be a violation of our abuse and harassment policy or fall within our hateful conduct policy. So, yes, but without knowing the specifics, Ms Dowsett, I am simply unable to say yes categorically to a question of that nature. I hope you understand. 

MS DOWSETT:  So   

CHAIR: The short answer is   that you would give is you wouldn't know the circumstances in which the photograph or the film was taken and, therefore, you would have no way of intervening with   if it was shown on the platform? 

MS REEN: Chair, this is why reporting is so valuable because  

CHAIR: Sorry, is that the position? That's all I want to know. 

MS REEN: No, not always Chair. 

CHAIR: I'm not criticising you. I'm just saying is that the position? 

MS REEN: I understand the question, Chair. I'm saying that that is not the position, and it is not   this is a false binary that's been created here. 

CHAIR: Could you answer the question, please? 

MS REEN: Yes. This could violate our policies, and we would remove it. 

CHAIR: If it involved violence or threats. But we are not talking about that. We are talking about someone who is videotaped or filmed without their consent and that's it. Is there anything you can do about that? 

MS REEN: I understand that. Yes, there are things we can do about it because under the   under the hateful conduct policy, it includes abuse and harassment and incitement with that kind of intent. So, with a protected category   

CHAIR: How would you know   how would you know the intent that was behind the filming or the photographs? 

MS REEN: We don't always know the intent. But because we have   I know   I think this is worth saying, if you will indulge me just for 20 seconds on this. I know it can also be a bit boring, but it really is how we get to the guts of this problem, Chair, and that is we   

CHAIR: Okay. Let's start   let's start the 20 seconds now and give me a substantive answer. 

MS REEN: In the background with the technology, we refer to behavioural signals. One of the ways we can address intent is if the poster of the photograph has a history of violations for which they have been temporarily sanctioned, or if that poster has been involved in previous abuse or harassment of other individuals in other circumstances. This is one such example. There are many others. We cannot read intent at half a billion Tweets a day, you are absolutely right. And there will be times when we simply do not know. That is correct. But there are times when we do. 

COMMISSIONER RYAN: You mentioned you would take action if violence is intended. What about just straight-out significant humiliation of someone with a disability? In other words, someone with a disability has a feature which is   which can be characterised as embarrassing, and it's posted specifically for the purpose of embarrassing that person or it's used to create a meme that might embarrass them or humiliate them further. 

MS HINESLEY: I am happy to jump in. Sorry, go ahead, Kathleen 

MS REEN: Go ahead, Kara. 

MS HINESLEY: Thank you for the question, Commissioner. I think what this might be getting to is also around a policy we have for right to privacy within Australia which is consistent with local law. And so in that situation, Commissioner, if there was a Tweet or image that was put up with the intent, again, to cause humiliation that would potentially run afoul of our abuse policies or if we receive a report from the person that is depicted in the image or video and they say, "I did not consent to this", under a right to privacy, we would be able to take action on the content. 

MS DOWSETT: We have talked about two ways that potential violations come to your attention. Those are reports received from people who may be the target of the behaviour, and your machine learning or AI. You also receive referrals from the eSafety Commissioner. That's correct? 

MS HINESLEY: Yes, Ms Dowsett. We also work with the eSafety Commissioner on this. 

MS DOWSETT: Thank you, Ms Hinesley. And Twitter has an enforcement policy, as I understand it, and it can take enforcement action at either the Tweet level or the account level. Is that correct, Ms Hinesley?

MS HINESLEY: Yes, it is. 

MS DOWSETT: And does the type of enforcement action that can be taken alter depending upon whether the matter comes to your attention through each of those three potential avenues? 

MS HINESLEY: Yes, it would depend on which avenue and which either policy or local law we were being asked to adjudicate the content under. 

MS DOWSETT: You said that the symptoms first reporting function doesn't require the person making a complaint to identify a particular policy, just to say, "This is what happened." Have I understood that evidence correctly? 

MS HINESLEY: Yes, that is correct. 

MS DOWSETT: So why does it matter whether it comes to you by a report by machine learning or from the eSafety Commissioner about what action Twitter can take? 

MS HINESLEY: In that instance, we would be   if there was a report that came through by the eSafety Commissioner or her office, then it would possibly be actioned or we would be asked to be looking at the content under the Online Safety Act, which is a local legislation that's specific to Australia. With any other report that would be coming to us, especially through our own internal reporting functions, whether it's on the website or on the app, then that would be reviewed under our own terms of service or policies that I detailed earlier. 

MS DOWSETT: And still to you, Ms Hinesley, is Twitter able to or does it need to identify perpetrators in order to take enforcement action? Say, for example, that the account user isn't using their legal name. They are using some other identifier. Do you need to know who the person is at the end of the offending conduct? 

MS HINESLEY: Thank you for the question. I believe, if I'm understanding it correctly, our rules are adjudicated regardless of who might hold an account. So, we make sure that our policies are enforced without bias and across all the different reports that come through. So, we wouldn't necessarily need to know the individual in order to take action to either suspend an account or remove a Tweet. 

MS DOWSETT: What is the timeframe for Twitter responding? If I lodge a report saying that I have been the subject of this conduct and I identify it to you, how long does it take for Twitter to respond to that? 

MS HINESLEY: We don't have any specific timeframes for responses, as each case is going to vary due to complexity or additional context that is sometimes needed. But, again, we do work very closely with our moderation teams, and we have the technology play a role in triaging the process. So, there is a sliding scale, essentially, for how we are able to respond to reports. 

Reports that are around the most abhorrent type of material, like child sexual exploitation material or terrorist material, that's at the top of the queues so the moderators are able to look at that and action it as soon as possible. Reports that are going to be at the other end of the spectrum, again more around spam, that that might be at the lower end of the spectrum in terms of how we turn that around. But it just depends on what's been reported. 

MS DOWSETT: And what is as soon as possible? How quickly can you take a Tweet down? 

MS HINESLEY: We have taken down Tweets in sometimes a matter of minutes. And, again, that's usually with that higher end of the spectrum and more serious sorts of content or conduct that's occurring. 

MS DOWSETT: Turning to you now, Ms Reen, can I ask you to tell us, from Twitter's perspective, do you see that there is a gap between the legislative regulation of the online space and the things that Twitter can do to regulate its users? 

MS REEN: Ms Dowsett, would you mind if I allowed for Ms Hinesley to answer this question first, given her special expertise in Australian law?

MS DOWSETT: That's fine. 

MS REEN: Thank you. 

MS HINESLEY: Thank you, Ms Dowsett, for the question. So, in the situation, one of the key things, when we are looking at especially at legislation   and I have referred to the eSafety Commissioner and the Online Safety Act earlier, one of the key things that we have had in place since the Enhancing Online Safety Act was implemented in 2015 is dedicated reporting channels for the eSafety office. 

And that office is able to send through content both that they would like us to review under the actual legislation that's in place, or they can also send through any type of report that they would like us to review under our own terms of service and our own rules. 

So, we try to maintain those different avenues and the breadth of options and combination of methods to report or review reports so that we're able to engage with the eSafety on both the formal and informal bases and look at request from the office. And so in line with that guidance, we work with them to review, again, both under the obligations of the OSA and the Online Safety Act, as well as our rules and policies. Does that help answer your question?

MS DOWSETT: Is Twitter able to adjust its rules? If you see reports about a kind of conduct that may not fall foul of the rule as it currently exists, are you able to amend the rule to   to pick up that gap? 

MS HINESLEY: Yes. We do view our rules as a living document. So, we work very closely with a number of organisations, and we have our Trust and Safety Council, which is a global group which includes non-profits and civil society organisations around the world that we regularly conduct both open consultations with, as well as targeted consultations if there is a specific element of a rule that we are looking to update. And so we constantly work to iterate and make sure that we are evolving our rules to map the online changing contours of conversation online. 

MS DOWSETT:  Thank you both very much. Chair, that is the end of the questions that I have for these witnesses. I hand them to you. 

CHAIR: Yes, thank you very much. I will inquire of my colleagues whether they have any questions to put to you. First, Commissioner Galbally. 

COMMISSIONER GALBALLY: Thank you. I would like to ask you about the whole issue of laws. Ms Findlay   I don't know whether you heard her evidence, the first witness, she proposed that laws need to be developed which criminalise hate crimes against disabled people. What impact would that have on Twitter if that was to go ahead? 

MS HINESLEY: Thank you for the question, Commissioner. I would say, just in regards to the way that we operate both here in Australia and globally, we respect local law and we take into account any sort of submitted legal requests that come through. So, if the laws were to change here in Australia, that is something we would be able to take into account if receiving a request under law to take action on content from our platform. 

COMMISSIONER GALBALLY: Thank you. 

CHAIR: Commissioner Ryan? 

COMMISSIONER RYAN: Thank you. Look, I think one of the concerns people have is the anonymity of using a social media platform. If a person commits what would be regarded as a crime, a hate crime, in, say, New South Wales or other parts of Australia where it is actually a crime to incite violence against an individual or a person on the basis of their sexuality or their   or their race, does Twitter assist local police in potentially identifying an individual, or are there complications in doing that? Is it easy for police to get information that might help them track someone who has committed a crime using Twitter as the vehicle? 

MS HINESLEY: Thank you for the question, Commissioner. In this instance we do have open channels of communication with local law enforcement, both on the state and federal levels, and police can request information about an account that they are investigating any time via a dedicated portal that we have available for law enforcement called The Legal Request Submission site. 

So, if there is any sort of content that they are looking to investigate, depending on what we hold   again we do not hold as much information on certain accounts as maybe some other platforms that you have heard from in the past. But we do require a name or a verified email and a phone number for accounts, and also content that   provisions in our service, basically like an IP address, for instance. So, that would be type of information that we would be able to provide law enforcement if they sent through a scoped request for us. 

COMMISSIONER RYAN: My final question is one more of an impression. I have a number of friends on the platform Facebook which have reported that, on odd occasions, they have been what they called banned by Facebook for period of time. Sometimes mistakenly. But there seems to be quite a significant awareness of the fact that someone is monitoring what they are doing. It would be my impression that Twitter is an area for pretty robust discussion, and yet, I have to say, I have not noticed a friend of mine ever complaining they have even accidentally been banned or somehow or other sanctioned by Twitter. Is it your possible that your particular intervention policies are a bit of a lighter touch, perhaps, than the platform Facebook? 

MS HINESLEY: Thank you for the question, Commissioner. I'm happy to take that in the first instance. If I miss anything, I might hand to my colleague Kathleen. But I would say that our policies are very robust in that we work very hard to develop them in consultation with a number of the partners, like I mentioned earlier, within the Trust and Safety Council but also a number of safety partners that are even based here in Australia. 

It's our top priority to make sure that people who are using the platform feel safe and free from abuse, so we are constantly looking to make sure we have clear rules in place to address any sorts of   again, bad behaviour, bad actors, or threats that can be levied at people online. 

I would say we also have very strong ban evasion policies in place to ensure that if people are suspended from the service and if they try to circumvent that enforcement action, that we do work very hard to make sure that they are not able to create new accounts and that they can't repurpose existing accounts. And so we work very hard to make sure that those detection methods are in place and that people who do violate our policies and are suspended are not able to come back on the platform. 

COMMISSIONER RYAN: Mr Chair. Thank you for your evidence. 

CHAIR: In your response   written response to the Commission, there are some figures given such as, for the six-month period from July 1 2021 through to December 31 2021, Twitter required users to remove 4 million Tweets. Is that a worldwide figure? 

MS HINESLEY: Yes, Chair, it is. 

CHAIR: How does Twitter address the issue that's been raised of disparities in local law? Twitter began and operates in the United States, which has the First Amendment, which is not the   which allows speech that would not be allowed in Australia. How does Twitter specifically address the different laws that apply in this country and which are designed to protect particular groups of vulnerable people, such as people with disability? How does Twitter do that? 

MS HINESLEY: Thank you, Chair. One of the key things that I mentioned earlier is that we do have these portals that allow for legal requests, whether they are from law enforcement or from a court order, to be able to be lodged with the company so that we are able to take action in accordance with local law. In the Transparency Report and some of the figures that you have mentioned just a moment ago, we do have a specific section that details what kind of actions are taken in each country's response to either information requests or takedown requests and that are specific, again, to those local laws. 

So, that was not in our written submission, as we had focused more on the disability issues and questions that had been brought forward. But we do have the sections that are available and published bi-annually in our Transparency Report that are specific country by country. 

CHAIR: So that it is possible to discern from the Transparency Report the actions that have been taken in Australia to shut down   shut down accounts or to remove Tweets? 

MS HINESLEY: In   yes, Chair, in response to legal requests. So, you would be able to go to the   

CHAIR: In response   yes, go on.

MS HINESLEY: So, in the Transparency Report you would be able to see the legal requests received by each country, by country breakdown, and the compliance rate for those different legal requests for information or for take down. 

CHAIR: But how does Twitter prevent the Tweets getting on in the first place if they are in violation of local law? 

MS HINESLEY: Well, again, Chair, this is where we are working to try to leverage our proprietary technology that can be more proactive in the way that it can both search out content that might be violative or accounts that are acting in a way that violates our terms of service. So, behaving in ways that are inauthentic or would amount to platform manipulation. These are where we are trying to leverage technology to get better at detecting potentially violative content. 

CHAIR: I see. Just one more question. Under the heading Hateful Conduct Policy   and this is on page 3 of the document have you provided   you say or at least the policy apparently says: 

"We also do not allow accounts whose primary purpose is in inciting harm towards others on the basis of these categories."

Why does an account have to have as its primary purpose the incitement of harm as distinct from just doing it? 

MS HINESLEY: So that particular language, Chair, is in response to accounts that might be created for the sole purpose of directly either attacking an individual or attacking a group of people. If they also engage in that behaviour in other just Tweets, or   then they would still be able to have enforcement actions brought against them. But we are also making sure in that language that it captures accounts that might have just been stood up for a primary or sole purpose. 

CHAIR: I see. Thank you very much. Thank you very much for coming to the Commission and giving evidence, and thank you for the written statement. The Commission appreciates the assistance you have provided. Thank you very much. Ms Dowsett, what happens now?

MS DOWSETT: Thank you, Chair. If we could have a brief five minute adjournment and then press on with the next witness. I did indicate to you yesterday afternoon lunch would be at 1 and we would be finished by then. This is moving towards that goal. 

CHAIR: I'm very impressed. 

<THE WITNESSES WITHDREW 

<ADJOURNED 12:12 PM 

<RESUMED 12:24 PM

CHAIR: Yes, Ms Bennett. 

MS BENNETT: The next witness is the eSafety Commissioner, Ms Julie Inman Grant, who appears. 

CHAIR: Ms Grant, thank you very much for your written statement, which we have and which we have read in advance of your appearance, and thank you for coming to the Royal Commission in Brisbane at the Brisbane hearing room. I will ask Ms Bennett to ask you some questions in just a moment, but in the meantime if you would be good enough to follow the instructions of my Associate, who is sitting just in front of me, she will administer the affirmation to you. Thank you very much. 

ASSOCIATE: I will read you the affirmation. At the end, please say yes or I do. Do you solemnly and sincere declare and affirm that the evidence you shall give will be the truth, the whole truth and nothing but the truth? 

MS INMAN GRANT: I do. 

<JULIE INMAN GRANT, AFFIRMED

CHAIR: Thank you very much. I will now ask Ms Bennett to ask you some questions. 

<EXAMINATION BY MS BENNETT SC

MS BENNETT:  Commissioner, you have made a statement to assist the Commission; is that right? 

MS INMAN GRANT: That is correct. 

MS BENNETT: Have you read the statement before appearing today? 

MS INMAN GRANT: Yes, I have. 

MS BENNETT: And are its contents true and correct? 

MS INMAN GRANT: Yes, they are. 

MS BENNETT: If it pleases the Commissioners, I tender the statement and I ask that it be marked Exhibit 28 38 with the attachments thereto being marked sequentially, 28 38.1 to 28 38.11. 

CHAIR: Yes. The statement can be admitted into evidence with the marking of Exhibit 28 38, and the annexures will be admitted into evidence also and given the markings indicated by Ms Bennett. 

<EXHIBIT 28 38 STATEMENT OF JULIE INMAN GRANT

<EXHIBITS 28 38.1 TO 28.38.11 ANNEXURES TO STATEMENT OF JULIE INMAN GRANT

MS BENNETT:  Now, Commissioner, you are the eSafety Commissioner; is that right? 

MS INMAN GRANT: That is correct. 

MS BENNETT: So, your role encompasses all of geographic Australia in the online space. Is that right? 

MS INMAN GRANT: That is correct. We started as the Children's eSafety Commissioner, and in 2017 we became the eSafety Commissioner covering all 26 million Australians. 

MS BENNETT: Given that the internet exists somewhat divorced from geographic land mass, how does someone fall within your jurisdiction? 

MS INMAN GRANT: Right. Well, fortunately, we have had an online content scheme in place for a long time, so very little content is hosted here in Australia, which means that almost all of our regulatory targets, including social media platforms, gaming sites and websites that we are dealing with are outside  

MS BENNETT: I will ask you to slow down for our interpreters. Thank you. 

MS INMAN GRANT: Sure. Are outside of Australia. So, we need to use extraterritorial reach. 

MS BENNETT: I see. And any Australians using the internet can come to you with their concerns and, subject to your powers, you can consider that request? 

MS INMAN GRANT: That is correct. Every Australian can report   or   and that includes residents of Australia, can report to us through our website eSafety.gov.au for information and strategies and to report five different forms of abuse.

MS BENNETT: Are there accessible reporting pathways? I mean, accessible for people with disabilities? 

MS INMAN GRANT: Yes, we built our   we built our entire website with accessibility by design in mind. Just to give you some context, we worked with Vision Australia before we even designed and built it. We checked with them at midpoint. We did user testing with different members of the   with lived experience in the disability community. We also made sure there were reviews that were done, and we looked at everything from making sure that there were appropriate contrasts, that people with dexterity issues could navigate, that they worked with screen readers. And we have had one accessibility audit, and we are planning our second accessibility audit in February. 

MS BENNETT: Would you agree, Commissioner, that the online space is one that is increasingly important for people with disability? 

MS INMAN GRANT: It is absolutely critical. As we heard from Ms Findlay, people from the broad   and, I note, very diverse disability community use the online world for connection, for community, to work, to learn, to create, to be entertained. It's an absolutely vital facet for them. 

MS BENNETT: And so can I take it from that, Commissioner, that you would accept that any so-called solution to the issue of online abuse that involves asking the victim to simply switch off, is not something that you would endorse? 

MS INMAN GRANT: No. I think part of our job is to make sure that all Australians are having more positive and safer experiences online, and that we are trying to harness the benefits for all users, whilst minimising the risks. And this means understanding, for those who are vulnerable or more at risk, how we can best help them. 

MS BENNETT: There's been evidence this week that about the impact of both online abuse, and in the real world abuse directed at people with disability. And I think it would be fair to summarise that evidence to say that that impact is severe and cumulative for people with disability. Would that be your understanding, Commissioner? 

MS INMAN GRANT: That would absolutely be my understanding. And that cumulative impact is very germane to this discussion. 

MS BENNETT: In what way do you say it's germane? 

MS INMAN GRANT: Well, if you are experiencing online abuse and isolation of having, for instance, daily discrimination, daily hate, formed at you, then it is going to have a more significant effect. So, when we did hate speech research, we saw that the   14 per cent of the general population have experienced some form of hate speech. For the broad disability community it was 18 per cent. So, higher. And in 25 per cent of those cases, the abuse directly targeted their disability and another 24 per cent, it directly implicated their physical appearance. 

MS BENNETT: So, how does the more significant effect of that online abuse directed at the disability community or people with disability, how is that reflected in your understanding of serious harm in the course of your regulatory responsibilities? 

MS INMAN GRANT: Well, I think it's worth noting that we don't require people to self-disclose whether or not they have a disability in terms of taking a report. That often   it depends on the complaint scheme. So, we have a youth-based cyberbullying scheme, an image-based abuse scheme which has to do with the non-consensual sharing of intimate images and videos   

MS BENNETT: I'm just going to ask you to slow down. 

MS INMAN GRANT: Okay.

MS BENNETT: Our interpreters are very good, but they're limited. 

MS INMAN GRANT: Sorry, I'm just conscious that we are ending at 1. There's a lot to say. 

MS BENNETT: No, it's okay. We will go on as long as we need to go. 

MS INMAN GRANT: Alright. There is a youth based   

CHAIR: The guillotine won't necessarily come down precisely at 1 pm, so don't   we don't need to go fast for that reason. 

MS INMAN GRANT: Okay. I will slow down. Would you like me to go through generally what the complaints schemes are or is that not   

MS BENNETT: No, we will come to that. What I would like to focus on quite specifically is   you have certain powers to take steps if you apprehend that there is serious harm by reason of online content. Is that right? 

MS INMAN GRANT: Right. That is right. But each scheme is different in terms of the thresholds that we will use to determine whether a formal action can be taken or that it meets the threshold. So, for instance, for youth based cyberbullying, the definition of serious cyberbullying is seriously harassing, threatening, intimidating or humiliating. And probably 2.5 per cent of the reports that we get through youth-based cyberbullying involves a person with a disability that we are aware of. We do give them an opportunity to indicate whether or not the disability is what is specifically being targeted in their abuse. If you look at our newer scheme, in   

MS BENNETT: Sorry, to pause you there, that's for children; is that right? 

MS INMAN GRANT: That's for children. 

MS BENNETT:  And then for adults? 

MS INMAN GRANT: For adults, the complementary or the new scheme that was just introduced through the new Online Safety Act, and came into force in January 24 this year, the determination was made by the government of the day, and ultimately the Parliament, that adult cyber abuse should be at a much higher threshold. 

The reasons stated in the Memorandum of Understanding is that adults are meant to be more resilient, that there were more concerns about interfering with freedom of expression, freedom of speech, and also defamation law and harm to reputation, that this was meant to be at a higher standard and on par with the criminal threshold of 474.17 of the Criminal Code. 

So we've got a two-pronged objective test. One, we have to prove serious intent to harm, and the second prong is that it's menacing, harassing and offensive in all cases to the ordinary reasonable person, and that's in line with the criminal statute. There was an amendment that was passed that also stated that when we are assessing these cases, we couldn't consider just mere emotional or psychological distress. So, fear, anger, grief and distress cannot be the sole determinant whether or not there is serious intent to harm. 

MS BENNETT: So, Commissioner, in light of the evidence you have heard this week, would you accept there is a gap in this space between the experience of people with disability and your capacity to take steps under that test? 

MS INMAN GRANT: Well, I would say that we can consider disability and targeted disability when making out serious intent to harm or whether it's menacing, harassing or offensive. I would say it is at a very high threshold, and so we will continue to use informal powers where we can, when it doesn't violate the terms of service on a particular platform and it doesn't meet the high threshold of serious adult cyber abuse. But it is a very high threshold. In the 12,000   or sorry, 1200 plus reports that we've had over the first six months, we have issued three formal removal notes. 

MS BENNETT: The Commissioners will find the precise numbers at paragraph 25 of the Commissioner's statement. 1243 complaints have resulted in three notices to remove since the instigation of the new regime. You said, Commissioner, you can consider disability in considering the harm of conduct. So, leave aside intent for a moment. Look at the impact on the person on the receiving end. Can you take into account this cumulative effect we have been hearing about this week? 

MS INMAN GRANT: Those are circumstances that we can certainly take into consideration. I think something like it's 1.7 per cent of the reports that we have received of the 1243 have been with someone who has declared having a disability. And certainly in the case that was discussed earlier today with Ms Findlay, that   that was a consideration when assessing the content. 

MS BENNETT: So, I'm not sure if I have understood correctly. Is there   is the cumulative impact, is what we have heard this week, is that fed into your corporate knowledge or is the disability something you take into account? 

MS INMAN GRANT: What I would say is we take into account intersectional factors like disability. We   it is not explicitly written into the legislation that we can consider intersectional factors or even cumulative effects. But what we do try to deliver is   is fair, proportionate and compassionate citizen service, and we do think those elements are often germane to the cases that we are considering. 

MS BENNETT: And so where you find that you can't take steps because of the high threshold in the statute, you take other steps. Is that right? 

MS INMAN GRANT: That is correct. So, for instance, where we weren't able to use our formal removal notices using the new powers this year, we have   there have been 11 instances where we have approached the platforms for informal removal. We have been successful in 10 out of 11 of those cases. But when we are dealing with complainants, we are often picking up the phone or communicating with or engaging with them. We provide what we call 360 wraparound service. So, we will talk about them, about how we can help them build their psychological armour, and we can refer them on to Legal Aid or mental health referral services, if they know   we can talk about how they can use conversation controls, like blocking, muting and reporting and other protective factors that at the can take. 

MS BENNETT: Do have you people providing psychological support? 

MS INMAN GRANT: So, for youth based cyberbullying   no. We have 43 investigators, some who do have counselling background or who have been frontline workers but our role is not to deliver psychological support. For youth based cyberbullying, we have a contractual relationship with Kids Helpline and we have referred about 21,000 children to them. We work, of course, with a range of other mental health organisations, including Beyond Blue, headspace, Reach Out and others we can refer people to. 

MS BENNETT: We have heard this week about involuntary filming, particularly within the short-statured community. Have you heard or been briefed upon that evidence? 

MS INMAN GRANT: I have seen that evidence.

MS BENNETT: So these are instances where people are quite often filmed in public, and they have a fear that those images are going to end up online. If that were to be the case, is your office in a position to assist? What steps can your office take? 

MS INMAN GRANT: We do not have any regulatory powers to intervene in that kind of situation. We do have an image-based abuse scheme, but that refers to the non-consensual sharing of intimate images and videos, which would either likely show a person in a sexual act or genitalia, or it will cover things like Manga or, for instance, if a   if a woman of the Muslim faith is captured without a hijab on or in a bathing suit and that is shared without consent and it could cause harm. We can act in some of those cases. 

MS BENNETT: So there are some instances where a person's attribute might be taken into account when considering the harm or the impact of the harm? 

MS INMAN GRANT: That is the case. But the   the intent of the scheme is   is around preventing the threatened sharing or the sharing of intimate images and videos. So, it doesn't have that kind of a broad application. Potentially, if   if an image and   of the evidence I heard around people of short stature was taken in a way and that was   that was meant to cause serious harm or ridicule, applying that two-pronged test, potentially, if it met that threshold, we could take formal removal action. But, again, it is   it is quite a high threshold. 

MS BENNETT: Is there a gap there, Commissioner? 

MS INMAN GRANT: I think   

MS BENNETT: Is the threshold too high, in your view, to address some of the harms we have been hearing about this week? 

MS INMAN GRANT: What I would start by saying we are the only government in the world that has any similar scheme, like the five schemes that we operate, and the government of the day and then the Parliament decided to draw a line that says, freedom of speech when it veers into the lane of serious online harm, we are going to draw that line. They draw   they drew the line at a very high level. 

We are in continued conversation with my new Minister and with our department about what's working, what might be able to work better, or what might be some factors that we will be able to consider to make it more effective. But we also have to consider the size of the aperture. We can't put the threshold so low that   that we open up the floodgates and won't be able to manage the reports. 

MS BENNETT: Are you able to assist the Commission in identifying where an appropriate threshold might be, if it's not where it presently is? 

MS INMAN GRANT: Well, we are nine months into testing this scheme. We haven't reached a formal conclusion. One thing that had been suggested during the debate and may be of help is if we could explicitly consider intersectional factors when they are   they are present in the abuse that is directed towards an individual. Having that   that explicit language may be helpful in any of these cases. 

And perhaps I'm   when we look at the threshold for serious intent to harm, you know, I would argue, in many cases, one of the first things we ask somebody who reporting is what is their level of mental distress, because obviously we want to triage and address those reports where people are highly mentally distressed and they are at risk. But the mere emotional distress does mean that we, again, have to tip things at a higher level, but I can't give you definitive language at the moment. It's something that we are   we want to consider forensically. 

MS BENNETT: But the capacity to consider the cumulative impact on people with a disability is something that, as you understand it, would need to be enshrined in legislation? 

MS INMAN GRANT: I mean, we try and push the   the edges where we can. And you will see we have done that in   in terms of our even   in terms of our regulatory guidance. But, you know, things including character assassination, defamation and harm to reputation are not covered through the scheme. This does not mean we don't recognise and we don't feel horrible that people are experiencing harm. It's just that we   we have limitations into where we can take formal action to meet that   

MS BENNETT:  I understand. I'm really trying to explore with you in what ways that could be changed to meet some of the issues that we have been hearing about this week. Can I   is it the case, Commissioner, that you are a reactive agency for the most part when it comes to online content, that you require a report to act? 

MS INMAN GRANT: That is correct. And we are   we were set up to serve as a safety net rather than a proactive monitor of the internet, and part of the reason is, we do believe it is the responsibility of the platforms themselves to police their own platforms and enforce their own policies. And that's the most expeditious way to get the content taken down. 

MS BENNETT: And so you have   you expect a proactive system monitoring from platforms operating within Australia; is that right? 

MS INMAN GRANT: There are varying levels of proactive detection and steps that companies take through some new systemic powers that were introduced in the Online Safety Act, including mandatory industry codes in the area of what we call class one content. So, illegal content like child sexual abuse material and prohibited content. What we are negotiating through this co-regulatory scheme   these codes are owned by the platforms themselves and they are their finishing consultation   is the steps that they are taking to be able to proactively monitor and restrict access to Australians to these kinds of harmful content. 

MS BENNETT: So are there particular standards of proactive monitoring that you expect from a platform? 

MS INMAN GRANT: This is what we are working through with the mandatory industry codes, and there is a lot of variability in terms of capability of the platform. It might   it might also depend on the type of technology service we are looking at. So, we are looking at eight different sectors of the technology industry. So, while a social media site, a large one that's well resourced, may already be using proactive technology to detect illegal content or as the Twitter   that might be happening, but an ISP or a telco will not be able to use those same kinds of technology. 

MS BENNETT: So, I imagine it wouldn't necessarily all have to take the same form. Is it the case, Commissioner, that at present there is no particular requirement for proactive monitoring standards? But you are considering developing them in the future. Is that fair? 

MS INMAN GRANT: That is correct, and we have also used another systemic reform we have called the Basic Online Safety Expectations. 

MS BENNETT: Yes. 

MS INMAN GRANT: Which basically does set forth a set of expectations that companies that are operating in Australia should take to make sure that their platforms are safer. That might include proactive monitoring, but also includes very basic expectations like you have policies in place, you have a trust and safety division, you are enforcing your own policies and standard. Now, those aren't enforceable but we do have some powerful   they we do have some powerful provisions there that have enabled us to ask questions about what companies are and are not doing. And we just issued seven legal notices to major platforms around what they are doing to proactively detect on child sexual abuse material videos and what grooming detection technologies they're using. 

MS BENNETT: And has consideration been given in those various standards to how they might incorporate the experience of people with disability, including what we have been hearing about this week? 

MS INMAN GRANT: That is certainly an area that we could pursue through the basic online safety expectations and through the codes. In the outcomes-based paper that we wrote to industry in terms of what our expectations are, we specifically asked that when they are trying to protect their users from illegal content, in the first instance, and to access to pornography, for the second set of codes, that they are giving regard to diverse and vulnerable communities. 

MS BENNETT: Do you yourself have any input from people with disability into the way that you operate as a regulator? 

MS INMAN GRANT: Absolutely. I guess what I would say is we   we operate as an educator, a coordinator. So, we have a huge focus on prevention, protection through our regulatory schemes and what I call proactive change and then partnerships. So, we are   we have done some really important foundational work to develop an evidence base with, for instance, with women with an intellectual and cognitive disability who are subject to what we call technology-facilitated abuse, which is an extension of family and domestic violence and is very insidious for that population of women, particularly when it it's used to stalk, monitor, gaslight, degrade or even isolate them by withholding their technology. 

So, we engaged with the disability support community and those with lived experience to do the research, to co-design the materials, and the three videos that we used just as an example were done by the Fusion Theatre with actors with lived experience and with lived disability and belonging. So, that co-design process is really important, as is the evaluation afterwards. And we had those products evaluated and found that 97 per cent of the people   the target audience found that they were helpful and useful. 

MS BENNETT: Are those   the people with disability in the organisations with which you are engaged, is it fair to say they are from parts of the disability community but there is scope to be broader? 

MS INMAN GRANT: So, we start   we started with some informal consultations with the disability community in mid 2021, and we recognise that there was huge diversity. There is no one size fits all and that we need to work with a large spectrum or broad spectrum of those with lived disability but those also in the disability community. And so that has morphed in a much broader formal consultation with multiple organisations and agencies, and we are starting some exciting projects around developing specialised resources for both adults and for children. 

MS BENNETT: Do you have   as   you have a role in promoting online safety. Is that right? 

MS INMAN GRANT: That is correct. 

MS BENNETT: So, part of your role is to respond to issues online, but you also have a substantial presence in promoting online safety? 

MS INMAN GRANT: We have a role in promoting online safety, and I often do that through media engagement, through social media. We haven't been funded as an agency to run an education campaign or an advertising campaign. We have a very modest marketing budget. But I do everything I can with the tools that I can to try to get things out more broadly, and that's also why the partnerships are so important. 

We try and work through schools and education bodies, through police agencies, through non governmental organisations to get information out, not only to the broader disability community, but to the people that support them. And that was, you know, really one of the insights that came from this   these women experiencing technology-facilitated abuse areas, that the carers and the disability support workers weren't really aware of where to go or how to identify what technology-facilitated abuse is. So   so we focused there and we are about to develop a   release a learning management system module for that community. 

MS BENNETT: Is it fair to say this is an area of developing understanding for your office? 

MS INMAN GRANT: Absolutely. I think it's   it's got to be continuous, and it's one of those situations where we find that where we discover issues, it usually uncovers more that we need to look into. But just so you know, through our Aussie Kids Online, our broad research that we released in February, about a thousand of the sample of about 3000 have lived   lived disability, and so we are doing a specific analysis on those children with lived disability and their parents to understand the negative experiences they are having that   their impacts and how they are responding. 

MS BENNETT: So, what is it that you do, or does your office take any steps to encourage more positive attitudes towards people with a disability online? 

MS INMAN GRANT: You know, we absolutely   we absolutely do that to the best of our ability. We just put out research last   actually, this week, for instance, on a new survey that we had done with   with adults with intellectual and cognitive disability. And I wrote a blog about some of the findings. We try to put that out more broadly on social. And whenever we put out information, we try and provide guidance about what the benefits are, what the risks are, and really simple easy ways that they can mitigate the risks. 

We also have all of our content around all of our schemes in Easy Read English. You know, I   we have promoted things like International Disability Day, the Facial Equality Day, we look for whatever opportunities we can to remind people to be   to be kind in the online world as they are in the real world. We know that the disinhibition effect does often cause people to be mean and cruel. They don't us see the impacts when they are online sitting behind a keyboard. So, sometimes it's cowardly, but sometimes there is just a real lack of awareness of the real impact that they are having on the person behind the keyboard. 

MS BENNETT: And is it part of your role to try to educate people what the impact is on the other side of the screen? 

MS INMAN GRANT: Absolutely.

MS BENNETT: And that   and as I have understood your evidence so far, that hasn't involved specific large scale, if I can put it that way, disability related education. But that's something that you could consider? 

MS INMAN GRANT: I   I would say we are early on in the journey in terms of disability education. To do this effectively, this is something I believe we need to do with the disability community rather than coming up with, we think   what we think is going to be useful for the community. So, it does require that fundamental research. There isn't really an adequate evidence base, and that's what we are trying to build first. We are trying to do the engagement and the consultation. And that will lead to more of the co-design and then working together on broader distribution. 

MS BENNETT: Has that process begun yet? 

MS INMAN GRANT: Yes, that began in 2021. 

MS BENNETT: And do you have a sense of when it will be completed? 

MS INMAN GRANT: I think it will never be completed. I think it will be   

MS BENNETT: The current phase? 

MS INMAN GRANT: Yes, the current phase. For   and I would also note that we have given a few grants to disability organisations to do their own specific projects, including Endeavour and Interactive Disability Services. The   the youth-based resources will be delivered in Q1. We   we are also co-developing new resources for adults with an intellectual and cognitive disability that will be released in Q3 of next year. 

MS BENNETT: And the balance of that work will be ongoing. Is that right? 

MS INMAN GRANT: It will continue   yes, it will be ongoing and   yes 

MS BENNETT: Do you, Commissioner, have relationships with law enforcement agencies? 

MS INMAN GRANT: We do, in fact. We have MOUs with every state, territory and federal law enforcement agency in Australia.

MS BENNETT: And do you share information about abuse and harassment that occurs online that might constitute a criminal offence? 

MS INMAN GRANT: Yes, we do referral pathways to state, territory and federal law enforcement agencies when it reaches that criminal threshold. The last I checked, it was about   about 4 per cent of the cases we receive do we refer to law enforcement for criminal action. And, you know, a really important indicator is whether or not people feel that they are in   there's a threat to physical harm. 

MS BENNETT: And on the flip side, do you receive some reports from police where you might be able to take steps to ameliorate a threat that is manifesting online? 

MS INMAN GRANT: Yes, we do, and then we have a number of complainants who have come to us who have gone to the police first and have been told they don't have the powers or cannot   or the capability and cannot help and they refer that person to the eSafety Commissioner.

MS BENNETT: And you have referred a few times to different data points. So, can I take it from that, Commissioner, that your office retains data about the nature of the complaints and concerns that are raised with it? 

MS INMAN GRANT: We do collect that information. We   we do act on   obviously, we are collecting very, very sensitive data, including child sexual abuse material, or terrorist content, intimate imagery. So, we work from a privacy and data minimisation perspective, and security, of course, would be really important in that regard. But I may not be answering your question. 

MS BENNETT: Well, do you obtain and collate data concerning the level of abuse directed at people with disability? 

MS INMAN GRANT: We are not collecting specific demographic data where disability, gender, if you are a CALD community. Again, based on what is required to ask for, for the schemes and I will give you a precise reason why. When I came into this role as eSafety Commissioner, we were the Children's eSafety Commissioner. We had the child cyberbullying scheme, and it took a child 15 minutes to fill out a form because   and that's a very long period of time. So, we had a 95 per cent drop-out rate. 

So, one of the first things we did in the 2017 18 period was look at how do we streamline the reporting forms, make it Easy Read and accessible to everyone to report and just ask the information that we absolutely need. What we do find in terms of the process of an investigation when an investigator takes the case and engages with the complainant, there is often context or declaration, whether   for instance, whether disability might be a factor in that particular case. We do case note that. And we have actually built an Insights and Intelligence team, mostly so that we can understand   and some of these questions, including threat trends and prevalence. 

MS BENNETT: So, Commissioner, can I test this with you. If the Commissioners were minded   these Commissioners   were minded to make recommendations concerning the eSafety space, the areas in which such recommendations could be directed would the legislative standard which activates your responsive powers. Is that right? 

MS INMAN GRANT: I   I think the youth based cyberbullying scheme works very well. We've got a 85 per cent compliance rate with the image based abuse, of which 56 per cent of people from the disability community have experienced. We have a 90 per cent success rate. I think the one that we are testing right now is the adult serious cyber abuse scheme. And that is something that I've already had conversations with our department and our minister about. And there is a review provision in the bill after three years. So, that is   that is something that could be looked at and recommendations made. 

MS BENNETT: If the Commissioners form a view that the test was too restrictive or not flexible enough to take into account the needs of people with disability, that would be one area in which their recommendations could focus; is that right? 

MS INMAN GRANT: Indeed. 

MS BENNETT: And, similarly, the standards that you spoke of around proactive monitoring and basic online safety, that would be another area that recommendations could focus upon. It would be the minimum content of those standards; is that right? 

MS INMAN GRANT: Certainly. 

MS BENNETT: And they could focus on ensuring that particular aspects of people with   the experience of people with disability are accounted for in those standards? 

MS INMAN GRANT: Again, I think that could be made more explicit in things like industry codes where there isn't currently really a review or review provision there 

MS BENNETT: Are there any unknown unknowns? If the Commissioners were considering recommendations of that kind in relation to either of those spheres, are there any matters that they should be aware of that might counter indicate? 

MS INMAN GRANT: No, again, we are in really early phases of implementing the Online Safety Act. It's the only law   set of laws and schemes of its nature in the world, so we are writing the playbook as we go along. And I want to make sure that we are   again, the codes are being   they are ultimately owned by the industry, and they are still being publicly consulted upon. I haven't decided whether or not I will register the codes and whether or not they meet the community safeguards and that determination will be made in mid November. 

So, it is early days. We still have some resourcing   additional resourcing we need to do around the basic online safety expectation teams because it's a very manual based process. But what I would   what I would say is   is we have made a   we made the determination   we haven't been funded to work in the disability area or the vulnerable people area and so I'm   I'm talking about a broader range of at risk communities, whether it's LBGTQI plus, Indigenous Australians. 

And I think to do this right, we all have to be resourced with people so that we can be getting more consistent training, that we can make sure that there are more people with lived experience, you know, working on our teams, particularly our investigations teams. You know, deep engagement, meaningful consultation. User group. That all takes time, people and monetary resource, developing the right kinds of content and making sure that they are screen readable, that they are Easy English, that we are using Auslan interpretation and conscription. All those things take   take time, money and funding. 

And so, I know, understanding that there is a tight fiscal environment and it depends on the priorities of the government of the day, that   if there is funding and programs earmarked to do that work, that will always be of assistance. 

MS BENNETT: Commissioners, those are the questions I have for the Commissioner. 

CHAIR: Thank you very much. If you don't mind, I will ask my colleagues if they have any questions of you, and I will ask Commissioner Galbally first. 

MS INMAN GRANT: Sure.

COMMISSIONER GALBALLY: Thank you very much. Very interesting. The   so Australia's the only country in the world that has such an office and with this legislation? 

MS INMAN GRANT: That is correct. Fiji does an Online Safety Commissioner. They have two people and an Acting Commissioner. They don't have legislative schemes quite like this. The UK is considering an online safety bill. Right now the Irish are looking at setting up a digital Safety Commissioner, as are the Canadians. But yes, we have got seven years of runway and we started small as Children's eSafety Commissioner, then we went universal and broad to all people.

It was   it was a decision that   that I made strategically in 2020 that I wanted us to really focus on vulnerable, at risk and diverse communities, because we were seeing how online abuse was playing out. It just was disproportionately impacting those Indigenous Australians, those from the LGBTQI plus community and those with a disability. 

COMMISSIONER GALBALLY: Is there a balancing act in this online life being so incredibly liberating and valuable for people with disabilities   especially through COVID; you know, we have heard about that in this Commission   and then the whole issue of abuse and what to do about that? Is that quite a   

MS INMAN GRANT: I think that's absolutely fair. I think we all would agree that, you know, the internet became an essential utility during COVID. It was the only way that many of us could engage, communicate, learn, explore, connect with people. Yet we also saw all forms of harm super charged. So, we had to get a $10 million infusion of funding just to bolster our investigative ranks. We saw child sexual abuse double. We saw image based abuse increase by 114 per cent. 

At the time, we didn't   we didn't yet have the adult cyber abuse program but, yes, so harms proliferated. But this is why the prevention work is so important in giving people tools, show them how to report to a platform; how to use muting, blocking and conversation controls. Sometimes we do need to take some time out and understand when our online engagement is more damaging to us than helpful. 

And this was a really important finding that just came out of the report that we released this week, with adults with intellectual and cognitive disability, is that those individuals didn't always recognise when they were being abused. They might not be reading the tone or the cues right. But their disability support workers, or parents or carers, were definitely concerned that they were experiencing abuse. 

So, again this is why we are really cautious to sort of not treat the disability community as homogenous. In fact, it's very diverse and the way that harms will manifest and the way that it will impact different members of the community and the ways they need to respond are going to be very different and tailored, and our guidance has to reflecting that. 

COMMISSIONER GALBALLY: And that example that you just used is even more delicate in that there's a view from people with intellectual and cognitive disability that they should decide themselves, and that   as adults and that it isn't families or support workers who should be deciding. So, that must be quite hard for you too. 

MS INMAN GRANT: Well, alright, the agency is very important. 

COMMISSIONER GALBALLY: Thank you. 

MS INMAN GRANT: Thank you. 

CHAIR: Commissioner Ryan? 

COMMISSIONER RYAN: Thank you. You asked   you answered earlier a question with regard to your relationship to state and federal police services. I have been trying to work out, through reading the Online Safety Act, the relationship that your Act has to state and territory and even Commonwealth legislation with regard to, for example, in the Crimes Act of New South Wales, there are specifications about inciting violence against particular groups of people. In Tasmania, I think they have laws about humiliation of people. What   what regard do you have to have or do you have to legislation? So, for example, if something is declared illegal by a particular state to be an illegal form expression, do you have regard to that? 

MS INMAN GRANT: Well, we are obviously very aware and need to have regard to that. If someone comes with a report to eSafety, wanting to, you know, leverage the adult cyber abuse scheme, we, of course, will   will assess it in accordance with the threshold and the two pronged objective test on that basis. If we think that, for instance, we can't help them through that, those   those legislative tools, through serious removal action or through informal action, we can always refer to the   the state police agency. And I say the state police agency because the Australian Federal Police doesn't generally deal with online abuse of this nature. We would refer to the state. 

Some of the   some of the state police agencies also have fixated persons units but those mostly apply to people who are in the public   public eye or politicians. So, sometimes it's hard for us to even identify the right people in the state law enforcement agencies that we can refer on to. 

COMMISSIONER RYAN: So is it fair to say that sometimes you cannot ask someone to take some material down even though it might be an illegal act by a state or territory legislation or a restricted act by anti discrimination legislation, but you can't ask someone to take it down unless it meets that very high threshold that you have got established in your Act? 

MS INMAN GRANT: That is correct. 

COMMISSIONER RYAN: Thank you, Mr Chair. 

CHAIR: It is called federalism. The definition of cyber abuse material imposes on you the very difficult task of determining whether an ordinary reasonable person would conclude that it is likely that the material was intended to have an effect of causing serious harm to a particular Australian adult. This is a compound expression that involves a number of criteria. In your experience of applying it, does it work? 

MS INMAN GRANT: Well, we are nine months into the scheme. As   as indicated earlier, we have had 1243 reports. We have issued three serious removal   three formal removal notices under the scheme where they met both prongs of the test. So, in addition to that intent to seriously harm, we often have to also have to prove that it is menacing, harassing or offensive to an ordinary reasonable person. So, again, as I think has been echoed here, it is a very high bar. 

CHAIR: Can you give us an example from the three without providing identifying details? 

MS INMAN GRANT: I'm   one of the cases involved doxxing, which is when someone will put up a person's personal details, like their address or their phone number online, and they might   

MS BENNETT: Could I interrupt. I'm not sure these are well known   it might not be known to and might not be something we should provide that information about exactly how that works, doxxing. Perhaps we could do that on notice in writing. Would that be convenient to the Chair? 

CHAIR: Yes. Can't we listen to the rest of the answer?

MS BENNETT: Certainly, Chair. Yes. Certainly. 

CHAIR: I appreciate you want to educate us and I understand why. But   

MS BENNETT: Perhaps at a high level you could   

MS INMAN GRANT: Right, right. So, if somebody is inciting violence and providing people information about how to find that person, obviously, that is concerning. Then   and then there have been targeted   there's been persistent targeted online abuse towards an individual that we deemed was meant to cause serious harm and was menacing, harassing and offensive. So, it met both prongs of test 

CHAIR: Of the   is there a breakdown of the 1243 complaints from adults as to the characteristics of the persons making those complaints or perhaps the nature of the complaint? 

MS INMAN GRANT: Well, I can tell you that one third of those reports were deemed to really constitute defamation or harm to reputation which, as you know, is a legal tort and cause of action, and while we are not disputing that this causes the person harm, it doesn't meet that two pronged test, and, unfortunately, we don't have a warm handoff place to send people. And we have said very publicly through   through submissions to   on the former Online Trolling Bill, that, you know, it is   it's not a great position to be in to say, "We can't help you with this free take down service because it doesn't reach this high threshold of harm and we can't provider you any"   

CHAIR: Sorry, is there a specific exclusion in the legislation for defamatory content? 

MS INMAN GRANT: Yes, but a lot of people come to us experiencing harm because they are experiencing defamation, but they come to us wanting us to use our serious adult cyber abuse powers to remove that content. 

CHAIR: I see. So, that's one third of the 1243. Do you know whether any, and if so how many, were from people with disability complaining that they had been the target of adult cyber abuse material because of their disability? 

MS INMAN GRANT: Based on the data we have, about 1.7 per cent. 

CHAIR: Which would be, what   

MS INMAN GRANT: Much smaller than the  

CHAIR:   20 or 25. 

MS INMAN GRANT: Yes, so much smaller than the   it's not reflective of the general population. 

CHAIR: No, I understand. Alright. Well, thank you very much for the information you have provided to us, both in your written statement and in your evidence today. As Commissioner Galbally said, it has been very interesting and very helpful. Thank you. 

MS INMAN GRANT: Thank you very much. 

MS BENNETT: Commissioner, if I might   

CHAIR: Yes. Do you want me to make certain directions?

MS BENNETT: If I could tender a couple of final documents. 

CHAIR: Yes, yes. Certainly 

MS BENNETT: That is some of the attachments to the Twitter statements that are marked in the index   

CHAIR: Sorry, Commissioner, you can by all means step down if you would like. 

MS INMAN GRANT: Thank you very much. 

<THE WITNESS WITHDREW 

MS BENNETT: Commissioners, if we could tender the exhibits to Twitter's statement, sequentially numbered 28 37.1 to 28 37.10, I would be grateful, and have them marked in accordance with that index. 

CHAIR: Yes, what we will describe as the Twitter documents will be Exhibit 28 37 and then the annexures thereto 37.1 to 37.10. 

<EXHIBIT 28 37 DOCUMENTS FROM TWITTER

<EXHIBITS 28 37.1 TO 28 37.10 ANNEXURES TO DOCUMENTS FROM TWITTER

MS BENNETT: Commissioners, we then seek directions that any witnesses who took on notice a question  

CHAIR: Well, I have got draft directions. 

MS BENNETT: Yes. 

CHAIR: Have they been circulated to any represented parties? 

MS BENNETT:  They have been.

CHAIR: They have. Well, then what I will do is read out the proposed directions. If there is any represented party who makes an objection, they have the opportunity to do so. Otherwise, these are the directions that will be made. 

"1. Any witness who took a question on notice during this hearing may provide his or her answers in writing to the Office of the Solicitor Assisting the Royal Commission by 28 October 2022. The answers should be targeted and concise and not address additional or unnecessary matters. Counsel Assisting the Royal Commission may tender any such responses in evidence. 

2. Counsel Assisting the Royal Commission will prepare written submissions following the hearing by 5 December 2022. These submissions will be provided on a confidential basis to parties with leave to appear and to any organisation that received a procedural fairness letter from the Office of the Solicitor Assisting the Royal Commission in preparation for this hearing. 

3. Any responses to Counsel Assisting's submission should be sent to the Office of the Solicitor Assisting by 21 December 2022, those responses should be concise and should not include any additional evidence."

I shall pause for one moment to see if there is any objection to that. There is none; therefore, the directions will be made. Thank you. Is there anything else that you want to   

MS BENNETT: Save to thank the parties, the witnesses and their lawyers for their assistance and, in particular, the lived experience witnesses who have given such extraordinary evidence this week, there's nothing further from Counsel Assisting. 

CHAIR: Thank you. On behalf of the Commissioners, I also wish to thank everyone who has given evidence this week, particularly the short statured people and other people with lived experience of violence and abuse in public places who told us about their experiences. This evidence was, in many respects, harrowing, but these experiences have to be recounted and publicly exposed. 

The Royal Commission is very conscious of the importance of providing support to people who are giving evidence of traumatic experiences and of avoiding the risk of retraumatisation. And that support, we hope, has been provided. Even so, it requires considerable courage and resilience for people to recount these experiences in a public forum such as this, and we are very grateful to them. 

We hope that the people who have given this evidence have found the experience worthwhile, and I think we can see from the responses of the panel of the senior police officers from whom we heard yesterday that the evidence has had a very powerful impact. Our job is to ensure, at least as far as we can, that the wider Australian community understands the harm of the behaviour of the kind we heard about does to the people who are subjected to it. 

We therefore particularly want to thank the people with lived experience who gave evidence: Dr Debra Keenahan, Ms Tracy Barrell, Mr David Gearin whose statement was read, Elissa, Fiona Strahan, Jenni, Peta Stamell, Marie, Mr Tim Marks, Ms Ricki Spencer, Ashleigh and Ms Findlay, who gave evidence today. 

But also we wish to thank everybody else who has given evidence at this hearing. We heard a great deal of very helpful and thoughtful evidence from a range of perspectives. We have heard from representatives of disability representative organisations: Ms Maree Jenner, on behalf of the short statured people of Australia; Ms Butler, who gave evidence of members of Speak Out advocacy. 

We heard from experts who gave extremely interesting, thoughtful and helpful evidence, Professor Llewellyn and Professor Asquith. We heard from government witnesses and statutory office holders: Ms Debbie Mitchell of the Commonwealth Department of Social Services and from Mr Fitzgerald and Ms McKenzie of the Ageing and Disability Commission of New South Wales, who gave such insightful evidence to us. Ms Mason the Director the South Australian Adult Safeguarding Unit and today just now we have heard from Dr Julie Inman Grant, the eSafety Commissioner. 

In addition as I have already mentioned, we heard from the panel of police: Assistant Commissioner Fellows from the South Australian Police, Acting Deputy Commissioner Cooke from the New South Wales Police Force, and Senior Sergeant Pickard of the Queensland Police Services. 

We heard from representatives of other organisations, community organisations: Ms Traynor from the Canterbury Bankstown Disability Abuse Prevention Collaborative, and Ms Veiszadeh, who told us, in a very helpful way, the setting up and management of the Islamophobia Register. And today we heard from Ms Hinesley and Ms Reen of Twitter about the way in which Twitter has responded to manage, moderate and enforce rules of conduct on its platform. 

We are exceedingly grateful to everybody who has given evidence. 

As with every one of our hearings, 28 today, more or less 130 hearing dates over the life of the Commission, preparing for and presenting a hearing such as this takes an enormous amount of time and effort on the part of a large number of people: Counsel Assisting; the Policy and Engagement branches of the Royal Commission, including our outstanding counsellors who provide support to people with disability giving evidence; the Corporate responsible for the logistics and the general organisation of these hearings. 

In addition, of course, we are rely upon Law In Order for the technology, our excellent Auslan interpreters for the translation into Auslan and vice versa when required, and who have to deal with varying paces at which evidence is given, and our media team which striving valiantly to get the message across to the mainstream media. So, our thanks to everybody involved in the preparation for and conduct of this hearing.

Our next public hearing is Public hearing 29. It will deal with the experiences of   experience of violence against, abuse, neglect and exploitation of people with disability from culturally and linguistically diverse communities. 

That hearing will hear evidence from people with disability from those communities about different cultural attitudes and understandings of disability, intersectionality and identity for people with disability from CALD backgrounds, the language and other barriers experienced by CALD people with disability, when accessing and interacting with various services and systems in Australia, and the importance of language acquisition and the impact of language deprivation experienced by the deaf, deaf blind and hard of hearing communities.

Public hearing 29 will be held in Melbourne at the Melbourne Convention and Exhibition Centre from 24 to 28 October 2022. That is starting on Monday week. 

Thank you everybody. We will now adjourn 

<ADJOURNED 1:24 PM