All changes made to the description and title of this division.

View division | Edit description

Change Division
senate vote 2024-08-21#3

Edited by mackay staff

on 2024-09-01 13:43:48

Title

  • Bills — Criminal Code Amendment (Deepfake Sexual Material) Bill 2024; Second Reading
  • Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 - Second Reading - Make an offence

Description

  • <p class="speaker">Kerrynne Liddle</p>
  • <p>One, two, three, four seconds: that's the time it takes to create a deepfake. In four seconds what could come next can be catastrophic. Deepfake imagery can be career harming for adults, but when used by criminals against our children the consequences have been and can be deadly. This bill recognises that improvement is needed to keep up with new developments in deepfake, but this bill does not make the existing legislation better. Instead, it introduces new legislation with significant gaps for vulnerable people and/or criminals to fill. It is becoming increasingly difficult to identify deepfakes and that makes this legislation incredibly important.</p>
  • <p>There is criticism of this bill from important stakeholders. The Law Council of Australia, in its submission, complained about Labor's process for consultation, declaring:</p>
  • <p class="italic">&#8230; expert advisory committees have not been able to consider completely all the issues raised by this Bill.</p>
  • <p>You would think the Law Council was a relevant contributor with a view worth taking very seriously, but it appears not. Australia's eSafety Commissioner has estimated deepfake imagery has soared by as much as 550 per cent year on year since 2019. Pornographic videos make up 98 per cent of the deepfake material currently online; 99 per cent of that deepfake imagery is of women and girls.</p>
  • <p>As shadow minister for child protection and the prevention of family violence, I have heard from parents whose lives have been shattered by the depravity and sheer evil of these online scammers. These criminal scammers use deepfakes and nude images to extort money from young children in a manner called 'sextortion'. Sextortion works like this. Criminals, maybe online and maybe in another country, prey on children through social media platforms and demand their money. They trick children into sending a compromising photo and threaten to leak those photographs, or the extorter threatens to send a fake photo to everyone in the recipient's contacts&#8212;schools, clubs and anywhere they've found out the young person is connected to.</p>
  • <p>Male children and young people online during the school holidays, likely with more cash or gift cards than usual, are targeted most. They are more than 90 per cent of victims. The Australian Centre to Counter Child Exploitation has seen a 100-fold increase in reports of sextortion crimes in 2023 compared to the previous year. Latest data from November 2023 shows around 300 reports of sextortion targeting children each month. Globally, sextortion is reported to have doubled from 2022 to 2023. I encourage everyone&#8212;carers, parents, grandparents, teachers&#8212;to know more about protecting our children from sextortion and to check out the ACCCE website for information.</p>
  • <p>A few months ago, News Ltd Australia brought a group of parents to Canberra as part of its Let Them Be Kids campaign, calling for children under 16 to be restricted from having social media accounts. As part of the campaign, a group of courageous parents, consumed by terrible grief, shared their stories of the loss of their children as a consequence of sextortion. The harrowing stories of these parents are real. Their stories are horrific.</p>
  • <p>Susan McLean, widely regarded as Australia's cyber cop, is quoted as saying that in her 30 years of policing she has never seen a crime type that tips mentally well young people to crisis and self-harm as quickly as sextortion does. One parent who is taking action is Wayne Holdsworth, who tragically lost his 17-year-old son, Mac, in October last year after being targeted by predators. Wayne created SmackTalk, doing all he can to reduce suicide by educating people on how to be better listeners.</p>
  • <p>Research released in June found one in seven adults, or 14 per cent, has had somebody threaten to share their intimate images. And if a worldwide survey involving Australia that found more than 70 per cent of people don't know what a deepfake is, then we need to do more to educate Australians. We need to ensure that young people understand how wrong it is and the harm it causes. We must do more to ensure children do not become victims or, indeed, perpetrators.</p>
  • <p>In July a teenager was arrested amid allegations that the faces of 50 female students in a Melbourne school were used in fake nude images generated using AI technology. No matter how deepfakes reveal themselves, whether the images shared online were doctored or are real photos unauthorised by victims, they cause harm. As a mum, my children were not allowed to use social media until their late teens. It was very unpopular to take that position, but I saw it as essential. Putting in place controls was the right thing for our family. But, on reflection, it would have been easier with another ally in the fight to protect them. This legislation goes some way to doing that but it could and should be better.</p>
  • <p>I went online and searched 'create AI-generated image'. In seconds, mere seconds, the search result was pages and pages offering easy guides to help me create deepfakes. There is need for education on online safety for people in all stages of life. These protectors include: agreeing on boundaries, reviewing controls, privacy and location settings, talking openly about the dangers and tactics of trolls and scammers, knowing where support can be found, and reporting abuse, crime and unwelcome behaviour.</p>
  • <p>Outside the personal life of Australians, deepfake has also had an impact on the business sector. Research released in July found almost 25 per cent of Australian businesses have experienced a deepfake information security incident within the last 12 months&#8212;25 per cent of them! This is not to say that AI is all bad. When used well, it can be used for significant good, but there has always been the potential for people with malicious intent, known as bad actors, in the industry, to turn it to their advantage. We need a strong deterrent for those bad actors. When opposition leader Peter Dutton was Home Affairs minister he funded and opened the $70 million Australian Centre to Counter Child Exploitation. Since then, the ACCCE has removed more than 500 children from harm. A coalition government will double the size of the Australian Centre to Counter Child Exploitation because there is evidence of its good work. In 2018, the coalition's Enhancing Online Safety (Non-consensual Sharing of Intimate Images) Bill established crucial powers, now exercised by the eSafety Commissioner, to keep Australian children safe.</p>
  • <p>Those opposite shouldn't be introducing weak legislation that is inconsistent with the Online Safety Act, that is too broad and that risks capturing innocent victims and placing the onus of proof on them. There is no need for legislation that amends the definition of 'consent', leaving too much room to judicial interpretation. We don't want to amend the definitions of 'recklessness' and 'unnecessary exception', and we don't want to increase exponentially the risk of harm to the victim, increasing the need to cross-examine victim-survivors.</p>
  • <p>The bill is flawed. It is well intentioned but ill-informed. The Albanese government should not be allowed to keep the bill as it presents to the chamber but instead make the necessary changes to improve the legislation, to increase safety for all Australians.</p>
  • <p class="speaker">David Shoebridge</p>
  • <p>I first of all rise to acknowledge the, woe of my colleague, Senator Waters, in this space, for her work on the committee and her contribution to the chamber, and endorse whole heartedly her contributions and credit her work on the committee. This is a bill that does a very, very small thing. It extends an existing offence about the transmission of sexual material without consent, which has been on the statute books for years and years to also include material that was produced using technology including deepfakes. That's what it does. You would think, from the comments from the Prime Minister down in the government, that this was some extraordinarily significant achievement from the government.</p>
  • <p>The Greens have been on record as saying that they support this change and support this incremental change to the existing offence of using a carriage service to transmit sexual material without consent. We support that, but we really could have legislated for this in an afternoon with the broad support of all parties and a very rapid committee process, because the change is, to be quite frank, extremely marginal in terms of the impact it's going to have.</p>
  • <p>One of the concerns that has been repeatedly raised in this space with my office is that the existing offence under the Criminal Code of using a carriage service to transmit sexual material without consent is almost never policed, and another is that, when women make complaints to the police, at a state level they get told it's a federal matter, and for a federal matter they get told it's really a state matter. The police seem largely disinterested in investigating. There don't seem to be any good guidelines for how the police should go about acquiring the relevant information needed to find out where the offence was committed. There don't seem to be the specialist resources in place to assist state, territory or federal police to undertake the necessary investigative work to work out from what device and by who the material was sent. None of that seems to be in place. It would be good to see the government investing in those resources, asking those questions of the Australian Federal Police and, through the standing committee on police ministers and the attorneys-general, getting those questions asked of state and territory police. That's the hard work of government that needs to be done in this space, but we're not getting that. We got this bill as though it's a solution.</p>
  • <p>The concerns that I have are that we'll pass the law and the government will say they've taken this action and that, somehow, this passing of the law itself will make women safe in this space, but history would suggest that that, of itself, won't do it. The policing culture needs to change to respect women when they come and make complaints, and I would refer the chamber to the report that was released last week on missing and murdered First Nations women and children and the additional comments to that report given by my colleague Senator Cox and me, which pointed out the deep disrespect police can show women when they're reporting their missing children, the physical assaults that they have received or the violent threats that they've received. Does anyone seriously think that there's a huge gap&#8212;that there's suddenly a totally different set of police forces across the country who are respectfully responding to women when they talk about their images being shared without consent online? I don't believe there is any meaningful difference.</p>
  • <p>So, yes, let's amend this bill. Let's amend this law. Let's criminalise the non-consensual sharing of deepfakes. Let's do that, where it's involving sexual material without consent. Let's do that, but don't anybody pretend that the problem's fixed with the passing of this bill. Don't even pretend that it's fixed with passing this bill.</p>
  • <p>I also move the second reading amendment that's been circulated in my name&#8212;as I'm corrected by my august leader in this chamber, I foreshadow that I will be moving a second reading amendment as circulated in my name.</p>
  • <p class='motion-notice motion-notice-truncated'>Long debate text truncated.</p>
  • The majority voted against an [amendment](https://www.openaustralia.org.au/senate/?gid=2024-08-19.185.1) introduced by Queensland Senator [Larissa Waters](https://theyvoteforyou.org.au/people/senate/queensland/larissa_waters) (Greens), which means it failed. This amendment would have added the words below to the usual second reading motion, which is "that the bill be read a second time" (parliamentary jargon for agreeing with the main idea of the bill).
  • ### Amendment text
  • > *At the end of the motion, add ", but the Senate notes that Australia, having ratified the Convention on the Elimination of All Forms of Discrimination against Women, has an international obligation to make the creation of, and associated threat of the creation of, deepfake sexual material an offence."*