Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

  • aard@kyu.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 months ago

    This was just a matter of time - and there isn’t really that much the affected can do (and in some cases, should do). Shutting down that service is the correct thing - but that’ll only buy a short amount of time: Training custom models is trivial nowadays, and both the skill and hardware to do so is in reach of the age group in question.

    So in the long term we’ll see that shift to images generated at home, by kids often too young to be prosecuted - and you won’t be able to stop that unless you start outlawing most of AI image generation tools.

    At least in Germany the dealing with child/youth pornography got badly botched by incompetent populists in the government - which would send any of those parents to jail for at least a year, if they take possession of one of those generated pictures. Having it sent to their phone and going to police for a complaint would be sufficient to get prosecution against them started.

    There’s one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying “they’re AI generated” is becoming a plausible way out.

    • taladar@feddit.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      In the long term that might even lead to society stopping their freak-outs every time someone in some semi-sensitive position is discovered to have nude pictures online.

  • rufus@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    10 months ago

    Interesting. Replika AI, ChatGPT etc crack down on me for doing erotic stories and roleplay text dialogues. And this Clothoff App happily draws child pornography of 14 year olds? Shaking my head…

    I wonder why they have no address etc on their website and the app isn’t available in any of the proper app-stores.

    Obviously police should ask Instagram who blackmails all these girls… Teach them a proper lesson. And then stop this company. Have them fined a few millions for generating and spreading synthetic CP. At least write a letter to their hosting or payment providers.

      • rufus@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 months ago

        I didn’t follow how the story turned out that closely. I think it was a schoolmate who did this. I kinda split up my answer because I think if a kid/minor is the offender, it’s not yet too late to learn how to behave (hopefully). But blackmailing people with nudes is a bit more than the usual bullying and occasional fight between boys we did back in the day. I trust some judge has a look at the individual case and comes up with a proper punishment that factors this in.

        What annoys me is the people who offer this service. Advertise for use-cases like this and probably deliberately didn’t put any filters in place not even if it’s pictures of minors. I think they should be charged, fined and ultimately that business case should be banned. I (anonymously) filed a complaint, after writing that comment in September. But they’re still online as of today.

        So in my opinion the kid should be taught a lesson and the company should pay for this and be closed for good.

  • rayyyy@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    10 months ago

    The shock value of a nude picture will become increasingly humdrum as they become more widespread. Nudes will become so common that no one will batt an eye. In fact, some less endowed, less perfect ladies will no doubt do AI generated pictures or movies of themselves to sell on the internet. Think of it as photoshop X 10.

      • andrai@feddit.de
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        I can already get a canvas and brush and draw what I think u/DessertStorms looks like naked and there is nothing you can do about it.

        • DessertStorms@kbin.social
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          10 months ago

          You’re not making the point you think you are, instead you’re just outing yourself as a creep. ¯_(ツ)_/¯

        • ParsnipWitch@feddit.de
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          10 months ago

          The lack of empathy in your response is telling. People do not care for the effect this has on teenage girls. They don’t even try to be compassionate. I think this will just become the next thing girls and women will simply have to accept as part of their life and the sexism and objectification that is targeted at them. But “boys will be boys” right?

      • taladar@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Photoshopped nude pictures of celebrities (and people the photoshopper knew personally) have been around for at least 30 years at this point. This is not a new issue as far as the legal situation is concerned, just the ease of doing it changed a bit.

      • Jax@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        10 months ago

        Have you ever posted a photo on Facebook or Instagram?

        If the answer is yes, congratulations! You gave consent.

        • Black616Angel@feddit.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Please show me where exactly the terms and conditions mention the production and publication of ai generated nudes on those sites.

          Also eww, I would not want to be near you in real life.

          • Jax@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            10 months ago

            You give them free reign to do literally whatever they want with your images the moment you post them. They OWN YOUR PHOTOS. The only reason you don’t know about it is because you’re fucking stupid and don’t read their terms of service.

            Signed: person who stopped using sites like Facebook and Instagram for this reason.

            Edit: Sorry, I realized that reading isn’t your strong suit which is why you demanded I sift through their ToS for you. It’s under the privacy section of Meta’s terms of service. Anything you post that is public immediately grants them the rights to your image.

            You ever put an image on Tinder through Facebook, congrats: consent achieved.

            I genuinely do not care if you are aware or otherwise. Your comment proves you’re fucking dumb, and deserve your images being used against you for not protecting yourself from predatory social media sites.

            • Black616Angel@feddit.de
              link
              fedilink
              English
              arrow-up
              0
              ·
              10 months ago

              You are right, they own my photos, this of course doesn’t grant them the right to do anything with it and it as well doesn’t give someone else the right, but what do you know? You are some lonely little sit harassing others online.

              Delete your CSAM collection and then yourself please. Do something for us all, thanks.

              • Jax@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                0
                arrow-down
                1
                ·
                10 months ago

                Jesus christ, you’re a fucking idiot. Maybe if you went through English class without writing every report through sparknotes you’d have developed the critical thinking required to understand what a TERMS OF SERVICE agreement is.

                It’s not too late, you can always go back to school. Although, reading your replies, you’re still too fucking dumb to gain anything from it.

                • Black616Angel@feddit.de
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  10 months ago

                  Wow, you have to be one of the most stubborn, stupid, insolent, arrogant, self-absorbed assholes, I ever had the displeasure of exchanging words with.

                  Eat a dick!

  • Margot Robbie@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Banning diffusion models doesn’t work, the tech is already out there and you can’t put it back in the box. Fake nudes used to be done with PhotoShop, the current generative AI models only makes them faster to make.

    This can only be stopped on the distribution side, and any new laws should focus on that.

    But the silver lining of this whole thing is that nude scandals for celebs aren’t really possible any more if you can just say it’s probably a deepfake.

    • GCostanzaStepOnMe@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      10 months ago

      Other than banning those websites and apps that offer such services, I think we also need to seriously rethink our overall exposure to the internet, and especially rethink how and how much children access it.

      • MadSurgeon@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        We’ll need an AI run police state to stop this technology. I doubt anybody has even the slightest interest in that.

        • GCostanzaStepOnMe@feddit.de
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          edit-2
          10 months ago

          We’ll need an AI run police state to stop this technology.

          No? You really just need to ban websites that run ads for these apps.

  • tetraodon@feddit.it
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I feel somewhat bad saying this, but the wo/man (it will be a man) who can make an Apple Vision Pro work with AI nudifiers will become rich.

    • TheGreenGolem@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      You know the old joke: if we could do anything with just our eyes, the streets would be full of dead people and pregnant women.

  • YurkshireLad@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Maybe something will change as soon as people start creating and distributing fake AI nudes of that country’s leaders.

    • Risk@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Honestly surprised this didn’t happen first.

      Be a great way to discredit politicians in homophobic states, by showing a politician taking it up the arse.

  • danhab99@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    I tried the AI with a pic of me. It was incredibly inaccurate and gave me something between a dick and a vagina. Nothing truly damaging.

  • iByteABit [he/him]@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    10 months ago

    Governments need to strike hard against all kinds of platforms like this, even if they can be used for legitimate reasons.

    AI is way too dangerous a tool to allow free innovation and market on, it’s the number one technology right now that must be heavily regulated.

  • duxbellorum@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    10 months ago

    This seems like a pretty significant overreaction. Like yes, it’s gross and it feels personal, but it’s not like any of the subjects were willing participants…their reputation is not being damaged. Would they lose their shit about a kid gluing a cut out of their crush’s face over the face of a pornstar in a magazine? Is this really any different from that?