just me

  • 0 Posts
  • 186 Comments
Joined 1 year ago
cake
Cake day: October 3rd, 2023

help-circle
  • i think it’s at least in part because we have always been taught to see Hitler as a monster instead of a person. We dehumanised him and the entire nazi party so much for many it sounds like a myth instead of history, the take away seems simple - just don’t be a monster.

    The lesson was - some people are born evil

    Instead of - anybody can fall the wrong path and find themselves committing atrocities. Even your friends, even your family, even you

    i’ve been saying this for a long time - Hitler wasn’t a monster, he was human just like you and me, and that’s a hundred times more terryfing





  • most of the times the difference between me and a cis man is not important, so i simply say i’m a man. Sometimes the difference is important, and then i clarify i’m a trans man

    90% of the time and most people i meet will have no idea i’m transgender, the other 10 are doctors, people i want to have sex with, and those i’ve talked with about trans experiences






  • shneancy@lemmy.worldtoMemes@lemmy.mlDucks
    link
    fedilink
    arrow-up
    11
    ·
    12 days ago

    after the goose comes a swan, which though bigger, tougher, and stronger, has chilled the hell out a bit

    after a swan then comes the Canadian goose, which even though it appears to be a return to goose, it’s actually the might of a swan, and the rage of a 100 regular geese




  • honestly, this is not a terrible idea

    if you see someone at the verge of a panic attack that means they’re fully in their head spiraling - you can try to calm them down the normal way, but you can also try to force them out of their own head and ground them by saying something weird, ideally a question so their mind can latch onto it. It won’t always work, but it might shock them just the right amount to ground them!



  • shneancy@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    18 days ago

    this is not about wanting this is about companies taking advantage of vulnerable people who should be grieving. This can cause lasting psychological harm

    you might as well be saying, if someone came to a drug maker, and wanted some heroine, and provided ingredients for heroine, and agreed to whatever costs were involved, isn’t that entirely their business?



  • shneancy@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    18 days ago

    wow, so many reasons

    • to create a mimic of a person you must first destroy their privacy
    • after an AI has devoured all they’ve ever written or spoken on video it will then mimic such person very well, but most likely still be a legal property of a company that made it
    • in a situation like that you’d then have to pay a subscription to interact with the mimic (because god forbid you ever get actually sold something nowadays)

    now imagine having to pay to talk with a ghost of your loved one, a chatbot that sometimes allows you to forget that the actual person is gone, and makes all the moments where that illusion is broken all the more painful. A chatbot that denies you grief, and traps you in hell where you can talk with the person you lost, but never touch them, never feel them, never see them grow (or you could pay extra for the chatbot to attend new skill classes you could talk about :)).

    It would make grieving impossible and take constant advantage of those who “just want to say goodbye”. Grief is already hard as is, a wide spread mimicry of our dead ones would make it a psychological torture

    for more information watch a prediction of our future a fun sci-fi show called Black Mirror, specifically the episode titled Be Right Back (entire series is fully episodic you don’t need to watch from the start)