• edwardbear@lemmy.world
    link
    fedilink
    English
    arrow-up
    87
    arrow-down
    3
    ·
    6 months ago

    Oh, if they PROMISE.

    Fuck Adobe. I’ll pirate PS and AI until I die. Greedy fucking pigboys.

    • technocrit@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      6 months ago

      Those are the easiest apps to replace. I’ll just use Gimp and Inkscape until I die. Not even tempted by Adobe’s bloat, spyware, etc.

      • glimse@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        6 months ago

        It won’t have anything that relies on “the cloud”

        When you use the AI services in Photoshop, it tries to connect to their servers. But to crack Photoshop, you need to blacklist all those servers. If you try to, say, use the automatic background removal tool, Photoshop will give you a message saying that it will run the (worse) version locally because it can’t connect.

        Not that I’d know or anything. A friend of a friend told me. Basically a stranger. Don’t even know his name.

        • Jakdracula@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 months ago

          Right, Since it can’t call home, the advanced features will not work or at least not work as described, is that correct?

          • glimse@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            6 months ago

            That is correct. Most (all?) of the services are run on Adobe servers, not locally

  • JeeBaiChow@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    ·
    edit-2
    6 months ago

    Every time we trusted a large tech promise on an unverifiable claim, they ended up shafting us. Just sayin’.

  • restingboredface@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    3
    ·
    6 months ago

    Guys, seriously. The entire Affinity Suite is $150. Paid for updates through the current version. It’s solid.

    Dump Adobe.

  • OhmsLawn@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    ·
    6 months ago

    claims that the company often uses machine learning to review user projects for signs of illegal content

    OK, so what happens when Florida starts deciding more content is illegal?

    Literally big brother shit.

    • HonorableScythe@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 months ago

      Exactly my thoughts. Adobe is not the police and they should not be the ones trying to deter crime by any definition. How many horrible things have governments done to “protect the children”?

  • fluckx@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    ·
    6 months ago

    Here’s a License change which implies we’re datafarming all your assets.

    Here’s my word that we’re absolutely not goijf to be doing that. Trust me bro.

  • retrospectology@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    6 months ago

    I’m betting the reason they want access to “moderate” your projects is to train their AI. Literally looking to steal artists work before it’s out the door.

    • ytsedude@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      6 months ago

      That’s absolutely what’s going on.

      A fun way to combat this would be to get every artist to add giant, throbbing dicks to everything they create in Photoshop with the hope that it creates the thirstiest, nastiest AI model out there.

      • retrospectology@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        6 months ago

        Not just dicks, but dicks mixed with other art so it just completely pollutes the training data and the AI has no idea how to draw anything without it kind of looking like a dick. Dicks with human and animal faces, boats shaped like dicks, dick buildings and landscapes etc.

        It would take an immense amount of bad data to actually work, but it would be funny.

  • CosmoNova@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    6 months ago

    Because when someone presents you a lengthy document. One that describes all the ways they claim ownership of your work (and work in progress) - in detail - it only matters how much they really mean what’s written down? Let me spare you the sarcasm and just say this doesn’t communicate the professionalism professionals are demanding. Quite the opposite.

  • cley_faye@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    6 months ago

    Interesting, we get to either hate them for going full big brother, or hate them for going full adobe in the first place. It’s nice to have a choice sometimes.

  • capital@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    edit-2
    6 months ago

    “Adobe does not train Firefly Gen AI models on customer content. Firefly generative AI models are trained on a dataset of licensed content, such as Adobe Stock, and public domain content where copyright has expired.”

    This references a single particular product. lol. If they’re training a model by a different name with customer data, it would still be a true statement.

    The points about lawyers and NDA’s hit the nail on the head. I thought something similar with the Windows Recall debacle. That’s a juicy set of data for anyone looking to find journalist sources or scrape a hospital’s network. In every case it relies on the end user (business or individual) to know how to disable those features with GPOs/registry options… There’s no way 100% of them realize the issue and have the knowledge to fix it.

  • CrowAirbrush@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    6 months ago

    But “big brother” would mean they watch you.

    I read everywhere that they claim the rights to your projects, which is far worse than just watching over your shoulder innit?

  • werefreeatlast@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 months ago

    They just wanna review your work 😀. What if you’re trying to put a penis on Trump’s face and it’s too big or it’s pointing the wrong way or something? You know. Wouldn’t you want to be told stuff like " the police is coming unless you erase this now!" You know, things like that? It would definitely come in handy to catch kids doing nudes of others. Or adults doing nudes of other adults who didn’t know. I wouldn’t want to end up in a collage of nudes that is 20MBb 1080p or 4K.