• thanksssmyman@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Here we go again with all the reddit bros pretending statistical regurgitation “is just like a human, trust me, don’t you get inspired by reading stuff as well”

    Oh, well, whatever… Soon enough all the data sets will be useless, maybe it will be time these companies hire writers for $5 an hour and lock them in a room to produce some unspoiled input.

    • Exist50@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Justice would be them having to pay the defense’s legal fees for filing a frivolous suit.

      • BrokenBaron@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        If you are buying the hoax that genAI’s data laundering scheme is fair use, I would like you to spare me the frivolous argument!

        It is truly depressing to see so many people watch massive mega corporations practice unrestrained access to our property and personal data, then use that to replace our jobs to fill their own pockets, and be dumb enough to take their side.

        • Exist50@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          If you are buying the hoax that genAI’s data laundering scheme is fair use

          Because it is. No legal scholar seriously doubts that argument. It comfortably meets all the requirements.

          It is truly depressing to see so many people watch massive mega corporations practice unrestrained access to our property and personal data

          Lmao, and you think abolishing fair use is somehow a win for people over corporations? Now I know you’re just trolling.

          • BrokenBaron@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Because it is. No legal scholar seriously doubts that argument. It comfortably meets all the requirements.

            Rationalization placed on the big corporations having good lawyers.

            Lmao, and you think abolishing fair use is somehow a win for people over corporations? Now I know you’re just trolling.

            You seriously think thats what I’m arguing for? Or are you composing a strawman to comfort yourself? Asking for data laundering scams to be regulated so they don’t replace the working class’s jobs the moment it makes a mega corporation a single buck should not be insane. It doesn’t mean abolishing fair use. Helpful idiots like you are what these companies are depending on though.

            I thought I told you to spare me the frivolous argument … go bootlick somewhere else.

            • Exist50@alien.topB
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              Rationalization placed on the big corporations having good lawyers.

              I’m not talking about just OpenAI’s lawyers. This is actually a very clear-cut matter, despite your attempts to throw doubt on it.

              You seriously think thats what I’m arguing for?

              Quite literally, yes. Training an AI model is rather clearly fair use, so to make that illegal, you need to either abolish fair use, or severely limit it from its current scope.

              Asking for data laundering scams to be regulated so they don’t replace the working class’s jobs the moment it makes a mega corporation a single buck

              And I’m sure you would have also suggested that we ban the automated loom for putting weavers out of business. There’s a reason the Luddites lost.

                • Exist50@alien.topB
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  10 months ago

                  What is it with you AI circlejerkers and constantly calling people Luddites?

                  Calling a spade a spade. You have a better term for someone who wants to hold back technology because it threatens some small population in an existing industry?

  • X0vvy@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    When you don’t know the difference between stealing and copying

  • spezisabitch200@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Idiots: “Sam Altman was fired because of Super AI for some reason”

    Normal people: “OpenAI now has several lawsuits that might make a impossible to monetize without being forced to pay billions and it’s doubtful that Sam told Microsoft before he sold his chat bot that he was stealing author’s works”

    • Exist50@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      And that explains why he’s now back? And has had MS’s support the entire time?

      • spezisabitch200@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Well, it was either fire him and lose all your money by losing your employees or keep him and try to salvage what you can.

        • Exist50@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Microsoft has always wanted to keep him. The OpenAI board fired him for ideological reasons/power struggle, realized they would be killing the entire company, and decided to salvage the company even at the cost of their jobs.

  • afwsf3@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Why is it okay for a human to read and learn from copyrighted materials, but its not OK for a machine to do so?

    • raisinbrahms02@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Because human beings have rights and machines don’t and shouldn’t. Humans read for enjoyment and self fulfillment. These AI machines only read for the purpose of regurgitating a soulless imitation of the original. Not even remotely similar.

    • Sansa_Culotte_@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Why is it okay to own furniture, but not people?

      By the way:

      its not OK for a machine to do so

      There are no machines that read and learn. “machine learning” is a technical term that has nothing to do with actual learning.

      • platoprime@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        There are no machines that read and learn.

        That’s exactly what Language Learning Models do.

        • Sansa_Culotte_@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          That’s exactly what Language Learning Models do.

          I can see how you would come to that conclusion, given that you clearly are incapable of either.

      • bikeacc@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        What? We as human literally learn through pattern recognition. How is it different that what a machine is doing? Of course it is not exactly the same process our brains do, but it is by no means a “metaphor”.

      • pilows@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        What’s the connection between owning slaves and using computer tools? I don’t really follow this jump in logic.

          • pilows@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            I think they were talking about people slaves, not computer networks. The person above them asked why humans can learn from copyright materials, but machines aren’t allowed to. The next person asked why we can own furniture but not people. To me this seems like they are saying we don’t own slaves for the same reason computer programs shouldn’t be allowed to learn from copyright materials. I’d say we don’t own slaves because as a society we value and believe in individuality, personal choice, and bodily autonomy, and I don’t see how these relate to dictating what content you train computer models on.

      • afwsf3@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        I fail to see how training an LLM with the material I choose is any different than me studying that material. Artists are just mad I can make awesome pictures on my graphics card.

      • ApexAphex5@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        I guess you think “neural networks” work nothing like a brain right?

        Of course machines can read and learn, how can you even say otherwise?

        I could give a LLM an original essay, and it will happily read it and give me new insights based on it’s analysis. That’s not a conceptual metaphor, that’s bonafide artificial intelligence.

        • FuckToiy@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          I think anyone who thinks neural nets work exactly like a brain at this point in time are pretty simplistic in their view. Then again you said “like a brain” so You’re already into metaphor territory so I don’t know what you’re disagreeing with.

          Learning as a human and learning as an LLM are just different philosophical categories. We have consciousness, we don’t know if LLMs do. That’s why we use the word “like”. Kind of like, “head-throbbed heart-like”.

          We don’t just use probability. We can’t parse 10,000,000 parameter spaces. Most people don’t use linear algebra.

          A simulation of something is not equal to that something in general.

      • BiasedEstimators@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Neural networks aren’t literally bundles of biological neurons but that doesn’t mean they’re not learning.

    • b_ll@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Pretty sure humans paid for the materials. That’s the whole point. Authors have to be compensated for their work.

    • Isa472@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Machines don’t have inspiration. They only do advanced versions of copy paste

      • anamericandude@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        It’s funny you say that because now that I think about it, inspiration basically is advanced copy and paste

        • Isa472@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Except a human gets inspiration from their environment, their life, their emotions. Unique experiences.

          A bot only gets “inspiration” from other people’s work. And if that work is copyrighted… The author deserves compensation

          • ParksBrit@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Your argument boils doen to the fact humans have a more diverse data set. This is a terrible legal basis.

            • Isa472@alien.topB
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              What are you saying… It’s not about the amount of information, it’s about whether the source of information is copyrighted work or not.

              Monet cultivated his own garden and painted the famous water lillies. That is 100% original work. No argument possible

              • ParksBrit@alien.topB
                link
                fedilink
                English
                arrow-up
                1
                ·
                10 months ago

                Your environment, emotions, and experiences are simply different forms of data and sources to pull from. Most stories are in some way inspired by other stories.

  • Tyler_Zoro@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    This is going to go the way of the Silverman case. On quote from that judge:

    “This is nonsensical,” he wrote in the order. “There is no way to understand the LLaMA models themselves as a recasting or adaptation of any of the plaintiffs’ books.”

    • Area-Artificial@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      The Silverman case isn’t over. The judge took the position that the output themselves are not infringement, as I think most people agree since it is a transformation, but the core of the case is still ongoing - that the dataset used to train these models contained their copyrighted work. Copying is one of the rights granted to copyright holders and, unlike the Google case a few years back, this is for a commercial product and the books were not legally obtained. Very different cases. I would be surprised if Silverman and the others lost this lawsuit.

      • Xeno-Hollow@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Copyright is more about distribution and deprivation than copying.

        There is absolutely nothing preventing me from sitting down and handwriting the entirety of the LOTR in calligraphic script.

        I can even give that copy to other people, as it is a “derivative work,” and I’m not attempting to profit from it.

        There’s not even anything preventing me from scanning every page and creating a .pdf file for personal use, as long as I don’t distribute it.

        Hell, the DMCA even allows me to rip a movie as long as I’m keeping it for personal use.

        I don’t see anything here that can not be argued against with fair use. The case is predicated upon the idea that if you give it the correct prompts, it’ll spit out large amounts of copyrighted text.

        If you were describing that as an interaction with a person, you’d call that coercion and maybe even entrapment.

        The intent of the scraping was not explicitly distribution.

      • Tyler_Zoro@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        The Silverman case isn’t over.

        It is with respect to that argument. The claim in question was thrown out.

        The remaining claim is unrelated.

  • anaxosalamandra@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I’m so surprised at the amount of people defending AI in this subreddit. It’s truly makes me feel like we failed as a species. I’m not a writer, nor an artist or musician but art and culture have walked hand in hand in human history. I struggle to believe why aren’t we more protective of it and instead just hand out thousands of years of human tradition to machines. Just because we could doesn’t mean we should.

  • DoopSlayer@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Both meta and OpenAI have been clear about pirating thousands of books for their training sets so it’s no exactly surprising that lawsuits are following