A New York Times copyright lawsuit could kill OpenAI::A list of authors and entertainers are also suing the tech company for damages that could total in the billions.

  • kjPhfeYsEkWyhoxaxjGgRfnj@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    ·
    edit-2
    9 months ago

    I doubt it. It would likely kill any non Giant tech backed AI companies though

    Microsoft has armies of lawyers and cash to pay. It would make life a lot harder, but they’d survive

  • makyo@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    9 months ago

    I always say this when this comes up because I really believe it’s the right solution - any generative AI built with unlicensed and/or public works should then be free for the public to use.

    If they want to charge for access that’s fine but they should have to go about securing legal rights first. If that’s impossible, they should worry about profits some other way like maybe add-ons such as internet connected AI and so forth.

    • Drewelite@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      9 months ago

      A very compelling solution! Allows a model of free use while providing an avenue for business to spend time developing it

    • miridius@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Nice idea but how do you propose they pay for the billions of dollars it costs to train and then run said model?

          • nickwitha_k (he/him)@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            If we didn’t live under an economic system where creatives need to sell their works to make a living or even just survive, there wouldn’t be an issue. What OpenAI is doing is little different than any other worker exploitation, however. They are taking the fruits of the labor of others, without compensation of any kind, then using it to effectively destroy their livelihoods.

            Few, if any, of the benefits of technological innovation related to LLMs or related tech is improving things for anyone but the already ultra-wealthy. That is the actual reason that we can’t have nice things; the greedy being obsessed with taking and taking while giving less than nothing back in return.

            Just like noone is entitled to own a business that can’t afford to pay a living wage, OpenAI is not entitled to run a business aimed at building tools to destroy the livelihoods of countless thousands, if not millions, of creatives by building their tools out of stolen works.

            I say this as one who is in support of trying to create actual AGI and potentially “uplift” species, making humanity less lonely. I think OpenAI doesn’t have what it takes and is nothing more than another scam to rob workers of the value of their labor.

            • General_Effort@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              9 months ago

              This is the wrong way around. The NYT wants money for the use of its “intellectual property”. This is about money for property owners. When building rents go up, you wouldn’t expect construction workers to benefit, right?

              In fact, more money for property owners means that workers lose out, because where else is the money going to come from? (well, “money”)

              AI, like all previous forms of automation, allows us to produce more and better goods and services with the same amount of labor. On average, society becomes richer. Whether these gains should go to the rich, or be more evenly distributed, is a choice that we, as a society, make. It’s a matter of law, not technology.

              The NYT lawsuit is about sending these gains to the rich. The NYT has already made its money from its articles. The authors were paid, in full, and will not get any more money. Giving money to these property owners will not make society any richer. It just moves wealth to property owners for being property owners. It’s about more money for the rich.

              If OpenAI has to pay these property owners for no additional labor, then it will eventually have to increase subscription fees to balance the cash flow. People, who pay a subscription, probably feel that it benefits them, whether they use it for creative writing, programming, or entertainment. They must feel that the benefit is worth, at least, that much in terms of money.

              So, the subscription fees represent a part of the gains to society. If a part of these subscription fees is paid to property owners, who did not contribute anything, then that means that this part of the social gains is funneled to property owners, IE mainly the ultra-rich, simply for being owners/ultra-rich.

    • Pacmanlives@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Not really how it works these days. Look at Uber and Lime/Bird scooters. They basically would just show up to a city and say the hell with the law we are starting our business here. We just call it disruptive technology

      • makyo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        Unfortunately true, and the long arm of the law, at least in the business world, isn’t really that long. Would love to see some monopoly busting to scare a few of these big companies into shape.

    • dasgoat@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      7
      ·
      9 months ago

      Running AI isn’t free, and AI calculations pollute like a motherfucker

      This isn’t me saying you’re wrong on an ethical or judicial standpoint, because on those I agree. It’s just that, on a practical level considerations have to be made.

      For me, those considerations alone (and a ton of other considerations such as digital slavery, child porn etc) make me just want to pull the plug already.

      AI was fun. It’s a dumb idea for dumb buzzword spewing silicon valley ghouls. Pull the plug and be done with it.

  • Melllvar@startrek.website
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    3
    ·
    9 months ago

    If OpenAI owns a Copyright on the output of their LLMs, then I side with the NYT.

    If the output is public domain–that is you or I could use it commercially without OpenAI’s permission–then I side with OpenAI.

    Sort of like how a spell checker works. The dictionary is Copyrighted, the spell check software is Copyrighted, but using it on your document doesn’t grant the spell check vendor any Copyright over it.

    I think this strikes a reasonable balance between creators’ IP rights, AI companies’ interest in expansion, and the public interest in having these tools at our disposal. So, in my scheme, either creators get a royalty, or the LLM company doesn’t get to Copyright the outputs. I could even see different AI companies going down different paths and offering different kinds of service based on that distinction.

    • tabular@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      9 months ago

      I want people to take my code if they share their changes (gpl). Taking and not giving back is just free labor.

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      9 months ago

      I think it currently resides with the one doing the generation and not openAI itself. Officially it is a bit unclear.

      Hopefully, all gens become copyleft just for the fact that ais tend to repeat themselves. Specific faces will pop up quite often in image gen for example.

  • SatanicNotMessianic@lemmy.ml
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    3
    ·
    edit-2
    9 months ago

    The NYT has a market cap of about $8B. MSFT has a market cap of about $3T. MSFT could take a controlling interest in the Times for the change it finds in the couch cushions. I’m betting a good chunk of the c-suites of the interested parties have higher personal net worths than the NYT has in market cap.

    I have mixed feelings about how generative models are built and used. I have mixed feelings about IP laws. I think there needs to be a distinction between academic research and for-profit applications. I don’t know how to bring the laws into alignment on all of those things.

    But I do know that the interested parties who are developing generative models for commercial use, in addition to making their models available for academics and non-commercial applications, could well afford to properly compensate companies for their training data.

    • LWD@lemm.ee
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      9 months ago

      The danger of the rich and evil simply buying out their critics is a genuine risk. After all, it’s what happened to Gawker when Peter Thiel decided he personally didn’t like them, neutering their entire network.

      Regarding OpenAI the corporation, they pulled an incredibly successful bait and switch, pretending first to gather data for educational purposes, and then switching to being a for-profit as soon as it benefited them. In a better world or even a slightly more functional American democracy, their continued existence would be deemed inexcusable.

      • SatanicNotMessianic@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 months ago

        I completely agree. I don’t want them to buy out the NYT, and I would rather move back to the laws that prevented over-consolidation of the media. I think that Sinclair and the consolidated talk radio networks represent a very real source of danger to democracy. I think we should legally restrict the number of markets a particular broadcast company can be in, and I also believe that we can and should come up with an argument that’s the equivalent of the Fairness Doctrine that doesn’t rest on something as physical and mundane as the public airwaves.

  • db2@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    6
    ·
    9 months ago

    Oh no, how terrible. What ever will we do without Shenanigans Inc. 🙄

  • 800XL@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    2
    ·
    9 months ago

    YES! AI is cool I guess, but the massive AI circlejerk is so irritating though.

    If OpenAI can infringe upon all the copyrighted material on the net then the internet can use everything of theirs all for free too.

  • Grimy@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    2
    ·
    edit-2
    9 months ago

    This would bring up the cost of entry for making a model and nothing more. OpenAI will buy the data if they have too and so will google. The money will only go to the owners of the New York Times and its shareholders, none of the journalists who will be let go in the coming years will see a dime.

    We must keep the entry into the AI game as low as possible or the only two players will be Microsoft and Google. And as our economy becomes increasingly AI driven, this will cement them owning it.

    Pragmatism or slavery, these are the two options.

      • Even_Adder@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        9 months ago

        He’s not arguing for OpenAI, but for the rest of us. AI is a public technology, but we’re on the verge of losing our ability to participate due to things like this and the megacorps’ attempts at regulatory capture. Which they might just get. Their campaign against AI is a lot like governments’ attempts to destroy encryption. Support open source development, It’s our only chance. Their AI will never work for us. John Carmack put it best.

        Fuck "Open"AI, fuck Microsoft. Pragmatism or slavery.

        • LWD@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          5
          ·
          9 months ago

          If Grimy supports abolishing OpenAI and making their unethically gained data set available to all, I would be interested in hearing that.

          There are also ways to hold giant megacorporations to a different set of standards than independent developers. Corporations valued at millions to billions of dollars should have higher entry fees getting access to someone else’s work than a private individual.

          • Grimy@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            9 months ago

            If you want to know my personal political stance, I think every company with more than 50 or so employees should be owned by the state. I’m for the dismantling of the stock market and the owner caste. I’m also a realist and understand those things won’t come to pass anytime soon. OpenAI will remain and they will happily eat all the fines if it guarantees them a monopoly.

            I wasn’t playing devil’s advocate. My point is these legislation only help companies like OpenAI while bringing no benefit whatsoever to any of us.

            There are also ways to hold giant megacorporations to a different set of standards than independent developers.

            Yes but that isn’t what is being currently proposed, is it?

            • LWD@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              9 months ago

              That’s a pretty good trick, trying to conflate regulation of OpenAI with other impossible ideals you claim to hold, and drawing a hard line between that and your own suggestion: to let OpenAI win.

              Heaven forbid anybody fight OpenAI, you think the best thing to do is avoid regulations. You claim to dislike OpenAI, yet you align with them. Where have I heard that one before?

              • Grimy@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                1
                ·
                edit-2
                9 months ago

                I never claimed to be a copyright lawyer and there is literally no other copyright discussion except the ones pertaining to AI. I touched on my ideals because you were implying I was pro big business.

                I always try to have a reasonable discussion with you but you always end up writing these kinds of comments while never adressing my actual arguments. Have a good day bro.

                • LWD@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  9 months ago

                  Do you have any actual contention with how you use scummy tactics to defend OpenAI, or was the label the only thing that mattered to you?

              • Grimy@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                9 months ago

                You edited your comment after I responded. This is what you originally posted:

                "That’s a pretty good trick, trying to conflate regulation of OpenAI with other impossible ideals you claim to hold, and drawing a hard line between that and your own suggestion: to let OpenAI win.

                I feel sorry for your clients.

                (By the way, Grimy claims to be a copyright lawyer, but for some reason he only crawls out of the woodwork when OpenAI is discussed. Sam Altman himself seems like a less biased source for how AI should be treated.)"

                • LWD@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  9 months ago

                  Grimy, you always end up writing these kinds of comments while never adressing my actual arguments.

  • Tony Bark@pawb.social
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    9 months ago

    The problem with copyright is that everything is automatically copyrighted. The copyright logo is purely symbolic, at this point. Both sides are technically right, even though the courts have ruled that anything an AI outputs is actually in the public domain.

    • Even_Adder@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 months ago

      Works involving the use of AI are copyrightable. Also, the Copyright Office’s guidance isn’t law. Their guidance reflects only the office’s interpretation based on its experience, it isn’t binding in the courts or other parties. Guidance from the office is not a substitute for legal advice, and it does not create any rights or obligations for anyone. They are the lowest rung on the ladder for deciding what law means.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    9 months ago

    This is the best summary I could come up with:


    Late last year, the New York Times sued OpenAI and Microsoft, alleging that the companies are stealing its copyrighted content to train their large language models and then profiting off of it.

    Meanwhile, the Senate Judiciary Subcommittee on Privacy, Technology, and Law held a hearing in which news executives implored lawmakers to force AI companies to pay publishers for using their content.

    In its rebuttal, OpenAI said that regurgitation is a “rare bug” that the company is “working to drive to zero.” It also claims that the Times “intentionally manipulated prompts” to get this to happen and “cherry-picked their examples from many attempts.”

    A growing list of authors and entertainers have been filing lawsuits since ChatGPT made its splashy debut in the fall of 2022, accusing these companies of copying their works in order to train their models.

    Developers have sued OpenAI and Microsoft for allegedly stealing software code, while Getty Images is embroiled in a lawsuit against Stability AI, the makers of image-generating model Stable Diffusion, over its copyrighted photos.

    In that 2013 decision, Judge Chin said its technology “advances the progress of the arts and sciences, while maintaining respectful consideration for the rights of authors and other creative individuals, and without adversely impacting the rights of copyright holders.” And a 2023 economics study of the effects of Google Books found that “digitization significantly boosts the demand for physical versions” and “allows independent publishers to introduce new editions for existing books, further increasing sales.” So consider that another point in favor of giving tech platforms room to innovate.


    The original article contains 1,628 words, the summary contains 259 words. Saved 84%. I’m a bot and I’m open source!

  • kaitco@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    3
    ·
    9 months ago

    I never thought that the AI-driven apocalypse could be impeded by a simple lawsuit. And, yet, here we are.

    • maynarkh@feddit.nl
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      9 months ago

      One has to wonder why in Star Trek the Federation did not simply sue the Borg.

      • BearOfaTime@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        9 months ago

        Hahahahahahaha hahahahahahaha omg, thank you for the very real, actual laugh-out-loud moment.

        Now I’m envisioning Picard one one side, Borq Borg (wtf autocorrect?) Queen on the other, and what, Q as judge, looking older by the minute, just hating life.

      • kaitco@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        Well, that comes down to the particular venue. Who’s going to rule? The Kardassians??