• 1 Post
  • 16 Comments
Joined 1 year ago
cake
Cake day: June 23rd, 2023

help-circle

  • Storytime! Earlier this year, I had an Amazon package stolen. We had reason to be suspicious, so we immediately contacted the landlord and within six hours we had video footage of a woman biking up to the building, taking our packages, and hurriedly leaving.

    So of course, I go to Amazon and try to report my package as stolen… which traps me for a whole hour in a loop with Amazon’s “chat support” AI, repeatedly insisting that I wait 48 hours “in case my package shows up”. I cannot explain to this thing clearly enough that, no, it’s not showing up, I literally have video evidence of it being stolen that I’m willing to send you. It literally cuts off the conversation once it gives its final “solution” and I have to restart the convo over and over.

    Takes me hours to wrench a damn phone number out of the thing, and a human being actually understands me and sends me a refund within 5 minutes.


  • I don’t necessarily disagree that we may figure out AGI, and even that LLM research may help us get there, but frankly, I don’t think an LLM will actually be any part of an AGI system.

    Because fundamentally it doesn’t understand the words it’s writing. The more I play with and learn about it, the more it feels like a glorified autocomplete/autocorrect. I suspect issues like hallucination and “Waluigis” or “jailbreaks” are fundamental issues for a language model trying to complete a story, compared to an actual intelligence with a purpose.







  • Hazzard@lemm.eetoLinus Tech Tips@lemmy.mlHere's the plan. (New video from LTT)
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    11 months ago

    Agreed, I thought mentioning those statistics was a tasteful way of addressing that conversation as best as possible in a YouTube video, and those “people will be fired” comments felt like a clear commitment to rooting out and going as far as firing anyone creating that kind of environment.

    The amount of “Linus didn’t even talk about” in this thread is crazy to me, just feels like bad reading comprehension when he directly addressed most of the conversation (HR, work hours and environment, etc) and even committed to firing people in a video his staff will all be watching.




  • Frankly, this whole situation boils down to exactly what I expected. LTT has always produced content at an insane velocity, and issues like these are the inevitable results. Miscommunications, errors that need to be tidied up, and compromises such as that water block video not being redone with the proper setup. LTT doesn’t have the ability to reverse course on an emergency like that, they’re already at breakneck pace so that they can’t make a change of that scope without missing deadlines. If it wasn’t this, it would’ve been something else.

    Is that evil? I don’t know. It’s the business strategy they’ve gone with, and much of why they’re in the position they are. An LTT that put out half the videos they do may have never made it to this position. This is a good wake up call as to the costs of that kind of operation, and it’s up to you how you choose to react to this.




  • Yeah, I respect that. Actually really liked the formatting of this post, with the little summary, and opening the discussion. Much better than having some bot just dump the link here for every video!

    That’s actually part of why I chose to drop the first comment, hopefully these can be hopping with some good engagement going forward. I think like many people, I often have thoughts or want to discuss these, but YT comments are just a nightmare if you want to do anything more than skim them.



  • I also feel like a lot of the value of chronological is lost if I think it’s algorithmic recommendations. If I don’t know I’m browsing the latest? I’ll likely just think the algorithm is serving up some garbage. Especially somewhere like Facebook, where people haven’t really been curating their feed for years, just… following whoever to be polite and letting the algorithm take care of it.