Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

  • caseyweederman@lemmy.ca
    link
    fedilink
    English
    arrow-up
    23
    ·
    8 months ago

    Remember when eVGA decided they would rather leave the market entirely than spend one more day working with Nvidia?

  • hark@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    8 months ago

    So many options, with small differences between them, all overpriced to the high heavens. I’m sticking with my GTX 1070 since it serves my needs and I’ll likely keep using it a few years beyond that out of spite. It cost $340 at the time I bought it (2016) and I thought that was somewhat overpriced. According to an inflation calculator, that’s $430 in today’s dollars.

  • DingoBilly@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    3
    ·
    edit-2
    8 months ago

    What’s going on? It’s overpriced and completely unnecessary for most people. There’s also a cost of living crisis.

    I play every game I want to on high graphics with my old 1070. Unless you’re working on very graphically intensive apps or you’re a pc master race moron then there’s no need for new cards.

    • chiliedogg@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      8 months ago

      I still game 1080p and it looks fine. I’m not dropping 2500 bucks to get a 4k monitor and video card to run it when I won’t even register the difference during actual gameplay.

    • n3m37h@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      It was a night and day difference going from a 1060 6gb to a 6700xt. The prices are still kida shit but that goes for everything

  • ReallyActuallyFrankenstein@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    11
    ·
    8 months ago

    Yep, it’s the RAM, but also just a mismatched value proposition.

    I think it’s clear at this point Nvidia is trying to have it both ways and gamers are sick of it. They used pandemic shortage prices as an excuse to inflate their entire line’s prices, thinking they could just milk the “new normal” without having to change their plans.

    But when you move the x070 series out of the mid-tier price bracket ($250-450, let’s say), you better meet a more premium standard. Instead, they’re throwing mid-tier RAM into a premium-priced project that most customers still feel should be mid-tier priced. It also doesn’t help that it’s at a time where people generally just have less disposable income.

  • FiskFisk33@startrek.website
    link
    fedilink
    English
    arrow-up
    11
    ·
    8 months ago

    GPUs haven’t been reasonably priced since the 1000 series.

    And now there’s no coin mining promising some money back.

  • Kazumara@feddit.de
    link
    fedilink
    English
    arrow-up
    8
    ·
    8 months ago

    600 $ for a card without 16 GB of VRAM is a big ask. I think getting a RX 7800 XT for 500 $ will serve you well for a longer time.

    • NIB@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      12gB vram is not a bottleneck in any current games on reasonable settings. There is no playable game/settings combination where a 7800xt’s 16gB offers any advantage. Or do you think having 15fps average is more playable than 5fps average(because the 4070s is ram bottlenecked)? Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.

      The 4070 super offers significantly superior ray tracing performance, much lower power consumption, superior scaling(and frame generation) technology, better streaming/encoding stuff and even slightly superior rasterization performance to the 7800xt. Are these things worth sacrificing for 100€ less and 4gB vram? For most people they arent.

      Amd’s offerings are competitive, not better. And the internet should stop sucking their dick, especially when most of the internet, including tech savvy people, dont even use AMD gpus. Hell, LTT even made a series of videos about how they had to “suffer” using AMD gpus, yet they usually join the nvidia shitting circlejerk.

      I have an amd 580 card and have bought and recommended AMD gpus to people since the 9500/9700pro series. But my next gpu will almost certainly be an nvidia one. The only reason people are complaining is because nvidia can make a better gpu(as shown by the 4090) but they choose not to. While AMD literally cant make better gpus but they choose to only “competitively” price their gpus, instead of offering something better. Both companies suck.

  • trackcharlie@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    8 months ago

    less than 20gb of vram in 2024?

    The entire 40 series line of cards should be used as evidence against nvidia in a lawsuit surrounding intentional creation of e waste

    • BorgDrone@lemmy.one
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 months ago

      The real tragedy is that PCs still have to make do with discrete graphics cards that have separate VRAM.

  • Dra@lemmy.zip
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    edit-2
    8 months ago

    I haven’t paid attention to GPUs since I got my 3080 on release day back in Covid.

    Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don’t struggle to run a single game on max settings at high frames @ 1440p, what’s the benefit that justifies the cost of 20gb VRAM outside of AI workloads?

    • Eccitaze@yiffit.net
      link
      fedilink
      English
      arrow-up
      22
      ·
      8 months ago

      An actual technical answer: Apparently, it’s because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it’s a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.

    • Space_Racer@lemm.ee
      link
      fedilink
      English
      arrow-up
      9
      ·
      8 months ago

      I’m maxed on VRAM in VR for the most part with a 3080. It’s my main bottleneck.

    • Asafum@feddit.nl
      link
      fedilink
      English
      arrow-up
      9
      ·
      8 months ago

      Lmao

      We have your comment: what am I doing with 20gb vram?

      And one comment down: it’s actually criminal there is only 20gb vram

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      7
      ·
      8 months ago

      Current gen consoles becoming the baseline is probably it.

      As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it’s almost Scrooge-like not to offer 16GB on a £579 GPU.

      That said, I think the pricing is still much more of an issue than the RAM. People just don’t want to pay these ludicrous prices for a GPU.

    • Hadriscus@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      8 months ago

      Perhaps not the biggest market but consumer cards (especially nvidia’s) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They’re the most logical investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here

    • Obi@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      Personally I need it for video editing & 3D work but I get that’s a niche case compared to the gaming market.

  • Altima NEO@lemmy.zip
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 months ago

    The RAM is so lame. It really needed more.

    Performance exceeding the 3090, but limited by 12 gigs of RAM .

  • Binthinkin@kbin.social
    link
    fedilink
    arrow-up
    4
    ·
    8 months ago

    You all should check prices comparing dual fan 3070’s to 4070’s they are a $40 difference on Amazon. Crazy to see. They completely borked their pricing scheme trying to get whales and crypto miners to suck their 40 series dry and wound up getting blue balled hard.

    Aren’t they taking the 4080 completely off the market too?

    • elvith@feddit.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      I have a 2060 super with 8GB. The VRAM is enough currently for FHD gaming - or at least isn’t the bottle neck, so 12 GB might be fine with this use case BUT I’m also toying around with AI models and some of the current models already ask for 12 GB VRAM to run the complete model. It’s not, that I would never get a 12 GB card as an upgrade, but you’d be sure, that I’d do some research for all alternatives and then it wouldn’t be my first choice but a compromise, as it wouldn’t future proof me in this regard.

  • randon31415@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    8 months ago

    Is this the one that they nerfed so that they could sell them in China around the US AI laws?

  • wooki@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    If they dont drop the price by at least 50% goodbye nVidia.

    So no more nVidia. Hello Intel.