• KingOfTheCouch@lemmy.ca
    link
    fedilink
    arrow-up
    76
    ·
    21 days ago

    Thousand times this. For actual builders that care about the nuance it all probably makes sense but then there is me over here looking at pre-builts wondering why the fuck are two seemingly identical machines have a $500 difference between them.

    I’m spending so much time pouring through spec sheets to find “oh the non-z version discombobulator means this cheaper one is gonna be trash in three years when I can afford to upgrade to a 6megadong tri-actor unit”.

    I’m in this weird state of to cheap to buy a Mac and can’t be arsed to build my own.

    • OtherPetard@feddit.nl
      link
      fedilink
      arrow-up
      1
      ·
      3 days ago

      Yeah, and when you check the detail pages of the games and other software you are upgrading it for it’ll turn out the 6 megadong tri-actor unit should work well in general, but there’s a certain crashing bug near the end of this game I already bought that the devs haven’t patched yet…

      And even after all those considerations modded Minecraft will be just about functional.

  • lorty@lemmy.ml
    link
    fedilink
    arrow-up
    58
    ·
    edit-2
    21 days ago

    Just go here and check the charts for the kind of work you want the PC to do. If one looks promising you can check specific reviews on YouTube.

    For gaming the absolute best cpu/gpu combo currently is the 9800x3d and a rtx 4090, if you don’t have a budget.

    Yes the part naming is confusing but it’s intentional.

    • arc@lemm.ee
      link
      fedilink
      arrow-up
      13
      ·
      21 days ago

      I saw a video on Gamers Nexus about how shitty a company they are. Hopefully word spreads amongst gamers & builders that they’re no good and they should be avoided.

      • fishbone@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        4
        ·
        21 days ago

        What’s the deal with them? Only NZXT component i’ve had is my current case, which has awful airflow (old model of H710 I think, bought 5 ish years ago).

        • ipkpjersi@lemmy.ml
          link
          fedilink
          arrow-up
          6
          ·
          21 days ago

          Apparently their PC rental program is a worse value than illegal loans that are likely mafia-backed.

        • boonhet@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          20 days ago

          Apparently they very recently got acquired or invested in and are probably looking to increase profits tenfold in under a year so the company can be dumped before it all crashes.

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      20 days ago

      If you are blindly renting things without doing numbers you have bigger issues.

      Always read and do long term calculations

  • Lvxferre@mander.xyz
    link
    fedilink
    arrow-up
    29
    ·
    22 days ago

    I recently had to go through this maze. I hate it. And I’m glad that my PCs tend to live ~10y, this means that I’m not doing it again in the foreseeable future.

  • MonkderVierte@lemmy.ml
    link
    fedilink
    arrow-up
    27
    ·
    edit-2
    21 days ago

    Meanwhile the data i care about, efficiency, is not readily availlable. I’m not gonna put a 350 watt GPU in the 10 liter case if i can have the same power for 250 watt.
    At least TomsHardware now includes efficiency in tests for newer cards.

    • addie@feddit.uk
      link
      fedilink
      arrow-up
      5
      ·
      21 days ago

      Tell me about it. The numbers that I’m interested in - “decibels under full load”, “temperature at full load” - might as well not exist. Will I be able to hear myself think when I’m using this component for work? Will this GPU cook all of my hard drives, or can it vent the heat out the back sufficiently?

      • Fizz@lemmy.nz
        link
        fedilink
        arrow-up
        3
        ·
        20 days ago

        I wish this was data was more available. I got a GPU upgrade 6800xt and it’s so loud. I can’t enjoy sitting at my desk without hearing a loud whine and a bunch of other annoying noises. Its probably because the card is 2nd hand but still.

        • cevn@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          20 days ago

          Maybe not cuz I have first hand 7900xtx and if I load it up it whines horribly lol.

      • Zanz@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 days ago

        Temperature is meaningless unless you want oc headroom. A watt into your room is the same no matter the temp the part runs at.

        • addie@feddit.uk
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          20 days ago

          That’s not correct, I’m afraid.

          Thermal expansion is proportional to temperature; it’s quite significant for ye olde spinning rust hard drives but the mechanical stress affects all parts in a system. Especially for a gaming machine that’s not run 24/7 - it will experience thermal cycling. Mechanical strength also decreases with increasing temperature, making it worse.

          Second law of thermodynamics is that heat only moves spontaneously from hotter to colder. A 60° bath can melt more ice than a 90° cup of coffee - it contains more heat - but it can’t raise the temperature of anything above 60°, which the coffee could. A 350W graphics card at 20° couldn’t raise your room above that temperature, but a 350W graphics card at 90° could do so. (The “runs colder” card would presumably have big fans to move the heat away.)

          • Zanz@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            19 days ago

            That is fundamentally not how PC cooling works. Each part is a closed system, with the PC an open system so long as you have fans. The heat sink temp over ambient could be what you are looking for, but that still would not work that way if you are looking at hot spot temps. If you tried to run a thread ripper at 500W in a closed space the air temp would end up hotter than than a 350W Graphics card. But the CPU if not throttling would have a temp over ambient of about 30c and the gpu core would be about 45c over ambient. The effect on your room will be that the 500W cpu raises the ambient temp more than the 350W gpu over the same period of time. The air in your room is what is cooling the components. Air at a given humidity has a specific heat capacity and will be your limiting factor. With your bath example you would need to have a much larger capacity of 60c water to melt the ice since the specific heat of water doesn’t change when a liquid.

            You have a fundamental misunderstanding of the 1st law of thermodynamics and what a “system” is as relating to the 2nd.

            For your HDD you want them to run 45-60c. running them colder will impact their life span. The drive will try to heat up if under 30c to prevent damage.

    • skibidi@lemmy.world
      link
      fedilink
      arrow-up
      32
      arrow-down
      1
      ·
      edit-2
      21 days ago

      I’d be very careful relying on that site… just flipped through some of the build and it was very strange.

      E.g. they were recommending a $500 or $900 CASE at the highest tiers - not even good cases, you can get something less than half the price with better performance. They recommended a single pcie 4.0 SSD and a SPINNING HARD DRIVE for a motherboard with pcie 5.0 m2 slots. Recommending CPU coolers that are far, far in excess of requirements (a 3x140mm radiator for a 100W chip? Nonsense). Memory recommendations for AMD builds are also sus - DDR5 6000 CL30 is what those cups do best with, they were recommending DDR5600 CL32 kits for no reason.

      Just strange… makes me question the rest of their recommendations.

      • Jesus_666@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        21 days ago

        Mind you, recommending a PCIe 4.0 SSD is the one part that makes sense. Right now very few people will gain noticeable benefits from a PCIe 5.0 SSD, AFAIK. The rest though… yikes.

        • skibidi@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          21 days ago

          The price differential doesn’t really exist anymore, though. If they were recommending 4TB, then I’d agree (only a few 4TB 5.0 and they are quite pricey), but at 2TB you’re looking at like $10 difference between something like the MP700 and the SN850X they recommend (not counting all the black Friday sales going on).

  • johannesvanderwhales@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    21 days ago

    Power consumption is part of the equation now too. You’ll often see newer generation hardware that has comparable performance to a last gen model but is a lot more power efficient.

    • eyeon@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      21 days ago

      Or you’ll see something equally efficient and equally performing at the same power levels…except you’ll see newer gens or upgraded skus allowed to pull more power

    • qyron@sopuli.xyz
      link
      fedilink
      arrow-up
      14
      ·
      22 days ago

      Honestly my preferred manufacturer since I started putting together my own machines.

    • Artyom@lemm.ee
      link
      fedilink
      arrow-up
      15
      arrow-down
      3
      ·
      edit-2
      21 days ago

      Make sure to get your 5900x3d with your 7900XTX. Note that one is a CPU and the other is a GPU. For extra fun, their numbers should eventually overlap given their respective incrementation schemes. The 5900x3d is the successor to the 5900xd, which is a major step down in performance even though it has more cores.

      I’m gonna give this award to Intel, which has increased the numbers on their CPU line by 1000 every generation since before the 2008 housing crash.

      • Valmond@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        21 days ago

        Is it not still “higher better” at AMD? With the obvious X or “m”, but usually price reflects the specs when the numbers are the same.

      • lorty@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        21 days ago

        The only thing you should realistic understand from the naming conventions is relative generations and which bracket of price/performance the part targets. Assuming more than that is just a mistake.

    • VeganCheesecake@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      6
      ·
      21 days ago

      Just ordered another CPU from them. Downside is that there isn’t any modern AMD desktop platform that works with coreboot, which seems to be the only workable way to deactivate the Management Engine/Platform Security Processor after boot.

      Was really considering to swap to Intel for that, but got a good deal on a Ryzen 9 that fits in my socket, so…

      • VeganCheesecake@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        1
        ·
        15 days ago

        They had at least two or three halfway sensible naming schemes, which they then proceeded to abandon after like one generation.

        I fault marketing department at the chipmakers that are trying to somehow justify their existence.

        • Possibly linux@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          20 days ago

          Explain how AMD naming works. I’m so confused as it is pretty hard to understand plus they randomly will violate there own conventions.

  • arc@lemm.ee
    link
    fedilink
    arrow-up
    11
    ·
    21 days ago

    I occasionally “refresh” my PC with new board, CPU etc. I never buy the top of the line stuff and quite honestly there is little reason to. Games are designed to play perfectly well on mid range computers even if you have to turn off some graphics option that enables some slight improvement in the image quality.

      • kerf@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        21 days ago

        For many games you can set graphics rendering to for example 1080p but run the whole game in 4k so text, menues and so on are super crisp but the game still runs very light. But maybe it’s good advice to never even start because I can’t imagine going back to 1080p after using 2k and 4k screens

  • chickenf622@sh.itjust.works
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    22 days ago

    I always go by the rule of the larger the number/more letters the better. The exception being M that usually means it’s made for mobile devices.

  • Godnroc@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    22 days ago

    I just go by PassMarks rating for CPU and GPU. It may not be the most nuanced rating, but it does give numbers that can be easily compared.