• salton@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      6
      ·
      1 year ago

      Oh no I’ll have to actually play one of the many great games in my library that runs perfectly on my hardware instead of buying another remastered game to that runs like hot garbage.

      • vxx@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Alan Wake 2 is brand new and not remastered, and by the looks of it, it has fantastic graphics, and gets praised by reviewers.

    • rar@discuss.online
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 year ago

      The industry needs to appreciate QA and optimization more than ever. I don’t feel like getting the latest GPU for a couple of rushed and overpriced digital entertainment softwares, the same say I don’t feel like getting the newest iphone every year because of social pressure.

      • GenderNeutralBro@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I’ve skipped a couple already. I’m on a 1080 now. It’s showing its age a bit but still generally does well at 3440x1440. I will turn settings down as needed to maintain 60-100fps.

  • Klaymore@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    2
    ·
    1 year ago

    Since that gpu has 24 GB of vram the game might be using more than it really needs, just because it can. The best way to test the importance of vram would be to get two cards of the same tier with different vram amounts (like the A770 8GB and 16GB) and see how that impacts performance.

    • Lojcs@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 year ago

      Looked at the review. 4070ti (12G) and 3090ti (24G) scale similiarly until 4K RT / 4K PT, at which point most 12G cards stop scaling and drop to a couple fps. 6700xt (12G) and 7700xt (12G) doesn’t seem affected in RT. With PT only 7700xt survives, with a whopping 7 fps. Similar thing happens at 1440p to 8GB cards

      Edit: edited out a750

      • Hasuris@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        According to the posted picture this should happen at 1440p with >14GB VRAM used. It doesn’t. 4k native is unplayable territory for every 12GB card anyway

    • Turun@feddit.de
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      1 year ago

      There are also plenty of totally reasonable settings that require less than 12GB, 1440p maximum settings for example. If you want the best of the best, obviously you have to pay for the best of the best.

      (It’s still a lot and a minimum of 12GB is already ridiculous. I’m just saying the claim of 16GB being not enough is kinda dishonest)

  • circuitfarmer@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    2
    ·
    1 year ago

    I find myself saying “but why?” for all these spec requirements on Alan Wake 2. Is it some kind of monsterous leap forward in terms of technical prowess? Because usually outliers like this suggest poor optimization, which is bad.

    • ShadowRam@kbin.social
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      1 year ago

      If these games would make proper use of resizable bar, VRAM size wouldn’t be an issue.

    • DragonTypeWyvern@literature.cafe
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      5
      ·
      1 year ago

      Yeah, as someone that got bored in the first part of the first one, what could possibly justify this for the series?

      Honest question. Do they need to look like actual people before the shadow monsters or whatever attack?

      Because mostly the series seemed to be about picking up collectables in the dark while hoping your flashlight doesn’t go out.

      • circuitfarmer@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 year ago

        I mean, I know many people like the series. I agree it doesn’t seem like it should be terribly demanding though. I may just be wrong and maybe it’s meant to be the best graphics ever, but I suspect that on release we’ll see a lot of “meh” and potentially backlash if these reqs don’t translate into something no one has seen before.

    • Skcyte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Well the game itself is an Nvidia sponsored title you can expect shit hitting the fan. They want you to use their tech.

  • nevemsenki@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    3
    ·
    1 year ago

    At the same time, Armored Core 6 has pretty stunning visuals and runs pretty well even on a 2060. Almost like graphics can be done well with a good art style and optimisation, not just throwing more hardware at the issue.

  • _sideffect@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    I honestly couldn’t give two fucks about how a game looks if its going to cost me $2000 to run it.

  • LoamImprovement@ttrpg.network
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    1 year ago

    These requirements are such horseshit. What’s the point of making everything look hyperrealistic at 4K if nobody can run the damn game without raiding NASA’s control room for hardware?

    • kadu@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 year ago

      It will run fine.

      This testing methodology of “idk, run it on a 4090 and log how much VRAM it uses!” means absolutely nothing, as VRAM is a dynamic cache.

  • HidingCat@kbin.social
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    1 year ago

    Yikes, even 1440P isn’t safe. My 12GB 6700 XT is looking a bit oudated already. It barely just has enough at max settings without the fancy stuff.

    • kattenluik@feddit.nl
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      1 year ago

      I think it’s relatively easy to avoid these games, they’re obviously not utilizing these resources well.

  • GaMEChld@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    3
    ·
    1 year ago

    It’s ok, thanks to Nvidia’s amazing value, I have a whopping 10GB on a 3080 that I paid way too much for! My old Vega 64 had 8GB which was from 2017.

  • user_2345@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    I really don’t mind reading as a hobby and other IRL things. Games are kind of shitty nowadays.

  • treesquid@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    My 3070 apparently can’t run it in low detail at the native resolution of my monitor. Weak.

    • vxx@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      Have you ever expected to play 4k with that card?

      I have a RTX 3060 and never thought that it would make sense.

    • salton@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      1 year ago

      My 2080 is plucking along perfectly fine. I’m actually happy that I didn’t upgrade to the 30 or 40 series when I’d have to pay over a grand to get enough vram to make this generation of horribly optimized games to run properly anyway.

    • Player2@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 year ago

      While this is not a good thing, we have to remember that games will take advantage of more resources than needed if they’re available. If keeping more things in memory just in case increases performance even a little bit, there’s no reason that they shouldn’t do it. Unused memory is wasted memory.

  • Matriks404@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 year ago

    Me with GTX 1060 3 GB: Ok.

    That said, I am probably going to finally upgrade to RTX 3060 this year or next (or some AMD equivalent, if I am going to switch back to GNU/Linux).

    • Kaldo@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      My 3060Ti has been serving me very well, I’ve played games that look amazingly unbelievably good (Death Stranding for example) with it, but these recent new requirements are crazy. Especially with UE5 games, I can’t help but think it’s just shitty optimization because they don’t look good enough to justify this.

    • NIB@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      The 4070 is consistently faster than the 7800xt and even the 7900xt(in ray tracing) in almost all settings. And only in 4k with ray tracing, it is ram bottlenecked. But even though the 7800xt and 7900xt arent ram bottlenecked, their performance is shit at those settings anyway(sub 30fps), so thats irrelevant.

      I dont see how having 20fps is better than having 5fps. Both are unplayable settings for either card.

      • Ranvier@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Wasn’t trying to compare to any specific other cards, this game is gong to destroy a lot of them. Just commenting on Nvidia skimping on the v ram for some very pricey cards.

  • MxM111@kbin.social
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    1 year ago

    Jokes on you, I have 15’’ 1024x768 CRT monitor. So, my older generation RTX3090 is just fine.