4GB is not enough in the world of 2018 when you could buy a used card with 8GB for like $250. In the world of 2022 where you just need some fucking GPU any fucking GPU and maybe yeah you have to turn down texture quality but it's still better than an IGPU.
Hardware Unboxed did a test with a 5500XT with 4GB vs 8GB and various PCIe speeds. The result was 4GB instead of 8GB was not that big a deal for most games (1080p), but PCIe 3.0 4x was a huge limiter. So their original blog post was marketing bullshit.
That assumes you want to be able to run the games on high. Which is nice of course but when dealing with shortages, beggers can't be choosers. I only have a 4GB card myself (RX570) and it does what it needs to do
To be charitable, the argument could be "high-res textures won't be visibly different on 1080p" which is not generally true but at least it makes sense as an argument.
If you can't even store them in vram, you'll have massive performance issues as you fetch them from system memory. Imagine having to go back and forth from System memory to VRAM multiple times to draw a single frame, your framerate will dumpster.
If they are already in VRAM and fully stored, along with their full mipmaps, the performance hit in the pixel copying is much smaller (in fact, it's quite negligeable on modern GPUs).
There seems to be a lot of things you're missing here.
Remember when this happened with Bitcoin, but then ASIC Bitcoin miners came out and market was flooded with cheap 2nd hand video cards. Why isn't that happening with whatever GPU miners are mining?
If I'm not mistaken (which I could be as I'm not a crypto guy) most GPU mining is Ethereum which was designed to be difficult to design an ASIC for, and even if an ASIC was designed for it good luck getting it manufactured in current year.
Remember when you used to be able to walk into Best Buy, peruse the DVD’s, then walk over to the computer section and grab a new GPU off the shelf, and then be badgered at the register to buy their worthless warranty? Those were the days.
4GB is not enough in the world of 2018 when you could buy a used card with 8GB for like $250. In the world of 2022 where you just need some fucking GPU any fucking GPU and maybe yeah you have to turn down texture quality but it's still better than an IGPU.
Hardware Unboxed did a test with a 5500XT with 4GB vs 8GB and various PCIe speeds. The result was 4GB instead of 8GB was not that big a deal for most games (1080p), but PCIe 3.0 4x was a huge limiter. So their original blog post was marketing bullshit.
That assumes you want to be able to run the games on high. Which is nice of course but when dealing with shortages, beggers can't be choosers. I only have a 4GB card myself (RX570) and it does what it needs to do
How? 2-3 games show a significant difference out of 12.
The games he tested at medium quality were the highest playable settings for that card. No one is saying you don't need more than 4GB for 4K gaming.
You are right, I see what you are saying now.
Hey man my pixel art indies don't need more than 4GB of VRAM to clearly this blog post is full of shit.
VR games also benefit from having >= 6GB.
It's not about screen framebuffer (4K vs 1080p), it's about textures. You didn't read his post.
Doom Eternal uses 8 GB of VRAM with hi-res textures.
To be charitable, the argument could be "high-res textures won't be visibly different on 1080p" which is not generally true but at least it makes sense as an argument.
I did not know that high resolution textures don't impact performance significantly as long as you have enough VRAM.
If you can't even store them in vram, you'll have massive performance issues as you fetch them from system memory. Imagine having to go back and forth from System memory to VRAM multiple times to draw a single frame, your framerate will dumpster.
If they are already in VRAM and fully stored, along with their full mipmaps, the performance hit in the pixel copying is much smaller (in fact, it's quite negligeable on modern GPUs).
There seems to be a lot of things you're missing here.
Everything is fine until you actually need more VRAM than you have, then games are completely unplayable.
Remember when this happened with Bitcoin, but then ASIC Bitcoin miners came out and market was flooded with cheap 2nd hand video cards. Why isn't that happening with whatever GPU miners are mining?
If I'm not mistaken (which I could be as I'm not a crypto guy) most GPU mining is Ethereum which was designed to be difficult to design an ASIC for, and even if an ASIC was designed for it good luck getting it manufactured in current year.
When they do get made, the Chicoms mine the piss out of them until the new version comes out and they fill the orders with the old ones.
Remember when you used to be able to walk into Best Buy, peruse the DVD’s, then walk over to the computer section and grab a new GPU off the shelf, and then be badgered at the register to buy their worthless warranty? Those were the days.
I mean its not wrong if you want to play anything besides solitaire.
Some arguments could be made that even 8GB of ram isnt enough.
Heck, my 32GB monstrosity has difficulty playing some newer games.