AMD recently posted a graphic with their RX 5500 XT which is available with 4GB or 8GB of VRAM. The result was eye opening.


With their Radeon RX 5500 XT, AMD has tested models with 4GB and 8GB of VRAM to see how much difference the extra VRAM can make at 1080p, uncovering a 12-24% performance change across five modern games. Beyond raw performance figures, the lower VRAM cards could suffer from more gameplay stutters and increased texture pop-in. 

I noticed an immediate improvement with the old RX 480 8GB vs the GTX 1060 3GB. The difference is substantial, I observed many 64-bit games with the RX 480 wanted more than 4GB of VRAM.

Now using the same GPU, AMD has demonstrated the performance hit that 4GB video cards are enduring with a wide range of games. I tested Rise of the Tomb Raider, Halo: MCC and Thief which all showed that VRAM mattered.

Given the outcome, it seems like it would be a terrible idea to market a video card even at the mainstream level with only 4GB of VRAM. This provides a basis for expecting new cards to feature 8GB and more VRAM.

Recently ASrock, Asus and MSI have offered RX 550 cards with only 2GB of VRAM which demonstrates real market demands. Such cards have negligible resale value due to the very limited capacity.


Discussing the issue on a mining form it was agreed that 8GB cards outperformed 4GB cards. Small wonder lots of worn out cards are being sold on eBay cheap. Mining cards tend to be used until they overheat or the earnings are too low. Both the GTX 1060 and RX 480 are ex-mining cards and the VRAM story does speak volumes for mining as much as gaming.