Iceberg Tech: https://www.youtube.com/@IcebergTech
HL Discord: https://discord.gg/c2Ug8pAJch
Timecodes:
00:00 – Brief Introduction
00:46 – GPUs overview
09:20 – 2012
13:14 – 2013
17:31 – 2014
21:37 – 2015
24:10 – 2016
27:51 – 2017
32:03 – 2018
35:44 – 2019
40:35 – 2020
43:51 – 2021
46:40 – 2022
49:22 – Conclusion
Background Tracks:
Unicorn Heads – 808 Doorbell Chime
plenka – Catharsis
plenka – Plasma Water
plenka – Spirit
plenka – Lost the time
plenka – Muddy House
Case & Point – Error Code
plenka – Bbrokenn
Portwave – Oh, M
Arti-Fix – The Mercury Man
Arti-Fix – Cybernatic Sect
Arti-Fix – The Untold Story
Arti-Fix – Dangerous
Arti-Fix – Liquidator
Arti-Fix – Your Time
source
Might want to see GCN 1.0 versus GCN 2.0 versus GTX 780ti, since GCN 1.0 has same API support has the GTX 780ti. (And secondly because I just ordered a HD 7950 3gb for about 40 USD, lol. )
I was thinking of getting a GTX 780 but this made my mind up for a R9 290 or Fury
The 3900x is wondering if it’s still on the Home Screen or not.😂
Flashing a 290 with Hynix memory to a 290X days………….
Techpowerup’s relative performance isn’t quite accurate these days. I think buyers should just watch YouTube benchmarks and comparisons, or go to Tom’shardware GPU hierarchy page, which isn’t 100% accurate but still better than Techpowerup
Damn, the 780ti was my dream card back in the day, I should've dreamt of something else instead.
R9 290X was the 4090 of it's time. That shit was a beast
I bought the R9 390 back in 2015 over the 970, and looking back I am very glad I made that decision. I was using that up until this year until I had to upgrade because the lack of driver support. If AMD didn't abandon driver support for the 300 series cards so early I could've easily had gotten another 2 years of use out of it, but since it was over 6 years ago I didn't mind upgrading to a new GPU (from AMD of course). I learned from past experiences not to buy NVIDIA cards do to how poorly they age, I'm glad to see people are starting to see past the hype.
Nvidia only shipped the required instructions, with modern API's it shows how bad the 6/7 and 9 series aged Vs the GCN architecture which contains those instructions. Bizarre to see DX 11 bash the 780Ti to destruction
Just clicked play and didn't realize its 50 mins long until I got to 35th minute. Every song is a banger in there. You got a playlist or a name of the genre?
Very good retrospective work — more of this kind of benchmarking is needed. I was surprised to see so many performance issues with Vuklan API titles on Kepler. Is it just the VRAM size limitation or memory asset mismanagement in the driver — who knows. One big difference between Kepler and GCN architectures is the inability of the former to run compute and graphics kernels at the same time. This is probably the reason for the extra performance drop in Doom Eternal with its specific implementation of forward rendering, that certainly causes GPU pipeline under-utilization in Kepler, on top of the chronic Vulkan run-time problems.
I'd appreciate it if you enable subs/closed captions 😄
I wonder if TressFX is what's mainly affecting the results in favor of AMD in Tomb Raider 2013
TR2013 with TressFX is not a fair comparison as TressFX is/was an AMD API… In that case, start testing your GPU's with Hairworks, PhysX and the whole shebang and you'll see the FPS tank like crazy on an AMD card, if it can even turn it on and if not, it means 0 FPS and thus a clear overwhelming win for nVidia, and that is where the extra price justification is in order. Also if you keep an eye on the VRAM usage you will see that at most times with nVidia (up to 2016 or so) it will be lower because of better (lossless) compression build in to the hardware, and that is also why 3GB for nVidia was enough back then, until about 2016 when texture sizes just went bonkers :). Also, if you have a CPU limit, you should use a higher resolution to shift it back to the GPU. Not an nVidia shill here, but testing methodology is everything and also keeping everything into account. Riftbreaker was mostly CPU limited, btw.
Love to see a channel that has only 600 subscribers push out such long videos of quality content.
Upgraded from a water cooled RX480 to a Vega 64 and the performance increase was insane in GTA V.
Not as insane as going from a RX480 to a 6900 XT in my main though!
It's quite an interesting test and must have taken a lot of work. I have one criticism however, not because what you are doing, the point you are trying to make is clear but at the same there is a variable not taken into account.
Drivers.
From my point of view what you are missing is using time appropriate drivers, because looking back is easy to see how things behaved, but it would have been nice to check if during their respective times the Drivers for AMD or Nvidia played a role here.
There is no point in looking back and say hey look AMD has aged better while using the newest drivers if during the release window of the game the optimization for Drivers (either AMD or NVidia) was so bad that made the game run worst that it looks here.
Yes, I want a card that over time behaves better or keep performance but if from the start it was bad, then I would prefer to change my GPU more often than to deal with a couple of year of bad performance for games while waiting for better drivers.
I thought this premise was dumb and pedantic, but now I’m fully on board. Looking at relative performance without looking at the specific games you play seems really dumb now.
I choosed XFX merc319 rx6750xt black for 599€ in June 2022 over used RTX3070 of any kind starting at 650€, i don't use RT and i'm already getting same perfomance in nvidia sponsored titles and better avg scores + consistency/latency in all the others games. We will see if this gpu will keep me outstand a 3070 in raster 1080p@144Hz and 1440p@144hz until at least rdna4 🤔
I went from gtx970 directly to this rx6750xt and the jump it's 💥
edit: i use s.a.m. with r7-5800x, running the vram maxed out at 2312mhz + fast timings along with oc at 2823Mhz + small undervolt at 1152mV and +6%PL.
I got 14530pts (gfx score) in 3Dmark TymeSpy, 14988pts in furmark donut 1080p, 9380pts in furmark donut 1440p.
(1000+ pts than a 3070 in 3Dmark, better scores than a 3070ti in furmark donut )
i can tell amd framerates feels smoother like JayTwoCents said 🤣
Bravo! spot on!
Great thumbnail!
nice production, well done
I really wanted to see a comparison chart (charts) at the end.
Show averages by year. Chart % by each API. Etc.
Do your research – you mean like watching Hardware Unboxxed? They have these great charts that show how a bunch of GPUs perform in a single game and sometimes charts that show how two GPUs compare in many many games where you can see how they perform per game as well as on average. HUB is a great resource, not the only resource as literally no one has that much time to do that much testing, but a great one none the less.
Kinda neat to see how these retro gaming cards compare today.
I think finding an RTX 4090 in 5 years for such a comparison will not be an easy task, given how power hungry they are and low-quality of their new 12VHPWR power cables
I find it strange that AMD stopped supporting GCN on Windows, as MESA's vulkan driver, RADV, use the same compiler for both GCN and RDNA.
wait
the r9 cards were the same time as 7xx-9xx??
fuck me wtf
i only ever really bought an nvidia gpu when to me it made sense. Like for example between the 5700xt and the 2070 super. yes they trade blows from time to time. But ultimately i liked the concept of raytracing and dlss and back then there was no fsr. thus i went with that. however with current prices and nvidias shitty business practices eh i didnt go for the 3070. nor 3070ti. and wont go for the 4070ti (old 4080 12gb) either. As it may be only as good or below the 3090ti and for probably not 500-550. No instead ill wait for the 7750xt which i hope is better than a 3090ti or at least also in the same ballpark for 500 bucks. and hopefully 16gb or at the very least 12 as i really dont need more a 1440p
Meh disagree, youre better off looking at the relative performance charts for the specific game you want to play and finding a gpu that can handle the games you want to play at the resolution and framerate you want (corresponding to your monitor resolution and refresh speed). For example, the consumer needs to go monitor shopping at the same time they go GPU shopping or have a monitor in mind already, and say okay, for my desk and my eyes I need a "X" Size monitor, okay, and oh look, 4k monitors in that size are more than I can afford, so that takes that out of the equation, so then its this monitor, its only 1080p but its 120hz, okay and I like to play Call of duty and I want to try cyberpunk, so then go look at the relative performance charts, and find out what GPU's can play those titles at 1080p and get 120+ fps, and then choose that card. Thats the best way of picking a graphics card. It makes 0 sense to purchase a GPU based on features that you have no idea how they will be implemented and utilized in the future, including amount of VRAM etc. Just look at the performance tables of the games you want to play, at the resolution of your selected monitor, and pick a gpu that can achieve that target. The other thing to consider is, the relative performance charts are game specific, so to claim that they are invalid because when new games come out they render the relative performance obsolete…. well thats not true because a new relative performance chart comes out for that game… presumably if youre deciding between a 290x and 780ti in 2022, you're going to be looking at a performance chart for Cyber Punk and not Skyrim….. Lastly, because there are modern GPUs that cost less than the sticker price of either of these that will drastically outperform them, and include things like raytracing…. Just seems like a really moot video, Not to mention I think they only have like 3 year warranties…. but you did highlight an interesting phenomenon so entertaining video none the less I suppose.
Amazing study! Subbed.
The 290x was a great card. Still can't believe it holds out that well despite being older than the ps4 and xbox one. I bought a used 290 in 2015 for $200 that lasted me years.
Awesome video, my man. It's really nice to see how these cards stack up over time.
My GPUs over my lifetime:
ATI Radeon X800 PRO VPU
->
GeForce 9800 GT
->
GTX 560ti 1gb + Radeon 7750
->
GTX 970 (longest held)
->
RTX 3080 10gb + RX 6500XT
I love technology OGs remember the ATI days.