You can pay $1599 for a XTX fairly easily.
$1629 has been the cheapest I have seen but I agree the 4080 is a no brainer in our market if you don’t want the extra VRAM.
Hopefully AMD can catch up on RT/PT next gen to remove that detractor and create some solid competition.
Don't forget less VRAM to utilise all those features :)
True, but unlike the stingy VRAM on lower model cards (especially 3000 series) I think 16GB will be fine for a good while. 24GB on the 4090 can be utilised well with AI related stuff etc, but given that isn’t AMDs strong suit I don’t see the extra VRAM as a huge advantage. Maybe I’ll be wrong about that.
I don’t hold onto cards too long anyway, usually upgrade every 2nd generation. If I was trying to milk 6-7 years out of a card I may be more concerned with VRAM.
The XTX kinda makes sense at US MSRP where it’s $200 cheaper. At local street pricing though, I’m left scratching my head a bit at the value proposition.
Don't forget less VRAM to utilise all those features :)
True, tho it is more than the consoles so even lazy ports from useless developers will be fine.
True, but unlike the stingy VRAM on lower model cards (especially 3000 series) I think 16GB will be fine for a good while. 24GB on the 4090 can be utilised well with AI related stuff etc, but given that isn’t AMDs strong suit I don’t see the extra VRAM as a huge advantage. Maybe I’ll be wrong about that.
16GB seems good, maybe 18GB or 20 is better, because I've noticed in various Games where my 3080 10GB would use all the VRAM at 1440P I am seeing 16GB used or at least reserved, but maybe that reservation is part of what makes a game play smoothly. I always get Vram anxiety when I see maxed out Vram , so if this mystical 4080ti comes out with 20GB Vram at a reasonable price I'd go for that.
As for above 16GB, I mostly see that with Davinci Resolve, it can use all my 24GB Vram, but so far not seeing it spill over into shared GPU memory. The other useful thing Is having 2 or more programmes open at once that use Vram. With 10 GB Vram had to annoyingly keep closing software to get Vram back, load new software, on completion, close that, then open previous software. Now I don't need to do that. It's non gaming where I"m seeing the big benefits with having more than 16GB Vram
I've noticed in various Games where my 3080 10GB would use all the VRAM at 1440P
There are many games that will just allocate the whole available VRAM amount. E.g. Divison 2 used almost 8GB on my 1080, upgrade to 4080 and on same settings (before I changed it) it's now allocating 15GB.
It's non gaming where I"m seeing the big benefits with having more than 16GB Vram
For sure, which is why workstation accelerator cards usually have far more than the comparable (core performance) GPU.
16GB seems good, maybe 18GB or 20 is better, because I've noticed in various Games where my 3080 10GB would use all the VRAM at 1440P
Most games allocate all VRAM, but use some. There is a setting in rivatunerstatistics that shows both values in the OSD. However with my RTX 3080 10GB, some titles such as RE4 Remake will use more than the 10GB available at 1440p. The result is a bad single 0.5s stutter, or everything suddenly turns into slow motion for about 0.5 to 1 second (looks like motion blur 5 fps).
Personally, I would get 16GB for 1440p and at least 24GB for 4K.
Maybe I’ll be wrong about that.
Overall you won't be. Most usage cases will perfectly adequate for everyone. Until...
...you encounter the one situation where for whatever reason you hit the limit hard and it becomes an issue. It's usually the one app/game that you really want to work flawlessly too. Thankfully they are usually the exceptions though.
I’m left scratching my head a bit at the value proposition.
Value? How quaint. I remember that was once a reality in the long ago :)
I knew a thing or two writes...
Overall you won't be. Most usage cases will perfectly adequate for everyone. Until...
...you encounter the one situation where for whatever reason you hit the limit hard and it becomes an issue
Certainly possible. On the flip side I get to enjoy the benefits of nVidia right now – playing through CP2077 again in RT Overdrive mode (path tracing) and get 90-120fps or so depending on whether I set DLSS to balanced or quality and with FG. Looking at benchmarks it seems like the XTX might average about 30-35fps with FSR Performance. Game is stunning in RT Overdrive mode.
I love how some mostly poorly optimised games have everyone wigging out. I use to do 4K on a 3080 but apparently that's impossibli without "at least" 24GB lol.
I love how some mostly poorly optimised games have everyone wigging out. I use to do 4K on a 3080 but apparently that's impossibli without "at least" 24GB lol.
Yep a few terrible PS5 ports are the new normal apparently.
I don’t hold onto cards too long anyway, usually upgrade every 2nd generation.
Fair enough. I know there's some outlier games where even 16BG is insufficient at high resolutions but I'm really hoping that isn't becoming the norm and is just a by-product of the abysmal releases we've had lately.
tho it is more than the consoles so even lazy ports from useless developers will be fine.
Yeah I'd imagine as long as you've got more VRAM than current-gen consoles do total memory you'll be fine for the most part.
the new normal
Couldn't blame anyone for thinking it's the case when essentially every "AAA" game release so far this year has been one of them.
Didn’t Atomic Heart perform abnormally well at launch?
Well its heavily advertised (including by Nvidia) raytracing mode was at 0 fps regardless of hardware at launch ;-)
Yep a few terrible PS5 ports are the new normal apparently.
Well it seems like yes it may be.
Numerous devs have said it's significant work to get games to fit into 8GB VRAM but they essentially had to anyway because of the old consoles, but now that games will only be developed for PS5/SeriesX they don't need to so a lot probably wont as it won't be worth it because of how much smaller PC market of the game is.
Games made for PC first you'd hope still would but even then if they are also releasing on console as well even those might choose not to.
So it may just be some "terrible PS5 ports" which are the problem right this moment, but if every game that comes out is a "terrible PS5 port" then that is not the problem as it is just the norm and it's cards that only have 8GB which will be the problem.
So Gigabyte Gaming OC 4080 is $1699 at Umart at the moment. Anyone think it will get lower any time soon? Thinking of pulling the trigger.
Also is there any way to see a size comparison between that and a 2070S?
So Gigabyte Gaming OC 4080 is $1699 at Umart at the moment. Anyone think it will get lower any time soon? Thinking of pulling the trigger.
It was $1679 on Computer Alliance eBay store recently.
So Gigabyte Gaming OC 4080 is $1699 at Umart at the moment. Anyone think it will get lower any time soon? Thinking of pulling the trigger.
They've been dropping for a while now. I picked up a 4080 about 2 months ago for $1759. They are already at some of the lowest prices in the world for 4080s.
Also is there any way to see a size comparison between that and a 2070S?
With my 4080 (PNY card), it's huge. A lot of 4080s use the same heatsink as the 4090 so its completely overkill. You should be able to find dimensions online though.
So Gigabyte Gaming OC 4080 is $1699 at Umart at the moment. Anyone think it will get lower any time soon? Thinking of pulling the trigger.
This is getting close to typical RTX 3080 prices around launch, I doubt you will see much lower to be honest given the current environment.
My money is on the next generation returning to some kind of better value, as RTX 4000 sales are looking pretty weak, I can't imagine why :P
Hopefully AMD can catch up on RT/PT next gen to remove that detractor and create some solid competition.
Well once FSR3.0 releases AMD users won't feel so bad about the %30 slower RT/PT 'native' perf they get.
Also I hope this NV NTC compression stuff doesn't end up being a 50 series exclusive thing. Kind of getting sick of generational hw exclusiveness. At least AMD is trying to not be that way, at least outright (newer hw still benefits more).