As am I, but it's been struggling for a while with the sims I try to play in VR.
Same experience here. 10fps in FS2020 with medium settings VR is nauseating! It was interesting that in the first year, the 3080 was a card that you could run everything at Max/Ultra 1440p easily. But in the 2nd year, it just ran out of steam with the newer games that came out, even if they were on an older engine that worked fine previously. I guess developers will always aim to use whatever processing power is out there in the market.
I also recently connected it to my 4K TV and have to turn settings way down for it to run at all.
I gave up with 4K (my VR headset is 4K), so I actually bought a 1440p 34" monitor instead when I wanted to move on from 1080p. It has helped extend its life for a bit, but there are still some poorly optimised games that have issues with the 3080 when running at max settings 1440p due to the 10GB memory.
Price leaks
Nvidia considering prices in these ranges.
I put even less stock in pricing leaks than I do normal leaks. It's well-known Jensen will sometimes decide final pricing on stage just before he walks out, not even AiBs usually get to know what Nvidia is thinking. That being said, I really honestly won't be surprised to see a $2k USD 5090.
Well according to sources the 4n wafer costs are approx $20k per wafer. How many chips will depend on yield and binning. That is a 25% increase on the prices from 2021.
We can sort of, kind of work out what they cost knowing what the wafers cost, using a die-per-wafer calculator. I still hear AdoredTV's Scottish accent when I type that lol. If this doesn't interest anyone, feel free to skip reading. I like doing this and think it's cool to think about.
A 4090 is roughly 24x25mm, for a 30cm/300mm wafer and leaving the rest as defaults (since I don't know what they are exactly for TSMC wafers) that's around 90 4090 dies per wafer, assuming perfect yields. We don't know how big GB102 will be, but assuming it's on N4P (Ada Lovelace is on N5) TSMC cites a 6% density improvement over N5, so that gets you to roughly 95 wafers per die. This assumes the 5090/GB102 die isn't any larger than the 4090's die (it probably will be) and this could bring it back down to around 90 dies per wafer, or even lower if Nvidia pushes die size to the reticle limit. Yields also obviously decrease this number, N4P's yields are apparently around 70%.
Assuming perfect yields, if Nvidia is paying $20k a wafer on N4, then each (of the 95) 5090 dies is going to cost them around $210 USD. This is wildly variable, obviously, and can change depending on what Nvidia is paying, how large the wafers are, how large the die is, yields, if Nvidia is paying more for any additional packaging steps, etc. I'd wager potentially as high as $250 USD a die taking into account the above variables. It's probably (?) about the same for the BOM for the FE 5090 design (whatever that ends up being) leaving Nvidia's total cost price for 5090 hardware at an "up to" $500 USD. I've seen comments regarding the 4090's total BOM cost as being about $450 USD from earlier this year, so if true this probably isn't too far from the truth.
Assuming ~70% yields, that's around 66 usable dies per wafer, and puts the cost at around $300 USD a pop. I'd say as high as $350 USD if there's anything "extra" happening in the fab process, and to account for larger than 4090 dies (and thus lower amounts of them from a wafer). Plus other BOM/hardware costs, probably closer to the $600-$650 USD mark in total for a 5090 for Nvidia especially if this generation's FE requires more advanced cooling, e.g. an AiO.
We're already making a lot of assumptions here, so let's assume another 20-25% increase in cost to Nvidia for a 5090 (over a 4090) means a 25% increase in MSRP for the 5090 over the 4090, and lo-and-behold, you get bang-on $1999 USD. This is, IMO, the likely bare-minimum MSRP for a 5090 assuming Nvidia only wants to make the same amount of margin on a 5090 as they do a 4090.
Anyone had a recent RMA with gigabyte? my 3080ti watercooled GPU crapped itself and PLE RMA'd it back to Gigabyte.
No idea what to expect.
No idea what to expect.
Let us know how it goes, there's good and bad stories about them. Interested in how long it takes, if they repair or replace and if replace a common complaint is that you get someone elses repaired(refurbished) GPU. They may do that to avoid long wait times, so another person will eventually get your card as a replacement
nyone had a recent RMA with gigabyte? my 3080ti watercooled GPU crapped itself and PLE RMA'd it back to Gigabyte.
Just had my 4090 in for a power connector that crapped itself
Took 4 weeks from dropping off to Scorptec to picking it back up. Cant tell if they just gave me a new unit or replaced only the 16pin connector.
Cant tell if they just gave me a new unit
The serial number of the PCB would be different, presumably, if this were the case. If you still have the box for your original card you can compare the SNs to find out (or if you can see the SN you input for the warranty claim).
Same experience here. 10fps in FS2020 with medium settings VR is nauseating! It was interesting that in the first year, the 3080 was a card that you could run everything at Max/Ultra 1440p easily. But in the 2nd year, it just ran out of steam with the newer games that came out,
Are you saying you have a 3080 and only getting 10fps, in MSFS2020..?
If so, you have something (maybe card settings), very wrong!
Or you have everything on ULTRA and a low powered CPU and/or RAM.
Either PM me, or goto the MSFS2020 ------- /thread/36rkp283?p=225
page 225 at the bottom. There is a link to looking at card settings.
But PM me if that doesn't help. What's your VR headset though?
I have an Oculus 3, (even my 'old 2' was good), RTX 3070, Intel i9 10850, 32gm ram and my MSFS2020 runs very smooth with MSFS settings most at Med and some on high and some on Ultra.
Not trying to hijack the thread, but even my O2 ran at 120hz.
Took 4 weeks from dropping off to Scorptec to picking it back up. Cant tell if they just gave me a new unit or replaced only the 16pin connector.
You might find an additional tamper evidence sticker on the PCB or screw as well. You can compare it to photos of reviews of your exact model of 3080Ti.
Are you saying you have a 3080 and only getting 10fps, in MSFS2020..?
If so, you have something (maybe card settings), very wrong!
Or you have everything on ULTRA and a low powered CPU and/or RAM.
It is on a Reverb G2, which is 4K, on a 5900X with 64GB RAM. It was fine at high altitudes, but gets very bad at low altitudes, even away from cities. This was back when the content for Top Gun Maverick was released, likely around May 2022. The Darkstar challenge was super smooth and great. But the F-18 canyon run was horrible. I have not gone back to try again since then to see if further updates have improved it, I'm sure it would be better today compared to back then.
Edit: My FS2020 is from the MS/Xbox game store.
Problem is if the leaked specs are true the 5080 is unlikely to even match the 4090 and will likely lag behind it. As it stands a 4090 is around 25% faster than a 4080S so it seems unlikely a 5080 would make a worthwhile upgrade.
Largely the same problem for Ampere 3080 owners, the 4080 was not a significant enough upgrade for the price, and many people did not want to support the 4090 price point, however a certain percentage did. Nvidia obviously did the math and determined this was the way $$.
It will be hardly surprising if Nvidia have pulled the same stunt with the 5080, especially as AMD will seemingly have even less to compete this time around.
As am I, but it's been struggling for a while with the sims I try to play in VR.
I hear you. I built a new high end PC for simracing and only held back on GPU (picked up a used 3070 to hold out for more 50xx release info) . For VR I just couldn't get reliable framerates at anything that was working for me visually. Picked up a brand new 4090 founders edition for a price I was very happy with on ebay – set it up yesterday and OMFG I get the hype with these cards. simply incredible.
My 3080ti has been great for VR, msfs2020 I lock the framerate at 45 and repro at 90fps on my reverb G2.
I use openxr/opencompositor instead of steamvr however.
Just looking at Umart – wondering if it's worth upgrading from my 3070.
They have a few RTX 4070 Ti's at $999.
That's a 61% improvement..!
Some 4070's at just over $850.
https://www.umart.com.au/pc-
The 4090's are still king. But for less than a grand, I'd be going a generation upwards and a tad (Ti), over.
Humm..
Largely the same problem for Ampere 3080 owners, the 4080 was not a significant enough upgrade for the price, and many people did not want to support the 4090 price point, however a certain percentage did. Nvidia obviously did the math and determined this was the way $$.
It will be hardly surprising if Nvidia have pulled the same stunt with the 5080, especially as AMD will seemingly have even less to compete this time around.
Not quite the same, 3080 -> 4080 was a 49% gain at 4K. In this hypothetical case where a 5080 may be slower than a 4090 that gain would be reduced from 49% to circa 25%. Also, 4080 was still better value cost-per-frame than a 4090 at MSRP so whilst some may have decided a 49% gain didn’t warrant 3080 -> 4080, from a value perspective you were no better off with a 4090 (and actually were worse off.)
The 4090's are still king. But for less than a grand, I'd be going a generation upwards and a tad (Ti), over.
Humm..
Not to mention that depending on the exact SKU of 3070 you have, they're selling for around $400 and up according to eBay. So your total potential outlay is <$500 (assuming you sell your existing card) for a 60% performance uplift. Seems pretty good to me.
from a value perspective you were no better off with a 4090 (and actually were worse off.)
Leather Jacket hates this man for exposing his secrets! Learn why here!
Not quite the same, 3080 -> 4080 was a 49% gain at 4K.
Depending how you word the math, the 3080 10GB was around 70% of the performance of the vanilla 4080 at 4K and at launch. The bigger issue with the 4080 was the near 45% price increase over the 3080, so I absolutely stand by my comment, the 4080 was DOA :P.
We only have guesses and speculation for the 5080 from the usual sauces, don't be surprised to see claims about "Nvidia changed the specs at the last minute" once the real specs become available. I really doubt the 5080 will be tangibly slower than the 4090, but a large performance gap between the 5090 and 5080 is all but locked in at this point.
If I recall correctly there were also rumours about the 4090 costing 2k usd a month or so before the announcement but didn’t pan out, I am hoping that to be the case but I digress as everyone says TSNMC prices went up.
everyone says TSNMC prices went up
Indeed they did. It's why it's good for everyone if Intel manages to get their Foundry services off the ground with leading-edge nodes because right now, Samsung ain't up to the task. Pricing will continue to increase for wafers as newer nodes come out, but TSMC is essentially the Nvidia of the fab world at the moment (with Samsung arguably representing AMD).
I've already done the napkin maths to prove that a $1999 USD 5090 is essentially bang-on the money assuming Nvidia wants the same margins as they had with the 4090, but we'll have to wait and see. I'd rather be pleasantly surprised don't get me wrong.
If I recall correctly there were also rumours about the 4090 costing 2k usd a month or so before the announcement but didn’t pan out, I am hoping that to be the case
Have a look at how few or even no 4090's retailers have in stock and it's still months until 5090 is released, that will create a situation where 4090 prices new and 2nd hand will continue to go up, and paying $4000+ for 5090 may not seem so bad. I'm surprised you can still get 4090's for $3K indicating demand is still balanced with supply for some models.
I'm surprised you can still get 4090's for $3K indicating demand is still balanced with supply for some models.
I'm surprised the price hasn't come down. I would've thought those willing to spend 3k on a GPU would have done so long ago. I don't know why anyone would upgrade to a 4090 now.
4070/ti seems to be the card people have been eating up of late.