I recommend trying mid journey first to see if ml image generation is something you find fun and useful and well use regularly.
Thanks for that. I have heard of Hugging Face and Stable Diffusion but not Mid Journey. I'll add it to my bookmarks.
They're launching a 16GB 4070
I might have to put off getting a 3090 then ... ;)
With all the rumoured new variants, Nvidia's MSRP should be looking like this in ascending order
- 4060 8GB
- 4060ti 8GB
- 4060ti 16GB
- 4070 12 GB
- 4070 16 GB
- 4070ti 12 GB
- 4080 16 GB
The safest thing when it comes to nVidia is wait 1yr after the release of the x90 GPU for them to bring out the entirety of their product range for that generation.
Once all their cards are on the table, only then do you pick one :P
This sounds like solid advice lol.
Whirlpool Illuminati writes...
I wouldn't get a 4070 or 4060 ti for future proofing based on 16gb of vram. The actual cards will probably need replacing due to performance in new games before the vram is an issue. 4070 ti probably makes the most sense this generation or the 4090 if you have the cash for it. Then the 4080. Below those I wouldn't be worried about vram too much. 12gb should be the minimum though.
Yeah idk it's a hard choice for me. I have a shit ton of older games to get through (and I do mean an absolute shit ton lol) and the 4070 will nail them all either at 1080p144 on the pc monitor, or 4k60 on the tv no worries (with some settings tweaks depending on the game & DLSS). But yeah I want to have both bases covered more or less.
Most single player gaming will be on the tv but most co-op games, along with rts & fps etc will be on the pc monitor. I don't mind having to turn down some settings too in the future to keep it running smooth. So I could probably get away with 12gb for awhile given all the older games I've still got to play. But I feel 16gb would help a lot more in the future especially for 4k60 or 1440p60 if I want to play something more recent so idk.
A 4080 & 4090 is definitely out of my price range though, and I don't feel confident having one of those in my system with all the potential psu melting cable issues, it'll just give me anxiety lol. And yeah I'd need a new power supply as well so that's basically like nearly 2 grand going with a 4080.
A 4070 Ti 16gb would definitely have my attention for around $1,200 though (providing a 750W Seasonic psu would be enough to power it with just the standard cables and no adapters).
For those interested, NVIDIA 40 series and Diablo 4 bundle has been announced https://www.nvidia.com/en-us/
For those interested, NVIDIA 40 series and Diablo 4 bundle has been announced
Says a lot about the marketing. Please feel free to correct me if I'm wrong but as I recall, in the past, games were bundled with the release of new GPUs (and other components) to encourage people to buy with the release. To now be bundling games says that they misread something with the marketing.
Out of interest, why? Images of what and then what do you do with them, other then fill up your hard drive.
It's just crazy technology and it's a lot of fun.
I've used a couple images for cover pages on my university assessments and profile pictures on Discord but those are only a very small handful of images.
I did also make a LOT of video game textures for some game projects I've been messing around with. It was super helpful with that.
Also p0rn
Whirlpool Illuminati writes...
4070 ti probably makes the most sense this generation or the 4090 if you have the cash for it.
It's memory bus does not match the power of this card, so you get a slowdown purely from that, creating situations where the 4070ti performs worse than a 3080 12GB. That's also something to check with the 4060ti's and it's even smaller memory bus.
Yikes.
Exactly my thinking. Would be miffed if there ends up being a plethora of 16GB SKUs and I'd originally bought a 4070 Ti with only 12GB. You'd be back to the 3060 12GB having more VRAM than everything above it up to the 3080 12GB.
With all the rumoured new variants, Nvidia's MSRP should be looking like this in ascending order
It's going to end up being Ampere's product stack, all over again. Maybe the confusion is intentional misdirection at this point?
I did also make a LOT of video game textures for some game projects I've been messing around with. It was super helpful with that.
I've seen some pretty cool stuff in that regard.
Says a lot about the marketing.
I agree. I initially thought it was a bundle to promote the 4060 but the card's not even mentioned.
I still don't understand why NVIDIA are releasing a 16GB version of the 4060ti – won't that compete with 4070 sales, which are already underperforming in the market? Happy to have more VRAM but the marketing seems very reactive for such a well established company.
I still don't understand why NVIDIA are releasing a 16GB version of the 4060ti – won't that compete with 4070 sales, which are already underperforming in the market? Happy to have more VRAM but the marketing seems very reactive for such a well established company.
They don't want to drop 4070 pricing. It is better for the company to release a cheaper card like the 4060ti, and reduce manufacturing numbers for the 4070 instead.
Exactly my thinking. Would be miffed if there ends up being a plethora of 16GB SKUs and I'd originally bought a 4070 Ti with only 12GB. You'd be back to the 3060 12GB having more VRAM than everything above it up to the 3080 12GB.
It doesn't even really make sense, other than pissing off their own customers that got the 4070 (ti) 12gb cards. It's like shooting themselves in the foot for no reason. Strange if true.
Whirlpool Illuminati writes...
It doesn't even really make sense, other than pissing off their own customers that got the 4070 (ti) 12gb cards. It's like shooting themselves in the foot for no reason. Strange if true.
Lol, Since when does Nvidia care about pissing off their customers ... they been doing it for years.
Whirlpool Illuminati writes...
It doesn't even really make sense, other than pissing off their own customers that got the 4070 (ti) 12gb cards. It's like shooting themselves in the foot for no reason. Strange if true.
They just want 4070 12GB owners to upgrade to the 16GB version. They don't care about pissing people off, as long as they get the sale.
Thanks for that. I have heard of Hugging Face and Stable Diffusion but not Mid Journey. I'll add it to my bookmarks.
MidJourney is great, arguably the best AI image generation although you will not be able to run it locally. You have a subscription and they generate the images on their servers.
I guess if you're into AI art and don't have lots of VRAM, you've always got MidJourney. Would end up cheaper than buying a 4090 anyways (;
Whirlpool Illuminati writes...
other than pissing off their own customers that got the 4070 (ti) 12gb cards.
To be fair it wouldn't be the first time Nvidia's done that.
They just want 4070 12GB owners to upgrade to the 16GB version.
^ This. If there's a 4070 16GB, people can't complain about 12GB being insufficient anymore because Nvidia can just say "Here's an option for you". Nvidia stopped caring about pissing people off quite a while ago.
MidJourney is great, arguably the best AI image generation although you will not be able to run it locally.
There's a Midjourney-derived model you can download to use locally. I don't think it's a first-party Midjourney model, but it's trained on their data.
There's a Midjourney-derived model you can download to use locally. I don't think it's a first-party Midjourney model, but it's trained on their data.
Yeah, it's trained on results from the Midjourney discord. It's decent, but it's no where near as good at Midjourney and falls behind in my opinion compared to other models like Delibrate.
^ This. If there's a 4070 16GB, people can't complain about 12GB being insufficient anymore because Nvidia can just say "Here's an option for you". Nvidia stopped caring about pissing people off quite a while ago.
thats why we stopped caring about them and went team red instead....
it's no where near as good at Midjourney and falls behind in my opinion compared to other models like Delibrate.
Would agree with you there for sure. Some of the stuff I've seen Midjourney itself do are bonkers.
thats why we stopped caring about them and went team red instead....
To each their own; go with what works best. I'm still on a 6900 XT myself, neither Ada (outside of the 4090) nor RDNA3 are enough of an uplift to warrant the price.
If there's a 4070 16GB, people can't complain about 12GB being insufficient anymore
I still don't get the full strategy. What confuses me is that Nvidia are releasing a 4060 with 16GB (not 4070). This messes up the sequence of upgrades (e.g. 4060<4060ti<4070 <4070ti) and it will impede any sale of existing 4070 stock (and that assumes Nvidia have full control to stop 4070 production at a whim...not sure what the agreement is with TSMC but any change in the order would likely be at a lose of sorts).
Perhaps they plan to give up on the whole 4070 series but that's a big hit to their market and PR.
Unfortunately, Nvidia has an overabundances of GPU dies, with only the AD102 dies been able to sell reasonably well, while the AD103 (4080), and AD104 (4070TI/4070s) are just stockpiling on the shelves or in warehouses. Nvidia are even repurposing AD103s and cutting them down to 4070 specifications to try to get rid of them. So with these new SKUs with higher VRAM amounts, its just a way for Nvidia to try and get rid of these stockpiling GPUs dies they have, at least they are starting alot earlier in trying to get rid of excess stock than last generation, where they were left with millions of 30 series GPUs when they are launching their new 40 series generation.