NVIDIA GeForce RTX 2080 or 2080TI - Anyone getting one?

phaZed

Well-Known Member
Reaction score
3,198
Location
Richmond, VA
So, preorders are open for the new hot cards!

I went full hog for a 2080TI Founder's Edition. And no, this is the first time I have ever done something like this as far as being an early adopter! I have been rocking the GTX 970 for the last 4 years and its age is starting to show for new titles.

Is anyone else getting one for themselves?
 
No way. I'd rather just game in 1080p and keep my GTX 970. Most technology that has come out over the past decade has been nothing but gimmicky or such a small improvement that it's not worth paying more than a few extra dollars for. Examples:

1. Touchscreen laptops
2. VR
3. "3D" TV's
4. Windows 8
5. Alexa/Google Home/Siri BS
6. Smartwatch
7. Computer processors (there has been so little advancement between 2nd gen and newer gen processors it's not even worth mentioning)
8. NVMe SSD's/Intel Optane - I use both of these technologies and the leap from SATA based SSD's to these drives is minor at best
9. Pretty much everything smartphone related
10. Graphics cards - it seems like anything after the GTX9XX cards just offers so little extra power for a LOT of extra money

I'm not saying that some of these technologies won't eventually evolve into something greater that's actually worth paying for, but the amount of useless technology that's been coming out is mind-boggling.

We've entered an era where each new advancement is so small that it's almost unnoticeable. For example, the difference in picture quality between VHS and DVD was HUGE. The difference between DVD and 1080p was HUGE. 1080p vs. 4K? Not so much. I love my 4K TV and I wouldn't want to go back to a 1080p display, but that's only because I use it as a computer monitor and there's a practical use for the higher pixel density. If I was just watching TV on it, I never would have upgraded to a 4K set.

Yes, technically the RTX 2080 is better than the GTX 1080, but the difference in minor. The real benefit comes from the new ray tracing technology, but it's in the early stages and won't be a big thing for another 5 years at least. So I'll upgrade to the RTX 3080 or RTX 4080. That's when it will be worth it to upgrade, not now.

That being said, if I were building a brand new gaming computer from scratch, would I buy the GTX 1080? Of course not. You don't buy last gen tech unless you can get it REALLY cheap. If I had a GTX 780 or older, you bet I'd upgrade to the latest card. But coming from a 970 or above, there just isn't enough of an improvement to justify the expense of these newer generation cards.
 
So, preorders are open for the new hot cards!

I went full hog for a 2080TI Founder's Edition. And no, this is the first time I have ever done something like this as far as being an early adopter! I have been rocking the GTX 970 for the last 4 years and its age is starting to show for new titles.

Is anyone else getting one for themselves?
I'll probably grab a bunch. They should make Space Invaders look amazing!!
 
[...]. Most technology that has come out over the past decade has been nothing but gimmicky or such a small improvement that it's not worth paying more than a few extra dollars for. [...]

Welcome to the new finance/marketing driven (IT) world... So sad :(
 
@sapphirescales Dare I say that you simply are not using the hardware to its full potential, or need the full potential for your situation.

That's the point. No one needs to game in 4K. There are no practical benefits. The difference is so small it's not even worth mentioning. If you game in 1080p there's no benefit to going with anything higher than a GTX 970 at this time. Now I do a lot of video rendering, which would benefit from a newer graphics card. But it's not worth the cost because the differences aren't worth $1,200. Now if I was doing video editing for 8 hours a day every day then sure. But I might do 5 hours a week. An RTX 2080 TI might reduce that to what...4.5 hours? Yup, that's worth spending $1,200. NOT! Now if it could reduce my rendering time by 80% then THAT might be worth it. But these teeny tiny performance bumps make it seem like they just keep rebranding the same old crap every year.

I'm not against paying a lot of money for good quality and high end components. The custom built system in my bedroom cost about $5,000 to build. I have an i7-7800X processor, 128GB of RAM, a 2TB NVMe SSD, and 4x 6TB hard drives for storage. I pair that with an Nvidia branded GTX 970 graphics card and it handles 4K games just fine.

I've played plenty of titles in 4K on my GTX 970 and while the frame rates were lower than in 1080p, it was still perfectly playable. Still, I prefer higher frame rates to a higher resolution, so I stick with 1080p.

I'm not saying that the RTX 2080 TI isn't more powerful than the GTX 970. I'm saying that the differences are minor and it's not worth upgrading at this time. The only reason why someone with a GTX 970 or higher graphics card should upgrade is if they have money to burn. Now if you have a really old card (older than the GTX9XX series) then it's a worthy upgrade.

With the system in my bedroom I also went with a reasonable processor. The i7-7800X isn't the most powerful processor that's compatible the LGA2066 socket motherboards. But going beyond that is just paying through the nose for very minor performance increases.

http://cpu.userbenchmark.com/Compare/Intel-Core-i3-2100-vs-Intel-Core-i3-8100/m41vs3942

Yep, 90% effective performance increase with 20W lower TDP isn't even worth mentioning. YAWN

I agree with most, if not all, of the other examples though. Especially Windows 8!

I'm very excited about 8th gen CPU's and the upcoming 9th gen CPU's. It's been a LONG time since Intel has released a worthy CPU upgrade, and the 8th gen is a noticeable upgrade from even the 7th gen CPU's. When I talk about CPU's only having minor speed differences, this is what I'm talking about:

Intel Core i7-3770k CPU - Passmark score of 9,525:
https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-3770K+@+3.50GHz&id=2

Intel Core i7-7770k CPU - Passmark score of 12,054:
https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-7700K+@+4.20GHz&id=2874

In 4 freaking years, 4 new CPU's and that's only a 26% increase in speed. Each generation increases the speed by 1.5% to 7%. Why even release a new CPU when the differences in performance are so small? Now look at the 8th Gen CPU:

Intel Core i7-8770k CPU - Passmark score of 15,974:
https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-8700K+@+3.70GHz&id=3098

That's a 32% speed increase in only ONE generation! THAT is a good upgrade! We should be seeing these speed increases EVERY SINGLE YEAR instead of 1.5% to 7% teeny tiny bumps in performance. Moores Law (which states that computer speed (or more specifically, transistor count) doubles every 2 years) is now like every 8-10 years.

Too much R&D funds have been spent on mobile CPU's and trying to reduce power draw rather than trying to make more powerful processors. I don't care if the stupid thing draws 500 watts so long as it's fast and powerful. If Moores Law had continued with CPU's, the 8th gen i7 would have a Passmark score of around 50,000. Even the fastest consumer CPU (which costs about $2,000!) only has a Passmark score of around 22,000. Where's my 50,000 Passmark CPU? If Intel keeps up their past trend of 1.5% to 7% performance increases per year, I'll be looking forward to a 50,000 Passmark CPU in about 2040 - 2050. That's unacceptable.
 
Last edited:
RTX series is basically useless for gaming first developers would have to support the realtime raytracing that comes with a price FPS drastically drop while it looks nice if it is not 60fps most people rather not use it then the price tag not going to spend more money on a video card just for that.
Bought all kinds of things for p3, ps4 that never caught on and developers no longer support ie playstation move only like 4 titles that support it nothing more than waste of money.

This reminds me of betamax vs vhs while betamax looked really nice it did not catch on and faded away.
https://research.nvidia.com/publication/infinite-resolution-textures
Might be a better approach to better looking games.
 
Last edited:
It's not an end-all-discussion picture, however, NVIDIA is usually fairly accurate in their benchmarking quotes... if history is an indicator.

Looks like the 2080 is 40-50% faster than the 1080 head to head, and roughly 2x faster with AI turned on. (Caveat Emptor; supported only on certain games for now, of course).

NV-GeForce-RTX-2080-Performance-1030x579.jpg
 
Seen the reviews and benchmarks not impressed not much faster top end card about 12% on benchmarks the ray-tracing demos don't really see that much of difference to warrant 60% frame rate drop avg fps in games supporting it is 30ish fps.
Where does this card shine well ray-tracing Maya,3DsMax ect. does not work with Blender at this time lol it crashes the program.
Even Linus agrees this card was rushed i would wait and see if game designers use this a year after launch.
There is the problem of compatibility notice Blender crashes with an RTX installed it is widely used heard issues in some games and pre-order release has been delayed.
 
Back
Top