NVIDIA GeForce RTX 2080 or 2080TI - Anyone getting one?

I dabble in 3d rendering still so I'm pretty excited to see how that aspect plays out. If I could get a pretty accurate real-time preview of a raytraced scene before rendering it, andthe time for that final render is much shorter, I'd be sold.

I don't think the tech is there yet though, especially for games. This is a first iteration so the performance will be pretty crap, the support is negligible from third party devs, etc. Since I just bought a 1080 earlier this year, I'll likely wait for the tech to mature and be more commonplace before I buy a new card.
 
I want some of what you're drinking @Galdorf - lol!

CUDA was a mess when it got released, too... but I'm pretty sure everyone has had a fun time using CUDA since.

Instead of RTX being shipped on the 20th, it's getting shipped on the 27th.

Drivers and Kernel updates are going to be the main thing holding RTX back ATM. The hardware is there, it simply needs to be developed a bit more. Sure, NVIDIA sucks for not having that ready for launch... but this isn't the first time, or second time that has happened on a new card launch.

The RTX API simply needs to be "plugged into" a game to enable that functionality. Battlefield developers implemented RTX within 1 week. That's impressive, yet also telling as to why devs are delaying their games or offering RTX as a downloadable update in November.. they have to optimize their Giga-rays and whatnot, lol.

Blender has already stated that they will have RTX support in their upcoming "eevee" version. Current crashing in blender (not using RTX), is down to the drivers and blender's own Open-source implementation, or a combination of the two. Some of the blender forum posters have stated that it's likely due to Blender calling CUDA cores using an unsupported toolkit and/or the in-time kernel compiler failing... this has happened with blender before, on other cards and is documented on their site:

https://docs.blender.org/manual/en/dev/render/cycles/gpu_rendering.html
This error may happen if you have a new Nvidia graphics card that is not yet supported by the Blender version and CUDA toolkit you have installed. In this case Blender may try to dynamically build a kernel for your graphics card and fail.

In this case you can:

  1. Check if the latest Blender version (official or experimental builds) supports your graphics card.
  2. If you build Blender yourself, try to download and install a newer CUDA developer toolkit.
Normally users do not need to install the CUDA toolkit as Blender comes with precompiled kernels.

Bottom line, the 2080ti is competing/beating out the $3000 Titan V... for 1/3rd of the price on incomplete/buggy drivers... Seems like progress to me.

Remember that the 1080/1080ti cards have seen a 20-40FPS improvement on drivers alone over the past year/years. Same thing will happen here.
 
I want some of what you're drinking @Galdorf - lol!

CUDA was a mess when it got released, too... but I'm pretty sure everyone has had a fun time using CUDA since.

Instead of RTX being shipped on the 20th, it's getting shipped on the 27th.

Drivers and Kernel updates are going to be the main thing holding RTX back ATM. The hardware is there, it simply needs to be developed a bit more. Sure, NVIDIA sucks for not having that ready for launch... but this isn't the first time, or second time that has happened on a new card launch.

The RTX API simply needs to be "plugged into" a game to enable that functionality. Battlefield developers implemented RTX within 1 week. That's impressive, yet also telling as to why devs are delaying their games or offering RTX as a downloadable update in November.. they have to optimize their Giga-rays and whatnot, lol.

Blender has already stated that they will have RTX support in their upcoming "eevee" version. Current crashing in blender (not using RTX), is down to the drivers and blender's own Open-source implementation, or a combination of the two. Some of the blender forum posters have stated that it's likely due to Blender calling CUDA cores using an unsupported toolkit and/or the in-time kernel compiler failing... this has happened with blender before, on other cards and is documented on their site:

https://docs.blender.org/manual/en/dev/render/cycles/gpu_rendering.html


Bottom line, the 2080ti is competing/beating out the $3000 Titan V... for 1/3rd of the price on incomplete/buggy drivers... Seems like progress to me.

Remember that the 1080/1080ti cards have seen a 20-40FPS improvement on drivers alone over the past year/years. Same thing will happen here.

https://forums.geforce.com/default/...x-20-series/pre-orders-are-now-delayed-dumb-/
Delayed.
 
Last edited:
Instead of RTX being shipped on the 20th, it's getting shipped on the 27th.

I didn't get an email like they did and I haven't been charged for the card yet, either.

From NVIDIA - 9/14/2018 - https://forums.geforce.com/default/...ries/geforce-rtx-2080-ti-availability-update/

Hi Everyone,

Wanted to give you an update on the GeForce RTX 2080 Ti availability.

GeForce RTX 2080 Ti general availability has shifted to September 27th, a one week delay. We expect pre-orders to arrive between September 20th and September 27th.

There is no change to GeForce RTX 2080 general availability, which is September 20th.

We’re eager for you to enjoy the new GeForce RTX family! Thanks for your patience.

Update 9/20:

Hey everyone, some of you have received emails recently regarding the 2080 Ti Founders Edition, we expect to give everyone a ship date for your GeForce RTX 2080 Ti Founders Edition in the next few days. Please stay tuned.
 
If crypto mining keeps these cards way beyond the point of reasonable, I'll keep rocking my HD 5870.... it's quite old and struggles a bit with modern games but I really don't game a whole lot these days and if I do it's rarely on PC. Finally... even IF I do game on PC, it's games like Hearthstone or Diablo 3... not games that need a 1070 or 1080 GPU. I regret not pulling the trigger on a $300 1070 about a year and a half ago.

I feel like anything beyond the mid tier of cards is just too heavily influenced by crypto mining and in turn they have their prices driven sky high.


@sapphirescales

Their was so little innovation or improvement between 2nd gen and on in the intel lineup because until Ryzen launched they had no reason to do anything else. Minimal effort each year was quite enough to keep chugging out new chips and charging through the teeth for them. Remember what happened when Ryzen launched? Intel started slashing CPU prices and the next new chips to launch were marked improvements in terms of what you get out of a new Intel chip... imagine that. Couldn't sit on their behinds any more...

But also don't forget even a 10-15% increase each generation is additive.... two or three gens worth of difference start to add up. 15% year over year isn't much...but 45% over 3 years is. Plus increased power efficiency is nothing to sneeze at. I'd say upgrading each year is wasteful... but every 3 or so wouldn't be if your use case could warrant it.

Another point I disagree with slightly... smart phones. Maybe year over year the improvement is relatively minimal, but look at the Galaxy S5 vs the Galaxy S8.... quite a big difference. Same with iPhone 4S vs iPhone 7
 
Another point I disagree with slightly... smart phones. Maybe year over year the improvement is relatively minimal, but look at the Galaxy S5 vs the Galaxy S8.... quite a big difference. Same with iPhone 4S vs iPhone 7

Samsung Galaxy phones have always sucked. I owned an S5, then reluctantly went to an iPhone after several cheap crap Android smartphones that didn't even last 6 months each. I tried to get back into Android again and I went out and bought a Note 8, but it sucked big time. There was so much garbage on there that it wasn't even usable. I'd sooner use a Celeron eMachines from 2003 with the OEM image installed on the hard drive (you know, with all the garbage on it?). At least that crap's removable on a computer. Not so much on a smartphone.

I hate both Android and iOS, but Android is definitely worse because all the manufacturers cram it so full of junk. I checked out the Pixel XL because it has a pretty clean install of Android on it, but I don't like my OS changing every time I update my phone. I only use my phone for calling, texting (minor), GPS, minor web browsing, and quick email checking. I don't want to have to re-learn the OS every time my smartphone manufacturer feels like moving sh*t around for fun. My phone is a business device. I can't have it crashing or unstable or changing itself around whenever it wants to.

In addition the Apple watch is absolutely essential as are the earpods. I need to be able to answer the phone and use my GPS on the road without having my phone out. Not to mention all the times when I have to answer the phone when I'm right in the middle of working on a computer. I'm hoping the new Apple watch is as good of an upgrade as everyone says it is, because there are some serious limitations to the one I have that I'm hoping are fixed. The main thing I want is a bigger screen and a louder speaker and the new one has both of these.

I also use Apple Pay all the time because I have a nasty habit of forgetting my wallet. It's so convenient to just put my watch next to the card reader and be done with it. And almost every business supports it. *Almost* every business. I was p*ssed off because the Dairy Queen I went to the other day didn't have it and I had forgotten my wallet so not everyone has it, but it's rare for a business not to have an Apple Pay enabled card reader. The only other place that doesn't have it is my chiropractor, but I'm not surprised. Smaller businesses are much less likely to have it.

All that being said, I still think smartwatches and airpods are gimmicky. I definitely wouldn't own them if I didn't run a business. I don't have an Apple watch or airpods for my personal iPhone and I have no desire to have them for my personal iPhone.

I REALLY want FULL Windows 10 on a smartphone (NOT Windows RT or RT 2.0 (i.e. "Windows 10 S")). Give me a big a$$ 6.5" Windows 10 smartphone with a stylus and I'll be in heaven. I REALLY miss my T-Mobile Wing (my first smartphone purchased in 2006 with Windows Mobile). It's too old to be used nowadays, but it was really nice to have Windows on a smartphone.
 
My second hand Samsung galaxy S6 works perfectly and has never let me down.

Windows 10 on a Smartphone?.....Seriously?
To each his own I suppose.
 
My second hand Samsung galaxy S6 works perfectly and has never let me down.

Windows 10 on a Smartphone?.....Seriously?
To each his own I suppose.

My S6 was doing ok, I got it refurbished in Feb 2016. So I made it a solid 2 and a half years with an S6 I paid $300 for. Pretty happy with that, but for the past 6 months its been pretty poor. Battery life was bad, but they were never known for having good battery life to begin with. Then the phone started getting pretty laggy. It's decent if you factory reset the phone and don't load too many apps on it. But I'd rather not have to limp around with it.

I just got an S8 + that came with a 32GB SD card for $349. Can't complain... it came in pretty good shape for a used phone. Screen / glass almost perfect which is rare for these as they crack easy. Most of the wear is around the charging port... like someone who drunkely tries to put their key in their car door and misses a few times? I don't know how it wears so heavily around the charger port... but the rest of the phone is pretty clean so I'm happy with it. I hate spending money on phones, but the S8+ is a marked improvement... runs much faster than the old phone, great battery and one of the best screens I've ever seen on a smart phone.

Now here is my problem.. adds on the youtube app. I've been spoiled by ublock origin stripping the ads out of youtube on my PC. I don't know if I can deal with the watching the ads on the app!

I do find the Android 8.0 update "rearranged" a lot. While I've never been a fan of moving stuff for the sake of moving it, it doesn't make me lose sleep either.
 
  • Like
Reactions: GTP
My S6 was rescued after a bath in the sea. I cracked it open and flooded it with methylated spirits.
After it was dry I put it back together and promptly dropped it - cracking the screen.
It works perfectly but has a star shaped crack in the bottom right corner of the screen. I not bothered by the crack.
As for battery life I get at least 2 days between charges. It's the original battery as well.

I have it firewalled and rooted, (always sounds weird when Aussie's say that 'cause it has different meanings here).
I only use it for calls, text, MMS, some Square payments, podcasts, the odd pic and music.
The only apps installed are JottaCloud, Square POS, Emsisoft Mobile Security, Podkicker Pro, Camera Blocker, SnoopSnitch and NetGuard Firewall.
All the other preinstalled garbage is locked down by the firewall and has no access to or from the outside.

Works for me.
 
After looking at all benchmarks yes rtx 2080 is faster than the gtx 1080 BUT only at high resolutions greater than 1080p the higher the resolution the greater the increase at 4k it as high as 50% faster is it worth the price well if you have a 1080p monitor no if your running 4k yes.
Still though seems pricey most people won't spend over $1000 on a video card and older games have trouble supporting 4k if your doing 3d rendering it is really worth picking up the RTX cards.

RTX2080-REVIEW-44.jpg
 
Samsung phones never sucked, maybe just your personal experience @sapphirescales
I have owned S products since S4 - never had an issue.

In regards to cards the jump was 40% from say a 960 to 1080. This current gen no real remarkable change. Has Ray Tracing - yes might be good for 3D Renders - not for gaming.

Is overkill for a card - but I am poor so am sad - still using card @Barcelona sent me lol
 
  • Like
Reactions: GTP
Samsung phones never sucked, maybe just your personal experience @sapphirescales
I have owned S products since S4 - never had an issue.

In regards to cards the jump was 40% from say a 960 to 1080. This current gen no real remarkable change. Has Ray Tracing - yes might be good for 3D Renders - not for gaming.

Is overkill for a card - but I am poor so am sad - still using card @Barcelona sent me lol
I'm thinking of getting a 1080Ti for Chrissy, so - if I do - I'll send you my 1050Ti as a gift. ;):)
 
I could never understand why people need to overclock to get a performance boost and all the instability that comes with it.
Why run the risk of destroying a very expensive card?
Why not just buy a more powerful card to begin with?
 
These ppl dont run the risk of ruining a perfect card either- or bank account
just because curiosity I suppose. I have been that way - why?+=why not.

Back in other times - was same simple volt bridge trick, did that with AMIGA 500 board lol to 1MB + Expansion Card I had 1.5MB OMGERD!
 
Last edited:
  • Like
Reactions: GTP
These ppl dont run the risk of ruining a perfect card either- or bank account
just because curiosity I suppose. I have been that way - why?+=why not.

Back in other times - was same simple volt bridge trick, did that with AMIGA 500 board lol to 1MB + Expansion Card I had 1.5MB OMGERD!
back in the C64 days i created a serial to parallel converter for the floppy drive and created software to load data at rates 500x faster so i could format or copy a floppy in 10 secs instead of 5-10 mins.
 
I could never understand why people need to overclock to get a performance boost and all the instability that comes with it.
Why run the risk of destroying a very expensive card?
Why not just buy a more powerful card to begin with?

Once you get it stable it generally stays stable for years...sometimes requires a voltage bump in some scenarios but its not a big deal.

"Why run the risk of destroying a very expensive card?"
If I have the fps I want, I am most certainly not doing any OC that requires a voltage adjustment on a super expensive card like that or anything to void the warranty until it ends (not sure how loose the warranties are these days)

"Why not just buy a more powerful card to begin with?"
We are buying more powerful cards, for the most part the majority of overclockers do not buy weaker cards unless they can't afford the more expensive cards or that particular card they purchase is known to be a good card to overclock for huge price/performance gains.
 
  • Like
Reactions: GTP
Back
Top