Thursday, June 9th 2016

NVIDIA GeForce "Pascal" 3-way and 4-way SLI Restricted to Select Non-Gaming Apps

In a move that's set to not go down well with gamers looking for 4K 60 Hz gameplay with eye-candy maxed out; NVIDIA has changed the way it approaches 3-way and 4-way SLI support for the GeForce GTX 1080 and GTX 1070. While out of the box, you can enable 2-way SLI using either an SLI HB bridge (recommended for certain high resolutions), or even a classic 2-way SLI bridge; 3-way and 4-way SLI support will be restricted to a select few non-gaming apps.

At the launch of the GTX 1080, NVIDIA told the press that it will officially not support 3-way and 4-way SLI for GeForce "Pascal" GPUs, however, it will provide a recourse for enthusiasts, by setting up an "SLI enthusiast key" webpage, from which enthusiasts can obtain a software key that unlocks 3-way and 4-way SLI support using classic bridges. NVIDIA would have merely optimized its drivers up to 2-way SLI, and the odd lucky gamer would be able to take advantage of 3-4 GPUs if a game developer got generous. That's no more to be.
NVIDIA has reportedly removed the entire "software key" process of unlocking 3-way and 4-way SLI support. You should be able to enable SLI for 3-4 GPUs, but only a list of apps selected by NVIDIA will be able to take advantage of >2 GPUs from your setup, for now. These include popular 3D benchmarks such as Unigine Heaven, 3DMark FireStrike, and Catzilla. NVIDIA may expand this list with future driver updates.

The idea behind this appears to be to appease overclockers who mostly score on synthetic benchmarks, while not having to deal with complaints of choppy display output from gamers. Overclockers only care about the numerical score a benchmark spits out, and not the fluidity of the benchmark's 3D scene. Gameplay, on the other hand, relies on smooth output.
Source: PC Perspective
Add your own comment

46 Comments on NVIDIA GeForce "Pascal" 3-way and 4-way SLI Restricted to Select Non-Gaming Apps

#1
Patriot
Let me guess... 3dmark will be an allowed program.
Posted on Reply
#2
dj-electric
I'm kinda with NVIDIA on this one. having 2 GPUs can make a lot of sense, while having more was never really a good thing, not im nost cases
Posted on Reply
#3
FordGT90Concept
"I go fast!1!11!1!"
SLI/Crossfire will be phased out as D3D12 and Vulkan gain ground. It makes sense for them to start scaling it back now.
btarunrThese include popular 3D benchmarks such as Unigine Heaven, 3DMark FireStrike, and Catzilla. NVIDIA may expand this list with future driver updates.
PatriotLet me guess... 3dmark will be an allowed program.
Yes. On that note: Futuremark needs to get a D3D12 benchmark out so SLI/Crossfire no longer matters.
Posted on Reply
#4
PP Mguire
But 3 and 4 cards look so cool in my machine :(
Posted on Reply
#5
RCoon
On the one hand, they're underhandedly going back on a promise of software keys.

On the other, why bother wasting resources coding for a setup <1% of NVidia owners actually have that doesn't achieve a whole lot anyway (4 way scaling is abysmal). This isn't going to affect many people on this forum at all. Benchmarkers can still run benchmarks, and crunchers never needed SLI anyway.
Posted on Reply
#6
Slizzo
RCoonOn the one hand, they're underhandedly going back on a promise of software keys.

On the other, why bother wasting resources coding for a setup <1% of NVidia owners actually have that doesn't achieve a whole lot anyway (4 way scaling is abysmal). This isn't going to affect many people on this forum at all. Benchmarkers can still run benchmarks, and crunchers never needed SLI anyway.
Yup and yup. While it sucks, it makes perfect sense business wise, and also market wise.
Posted on Reply
#7
FordGT90Concept
"I go fast!1!11!1!"
PP MguireBut 3 and 4 cards look so cool in my machine :(
I hope you have 9-12 monitors. AMD and NVIDIA never really put in the effort on the driver side to make >2 cards perform the way they should.
Posted on Reply
#8
PP Mguire
FordGT90ConceptI hope you have 9-12 monitors. AMD and NVIDIA never really put in the effort on the driver side to make >2 cards perform the way they should.
3 cards ran great for the games I play. I never really bothered with 4 way unless I was exclusively doing Firestrike anyways.
Posted on Reply
#9
bogami
Well again, one more restriction. who will build a multi monitor PC with high requirements for at least 60 FPS currently you can only turn to the old GPU s. And you can expect an additional account ($?) for the key !
Fuck up again. How this has become something expected from nVidia. As to buy a good race car and would not be able to ride with him on the racing track at full capacity.
How many times have we been raped from this company with prices, bad products, software bugs .We should get donations GPUs as a gift for damages! Not to mention that the prices.
Posted on Reply
#10
Brusfantomet
If AMD follows one of the biggest advantage of x99 in gaming disappears, since z170 has 20 CPU lanes, and z270 gets 24 (8X PCI-e 3 getting the same performance as 16X PCI-e 3 here) needing more will be a hard sell.
bogamiAs to buy a good race car and would not be able to ride with him on the racing track at full capacity.
I agree that its stupid of them to remove the possibilities (just don't optimize for it in the drivers, but let users use it and try to tweak the settings themselves would be a better solution i think) but some sports cars use GPS to find out if you are on a race track and then remove the speed limiter while you are on the track.
Posted on Reply
#12
TheGuruStud
Typical nvidia behavior lol

I like how SLI/xfire just doesn't work for shit, nowadays. I remember my 4890s/8800GTs. I almost NEVER had a problem with dual cards. Gaming was fantastic. The drivers AND games were far superior, then. And if you older guys remember, before broadband was common, games were incredibly polished and ran with amazing performance. I could ALWAYS max out every game and play with vsync at bare min of 60 fps solid, if not 85 or 120 (CRT ftw).
Posted on Reply
#13
Octavean
PP MguireBut 3 and 4 cards look so cool in my machine :(
Well if looks are all you care about then you can still have those 3 and 4 cords in your system.
Posted on Reply
#14
Breit
If they don't wanna support 3- and 4-way SLI, why don't they just state this and let everything else up to us users and the developers of games and applications? I mean its fine if they don't want to spend ressources for this, but it turns out now they do spend ressources just to hinder people to use more than two cards?!

I wouldn't be surprised if the new and shiny 1080Ti/Pascal-based Titan magically support 3- and 4-way SLI once more (for a premium of course). :cool:

Just like SLI could only be used on licensed chipsets/mainboards in the past (where you had to pay a premium of course), even if it would've worked on a non-licensed mainboard as well.
Posted on Reply
#15
efikkan
Dj-ElectriCI'm kinda with NVIDIA on this one. having 2 GPUs can make a lot of sense, while having more was never really a good thing, not im nost cases
SLI customers might be a minority, but they are still the most loyal big spenders, and Nvidia will be stupid to piss them off.

Once there is support for AFR, this can in theory scale to any number of GPUs. There is no need for the software to add support for each multi-GPU configuration. So it's not like this is extra development work for them, it's mostly just QA.

In games with good multi-GPU support they usually scale well up to three GPUs, and some struggle to utilize four GPUs. That's due to bottlenecks in the game engines, in theory there should be no problem scaling up to four GPUs for current hardware.
FordGT90ConceptSLI/Crossfire will be phased out as D3D12 and Vulkan gain ground. It makes sense for them to start scaling it back now.
Not quite. Direct3D and Vulkan will expose the same hardware features which currently are utilized directly for each vendor. The advantage of this would be that game developers no longer would have to interface with each native API, but we'll see in time if it works just as well.
Posted on Reply
#16
GhostRyder
Well...This is unfortunate but I was planning to go back to two cards anyway so it does not bother me much. The only unfortunate thing is I spent the extra on the 5930K just for the PCIE lanes and now I am wishing I just opted for the 5820K. Whelp, least when I change out cards this year I get full 16x PCIE 3.0 on both cards :P.

The only problem I have with this is them backing out of what they said. Personally its actually understandable that they want to go ahead and drop the support just wish they had been up front about it instead of gettings peoples hopes up.
Posted on Reply
#17
qubit
Overclocked quantum bit
Way to go to reduce your support costs, NVIDIA. :rolleyes:

Clearly, it's because there are very few 3-4 way SLI setups out there so it's not economic for NVIDIA to support them, but it still sucks for those with deep pockets that would like to try it.

At least I can still run my old twin GTX 590s whenever I feel like some hot and sweaty 4-way SLI action, lol.
Posted on Reply
#18
PP Mguire
OctaveanWell if looks are all you care about then you can still have those 3 and 4 cords in your system.
I'll just keep one of my Titans in there for PhysX.
Posted on Reply
#19
Dicfylac
As far as I understant Nvidia's move there is a possibility of this gtx1080/1070 run as good as Quadro/Tesla cards in tri or four sli configuration.
Other wise you wont buy a 800€ card with a simple sli bridge,
But whowever I can be wrong.
What been said above feels to me nonsense, cutting down profit :banghead:.
Just a though.
Posted on Reply
#20
the54thvoid
Intoxicated Moderator
GhostRyderWell...This is unfortunate but I was planning to go back to two cards anyway so it does not bother me much. The only unfortunate thing is I spent the extra on the 5930K just for the PCIE lanes and now I am wishing I just opted for the 5820K. Whelp, least when I change out cards this year I get full 16x PCIE 3.0 on both cards :p.

The only problem I have with this is them backing out of what they said. Personally its actually understandable that they want to go ahead and drop the support just wish they had been up front about it instead of gettings peoples hopes up.
Isn't it the case that two cards require 2 x 16 lanes (i.e. 32) therefore you need a 40 lane CPU for two cards anyway (5820 has 28 lanes?). Nvidia aren't dropping SLI, they're dropping 3 and 4 way.

And I expect the most vocal people will be those who either (A) don't use Nvidia cards or (b) don't use 3 or 4 cards. Go figure. FWIW, I googled a couple of older sli 2-4 way comparisons and guess what, until using 4k res with ultra settings and very game dependent there are little benefits to 3 or 4 way. Most titles benefit greatly from 2 way, a smaller 10-20% extra increase for 3 way and 4 way yields very little extra performance for most titles.

Much crying over little milk
Posted on Reply
#21
GhostRyder
the54thvoidIsn't it the case that two cards require 2 x 16 lanes (i.e. 32) therefore you need a 40 lane CPU for two cards anyway (5820 has 28 lanes?). Nvidia aren't dropping SLI, they're dropping 3 and 4 way.

And I expect the most vocal people will be those who either (A) don't use Nvidia cards or (b) don't use 3 or 4 cards. Go figure. FWIW, I googled a couple of older sli 2-4 way comparisons and guess what, until using 4k res with ultra settings and very game dependent there are little benefits to 3 or 4 way. Most titles benefit greatly from 2 way, a smaller 10-20% extra increase for 3 way and 4 way yields very little extra performance for most titles.

Much crying over little milk
Well yes, but what I meant was that they run just as well at 8x as in 16x so I could just have gone 5820k for cheaper and opted for less lanes seeing as its going to be something that not really supported anymore (Or only by AMD). But like I said I was planning to go back to only two cards anyways so its my fault a bit and does not bother me much in the end anyways since I will just have two cards setup to run at full 16x PCIE lanes reducing any chance of bottlenecks. So its a big meh at the end of the day and its something that was really a niche thing anyways. Just going to miss having that many GPU's as a personal thing :p
Posted on Reply
#22
TheinsanegamerN
TheGuruStudTypical nvidia behavior lol

I like how SLI/xfire just doesn't work for shit, nowadays. I remember my 4890s/8800GTs. I almost NEVER had a problem with dual cards. Gaming was fantastic. The drivers AND games were far superior, then. And if you older guys remember, before broadband was common, games were incredibly polished and ran with amazing performance. I could ALWAYS max out every game and play with vsync at bare min of 60 fps solid, if not 85 or 120 (CRT ftw).
That's what happens when the gaming world fosters a "pre-order now, patch in 3 months" culture.

I dont have many issues with SLI, but then I always wait 6+ months before buying a game. The only game I didnt wait for was battlefield, which has great SLI performance.
Posted on Reply
#23
cdawall
where the hell are my stars
I don't know why people are upset about it not supporting more than 2 cards. The less than 1% of games that affected aren't exactly a majority of anything. Having played that game for years multi GPU configs leave a lot to be desired. To keep people from complaining that they just spent nearly $3k in video cards alone nvidia is basically saying don't be dumb.

Not that this really matters if you are playing in DX12, DX12 will allow you to use all 3 or 4 GPU's regardless to what nvidia says you can or cannot do.
PatriotLet me guess... 3dmark will be an allowed program.
Don't know you tell me, oh wait...
btarunrThese include popular 3D benchmarks such as Unigine Heaven, 3DMark FireStrike, and Catzilla. NVIDIA may expand this list with future driver updates.
Posted on Reply
#24
vega22
RCoonOn the one hand, they're underhandedly going back on a promise of software keys.

On the other, why bother wasting resources coding for a setup <1% of NVidia owners actually have that doesn't achieve a whole lot anyway (4 way scaling is abysmal). This isn't going to affect many people on this forum at all. Benchmarkers can still run benchmarks, and crunchers never needed SLI anyway.
dont you think once dx12 and vulcan become the norm sli/xfire are dead in the water anyway?
Posted on Reply
#25
cdawall
where the hell are my stars
vega22dont you think once dx12 and vulcan become the norm sli/xfire are dead in the water anyway?
That is what I am personally hoping.
Posted on Reply
Add your own comment
May 1st, 2024 13:47 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts