Lossless Scaling

Lossless Scaling

78 ratings
Dual-GPU Setup For Even More FPS With LSFG!
By IvanVladimir0435
Do you still have your previous GPU with you? Do you want to give it a new life? Then this is for you! (you can watch a video tutorial of this guide here!)
3
3
4
   
Award
Favorite
Favorited
Unfavorite
Why is using two GPU's relevant?
Lossless Scaling is a wonderful tool, one of many use cases too, but not everything within it can be used without any sort of penalties.

If you plan to turn on LSFG on a game that is already using your GPU at 100%, then you'll notice a performance drop on the base FPS before it applies any frame generation, said performance drop is because frame generation is an expensive process, which can be slightly alleviated by turning on performance mode, but even then, there's an inevitable performance drop, it can go anywhere from turning 80 fps into 60, to turning 60 into 40 before duplicating them, thus making latency feel awful.

You can see an example of that here:

https://steamuserimages-a.akamaihd.net/ugc/2502404027099807410/3CA2556F51BEB7AF9CDF1EA0FACD54E5B475A326/?imw=256&&ima=fit&impolicy=Letterbox&imcolor=%23000000&letterbox=false
Said "load" can be offloaded from your main GPU to a second GPU that's capable enough, and thus avoiding the performance penalty.

If you have a second GPU lying around, or are looking into an easier way to increase your fps without losing a lot of money getting a higher tier GPU to replace your current one, this guide might help you a lot.
Requirements
There are a few requirements you need to meet to be able to get a Dual-GPU LSFG setup to work:

-You need a motherboard that allows you to plug in two GPU's (of course) at the same time, and you also need it to run at the very least pcie 3.0 x4 for your second GPU's slot, this is because 3.0 x16 to 3.0 x4 is a big loss of bandwidth already, any further drops in pcie generations/lanes will make the second GPU be more of a hindrance than an advantage when trying to run Lossless Scaling Frame Generation with turning the whole image/game stuttery and blurry, for 4k pcie 4.0 x4 is recommended.

-You need a second GPU that's capable enough to run LSFG depending on your resolution and base framerate, a GTX 1050 TI would struggle for higher resolutions like 4k and 1440p but can reach 180 fps with performance mode at 1080p, meanwhile an RX 5600 XT would have no trouble handling 1440p, but for 4k something like an RX 6600 is more advisable, there's always room for you to do the tests and judge what's acceptable and what isn't.

-You need to be running either Windows 10 or Windows 11 so you can adjust your performance (render) GPU and let the power save (LSFG) GPU rest.

-And finally of course, you need your PC to be capable of handling both running at the same time (PSU being capable enough of powering both and the case itself not heating both to death)

Setting It Up
Step 1
When you install your LSFG GPU into your computer, make sure your monitor's cable is connected to it, rather than to the render GPU, this is because your render GPU will send the frames to your LSFG GPU so it can process them and generate more frames based on them, having to send that huge load back can cause a lot of issues, it must look similar to what's in the image:


Step 2
Once you are running your Windows you have installed, you must set the preferred render GPU, this process varies depending on the version of Windows you have:

For Windows 10:
When doing this process with Windows 10, you must first open device manager, right click on your render GPU, look at its properties and locate your render GPU's ID in the Details section


In my case, I want my RTX 3060 to be the render GPU, and its ID shows as PCI\VEN_10DE&DEV_2504&SUBSYS_397D1462

Yours will of course be different, after locating your ID, you must turn it into a value that can be used, mine ended up being 10DE&2504&397D1462, so you must remove the VEN_, DEV_ and SUBSYS_ from your original ID and you'll get a usable value for the next step.


Now open regedit and head to
HKEY_CURRENT_USER\Software\Microsoft\DirectX\UserGpuPreferences
Then once there, create a string value named "DirectXUserGlobalSettings" and set its value to "HighPerfAdapter=GPUID", replacing GPUID with the render GPU's ID you have.

The purpose of this being so that you can manually choose which games use the render GPU because Windows 10 will usually only show one option when trying to choose between "power save" and "high performance" GPU's, this fixes it.



In case you are still having trouble selecting the performance GPU, then in regedit, go to Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Class\{4d36e968-e325-11ce-bfc1-08002be10318}, there you'll find some folders with 4 digits each, look inside them and look for a value that says "DriverDesc" to identify the folder that belongs to your performance GPU, like shown in the image.

Once you have located that folder, create a new DWORD key called "EnableMsHybrid" and set it to 1 for performance GPU, or if you are having trouble being allowed to select the "power save" (LSFG) GPU, and are on that GPU's folder, then set it to 2.

Once all done, just restart your computer and you should be allowed to choose your preferred GPU for each game you want.

For Windows 11:
For Windows 11, the process is a bit more straightforward, for you don't need to mess with regedit to get it working.
Go to System > Display > Default graphics settings
And set Default high performance GPU to your render GPU



In both cases of Windows, if your render GPU is an Nvidia GPU, go to Nvidia Control Panel and set your OpenGL render GPU to your rendering GPU, so it is used in games like Minecraft

Step 3
Now, you just need to tell Lossless Scaling which GPU you want doing the Frame Generation, open Lossless Scaling and scroll down until you reach the GPU and Display section, and set your Preferred GPU to the LSFG GPU you want to use
Results
Now you should be able to use Frame Generation on your Dual-GPU setup to deny the performance hit of it and achieve higher frame rates.

Some examples:


Extra Notes
There are some things that one must keep in mind:
-Although generally higher performing GPU's tend to do better than weaker ones for LSFG, there are some special cases, like an RX 6400/6500 XT being far better for LSFG than a GTX 1660 Super, which performs better in games in terms of rendering

-RDNA 1 and above GPU's handle LSFG far better, even in similar performance GPU's (an RX 5500 XT will fare better by a decent margin at LSFG than an RX 580)

-AMD GPU's can keep up way better as the LSFG GPU than Nvidia GPU's, it's currently unknown why, but it's theorized this behavior might be related to the amount of fp16 AMD gave to them (an RX 5600 XT can output more frames than an RTX 3060 12GB)

-There isn't a lot of data about the many possibilities for dual-gpu setup's, so if your weaker GPU like a GTX 1050/1060 runs at results you like at your resolution and framerate, then please share your results.

-GPU's have a "limit" for how many frames they can output, an RX 5600 XT was observed outputting 82x2 at 1440p as its limit with LSFG 2.3.

-Higher resolutions take more processing power of course, but so does higher framerates, a GPU might struggle to turn 80 fps into 160 fps at 1440p but might do 60 fps to 120 fps just fine, adjust your expectations based on how old and how well performant your GPU is.

-HDR has a toll on how hard to run LSFG is, you'll notice a 20% higher result with SDR.

-If your LSFG GPU is far more than enough for the job you want it for (doesn't reach 100% usage), then it's recommended to undervolt and underclock it so you save up on power consumption and heat generated.

-For performance mode, an rx 6400 will be enough to reach 1440p 165 fps, this guide shows an rtx 3060 12gb paired with an rx 5700 xt because that's what I had on hand and was testing, but it doesn't mean weaker GPU's just can't do it, if you manage your expectatives, a lot of GPU's can do the job just fine [Update] This is a note with LSFG 2.3, which ran off performance and quality modes, LSFG 3 runs with the same performance penalty as LSFG 2.3 performance mode but with higher quality, check the performance chart to get LSFG 3.0 numbers.

-The Lossless Scaling discord has been constantly doing testing for a second GPU performance chart[docs.google.com].
Credits
- Ravenger: For the massive amounts of useful information about Dual-GPU setup's he provided and testing he's done, the screenshots showing an RTX 4060 TI were by him, and for making the Secondary GPU LSFG Performance chart
- u/sobaddiebad: For the regedit findings that allow fixing the high performance/power saving selection issues in Windows 10 ( https://www.reddit.com/r/AMDHelp/comments/18fr7j3/configuring_power_saving_and_high_performance/ )
- The Lossless Scaling discord members: Constant testing and findings about LSFG
- THS: Developing Lossless Scaling
178 Comments
Sora 14 Jul @ 7:35pm 
This guy's basically showing a return of Dual-GPU setups for better performance and @Pat out here being a complete a$$hat.

Welcome to the internet! Sorry you have to deal with dumb people like him OP
josuerondon2006 6 Jul @ 11:37am 
puedes hacer un video explicando como se hace para que windos 10 detecte la segunda grafica porfa este no se como hacerlo y tu lo explicas pero no que entiendo xd
IvanVladimir0435  [author] 5 Jul @ 1:43pm 
@Pat That reply was really unwarranted
Pat 30 Jun @ 3:20am 
@BVLVI dude ur main acc looks worse then mine poor fake what 5090 u can buy? u better to buy some food to not die of homeless life=)
BVLVI 19 Jun @ 1:47am 
I have two $1,400 ASUS ROG Poseidon GeForce GTX 1080 TI in SLI.
Can totally buy the 5090's but will not because they took away SLI.
It would be hilarious too see my rig run 3 1080ti's
In SLI then also using this mod on the 3rd card.
LuCk3y Ch4rMs 18 Jun @ 2:11pm 
Gracias por la respuesta, Que tengas un buen dia :3
IvanVladimir0435  [author] 18 Jun @ 12:28pm 
@LuCk3y Ch4rMs It all depends on the card, you are getting a bandwidth equivalent to pcie 4.0 x4, which tends to be enough
LuCk3y Ch4rMs 18 Jun @ 2:59am 
For 4k is having 2 gpus in pcie 3.0 x8 configuration enough?
CaptPatrick01 15 Jun @ 3:50pm 
nvm, solved. Apparently you could confuse it by having your monitors plugged into different GPUs from each other.
CaptPatrick01 15 Jun @ 2:59pm 
On my end, doing this just sticks the game window in a black box like if it's running on integer scaling and won't perform any frame gen. (Attempting to run on RX 6600 XT main with 5700g APU (Vega 8) for LSFG. Vega's likely not up to snuff to begin with, but I am trying to diagnose a technical issue.