AI upscaling is the future

>540p internal with DLSS 2.0 is better than 1080p native

How did they do it?

Attached: Control vs DLSS 2.0 Can 540p Match 1080p Image Quality Full Ray Tracing On RTX 2060.mkv_snapshot_10.43.712.png (1920x1080, 1.57M)

Other urls found in this thread:

youtube.com/watch?v=nCpGStnayHk
youtube.com/watch?v=YWIKzRhYZm4
youtube.com/watch?v=yxeSmZ_vLDM
twitter.com/NSFWRedditImage

the power of Meme Learning

Upscaling is just the beginning.
youtube.com/watch?v=nCpGStnayHk

>transparent hair is better resolved in the upscale
am i being rused?

nope
youtube.com/watch?v=YWIKzRhYZm4
540p DLSS v 1080p native at 10:30, everything before is 1080p v 4K

>dlss finally does something useful
just took them around what..2 years to get their advertised result? wow.

>How did they do it?
by adding 70-110 ms of input lag

Attached: file51.jpg (391x484, 21.2K)

Of course it looks good in cutscenes because it always looks the same. It looks much worse in regular gameplay. The DF faggots are to dumb to understand how machine learing works

Now show it in motion with those garbage artifacts, you disingenuous piece of shit nvidia astroturfer. Fucking do it, I dare you.

did u watch the video? i dont think they claim to know how it works. the new version released uses tensor cores and looks better in gameplay than the old version thats it

It is objectively looking better on said example, if it is true, but I doubt there is no drawback.
Probably needs some retarded processing power or adds horrible delay to have enough time to run every separate frame through a filter.

Attached: 1387225780354.gif (320x240, 1.83M)

I like how ShillFoundry conveniently forgets to mention it adds a solid 100ms of input lag.

I got into Meme learning recently and after thinking it was retarded gibberish i finally found good sources on the internet about it started doing my models and its really a fucking cool piece of tech.

>Probably needs some retarded processing power
It needs a GPU with tensor cores because Deep Learning is all about matrices and vectors. Trying to run that on a normal GPU (even if doable thanks to CUDA on Nvidia probably) would be a catastrophe

It has increases processing power compared to just native resolution and adds input lag so you're correct on both accounts.
It's about 30% more intensive performance wise than equivalent native resolution.
As in, your framerate at 540p+DLSS will be 30% worse than at native 540p (still much higher than at native 1080p though which is the point of this tech).
And it adds roughly 100ms of input lag.

youtube.com/watch?v=yxeSmZ_vLDM

I dare you to find a fragment in this video where for even like 2 seconds you can see without zooming and stopping the video something is wrong, and remember this is with YT garbage ass compression

>Muh 100ms input lag
You get that amount by playing on consoles or even more and literally nobody cares

(You)

inb4 switch 2 is another underpowered shitterhouse that uses DLSS in all games as a shitty attempt to make up for it

I mean, the switch is already running on nvidia tech, if they can get tensor cores on a mobile gpu they theoretically could.

if its as fast as the Shitch 1 but has Tensor Cores for DLSS support then, i mean, games that would run at 540p at 30fps on Switch would probaby look pretty nice. Dont most games run at like 720p+ there? And they look fine.

Just play on a CRT and it evens out bro.
t. John Linemann

Then go buy Stadia, retard.

>And it adds roughly 100ms of input lag.
ewwwwww
hard pass, unless my framerate is chugging I wouldn't use this thing

>adds 100ms input lag
[citation needed]

>the future of realtime graphics is upscaling and frame interpolation

Attached: 1482971691-2918053c6cb840fd887ae7a9be0c8403.jpg (500x281, 14.91K)

upscaling, yes
Frame interpolation, i doubt that, that would involve predicting future frames, which in any game, let alone fast paced ones, is a crap shoot.

Why? Even in more complex games the things you can do are small and easily predictable.

I like how nobody replied to this post, truth is if you took anyone who didnt know the game uses DLSS they would think it looks native even in-motion. This tech is great and i hope especially weak ass consoles will use it

>basically photorealistc looking games running at 540p upscaled to 1440p looking native in 99% of the cases
>Shittier looking games running at native 1440p

Guess which one i will take

DLSS is only good because TAA is horrible