DLSS 2.0

Anyone used it? I've only used it in Minecraft. Ray tracing in Minecraft looks shitty to me in my opinion, but DLSS impresses me. It's hard to judge in a game like Minecraft, how does it look in Control? And does it really solve the same issues TAA does without the blur or would you still need TAA to get rid of checkboard hair and shimmering pixels?

Attached: maxresdefault.jpg (1280x720, 113.91K)

I don't think its worth the processing power for something that looks as good if it's baked. In this example it looks like the Janitor forgot to mop up properly

The one on the right uses less processing power than the one on the left

Do you know what's going on in the picture?

Looks too good to be true

Attached: 1573516456509.webm (1024x768, 1.58M)

It's allegedly true. No doubt in my mind that actually playing at the native resolution will still look better. But I've played games at 4K and you STILL need TAA to blur the shit out of everything and get rid of blindingly bad shimmering on edges in some games. DLSS is supposed to solve that issue too, not only upscaling from a lower resolution, but apply something that does the same job as TAA without the blur.
So basically, rendering at a low resolution and having DLSS upscale it to 4K can actually look better than 4K+TAA and run significantly better.
But I need some people in this thread that have used it. Like I said, I only tried the new Minecraft update, textures in that aren't detailed enough for me to really judge. But turning it on still looks like 1440p on my monitor, it's supposedly rendering at 720p or something.

If DSLL 2.0 takes off big, how can Radeon realistically respond to it? This shit seems like a complete gamechanger

DLSS is just Waifu2x for video streams. All comparisons show it against heavily blurred screenshots that don't represent an actual 1080p still shot anyway.

Don't get me wrong, the technology is fucking impressive, but not for upscaling 540p to 1080p then claiming to have a huge performance increase. No shit you have a performance increase.

Do some actual AA-Off 1080p vs 540->1080p DLSS and you'll say "Huh".

Or wait, do you actually think Control is blurry like vaseline has been smeared all over your screen and DLSS fixes that?

It's plausible, of course the take away from this isn't that DLSS is amazing, it's that 4K+TAA is just a shitty idea and we should never have bothered with it.

It's an extreme comparison, but not completely unfair. The one on the left is blurry because TAA is basically a requirement for a modern game, where hair and some shadows look like a screen door by design. Though it should be sharpened some.
You've gotta understand that DLSS is also acting as an AA without the blur.
Your point isn't invalid, though, it reminds me of RTX OFF vs. RTX ON comparison, where the game is just completely devoid of shadows or reflections at all until it's turned on.

I don't doubt the right is true, but left looks like complete marketing BS to skew the comparison.
>haha dude twice the resolution is totally THIS blurry, trust us

Woah, the newest buzzword, TSLAA X36 MNAA EXPGA DSLN MNM 2.0 XL Edition. Buy the newest Video Card, goy.

Fuck off.

If you're stupid you won't have noticed that this came out just after RTX hit the market.
Allow me to explain;

RTX comes out, it looks good but it obliterates performance, to a degree that people who actually aren't happy about it when their top of the line 2080 isn't producing playable results.
So what do you do? Do you work on making an even more powerful card and more efficient RTX system? Well, yes you do. But that takes time and the RTX 2000 series cards are still new, so what do you do in the meanwhile?

It's a stroke of genius. You convince people to play at a lower resolution and then use the built in H.265 hardware, which is largely going unused by most people, to upscale the image back to their native resolution. This really actually is very smart because it helps the performance hit massively but it isn't actually putting you anywhere ahead of where you started pre-RTX.

It's a huge marketing push and a patch, DLSS will be dropped when the 3000 series launches and it'll just be quietly forgotten about.

inb4; hurr durr u must be an amd shill go fuck ur mom

It's the first time in Nvidia history where turning a feature on actually improves performance though, otherwise I'd agree.

cope

>make boring, shit game
>blow all your load into "graphics"
>game is forgotten 3 months after release

devs never fucking learn

>TAA is basically a requirement for a modern game
It annoys me no end that games use TAA and not TSSAA which is basically the same thing but better. Doom (and eternal) are basically the only games to use it and it gives best in class results.

Look like her clothes are made out of static... wtf...

that's control?
looks like that one shitty hallway when you play as jake and sherry in china, RE6

TSSAA looks really good, but you've still got to sharpen the blur out and technically losing detail like that. But it's barely noticeable.
I never truly understood just how bad TAA could be until I played RDR2. Holy shit, that could use TSSAA or DLSS.

It uses less processing power on the video side. To use DLSS the tensor cores are used which normally are never used for games

>Left side
>Blurry and obviously zoomed in low res screenshot in jpg
>Right side
>Sharp and filled with every post-processing available high res screenshot in bmp
mmmmmmmmmmmm

The importance of resolution has been massively overstated by marketing campaigns that have tried to sell you newer monitors and newer graphics cards. Now the same people who were telling you to buy new kit because 4K is worth it are telling you to buy their new kit because 4K is unnecessary. The marketing is marketing, but that doesn't mean 4K resolution ever actually mattered.

All AA is cancer, designed as a workaround for low resolution displays.

Increase the base resolution enough and no AA is needed, you can have a perfect 1:1 pixel representation of the scene. But no, we still have people buying 720p screens.

Its use is better explained in video comparisons since it's eliminating pixel crawling without a blurry AA. The problem is, videos are compressed too. Unfortunately, I haven't seen anyone in this thread that's actually played the game to confirm if it's too good to be true or not.

Not true, you still get shimmering regardless of resolution because surprise, it's something that also occurs in real life. But on an LCD display it's particularly unpleasant, so it's important to get rid of it.

pic unrelated?

There's way too much shimmering and screen door hair even at 4K in games like RE2 and RDR2.

In an hour if this thread is still up, we'll still see someone triggering a new IP count that didn't read the thread, thinking we're talking about another meme anti-aliasing method that reduces performance.

So many people in this thread have no idea what they're talking about. Ignore what most people are saying.
It does a better job than TAA at a lower performance cost, that's simply been proven now, and there's research papers that demonstrate the accuracy. TAA is entirely inferior in all regards, including in alpha transparencies and fine details like hair.

>And does it really solve the same issues TAA does without the blur or would you still need TAA to get rid of checkboard hair and shimmering pixels?
It is a complete replacement for TAA, you can't combine the two with positive results. TAA is required without DLSS for PBR rendering techniques that require multi-frame pixel accumulation that many modern games do now. DLSS replaces that with clever upscaling and some of its own, different temporal techniques.

Now I'm gonna leave this thread before the stupidity of armchair graphics experts gives me a fucking aneurysm.

P.S. If you think AMD isn't going to implement very similar technology in the near future, think again.

Attached: 1382220359312.jpg (250x284, 15.49K)