Uses formerly worthless gimmick rtx components to upres the game using nvidia 'tensor cores' (their version of the...

>uses formerly worthless gimmick rtx components to upres the game using nvidia 'tensor cores' (their version of the neural net shit people were using to upres old video and old game footage)
>looks decent enough in videos/screenshots

apparently according to reviews of it, it looks a bit off with extreme brightness, but on the flip side can render certain details like hair much better from a distance, and it offers at least a 40% performance boost

raytracing seemed like a total gimmick, so if this pans out even a little they salvaged that trash

Attached: control-nvidia-dlss-2.0-comparison-004[1].png (850x750, 804.31K)

DLSS has always been a part of the 20 series.
raytracing cores are still being used for raytracing, even while in DLSS mode.
Shows how little you know and how worthless your opinionated post is.

not memeing, what's better about updated on vs off, looks subtly different but not better

I already bought the rtx 2070 super faggot, no need to suck them off. Raytracing provided no value and this does, you're right.

dlss is running at like half the resolution, so it gets a massive performance boost, so the bottom is like 540p while top is real 1080p

the original version was trash, but now they've updated it to be better

Attached: control-1920x1080-ray-tracing-nvidia-dlss-2.0-quality-mode-performance[1].png (850x450, 56.09K)

It's an upscaling tool. It can turn 1080p into 4K through deep learning algorithms

oh cool. guessing 1060 6gb doesn't support this?

nope unfortunately, they just introduced it with the rtx 20x0 line

they only got it on three or so games right now (MW5, control, and one other) so it's not really useful right now to you anyway unless you really like you some mechwarrior

>off
>pure 4K
>updated on
>1080p upscale to 4K
you get a massive performance boost

thanks for betatesting my 3080 ti, cuck

DLSS looks worse than just lowering your resolution
even adaptive sharpening has more worth

Thanks for betatesting my 4080Ti, cuck

Ironically it shows how little you know. That you don't know the original DLSS was 1.0 and it's now up to 2.0 offering the extreme improvement in quality and performance as visible in the bottom of OPs picture. kys faggot

The RTX and tensor cores are completely different things. Some Titan and Quadro cards had tensor cores before they added RTX

>DLSS has changed
yes, obviously. that doesn't detract from the fact it's been a part of the 20 series cards from the get go.
Did you even read my post?

No it doesn't.

>shitpost: the post

not worth the trouble until DLSS becomes usable on games that don't have to develop native support for it.

Yes I did! But you didn't read OPs post. kys faggot

dlss 1.0? yeah of course. dlss 2.0? not only is the gap much more closed, but it actually looks BETTER in some ways

2.0 looks like it solved a lot of the issues, like mouths flapping and eyes blinking looking off, or the middle image in the OP's shit

it's not going to let you replace a titan with a 2060 rtx, but it seems that as long as devs put in the effort it will be a 10

Thanks for betatesting my 5080ti, cuck

not entirely true. It's a part of all forthcoming Nvidia cards, so there's not exact 'trouble' you'd have to get into to be eligible.

Kinda this, we need AI for this shit and optimization. Imagine a world where PC gets the same extensive optimization as consoles, I bet a 1080Ti would kick next gen around.

So dlss on is faster than no dlss at all?

Fortunately with DLSS 2.0 that's easier than ever because the AI doesn't need to be trained with every specific game anymore, it's all a generic algorithm now.

>what are games made before the year 2020

>N-NO U!
>kys faggot
sounds like someone's mad

yeah, DLSS renders at half the resolution
so that graph is full res vs half res+DLSS

what does that have to do with anything? DLSS is going to be possible on all new and currently produced Nvidia GPUs (save for the 1660) so there's no extra effort for the consumer.
Older games aren't very likely to need upscaling to mitigate performance anyway

>raytracing seemed like a total gimmick
Not a gimmick, it makes a lot more sense to render vidya that way once hardware is powerful enough

540p is quarter resolution retard

lol, i won

>what are games made before the year 2020
Literally every game made before 2020 will run on those new Nvidia RTX cards at like 200fps anyway so you dont need DLSS for them, DLSS is for future games

Attached: 131342535.jpg (1106x1012, 69.91K)

its using special cores on the gpu to fake a higher resolution. So you'd be running it at 540p real resolution, but it'd simulate 1080p.

normal this would be overrated marketing faggotry like that dumb INFINITE RESOLUTION shit, but DLSS 2.0 is actually doing this shit at a high enough level to be usable. there are still situations where it looks a tiny bit off, but it also has situations where it gets a tiny bit more fidelity

Attached: dlsstwopointoh.png (2529x1417, 2.9M)

DLSS will make developers more lazy optimizing less and less and before you know it DLSS is a requirement, not a boost

The ultimate test for DLSS (2.0) is to take a normie gamer put him in front of a 1080p to 4k DLSS game like Control and ask him how does the game look at native 4k, if most say it looks great then DLSS is based and if most realize its not native 4k then it failed

I thought I'd stick to 1080p for my next upgrade but I consider 1440p with dlss now.

they're already doing the bare minimum, remember before anisotropic 16x was basically free, or 4x anti aliasing?

jesus fucking christ, mustards have sunk low
parading about another mcguffin from the guys who sold you 1k gpus?
well not you specifically since someone with a 2080 wouldnt be jumping for joy at upscaling 540

Same, DLSS 1440p + a 144hz freesync monitor seems the perfect choice

>well not you specifically since someone with a 2080 wouldnt be jumping for joy at upscaling 540
When new consoles and next AMD shit cards will struggle to run games at 20fps in 2025 NVIDIA chads will just enable DLSS and will have near native image quality while having + half of the fps, stay seething

>new consoles
and there it is
the mustard on a shit-rig reveals who he's truly butthurt for
hey good luck with the new mcguffin!

>or 4x anti aliasing?
Fuck you talking about i play PC games since Gothic 1 released and AA always absolutely tanked your framerate

im not sucking nvidia's dick, rtx was shit pre-super (and even super is a bit pricy), 1000 series was shit

I'm just surprised they finally got a gimmick that isn't just rigging benchmarks/performance for amd stuff that looks neat

And people still say Neural Networks are a meme

I have a GTX 1080 and will upgrade to a 3000+ and have guaranteed 60fps for the next 10 years

That's all I wanted, something the rtx shit could do instead of rtx. If only it were universally implemented from the control panel and not just like 4 games

hahahah this is precious
the guy excited about an up-scaling mcguffin, is also getting the new mcguffin gpu and has big ole dreams about a mcguffin performance life

The whole point of 2.0 is that is much easier to implement,also UE4 will soon have official support for it so probably most UE4 games will use it. If Unity follows then basically all games will have it

cool

apparently op was wrong, it just means that the RTX stuff gets a lot more mileage because they got the DLSS shit not sucking dick now

so if 1080p with full ray tracing was running at 30fps, this would adequately fake 1080p at 540p or 720p or some shit while running at 60fps

I'm still on a 760 playing games at 60 fps 1080p because I turn off prettiness gimmicks like AA

its not like pic related though

its at the point where its actually enough visual quality to be viable

Attached: xBRZ%2022[1].png (1119x493, 48.29K)

2080 TI owner here.

DLSS 2.0 gives me delicious 60FPS in Control with Ray Tracing looking awesome in 4k on a 65 incher.
Hell, gaming in 4k on a 65inch OLED with HDR is gorgeous.
I packed it all into a mini-ITX h210i case (after modding out the PSU shroud and with a SFX PSU) so I can easilly carry the rig upstairs to my office.

Fuck your consoles, I have a portable beast (and I'm planning to turn the attic into a VR room - hopefully after the corona crisis dies down).

You're upscaling whilst losing almost no detail/sharpness. If you dont understanf how running a game 40 to 50% faster for no difference isnt a big deal, you're a retard. Lets say they wouldnt make dlss 2.0 for free, people would go apeshit if the 3080ti was 40% faster then a 2080ti. Now its free

you will be literally cheerleading for Sony/MS when this tech comes to consoles 3 years after we got it

so why isn't it available for any game right now? right, because that's marketing bullshit

Because it still needs to be implemented in the games. It can't be added on a driver level.

they got it working on mechwarrior 5 (unreal 4) and control (remedy's proprietary shit)

im treating it with cautious optimism since its working in actual games and not some one off tech demo, im not going to treat it as the second coming but the videos and performance is very real in those two games so cautious optimism as I said

Because it came out a week ago retard

Thanks for clarification

>nvidiots calling each other cucks
makes sense, the only ones that get fucked when you buy those cards are yourselves.

Attached: 1584951551877.png (888x894, 520.32K)

Only downside seems to be it seems the sharpness is a little bit higher than I like, but some fags datamined a function called "r.NGX.DLSS.Sharpness" last week so hopefully that gets in soon

>its using special cores on the gpu to fake a higher resolution. So you'd be running it at 540p real resolution, but it'd simulate 1080p.

It dosnt fake it, it has a huge temporal component, by looking at 1/4 second of 60fps footage it has 15 frames to work with, just by looking at displaced pixels it can have 4x more information for native footage, any by looking at near pixels it can have 16 or 32 more information, big chunk of that info wont be accurate and create some ghosting, but still a lot of it will be usefull and will allow for this magic that allows the upscale to be more detailed than native footage.

Obviously the algorithm has no clue what its looking at, so resolving text or similar complex structured stuff is beyond it even if frame accumulation when camera is moving could resolve it.

who cares? if it becomes a standart, devs will optimize their games with dlss in mind and it will be like as if dlss never happened.