>uses formerly worthless gimmick rtx components to upres the game using nvidia 'tensor cores' (their version of the neural net shit people were using to upres old video and old game footage) >looks decent enough in videos/screenshots
apparently according to reviews of it, it looks a bit off with extreme brightness, but on the flip side can render certain details like hair much better from a distance, and it offers at least a 40% performance boost
raytracing seemed like a total gimmick, so if this pans out even a little they salvaged that trash
DLSS has always been a part of the 20 series. raytracing cores are still being used for raytracing, even while in DLSS mode. Shows how little you know and how worthless your opinionated post is.
Jacob Phillips
not memeing, what's better about updated on vs off, looks subtly different but not better
Isaac Davis
I already bought the rtx 2070 super faggot, no need to suck them off. Raytracing provided no value and this does, you're right.
dlss is running at like half the resolution, so it gets a massive performance boost, so the bottom is like 540p while top is real 1080p
the original version was trash, but now they've updated it to be better
It's an upscaling tool. It can turn 1080p into 4K through deep learning algorithms
Jonathan Adams
oh cool. guessing 1060 6gb doesn't support this?
Easton Rogers
nope unfortunately, they just introduced it with the rtx 20x0 line
they only got it on three or so games right now (MW5, control, and one other) so it's not really useful right now to you anyway unless you really like you some mechwarrior
Brayden Davis
>off >pure 4K >updated on >1080p upscale to 4K you get a massive performance boost
Robert Nguyen
thanks for betatesting my 3080 ti, cuck
Brody Barnes
DLSS looks worse than just lowering your resolution even adaptive sharpening has more worth
John Sanders
Thanks for betatesting my 4080Ti, cuck
Ethan Wilson
Ironically it shows how little you know. That you don't know the original DLSS was 1.0 and it's now up to 2.0 offering the extreme improvement in quality and performance as visible in the bottom of OPs picture. kys faggot
Levi Wright
The RTX and tensor cores are completely different things. Some Titan and Quadro cards had tensor cores before they added RTX
Leo Foster
>DLSS has changed yes, obviously. that doesn't detract from the fact it's been a part of the 20 series cards from the get go. Did you even read my post?
Austin Nelson
No it doesn't.
James Allen
>shitpost: the post
Gavin Nelson
not worth the trouble until DLSS becomes usable on games that don't have to develop native support for it.
Carson Jackson
Yes I did! But you didn't read OPs post. kys faggot
Ethan Long
dlss 1.0? yeah of course. dlss 2.0? not only is the gap much more closed, but it actually looks BETTER in some ways
2.0 looks like it solved a lot of the issues, like mouths flapping and eyes blinking looking off, or the middle image in the OP's shit
it's not going to let you replace a titan with a 2060 rtx, but it seems that as long as devs put in the effort it will be a 10
Luis Robinson
Thanks for betatesting my 5080ti, cuck
Zachary Johnson
not entirely true. It's a part of all forthcoming Nvidia cards, so there's not exact 'trouble' you'd have to get into to be eligible.
Henry Hall
Kinda this, we need AI for this shit and optimization. Imagine a world where PC gets the same extensive optimization as consoles, I bet a 1080Ti would kick next gen around.
Leo Torres
So dlss on is faster than no dlss at all?
Ian Walker
Fortunately with DLSS 2.0 that's easier than ever because the AI doesn't need to be trained with every specific game anymore, it's all a generic algorithm now.
Michael Gomez
>what are games made before the year 2020
Nicholas Hughes
>N-NO U! >kys faggot sounds like someone's mad
Cameron Cox
yeah, DLSS renders at half the resolution so that graph is full res vs half res+DLSS
Jayden Baker
what does that have to do with anything? DLSS is going to be possible on all new and currently produced Nvidia GPUs (save for the 1660) so there's no extra effort for the consumer. Older games aren't very likely to need upscaling to mitigate performance anyway
Thomas Carter
>raytracing seemed like a total gimmick Not a gimmick, it makes a lot more sense to render vidya that way once hardware is powerful enough
Angel Sanchez
540p is quarter resolution retard
Liam Moore
lol, i won
Dominic Howard
>what are games made before the year 2020 Literally every game made before 2020 will run on those new Nvidia RTX cards at like 200fps anyway so you dont need DLSS for them, DLSS is for future games
its using special cores on the gpu to fake a higher resolution. So you'd be running it at 540p real resolution, but it'd simulate 1080p.
normal this would be overrated marketing faggotry like that dumb INFINITE RESOLUTION shit, but DLSS 2.0 is actually doing this shit at a high enough level to be usable. there are still situations where it looks a tiny bit off, but it also has situations where it gets a tiny bit more fidelity
DLSS will make developers more lazy optimizing less and less and before you know it DLSS is a requirement, not a boost
Blake Allen
The ultimate test for DLSS (2.0) is to take a normie gamer put him in front of a 1080p to 4k DLSS game like Control and ask him how does the game look at native 4k, if most say it looks great then DLSS is based and if most realize its not native 4k then it failed
Robert Wright
I thought I'd stick to 1080p for my next upgrade but I consider 1440p with dlss now.
Tyler Reed
they're already doing the bare minimum, remember before anisotropic 16x was basically free, or 4x anti aliasing?
Jonathan Russell
jesus fucking christ, mustards have sunk low parading about another mcguffin from the guys who sold you 1k gpus? well not you specifically since someone with a 2080 wouldnt be jumping for joy at upscaling 540
Luis Perez
Same, DLSS 1440p + a 144hz freesync monitor seems the perfect choice
Tyler Sanchez
>well not you specifically since someone with a 2080 wouldnt be jumping for joy at upscaling 540 When new consoles and next AMD shit cards will struggle to run games at 20fps in 2025 NVIDIA chads will just enable DLSS and will have near native image quality while having + half of the fps, stay seething
Dominic Reyes
>new consoles and there it is the mustard on a shit-rig reveals who he's truly butthurt for hey good luck with the new mcguffin!
Luke Cook
>or 4x anti aliasing? Fuck you talking about i play PC games since Gothic 1 released and AA always absolutely tanked your framerate
Matthew Rivera
im not sucking nvidia's dick, rtx was shit pre-super (and even super is a bit pricy), 1000 series was shit
I'm just surprised they finally got a gimmick that isn't just rigging benchmarks/performance for amd stuff that looks neat
Jason Hall
And people still say Neural Networks are a meme
Carter Diaz
I have a GTX 1080 and will upgrade to a 3000+ and have guaranteed 60fps for the next 10 years
Aiden Perry
That's all I wanted, something the rtx shit could do instead of rtx. If only it were universally implemented from the control panel and not just like 4 games
Eli Perez
hahahah this is precious the guy excited about an up-scaling mcguffin, is also getting the new mcguffin gpu and has big ole dreams about a mcguffin performance life
Joshua Edwards
The whole point of 2.0 is that is much easier to implement,also UE4 will soon have official support for it so probably most UE4 games will use it. If Unity follows then basically all games will have it
Mason Lee
cool
Landon Cooper
apparently op was wrong, it just means that the RTX stuff gets a lot more mileage because they got the DLSS shit not sucking dick now
so if 1080p with full ray tracing was running at 30fps, this would adequately fake 1080p at 540p or 720p or some shit while running at 60fps
Jayden Lewis
I'm still on a 760 playing games at 60 fps 1080p because I turn off prettiness gimmicks like AA
Sebastian Watson
its not like pic related though
its at the point where its actually enough visual quality to be viable
DLSS 2.0 gives me delicious 60FPS in Control with Ray Tracing looking awesome in 4k on a 65 incher. Hell, gaming in 4k on a 65inch OLED with HDR is gorgeous. I packed it all into a mini-ITX h210i case (after modding out the PSU shroud and with a SFX PSU) so I can easilly carry the rig upstairs to my office.
Fuck your consoles, I have a portable beast (and I'm planning to turn the attic into a VR room - hopefully after the corona crisis dies down).
Nathan James
You're upscaling whilst losing almost no detail/sharpness. If you dont understanf how running a game 40 to 50% faster for no difference isnt a big deal, you're a retard. Lets say they wouldnt make dlss 2.0 for free, people would go apeshit if the 3080ti was 40% faster then a 2080ti. Now its free
Jeremiah Hill
you will be literally cheerleading for Sony/MS when this tech comes to consoles 3 years after we got it
Charles Morgan
so why isn't it available for any game right now? right, because that's marketing bullshit
Jonathan Bennett
Because it still needs to be implemented in the games. It can't be added on a driver level.
Chase Reyes
they got it working on mechwarrior 5 (unreal 4) and control (remedy's proprietary shit)
im treating it with cautious optimism since its working in actual games and not some one off tech demo, im not going to treat it as the second coming but the videos and performance is very real in those two games so cautious optimism as I said
Carter Sullivan
Because it came out a week ago retard
Lucas Williams
Thanks for clarification
Nicholas Green
>nvidiots calling each other cucks makes sense, the only ones that get fucked when you buy those cards are yourselves.
Only downside seems to be it seems the sharpness is a little bit higher than I like, but some fags datamined a function called "r.NGX.DLSS.Sharpness" last week so hopefully that gets in soon
Justin Stewart
>its using special cores on the gpu to fake a higher resolution. So you'd be running it at 540p real resolution, but it'd simulate 1080p.
It dosnt fake it, it has a huge temporal component, by looking at 1/4 second of 60fps footage it has 15 frames to work with, just by looking at displaced pixels it can have 4x more information for native footage, any by looking at near pixels it can have 16 or 32 more information, big chunk of that info wont be accurate and create some ghosting, but still a lot of it will be usefull and will allow for this magic that allows the upscale to be more detailed than native footage.
Obviously the algorithm has no clue what its looking at, so resolving text or similar complex structured stuff is beyond it even if frame accumulation when camera is moving could resolve it.
Thomas King
who cares? if it becomes a standart, devs will optimize their games with dlss in mind and it will be like as if dlss never happened.