4K > FPS

Attached: 867435324.jpg (590x647, 104.15K)

Other urls found in this thread:

microsoft.github.io/DirectX-Specs/d3d/VariableRateShading.html
twitter.com/NSFWRedditImage

>Play on 4k
It looks good, but i can go back to 1080/1440p

>Play on 60 or more fps
Literally impossible to go back to 30
Hell, if you have a good pc, anything below 90 feels like 30fps

>singleplayer rpg
>fancy visuals with a stable 30+ fps
>competitive multiplayer game
>readable visuals with a stable 6o-120 fps

true, if i get around 70 FPS i freak out

FPBP

>PS4 Pro
>4K
good one

We hope you’re looking forward to fps spikes and no more than 40 FPS (that’s being generous)

>Have to wait until the PC version for 60FPS
>Have to hope Square actually puts effort into the port and doesn't leave it fucked like they did with XV

unpopular but true opinion

>Literally impossible to go back to 30
What a bunch of crap.
I've seen that dumb shit because of 60fps too.

Sure better perfomance is more enjoyable but it doesn't change a game unless it's really awful framerate, like 10fps

gameplay>artstyle>soundesign>60fps>grafix(4kmeme)

Are trophies not syncing?

>>Play on 60 or more fps
>Literally impossible to go back to 30
>Hell, if you have a good pc, anything below 90 feels like 30fps
Man, that shit never used to bother me. I was one of those '30fps is fine' guys. Now after getting into PC gaming two years ago, 30fps looks like dogshit. Even fcking 50 looks too stuttery and unacceptable.

Went from using my console daily to almost never.

It's not even close to 4K, it's dynamic 1440p-1600p

Hell, if you get invested in VR then 60 becomes the new 30 since anything below is pretty much unbearable.

Wait until you get used to locked frametimes. I used to be fine as long as my framerate was high, now even +-0.2ms frametime variation bothers me.

>Mindless Fetchquests
>Linear as fuck
>Mass effect andromeda animations
>Spastic camera
>Button Mashing
>Low quality textures

4 Days left until our based gaming journalist expose this piece of shit

Attached: blunder.png (1142x775, 130.69K)

FF7 was always linear

4K***

Attached: file.jpg (623x749, 35.92K)

more lies about the ps4 pro playing games in 4k. upscaling is not 4k. your tv already will upscale a 480p image to "display in 4k"

>4K
>It's neo-interlacing, aka checkboard.

The pos4 has neither.

Nuh uh ffxhaters always told me ffx was the first linear ff.

FFVII Remake doesn't have the texture resolution or quality to justify playing it at 4K.

Is it 4k? I thought digital foundry said it ran up to like 1600p or some shit

>4K

is it worth it when it looks like this?

Attached: 1585701778773.jpg (3840x2160, 1.28M)

I wonder how brainlets like you will cope with all next-gen cards using variable rate shading. Will your anus bleed till you die?

Something not running at 4K is not 4K, something not running at 1080p is not 1080p. Something running at 38 FPS is not 30 FPS. This is not hard to understand, and no amount of render scale blur shit and fad of the month interlacing rebranding will change that.

That's what I'm saying. How will your brain cope with variable rate shading i.e full shading done only on a small percentage of the screen. Will you cope to death over it?

4K is easier to market than 60 fps.
Hell, most console players don't even know what fps are.

Attached: scuttlebug.png (800x600, 775.88K)

>variable rate shading
Oh dear god please tell me you console kids aren't actually excited for this travesty.

Having everything outside the center of the screen being a roller coaster of varied bad rendering quality is a retarded idea to end all retarded ideas. Your eye balls look around a screen, my eyes look to the left and right while my character doesn't, a tv screen =/= peripheral vision. You can and will look towards the low rendered areas and suffer for it.

Got lucky and got my deluxe edition pre-order in today so I don't have to buy a fucking digital piece of shit edition.

But bro don't you see it's all to pave the way for all games being VEE ARR.

I'm going to wait for the inevitable PS5 pro version.

It's nvidia tech you brainlet neither xbox x/ps5 will have it.

Cinematic unstable 24fps.

The clearest, crispest 4K presentation of 2005 graphics you've ever seen.

60 FPS is the worst meme Total Biscuit left behind. 30 is fine, having 60 FPS literally changes nothing if the devs actually know what they're doing.

>upscaling

why bother?

>total biscuit
>literally who
I hope you die from asscancer too.

No, the only meme in gaming today is 4K.
Fucking consoomers and TV company shills...
60fps>2160p

Even in death he's more of a (you) then you could ever hope to be.

>It's nvidia tech you brainlet
It's literally part of the DirectX speclist, it's not vendor exclusive.
microsoft.github.io/DirectX-Specs/d3d/VariableRateShading.html

>Actually believing this blurry shit is 4K

Just look at that foliage on the second floor up there, looks like when I start a game for the first time and it launches at 1080p but with 60% render scale on.

Literal cuck for brains.

4k is just the resolution. The graphics can look all pixelated shit desu. :^)

My tv does 1080/120 my monitor does 1440p/144. 60 feels slow and 30 feels like something is broken. I'll always take higher framerates over more pixels

I'm personally avoiding seeing any framerates above 60fps because that is a dragon I don't want to start chasing.

I grew up on 60fps games, when ps2 came along a framerates started being 30 more often then not I started moving to pc gaming to get thay higher frame rate. 1080/120 looks really really good even on my 65 inch tv. Sure I'd love 4k/120 but even with a 1080ti ita hard to get that many frames and only the next wave of TVs will do it, I am gonna go with the next LG oled because they already said it will do 4k/120 over hdmi

Hol soulless. It's like squaretendo actually hired that man.

Playing it know and the framerate is a steady 30, which is good I guess. The framerate drop when sliding down ladders in the demo is fixed in the full game as well. Can't wait for it to hit PC though. I'm hoping it will support 120fps.

Good for you.
For me that seems like a needless money sink right now but I'm sure I'll get to it whenever games start treating higher framerates as standard.

Even just getting a ps2 after years of gc was a nightmare. So many games in 50 Hz here in PAL when gc had 60Hz standard for all first party and more games. All that while the image was super blurry for the time. Dreamcast looked better because it was super crisp image.

The human eye can't see above 480p

Video games in general is a needless money sink

I played dreamcast on my pc crt at 480p and still think it looks sharper then ps2

>Video games in general is a needless money sink
Yes and that's why I don't need anymore needless money sinks like 4K, 144fps and OLED.

You don't need any of it, video games as a whole are a want. I'd also aruge that OLED is so the most import new tech to games since current lcd tech is fucking garbage.

the human eye actually cant see any difference above 160p, but marketing has led retards to believe otherwise

>variable rate shading
u can't be serious

Attached: lolretard#.png (684x370, 128.04K)

>OLED
kek, I bought a Pocophone because it's the only flagship with an IPS panel.

Attached: pJtuuTt.jpg (4048x3036, 503.2K)

Enjoy your shit blacks, oled is best for media consumption.

OLED is oversaturated garbage for tech illiterates.

there's linear (ff7), and then there's the entire game being a hallway (ffx).

Very possible. I played Bloodborne on 30 and i don't understand what people bitch about so much. 30 is fine depending on the game.

30FPS is choppy as fuck, hell even 60 is unbearable when you go to 122 or 144
t.racing sim autist

Dude, I played at 60 all my life. I spent a single weekend on a friend's house with a 144hz monitor and when I came back I thought something was wrong with my PC. It took way too much time to go back to "60 fps is good enough"